Jump to content

Pedo guy megalomaniacal manchild officially owns Twitter


Recommended Posts

11 minutes ago, LazyPiranha said:


I don’t know the stats on deaths, but computers crashed 9.1 times per millions miles and humans crashed 4.1 times per million miles.


Also I don’t know how these things count on paper when self driving vehicles abandon control when they know an accident is imminent.  How long before an accident does the computer have to be in control before impact before it stops counting?

 


I have read a dozen studies and never come across a single one that had ADVs at 2x more crashes than HDVs, outside of subsets of a circumstances (there are circumstances where ADVs just don’t work well yet).

 

Would be curious to read the study you are referring to as it seems to be way outside the consensus at this point. That doesn’t mean it is incorrect, but they must be doing something different to generate such a result and I am quite curious what that is.

Link to comment
Share on other sites

23 minutes ago, sblfilms said:


I have read a dozen studies and never come across a single one that had ADVs at 2x more crashes than HDVs, outside of subsets of a circumstances (there are circumstances where ADVs just don’t work well yet).

 

Would be curious to read the study you are referring to as it seems to be way outside the consensus at this point. That doesn’t mean it is incorrect, but they must be doing something different to generate such a result and I am quite curious what that is.

 

NATLAWREVIEW.COM

The concept of driverless cars is here to stay. America is competing in a global race to make driverless cars the norm, and as predicted, nearly all major car manufactures currently...

 

Link to comment
Share on other sites

2 hours ago, jinx8402 said:

My "for you" tab on Twitter just became like 90% elon and trump posts. Anyone else seeing that?

 

God this app sucks now. Only use it for following NBA/Celtics chatter.

 

I follow Musk (because of space stuff) so I see a lot (all of it?) chud posts on my "following" tab, along with retweets by chud-adjacent people I follow.  It's gotten so bad over the last few weeks and I might as well be looking at the "for you" tab instead to get some variety. 

Link to comment
Share on other sites

3 hours ago, jinx8402 said:

My "for you" tab on Twitter just became like 90% elon and trump posts. Anyone else seeing that?

 

God this app sucks now. Only use it for following NBA/Celtics chatter.

 

I can only spend like 5 minutes on it before it asks me to subscribe and that I'm done for the day. lol

 

I only recently started trying to use it and it's already lost me

Link to comment
Share on other sites

17 hours ago, sblfilms said:


I know what you mean, and it is a standard that no company puts into place for any of their automated hardware or software.

 

But your claim was they don’t stand behind it as a response to me pointing out that it is safer than human driving per mile, the fact that the insurance coverage they sell is cheaper for those who use FSD more often shows that they do financially stand behind FSD. If they believed it was more prone to failure than human driven miles, they would charge a higher rate for miles driven under FSD.

 

Waymo has no driver in their taxis and is fully responsible for accidents that are its fault.  I had a very long layover in SF last week and took a couple Waymos around, it is a bizarre experience being in a car with no driver.

Link to comment
Share on other sites

4 minutes ago, finaljedi said:

 

Waymo has no driver in their taxis and is fully responsible for accidents that are its fault.  I had a very long layover in SF last week and took a couple Waymos around, it is a bizarre experience being in a car with no driver.

 

Really cool though. 

  • True 1
Link to comment
Share on other sites

4 minutes ago, finaljedi said:

 

Waymo has no driver in their taxis and is fully responsible for accidents that are its fault.  I had a very long layover in SF last week and took a couple Waymos around, it is a bizarre experience being in a car with no driver.


Waymo owns the car. The liability falls on the owner of the equipment. Who else would be liable?

Link to comment
Share on other sites

6 minutes ago, sblfilms said:


Waymo owns the car. The liability falls on the owner of the equipment. Who else would be liable?

 

Liability falls on both the owner and operator of equipment. So in the case of a rental car, for example, the driver is liable in most collisions (unless the owner, the rental company, is somehow negligent due to proven maintenance failures, etc). So if you are renting/leasing a self-driving car, then the liability situation should be similar to if you own it, provided the vehicles are the same (since the operator is either you, or the self-driving program). If we are shifting liability to the owner of self-driving vehicles rather than the operator/passenger, then it radically changes how vehicle ownership would operate. I can already envision a situation where there are companies set up to own vehicles that are independent of the companies that use them, thus passing the liability up the chain (so those companies can potentially take losses and fail without impacting the operating company, etc). 

 

The really interesting question is, if self-driving cars are mobile Chinese Rooms and can't be changed/modified by the owner/operating company or individual...then why should that owner or individual carry any of the liability? It should be on the manufacturer, since the owner/operator has no direct control over how the vehicle works.

Link to comment
Share on other sites

I've been slow to post my hot take but FSD will be here soon enough and the big piece that will be need to be ironed out is car 2 car communication even for cars without FSD. The sooner this gets widely adopted the sooner FSD can really take off. I know Tesla has been getting better with their FSD but still think the real winners will be someone else like Waymo. 

Link to comment
Share on other sites

10 minutes ago, chakoo said:

I've been slow to post my hot take but FSD will be here soon enough and the big piece that will be need to be ironed out is car 2 car communication even for cars without FSD. The sooner this gets widely adopted the sooner FSD can really take off. I know Tesla has been getting better with their FSD but still think the real winners will be someone else like Waymo. 


As I mentioned previously, c2c communication will be a huge improvement in safety even for vehicles that have basic brake and steering assist functions. I foresee government mandated standards for c2c communication in the next 10 years, and it will be a seatbelt level benefit to society.

Link to comment
Share on other sites

17 hours ago, LazyPiranha said:

 

NATLAWREVIEW.COM

The concept of driverless cars is here to stay. America is competing in a global race to make driverless cars the norm, and as predicted, nearly all major car manufactures currently...

 


I tracked down the actual study they based their claim off of. I think it is worth noting that the data sources they used are tracking different things. The ADV data came manufacturers working with NHTSA, while the HDV data came from only accidents reported to police.
 

I am not a big fan of using wholly different types of data to do this sort of analysis, but going with what we have, it was interesting to read that despite finding more accidents in total per mile driven, ADV accidents were notably less serious than HDV accidents.

 

It seems like that too could be a bias of the data. Accidents needing a police report are typically more serious than accidents which don’t, so only including police reported HDV accidents probably inflates the average seriousness of accident in the data set.

Link to comment
Share on other sites

1 hour ago, sblfilms said:


Waymo owns the car. The liability falls on the owner of the equipment. Who else would be liable?

 

Car mfgs are liable for their designs and defects. Full stop. So many cars run on so much software outside of fsd it is stupid to not assign liability to manufacturers for something they alone have control over in the absence of user or owner voiding the warranty by accessing the computers and software of a vehicle or some improper use (outside of normal use and operation). Like there’s been recalls and class action settlements for other safety equipment and software defects for things that go on literally under the hood and manufacturers are liable for it. And when fsd is active the assumption of liability should generally be borne by the software owner. But also calling what Tesla has as fsd given the stringent requirements they have for “normal” operation is misleading at best imo. They’re trying to have their cake and eat it while Tesla owners are beta testers. (Beta cuck testers but that’s neither here nor there)
 

If there’s a product feature that leads to damages the manufacturer is and should eventually be held responsible, even if only financially. And that’s regardless of the industry, and even in the name of meeting other regulatory standards. FSD should be treated the same.
 

But also:

46 minutes ago, Kal-El814 said:

 

Imaging putting your life in the hands of anything this man is remotely associated with; truly one of the dumbest motherfuckers walking god's green earth. Somewhat ironically he's convinced me the woke mind virus is real and his case is terminal.

This. But his cars are not just a danger to passengers, his and all cars in general are a danger to the public especially at higher speeds. 

Link to comment
Share on other sites

3 minutes ago, b_m_b_m_b_m said:

Car mfgs are liable for their designs and defects. Full stop.

Waymo doesn’t manufacture cars though. They own and operate vehicles manufactured by somebody else. So of course the liability is fully on them when their fleet vehicles are at fault.

 

But who is operating a privately owned car when using computer assisted driving, from basic lane keeping all the way to stuff like FSD and Bluecruise? The notion that you stop being an operator of a piece of equipment simply because there is now automation involved is not something we apply to any other industry. If the automation fails to perform in the way it is supposed to, the end user certainly can argue that the manufacturer has liability.

 

Tesla has been sued for damages due to their various automated driving technologies, though I don’t know how far any of those cases have gone at this point. My guess though would be that even if Tesla were found liable to some degree in a particular scenario, the liability would still be split between the operator and Tesla because FSD is plastered from start to finish with warnings to the driver to remain engaged at all times because the software is incomplete.

Link to comment
Share on other sites

2 hours ago, sblfilms said:


As I mentioned previously, c2c communication will be a huge improvement in safety even for vehicles that have basic brake and steering assist functions. I foresee government mandated standards for c2c communication in the next 10 years, and it will be a seatbelt level benefit to society.

It's actually more than that. It allows for cars that are doing FSD to know the state of other cars around them so they can make smarter choices on the situation. You're starting to see this a lot with drones in how they can interoperate together on their own. I agree this is going to be an area that will have government mandating it once a standard is established.

Link to comment
Share on other sites

35 minutes ago, sblfilms said:

They own and operate vehicles manufactured by somebody else. So of course the liability is fully on them when their fleet vehicles are at fault.

It’s the software (and corresponding hardware modifications) that makes them liable because you do not own the software that is utilized for fsd from Waymo or Tesla. They own the software and license it to you on a limited basis for use in your or their vehicle.
 

With a Tesla if your vehicle is equipped with the necessary sensors you can purchase a software license which they can revoke or modify access to under certain conditions, which is more intrusive than other software used in the vehicle because of the value add and the liability they wish to offload. They maintain that they are the owner of the software that enables the value add to make your normal Tesla an fsd vehicle, but want to think and convince you that their software (which they own, create, and maintain) is free from additional liability. 
 

1 hour ago, sblfilms said:

The notion that you stop being an operator of a piece of equipment simply because there is now automation involved is not something we apply to any other industry. If the automation fails to perform in the way it is supposed to, the end user certainly can argue that the manufacturer has liability.

Subverting expectations only works well in movies, not in real life. But I promise you that if software does not perform as expected and real losses are incurred the software owner is held liable.

 

all of this is beside the point: are you an operator or is it fsd? If you have to operate or manage the vehicle while in motion it’s definitionally not full self driving. And if it’s not full self driving there would be reduced liability to the software owner, but it’s clear there’s a level of obfuscation where certain wealthy parties want to push what they consider fsd and the risks of the level of fsd offered for their beta testing (and $$$) while maintaining that it is a driver assist for legal liability purposes. 

Link to comment
Share on other sites

1 hour ago, b_m_b_m_b_m said:

But I promise you that if software does not perform as expected and real losses are incurred the software owner is held liable.

 

This is exactly the crux of the issue. What expectation is being set by a company when they make their software available? The relevant parts of the Tesla owner's manual are linked below

 

https://www.tesla.com/ownersmanual/modely/en_us/GUID-2CB60804-9CEA-4F4B-8B04-09B991368DC5.html

 

https://www.tesla.com/ownersmanual/modely/en_us/GUID-E5FF5E84-6AAC-43E6-B7ED-EC1E9AEB17B7.html

 

There is a substantial gulf between the language used in early Tesla documentation and today. I would say the company was rightly criticized for the way it described the state of the technology 3-4 years ago. Even the name for the software has been updated to explicitly reference that is a drive-supervised technology at this point. The vehicle itself will also give you all this same language about what you are engaging when you activate FSD, or lower level assistance features like Autopilot. Misuse of a product that results in harm is of decidedly lower liability for the manufacturer of the product, and I would venture to say that most examples of low level or high level driver assistance leading to crashes are far more commonly due to drivers being inattentive and not reacting to what is happening around them than the car behaving completely outside of the expectation of what it is capable of. But that is mostly due to the relatively limited example of lawsuits being won on the basis that the automated function of the car was primarily at fault.

 

2 hours ago, b_m_b_m_b_m said:

all of this is beside the point: are you an operator or is it fsd?

 

You are the operator getting driving assistance from the software. You will still be the operator of the vehicle even when we eventually have autonomous vehicles with no steering wheels. The type of operation has merely changed. Modern trains feature a ton of automation, and yet BNSF or Union Pacific are still responsible for not making the things fly off the tracks. They continue to have human engineers sitting in the engines and intervening when needed.

Link to comment
Share on other sites

5 minutes ago, Jason said:

 

So it's not full self driving.

Imagine optionally paying thousands of dollars more for your car to still be mostly in your control and you still have to have hands on wheel and eyes on the road, glorified cruise control. Truly living in the self driving future 

Link to comment
Share on other sites

2 minutes ago, b_m_b_m_b_m said:

Imagine optionally paying thousands of dollars more for your car to still be mostly in your control and you still have to have hands on wheel and eyes on the road, glorified cruise control. Truly living in the self driving future 


I certainly wouldn’t pay for FSD in its current state. The included Autopilot does the thing I most care about, which is making long distance highway driving substantially more comfortable than active steering, though most people don’t understand why until they have to drive again without it. Fortunately, in the next 5 years pretty much every car will be delivered with something like Autopilot or Bluecruise.

Link to comment
Share on other sites

2 minutes ago, sblfilms said:


I certainly wouldn’t pay for FSD in its current state. The included Autopilot does the thing I most care about, which is making long distance highway driving substantially more comfortable than active steering, though most people don’t understand why until they have to drive again without it. Fortunately, in the next 5 years pretty much every car will be delivered with something like Autopilot or Bluecruise.

How good is it all when in less than ideal conditions? Mountains, rain, snow, etc

Link to comment
Share on other sites

6 hours ago, sblfilms said:


I tracked down the actual study they based their claim off of. I think it is worth noting that the data sources they used are tracking different things. The ADV data came manufacturers working with NHTSA, while the HDV data came from only accidents reported to police.
 

I am not a big fan of using wholly different types of data to do this sort of analysis, but going with what we have, it was interesting to read that despite finding more accidents in total per mile driven, ADV accidents were notably less serious than HDV accidents.

 

It seems like that too could be a bias of the data. Accidents needing a police report are typically more serious than accidents which don’t, so only including police reported HDV accidents probably inflates the average seriousness of accident in the data set.


The problem with any of these things is you’re trying to track a number that by definition is unknowable.  Car accidents that no one reports probably outweigh ones that are reported, especially if you consider accidents that don’t involve anyone else.  If I accidentally back into a concrete pylon in a parking lot, that’s an accident but no one ever finds out about it.  At the same time, I feel like it’s an unfair comparison because an automated driving system can just go “nah son” and tell a driver that conditions are too crap for it to drive, whereas a person doesn’t necessarily have that choice.  
 

The bizarre reality of it is that, in general, people are shockingly good at driving.  The fact that we regularly get in huge metal boxes and travel at 70 mph directly at one another with less than 15 feet separating us and any of us live to see tomorrow is astounding.  I don’t know when or if a computer is ever going to become as good as a human at perceiving the world around it. 

  • Halal 1
Link to comment
Share on other sites

10 minutes ago, b_m_b_m_b_m said:

The latter


Rain is not much of an issue at this point, while snow remains a more difficult problem to solve (snow is notably more difficult than rain for human drivers too). Unmarked/poorly marked roads have seen massive leaps in the ability of these systems to correctly recognize the road. I am unaware of how mountain terrain impacts these systems.

 

i think like most technologies, you as the end user need to make educated choices about how to utilize them, and each flavor of system is different and has a unique set of strengths and weaknesses to understand.

Link to comment
Share on other sites

22 minutes ago, sblfilms said:

How many of you have actually driven a car using any of these systems? I feel like a lot of the perspectives here are based off of videos of early versions of the software, which were legitimately terrible and scary 😂


Closest I ever got was renting a Subaru which would stay in lanes and accelerate/brake automatically.  You still had to turn manually, etc.  It was this model year, so recent.

 

It was… weird.  It worked, and I never had a problem with it, but I never got over feeling really uncomfortable as it drove along.  You had to keep your hands on the wheel at all times or it yelled at you and disengaged, so weirdly I felt MORE stressed out because instead of just knowing what was happening because I was the one doing it, I was constantly monitoring something to make sure it didn’t fuck up.  I had to be aware of the road AND the machine at the same time.  I drive my dad’s Audi which had lane assist and adaptive cruise which basically made the wheel harder to steer out of the lanes and rumbled things to warn you you were drifting.  I found that easier to deal with because it was not actively relinquishing control and really only had to pay attention to speed on the highway which really wasn’t much different than normal cruise control.

 

I’m sure with more time, I’d get progressively more comfortable with it until it didn’t stress me out, but I don’t know if I want to get to that point.  I know it’s not a reasonable position given the data, but there’s just something I find really uncomfortable about it. 

Link to comment
Share on other sites

9 minutes ago, LazyPiranha said:

I know it’s not a reasonable position given the data, but there’s just something I find really uncomfortable about it. 

I think it’s reasonable. These half measures are helpful at best, but mostly annoying. I don’t even keep the lane assist on in my mostly dumb car because the feeling of the car starting to steer for you to keep it between the lines is so off-putting. It’d be fine if it always worked, but sometimes it just doesn’t. I do appreciate automatic emergency braking at low speeds. It’s those minor collisions (or hitting someone walking, lawd forbid) that suck hard. I haven’t tried GM’s SuperCruise, but that’s the self-driving I’m most likely to encounter in the future. I’m never getting into a Tesla, that’s for sure. I refuse to have anything to do with that egotistical asshole. 

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...