New Footage Shows Tesla On Autopilot Crashing Into Police Car After Alerting Driver 150 Times::Six officers who were injured in the crash are suing Tesla despite the fact that the driver was allegedly impaired

@EndOfLine@lemmy.world
link
fedilink
English
12
edit-2
1Y

Officers injured at the scene are blaming and suing Tesla over the incident.

And the reality is that any vehicle on cruise control with an impaired driver behind the wheel would’ve likely hit the police car at a higher speed. Autopilot might be maligned for its name but drivers are ultimately responsible for the way they choose to pilot any car, including a Tesla.

I hope those officers got one of those “you don’t pay if we don’t win” lawyers. The responsibility ultimately resides with the driver and I’m not seeing them getting any money from Tesla.

Peanut
link
fedilink
English
21Y

i still think tesla did a poor job in conveying the limitations on the larger scale. they piggybacked waymo’s capability and practice without matching it, which is probably why so many are over reliant. i’ve always been against mass-producing semi-autonomous vehicles to the general public. this is why.

and then this garbage is used to attack the general concept of autonomous vehicles, which may become a fantastic life-saver, because then it can safely drive these assholes around.

queermunist she/her
link
fedilink
English
-31Y

wtf I love Tesla now

LazaroFilm
link
fedilink
English
-21Y

lol

deleted by creator

It’s a douche bag trifecta

Tesla owner, driving drunk Cops, being cops Tesla, overselling their shitty car

I just hope that innocent bystander gets something from all three of them

daikiki
link
fedilink
English
461Y

I have a lot of trouble understanding how the NTSB (or whoever’s ostensibly in charge of vetting tech like this) is allowing these not-quite self driving cars on the road. The technology doesn’t seem mature enough to be safe yet, and as far as I can tell, nobody seems to have the authority or be willing to use that authority to make manufacturers step back until they can prove their systems can be integrated safely into traffic.

It’s not “not-quite-self-driving” though, it’s literal garbage. It’s cruise control, lane assist and brake assist. The robot vision in use is horrible.

There are Tesla engineers bad mouthing the system openly.

Musk is a scammer and they need to issue an apology for all of the claims around autopilot, probably pay a great deal of money, and then change the name and advertising around it.

Oh, and also this guy should never drive again.

$$$ that’s how.

It’s just ADAS - essentially fancy cruise control. There are a number of autonomous vehicle companies who are carefully and successfully developing real self-driving technology, and Tesla should be censured and forbidden for labeling their assistance software as “full self-driving.” It’s damaging the real industry.

Hard to argue Tesla at fault when clearly the driver was impaired and at fault here.

@chakan2@lemmy.world
link
fedilink
English
11Y

I hope the cops win. Autopilot allows for a driver to completely disengage their attention from the car in a way that’s not possible with just cruise control.

There’s no way you can drop a human in a life threatening critical situation with 2.5 seconds to make a decision and expect them to make reasonable decisions. Even stone cold sober, that’s a lot to ask of a person when the car makes a critical mistake like this.

On cruise, the driver would still have to be aware that they were driving. With auto pilot, the driver had likely passed out and the car carried on it merry way.

@meco03211@lemmy.world
link
fedilink
English
21Y

Because people can’t pass out with just cruise control? He didn’t have 2.5 seconds. According to the article he had 45 minutes of multiple warnings.

N3Cr0
link
fedilink
English
131Y

Poor drunk impaired driver falling victim to autonomous driving… Hopefully that driver lost their license.

Cyber Yuki
link
fedilink
English
11Y

That doesn’t drive the problem of autopilot not taking the right choices. What is the driver wasn’t drunk, but they had a heart attack? What if someone put a roofie on their drink? What if the driver was diabetic or hypoglycemic and suffered a blood glucose fall? What if they had a stroke?

Furthermore, what if the driver got drunk BECAUSE the car’s AI was advertised as being able to drive for you? Think of false publicity.

If your AI can’t handle one simple case of a driver being unresponsive, that’s negligence on the company’s part.

@zerbey@lemmy.world
link
fedilink
English
261Y

150 more warnings than a regular car would give, ultimately it’s the driver’s fault.

Armok: God of Blood
link
fedilink
English
71Y

If the driver was unresponsive in a normal car, it would stop.

@sugartits@lemmy.world
link
fedilink
English
41Y

The driver was responding. If he didn’t respond the car would have stopped.

If this was a normal car he probably would have just crashed earlier.

@socsa@lemmy.ml
link
fedilink
English
41Y

TIL cruise control doesn’t exist

@Md1501@lemmy.world
link
fedilink
English
151Y

You know what might work, program the car so that after the second unanswered “alert” the autopilot pulls the car over, or reduces speed and turns on the hazards. The third violation of this auto pilot is disabled for that car for a period of time.

They didn’t say he didn’t respond to the alerts. If you don’t respond, autopilot turns off.

This is literally exactly how it works already. The driver must have been pulling on the steering wheel right before it gave him a strike. The system will warn you to pay attention for a few seconds before shutting down. Here’s a video: https://youtu.be/oBIKikBmdN8

@Md1501@lemmy.world
link
fedilink
English
61Y

Ah, so its just people defeating the system

@stealin@lemmy.world
link
fedilink
English
21Y

The system with cars is that you don’t distract the driver from driving, having a system that takes over driving is exactly that, so the idea of the system is flawed to begin with.

@Technoguyfication@lemmy.ml
link
fedilink
English
1
edit-2
1Y

I have to say this is extremely inaccurate imo. Self driving takes over the menial tasks of keeping the car in the lane, watching the speed, etc. and allows an attentive driver to focus on more high level tasks like looking at the road ahead, watching the sides of the road for potential hazards, and keeping more aware of their blind spots.

Just because the feature can be abused does not inherently make it unsafe. A drunk driver can use cruise control to more accurately control the vehicle’s speed and avoid a ticket, does that make it a bad feature? I wouldn’t say so.

Autopilot and other driver assist systems are good when used responsibly and cautiously. It’s frustrating to see people cause an accident after misusing the system and blame the technology instead. This is why we can’t have nice things.

It’s frustrating to see

This is why we can’t have nice things

It is also frustrating to see people whining for technology when they should rather think about dead policemen and rescuers.

You should get your priorities straight if you ever hope to be taken seriously

Evie
link
fedilink
English
0
edit-2
1Y

So self driving cars, are not so self driving… Huh, whodathunk it lol /s

@CaptainProton@lemmy.world
link
fedilink
English
34
edit-2
1Y

This is stupid. Teslas can park themselves, they’re not just on rails. It should be pulling over and putting the flashers on if a driver is unresponsive.

That being said, the driver knew this behavior, acted with wanton disregard for safe driving practices, and so the incident is the driver’s fault and they should be held responsible for their actions. It’s not the courts job to legislate.

It’s actually the NTSB’s job to regulate car safety so if they don’t already have it congress needs to grant them the authority to regulate what AI behavior is acceptable/define safeguards against misbehaving AI.

@doggle@lemmy.world
link
fedilink
English
21Y

Sounds like the injured officers are suing. It’s a civil case not criminal, so I’m not sure how much the court would actually be asked to legislate. I’d be interested to hear their arguments, though I’m sure part of their reasoning for suing Tesla over the driver is they have more money.

@socsa@lemmy.ml
link
fedilink
English
51Y

There’s no way the headline is true. Zero percent. The car will literally do exactly what you stated if it goes too long without driver engagement and I’ve experienced it first hand.

@doggle@lemmy.world
link
fedilink
English
11Y

The headline doesn’t state that the warnings were consecutive.

Perhaps the driver was just aware enough to keep squelching warnings and prevent the car from stopping altogether?

I’ll grant you, though, 150 warnings is still a little tough tough to believe…

@hark@lemmy.world
link
fedilink
English
251Y

Setting aside the driver issue, isn’t this another case that could’ve been prevented with LIDAR?

@Snapz@lemmy.world
link
fedilink
English
151Y

This source keeps pushing tesla propaganda. There’s always an angle trying to sell that it wasn’t the tesla’s fault

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 186 users / day
  • 583 users / week
  • 1.37K users / month
  • 4.49K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog