Use Baidu's platform to show how the fusion of Lidar, radar, and cameras can be fooled by stuff from your kids' craft box

A team of researchers from prominent universities – including SUNY Buffalo, Iowa State, UNC Charlotte, and Purdue – were able to turn an autonomous vehicle (AV) operated on the open sourced Apollo driving platform from Chinese web giant Baidu into a deadly weapon by tricking its multi-sensor fusion system, and suggest the attack could be applied to other self-driving cars.

MeatPilot
link
fedilink
English
144M

Human-driven cars can be crashed with a brick, or a quart of oil.

Jesus
link
fedilink
English
54M

Wait until you see what my uncle Jerry can do with a 5th of vodka and his Highlander.

@EvilBit@lemmy.world
link
fedilink
English
384M

https://xkcd.com/1958/

TL;DR: faking out a self-driving system is always going to be possible, and so is faking out humans. But doing so is basically attempted murder, which is why the existence of an exploit like this is not interesting or new. You could also cut the brake lines or rig a bomb to it.

@Beryl@lemmy.world
link
fedilink
English
2
edit-2
4M

You don’t even have to rig a bomb, a better analogy to the sensor spoofing would be to just shine a sufficiently bright light in the driver’s eyes from the opposite side of the road. Things will go sideways real quick.

@EvilBit@lemmy.world
link
fedilink
English
14M

It’s not meant to be a perfect example. It’s a comparable principle. Subverting the self-driving like that is more or less equivalent to any other means of attempting to kill someone with their car.

@Beryl@lemmy.world
link
fedilink
English
34M

I don’t disagree, i’m simply trying to present a somewhat less extreme (and therefore i think more appealing) version of your argument

@NeoNachtwaechter@lemmy.world
link
fedilink
English
12
edit-2
4M

It is old.

I mean, not this certain attack, but the principle is well known.

The solution is also known: any sensor (or at least any critically important sensor) in a robotic system must be able to recognize it’s own state of “blindness”. The system must react accordingly. (For example, with the camera behind the windshield, it would activate the wipers and the heating in the windshield to remove possible rain, snow or dirt). If several sensors go “blind” at the same time, the system must do a safe stop of the car.

@Gustephan@lemmy.world
link
fedilink
English
54M

It’s basically chaff, lol. We’ve known chaff is an effective radar countermeasure since the 40s, and it seems like the researchers have found the lidar and optical equivalents of chaff. What really scares me is the idea of this evolving into more sophisticated deception attacks like range or velocity gate pulls. No idea how you’d do that with lidar or optically, but I’d bet money that’s a line item on a black budget somewhere

Its still a problem, A sheet held across the road on a string would show up as a wall to both cameras and lidar. I for one am buffalo buffalo buffalo buffalo buffalo looking forward to the emerging profession of road pirates robbing automated trucks this way.

road pirates robbing automated trucks

Ok but the problem of road pirates isn’t new either, is it? Let’s watch ‘Herbie’ again :-)

There is just one risk that is kinda new (but actually coming with every automation): systematic errors could bring vulnerabilities that get exploited in large numbers.

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 186 users / day
  • 583 users / week
  • 1.37K users / month
  • 4.49K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog