Driverless cars worse at detecting children and darker-skinned pedestrians say scientists::Researchers call for tighter regulations following major age and race-based discrepancies in AI autonomous systems.

Eager Eagle
link
fedilink
English
10
edit-2
1Y

I hate all this bias bullshit because it makes the problem bigger than it actually is and passes the wrong idea to the general public.

A pedestrian detection system shouldn’t have as its goal to detect skin tones and different pedestrian sizes equally. There’s no benefit in that. It should do the best it can to reduce the false negative rates of pedestrian detection regardless, and hopefully do better than human drivers in the majority of scenarios. The error rates will be different due to the very nature of the task, and that’s ok.

This is what actually happens during research for the most part, but the media loves to stir some polarization and the public gives their clicks. Pushing for a “reduced bias model” is actually detrimental to the overall performance, because it incentivizes development of models that perform worse in scenarios they could have an edge just to serve an artificial demand for reduced bias.

@zabadoh@lemmy.ml
link
fedilink
English
51Y

I think you’re misunderstanding what the article is saying.

You’re correct that it isn’t the job of a system to detect someone’s skin color, and judge those people by it.

But the fact that AVs detect dark skinned people and short people at a lower effectiveness is a reflection of the lack of diversity in the tech staff designing and testing these systems as a whole.

They staff are designing the AVs to safely navigate in a world of people like them, but when the staff are overwhelmingly male, light skinned, young and single, and urban, and in the United States, a lot of considerations don’t even cross their minds.

Will the AVs recognize female pedestrians?

Do the sensors sense light spectrum wide enough to detect dark skinned people?

Will the AVs recognize someone with a walker or in a wheelchair, or some other mobility device?

Toddlers are small and unpredictable.

Bicyclists can fall over at any moment.

Are all these AVs being tested in cities being exposed to all the animals they might encounter in rural areas like sheep, llamas, otters, alligators and other animals who might be in the road?

How well will AVs tested in urban areas fare on twisty mountain roads that suddenly change from multi lane asphalt to narrow twisty dirt roads?

Will they recognize tractors and other farm or industrial vehicles on the road?

Will they recognize something you only encounter in a foreign country like an elephant or an orangutan or a rickshaw? Or what’s it going to do if it comes across that tomato festival in Spain?

Engineering isn’t magical: It’s the result of centuries of experimentation and recorded knowledge of what works and doesn’t work.

Releasing AVs on the entire world without testing them on every little thing they might encounter is just asking for trouble.

What’s required for safe driving without human intelligence is more mind boggling the more you think about it.

@rDrDr@lemmy.world
link
fedilink
English
91Y

But the fact that AVs detect dark skinned people and short people at a lower effectiveness is a reflection of the lack of diversity in the tech staff designing and testing these systems as a whole.

No, it isn’t. Its a product of the fact that dark people are darker and children are smaller. Human drivers have a harder time seeing these individuals too. They literally send less data to the camera sensor. This is why people wear reflective vests for safety at night, and ninjas dress in black.

That is true. I almost hit a dark guy, wearing black, who was crossing a street at night with no streetlight as I turned into it. Almost gave me a heart attack. It is bad enough almost getting hit, as a white guy, when I cross a street with a streetlight.

@ashok36@lemmy.world
link
fedilink
English
31Y

This is true but tesla and others could compensate for this by spending more time and money training on those form factors, something humans can’t really do. It’s an opportunity for them to prove the superhuman capabilities of their systems.

Eager Eagle
link
fedilink
English
1
edit-2
1Y

These are important questions, but addressing them for each model built independently and optimizing for a low “racial bias” is the wrong approach.

In academia we have reference datasets that serve as standard benchmarks for data driven prediction models like pedestrian detection. The numbers obtained on these datasets are usually the referentials used when comparing different models. By building comprehensive datasets we get models that work well across a multitude of scenarios.

Those are all good questions, but need to be addressed when building such datasets. And whether model M performs X% better to detect people of that skin color is not relevant, as long as the error rate of any skin color is not out of an acceptable rate.

The media has become ridiculously racist, they go out of their way to make every incident appear to be racial now

@Fedizen@lemmy.world
link
fedilink
English
31Y

cars should be tested for safety in collisions with children and it should affect their safety rating and taxes. Driverless equipment shouldn’t be allowed on the road until these sorts of issues are resolved.

@reddig33@lemmy.world
link
fedilink
English
281Y

LiDAR doesn’t see skin color or age. Radar doesn’t either. Infra-red doesn’t either.

That’s a fair observation! LiDAR, radar, and infra-red systems might not directly detect skin color or age, but the point being made in the article is that there are challenges when it comes to accurately detecting darker-skinned pedestrians and children. It seems that the bias could stem from the data used to train these AI systems, which may not have enough diverse representation.

The main issue, as someone else pointed out as well, is in image detection systems only, which is what this article is primarily discussing. Lidar does have its own drawbacks, however. I wouldn’t be surprised if those systems would still not detect children as reliably. Skin color wouldn’t definitely be a consideration for it, though, as that’s not really how that tech works.

@bisq@lemmy.world
link
fedilink
English
21Y

Ya hear that Elno?

@RanchOnPancakes@lemmy.world
link
fedilink
English
0
edit-2
1Y

Pretty sure Teslas target kids.

@AllonzeeLV@lemmy.world
link
fedilink
English
5
edit-2
1Y

Worse than humans?!

I find that very hard to believe.

We consider it the cost of doing business, but self-driving cars have an obscenely low bar to surpass us in terms of safety. The biggest hurdle it has to climb is accounting for irrational human drivers and other irrational humans diving into traffic that even the rare decent human driver can’t always account for.

American human drivers kill more people than 10 9/11s worth of people every year. Id rather modernizing and automating our roadways would be a moonshot national endeavor, but we don’t do that here anymore, so we complain when the incompetent, narcissistic asshole who claimed the project for private profit turned out to be an incompetent, narcissistic asshole.

The tech is inevitable, there are no physics or computational power limitations standing in our way to achieve it, we just lack the will to be a society (that means funding stuff together through taxation) and do it.

Let’s just trust another billionaire do it for us and act in the best interests of society though, that’s been working just gangbusters, hasn’t it?

Any black people or children in your ‘study’?

Not necessarily worse than humans, no, just worse than it can detect light skinned and tall people.

@baatliwala@lemmy.world
link
fedilink
English
11Y

2020 was lockdown year, how on earth have accidents increased in the US?

ma11en
link
fedilink
English
21Y

They need Google Pixel cameras.

Self driving cars are republicans?

What? No. They’d need to recognise them better - otherwise how can they swerve to make sure they hit them?

@thantik@lemmy.world
link
fedilink
English
-1
edit-2
1Y

It’s almost like less contrast against a black road or smaller targets are computationally more difficult to detect or something! Weird! How about instead of this pretty clear fact, we get outraged and claim it’s racism or something! Yeah!!

@Frozengyro@lemmy.world
link
fedilink
English
11Y

If you say the cars are racist you might get the car right to buy into them…

Sue the road builders for building racist roads!

/s

Wouldn’t good driverless cars use radars or lidars or whatever? Seems like the biggest issue here is that darker skin tones are harder for cameras to see

@mint_tamas@lemmy.world
link
fedilink
English
11Y

I think many driverless car companies insist on only using cameras. I guess lidars/radars are expensive.

They’re basically the only one. Even MobilEye, who is objectively the best in the ADAS/AV space for computer vision, uses other sensors in their fleet. They have demonstrated camera only autonomy, but realize it’s not worth the $1000 in sensors to risk killing people.

@rDrDr@lemmy.world
link
fedilink
English
21Y

Even Comma.AI, which is vision-only internally, still implicitly relies on the cars built in radar for collision detection and blind spot monitoring. It’s just Tesla.

@DoomBot5@lemmy.world
link
fedilink
English
11Y

To be fair, that’s because most cars aren’t equipped with cameras for blind spot detection.

@rDrDr@lemmy.world
link
fedilink
English
11Y

Thats because cameras aren’t good for blind spot detection. Moreover, even for cars that have cameras on the side, the Comma doesn’t use them. AFAIK, in my car with 360 cameras, the OEM system doesn’t use the cameras either for blind spot.

A single flir camera would help massively. They don’t care about colour or height. Only temperature.

I could make a warm water balloon in the shape of a human and it would stop the car then. Maybe a combination of all various types of technologies? You’d still have to train the model on all various kinds of humans though.

Create a post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


  • 1 user online
  • 197 users / day
  • 590 users / week
  • 1.38K users / month
  • 4.49K users / 6 months
  • 1 subscriber
  • 7.41K Posts
  • 84.7K Comments
  • Modlog