Tesla Autopilot drove into Wile E. Coyote-style fake road wall

larrydahooster
25 Comments
Subscribe
Notify of
25 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments

It has long been a known issue that Teslas can crash into objects without the help of LIDAR. I’m amazed they still have not implemented it.
https://www.truckinginfo.com/135780/white-trailer-proved-invisible-to-teslas-autonomous-system

This was a [Mark Rober](https://youtu.be/IQJL3htsDyQ?si=qek_g2xgjt_rBAQn) video on YouTube

The Tesla murdered the child way too many times. The cheap camera systems Tesla uses are not safe.

Okay – but which were they testing, the camera or the lidar? Because that matters in interpreting the test results! xD

EDIT: Thanks, it’s been explained. 🙂

90% of non self driving cars would also fall for it.

Paid for by ACME Corp.

I really don’t know why they insist on not using LIDAR. Yeah, Leon, we do the same thing with our brains without LIDAR but we have, you know, brains. A car does not. It just seems like an overly-idealistic constraint.

is this a safe place to admit I would have made the exact same mistake?

A perfect example of what Elmo is doing to the country.

I think it should be fixed obviously but I mean come on what’s the odds of driving on the road and somebody puts in a wall in between the road and paints it to make it look like the road is continuing on? That would be a murder sentence.

Someone remind me why Tesla stopped using LiDAR some time ago?

comment image?width=824&format=png&auto=webp&s=3b02e0425ac32b24eae78a4f4d070dc51919031b

Looks ultra realistic. The hole in the wall, whatever is behind the hole, even though there is nothing like that to be seen on the sides.

Did they also test a human?

This is literally a moment in Goldfinger

Now test a human because I’ve very interested in the result.

![gif](giphy|f61XhrlzeCDSZ5xK8t)

This is from a Mark Robert video
https://youtu.be/IQJL3htsDyQ?si=T5ax6_K_4jlU9Xv1

Guys, I have a great idea.

With that kind of picture, even a human driver would have been fooled.

So for anyone curious, as funny as this is, this is less of a problem with people painting fake road walls, as it is with mirror surfaces and adverse weather.

You, a human, know what the concept of a reflection is (I hope). When you pass a glossy car, you don’t assume the reflections you see in the car door are behind the door. 

Machine vision has no such capability of context. It’s seeing something, and it thinks it’s there. 

Lidar works differently. Lidar is measuring how much light is reflected and how long it takes. The laser will hit the glossy surface and return some of the car door and some of the reflection through the car door (or in the above test case – a wall, instead of an image of a road), but the systems prefer the “first return” light bounce for safety. Whatever bounces back quicker is real.

When you think about how many glossy, mirrored, or shiny surfaces or puddles are out there on the road, you realize how dangerous cameras alone really are.

——–

I think it would be extremely dangerous and probably illegal to paint a stop sign or small child on the back of your car. It’s really important you don’t do that. That wouldn’t be funny.

My wife has a Tesla with self driving. There’s a few places it gets weird.
It will take a left turn on a red light at a specific intersection. Why? Because there’s a green OPEN sign in a building about 50 feet behind the left turn light. The sign flashes on and off. So the car thinks the light has turned green.

Meep meep!

Okay I ain’t tryna defend teslas subpar system, but I can’t imagine when this would apply in regular situations

It is hilarious though.

25
0
Would love your thoughts, please comment.x
()
x