Mark Rober just set up one of the most interesting self-driving tests of 2025, and he did it by imitating Looney Tunes. The former NASA engineer and current YouTube mad scientist recreated the classic gag where Wile E. Coyote paints a tunnel onto a wall to fool the Road Runner.
Only this time, the test subject wasn’t a cartoon bird… it was a self-driving Tesla Model Y.
The result? A full-speed, 40 MPH impact straight into the wall. Watch the video and tell us what you think!
I seem to recall that fElon prevented the self driving team from utilizing LIDAR for any part of the system, instead demanding that everything run off of optical input. Does anyone else remember the same?
Yes, I recall at the time experts saying it was a terrible mistake and Elon saying Machine learning will bridge the gap.
The real reason was to increase margins.
I remember there being claims from him or his team about lidar being a dead end that would not scale as well as computer vision.
I believe he claimed that since humans use their vision to drive that computer vision was more than enough.
I don’t know about you, but I also rely on sounds & feel when I drive. I also know that the human eye has evolved to detect motion, filter out extraneous information, and send just the important bits to the brain so that it doesn’t get overloaded with everything the eye sees. Computer vision is the exact opposite from that, having to process every bit of every image the camera sees.
I also know of many times my vision fails. Driving into a sunrise for example
I don’t know about you, but I also rely on sounds & feel when I drive.
Of course. When I feel myself driving into a wall, I stop immediately.
You must connect with the road, every km or so stop and hug the asphalt.
since humans use their vision to drive that computer vision was more than enough
Surprised he didn’t swap out the wheels with legs while he was at it
o p t i m u s
Yep! That’s what I’m thinking of. It was Elmo. The real engineers objected.
Was just thinking this
A single LiDAR sensor prevents this kind of issue
I’m trying to find an article that covers what I remember but I know for sure that it’s been a good while since I saw the info I recall. Hopefully I can dig something up.
That might actually be the exact article I’m thinking about. Thanks!
Even RADAR prevents this and the cars had RADAR! They started disabling RADAR for the older cars since the new ones don’t even have the hardware installed.
What’s cool is, Tesla’s used to have radar sensors at least, but Elon removed them from production to save money. Even if you have a car from back then, the software no longer uses them and they’ll just physically unplug it them the next time you have the car serviced, as it’s just a drain on the battery at this point 🙃
meanwhile our subaru has lidar for adaptive cruise control and emergency braking
I didn’t realize EyeSight had different versions, on the Solterra it looks like it is indeed LIDAR.
My Crosstrek has the older dual camera setup for depth perception, it would not be fooled by a picture of a road on a wall… I’m surprised the Teslas are.
Yikes.
they’ll just physically unplug them the next time you have the car serviced
So, (looks at watch), in an hour?
So did they unplug it
Yes. He took too much inspiration from Stanford University’s “Stanley” winning the DARPA Grand Challenge in 2005. This was an early completion to build viable autonomous vehicles. Most of them looked like tanks covered in radar dishes but Stanford would up taking home the gold with just an SUV with cameras on it.
It was an impressive achievement in computer vision, and the LiDAR-encrusted vehicles wound up looking like over-complex dinosaurs. There’s a great documentary about it narrated by John Lithgow (who, throughout it, pronounces the word robot as “ro-butt”). Elon watched it, made up his mind, and like a moron, hasn’t changed it in 20 years. I’m almost Musk’s age so I know how the years speed up as we go on. He probably thinks about the Stanford win as something that happened relatively recently. Especially with his mind on - ahem - other things, he’s not keeping up with recent developments out in the real world.
Rober just made Musk look like the absolute tool he is. And I’m a little worried that we may see people out there staging real world versions of this somehow with actual dangerous obstacles, not a cartoonish foam wall.
I did low-key get the squiggles before writing the article. I thought, from an ethical hacking disclosure-type perspective, that this info might cause folks to… well, ya know, paint tunnels on walls.
Then I looked, the cat was already out of the bag, the video had something like 5 million views on it in the 4 hours it took me to draft the article. So I shared it, but I definitely did have that thought cross my mind. I am also a little worried on that score.
Iirc they were using a combination of lidar and radar, but Elmo wanted to cut costs.
Funny thing is, the price of lidar is dropping like a stone; they are projected to be sub-$200 per unit soon. The technical consensus seems to be settling in on 2 or 3 lidars per car plus optical sensors, and Chinese EV brands are starting to provide self driving in baseline models, with lidars as part of the standard package.
Cameras and radar, I believe. Never lidar.
Did he want to cut costs or did he want a network of cameras at his control all over the world?
Yes.
Ah okay. I was genuinely curious if I was remembering correctly because I definitely know it’s been awhile since I’d read anything on the subject.
Came here to actually write this. Everyone remembers that. He made Tesler the hated shit it is today.
As a space nut I seriously hope that he never gets a chance to do anything similar with SpaceX. Thankfully he’s mostly been kept away from important things thus far.
Don’t get me wrong, I know SpaceX’s closet is overflowing with skeletons. But since Congress has been so kind as to continuously cut NASA’s budget for the last few decades, I have to rely on SpaceX and other private companies to keep our space endeavors going.
I’m (was) huge SpaceX nerd, but last year or so I’m less and less. He always was dumb narcissist asshole, but now I can’t take it anymore. Also the idea that we’ve fucked up this planet and need to move somewhere else, by doing thousands of launches finishing this planet always made me sick. If someone would take him out, I probably would come back to liking the company.
To the famously already fucked planet Mars, no less.
Is that just to cover his ass cuz he was promising backwards-compatible FSD for models that don’t have LIDAR?
The actual wall is way more convincing though.
As much as i want to hate on tesla, seeing this, it hardly seems like a fair test.
From the perspective of the car, it’s almost perfectly lined up with the background. it’s a very realistic painting, and any AI that is trained on image data would obviously struggle with this. AI doesn’t have that human component that allows us to infer information based on context. We can see the boarders and know that they dont fit. They shouldn’t be there, so even if the painting is perfectly lines up and looks photo realistic, we can know something is up because its got edges and a frame holding it up.
This test, in the context of the title of this article, relies on a fairly dumb pretense that:
- Computers think like humans
- This is a realistic situation that a human driver would find themselves in (or that realistic paintings of very specific roads exist in nature)
- There is no chance this could be trained out of them. (If it mattered enough to do so)
This doesnt just affect teslas. This affects any car that uses AI assistance for driving.
Having said all that… fuck elon musk and fuck his stupid cars.
This doesnt just affect teslas. This affects any car that uses AI assistance for driving.
Except for, you know… cars that don’t solely rely on optical input and have LiDAR for example
Fair point. But it doesn’t address the other things i said, really.
But i suppose,based on already getting downvoted, that I’ve got a bad take, either that or people who are downvoting me dont understand i can hate tesla and elon, think their cars are shit and still see that tests like this can be nuanced. The attitude that paints with a broad brush is the type of attitude that got trump elected…
No, it’s just a bad take. Every other manufacturer of self driving vehicles (even partial self driving, like automatic braking) uses LiDAR because it solves a whole host of problems like this. Only Tesla doesn’t, because Elon thinks he’s a big brain genius. There have been plenty of real world accidents with less cartoonish circumstances involving Teslas that also would have been avoided if they just had LiDAR sensors. Mark just chose an especially flashy way to illustrate the problem. Sometimes flashy is the best way to get a point across.
based on already getting downvoted
In this case, yes, but in general, downvotes just mean your take is unpopular. The downvotes could be from people who don’t like Tesla and see any defense of Tesla as worthy of downvotes.
So good on you for making the point that you believe in. It’s good to try to understand why something you wrote was downvoted instead of just knee-jerk assuming that it’s because it’s a “bad take.”
I agree the wall is convincing and that it’s not surprising that the Tesla didn’t detect it, but I think where your comment rubs the wrong way is that you seen to be letting Tesla off the hook for making a choice to use the wrong technology.
I think you and the article/video agree on the point that any car based only on images will struggle with this but the conclusion you drew is that it’s an unfair test while the conclusion should be that NO car should rely only on images.
Is this situation likely to happen in the real world? No. But that doesn’t make the test unfair to Tesla. This was an intentional choice they made and it’s absolutely fair to call them on dangers of that choice.
That’s fair.
I didn’t intend to give tesla a pass. I hoped that qualifying what i said with a “fuck tesla and fuck elon” would show that.
But i didn’t think about it that way.
In my defense my point was more about saying “what did you expect” the car to do in a test designed to show how a system that is not designed to perform a specific function cant perform that specific function.
We know that self driving is bullshit, especially the tesla brand of it. So what is Mark’s test and video really doing?
But on reflection, i guess there are still a lot of people out there that dont know this stuff, so at the very least, a popular channel like his will go a longway to raising awareness of this sort of flaw.
I agree that this just isn’t a realistic problem, and that there are way more problems with Tesla’s that are much more realistic.
Tell that to the guy who lost his head when his Tesla thought a reflective semi truck was the sky
Well that seems like a realistic problem. Not picture of tunnel
I am fairly dumb. Like, I am both dumb and I am fair-handed.
But, I am not pretentious!
So, let’s talk about your points and the title. You said I had fairly dumb pretenses, let’s talk through those.
- The title of the article… there is no obvious reason to think that I think computers think like humans, certainly not from that headline. Why do you think that?
- There are absolutely realistic situations exactly like this, not a pretense. Don’t think Loony Tunes. Think 18 wheeler with a realistic photo of a highway depicted on the side, or a billboard with the same. The academic article where 3 PhD holding engineering types discuss the issue at length, which is linked in my article. This is accepted by peer-reviewed science and has been for years.
- Yes, I agree. That’s not a pretense, that’s just… a factually correct observation. You can’t train an AI to avoid optical illusions if its only sensor input is optical. That’s why the Tesla choice to skip LiDAR and remove radar is a terminal case of the stupids. They’ve invested in a dead-end sensor suite, as evidenced by their earning the title of Most Lethal Car Brand on the Road.
This does just impact Teslas, because they do not use LiDAR. To my knowledge, they are the only popular ADAS in the American market that would be fooled by a test like this.
Near as I can tell, you’re basically wrong point by point here.
Excuse me.
-
Did you write the article? I genuinely wasn’t aiming my comment at you. It was merely commentary on the context that is inferred by the title. I just watched a clip of the car hitting the board. I didn’t read the article, so i specified that i was referring to the article title. Not the author, not the article itself. Because it’s the title that i was commenting on.
-
That wasn’t an 18 wheeler, it was a ground level board with a photorealistic picture that matched the background it was set up against. It wasnt a mural on a wall, or some other illusion with completely different properties. So no, i think this extremely specific set up for this test is unrealistic and is not comparable to actual scientific research, which i dont dispute. I dont dispute the fact that the lack of LiDAR is why teslas have this issue and that an autonomous driving system with only one type of sensor is a bad one. Again. I said i hate elon and tesla. Always have.
All i was saying is that this test, which is designed in a very specific way and produces a very specific result, is pointless. Its like me getting a bucket with a hole in and hypothesising that if i pour in waterz it will leak out of the hole, and then proving that and saying look! A bucket with a hole in leaks water…
Y’all excused, don’t sweat it! I sure did write the article you did not read. No worries, reading bores me sometimes, too.
Your take is one of the sillier opinions that I’ve come across in a minute. I won’t waste any more time explaining it to you than that. The test does not strike informed individuals as pointless.
-
still, this should be something the car ought to take into account. What if there’s a glass in the way?
Yes, I think a human driver who isn’t half asleep would notice that something is weird, and would at least slow down.
Glass would be very interesting, might actually confuse lidar also.
A camera will show it as being more convincing than it is. It would be way more obvious in real life when seen with two eyes. These kinds of murals are only convincing from one specific point.
That’s true, but it’s still way more understandable that a car without lidar would be fooled by it. And there is no way you would ever come into such a situation, whereas the image in the thumbnail, could actually happen. That’s why it’s so misleading, can people not see that?
I absolutely hate Elon Musk and support boycott of Tesla and Starlink, but this is a bit too misleading even with that in mind.So, your comment got me thinking… surely, in a big country like the US of A, this mural must actually exist already, right?
Of course it does. It is an art piece in Columbia, S.C: https://img.atlasobscura.com/90srIbBi-XX-H9u6i_RykKIinRXlpclCHtk-QPSHixk/rt:fit/w:1200/q:80/sm:1/scp:1/ar:1/aHR0cHM6Ly9hdGxh/cy1kZXYuczMuYW1h/em9uYXdzLmNvbS91/cGxvYWRzL3BsYWNl/X2ltYWdlcy85ZTUw/M2ZkZDAxZjVhN2Rm/NmVfOTIyNjQ4NjQ0/OF80YWVhNzFkZjY0/X3ouanBn.webp
A full article about it: https://www.atlasobscura.com/places/tunnelvision
How would Tesla FSD react to Tunnelvision, I wonder? How would Tesla FSD react to an overturned semi truck with a realistic depiction of a highway on it? JK, Tesla FSD crashes directly into overturned semis even without the image depiction issue.
I don’t think the test is misleading. It’s puffed up for entertainment purposes, but in being puffed up, it draws attention to an important drawback of optical-only self-driving cars, which is otherwise a difficult and arcane topic to draw everyday people’s attention to.
Good find, I must say I’m surprised that’s legal, but it’s probably more obvious in reality, and it has the sun which is probably also pretty obvious to a human.
But it might fool the Tesla?Regarding the semi video: WTF?
But I’ve said for years that Tesla cars aren’t safe for roads. And that’s not just the FSD, they are inherently unsafe in many really really stupid ways.
Blinker buttons on the steering wheel. Hidden emergency door handles, emergency breaking for no reason. Distracting screen interface. In Denmark 30% of Tesla 3 fail their first 4 year safety check.
There have been stats publicized that claim they aren’t worse than other cars, when in fact “other cars” were an average of 10 year older. So the newer cars obviously ought to be safer because they should be in better conditions.
I’m so glad I wasn’t the only person who immediately thought “This is some Wile E. Coyote shit.”
I mean, it is also referenced in the article and even in the summary from OP.
And extensively in the video too.
Is this video being suppressed by the YouTube algorithm? I wonder if it’s because of Tesla or Disney. Or maybe it’s because of simulated child harm?
New stuff to add to the car kit bag for the 21st century
- poster board to block usonic weapons
- black paint, white paint, roller, brush to paint tunnels on walls
- orange cones to pen in self driving cars
And a “Yikes!” sign 🤣
Apparently they keep getting tickets in China because they didn’t bother to adjust the settings to accommodate Chinese roads and traffic laws. Result is Tesla is getting utterly crushed by BYD in their one major market that doesn’t care about Elon’s antics.
Huh, now I’m mildly interested in the differences in traffic laws in China vs US vs Europe that lead to Teslas getting more tickets in China than elsewhere.
I found this article. My takeaways were:
- No driving in bus lanes during certain times of day.
- No using the shoulder as a turn lane.
- No using a bike lane as a turn lane.
Wow
(Basedbasedbasedbasedbasedbasedbased)
This post brought to you by American car centrism
Not sure what “American car centrism” has to do with Chinese traffic regulations tbh
I think their regs, while seemingly very basic rules of the road, are based because I live in the US and we have bike lanes here that just whole ass turn into turn lanes with almost no warning. I wish we could get basic decency for everyone on the road, too.
There’s a very simple solution to this. BUILD MORE TRAINS!
Meep meep
A genius :-)
Well, I guess we know how to defeat Teslas.
By not purchasing one.
They’ll just go out of business eventually.
This is why it’s fucking stupid Tesla removed Lidar sensors and relies on cameras only.
But also who would want a tesla, fuck em
They also removed radar, which is what allowed them to make all of those “it saw something three vehicles ahead and braked to avoid a pileup that hadn’t even started yet” videos. Removing radar was the single most impactful change Tesla made in regards to FSD, and it’s all because Musk basically decided “people drive fine with just their eyes, so cars should too.”
I was horrified when I learned that the autopilot relies entirely on cameras. Nope, nope, nope.
Leon said other sensors were unnecessary because human driving is all done through the sense of sight…proving that he has no idea how humans work either (despite purportedly being a human).
They never had lidarr. They used to have radar and uss but they decided “vision” was good enough. This conveniently occurred when they had supply chain issues during covid.
There’s a very simple solution to autonomous driving vehicles plowing into walls, cars, or people:
Congress will pass a law that makes NOBODY liable – as long as a human wasn’t involved in the decision making process during the incident.
This will be backed by car makers, software providers, and insurance companies, who will lobby hard for it. After all, no SINGLE person or company made the decision to swerve into oncoming traffic. Surely they can’t be held liable. 🤷🏻♂️
Once that happens, Level 4 driving will come standard and likely be the default mode on most cars. Best of luck everyone else!
You can’t sue me for my driverless car that drops caltrops, forcing everyone else to take the train.
I’ve said for a while that you could shut down an entire city with just a few buddies and like $200 in drywall screws. Have each friend drive on a different highway (or in a different direction on each highway) and sprinkle drywall screws as they go. Not just like a single dump, but a good consistent scatter so the entire highway is a minefield and takes hours to properly sweep up.
These same laws will be grandfathered in for ai. This way your health insurance companies computer can murder you by denying care and nobody can be held civilly or criminally liable.
Beep beep! Damn things are using ACME LiDAR!
Actually elon demanded that lidar be depricated because of phantom breaking years ago, they only use visible spectrum cameras now
Show someone footage of 9/11 and they‘ll think of 2001. Show someone footage of a burning or crashing Tesla 20 years from now and they‘ll think of 2025.
They obviously pre-cut the wall, probably for safety reasons, and they were like, let’s make it a silly cartoon impact hole while we’re at it.
Good job.
You think you’re reliably going to notice this after a hundred miles of driving? (X) doubt.
Easy
Dude.
You could at least look at what you’re replying to before jumping in and going full outrage mode. I didn’t even say anything about what I thought of the validity of that experiment.
Keep putting yourself forward to defend the poor, misjudged car company belonging to a crazy asshole.
👌👍
Username checks out.
So its road runner rules in play here.