Self-driving cars are supposed to make roads safer and driving easier. But a recent head-to-head test featuring Tesla Autopilot has raised more concerns than confidence. In a widely shared video by engineering YouTuber Mark Rober, a Tesla Model Y—relying solely on cameras—drove straight into a fake road wall, Looney Tunes-style, while a lidar-equipped vehicle easily avoided the trap.
Let’s break down what happened, what it means for Tesla’s self-driving strategy, what regulators are saying, and how it compares to other similar tests making the rounds on YouTube.
The Problem: Tesla’s Autopilot Driven by Cameras Only
Most autonomous vehicle developers, including Waymo and Cruise, use a combination of cameras, radar, lidar (light detection and ranging), and ultrasonic sensors to “see” the world around them. Tesla has gone against the grain.
Starting in 2021, Tesla removed radar from new vehicles and began deactivating it in older ones via software updates. The company’s Full Self-Driving (FSD) beta and Autopilot now rely exclusively on a vision-based system powered by neural networks.
Elon Musk argues that this mimics human driving better and is more scalable. In theory, once the AI gets good enough, it can drive anywhere, under any condition, like a person. In practice? Not quite.
The Viral Test: Tesla Autopilot vs. Lidar
In Mark Rober’s test, the Tesla Model Y was pitted against a lidar-equipped vehicle in several conditions:
- Child Mannequin in the Road (Daylight)
✅ Tesla stopped.
✅ Lidar car stopped. - Child Mannequin in the Road (Blinded by Lights)
✅ Tesla stopped.
✅ Lidar car stopped. - Child Mannequin in Fog
❌ Tesla failed to stop.
✅ Lidar car stopped. - Child Mannequin in Heavy Rain
❌ Tesla failed to stop.
✅ Lidar car stopped. - Wile E. Coyote-Style Fake Road Wall
❌ Tesla drove straight into the wall.
✅ Lidar car identified it as a solid object and stopped.
These tests highlight a critical flaw in camera-only systems: they can be tricked by visual illusions and struggle in poor visibility. Lidar, using lasers to map the environment, doesn’t care what something looks like—it only cares what’s physically there.
Similar Tests on YouTube
Rober isn’t the only one raising questions about Tesla’s approach. Here are other creators and researchers putting Tesla Autopilot to the test:
- “Tesla FSD Beta vs. Foggy Roads” – AI Addict
Showed erratic behavior and missed road lines when visibility was low. - “Tesla Full Self Driving Can’t Handle Fake Humans” – Consumer Reports
Demonstrated Tesla vehicles failing to identify pedestrian dummies under certain conditions. - “Waymo vs. Tesla FSD: Urban Challenge” – Tech Vision
Compared Tesla’s vision system against Waymo’s lidar-based platform in complex city traffic. Tesla hesitated or disengaged in several scenarios where Waymo sailed through smoothly.
What Regulators Found: Safety Concerns About Tesla’s Autopilot
U.S. regulators, particularly the National Highway Traffic Safety Administration (NHTSA), have been scrutinizing Tesla’s driver assistance systems for years. As of 2024, NHTSA has:
- Linked Tesla’s Autopilot to dozens of crashes, some fatal.
- Opened multiple investigations into phantom braking, failure to detect stationary vehicles, and driver misuse of Autopilot features.
- Pressed Tesla to issue over-the-air updates to improve Autopilot’s behavior in edge cases.
In December 2023, the NHTSA concluded that Tesla’s user interface design encouraged driver inattention, especially during Autopilot operation. While Tesla made interface adjustments, the system still remains classified as a Level 2 advanced driver-assistance system (ADAS)—a far cry from the Level 5 “fully autonomous” promise.
Understanding the Levels of Driving Automation
- Level 2 (Tesla today): Driver must monitor the system and be ready to take control at any time.
- Level 3–4: System can handle most driving, but human fallback may still be needed in rare cases.
- Level 5: Fully autonomous in all conditions, no human input required at any point.
Elon Musk has frequently said Tesla will achieve Level 5 autonomy “soon.” That claim is increasingly viewed with skepticism given the current limitations of its vision-only strategy.
Final Thoughts: Is Vision-Only Enough?
Tesla’s strategy is bold, but this test shows it’s still flawed. While cameras and neural networks might work in ideal conditions, real-world driving is full of rain, fog, bad lighting, and unpredictable obstacles. Lidar and radar offer redundancy and precision that cameras simply can’t.
Until Tesla addresses these gaps—or proves otherwise with real-world data—its cars won’t be anywhere near full autonomy.
Key Takeaways
- Tesla’s Autopilot failed several real-world vision tests that lidar systems passed.
- Camera-only systems struggle in poor visibility and with visual deception.
- Regulators are closely monitoring Tesla due to safety concerns and incomplete autonomy promises.
- Multiple independent tests suggest Tesla is far from Level 5 autonomy.
Want to Learn More?
Check out these recommended YouTube tests for deeper insight:
- The Independent – Tesla and the Fake Child Test
- AI Addict – FSD Beta in Bad Weather
- Tech Vision – Tesla vs. Waymo Urban Driving Challenge