There’s Hidden Tesla Autopilot Data That Reveal Why They Crash & No, They’re Not Sharing

A report on hidden Tesla Autopilot data by the Wall Street Journal has revealed an inherent issue with its camera-based autopilot “technology”.
In its investigation of over 222 crashes involving the use of the Tesla Autopilot feature, the WSJ has managed to uncover data that the company does not want people to know.
Even during crashes, the company does not reveal crash data which is crucial for investigators to understand where the human error stops and the machine errors begin.
Since Tesla is a little shy with their resources, crash data has to be physically extracted from the on-board computer by hackers and the findings are damning.
Tesla’s Camera-based autopilot is heavily to blame
While it’s not conclusive, as only 100 crashes were able to be identified precisely by the WSJ, the computer data shows that there’s an inherent flaw in the camera-based autopilot system.
Unlike conventional car safety systems that use the Frequency Modulated Continuous Wave Radar or FMCW Radar system that measures both distance and velocity of moving objects with signals.
On Tesla’s the on-board computer uses its cameras like eyes to “see” the road and identify objects.
Unlike an actual AI that learns with increased data, the on-board computers on Teslas are dependent on its manufacturer to push updates and as such, remains “blind” to certain objects on the road unless they have been trained on that data.
Several instances in the investigation revealed that in conditions where there is an object on the road, especially at night, the car cannot understand what it’s looking at and therefore cannot react.
In one instance, the over reliance on the autopilot caused the death of its driver when it failed to identify an overturned truck in the middle of the highway.

The overturned truck can be seen quite clearly above but the on-board computer failed to understand what it was looking at and did not brake in time.
In another instance of a crash, though one camera picked up an object on the road, its 7 other cameras did not understand what the object was.
This data did not come from Tesla of course, who have adamantly been opposed to sharing data, and now Trump is on Tesla’s side and proposing to drop the car crash report requirement.
Elon did sink a quarter million dollars into the Don’s reelection campaign after all.
Misinformation about the Autopilot system
Elon Musk has been parading its “autopilot” system since as early as 2016 and has been adamant that his cameras are far more advanced that systems like radar or even the more advanced LiDAR.
His confidence and charisma has given his fans a false sense of security and despite its company website telling you that you need to pay attention, countless influencers share videos of them sleeping behind the wheel and flagrantly showing off this flawed technology.
Perhaps more than over promising and under delivering on an all-electric green future, Tesla’s biggest reckoning is coming in the form of its underbaked driverless technology.