We select and review products independently. When you purchase through our links we may earn a commission. Learn more.

Tesla Staged a Self-Driving Video In 2016, According to Engineer’s Testimony

The testimony may be utilized in a lawsuit against Tesla.

The 2021 Tesla Model 3 in white.

Back in 2016, Tesla published a video promoting the “Full Self-Driving” capabilities of its vehicles. But in a recent testimony, Tesla’s director of Autopilot software claims that this promotional video was staged. Tesla simply preprogrammed a route for the car, and humans intervened whenever things got dicey.

As reported by Reuters, this testimony was made by Ashok Elluswamy during a July deposition for a lawsuit against Tesla. The lawsuit stems from a 2018 car crash—a Model X steered into a traffic barrier while Autopilot was enabled, killing the driver.

We aren’t legal experts, and we don’t know how this testimony will be used in court. That said, the fact that Tesla faked this 2016 self-driving video is alarming. Tesla misled the general public by advertising Autopilot functionality that did not exist. The reality of this promotional video was kept under wraps for half a decade, and we hope that it didn’t influence public policy or autonomous vehicle regulation.

According to Ashok Elluswamy’s testimony, Tesla vehicles were incapable of stopping at red lights when this video was created. But he also states that “the intent of this video was not to accurately portray what was available for customers in 2016.” This is true, and Tesla’s website explicitly states that “Enhanced Autopilot and Full Self-Driving features require active driver supervision and do not make the vehicle autonomous.”

But the Tesla company did not treat this video as a proof of concept. In fact, Elon Musk shared the video on Twitter as evidence that “Tesla drives itself.” He even gushed that the Tesla automatically recognized and avoided a handicapped parking space. (Elon has a long history of overstating Full Self-Driving’s capabilities. He claimed that it would perform “at a safety level well above that of an average driver” by the end of 2021, for example.)

In response to the 2018 crash, Tesla published a blog post claiming that the driver “had received several visual and one audible hands-on warning earlier in the drive and the driver’s hands were not detected on the wheel for six seconds prior to the collision.” The carmaker also claimed that “Tesla Autopilot does not prevent all accidents,” though it dissuaded lawmakers from cracking down on self-driving systems, as Autopilot “unequivocally makes the world safer for the vehicle occupants.”

Source: Reuters

Andrew Heinzman Andrew Heinzman
Andrew is the News Editor for Review Geek, where he covers breaking stories and manages the news team. He joined Life Savvy Media as a freelance writer in 2018 and has experience in a number of topics, including mobile hardware, audio, and IoT. Read Full Bio »