Tesla Under Investigation Due to Recent Auto Accidents

Elon Musk, alongside Tesla, is a pacesetter in auto manufacturing, with the Tesla vehicles making big waves in the automobile industry. However, recent developments have raised concerns about the technology adopted and advanced by Tesla automobiles. Recently, federal agencies in the US launched investigations into Tesla’s cars after reports of several crashes. These accidents occurred while the autopilot system, called the Traffic-Aware Cruise Control, in the vehicles was engaged.

The National Highway Transportation Safety Administration (NHTSA) announced this investigation on the 16th of August 2020. The investigation will cover 765,000 Teslas covering all Tesla vehicle models like Tesla Model 3, Model S, Model Y, and Model X. The accidents being investigated happened between January 22, 2018, and July 10, 2021. And the accidents took place mostly at night. The investigation will zoom in on the entire functioning of the autopilot system to figure out how it handles the driver’s attentiveness. And also how it interacts with and responds to objects and events around it. It will also look into any other factors that contributed to the accident.

Tesla’s History of Accidents

 

Since 2018, there has been a record of 11 crashes, 17 injuries, and 1 death caused by Tesla cars. In some of these crashes, the Tesla hit parked emergency vehicles even though they were marked with all the safety measures like road cones, flares, flashing lights, and so on. The European New Car Assessment Program reported that autopilot, which is one of the advanced driver-assistance systems (ADAS), does not monitor drivers’ attention efficiently. And this has been the major cause of the Tesla crashes. The National Transportation Safety Board (NTSB), an agency that investigates plane crashes and other fatal accidents, also states in its report that the ineffectiveness of Tesla to monitor driver’s engagement contributed to the crash that occurred in 2018. The crash claimed the life of Wei Huang, who was driving a Model X. The car hit a highway obstacle in Mountain View, California.

The transportation safety board’s chairman at that period, Robert L. Sumwalt, when the report was released, beckoned on the highway safety administration to step into action and carry out its oversight function and enforce corrective measures urgently. He said, “It’s time to stop enabling drivers in any partially automated vehicle to pretend that they have driverless cars.” The series of investigations were sparked by a crash in 2016 involving the Model S, which led to the death of Joshua Brown.  In response, Tesla said the Model S failed to detect the white side of the tractor-trailer that had driven across the car’s path against the bright sky.

In the crash that happened in 2018 in California, the Tesla, a 2014 Model S hit a fire truck from the rear at 31mph. The autopilot failed to swerve away from the fire truck after the car in front did. The driver, who was eating a bagel, could not control the car early enough, so the crash occurred. Unfortunately, this kind of scenario is common in most accidents involving autopilot. The drivers are either not sitting in the driver’s seat or distracted. In a crash that occurred in May, the California driver activated the autopilot and went to the back seat as the car crossed the San Francisco Oakland Bay Bridge.

However, despite Tesla’s big dream to make sure these self-driving cars become the norm, the company instructs drivers to keep their hands on the wheel even when using the Autopilot feature. The company admits that the autopilot sometimes fails to recognize parked first responder’s vehicles. Then there are videos by experts and Tesla drivers themselves who have shared the defects of autopilot, with some saying that it is not safe, but it kills.

According to a statement made by the company, “the system also makes frequent checks to ensure that the driver’s hands remain on the wheel and provides visual and audible alerts if hands-on is not detected. It then gradually slows down the car until hands-on is detected again.” But in the light of the crashes, it is rather clear that the autopilot did not respond to the driver’s lack of attention.

In 2016, Tesla’s shares dropped by 3%, and in 2021, there’s been a further 5% drop in Tesla stock after the news of the investigation. The findings by the agency can bring further consequences to the company. The agency could order the company to recall all vehicles and work on the system. It can also compel auto manufacturers to add safety gadgets and features to their cars like rearview cameras and airbags. Even though this is a big hit on the Tesla company, Elon Musk says Tesla is forging ahead with producing full self-driving cars.

 

Limiting future Tesla autopilot accidents

 

Since the NTSB blames the autopilot for the crash in 2018 but cannot make enforcements but rather make recommendations to other federal agencies, it proposed that NHTSA and Tesla limit the use of Autopilot to areas where it can safely run. The agency also recommended that NHTSA should demand that Tesla and other autopilot automakers have better systems to engage drivers attentiveness and make sure they are paying attention.

Thankfully, the NHTSA is ready to jump on board with its “robust enforcement tools” to ensure public safety and investigate potential threats. The agency will waste no time clamping down on these threats when it finds proof of non-compliance to safety rules. And in June, the NHTSA wasted no time mandating all car manufacturers to report any accidents involving fully or partially automated cars. In a statement by NHTSA, the agency warns drivers to be responsible for the operation of their cars as prescribed by state laws. And that even though autopilot can help drivers avoid crashes and reduce the fatality of crashes, drivers use them responsibly.

Gordon Johnson, a vocal critic of Teslas and also an analyst, said that the issue is not only about the autopilot users but about other non-users who can be injured by Tesla autos or any other vehicle using the autopilot feature. And so Tesla’s old defense that users accept the risks involved in using autopilot cars is no longer agreeable. On the other hand, Sam Abuelsamid, an expert in self-driving automobiles, commended Tesla’s autopilot’s ability to slow down when the car in front of it is slowing down. He also pointed out that by design, these vehicles do not stop for immobile objects when moving at 40mph. However, thankfully, with some form of automatic brakes, some of these vehicles do come to a halt for unmoving objects when moving at a slower pace.

According to him, it is wrong for Tesla drivers to presume that their cars can drive themselves or recognize cues as a human would. Machines are not infallible and do make mistakes, and Tesla cars are vulnerable to mistakes as well. Previously, NHTSA has allowed car manufacturers to develop ADAS features with little regulatory oversight. However, with the recent crashes and follow-up investigations, they are taking on a new approach. On this matter, Raj Rajkumar, an engineering professor at Carnegie Mellon University, says: “Driver monitoring has been a big deficiency in Autopilot. I think this investigation should have been initiated some time ago, but it’s better late than never.”

 

Contact Us

602-780-1226