Federal vehicle regulators finish Tesla Autopilot safety investigation

The National Highway Traffic Safety Administration’s multiyear Tesla driver assistance system safety inquiry is ending. David Shepardson of Reuters reported the latest developments Thursday, citing NHTSA acting administrator Ann Carlson. CNBC validated the allegation with federal car safety regulators. A NHTSA representative emailed CNBC, saying, “We confirm the comments to Reuters,” and “NHTSA’s Tesla investigations remain open, and the agency generally does not comment on open investigations.”

In 2021, the agency began a safety investigation of Tesla’s driver assistance systems, now known as Autopilot, Full Self-Driving, and FSD Beta, after a series of crashes in which Tesla drivers crashed into first responders’ stationary vehicles. Despite their nomenclature, Tesla’s driver assistance capabilities are not autonomous. Tesla cars cannot work as robots like Cruise or Waymo. Tesla cars require a human driver who can steer and brake at any time. The regular Autopilot and premium Full Self-Driving systems only regulate braking, steering, and acceleration in certain situations.

X (previously Twitter) founder and Tesla CEO Elon Musk often indicates Tesla cars are driverless. On July 23, an ex-Tesla employee who managed artificial intelligence software engineering tweeted on social media about ChatGPT and how much it impressed his parents when he showed them for the first time. Tesla’s FSD is the same, Musk said on X. I forget that most people don’t know cars can drive themselves.”

Tesla cautions drivers using Autopilot or FSD to maintain their hands on the steering wheel and be aware of road conditions, traffic, and other road users (e.g., pedestrians and cyclists) in their owners’ manuals. Be ready to act immediately. Not following these instructions could cause harm, serious injury, or death.” In-cabin cameras and steering wheel sensors evaluate whether a driver is paying enough attention to the road and driving tasks in the company’s cars. The device “nags” drivers with a beep and message on the car’s touch screen to pay attention and hold the wheel. This mechanism may not be robust enough to ensure Tesla’s driver assistance capabilities are safe.

Tesla has voluntarily recalled cars owing to further Autopilot and FSD Beta faults and offered over-the-air software patches. In July, the regulator ordered Elon Musk’s manufacturer to provide more driver assistance system data for its Autopilot safety probes. NHTSA frequently provides statistics on U.S. car crashes involving advanced driver aid systems like Tesla Autopilot, Full Self Driving, or FSD Beta, deemed “level 2” by SAE International.

The newest collision report shows 26 fatalities in Tesla cars with level 2 systems from Aug. 1, 2019, to mid-July. The agency analysis states Tesla’s driver assistance capabilities were used within 30 seconds of 23 collisions. Three incidents are unknown if these characteristics were employed.Ford Motor Company is the only automaker to record a fatal crash involving a level 2 driver assistance vehicle. The NHTSA report did not indicate if the system was active before the collision.

Back to top button

Adblock Detected

our website is completly depends on ad revenue please disable ad blocker and support us. don't worry we will not use any popup ads you can see only ads by google.