WASHINGTON (Reuters) – Tesla Inc is recalling just over two million U.S. vehicles equipped with its Autopilot advanced driver assistance system to install new protections, after a safety regulator said the system raised safety concerns. .
The National Highway Traffic Safety Administration (NHTSA) has been investigating the electric automaker led by billionaire Elon Musk for more than two years to learn whether Tesla vehicles adequately ensure drivers pay attention when using Autopilot. vehicles on US roads.
Tesla said in a recall report that the Autopilot software system controls “may not be sufficient to prevent driver misuse” and could increase the risk of a crash.
NHTSA Acting Administrator Ann Carlson told Reuters in August that it was “really important that driver monitoring systems take into account that humans rely too much on technology.”
Tesla's Autopilot is intended to allow cars to automatically steer, accelerate and brake within their lane, while Enhanced Autopilot can help with changing lanes on highways but does not make them autonomous.
A component of Autopilot is Autosteer, which maintains a set speed or following distance and works to keep the vehicle in its lane.
Tesla said it does not agree with NHTSA's analysis, but that it will deploy an over-the-air software update that will “incorporate additional controls and alerts to those already in place in affected vehicles to further encourage the driver to adhere to their driving responsibility.” continuous whenever Autosteer is engaged.”
The company did not respond to a question about whether the recall would be carried out outside the United States. It is not immediately clear whether China will require a recall over the same issue.
A spokesperson for the Italian Transport Ministry said at the moment it was not aware of similar actions being taken in Italy. Regulators in Germany said they are investigating the issue.
Predictable misuse
The NHTSA opened an investigation in August 2021 into Autopilot after identifying more than a dozen accidents in which Tesla vehicles struck stopped emergency vehicles and updated it in June 2022. The NHTSA said that as a result of its investigation , Tesla issued the recall after the agency discovered “Tesla's unique design of its Autopilot system may provide inadequate driver involvement and usage controls that may lead to foreseeable misuse of the system.” The NHTSA reviewed 956 crashes where Autopilot was initially alleged to be in use and focused on 322 crashes involved with Autopilot in its investigation.
Bryant Walker Smith, a law professor at the University of South Carolina who studies transportation issues, said the software-only solution will be quite limited. The recall “really seems to place a lot of responsibility on human drivers rather than a system that facilitates this misuse,” Smith said.
Separately, since 2016, the NHTSA has opened more than three dozen special Tesla crash investigations into cases where driving systems like Autopilot were suspected of being used, with 23 crash deaths reported to date.
The NHTSA said there may be an increased risk of a crash in situations where the system is activated but the driver does not maintain responsibility for the operation of the vehicle and is not prepared to intervene or cannot recognize when it has or has not been overridden.
The NHTSA investigation into Autopilot will remain open while it monitors the effectiveness of Tesla's drugs. Tesla and NHTSA have held several meetings since mid-October to discuss the agency's interim findings on potential driver misuse and Tesla's proposed software solutions in response.
The company will roll out the update to 2.03 million Model S, X, 3 and Y vehicles in the United States since the 2012 model year, the agency said.
The vehicle hardware-based update will include increased prominence of visual alerts in the user interface, simplifying Autosteer engagement and disengagement, and additional checks when activating Autosteer” and eventual suspension of Autosteer use if the driver repeatedly fails to demonstrate driving responsibility continuous and sustained while the feature is activated,” Tesla said.
It did not provide further details on exactly how alerts and safeguards would change.
Shares of the world's most valuable automaker fell 1.5% in morning trading.
Tesla disclosed in October that the US Department of Justice had issued subpoenas related to its Full Self-Driving (FSD) and Autopilot systems. Reuters reported in October 2021 that Tesla was under criminal investigation over allegations that the company's electric vehicles could drive themselves.
Tesla in February recalled 362,000 U.S. vehicles to update its FSD Beta software after the NHTSA said the vehicles did not adequately comply with traffic safety laws and could cause accidents.
The NHTSA closed a previous investigation into Autopilot in 2017 without taking any action. The National Transportation Safety Board (NTSB) criticized Tesla for its lack of system safeguards for Autopilot and the NHTSA for failing to ensure the safety of Autopilot.



Source

LEAVE A REPLY

Please enter your comment!
Please enter your name here