[ad_1]
Authorities said Tesla, which had a fatal car accident on a highway in Southern California last week, was using an autopilot.
The National Highway Traffic Safety Administration (NHTSA) is investigating a car accident in Fontana, 80 kilometers (50 miles) east of Los Angeles on May 5. The investigation is the 29th case involving Tesla that the agency has responded to.
A 35-year-old man was killed when he hit an overturned semi truck on the highway at 2:30 am local time (09:30 GMT). The driver’s name has not yet been made public. When the electric car hit him, another man was seriously injured while he was helping the driver of the semi-autonomous car to escape from the sunken ship.
The California Highway Patrol (CHP) announced Thursday that the car has been running Autopilot, part of Tesla’s autonomous driving system, which has been involved in multiple car accidents. The Fontana crash marked at least the fourth time in the United States involving autopilot deaths.
The agency said in a statement: “Although the CHP center does not usually comment on ongoing investigations, the Department of Defense is aware that there is a high level of concern about the Tesla collision.” We believe that this information provides an opportunity to remind the public that driving is a complex task that requires the full attention of drivers.”
The federal security investigation took place after the CHP arrested another man, who the authorities said was sitting in the back seat of Tesla. Tesla was driving on Interstate 80 near Oakland this week, and no one was driving.
CHP has not disclosed whether officials have determined whether the Tesla in the I-80 incident was operating on Autopilot, a feature that allows the car to be centered in the lane and maintain a safe distance behind the vehicle in front of it.
But it is likely that the driver is operating the autopilot or “fully autonomous driving” in the back seat. Tesla allows a limited number of car owners to test its autonomous driving system.
Tesla, which has disbanded its public relations department, did not respond to an email seeking comment on Friday. The company stated in the owner’s manual and website that neither autonomous driving nor “fully autonomous driving” is completely autonomous, and drivers must be aware of and be prepared to intervene at any time.
Sometimes, the autopilot has trouble handling the intersection of fixed objects and traffic in front of Tesla.
In two Florida crashes in 2016 and 2019, a car using autopilot drove under a crossover tractor trailer and killed the person driving the Tesla. In a crash in Mountain View, California in 2018, an Apple engineer driving an autopilot was killed when Tesla hit a highway guardrail.
Tesla’s system uses cameras, radar, and short-range sonar, and it also has trouble dealing with stopped emergency vehicles. Tesla has hit several fire trucks and police cars, which were parked on the highway with emergency lights on.
For example, the National Highway Traffic Safety Administration (NHTSA) sent a team to investigate after a Tesla car on autopilot crashed into a Michigan police car on Interstate 96 near Lansing. The police said that neither the soldier nor the 22-year-old Tesla driver was injured.
After fatal accidents in Florida and California, the National Transportation Safety Board (NTSB) recommended that Tesla develop a more powerful system to ensure driver attention and limit Tesla’s autopilot to work effectively On the highway. Neither Tesla nor the safety agency took any action.
NTSB Chairman Robert Sumwalt (Robert Sumwalt) in a letter to the US Department of Transportation on February 1 urged the department to formulate regulations to regulate driver assistance systems such as autopilots and the testing of autonomous vehicles . The National Highway Traffic Safety Administration (NHTSA) mainly relies on voluntary guidelines for vehicles and adopts a non-intervention approach, so it will not hinder the development of new safety technologies.
Sam Walter said that Tesla is using car buyers to test “self-driving” software on public roads with limited supervision or reporting requirements.
“Since NHTSA does not require it, manufacturers can operate and test vehicles almost anywhere, even if the location is beyond the AV range. [autonomous vehicle] Limitations of the control system,” Sumwalt wrote.
He added: “Although Tesla includes a disclaimer that “currently enabled features require active driver supervision and cannot make the vehicle self-driving, the NHTSA’s self-driving method cannot supervise the self-driving test, which is very Other road users pose a potential risk. “
Since President Joe Biden took office, the National Highway Traffic Safety Administration (NHTSA) has the authority to manage the autonomous driving system and seek recalls when necessary, which seems to have aroused new interest in the system.
[ad_2]
Source link