WASHINGTON: A US senator on Jan 24 urged Tesla Inc to rebrand its driver help system Autopilot, saying it has “an inherently deceptive title” and is topic to doubtlessly harmful misuse.
However Tesla mentioned in a letter that it had taken steps to make sure driver engagement with the system and improve its security options.
The electrical automaker launched new warnings for crimson lights and cease indicators final yr “to minimise the potential danger of crimson light- or cease sign-running because of short-term driver inattention,” Tesla mentioned within the letter.
Senator Edward Markey mentioned he believed the potential risks of Autopilot could be overcome. However he known as for “rebranding and remarketing the system to cut back misuse, in addition to constructing backup driver monitoring instruments that may ensure that nobody falls asleep on the wheel.”
Markey’s feedback got here in a press launch, with a duplicate of a Dec 20 from Tesla addressing a number of the Democratic senator’s considerations connected.
Autopilot has been engaged in at the very least three Tesla autos concerned in deadly US crashes since 2016.
Crashes involving Autopilot have raised questions concerning the driver-assistance system’s capability to detect hazards, particularly stationary objects.
There are mounting security considerations globally about programs that may carry out driving duties for prolonged stretches of time with little or no human intervention, however which can’t fully change human drivers.
Markey cited movies of Tesla drivers who appeared to go to sleep behind the wheel whereas utilizing Autopilot, and others wherein drivers mentioned they might defeat safeguards by sticking a banana or water bottle within the steering wheel to make it seem they had been in command of the automobile.
Tesla, in its letter, mentioned its revisions to steering wheel monitoring meant that in most conditions “a limp hand on the wheel from a sleepy driver won’t work, nor will the coarse hand strain of an individual with impaired motor controls, equivalent to a drunk driver.”
It added that units “marketed to trick Autopilot, might be able to trick the system for a short while, however typically not for a whole journey earlier than Autopilot disengages.”
Tesla additionally wrote that whereas movies like these cited by Markey confirmed “a couple of dangerous actors who’re grossly abusing Autopilot”, they represented solely “a really small share of our buyer base.”
Earlier this month, the US Nationwide Freeway Site visitors Security Administration (NHTSA) mentioned it was launching an investigation right into a 14th crash involving Tesla wherein it suspects Autopilot or different superior driver help system was in use.
NHTSA is probing a Dec 29 deadly crash of a Mannequin S Tesla in Gardena, California. In that incident, the automobile exited the 91 Freeway, ran a crimson mild and struck a 2006 Honda Civic, killing its two occupants.
The Nationwide Transportation Security Board will maintain a Feb 25 listening to to find out the possible reason for a 2018 deadly Tesla Autopilot crash in Mountain View, California. – Reuters