The uncompromising Tesla arouses public outrage. How many lives will be claimed by the exaggerated autopilot?

The fatal Tesla Model S crash in Texas, USA, is still being fermented, causing more attention and discussion.

The National Highway Traffic Safety Administration (NHTSA) and the National Transportation Safety Board (NTSB), which are responsible for supervising vehicle safety, announced on Monday that they are investigating the accident.

On the morning of April 20th, Beijing time, Tesla CEO Elon Musk also broke his silence and spoke for the first time about the accident.

Musk said on Twitter that the data extracted so far shows that the car did not use Tesla's automatic driving assistance system Autopilot when the car crashed, and the car was not equipped with FSD (Fully Autonomous Driving Package).

Source: Musk Twitter

Musk said, “In addition, standard automatic assisted driving requires a lane line to be turned on, but there is no autopilot on the street.” It is the standard configuration of Tesla cars, but it does not always perfectly recognize lane markings, for example, it will confuse seal cracks on roads or bicycle lanes with other lane markings.

Regarding Musk’s remarks, Harris County Fourth Branch Police Officer Mark Herman, who is investigating the matter, said that this is the first time police officers have heard from the company.

Herman said, "If he has already retrieved the data, he has not told us that we are eagerly waiting to get the data."

Herman refuted Musk’s statement.According to witness testimony: The two men left to try out the autopilot function and show friends how the car can drive by itself.

Herman told Reuters that Texas police will issue a search warrant to Tesla on Tuesday to obtain data on vehicles involved in a car accident over the weekend.

It can be seen that Tesla's Autopilot system is a key to the fatal Model S accident. In fact, the use of Autopilot has previously attracted the attention of US regulators.

The National Highway Traffic Safety Administration (NHTSA) said last month that it was investigating nearly two dozen accidents involving Tesla. When these accidents occurred, either the Tesla car involved was using the Autopilot function. Or you may have been using the Autopilot function.

The exaggerated autopilot function

In fact, Autopilot has been full of controversy since its introduction.

Tesla launched the Autopilot function in 2015. The original meaning of Autopilot is the autopilot system of airplanes. Critics believe that Tesla's use of this name is full of misleading, which will make drivers mistakenly believe that Tesla vehicles have autopilot capabilities and relax their vigilance. In fact, Autopilot is only a function of assisting driving.

Initially, Tesla also used the word "autonomous driving" on its official website in China. This also raises doubts.

In early August 2016, a Chinese Tesla owner had a car accident. The car was in "Autopilot" mode at the time, but it could not avoid the car parked on the left.Two cars were damaged, but there were no casualties. This is the first accident in China where Tesla's autonomous driving has been exposed.

After the accident, the Tesla owner posted a Weibo detailing the process of the accident and complained that Tesla had exaggerated the autopilot function and misled consumers.

Tesla subsequently replaced the word "autonomous driving" with the word "autonomous assisted driving" on its official website in China.

However, according to foreign media reports, Tesla still exaggerates when it promotes Autopilot. Tesla refers to Autopilot on its website as "the future of driving", as if it is the most advanced autopilot in the world. Driving system.

is actually not the case. There are 6 levels of autonomous driving, from L0 to L5, with L5 being the highest level. Autopilot is an L2 system. The U.S. Department of Transportation defines it as "partially automated", requiring drivers to "always maintain investment in driving tasks and monitor the environment at all times."

Just last Saturday, Musk also promoted a recent safety report of the company. Musk said on Twitter, “Now, the probability of an accident in a Tesla that started Autopilot is 10 times lower than that of an ordinary vehicle.”

But Kelly, head of interconnection and automated car testing, the authoritative American magazine "Consumer Report" Finkhaus said that Tesla’s past data is not accurate and it is difficult to verify without basic data.

Finkhaus added that in the report, Tesla did not specify the number of system failures but no crashes, nor did it specify when the driver failed to take over the vehicle.

This year, Tesla will also launch the so-called fully autonomous driving function, which can be used as a software upgrade on most Tesla models at a price of $10,000.However, this technology is far from reaching the L5 level of autonomous driving, that is, "fully automated."

Princeton University’s Department of Autonomous Automotive Engineering, Alan Kornhauser, said, “Musk’s approach is completely irresponsible.”

Kornhauser said that indeed, Tesla stated in some rules. They are not ready yet (self-driving), but Musk has sold consumers the dream of a car that can drive by itself. "This is not a game. This is a serious matter."

Over-reliance on the "autopilot" system causes evil consequences Some drivers rely on and abuse the system, which has led to an increase in Autopilot-related accidents.

In the user manual, Tesla recommends that drivers place their hands on the steering wheel and pay attention to them when using Autopilot. However, there have been many videos on the Internet saying that someone is sleeping in the driver’s seat or otherwise unsafe Way to drive. Some Tesla drivers said that they can use Autopilot without having to keep their hands on the steering wheel for a long time.

The report of the US regulatory agency also confirmed the consequences of relying on the Autopilot system.

The National Transportation Safety Board (NTSB) has found that many fatal and non-fatal accidents are related to Autopilot. According to the NTSB report, these accidents have one thing in common: drivers rely too much on the "autopilot system", but in certain Under circumstances, the system is flawed, and in some cases its capabilities are not as strong as the driver thought, leading to accidents. The following cases are all from the NTSB report.

2016,In a car accident in Williston, Florida, the owner of the Tesla Model S was crushed by a tractor and the driver was killed. According to the report, “Tesla’s automatic vehicle control system was not designed to recognize and recognize trucks crossing the lanes, nor did it determine that a collision was imminent.” Model X hit the highway barrier, killing the driver.

reported that “the possible cause of the car accident is that due to system limitations, Tesla Autopilot guided the Model X to the corner of the highway, and the driver lacked reaction due to distraction. He may be playing with a mobile phone. Game applications, and over-rely on Autopilot's partial driving automation system."

In 2018, a non-fatal car accident occurred in Culver City, California. A Model S fire truck rear-ended and parked in the driveway. According to the report, “the possible reason is that Tesla drivers did not react to the stationary fire trucks on the road because they did not notice the advanced driving assistance system of the vehicle.”

occurred in Dray Beach, Florida in 2019. In a car accident, a Model 3 passed under a tractor, resulting in the death of the driver. The report stated, “Autopilot did not issue a visual or audible warning to the driver to put his hand back on the steering wheel. Due to system design limitations, the collision avoidance system did not issue a warning or activate (automatic emergency braking). The surrounding environment exceeded Autopilot’s (Operational design field)."

According to foreign media reports, since the launch of Autopilot, at least 11 people have been killed in 9 car accidents involving Autopilot in the United States. Internationally, there were 7 more accidents, resulting in at least 9 deaths.

The US government may strengthen supervision

successive Tesla Autopilot related accidents,It has also triggered reflections from US regulators.

On February 1 this year, NTSB Chairman Robert Samwalter wrote to the U.S. Department of Transportation, criticizing the lax safety standards of the autonomous driving system and specifically mentioning Tesla.

Sam Walter said that if the department asked Tesla to improve Autopilot's safety protocol after the 2016 Florida death, the 2019 Model 3 fatal accident may not happen.

Sam Walter also warned, “Tesla is testing a highly automated (self-driving) technology on public roads, but the supervision of it is very limited by the regulatory authorities, which will give drivers and other road users _Span6span

NTSB also criticized NHTSA’s approach to automated vehicles as “wrong, because it basically waits for problems to occur, rather than proactively addressing safety issues.” The NTSB also stated that NHTSA “applies to automated vehicles Security has taken a non-supervised approach."

NTSB can only issue recommendations, requiring NHTSA and Tesla to restrict the Autopilot system usage scenarios to roads where the system can operate safely, and require Tesla to configure one More powerful systems to monitor drivers to ensure their concentration.

Neither Tesla nor NHTSA took action, which caused criticism from the NTSB.

In the past, NHTSA has the right to supervise car manufacturers and can request the recall of defective vehicles. However, due to concerns about hindering the development of promising new functions, NHTSA has adopted a non-interfering attitude towards the supervision of partial and fully automated driving systems.

But since March this year, NHTSA has stepped up its investigation of Tesla.However, so far, it has relied on automakers and technology companies to voluntarily follow safety compliance.

NHTSA said last month, “With the establishment of the new government, we are reviewing regulations on autonomous vehicles.” Especially the laws and regulations binding Tesla) should have been introduced long ago.

In December last year, before the departure of former President Trump, NHTSA asked for public opinions on regulations (related to autonomous driving). The then US Secretary of Transportation Zhao Xiaolan (whose jurisdiction includes NHTSA) stated that the proposed regulations would solve safety issues and "at the same time would not hinder the development and innovation of autonomous driving systems."

But her successor Pete Buttigieg said in front of Congress that changes may come.

Buttigieg said last month, “What I want to say is that the U.S. policy framework has not really caught up with the advancement of technology platforms. Therefore, we intend to pay close attention to this and within our jurisdiction. Make every effort." Buttigieg said, may cooperate with Congress to resolve this issue.

Duke University electrical and computer engineering professor Kamis, who studies automated vehicles, said that the Texas car accident will become a watershed for NHTSA (policy).

She hopes that this car accident will bring about a change, because "Tesla has been holding a free pass for such a long time", it is time for a change.

(Canada-U.S. financial article, plagiarism must be investigated)

# Tesla#、#自驾#

Author: Song Xing