The Feds Are Investigating Tesla Over Autopilot Crashes
US government regulators are opening an investigation into Teslaâs Autopilot system after cars using the feature crashed into stopped emergency vehicles.
The National Highway Transportation Safety Administration announced the investigation on Monday, and it encompasses 765,000 Teslas sold in the US, a significant fraction of all of the companyâs sales in the country. The agency says the probe will cover 11 crashes since 2018; the crashes caused 17 injuries and one death.
The NHTSA is looking at Teslaâs entire lineup, including Models S, X, 3, and Y from model years 2014 through 2021. Itâs investigating both Autopilot and Traffic Aware Cruise Control, a subset of Autopilot that does not steer the vehicle but allows it to match traffic speeds.
In each of the 11 crashes, Teslas have hit first respondersâ vehicles that have been parked and marked with flashing lights, flares, illuminated arrow boards, or road cones.
The investigation will cover the entire scope of the Autopilot system, including how it monitors and enforces driver attentiveness and engagement as well as how the system detects and responds to objects and events in or near the roadway.
Tesla has faced scrutiny for the way Autopilot verifies driversâ attentiveness while the system is turned on. In an assessment of advanced driver-assistance systems (ADAS), Autopilot received middling marks in the European New Car Assessment Program. The system was hampered by its relative inability to keep drivers engaged with the road.
Like many other ADAS systems, Autopilot requires a driver to keep their hands on the wheel, though such systems can be easily fooled by draping a weight over one of the steering wheelâs spokes. A recent investigation by Car and Driver found that it took anywhere from 25 to 40 seconds for the vehicle to flash a warning when drivers took their hands off the wheel, depending on the model. If drivers didnât respond, the car would drive for another 30 seconds before starting to brake. At highway speeds, this could result in the system operating without driver engagement for up to a mile.
In the wake of a January 2018 crash in California, the National Transportation Safety Board criticized the way that Tesla attempts to keep drivers engaged. In that incident, which is also part of the NHTSA probe, a 2014 Model S rear-ended a fire truck in the high-occupancy vehicle lane of Interstate 405 in Culver City. The Teslaâs driver had Autopilot engaged and was following another vehicle in the HOV lane when the lead vehicle changed lanes to avoid the parked fire truck. Autopilot did not swerve or brake, and the driver, who was eating a bagel, did not take control of the vehicle. The Tesla hit the fire truck at 31 mph, according to the accident report.
The NTSB said that driverâs inattentiveness was the likely cause of the crash âdue to inattention and overreliance on the vehicleâs advanced driver assistance system; the Tesla Autopilot design, which permitted the driver to disengage from the driving task; and the driverâs use of the system in ways inconsistent with guidance and warnings from the manufacturer.â
Tesla recently began changing the way Autopilot works, ditching the radar sensor in Models 3 and Y in favor of additional cameras. (Models S and X will retain radar for the foreseeable future.) As the crashes that are part of the NHTSA probe show, radar data doesnât guarantee that ADAS systems will properly sense obstacles in the roadway, though, generally, additional sensors can help the systems get a complete picture of the scene. Because radar and lidar data are essentially a series of measurements, they aid in determining how far a vehicle is from an object. While ADAS systems can get the same information from camera images, they require more complicated computations than with radar or lidar. Itâs unclear whether the NHTSA investigation includes Teslaâs new camera-only models.
Nor is it clear whether the probe will affect Teslaâs so-called Full Self-Driving feature, beta versions of which have been released to a group of drivers. Videos of the system in action show that itâs very much a work in progress, and it needs driver attention at all times.
While Full Self-Driving does make some decisions that closely emulate a human driver, in other cases it makes more questionable choices. In one video, a Full Self-Driving car brakes only after passing a disabled vehicle on the shoulder. On the same trip, it suddenly swerves right into another lane before taking a left. In another video, the car creeps forward into intersections despite cross traffic, and later, it almost tries to drive into a hole in the street that was surrounded by construction cones. At times, Full Self-Driving can't tell whether the human driver has control of the vehicle, and it will drive for more than a minute between prompts to confirm driver attention.
So far, automakers have been largely free to develop ADAS features without significant regulatory oversight. The NHTSA has been relatively hands-off, to the point that the NTSB has been critical of its laissez-faire attitude. This new investigation suggests the agency may be considering a less lenient approach.
This story originally appeared on Ars Technica.
More Great WIRED Stories
0 Response to "The Feds Are Investigating Tesla Over Autopilot Crashes"
Post a Comment