Two crashes involving Teslas apparently running on Autopilot are drawing scrutiny from federal regulators and suggest a potential new hazard on US freeways: The partially automated vehicles may not stop for motorcycles. The National Highway Traffic Safety Administration sent investigation teams to two crashes last month in which Teslas collided with motorcycles on freeways in the darkness, the AP reports. Both were fatal. The agency suspects that Tesla's partially automated driver-assist system was in use in each. Once investigators have more information, the agency said, it may include the crashes in a broader probe of Teslas striking emergency vehicles parked along freeways. The agency also is investigating over 750 complaints of Teslas braking for no reason.
- The first crash involving a motorcyclist happened at 4:47am July 7 on a state freeway in Riverside, California. A Tesla Model Y SUV was traveling east in the HOV lane. Ahead of it was a rider on a Yamaha V-Star motorcycle, the California Highway Patrol said. The vehicles collided, and the motorcyclist was ejected from the Yamaha and pronounced dead at the scene. Whether the Tesla was operating on Autopilot remains under investigation.
- The second crash happened about 1am July 24 on Interstate 15 near Draper, Utah. A Tesla Model 3 sedan was behind a Harley-Davidson, also in an HOV lane. "The driver of the Tesla did not see the motorcyclist and collided with the back of the motorcycle," Utah officials said. The 34-year-old rider died at the scene. The Tesla driver told authorities that he had the vehicle's Autopilot setting on, the statement said.
Michael Brooks of the nonprofit Center for Auto Safety called on the agency to recall Tesla's Autopilot because it is not recognizing motorcyclists, emergency vehicles, or pedestrians, saying the system "is putting innocent people in danger on the roads." Since 2016, the agency has sent teams to 39 crashes in which automated driving systems were suspected of being in use, per the AP. Of those, 30 involved Teslas, including crashes that caused 19 deaths. Brooks criticized the agency for investigating but not taking action. "Drivers are being lured into thinking this protects them and others on the roads, and it's just not working," he said. Tesla did not immediately comment. The company has said that Autopilot and "Full Self-Driving" cannot drive themselves, and that drivers should be ready to intervene at all times.
(Read more Tesla