In recent weeks, a self-driving Uber cab struck and killed a pedestrian and a Tesla owner was killed when his self-driving car rammed into a concrete median. Earlier this year, a Tesla on Autopilot crashed into a fire truck.
Despite a number of crashes and near-misses, the companies maintain that autonomous vehicles are statistically safer than human drivers. However, Uber canceled its test program in California and Tesla will re-examine its trademark Autopilot software.
Autopilot technology still has some bugs to be worked out
Tesla has disclosed that Wei Huang, an owner of its Model X SUV, was killed on March 23 while his car was in Autopilot mode. Although the car controls steering, braking and acceleration on Autopilot, there is a human in the driver’s seat who can quickly take over. Tesla’s records show that the driver received visual and audible warnings before the crash but did not put his hands on wheel in the last six seconds before impact. Neither he nor the car applied the brakes or took evasive action to avoid the concrete highway divider.
- In March, a self-driving SUV owned by Uber struck and killed a woman who was walking her bicycle across a road. The accident occurred at night on an unlighted stretch of road far from the crosswalk. However, the car’s lidar (laser sensors) apparently did not detect the pedestrian. Neither the car nor the human driver slowed down or swerved before impact.
- In January, a Tesla Model S slammed into the back of a fire truck parked on the side of the freeway. The human driver was not seriously injured, despite the 65-mph collision. Few details have been released except that the Autopilot was engaged.
- In 2016, Tesla Model S owner Joshua Brown became the first person to die in a collision involving a self-driving vehicle. The Tesla’s radar failed to “see” the side of a tractor-trailer that was crossed the road.
Where were the human operators?
Despite the “Autopilot” name, Tesla says that its technology is meant to assist, not replace, the human driver. However, in the catastrophic crashes, the humans either ignored warnings or were asleep or distracted. It may be unrealistic to expect human drivers to stay alert and focused while the car is doing all the work. Investigators have not determined whether Tesla’s technology or the human operators were at fault for these crashes. Perhaps both bear responsibility.
Are there self-driving cars in Connecticut?
Believing that self-driving cars can make roads safer – and not wanted to fall behind other states – the Connecticut Legislature in 2017 authorized limited use of autonomous cars. The legislation authorized testing in four municipalities, but allows for fully autonomous vehicles (FAVs) only under tightly controlled circumstances. So far, those test communities have not been announced. But it seems inevitable. Hopefully, we are not “guinea pigs” for technology that is not quite ready for prime time.