Self-driving cars have already started appearing on our roadways. While many people assume that artificially intelligent cars are safer than cars driven by humans, there have been several notable car accidents caused by these AI vehicles. Whether self-driving cars are more or less safe than traditional cars remains to be seen.
Self-driving cars are under more scrutiny than traditional motor vehicles. When cars in self-driving mode are involved in an accident, those accidents receive more media attention. Below are some of the most prominent motor vehicle collisions involving cars that operate by artificial intelligence.
A Self-Driving Uber Car Hit and Killed a Pedestrian in Arizona
This self-driving car accident gained significant news attention. A self-driving Uber car collided with a pedestrian in Tempe, Arizona, and caused the death of the pedestrian. The National Transportation Safety Board released a report of its investigation. According to this report, the self-driving system used by the Uber car determined that the pedestrian was a bicycle, not a pedestrian. When the car struck the pedestrian, he was walking and wheeling a bicycle at the same time.
The artificial intelligence used by the Uber vehicle decided that the pedestrian was only a bicycle. By doing so, the artificial intelligence determined that it had 1.3 seconds before using an emergency braking maneuver to avoid a crash. Had the car determined that the victim of the accident was a pedestrian, it would have set off the emergency braking system sooner.
Uber has stated that when a vehicle is under the control of the computer, emergency maneuvers are not enabled. The purpose of this programming is to reduce the potential for erratic vehicle behavior. In these situations, the programming relies on the driver to take action and intervene. Unfortunately, the artificial intelligence system is not designed to alert the operator.
Tesla S. Model in Autopilot Kills Driver in Crash
An Apple engineer died in March 2018 after Tesla crashed into a concrete barrier. At the time of the car accident, the victim’s Tesla was set to operate semi-autonomously. The vehicle occupant was playing a video game on his smartphone when the collision happened. The vehicle was controlled by a partially self-driving autopilot system.
The National Transportation and Safety Board stated that the Tesla did not alert the driver through the forward collision avoidance system. Cars that offer partial automation are not explicitly self-driving. Drivers cannot watch a movie, TV show, or text when a Tesla is in “self-driving” mode.
The victim had allegedly complained that the autopilot system on his Tesla had malfunctioned in the same location in the past. According to his family’s lawyer, the man had told his wife that his Tesla Model X veered toward the concrete barrier during his commute before. Tesla has repeatedly stated that the self-driving feature is not fully autonomous. Nonetheless, it remains to be seen whether or not the technology should have warned the victim and should be more robust.
The question of whether or not a Tesla in autopilot mode is a self-driving car is debatable. Tesla claims that the Autopilot feature is as much as nine times safer than cars driven by human drivers. According to Tesla, they registered only one accident for every 4.34 million miles driven by drivers in autopilot.
Self-Driving Cars Cannot Eliminate All Car Accidents
The Insurance Institute for Highway Safety shows that self-driving cars would have a difficult time preventing all car accidents. The group evaluated over 5,000 crashes reported by police officers in a National Motor Vehicle Crash Causation Survey. The accidents reported were serious enough to require a car being towed and emergency medical services to respond to the scene.
The study sorted through all of the accidents and placed them into five different categories. It discovered that self-driving cars would be able to sense and perceive crashes caused by driving while under the influence, but certainly not all of such accidents. Based on the report, 34% of car accidents, or approximately 2 million car accidents a year happen due to distracted driving and driving under the influence. Today’s self-driving car technology cannot account for mechanical failures, such as tires that have blown out.
In short, even though self-driving car technology could help reduce the incidents of drunk driving and distracted driving, they cannot prevent all accidents, especially those caused by speeding, human error, or driver fatigue. Keep in mind that there is no timeline as to how long it will take for self-driving car technology to catch up. At some point, self-driving cars will be able to communicate with other drivers, preventing even more accidents.
Self-Driving Cars and the Coronavirus
Self-driving vehicles are also facing another challenge. Self-driving cars are not as widespread as they might have due to a fatal crash involving an Uber vehicle that happened in Tempe, Arizona. This fatal crash forced the self-driving car industry to take extra precautions to make sure it is safe. Now, self-driving car operators are dealing with the effects of the coronavirus. Businesses that operate self-driving passenger cars have a duty to keep their passengers reasonably safe, and that duty likely includes sanitizing the vehicles between each passenger and between each food delivery.
The Need for Legal Representation in Self-Driving Car Accidents
Self-driving cars are currently on the road. While it is impossible to predict how many accidents will take place involving self-driving cars, we can be sure that accidents involving self-driving cars will happen. If you or your loved one have been involved in an accident with a self-driving or semi-autonomous car, you need to contact an experienced personal injury attorney. The sooner you speak to an attorney about your personal injuries, the better. Contact the New York City personal injury lawyers at Pulvers Thompson today to schedule your initial consultation.