What Happens When Self-Driving Cars Crash? The Legal Ramifications of Automation Technology is developed to improve our lives. This has meant that many traditional industries are turning to automation. Among the most notable is the automobile industry. As new practices are developed around these technological advancements, how safe are we with this technology, and what happens when things don't work as designed?
Opinions expressed by Entrepreneur contributors are their own.
The Houston Chronicle reported that Waymo and Daimler had joined together to test the viability of computer-controlled 18-wheelers. Previously, smaller tests were being run, but the companies have agreed to increase the total to 60 trucks. Other trucks from the partnership are being tested in other parts of Texas and Arizona. This latest move is part of the state's more significant recent economic focus.
If you see one of these trucks on the road, don't worry, a human driver is on board, ready to take over at any moment. The trucks are currently being tested for overall effectiveness and ability to respond to changing road conditions and stop at unpredicted times.
As our technological capabilities advance, we will continue to see innovation across every industry. The thought of computer-driven semi-trucks raises a serious question: how safe are these trucks?
In 2020, 12.6% of crashes on U.S. roadways were caused by large trucks. Many safety advocates have questions and concerns about the increased prevalence of these vehicles on roads. From July 2021 to May 2022, the National Highway Traffic Safety Administration reported 400 separate crashes related to or caused by vehicles with at least partial automated control systems.
Beyond just automated trucks, driverless vehicles are becoming more and more prevalent. Globally, the automated vehicle market is estimated to be worth $54 billion. The personal automated vehicle market is following a similar trend, and it's growing.
Audi recently revealed plans to spend $16 billion on self-driving cars by 2023. Regardless of how individual drivers feel about them, it does appear that large corporations are investing heavily. However, how fast the market accepts this growing trend remains to be seen.
Many drivers are finding the increasing prevalence of self-driving vehicles unsettling and concerning. According to PolicyAdvice, 43% of Americans are not comfortable inside a driverless car, citing safety as their chief concern. Many are still not comfortable with the idea of the United States moving towards this phase of automation.
Despite concerns regarding automated vehicles, car manufacturers continue to try and demonstrate their safety. Although it's challenging to get accurate and relevant safety data regarding automated vehicles, concerned drivers can look to the state of California for some guidance. Since 2014, the state has reported 88 accidents involving driverless vehicles. Among those incidents, 81 of the accidents were caused by other vehicles. Sixty-two of the 88 vehicles were operating in fully autonomous modes.
Despite manufacturers trying to showcase the safety of driverless vehicles through data, public perception may be hard to move. A public appeal is a notoriously difficult opinion to shift. Even if autonomous vehicles cause only fender benders, the public outcry will likely be fierce.
Operating a motor vehicle can already be a dangerous practice. In 2020 alone, 38,824 individuals were killed in car accidents on U.S. roadways. It's a natural reaction to be skeptical and a bit hesitant to turn over control of a vehicle knowing the prevalence at which fatal accidents occur. Even eliminating deadly accidents, there are an average of 17,000 or more car accidents a day in the U.S. These accidents can result in sustainable financial damages to the at-fault party, emotional trauma and related lingering injuries.
The Centre for Data Ethics and Innovations in England surveyed drivers in the UK and found that, as predicted, most drivers aren't ready to turn control over so easily. They found that until trust can be built between consumers and manufacturers, most drivers will unlikely accept automated vehicles on roadways.
Who ultimately takes the blame if a driver is struck by a driverless vehicle? The owner? Or the manufacturer for selling a product that couldn't protect the user, in this case, the passenger in the automated vehicle. For now and the foreseeable future, owners of automated vehicles are held to the same standard as their traditional counterparts in the U.S.
These are the legal questions that will likely begin to become more and more relevant. Specifically, the rise of product liability lawsuits may rise in the future. Some states in the U.S. are already beginning to adopt such laws. In this case, the manufacturer could be seen as the "driver" in some situations. Ultimately, anyone looking to prove fault on behalf of an artificial intelligence or computer program will have a lengthy and difficult battle ahead of them. Limited precedent makes this a difficult theory to argue in a court of law, but not impossible.
Product liability law traditionally covers products that result in bodily harm or general injury to the user. Anyone arguing this side of the law must establish proof of injury, proof of a defect, proof of appropriate use and ultimately, a clear connection between injury and defect. The product not working as advertised does not constitute liability on the manufacturer's side. The product in question must result in injury, illness, or general harm to the user or other relevant parties for there to be a case.
As more and more autonomous vehicles take to the roads, these issues will become more common, and the legal impact will be made clearer.