Get All Access for $5/mo

What Happens When Self-Driving Cars Crash? The Legal Ramifications of Automation Technology is developed to improve our lives. This has meant that many traditional industries are turning to automation. Among the most notable is the automobile industry. As new practices are developed around these technological advancements, how safe are we with this technology, and what happens when things don't work as designed?

By Hank Stout Edited by Micah Zimmerman

Opinions expressed by Entrepreneur contributors are their own.

Drivers commuting along I-45 between Dallas and Houston will likely notice new passengers on the road. Driverless semi-trucks.

The Houston Chronicle reported that Waymo and Daimler had joined together to test the viability of computer-controlled 18-wheelers. Previously, smaller tests were being run, but the companies have agreed to increase the total to 60 trucks. Other trucks from the partnership are being tested in other parts of Texas and Arizona. This latest move is part of the state's more significant recent economic focus.

If you see one of these trucks on the road, don't worry, a human driver is on board, ready to take over at any moment. The trucks are currently being tested for overall effectiveness and ability to respond to changing road conditions and stop at unpredicted times.

As our technological capabilities advance, we will continue to see innovation across every industry. The thought of computer-driven semi-trucks raises a serious question: how safe are these trucks?

Related: What's Under the 'Hood' of Self-Driving Cars?

In 2020, 12.6% of crashes on U.S. roadways were caused by large trucks. Many safety advocates have questions and concerns about the increased prevalence of these vehicles on roads. From July 2021 to May 2022, the National Highway Traffic Safety Administration reported 400 separate crashes related to or caused by vehicles with at least partial automated control systems.

Beyond just automated trucks, driverless vehicles are becoming more and more prevalent. Globally, the automated vehicle market is estimated to be worth $54 billion. The personal automated vehicle market is following a similar trend, and it's growing.

Audi recently revealed plans to spend $16 billion on self-driving cars by 2023. Regardless of how individual drivers feel about them, it does appear that large corporations are investing heavily. However, how fast the market accepts this growing trend remains to be seen.

Many drivers are finding the increasing prevalence of self-driving vehicles unsettling and concerning. According to PolicyAdvice, 43% of Americans are not comfortable inside a driverless car, citing safety as their chief concern. Many are still not comfortable with the idea of the United States moving towards this phase of automation.

Related: Elon Musk Announced Tesla's 'Full Self-Driving' Technology Will Cost $3,000 More

Despite concerns regarding automated vehicles, car manufacturers continue to try and demonstrate their safety. Although it's challenging to get accurate and relevant safety data regarding automated vehicles, concerned drivers can look to the state of California for some guidance. Since 2014, the state has reported 88 accidents involving driverless vehicles. Among those incidents, 81 of the accidents were caused by other vehicles. Sixty-two of the 88 vehicles were operating in fully autonomous modes.

Despite manufacturers trying to showcase the safety of driverless vehicles through data, public perception may be hard to move. A public appeal is a notoriously difficult opinion to shift. Even if autonomous vehicles cause only fender benders, the public outcry will likely be fierce.

Operating a motor vehicle can already be a dangerous practice. In 2020 alone, 38,824 individuals were killed in car accidents on U.S. roadways. It's a natural reaction to be skeptical and a bit hesitant to turn over control of a vehicle knowing the prevalence at which fatal accidents occur. Even eliminating deadly accidents, there are an average of 17,000 or more car accidents a day in the U.S. These accidents can result in sustainable financial damages to the at-fault party, emotional trauma and related lingering injuries.

Related: A Devastating Car Accident Left This Entrepreneur Unable to Speak or Walk for Months. Here's How She Rebuilt Her Life and Her Business.

The Centre for Data Ethics and Innovations in England surveyed drivers in the UK and found that, as predicted, most drivers aren't ready to turn control over so easily. They found that until trust can be built between consumers and manufacturers, most drivers will unlikely accept automated vehicles on roadways.

Who ultimately takes the blame if a driver is struck by a driverless vehicle? The owner? Or the manufacturer for selling a product that couldn't protect the user, in this case, the passenger in the automated vehicle. For now and the foreseeable future, owners of automated vehicles are held to the same standard as their traditional counterparts in the U.S.

These are the legal questions that will likely begin to become more and more relevant. Specifically, the rise of product liability lawsuits may rise in the future. Some states in the U.S. are already beginning to adopt such laws. In this case, the manufacturer could be seen as the "driver" in some situations. Ultimately, anyone looking to prove fault on behalf of an artificial intelligence or computer program will have a lengthy and difficult battle ahead of them. Limited precedent makes this a difficult theory to argue in a court of law, but not impossible.

Related: Grand Theft Auto V Is Helping Teach Driverless Car AI

Product liability law traditionally covers products that result in bodily harm or general injury to the user. Anyone arguing this side of the law must establish proof of injury, proof of a defect, proof of appropriate use and ultimately, a clear connection between injury and defect. The product not working as advertised does not constitute liability on the manufacturer's side. The product in question must result in injury, illness, or general harm to the user or other relevant parties for there to be a case.

As more and more autonomous vehicles take to the roads, these issues will become more common, and the legal impact will be made clearer.

Hank Stout

Entrepreneur Leadership Network® Contributor

Attorney at Law

Hank Stout co-founded Sutliff & Stout, Injury & Accident Law Firm because he wanted to help real people with real problems. Raised in a small West Texas town, Hank was taught the value of hard work, determination, fairness and helping others.

Want to be an Entrepreneur Leadership Network contributor? Apply now to join.

Side Hustle

The Side Hustle He Started in His College Apartment Turned Into a $70,000-a-Month Income Stream — Then Earned Nearly $2 Million Last Year

Kyle Morrand and his college roommates loved playing retro video games — and the pastime would help launch his career.

Business News

New Southwest Airlines Major Investor Wants to Force Out CEO, Slams Company's 'Stubborn Unwillingness to Evolve'

Elliot Investment Management announced a $1.9 billion stake in the Dallas-based Southwest Airlines on Monday and is urging shareholders to vote for new leadership.

Science & Technology

Why We Shouldn't Fear AI in Education (and How to Use It Effectively)

Facing resistance to new technologies in the educational process is nothing new, and AI is no exception. Yet, this powerful tool is set to overcome these challenges and revolutionize education, preparing students and professionals for a future of unparalleled efficiency and personalized learning.

Business News

Elon Musk Threatens to Ban Employees from Using Apple Products, Says Will Lock Devices in 'Cages'

The Tesla founder sounded off on X following Apple's 2024 Worldwide Developer Conference on Monday.

Business News

Apple's AI Has a Catch — And It Could Help Boost Sales

Not every iPhone owner will get to use the new Apple Intelligence.