Imagine a world where you never have the hassle of driving in a bumper-to-bumper traffic jam. Or never getting flipped the bird and yelled at for taking more than 0.0001 seconds to accelerate after a red light turns green.
What if we could prevent millions of traffic-related fatalities every year?
Autonomous cars are going to forever change the way we travel. Despite the obvious benefits, there is still much work to do, which was painfully evident after the first fatality due to a self-driving car accident. more people are considering the moral and legal implications
Human Error Car Accidents
A car crash occurs every 5 seconds in the United States. Every 13 minutes, there is a death due to one of these accidents.
Human error is the principal cause of close to 98% of motor vehicle accidents.
Avoidable mistakes too often end in tragedy. Speeding, drunk driving, distracted driving, failing to drive in accord with climate conditions, and other slips could be prevented by self-driving cars.
Self-Driving Car Accident Laws
There is a very small chance that you have come across one of these vehicles in your transit. However, developers and investigators are testing them in different cities throughout the US and it is only a matter of time before they become commonplace in American streets.
The federal government has been unable to pass any legislation regarding the safety of driverless cars or how to proceed following an accident.
Several states, such as California and Arizona, have taken the initiative to approve regulations for these instances.
In other words, it all depends on which state the accident occurs.
Who is at Fault?
Let’s examine the case of the pedestrian struck and killed by a driverless car in Arizona.
One of Uber’s self-driving vehicles was traveling down the road at almost 40 mph when 49-year-old Elaine Herzberg attempted to cross the street. This was the first known such case.
At the time, the police alluded to the possibility that Herzberg may actually have been something imprudent in the way they stepped onto the roadway and in the path of the motor vehicle.
But could better technology or human diligence have prevented the tragedy?
Before anyone receives the blame for any of these autonomous vehicle accidents, one must consider the involved parties.
Self-driving Car Liability
Essentially, these autonomous cars use a computer to navigate the streets and avoid other cars and pedestrians.
Good luck exchanging insurance info with a piece of software.
However, it should be noted that sometimes a human is in the driver seat of these self-driving cars and can take control if necessary. It is possible that they are at fault in some cases.
Other Human Drivers/Pedestrians
Just because a car is autonomous does not mean someone can get away with blatantly rear-ending it. The same rules of the road apply and must be adhered to.
Testing Company Liability
Companies like Uber, Waymo (by Google), Tesla, GM, and others are very involved in testing these vehicles. It is very possible that they will be liable for an accident occurring during one of these tests.
Also, one of their employees may actually be in the car during the incident. However, the principles of employer liability will apply.
Manufacturer Liability
In some cases, the creator of the vehicle or autonomous driving system is to blame. Many car makers, like GM, BMW, Ford, and others are producing automated driving systems.
If it can be proven that the accident occurred due to a fault of the computer system it may be possible to build a strong case against the manufacturer.
What You Can Do
Someday in the near future, you may find yourself and or loved one involved in a self-driving car accident.
First, call emergency services if someone is injured.
Then, there are many ins and outs to consider. No two cases are exactly alike and there may be several confounding factors. Also, local and state laws may apply.
Don’t do it alone. Contact an attorney with knowledge and experience in car accidents and personal injury claims to find out the best options to take.