Self-driving cars – who is responsible if you have an accident?
The age of driverless' cars is nearly upon us. Funnily enough, it's internet giant Google that's been leading the way, developing truly driverless vehicles that don't even have a steering wheel you can grab hold of if things start to go sideways.
However, the development of driverless cars has brought up an interesting legal issue - who is responsible if you have a crash? If you're not technically driving' a vehicle, are you, as the owner, responsible for its actions, or is it down to the manufacturer? This whole conundrum has been thrown into sharp relief lately, after two high-profile crashes in the USA: one involving a Tesla and one that involved a new Chevrolet Bolt using GM's Cruise Automation technology.
This isn't a problem yet, is it?
If you don't think that this issue concerns you just yet then you'd be wrong. We're now on the outer fringes of driverless technology, with many new models incorporating things like crash avoidance systems, hands-free parking, or cruise control options. These all count as autonomous' technology, but don't for one second think you can lay the blame of a parking fender-bender or a collision caused by lane wandering on the tech. They may be automatic systems, but ultimately us drivers are still in charge of our vehicles, which means the buck stops with us. We can't blame manufacturers for our own inability to use automotive technology appropriately, so the driver must take responsibility for any accidents caused while using that tech.
Completely driverless cars raise a whole raft of new issues, and a blame game that could result in manufacturers having a really big rethink about driverless technology in general.
Putting the brakes on things
One of the most common autonomous automotive systems is impact awareness braking systems - where the vehicle uses a bank of sensors to anticipate a potential hazard and, if the driver doesn't hit the brakes, the car will. Do they work? Well, not all the time, and there have been instances where cars fitted with Automated Emergency Braking (AEB) haven't quite understood the whole crash mitigation' principle of their programming and ploughed into the back of a vehicle in front or, in a few instances, a wall.
In these cases, there's no point in trying to pin the blame on the technology. Reliance on automated systems that haven't got past the 1.0 level yet means that you're putting your faith in potentially unreliable tech. While shifting the blame over to the black box under the bonnet may be tempting, once again it's the driver's responsibility to take back control of the vehicle and employ their own crash mitigation' responses. If they don't, then the insurance and legal companies certainly will apportion blame on the driver, and not on the technology or its manufacturer.
Doing things Scandi-style
Volvo, however, has decided to buck the trend and are out in front when it comes to innovation. They were one of the first to incorporate sophisticated cruise control systems into their cars, and have been at the forefront of the race (if you'll pardon the expression) towards driverless tech. In 2015, they came out and officially said they would accept full responsibility and, from a legal standpoint more importantly, liability for accidents involving its driverless cars, if the accident was the result of a flaw in the car's design or manufacture. Google has also made similar claims, but the issue may lie with legislation, rather than the manufacturer's willingness to shoulder responsibility.
Currently, the number of truly autonomous vehicles on public roads can be fairly accurately estimated at a big fat zero. Legislators are still very jittery about testing first-generation tech on the open road. Even in the USA, one of the most progressive countries in the world when it comes to autonomous vehicles, state legislators are reluctant to allow them onto the public highway. That, in turn, is holding back development, as well as log-jamming any potential data that would enable manufacturers to address any safety issues. Put simply, even the manufacturers don't know how well these driverless cars are going to perform when it comes to the real world.
Until things have been cleared up, we're into 2.0 level tech (or beyond) where the bugs have been properly ironed out, and that clarification has been attained as to who is responsible for what, the driver's default position should be that ultimately they are responsible for their actions, and those of their vehicle.