Dr Xiaocong Liu, Research Assistant at Nottingham Law School, NTU
As we approach the era of autonomous vehicles, the legal community is gearing up to tackle the looming question of liability in incidents involving autopilot-enabled cars. The advent of Tesla's Autopilot system has not only revolutionized driving but has also introduced a complex legal conundrum: when an autonomous vehicle is involved in an accident, who bears the responsibility? This article will follow by comparing the different approaches and attitudes adopted by the UK and US legal systems in addressing such issues.
Notably, the US legal framework adopts a passive stance, primarily involving the courts in litigating and determining the contours of liability. This has resulted in a patchwork of case law that, while providing some guidance, lacks the consistency and predictability that formal legislation might provide.
Instead, the UK is taking strides towards a more proactive approach. Recognising the complexity and potentially wide-ranging impact of self-driving cars, the UK government has been working on forthcoming legislation designed to clearly delineate liability. This approach not only addresses current ambiguities but also anticipates future challenges, aiming to provide a legal framework that can guide both domestic policy and international standards.
Tesla's Autopilot and US Courtrooms
In 2019, the case of a 28-year-old man (Kevin George Aziz Riad) who drove an autopilot Tesla Model S through a red light in southern California (while autopilot technology was being used) and crashed into a car, killing two people, delved into the blurred lines of responsibility when autopilot systems are activated. Until June 2023, when Riad pleaded guilty, the prosecution ultimately ruled and found that Riad's actions were reckless, while the defence's argument emphasised the reliance on Tesla's technology.
It is interesting to note that in this case, Tesla's engineers testified that Riad had been using Tesla's Autopilot technology at the time of the accident, but that Riad did not make any braking movements in the six minutes leading up to the accident, even though Riad had both hands on the steering wheel at all times. It is worth mentioning that Tesla has also put on its website something like a 'disclaimer' that ''Anyone using its Autopilot should be a fully attentive driver, with both hands on the steering wheel, and ready to take over at any time"'.
Legally, this disclaimer is an attempt to establish a theoretical basis for the company to not be liable for any consequences resulting from improper use if the user fails to follow the guidelines and warnings provided by the company. It's not just Tesla; many technology and product manufacturers use similar language in situations where their products may pose a risk. However, the effective avoidance of culpability and liability does not rely solely on the existence of a disclaimer. In the event of an accident, courts will consider a variety of factors, including, but not limited to, the design of the product, the clarity of the warnings and guidelines provided by the manufacturer, and whether the user could reasonably have followed those guidelines.
In a groundbreaking development, Riad was ultimately sentenced to two years of probation, setting a precedent for how cases involving self-driving car technology should be decided. The court's decision to sentence Riad to probation reflects the evolving legal view that drivers need to maintain a certain level of responsibility even when using advanced assisted driving systems. Notably, during the trial, there was evidence that Riad's Tesla was travelling at a high speed of 74 mph and showed no signs of braking or steering adjustments prior to the collision. This data, as well as the legal outcome, emphasises the need for drivers to remain engaged and responsible despite the assistance of Autopilot.
More importantly, the case is also believed to be the first felony prosecution in the U.S. against a driver using widely available partially automated driving technology. Also accompanied by a growing number of voices from accident victims as well as condemnation, the case symbolises the broader legal dilemma of "man versus machine" that has brought Tesla's Autopilot system under scrutiny year after year.
Six months after the end of the Raid trial, US judges in a November 2023 case decision seem to have become more strict as well as cautious about Tesla as they found that the reliability of Tesla's Autopilot wasn't as 'miraculous' as it was advertised to be.
The case arose from a 2019 accident north of Miami in which a Model 3 driven by owner Stephen Banner hit the trailer of a large 18-wheeler truck as it turned onto the road, resulting in the Tesla's roof being cut off and Banner's death. Specifically, Judge Scott found evidence that Tesla "adopted a marketing strategy that portrayed its products as automated" and that Musk's public statements about the technology "had a significant impact on people's beliefs about the products' capabilities”. In addition, the judge found that the plaintiff, Banner's wife, should have been able to argue to the jury that Tesla's warnings in its manuals and "click-to-consent" agreements were inadequate (i.e., that they failed to adequately inform drivers about the potential pitfalls and risks of the Autopilot system). At the same time, the judge noted that the accident was "strikingly similar" to the one that killed Joshua Brown in 2016, in which the Autopilot system failed to detect a truck crossing the road, causing the vehicle to hit a trailer at a high rate of speed.
Among other things, the judge referred to a Tesla promotional video released in 2016. This video showed a Tesla car seemingly driving itself without human intervention to promote its Autopilot feature. At the beginning of the video, there is a disclaimer saying that the person sitting in the driver's seat is actually only there because it is required by law, implying that the car is fully capable of driving itself.
Judge Scott noted that the video gave the impression that Tesla's Autopilot technology was so sophisticated that it could safely drive the car without human intervention. But in fact, the video didn’t make it clear that the technology was still in a developmental stage or that this driverless scenario was a long-term goal rather than a true reflection of current technology. In other words, the judge found that the video could mislead consumers into believing that Tesla Motors' Autopilot feature is more advanced and reliable than it actually is, without sufficiently emphasising that it's still a developing technology that may have some problems or limitations.
So, to summarise the judgment, it is reasonable to conclude that the defendant Tesla, through its CEO and engineers, was well aware of the failure of Autopilot to detect the problem of sideways traffic. Therefore, the US court ultimately ruled that the plaintiffs could proceed to trial on their claim for punitive damages against Tesla for intentional misconduct and gross negligence. This ruling is a setback for Tesla, as the company had just won two product liability trials regarding its autopilot system in California in 2023 (case details in Further Reading).
The ruling highlights the potential discrepancy between advertised capabilities and actual functionality, raising important questions about consumer safety and corporate responsibility. These two major cases illustrate the evolving legal battle over self-driving car technology in the United States, focusing on the fine line between innovation and safety, and the primary responsibility manufacturers should have in ensuring that their technology does not mislead consumers or compromise safety. While previous cases have seen Tesla successfully defend against liability claims under various grounds and place the onus of negligence on the drivers who are using its self-driving cars themselves, subsequent judgments, such as Banner's, have progressively illustrated that Tesla is, in fact, knowingly committing a crime, accompanied by increasing scrutiny from the courts.
The UK's Attitude to Self-Driving Car Liability
Across the pond, the UK has a very different approach to self-driving car accidents, emphasising regulation over litigation. The Automated Vehicles Bill 2023, which will revolutionise road laws, makes it clear that manufacturers, not drivers, will be held liable for accidents involving self-driving cars. More specifically, The Automated Vehicles Bill [HL] 2023-2024, Bill 167, was introduced in the Lords on November 2023, and will have its second reading in the Commons in March 2024.
This legislative initiative is designed to provide clarity and safety for car manufacturers and drivers, encouraging the development of self-driving technology on UK roads. The bill also addresses the previously mentioned issue of misleading marketing, which directly affects companies such as Tesla. The bill stipulates that only vehicles that meet strict safety standards can be marketed as self-driving, aiming to prevent consumer confusion and ensure a higher level of road safety.
The contrast between the US and UK approaches highlights the wider global challenges of bringing self-driving cars into existing legal frameworks. While the US has approached the issue of liability gradually through the courts, the UK has opted for pre-emptive legislation, setting a clear path for the future of autonomous driving.
As autonomous driving technology continues to evolve, the legal implications of autonomous driving systems remain a contentious issue. The balance between innovation and safety, as well as clear legal responsibilities, is crucial. The decisions made by courts and parliaments around the world will determine the future of transport as we explore this uncharted territory.
Further Reading
First Case: Maria Luz Nieves v. Kevin George Aziz Riad, et al. [2020] Superior Court of California, County of Los Angeles, Case No. 20STCV08423
Second Case: Monet v. Tesla Inc. [2024] United States District Court, N.D. California Case No. 2:24-cv-00107
Third Case: Matsko v. Tesla, Inc [2022] United States District Court, N.D. California, Case No. 3:22-cv-05240
The California Case 1: Molander v. Tesla Inc [2020] Riverside County Superior Courts, California, Case No. RIC2002469
The California Case 2: Justine Hsu v. Tesla Inc [2020] Los Angeles County Superior Courts California, Case No. 20STCV18473
Stefanie Dazio ‘Tesla driver who killed 2 people while using autopilot must pay $23,000 in restitution without having to serve any jail time’ (Fortune 16 December 2023) https://fortune.com/2023/12/15/tesla-driver-to-pay-23k-in-restitution-crash-killed-2-people/
Tom Krisher and Stefanie Dazio ‘Felony charges are 1st in a fatal crash involving Autopilot’ (APNEWS 18 January 2022) https://apnews.com/article/tesla-autopilot-fatal-crash-charges-91b4a0341e07244f3f03051b5c2462ae
Rachel Abrams and Annalyn Kurtz ‘Joshua Brown, Who Died in Self-Driving Accident, Tested Limits of His Tesla’ (New York Times 01 July 2016) https://www.nytimes.com/2016/07/02/business/joshua-brown-technology-enthusiast-tested-the-limits-of-his-tesla.html
Edward Helmore ‘Judge finds ‘reasonable evidence’ Tesla knew self-driving tech was defective’ (The Guardian 23 November 2023) https://www.theguardian.com/technology/2023/nov/22/tesla-autopilot-defective-lawsuit-musk
Department for Transport and Centre for Connected and Autonomous Vehicles ‘Policy paper: Automated Vehicles Bill 2023’ (Gov.uk 21 November 2023) https://www.gov.uk/government/publications/automated-vehicles-bill-2023
Roger Tyers ‘Research Briefing: Automated Vehicles Bill [HL] 2023-24’ (UK Parliament 01 March 2024) https://commonslibrary.parliament.uk/research-briefings/cbp-9973/
Comments