Driverless cars, also known as Connected and Autonomous Vehicles (‘CAVs’), are a hot issue in the legal sector as many jurisdictions brace themselves for increased numbers of driverless cars on their roads. This technological leap forward raises many questions and lawyers will need to keep up with the developments in the years ahead.

Contrasting perspectives: know your client

Different stakeholders have differing interests in the self-driving revolution. Knowing your client and why they do what they do is integral to a lawyer’s role. For the producers, the car manufacturers, the main concern is safety. This is especially important considering 95% of all road traffic accidents in the UK are as a result of human error.[1] It is hoped that self-driving cars will minimise the number of accidents if not eliminate accidents completely.

Other stakeholders have different interests: suppliers would cherish the potential to save labour and resources. If all cars can communicate and predict any potential traffic jams and mitigate accordingly, then there is potential to save time on the road which would be beneficial for distributers and commuters. Governments and green initiatives would be interested in the potential to reduce traffic and thereby reduce pollution. There is also the chance to increase mobility for the old or infirm. Other stakeholders, such as unions, would be more concerned by the challenges that automation brings to peoples’ jobs and livelihoods.

The introduction of CAVs onto our roads is a technological shift that will touch all clients and industries in some way or another and a good lawyer should understand their client’s perspective and concerns.

How do they work?

The way CAVs work differs between manufacturers but there are certain features that are common across designs. Video camera technology is employed to track and identify obstacles, pedestrians, road signs and traffic lights. However, a CAV cannot rely on cameras alone – what if the CAV is driving at night or visibility is low due to fog? This is why CAVs also employ Light Detection and Ranging (LiDAR) sensors which project beams of light to map the car’s surroundings in order to detect road edges. Similarly, radar sensors are used to monitor the presence of nearby vehicles and the speed at which they are travelling to avoid collision. Ultrasonic sensors located on the wheels help detect the positions of curbs and other vehicles when parking and manoeuvring. Finally, a central computer system is needed to process the data collected from all of the sensors, cameras, radar and LiDAR to accurately determine the steering, acceleration and braking.

Having said this, the technology is always progressing and these features will no doubt be improved and refined in the years to come. For instance, at the time of writing, Toshiba has recently upgraded its LiDAR to attain even more robust 3D scanning and we are discovering that smart infrastructure (such as smart roads) is going to be just as important as the technology on the cars themselves.[2]

How do we apportion responsibility?

This will be a central question facing litigators for the next few years. Last year, Rafaela Vasquez, the safety driver of a self-driving Uber that struck and killed a woman in 2018 was indicted for criminal negligence.[3] Uber, her employer and the company that built the automated system, did not face any charges.

The car hit a pedestrian pushing a bicycle across a darkened road (but not across a road crossing) and Uber’s software system did not consider the possibility of pedestrians walking across roads outside of crossings, or the possibility of a person pushing a bicycle while on foot. Uber’s technology repeatedly tried to “categorise” the woman as a different kind of object and predict her path accordingly. When the vehicle sounded the alarm, it was too late for Vasquez to react in time. This is in-keeping with research that has shown that it is very difficult to keep human attention focused on partially automated tasks.

The decision that the safety driver should bear total responsibility is symptomatic of the difficulties faced by legal systems to keep pace with advances in technology and hold people responsible for the technology they build. It is easier to hold the human involved, the person behind the wheel or the screen, accountable for the damage. According to Ryan Calo, a law professor who studies robotics at the University of Washington School of Law, it is easier to bring a case against the human involved as this is a simpler story to tell a jury. In order to bring a case against the company, you have to tell a more complicated story about how driverless cars work and what the company did wrong.[4]

The current state of the law

The laws around self-driving cars differ between jurisdictions. In England and Wales, the Law Commission is in the process of carrying out a detailed review of legislative and regulatory change necessary to support CAVs on UK roads. Currently, the legal and regulatory rules applicable to testing processes can be found in the Code of Practice: Automated vehicle trialling (the 2019 Code of Practice), cyber security and data protection laws (such as the GDPR), traditional road safety laws such as the Road Traffic Act 1988 and the Automated and Electric Vehicles Act 2018.

The 2019 Code of Practice is written for the benefit of trialling organisations and aims to improve the safety guidance and ensure transparency in trials. A recurring trio of rules stated in the Code is that trialling organisations will need to ensure that they have:

(i)         a driver or operator, in or out of the vehicle, who is ready, able, and willing to resume control of the vehicle;

(ii)        a roadworthy vehicle; and

(iii)       appropriate insurance in place.

Annex C of the 2019 Code of Practice “bolts-on” the following laws to the testing of CAVs: the Road Traffic Act 1988, Road Vehicles (Construction and Use) Regulations 1986 and Highway Code. These continue to apply with regard to the duties and obligations of persons responsible for driving the vehicle and compliance with road traffic directions (e.g. just because the safety driver is not the only entity in control of the vehicle does not mean the safety driver should be any less alert or sober or compliant with traffic signs).

Since some trials may decide to use members of the public as passengers in their CAVs, who would be filmed as part of the experiment, testing companies must ensure that they comply with data protection legislation, including the requirements that the personal data is used fairly and lawfully, kept securely and for no longer than necessary. Companies should also follow the Key Principles of Cybersecurity for Connected and Autonomous Vehicles. These are developed by the Department for Transport in conjunction with Centre for Protection of National Infrastructure for companies who supply the trialling company, such as those involved in the manufacturing of parts and the creation of the technology towards the development, testing and deployment of CAVs.

Lastly, when it comes to insurance, the Automated and Electric Vehicles Act 2018 extends the compulsory motor insurance requirement to include owners of CAVs. Motor insurance usually covers damage caused by the fault of the driver – but what if the driver is the vehicle itself? Surely then the accident is due to a fault with the vehicle? One might then ask, if there is occurrence of an insurance event such as a crash while the car is in automation mode, should compensation be sought against the manufacturer on the basis of product liability?

The Automated and Electric Vehicles Act 2018 provides useful clarity that rather than a product liability claim, compensation should be sought through the motor insurance settlement framework. This avoids the injured owner of the CAV bringing a long and costly claim against the manufacturer on a product liability basis. Instead, the owner only needs to prove that the automated vehicle was at fault and they can recover from their insurer. The insurer can then subsequently claim against any other person liable to the injured owner in respect of the accident (e.g. the vehicle manufacturer or software developer).[5]


[1] The Royal Society for the Prevention of Accidents, Road Safety Factsheet (November 2017) 3

[2] Dr Georges Aoude, ‘Why smart roads are just as important as autonomous vehicles’ (Traffic Technology Today, 21 May 2021) <https://www.traffictechnologytoday.com/opinion/opinion-why-smart-roads-are-just-as-important-as-autonomous-vehicles.html> accessed 30 June 2021

[3] Matt McFarland, ‘Uber self-driving car operator charged in pedestrian death’ (CNN, 19 September 2020) <https://edition.cnn.com/2020/09/18/cars/uber-vasquez-charged/index.html> accessed 30 June 2021

[4] Aarin Marshall, ‘Why Wasn't Uber Charged in a Fatal Self-Driving Car Crash?’ (Wired, 17 September 2020) <https://www.wired.com/story/why-not-uber-charged-fatal-self-driving-car-crash/> accessed 30 June 2021

[5] Matthew Felwick, Lydia Savill, Emmie Le Marchand and Eleanor Griffith, ‘The Road Ahead: Product Liability and Motor Insurance Implications of the Automated and Electric Vehicles Act 2018’ (2019) 74 International Products Law Review 6