Skip to: main navigation | main content | sitemap | accessibility page

 
 
 

Is legal certainty too big an ask in the information technology revolution?

This article was written by Nicoletta Bakolas who spent a week with us as the recipient of the Best Student in IT Law Prize from the University of Bristol, which was sponsored by Gregg Latchams for the third year running.

Driverless car consoleThe UK in the driver’s seat

The race to becoming a key player in the particularly competitive market for the development of Connected and Autonomous Vehicles (CAVs) has begun, and achieving this status is especially attractive to the UK in the post-Brexit era.

The current regulatory framework in the UK allows for the testing of CAVs, as per the 2015 Summary Report entitled ‘The Pathway to Driverless Cars’. Other competitive advantages of testing in the UK are set forth in the report, namely the varied weather and traffic conditions across the region which would allow for the collection of highly varied data. The government, recognising the potential for securing market growth and boosting the economy via investing in automation, is adopting a permissive approach to the regulation of automated technologies to provide an attractive ground for developers and investors. The daring governmental strategy involves an autonomous technologies revolution with driverless cars on the road by 2021. Significantly, in March 2018 the UK government commissioned a three-year review by the Law Commission intended to form the basis for a regulatory framework on automated vehicles covering safety assurance mechanisms, safety certification and accountability for accidents and criminal offences.

As noted in the 2015 Summary Report, while the UK is a signatory to the 1968 Vienna Convention on Road Traffic (which, among other things, requires a driver to be in control of their vehicle at all times) it is not bound by it because it has not ratified it. However in order to ensure that drivers can continue to travel as they do now in EU member states post-Brexit, the UK will be ratifying the Convention.  which is in the process of being amended to allow for CAVs as long as the system can be overridden or switched off by the driver. The

The blame game

Questions raised by CAVs are numerous and interdisciplinary. The ‘Moral Machine‘ developed by the Massachusetts Institute of Technology helpfully demonstrates the ethical implications of the reactions of CAVs to emergency scenarios and the ‘trolley problem’ has been at the forefront of public discussion, raising valid concerns about moral programming and AI decision making capabilities. Nevertheless, the popularity of the aforementioned enquiries has pushed issues surrounding blameworthiness to the margins of the debate. Liability controversies have already begun to emerge as technology threatens to render traditional legal concepts inapplicable, overtaking legislators at a national and EU level.

It is vital at this stage to acknowledge that legal issues differ depending on the level of vehicle automation, which ranges from specific automated features to autonomous decision-making and ultimately the lack of any involvement from a driver. Thus the driver/user, the owner, the manufacturer and the programmer are all parties to which liability may be attributed depending on context.  To put this issue into perspective, adopting the scale for levels of car automation as set by the Society of Automotive Engineers, the driver at level 2 is under an obligation to perform safety functions and thus would reasonably retain liability arising from any related error. When approaching level 5 however, driver interaction becomes non-essential and the risk shifts notably to other parties.

The status quo on approaching such questions will require adjustments to be made reflecting the minimization of the driver’s role and the expansion of manufacturer and software developer responsibility. As use of the concept of negligence becomes inappropriate to place blame on the “driver” of a fully automated vehicle, a turn is made towards owner and product liability. While concerns have been raised that product liability could harm innovation and discourage manufacturers from investing in driverless car technology, it is unreasonable to expect owners to accept full responsibility for a product that makes autonomous decisions based on its independent programming. That is not to say that owners are to be absolved from any liability; the dangers arising from owning a vehicle are already reflected in the traditional system through existing strict liability offences.

It is within the transitional period between manual control and complete automation that the most complex legal issues arise. This lack of discernible responsibilities and uncertainty on the level of involvement required from the driver lead to inappropriate over-reliance on systems with a degree of automation, resulting in preventable accidents. At this stage, a slightly modified question of fault arises; is it the duty of the manufacturer to ensure the consumer is appropriately informed on the limitations of the technology and, more importantly, what degree of information is necessary to satisfy the threshold and place sole blame on the careless user? But assuming that full automation will automatically resolve legal questions is a misguided approach. Even regarding sole manufacturing errors, the complexity in company structures and the development process may cause further difficulties in attributing fault to specific parties, for instance in cases where an accident is caused due to miscommunication of several systems and vehicles.

The industry itself has recognised the barriers posed by uncertainty in responsibilities and some companies have adopted a proactive stance to counteract these problems. Volvo, for example, made a bold statement by choosing to remove the uncertainty surrounding liability, announcing that the company shall accept full responsibility for incidents caused while its car is ‘in autonomous mode’. Nevertheless, this approach does not constitute a panacea for liability questions; it is crucial to acknowledge that the declaration has not been tested in court and that the statement itself is open to interpretation. In any case, such statements are not uniform across the industry. Ultimately, it is likely questions will not revolve around who bears responsibility for an accident, but in determining the degree of fault for each party depending on the context and level of automation involved, making a degree of legal uncertainty unavoidable.

Long road ahead

Fast-lane development and ambitious objectives are not necessarily negative strategies, so long as they are supported by critical engagement and reflection to ensure the infrastructure and legal landscape are progressing at an appropriate pace. Any competitive edge secured by hastily removing barriers to development is sure to be made blunt by accidents, high profile liability cases and data safeguarding scandals. Current reliance on light-touch restrictions and on informal industry self-regulation raises concerns among the community and the potential public outcry, and distrust could pose a critical barrier to future development and engagement with the market.

While it is rational to allow questions of liability to be resolved incrementally alongside the development of technology, a functioning justice system requires the obligations and responsibilities of each party to be clear under law. It will be interesting to see how the Law Commission approaches the core issues of civil and criminal liability during its consultation process on the ‘Automated Vehicles’ project. It is clear, however, that regardless of the findings of the consultation, unresolved issues will remain; the Commission, for instance, placed data protection and privacy outside the scope of this particular consultation, a surprising decision given the trends surrounding the General Data Protection Regulation and the public concerns over misuse of personal data. In any case, any insight and development of regulation is to be welcomed, as it would provide the certainty necessary for the industry to continue innovating in the field.

The contents of this article are intended for general information purposes only and shall not be deemed to be, or constitute legal advice. We cannot accept responsibility for any loss as a result of acts or omissions taken in respect of this article.

Categories: Commercial | For Business | Digital Media & Technology

Navigation

Taxonomy Selection

 

To find out how we can help you or your business, get in touch.

Give us a call:

0117 906 9400

 
 
 
  • This field is for validation purposes and should be left unchanged.