Publications -

A Question of Ethics: Connected and Automated Vehicles

The European Commission recently published a detailed report on the ethical issues associated with connected and automated vehicles. The report aims to promote a safe and responsible transition to connected and automated vehicles (CAVs) by engaging the various players in the field, such as manufacturers and deployers, policymakers and researchers, in considering these ethical considerations. Much like ‘privacy by design’, the report recommends ‘ethics by design’ in suggesting the systematic consideration of ethics at development stage.

The report presents a set of twenty ethical recommendations concerning the future development and use of CAVs in the context of three broad areas:

1) Road Safety, Risk, Dilemmas;

2) Data and Algorithm Ethics: Privacy, Fairness, Explainability; and

3) Responsibility

The overlap with the privacy world is substantial, and so, below we consider some key issues emerging from the report in respect of data protection and privacy, which manufacturers and deployers of CAVs should take to heart:

Protect User’s Right to Informational Privacy

In line with the GDPR’s basic principles regarding data minimisation and purpose limitation, manufacturers and deployers of CAVs must inform data subjects about the specific purposes for which their data are collected. They must also give data subjects control over their data by enabling them to easily access, rectify or erase their personal data.

If personal data is processed for purposes that are not necessary for the proper functioning of the CAV, such as advertising, the carmaker has to get the user’s consent for such use, especially when these purposes involve data sharing with third parties. The manufacturer cannot make the use of the CAV service contingent upon the user’s consent. The report specifically calls for a more nuanced approach than the “take it or leave it” model of consent that is quite common in many industries. Instead, carmakers should develop more sophisticated and advanced consent options.

Moreover, the Report mentions that it would not always be appropriate to request consent, for example when the CAV driver is under pressure, in an unsafe area, or in a situation where they need to make a quick decision. In such case, the user may be in a vulnerable position with limited ability to choose or negotiate conditions as offered by the service provider. A consent given in such situation, may be deemed invalid and the manufacturer should refrain from processing the relevant data.

Develop Transparency Strategies

A CAV moves through physical space and, while doing so, exchanges data with other CAVs, with the carmaker’s servers and with the ever-growing landscape of other connected infrastructure on the road. This complex ongoing multi-party data sharing is often difficult to explain to drivers, especially its effect on their privacy rights.

In addition, CAVs also collect data of other road users, including pedestrians, which can affect their privacy rights – these other road users need to be informed as well if their data are collected. The report suggests that these other users can be informed via in-vehicle or wearable smart-device displays, audio-visual aids on roads (e.g. street signs, flashing icons, beeping sounds), or other communication modes. Manufacturers should also endeavour to anonymise any data of third parties that is collected by CAVs, especially when they wish to use this data for internal R&D.

Manufacturers are advised to develop meaningful, standardised transparency strategies to cater for such complex data interactions and collection.

Transparent Algorithms and Regular Audit

Algorithms in CAV systems can create new personal data about the driver or be used as a basis to make automated decisions about them. Algorithm-based machine operations and decisions can have a significant impact on users. Manufacturers and deployers must explain this impact as well as the functioning of the CAV technology in a way that is transparent and easily understood by anyone without prior knowledge of the technology.

If the CAV technology results in significant automated decisions affecting an individual, a manufacturer has to ensure that such decisions can be explained and justified and the circumstances that led to such decision can be examined.

Another CAV-specific concern is that the algorithmic basis for CAV systems and operations can trigger unique variables that can lead to biased outcomes. The report points out that prior examples show that the prevalence of social biases in data sets, combined with limitations in sensing systems and automated machine learning models, are highly likely to reproduce and reinforce biases, such as negatively representing women, children, or people of different races. Manufacturers and deployers should take steps to increase the users’ awareness of potential risks of bias. Initial examples might include “warning flags”, labelling remedies, and diversity requirements when presenting users with options. At the same time, manufacturers should develop a targeted algorithm auditing procedure to regularly assess algorithms for efficacy and bias in order to avoid such reinforcement of biases or unnecessary lack of transparency.

Next Steps

While the report itself is not legally binding, the recommendations contained therein are supposed to give confidence to manufacturers and deployers in the development of CAV technology in ways that are ethically defensible and also provide a platform for future CAV research, development and deployment. We would strongly recommend that manufacturers familiarise themselves with the entire scope of the report, as recommendations for new legal standards and policies give a good indication to the general framework that will govern such activities in the future. It is becoming clear that data protection is a crucial element in the field of CAVs, and that manufacturers are advised to ensure that their products and services are aligned with the latest legal requirements.

The full report can be found here, and a factsheet here.

 

As ever we are ready to assist with all your needs. Please don’t hesitate to contact us.

 

   

Roni Abelski                                                                           Dr Laura Jelinek

Download as PDF
Share:

Media