How to Design Connected and Automated Vehicles while taking Legal and Ethical Concerns into consideration

Authors: Paolo Balboni, Kate Francis, Anastasia Botsi

Edited by: Vito Pavese

This blog post is based on a paper presented by Prof. Dr. Paolo Balboni at the SETN 2020 Workshop on AI, Law, and Ethics, hosted by the 11th Hellenic Conference on AI special events section.

Connected and Automated Vehicles (CAVs), as is often the case with technologies based on Artificial Intelligence (AI), bring with them many advantages for society. However, they also represent challenges from the legal and regulatory perspectives and present concerns when it comes to ethics. By complying with the European personal data protection legal framework and embedding notions of fairness, transparency, and ethics into the development process of CAVs, user trust and technological uptake and be furthered.

The European legal data protection framework applicable in the CAV context consists of the General Data Protection Regulation (GDPR), the ePrivacy Directive,[1] and when the adoption of CAVs will reach a critical mass, the Directive for a high common level of security and information systems (NIS Directive).[2] Furthermore, according to the United Nations Economic Commission for Europe, UN Regulation No. 155 on Cyber Security and Cyber Security Management Systems, which regulates the cybersecurity of vehicles and which will be mandatory for new vehicle types in the European Union (EU) starting in July 2022, and the UN Regulation on Software Updates and Software Updates Management Systems, must also be taken into consideration alongside others such as UN Regulation No. 157 on the type approval of Automated Lane Keeping Systems (ALKS).[3]

While the applicable legislation provides safeguards for users in the CAV context, it should be noted that the law alone may not be enough to effectively reduce negative societal impacts to the maximum extent possible while also increasing benefits for users.[4] Indeed, the best way to adequately address ethical and human rights-related concerns is to take ethics into consideration in the development process in order to ensure that data security, transparency, and fairness are built into CAVs.

With reference to the GDPR, there are a number of legal hurdles that the developers and manufacturers of CAVs face such as identifying the data processing roles and ensuring that the obligations of controllers and processors are met, compliance with the principle of purpose limitation,[5] data minimization, [6] and the principles of data protection by design and by default. [7] Specifically, for example, the principle of data minimization is in tension with practical processing needs where a significant amount of data is required to train AI systems. Data protection by default necessitates the processing of data for a specific purpose.[8] Furthermore, the European Data Protection Board's interpretation of art. 5(3) ePrivacy Directive suggests the need of requesting specific user consent for "storing and gaining of access to information already stored and the subsequent processing of personal data".[9] A valid legal basis must also be identified for further use of data that is stored and accessed in the CAV, such as for analyzing telematics data. This represents a challenge when the legal basis used for the further use of data collected by the CAV is not consent (think of pay-as-you-drive insurance which would rely on the performance of a contract). An important challenge from the data security perspective is found in the fact that the highly connected nature of CAV components may increase the potential attack surface of the CAV. [10] In addition, there is a lack of shared security standards.

In order to ensure legal data protection compliance and to foster the deployment of CAVs, it is necessary that the entities which develop and manufacture such vehicles apply novel solutions which adhere to legal obligations. Among the actions to be taken, Data Management Agreements which identify and regulate the relationships between CAV stakeholders and their relative activities should be entered into; data protection by design and by default should be prioritized and IT security risk assessments and data protection impact assessments should be carried out. Importantly, best practices for security should be implemented in the organizations of the CAV stakeholders and human oversight should be foreseen. Compatibility tests should be carried out to permit the reuse of personal data for compatible purposes. CAV developers, manufacturers, and service providers should adopt solutions that permit for transparency concerning data processing to be realized, for example, through the use of standardized icons. Finally, ethical aspects should be taken into consideration from the start of the CAV design process with the aim of developing products that not only avoid causing harm but actually do good for the digital society.

To read the full paper and to learn more about designing CAVs in a way that respects legal and ethical concerns, see:

Paolo Balboni, Anastasia Botsi, Kate Francis, Martim Taborda Barata. 2020. Designing Connected and Automated Vehicles around Legal and Ethical Concerns – Data Protection as a Corporate Social Responsibility. WAIEL2020, September 3, 2020, Athens, Greece. Available at: http://ceur-ws.org/Vol-2844/ethics8.pdf

References

[1] Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002 concerning the processing of personal data and the protection of privacy in the electronic communications sector.

[2] The NISD, applicable to operators of essential services and digital service providers, ensures the security of network and information systems vital to economic and societal activities and to the functioning of the internal EU market. Also, see Recital (1) NISD.

[3] For more information see https://unece.org/sustainable-development/press/three-landmark-un-vehicle-regulations-enter-force

[4] European Data Protection Board, Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility-related applications, Version 1.0, 28 January 2020, p. 10. Available at: https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_202001_connectedvehicles.pdf.

[5] According to Article 5(1)(b) GDPR, the personal data must be "collected for specified, explicit and legitimate purposes and not further processed in a manner that is incompatible with those purposes".

[6] The principle of data minimization according to Article 5(1)(c) GDPR, requires that personal data are processed to the extent to which it is "adequate, relevant and limited to what is necessary in relation to the purposes for which they are processed".

[7]Commission Nationale Informatique & Libertes, Compliance Package: Connected vehicles and personal data. October 2017. Available at: https://www.cnil.fr/sites/default/files/atoms/files/cnil_pack_vehicules_connectes_gb.pdf.

[8]Commission Nationale Informatique & Libertes, Compliance Package: Connected vehicles and personal data. October 2017. Available at: https://www.cnil.fr/sites/default/files/atoms/files/cnil_pack_vehicules_connectes_gb.pdf.

[9] European Data Protection Board, Guidelines 1/2020 on processing personal data in the context of connected vehicles and mobility related applications. 9 March 2021. Available at: https://edpb.europa.eu/system/files/2021-03/edpb_guidelines_202001_connected_vehicles_v2.0_adopted_en.pdf

[10] See European Data Protection Supervisor, Connected Cars, TechDispatch, Issue 3, 20 December 2019, p. 2. Available at: https://edps.europa.eu/data-protection/our-work/publications/techdispatch/techdispatch-3-connected-cars_en; and the European Union Agency for Cybersecurity, Good Practices for Security of Smart Cars, 25 November 2019, pp. 6-7. Available at: https://www.enisa.europa.eu/publications/smart-cars.

By accepting you will be accessing a service provided by a third-party external to https://niove.eu/