17 December, 2015
Over the past few years, major steps have been taken in the development of software capable of recognising emotions. This software could be used in various ways, one of the more obvious ones would be using the software to fine-tune advertisement based on a consumer’s emotions. However, if the software were to be used in that fashion, it would very likely touch upon the right of privacy and the protection of personal data of those who are under analysis. Organizations that put emotion recognition software to use ought to investigate whether they are subjected to and comply with all privacy legislation.
Reading the emotions of consumers could be very valuable for organizations. Knowing how a person reacts to certain advertisement could for example enable organizations to advertise more effectively. As was mentioned previously: in such cases privacy legislation could apply. This is for example the case if data regarding the emotional state of the data subjects is stored or combined with other data for the purpose of profiling. In order for organizations to make optimal use of the possibilities provided for by such software, it is important that they take into account the relevant privacy legislation before starting any processing activities in this regard.
In determining the applicability of the Dutch Data Protection Act (Wet bescherming persoonsgegevens or Wbp) it is to be assessed whether personal data is processed. The Wbp does for example not apply when anonymized personal data is processed. Organizations thus first need to establish whether personal data is processed and if so, what categories of personal data can be identified. The next step is to investigate how privacyrisks can be diminished (Privacy by Design). An instrument that could be used in that regard is for example a Privacy Impact Assesment (PIA). Conducting a PIA could, prior to the processing of personal data, give insight into the privacyrisks, thereby enabling the organization to take well-informed measures to diminish or prevent the impact on a consumer’s privacy.
If privacy and the protection of personal data are well invested in the organisation, consumer trust and thereby their support for the processing of personal data could grow. This increases the chances of success for projects in which emotion recognition software is used. Especially with regards to new technologies it is of great importance to create enough support amongst the general public and/or users of the service.
The ‘Ethics in Networked Systems Research’ project at the Oxford Internet Institute launched a...