Sentiment analysis and sensitive data
Many systems use ‘Sentiment analysis’, which involves analyzing some form of activity or communication and extracting some useful data as to the relevant party’s state of mind. That could mean analyzing a customer’s voice to hear if she is likely in the mood for an upsell, or a client’s email to suggest that he’s irate and needs a discount to keep the business. Also, there are systems that analyze employees’ sentiment, which then allows employers to optimize productivity, engagement and job satisfaction.
A recent fine notice in Hungary relates to this kind of personal data processing, to sentiment analysis, so we’re elaborating on a couple of important outcomes of that decision here:
data relating to a person’s emotional and mental state will be considered personal data where it is tied to other data, that can identify the person indirectly;
in the particulars of that case during the voice analysis itself no data that uniquely identified a person was created;
again, since in that case no conclusions were drawn about a person’s physical or mental health, GDPR Article 9 was not triggered. In other instances, the same kind of data processing could lead to sensitive personal data.
Another interesting point that arose from the decision is that it is not necessary for “automated data processing” that the decision is made by a device; if the computer produces a result which is taken into consideration by a human making the decision, that will also be ‘automated data processing’, and GDPR Articles 4(4) and Articles 21 will apply. In that case, the computer produced a list of recommendations for which customers to call back, though the final decision was made by the employee.
Ensure you complete a DPIA for any sentiment-analysis personal data processing, that you meet transparency requirements, and that any automated-decision-processing rights are respected.
* * * * *
Reminder: this isn't legal advice. To subscribe, click here.