The Information Commissioner’s Office (ICO) has warned companies against using biometric technology to analyze emotions. Some of these systems that lack development carry the risk of systemic bias, inaccuracy and discrimination.
Biometric systems used for emotional analysis rely on processing data such as gaze tracking, mood analysis, facial movements, gait analysis, heartbeats, facial expressions and skin moisture collected in various ways – the aim is to gauge someone’s emotional state to tailor a service offered by a company.
This means a large amount of data is collected and analyzed, which the ICO notes is “much riskier” than traditional uses of biometrics to verify or identify a person.
“The market development of biometric data and emotional artificial intelligence is immature. They may not work yet, or even ever,” Deputy Commissioner Stephen Bonner said.
“While there are opportunities available, the risks are currently greater. At the ICO, we are concerned that incorrect data analysis can lead to assumptions and judgments about an individual that are inaccurate and lead to discrimination.’
Bonner added: “As it stands, we are yet to see any emotional AI technology develop in a way that meets data protection requirements, and we have more general questions about proportionality, fairness and transparency in this area. “
The ICO is set to further explore the biometrics market in 2023. The organization is developing guidelines on biometrics and plans to release them in spring next year to help companies looking to adopt technologies such as facial recognition, fingerprints and voice .
Meanwhile, The ICO published two new reports this weekwhich aim to support businesses in embedding a privacy-by-design approach in the adoption of biometrics.
Security of biometric data is particularly important due to its immutable nature. As the ICO says: “Biometric data is unique to an individual and is difficult or impossible to change if it is ever lost, stolen or misused.”
The ICO remains cautious about biometrics
The latest cautionary note is a continuation of the ICO’s existing position on biometrics. The service has long been concerned about the technology, particularly the use of facial recognition. Last year, then-commissioner Elizabeth Denham said she was “deeply concerned” about the use of live facial recognition and that such mass data collection could have “significant” consequences.
The ICO has also spoken out against the use of facial recognition by the police, in schools and supermarkets, and this year fined Clearview AI £7.5m for collecting and using image data of UK residents.
https://www.computing.co.uk/news/4058924/ico-emotional-biometrics-risky-mature