News
BackThe use of biometric applications is becoming increasingly common in our everyday life. Many regard finger scans, face or voice recognition as a harmless and secure type of identification; however, the use of biometric data also carries risks such as data misuse and identity theft. An event organised by AK EUROPA and BEUC makes it clear: the safety of consumers must be guaranteed as personal biometric characteristics – in contrast to passwords– cannot be changed or deleted.
A study by the Institute of Technology Assessment (ITA), which was commissioned by AK Vienna, deals with the effects of the wide application of biometric methods on consumers and society. The study was presented within the scope of a Webinar, whereby the author of the study Walter Peissl (ITA) drew attention to the dangers and pitfalls of the extraction process of biometric data: these include the often unresolved question of consent, the high error-proneness of the systems regarding non-white people, the huge potential for misuse and the doubtful confidentiality when saving such data.
Face recognition is especially problematic, as in both the private and public sector such data is mainly collected unnoticed. The scandals pertaining to ClearView AI and PimEyes demonstrate the far-reaching dimensions of face recognition. The companies created gigantic databases based on photos of human faces on the internet, among other from platforms such as Facebook, YouTube, Instagram, Twitter and TikTok. This enables the identification of a person in real-time. The growing circulation and the increasing use of biometric data in daily life also have a social impact: in general, the attitude of society towards using digital applications, which utilize biometric data, becomes increasingly uncritical. This results in the gradual undermining of data protection and data security.
AK expert Daniela Zimmer explained how the growing use of biometric applications in the consumer sector significantly increases the risks of misappropriation, identity theft and data misuse. If the advantages are weighed against the risks and dangers of using biometric data, there is – from the perspective of consumer protection – hardly any or only isolated potential for sensible applications in the area of end users. These include for example applications with increased security relevance. Zimmer therefore, emphasised that biometry must not become a line of business. The commercialisation of and the trade with biometric data as well as sharing it with external third parties should be banned and strictly sanctioned. This requires the tightening of data protection and security standards at European level.
The Deputy Director-General at BEUC, Ursula Pachl, pointed out that neither the current EU legislation nor the newly proposed Artificial Intelligence Act (AI) adequately deal with biometric data recording in order to guarantee consumer protection. The AI regulation proposal only bans the use of biometric data in the field of prosecution; however, it does not regulate applications in respect of consumers and the use of biometric data by private enterprises. Accordingly, Pachl demands banning the commercial use of such data, as already requested by EDPS (European Data Protection Supervisor) or EDPB (European Data Protection Board). Furthermore, an alternative to using biometric data has to be ensured. Consumers should always be given the choice to make a personal decision on whether or not they allow releasing and sharing their data.