According to Big Brother Watch, a British privacy advocate, and civil liberties organization, the Voice ID system of HMRC (Her Majesty’s Revenue and Customs) has stored voice records of over 5.1 million British citizens – All that without their consent or knowledge.
The voice collection saga began in January last year when HMRC launched a service called Voice ID aiming at callers to verify their identity while getting in touch with the agency through its customer support service.
HMRC, on the other hand, has acknowledged the feat but blames victims for not using the opt-out feature. However, Big Brother Watch explained that “Upon calling HMRC’s self-assessment helpline we were met with an automated system. After the account verification questions, the system demanded that we create a voice ID by repeating the phrase ‘my voice is my password’.”
“Far from ‘encouraging’ customers’, HMRC offers no choice but to do as the automated system instructs and create a biometric voice ID for a government database.”
See: Vault 7 Leak: CIA Collected Biometric Data from Partner Agencies
The privacy group further stated that “In our investigation, we found that the only way to avoid creating a voice ID is to say “no” to the system – three times – before the system resolves to create your voice ID “next time”.
The issue is now being investigated by Information Commissioner’s Office (ICO). If found guilty, HMRC can face heavy fines since the whole process could be a potential violation of GDPR. (Read More: GDPR and the REAL impact on business).
According to the Article 9 of GDPR (General Data Protection Regulation), the data subject should “give explicit consent to the processing of those personal data for one or more specified purposes.” However, the group claims HMRC does not obtain explicit consent from users.
“We sent HMRC a Freedom of Information (FOI – PDF) request, asking how an individual could securely delete their voice ID and use the usual method to access the helpline. Disturbingly, HMRC refused (PDF) to answer our question under FOIA Exemption s31 (1) (a) — prejudice to the prevention or detection of crime,” the group said.
“This suggests that taxpayers’ voiceprints are being used in ways we do not know about.”
Tom Harwood, CPO, and Co-Founder at Aeriandi commented on the issue and said that “Biometrics technology has been shown to significantly reduce fraud, especially in the financial sector – but it’s not the whole solution. Last year, two twins demonstrated how easy it is to trick these systems after they gained access to HSBC’s voice biometrics security platform.”
“No security technology is 100% fool-proof, and it’s is now possible to cheat voice recognition systems. Voice synthesizer technology is a great example. It makes it possible to take an audio recording and alter it to include words and phrases the original speaker never spoke, thus making voice biometric authentication insecure,” added Harwood.
“Organisations need additional technologies – beyond biometrics – to protect their customers. Fraud detection technology is the prime candidate. It looks at far more than the voiceprint of the user; it considers hundreds of other parameters to ensure the caller and the call is legitimate – everything from their location to the acoustic dimensions of the room they’re making the call from.”