Voice cloning enables many beneficial applications, but can also help bad actors commit crimes and erode public trust by impersonating people without their consent
ID R&D, a Mitek company, and an award-winning provider of voice biometrics and liveness technology, introduced voice clone detection as a new option for its IDLive Voice liveness detection product. Detecting voice clones and audio deepfakes can prevent fraud and crime, deter bad actors, and help preserve trust in the authenticity of digital audio communication.
Also Read: ITechnology Interview with Rajendra Prasad, Global Automation Lead at Accenture
“Just as deepfakes have made it harder to distinguish between fact and fiction in the digital world, voice clones make it hard to believe what we hear said by people we think we know and recognize”
The software processes a recording of speech and uses AI to determine whether it was spoken by a person or a voice clone. The covert use of a voice clone is a strong indicator of criminal intent.
Generative AI has substantially accelerated the advancement of audio deepfakes and voice clones. Today’s voice clones, created using just a short audio sample of a person’s voice, are virtually impossible for people to differentiate from their source. It is a powerful technology with many compelling applications in communications, healthcare, and productivity, particularly when combined with text-to-speech and conversational AI. But it can also be used to commit fraud and other crimes by impersonating people without their consent. These include account takeover, identity theft, misinformation, extortion, defamation, and appropriation of identity.
CIO INFLUENCE News: Virtusa Signs Strategic Collaboration Agreement with AWS
President Biden has issued an Executive Order that established standards in AI security and safety in an effort to protect Americans privacy while seizing the promise of AI. Bruce Reed, White House deputy chief of staff spearheading the AI strategy of the Biden administration, recently said that what worries him most about AI is voice cloning.
“Just as deepfakes have made it harder to distinguish between fact and fiction in the digital world, voice clones make it hard to believe what we hear said by people we think we know and recognize,” commented Alexey Khitrov, D R&D CEO and Co-Founder and Mitek GM. “In a world where deepfake impersonations are proliferating so rapidly, voice clone detection plays an essential role in preserving trust between people and technology, securing the voice interface from fraud.”
CIO INFLUENCE News: ShardSecure Joins Google Cloud Partner Advantage for Data Privacy Benefits
[To share your insights with us, please write to sghosh@martechseries.com]