They Clone Your Voice and Your Friends’ with AI to Commit Million-Dollar Robberies: This Is the Scam
Frauds targeting companies and families are becoming frequent in various parts of the world due to a lack of knowledge about how cybercriminals can attack.
This technology can automatically create patterns of human behavior.
The development of artificial intelligence (AI) has opened new doors for innovation but has also raised concerns about its malicious use. One of the most alarming trends is voice cloning, a practice that scammers are increasingly exploiting, resulting in multi-million-dollar frauds.
Hackers Can Empty Your Bank Account from WhatsApp with Just One Click
A recent study in the UK revealed that 46% of adults are unaware of these types of scams, highlighting the urgent need to raise public awareness about this dangerous phenomenon.
This malicious use of AI is expected to become more sophisticated in the coming years, increasing the risk for individuals and organizations handling large amounts of data and financial resources.
Cybercriminals: How Voice Cloning with AI Works
Voice cloning with AI is a technique that allows scammers to replicate a person’s voice with surprising accuracy, using previous recordings of their speech. While most people think hours of recordings are necessary to mimic a voice, in reality, only three seconds of audio are needed to create a convincing replica.
With these brief recordings, AI technology can analyze and replicate an individual’s vocal characteristics, producing an artificial voice file that, in some cases, is indistinguishable from the original.
This type of technology, which has rapidly advanced since 2023, is accessible to anyone with the right tools, posing a serious security risk. Scammers can use AI to imitate the voice of someone familiar to the victim and make phone calls requesting confidential information or bank transfers.
Victims, convinced they are speaking to a friend or family member, often comply with the criminals’ demands without questioning the authenticity of the call.
How Much Money Can Be Stolen by Cloning a Voice with AI
Global companies have lost millions of dollars due to these types of scams.
The impact of these frauds has been significant. One notable example is the theft of 51 million Australian dollars from a company in the United Arab Emirates. In this case, fraudsters used voice cloning to impersonate a high-level executive and convince the company to make a massive transfer of funds.
This type of fraud is not an isolated incident. Reports of similar scams have surfaced in various parts of the world, from North America to Europe and Asia.
Another case that has caused alarm involves scammers using voice cloning to convince parents that their children have been kidnapped. In these situations, the criminals imitate the child’s voice to demand a ransom, instilling fear in the parents and forcing them to act under pressure.
The sophistication of these frauds has grown rapidly, and victims often have no way to verify the authenticity of the call in the moment.
What Will Happen in the Future with These AI Scams
Several cybersecurity reports warn that these crimes are on the rise. In fact we expect them to continue evolving. As technology advances, scammers will also improve their techniques, making these frauds even harder to detect and prevent.
Furthermore, the ease with which fraudsters can access voice cloning tools is a major concern. Nowadays, anyone with minimal technical knowledge can access AI programs capable of carrying out this kind of fraud.
Additionally, social media platforms and other places where people share audio or videos with their voices provide the necessary material for scammers.
How to Avoid Scams with AI-Created Voices
Never answer a call from an unknown number.
Fighting these crimes requires a comprehensive approach that combines technology, legislation, and public education. It is crucial for companies and individuals to be aware of this type of fraud and take precautions.
Implementing more robust verification technologies, such as biometric recognition, could be a long-term solution to detect these voice imitations.
On the other hand, governments and tech companies must work together to establish regulations that limit access to voice cloning tools and enforce stronger penalties for those using them fraudulently.
Some useful tips include being cautious when receiving unexpected calls requesting confidential information, verifying the identity of the caller through other means (such as a text message or video call), and avoiding sharing voice recordings on social media or public platforms without a clear purpose.