Beware of AI-Cloned Voice Scams, How to Counteract?

With the rapid development of Artificial Intelligence (AI), its applications have been expanding. While AI can bring benefits to society when used in positive ways, it can also pose a threat to humanity if misused by malicious individuals. What used to sound like science fiction, such as voice cloning, is now easily achievable with AI technology, and criminals are utilizing it for fraudulent activities. How should people counteract such crimes?

The UK online bank, Starling Bank, highlighted in a press release in September that criminals can now use AI to replicate someone’s voice with just 3 seconds of audio, extracted from videos posted online by individuals.

Subsequently, they would identify the acquaintances of the targeted individual and use the AI-cloned voice to conduct scams through phone calls or voicemails, requesting money from these acquaintances. This type of scam has the potential to deceive many people.

Incidents of such fraudulent activities have already occurred. According to a survey conducted by the bank among adults in the UK, 28% of respondents have encountered this kind of scam at least once in the past year. However, 46% of respondents have never heard of this type of fraud, let alone know how to protect themselves.

The survey also revealed that 8% of participants would transfer money to the other party based on telephone instructions, even if they found the call suspicious.

Lisa Grahame, the Chief Information Security Officer of the bank, emphasized, “People often post audio recordings of themselves on the internet without realizing that this could make them more vulnerable to fraudsters.”

She added, “Therefore, it is essential for people to understand such fraudulent schemes and how to protect themselves and their loved ones from becoming victims.”

Four scholars, including Duane Aslett, a senior lecturer in policing studies at Charles Sturt University in Australia, wrote on The Conversation website, stating that the advancement of technology makes it easier for criminals to intrude into people’s personal space, emphasizing the importance of cautious use of technological products.

These scholars pointed out that voice cloning is a form of deepfake technology that can extract a person’s accent, speech pattern, and breathing style from a brief audio sample, enabling the replication of their voice. A 3-second audio sample is sufficient for this purpose.

Once the speech pattern is acquired, an AI voice generator can convert text input into a voice that resembles the individual’s voice, and it is highly convincing.

Simple phrases like, “Hello, is anyone at home?” can be used for voice cloning in phone scams, while longer conversations provide fraudsters with more vocal details to make the cloned voice more authentic. Therefore, it is necessary to keep conversations brief unless you trust the identity of the caller.

The scholars noted that fraudsters could impersonate celebrities, authorities, or ordinary citizens using AI-cloned voices in phone scams. They create a sense of urgency, gain the victim’s trust and then request money in the form of gift cards, wire transfers, or cryptocurrency.

Numerous cases of voice cloning scams have made headlines. For instance, fraudsters previously replicated the voice of a company executive in the United Arab Emirates, swindling a colossal sum of $35 million.

The scholars proposed various measures that individuals and organizations can take to prevent the misuse of voice cloning technology.

Firstly, raising public awareness through education and campaigns helps to protect individuals and organizations, reducing the frequency of such scams.

Secondly, individuals and organizations should consider using biometric technology with liveness detection capabilities. This technology can identify and verify genuine human voices, distinguishing them from fake ones. Organizations that utilize voice recognition should consider implementing multi-factor authentication, requiring two or more verification mechanisms, such as entering a password along with fingerprint verification.

Thirdly, law enforcement agencies should enhance their capabilities in investigating voice cloning.

Lastly, governments worldwide need accurate and updated regulations to manage related risks effectively.

In conclusion, these scholars emphasized that cybercrime has significant economic impacts on a country, stressing the importance of public awareness and robust protective measures. They stated, “All stakeholders – government, citizens, and law enforcement agencies – must remain vigilant and elevate public awareness to mitigate the risks of victimization.”