“Beware of the ‘Pig Killing’ Scam Targeting Your Assets: How to Avoid Being Cheated”

Scam artists in the “pig butchering” scheme have created fake online personas to lure victims into their fraudulent investment schemes.

You may no longer have to worry about Skynet, but it’s time to start worrying about online pig butchering scams. Fraudsters have been expanding their use of tools that can infiltrate accounts and profit from cryptocurrencies or real money.

The so-called “pig butchering” scam is an online investment fraud where scammers create fake online personas to lure victims into their deceptive investment schemes. The term “pig butchering” refers to fattening up victims over time through the establishment of trust, only to eventually slaughter them and plunder their finances.

First, scammers create their false online identities. This could be a handsome, single, wealthy investor, depending on whether the tool being used is a dating app or social media. Their aim is to target the interests of victims while exploiting their vulnerabilities. Photos, personal information, and background stories are carefully crafted, either stolen or generated by AI.

They initiate contact using scripts and messages to assess the victims’ receptiveness. To make it profitable, they establish numerous contacts.

Over weeks or even months, as the relationship progresses smoothly, potentially involving romantic entanglements, sophisticated “mirroring” techniques allow scammers to closely align with victims’ language, interests, and viewpoints to establish and nurture connections, familiarity, and trust.

Once trust is firmly established, scammers steer the conversation towards their “investment” pitch. It could be gold, foreign currency exchange, or cryptocurrencies. They present themselves as knowledgeable, experienced investors who only want to help victims get started and succeed.

Subsequently, scammers attempt to get victims to download investment applications or visit fraudulent investment platforms. They explain how it works, set up an account, and accept an initial deposit. Savvy scammers make it appear that the “investment” is making money over time or convince victims to invest more money gradually. Monthly statements akin to the infamous fraudster Bernie Madoff’s seemed to provide “proof” of successful investment and “risk ventures.”

When they have taken as much money as possible, contacts suddenly vanish, and scammers disappear without a trace. In the worst-case scenario, the collected information could be used for identity theft or targeting victims’ friends and family.

The United Nations Office on Drugs and Crime (UNODC) recently issued a report sounding the alarm on the rapidly expanding criminal ecosystem.

Many digital scams rely on social engineering, manipulating victims to willingly donate money rather than relying on malicious software or other methods.

Researchers now caution that scammers are integrating generative AI content and deep fake technology to scale up their operations and efficiency.

Due to limitations on scammers’ language skills and their ability to engage with hundreds of victims simultaneously, the number of victims they can target is restricted.

However, advances in generative AI over the past few years, such as ChatGPT writing tools, make it easy for criminals to overcome language barriers and create the bulk content they need for scams.

The UN report goes on to say that artificial intelligence is used for automating phishing attacks, creating false identities and online profiles, developing personalized scripts to deceive victims, and sending messages in almost any language.

The report states, “These developments not only expand the scope and efficiency of online fraud and cybercrime but also lower the entry barriers of criminal networks, which previously lacked the technical skills to exploit sophisticated and profitable methods.”

Perhaps the most significant AI transformation in digital attacks comes from what’s known as “deep fakes.” Scammers use machine learning systems to facilitate real-time face-swapping. This technology allows criminals to change their appearance during conversations with victims, making them look like someone else. The UN report notes that this technology enables “one-click” face-swapping and high-resolution video sources, providing attackers with “proof” of their claimed identities.

It’s common sense. You should trust your first impressions and instincts—be cautious when someone shows excessive interest in you.

Don’t share personal sensitive information with strangers or those you barely know. Absolutely not. Remember, always be cautious.

Immediately block suspicious emails, texts, or calls.

When faced with unsolicited messages from strangers, skepticism is a healthy online lifestyle.

Educate yourself about fraud strategies and stay updated with the latest information. The new generation of AI-related fraud schemes has taken fraud to a whole new level, demanding that all of us research before engaging in transactions or investments.