12 AI Scams You Need to Be Aware Of

Artificial Intelligence (AI) was introduced to the general public to improve their lives. Little did the developers know that scammers would be the first to leverage this cutting-edge technology to refine their scamming tactics. Almost 48% of Americans believe they’ve become less ‘scam-savvy’ because of AI, and one in three admits that it will be challenging for them to identify a potential scam.

Let’s explore the common AI scams making people call for stricter AI regulation.

Voice Cloning Scams

Image Credit: iona didishvili, Shutterstock

The constant refinement of AI isn’t only helping the economy grow but also giving cybercriminals newer avenues to dupe people. Today, anyone can use cutting-edge AI tools to clone any voice with a mere 3 seconds of audio clips. Many scammers have already started using AI voice cloning solutions to rob people of their savings. McAfee recently surveyed 7,000+ people from seven countries and found that almost a quarter have been victims of this scam and have lost significant amounts of money.

Deepfake Scams

Two surprised students learning in library on laptop
Image Credit: Deposit Photos

A deepfake scam involves impersonating another person by digitally replicating their appearance, voice, and talking style to steal money from potential victims. One of the biggest such cases occurred in January 2024, where an employee working at a Hong Kong-based firm was tricked into sending US $25 million to fraudsters by using a deepfake video of the company’s CFO. There has been a sudden surge in such cases, and experts warn that it will only increase in the coming years.

Phishing Scams

Image credit- fizkes, Shutterstock

Phishing scams aren’t new. The introduction of generative AI tools has only helped these fraudsters improve their phishing efforts. Many cybercriminals use AI tools like ChatGPT to write professionally sound emails that trick people into sharing their sensitive information. Such cases have witnessed a 1,265% increase since the fourth quarter of 2022.

Romance Scams

Image credit- Pixel-Shot, Shutterstock

AI scams aren’t limited to finance and investment fields anymore; they’ve made their way to the dating world. Cybercriminals are now creating fake identities, sultry videos, etc., to catfish people looking for love on dating apps and social media platforms. Norton’s data reveals that such AI-backed online dating scams have increased by 72% in 2023, with almost 27% of dating app users reporting being their victims.

Tech Support Scams

Image Credits: Deposit Photos

These are common monetary scams targeting elderly or gullible people and tricking them into believing there are technical issues/viruses on their computers. Scammers then offer to fix their devices against massive amounts of money. They have started using AI to refine their fraudulent processes, resulting in an enormous increase in tech support scams. These impersonation scams have resulted in losses worth $1.3 billion by 2023, which stood at $178.3 million in 2019.

Fake News and Propaganda

Image Credit: Deposit Photos

AI-powered misinformation campaigns can manipulate people into believing falsely created stories and influence their actions. As it is easy and comparatively economical to create AI-generated fake news and propaganda stories at scale, people must learn to identify what is real and what is not. One such instance involves the widespread sharing of an AI-generated image showing an explosion at the Pentagon. It became viral soon after publishing and caused a brief dip in the stock market.

Sextortion Scams

Image credit- CC7, Shutterstock

FBI issued an alert last year informing U.S. citizens of the rising sextortion cases targeted towards its citizens. It stated how criminals are using AI to doctor explicitly inappropriate images and videos of victims to intimidate them into paying money. As of April 2023, there was a noticeable uptick in such cases, and the authorities fear that such instances may continue increasing in the following years. The only way to protect oneself from such cases is by being vigilant and immediately contacting authorities for support.

Fake Reviews

Image Credit: voronaman/ Shutterstock

Online shoppers frame their buying decisions based on the product’s reviews, and scammers use it to their advantage. They use sophisticated AI tools to create positive human-sounding reviews to tempt people to buy specific products. It is estimated that about 30% to 40% of online reviews are doctored or not genuine. Authorities are considering framing strict regulations against such malpractices to secure shoppers’ money, preventing them from making wrong purchases.

Click Fraud

Image credit- Andrey_Popov, Shutterstock

Cybercriminals are becoming sophisticated in their scamming tactics, and AI is making their work much easier. A Fox News report highlights how scammers launch advanced AI scams at scale. They’re purchasing ad spots on search engines and placing bogus websites disguised as authentic ones to make people share their sensitive information and steal money. All it needs is a click, and people may lose money. People are advised to be more careful while clicking on any random website.

Subscription Trap Scams

Image credit- fizkes, Shutterstock

Scammers now use AI to automate their scamming processes by tricking people into signing up for fake subscriptions. They can resort to any subscription model, like a 14-day free trial or sweepstake, which is lucrative enough to make people sign up and enter their credit card details. Once the credit card details are entered, cybercriminals make it almost impossible to cancel the subscription, and money keeps getting deducted month after month. Authorities have noticed an increase in such scams and published a press release in 2022 to warn people not to fall for such traps.

Real Estate Scams

Image credit- SaiArLawKa2, Shutterstock

Much money is involved in real estate, making it the top target of cybercriminals. Industry experts say there were real estate scams earlier, but AI makes them hard to detect. Fraudsters are using AI tools to write emails and generate voicemails that sound exactly like the potential buyer, broker, or seller. They use these original-sounding yet synthetic materials to manipulate victims into performing a specific action, making them incur hefty financial damages.

Tax Scams

Image Credit: Cast Of Thousands, Shutterstock

Sophisticated AI tax scams are rapidly targeting tax professionals to steal their client’s sensitive information, which can cause them to lose money. The IRS recently published a news release to warn tax professionals to avoid AI-generated phishing emails and calls designed to extract details such as EFINs, PTINs, and CAF numbers. Scammers use this information to carry out elaborate identity theft, refund fraud, and numerous other crimes.

Scroll to Top