A father in Pune gets a WhatsApp call. The voice on the other end is his son. Same tone. Same accent. Same way of saying “Papa.”

“Papa, I am in serious trouble. Please transfer money right now. Do not tell anyone.”

He transfers Rs 80,000 within minutes. His son calls him an hour later from his own phone. He had no idea what happened.

This is not a movie scene. This is a deepfake voice scam and it is happening across India right now.

What Exactly Is a Deepfake Scam?

A deepfake is AI generated content, a video, image or audio clip that looks or sounds completely real but never actually happened.

Scammers use this technology in three main ways:

Here is the part most people do not realise. In 2026, an AI tool needs just 3 to 5 seconds of your voice to clone it with frightening accuracy. That audio does not have to come from somewhere obvious. A WhatsApp voice note you sent last month. A video you posted on Instagram. A call you picked up without knowing it was being recorded. Your voice is already out there in more places than you think.

How Big Is This Problem in India?

The numbers are hard to ignore.

A 2025 analysis of AI scams found that 47 percent of Indian adults have either been targeted by an AI voice cloning or deepfake scam themselves or know someone who has. That is almost double the global average of 25 percent.

Of those who were targeted, 83 percent lost money. And nearly half of them lost more than Rs 50,000 in a single scam.

In just the last three months of 2025, the Haryana Cyber Cell alone recorded over 2,300 voice cloning fraud cases. That is a 450 percent jump from the year before.

This is not something that might happen someday. It is already happening, in every city, to people of every age, including people who consider themselves careful.

How Does a Deepfake Scam Actually Work?

There are three main types of deepfake scams targeting Indians in 2026.

Type 1: AI Voice Cloning Scam (Most Common)

This is the “Papa, I am in trouble” scam described above. Here is the full process:

  1. The scammer finds a voice sample of someone you trust, from social media, YouTube, or a public video
  2. They feed it into an AI voice cloning tool, freely available online
  3. The AI generates a clone of that voice in seconds
  4. The scammer calls you using that cloned voice and creates a fake emergency
  5. You panic and transfer money before verifying

This works on families, colleagues, and employees. A Mumbai CFO authorized a Rs 2.3 crore wire transfer after receiving what sounded like a call from his own CEO. The CEO was in a meeting the whole time. The voice was AI-generated.

Type 2: Deepfake Video Scam (Growing Fast)

In this version scammers create a fake video of a celebrity or trusted public figure promoting a fake investment scheme.

One of the most well known cases in India involved Ankur Warikoo, a popular personal finance educator. AI generated deepfake videos using his face, voice and likeness spread across social media and convinced his followers to join WhatsApp groups for fraudulent stock market advice. The videos looked real enough that the Delhi High Court had to step in and grant emergency relief to get them taken down.

You have probably seen similar videos yourself. Mukesh Ambani promoting a guaranteed return investment app. Narendra Modi endorsing a trading platform. Virat Kohli recommending a scheme that will double your money in 30 days.

All fake. All deepfake. None of them ever said any of it.

Type 3: Deepfake Video Call Scam (Most Dangerous)

This is the newest and most dangerous variant. The scammer joins a video call with you using a real-time AI face filter that overlays someone else’s face onto theirs.

You see the face of your boss, your bank manager, or a government official. You hear their voice. You believe it is them. And then you follow their “urgent instructions.”

In 2024, a Hong Kong company lost the equivalent of Rs 210 crore when an employee attended a video conference where every participant, including the CFO, was a deepfake. This type of attack is now reaching India.

Real Cases from India

Case 1: Indore Deepfake Kidnapping Video Scam

A family in Indore received a video call showing their relative being kidnapped. The visuals looked completely real. The caller was aggressive and demanded money immediately.

The family panicked and transferred the money.

Their relative was sitting at home, completely safe, with no idea any of this had happened. The kidnapping video was entirely AI generated. None of it was real.

Source: Indian Express

Case 2: AI Voice Clone Scam (Kanpur – Nephew Impersonation)

A man in Kanpur received a call from someone who sounded exactly like his nephew. The voice, the tone, everything matched. The caller said he was in serious trouble and needed money urgently.

The man transferred around ₹ 1 lakh without hesitating. Why would he? It sounded exactly like his nephew.

It was not. The voice had been cloned using AI and the real nephew had never made that call.

Source: Times of India

Case 3: ₹2+ Crore Digital Arrest Scam (Mangaluru)

A man in Mangaluru got a call from someone claiming to be a law enforcement official. They told him his SIM card had been linked to illegal activities and that he was under investigation. The pressure was immediate and relentless.

To clear his name, they said, he needed to transfer money.

He transferred over ₹2 crore before he realised what had happened.

Source: Economic times

How to Know If a Voice Call or Video Is Fake

This is getting harder as AI improves. But there are still signs to watch for.

For voice calls:

For videos:

The most important sign in both cases: Someone is creating panic and asking you to act immediately without telling anyone.

What Should You Do If You Get a Deepfake Call?

If you receive a suspicious call claiming to be from a family member, colleague, or official:

  1. Do not transfer any money immediately, no matter how urgent they sound
  2. Hang up and call the person back on their known number directly
  3. Ask a question only that person would know the answer to
  4. If a video call looks suspicious, ask them to wave or blink, which trips up some AI filters
  5. Report the number immediately by calling 1930

If a suspicious link was shared during the call, paste it into the ScamDekho URL Checker before clicking anything.

How Are Scammers Getting Your Voice Sample?

You might be thinking: I am not a celebrity, why would anyone clone my voice?

The answer is that scammers do not need you to be famous. They target your family members. If your child has even one Instagram Reel or YouTube Short online, that voice can be cloned.

Common sources scammers use to steal voice samples:

This does not mean you should disappear from the internet. It means your family needs to know this threat exists.

What Is the Indian Government Doing About Deepfakes?

The government has taken some steps in 2026 worth knowing about.

New IT rules that came into effect in February 2026 now require social media platforms to remove deepfakes within 3 hours of receiving a government or court order. For non-consensual intimate imagery the window is even shorter at 2 hours.

Instagram, Facebook and YouTube are now legally required to explicitly ban AI deepfake content in their terms of service and respond to takedown requests quickly. MeitY has also issued advisories requiring platforms to inform users about AI content policies at the time of registration and login.

But here is what experts are saying honestly. Regulation cannot keep up with how fast this technology is moving. By the time a rule is written, the tools have already moved three steps ahead. Your own awareness is still your strongest protection, more than any policy or platform guideline.

How to Protect Yourself and Your Family

You cannot prevent scammers from trying. But you can make it much harder for them to succeed.

Also read: AI Voice Cloning Scam in India: How It Works and How to Stay Safe for a deeper look at voice scams specifically.

What to Do If You Already Lost Money

Act within the first hour if possible.

  1. Call 1930 immediately, this is India’s cyber crime financial fraud helpline
  2. File a complaint at cybercrime.gov.in with all details, the number, the amount, the UPI ID used
  3. Contact your bank to freeze or reverse the transaction
  4. Keep all evidence: call logs, screenshots, transaction IDs

Also check if any suspicious message or link was involved using the ScamDekho Scam Message Checker.

A Word on AI Voice Cloning Scams Specifically

Your voice is now as sensitive as your fingerprint. It can be copied, replicated, and weaponized against people who love you.

This does not mean living in fear. It means having one honest conversation with your family about how this technology works. Show them this article. Tell them about the family code word. Make a rule that no one transfers money based on a call alone, ever.