AI scams on the rise; voice cloning and deepfakes target Indians

Nearly half of Indian victims of AI voice scams report losing over Rs 50,000, says McAfee report
Nearly half of Indian victims of AI voice scams report losing over Rs 50,000, says McAfee report
Nearly half of Indian victims of AI voice scams report losing over Rs 50,000, says McAfee report
Published on

In March 2025, a 72-year-old homemaker in Hyderabad lost Rs 1.97 lakh to an artificial intelligence (AI)-driven fraud. The scam began with a WhatsApp message that seemed to come from her sister-in-law in New Jersey, urgently requesting money. When she called back, a voice that sounded familiar responded with a simple “Yes.” Believing it to be genuine, she transferred the money via Google Pay. 

Investigators later confirmed that fraudsters had used AI-generated voice cloning to mimic her relative. The Hyderabad cybercrime unit is now tracing the digital trail, noted a report by The Indian Express. 

Rising cases and statistics

Police have cautioned that such scams are increasing sharply and urged citizens to verify money requests even from known contacts. Suggested measures include using video calls for confirmation, enabling WhatsApp’s two-factor authentication, and reporting suspicious activity immediately.

A McAfee study earlier in 2025 found that 83 per cent of Indians targeted by AI voice scams reported monetary losses, with 48 per cent losing over Rs 50,000. The report also revealed that 69 per cent of Indians could not distinguish between real and AI-generated voices, while 47 per cent had either been victims themselves or knew someone affected—almost double the global average of 25 per cent.

Why these frauds succeed

“AI-based scams in India have surged in 2025, exploiting WhatsApp, SMS, and phone calls,” said Imteyaz Ansari, Founder and Chief Executive Officer, Azmarq Technologies. 

Experts explained that fraudsters often use publicly available data to clone voices or create deepfake videos, inserting them into emails, calls, or fake video conferences to extract money or sensitive information. 

According to Neehar Pathare, Managing Director, 63SATS Cybertech, “These scams succeed because they exploit human psychology—fear, trust, and urgency.”

Safety measures and reporting

Experts recommend avoiding the sharing of one-time passwords (OTPs) or login details, verifying identities through video calls or official channels, and being cautious of offers that appear too good to be true. 

Citizens are also advised to limit personal information shared online and enable two-factor authentication.

Those who fall victim are urged to report immediately on cybercrime.gov.in or through the helpline 1930, and to preserve evidence such as screenshots and messages. “The best way to stay protected is to pause before responding, question before believing, and verify before acting,” Pathare added.

Related Stories

No stories found.
logo
EdexLive
www.edexlive.com