Beware of AI-Mimicked Voices in Phone Scams, Warns Tamil Nadu Police

Preeti Bali / 1:06 pm / May 2, 2024

The Tamil Nadu police’s cybercrime unit has issued a strong warning to the public regarding a new scam tactic. Fraudsters are now utilizing artificial intelligence (AI) to clone the voices of loved ones and trick victims into transferring money under the pretense of emergencies.

How the Scam Works: Mimicking Voices to Create a False Sense of Urgency

According to the police advisory, scammers acquire voice samples of the victim’s trusted contacts, such as family or friends, from social media accounts or through “wrong number” calls. This stolen audio is then fed into AI software that can convincingly replicate the person’s voice.

Armed with this mimicked voice, the scammer contacts the victim, impersonating their trusted contact and fabricating an urgent situation. This ploy creates a sense of panic and distress, making the victim more susceptible to transferring money immediately.

Protecting Yourself: Recognizing Red Flags and Verifying Requests

The advisory urges citizens to remain vigilant against such scams. Be wary of unexpected requests for money, particularly when accompanied by claims of emergencies or attempts to create emotional manipulation.

To verify the legitimacy of such requests, especially those involving significant sums, utilize secure communication channels like encrypted messaging apps or video calls where you can confirm the identity of the person directly.

A Growing Concern: The Misuse of AI and Deepfakes

This incident highlights the growing concerns surrounding the misuse of artificial intelligence and deepfake technology. Prime Minister Narendra Modi, in a recent interaction with Bill Gates, co-founder of Microsoft, expressed his anxieties regarding the potential dangers of unregulated AI.

PM Modi advocated for the implementation of watermarks on AI-generated content as a safeguard against malicious applications.

The rise of deepfakes in India further underscores the urgency of addressing this issue. Stock exchanges BSE and NSE have issued warnings regarding fake videos featuring their CEOs, and actor Ranveer Singh filed a complaint against a deepfake video circulating on social media that impersonated his voice for political commentary.

These developments necessitate increased awareness and caution from the public to remain safe in the evolving digital age.

More Stories