banner_ad
Crime

STF’s Cyber Cell issues advisory on AI-based voice cloning

Friday, 18 August 2023 | PNS | DEHRADUN

The Cyber Cell police station of the Uttarakhand Special Task Force (STF) has issued an advisory on artificial intelligence (AI)-based voice cloningcybercrime in which artificial intelligence is being used to clone the voice of a person to dupe his/her friends or family. The authorities issued the advisory after the first case of the State was registered in Dehradun last week in which a Vasant Vihar resident was cheated of over Rs six lakh. Thousands of such cases have been registered in various parts of the country.

According to the officials, the specific use of AI for voice cloning and deep fakes has existed for several years but in the past it required significant technical knowledge and access to resources to do it. Now legitimate AI tools can be used with little or no cost to clone a victim’s voice due to the recent advancements in AI. The deputy superintendent of police of Cyber Crime police station, STF, Ankush Mishra said that the technology can create a unique human voice after studying human speech model data sets. He said that AI and/or Machine Learning (AI-ML) model learns the patterns and characteristics from a person’s voice recordings like speech patterns, accents, voice inflection and even breathing along with the pronunciation of words, the tone of the voice and emotions. 

The criminals use this AI-ML model to convert text to speech that generates a waveform to represent the sound of any person. This computer-generated call is then used by criminals to call the person’s family or friends pretending to be in some serious emergency and most of them end up paying them money thinking they are helping their loved ones in need, said DSP. He said that with the constant development of AI, criminals will also upgrade their method of duping people which makes it more necessary for everybody to remain cautious while doing any kind of transaction digitally. He said that one must always remember some key points as a precautionary measure against such frauds. He said that if a person receives such a call, he/she must listen carefully to unnatural pauses or robotic speech styles along with an error in pronunciation or tone of the conversation.

One should notice if the voice lacks emotional expression or difference and if possible, compare the voice with the voice of the original person. Mishra emphasised that one must always try to call the original person on their personal phone number to ensure whether they talked to the real person. He also asked to not share voice samples or recordings with any unknown person or limit access to sensitive voice recordings or audio files. He appealed to the public to always register a complaint against cyber fraud in the National Cybercrime Reporting Portal (NCRP) at www.cybercrime.gov.in or call at 1930 to register financial fraud complaint.  

Related Articles

Back to top button