As Aamir Khan’s fake video on Congress goes viral, how to spot disinformation amid elections
Kartavya Desk Staff
The first phase of voting for the Lok Sabha elections is on Friday (April 19). Over the past few weeks, there has been a deluge of disinformation and manipulated media online. Two videos of actor Aamir Khan went viral this week. Both were manipulated versions of a promo for Khan’s popular TV show, Satyamev Jayate. In one, Khan appears to be explicitly supporting the Congress party, while in the other, he is seen speaking about nyay (justice) — a key Congress talking point in recent years, and the title of its manifesto (Nyay Patra or ‘Document [for] Justice’). Recently, actor Ranveer Singh too was a victim of deepfake technology, when a manipulated video of him criticising Prime Minister Narendra Modi on the issues of unemployment and inflation was widely shared. In the original clip, however, Ranveer was actually praising the prime minister. Here is how these deepfake videos are made — and how you can spot them ## Voice swap technology itisaar.ai, an AI detection tool developed in collaboration with IIT Jodhpur, shows that these videos were generated using ‘voice swap’ technology. As the name suggests, this refers to the process of using an AI algorithm to either alter or mimic an individual’s voice. The technology also allows the creators to change the characteristics of a voice, such as accent, tone, pitch, and speech patterns to make the videos more realistic. Currently, there are several easy-to-use AI voice swap tools available for free. The creator has to simply upload or record the audio sample that she wants to replace, and then customise the settings to make the uploaded sample sound as realistic as possible. ## How to spot deepfakes While it is not easy to spot well-produced deepfakes, here are some tips to keep in mind while scrolling through social media, especially during election time. Verify sources: Be cautious of audio or video content from unfamiliar sources, especially if it seems controversial or sensational. Verify the authenticity of any suspicious post by cross-referencing with reliable sources, and trustworthy media organisations. Listen for anomalies: Deepfake audio may exhibit subtle anomalies, such as the voice’s unnatural tenor, slightly robotic speech, and irregular pauses. Listen closely for these telltale signs of manipulated or synthetic speech. Scrutinise visual content: Deepfake audio is often accompanied by manipulated visual content, such as altered video footage. Check both audio and visuals elements for any discrepancies or inconsistencies. For instance, if lips do not move in sync with the speech, the video you are seeing may be manipulated. Stay informed: Staying updated about day-to-day news and events is key to recognising the risks associated with deepfakes. It is harder to fool people who have general awareness of what is happening around them. Use AI voice detectors: A few AI detectors, such as Optic’s ‘AI or Not’ are available to be used for free. You can upload any suspicious audio or video onto such detectors, which will tell you the authenticity of any content. Ankita Deshkar is a Deputy Copy Editor and a dedicated fact-checker at The Indian Express. Based in Maharashtra, she specializes in bridging the gap between technical complexity and public understanding. With a deep focus on Cyber Law, Information Technology, and Public Safety, she leads "The Safe Side" series, where she deconstructs emerging digital threats and financial scams. Ankita is also a certified trainer for the Google News Initiative (GNI) India Training Network, specializing in online verification and the fight against misinformation. She is also an AI trainer with ADiRA (AI for Digital Readiness and Advancement) Professional Background & Expertise Role: Fact-checker & Deputy Copy Editor, The Indian Express Experience: Started working in 2016 Ankita brings a unique multidisciplinary background to her journalism, combining engineering logic with mass communication expertise. Her work often intersects regional governance, wildlife conservation, and digital rights, making her a leading voice on issues affecting Central India, particularly the Vidarbha region. Key focus areas include: Fact-Checking & Verification: As a GNI-certified trainer, she conducts workshops on debunking deepfakes, verifying viral claims, and using OSINT (Open Source Intelligence) tools. Cyber Law & IT: With postgraduate specialization in Cyber Law, she decodes the legalities of data privacy, digital fraud, and the evolving landscape of intellectual property rights. Public Safety & Health: Through her "The Safe Side" column, she provides actionable intelligence on avoiding "juice jacking," "e-SIM scams," and digital extortion. Regional Reporting: She provides on-ground coverage of high-stakes issues in Maharashtra, from Maoist surrenders in Gadchiroli to critical healthcare updates and wildlife-human conflict in Nagpur. Education & Credentials Ankita is currently pursuing her PhD in Mass Communication and Journalism, focusing on the non-verbal communication through Indian classical dance forms. Her academic foundation includes: MA in Mass Communication (RTM Nagpur University) Bachelors in Electrical Engineering (RTM Nagpur University) Post Graduate Diploma (PGTD) in Cyber Law and Information Technology Specialization in Intellectual Property Rights Recent Notable Coverage Ankita’s reportage is recognized for its investigative depth and emphasis on accountability: Cyber Security: "Lost money to a scam? Act within the 'golden hour' or risk losing it all" — A deep dive into the critical window for freezing fraudulent transactions. Public Health: "From deep coma to recovery: First fully recovered Coldrif patient discharged" — Investigating the aftermath of pharmaceutical toxins and the healthcare response. Governance & Conflict: "Gadchiroli now looks like any normal city: SP Neelotpal" — An analysis of the socio-political shift in Maoist-affected regions. Signature Beat Ankita is best known for her ability to translate "technical jargon into human stories." Whether she is explaining how AI tools like MahaCrimeOS assist the police or exposing the dire conditions of wildlife transit centres, her writing serves as a bridge between specialized knowledge and everyday safety. Contact & Follow X (Twitter): @ankita_deshkar Email: ankita.deshkar@indianexpress.com ... Read More