The third speaker in this session at the IAMCR 2025 conference in Singapore is Maham Sufi, whose focus is on misinformation and deepfakes in Pakistan. Deepfakes are AI-generated synthetic media, and their realism creates a substantial potential for audiences to be misinformed; however, image manipulation has long been a feature of political misinformation well before the emergence of AI image generation technologies.
Pakistan represents a hybrid regime with weak political parties that rely on the support of other elements of the establishment – not least the military. Image manipulation has a history here, directed at various leading politicians; this has now transitioned into the deepfake era too. This project interviewed some 24 Pakistani politicians to explore their understanding of deepfakes.
Many of them had a very loose understanding of deepfakes, conflated with other forms of misinformation, and were unaware of any tools for image verification; most simply relied on their own ability to detect misinformation. They also regarded deepfakes as a threat, however, and saw a lack of public awareness about deepfakes; many blamed the public and their low information literacy for the circulation of misinformation. They also saw such misinformation as part of a political culture thriving on entertainment and sensationalism.
The use of deepfakes is seen as a political tool to engage in character assassinatiojn, digital intimidation, and other exclusionary practices; the fear of being targeted by deepfakes can also lead to preemptive self-censorship, and this entrenches power imbalances between dominant and vulnerable groups. Deepfakes also co-exist with other problematic practices like genuine audio leaks and phone tapping practices, increasing their believability.
Politicians called for stronger laws and stricter punishments, and saw a need for greater media self-regulation, but this is driven more by a fear of losing control over political narratives than a genuine concern for public resilience against misinformation.