LAHORE: Fake news or misinformation on social media platforms is a growing phenomenon in the world and can have a profound social, economic, and political impact on societies ranging from election interference, polarisation, and violence. This problem is particularly challenging in developing countries where lower levels of literacy and limited exposure to technology can make users more susceptible to believing and acting upon misinformation.
LUMS faculty, Dr. Ayesha Ali (Assistant Professor of Economics) and Dr. Ihsan Ayyub Qazi (Associate Professor and Chair, Department of Computer Science) were recently awarded a research grant by Facebook for their proposal on understanding the impact of digital literacy on the spread of misinformation in Pakistan. The highly prestigious Facebook Integrity Foundational Research Award was awarded to only 11 proposals from across the world in 2019, which included proposals from Stanford University, Princeton University, Columbia University, University of Pennsylvania, and Michigan University. Out of these, only two awards were given to Universities outside the USA.
Presenting the results of their research, Dr. Ali gave a seminar at Facebook in Menlo Park, California and later, she was invited to deliver a seminar at MIT Sloan School of Management in Cambridge.
Dr. Ali and Dr. Qazi conducted a household level survey to capture the trends in social media use among low and middle-income users in the city of Lahore, Pakistan and evaluated the effectiveness of two educational interventions for countering misinformation among populations with lower levels of digital literacy in a randomised control setting.
Using a list of actual news stories circulated on social media to measure the extent to which users are likely to believe misinformation, the first intervention educates users about common features of misinformation through a video in a local language, while the second intervention in addition to the video provides feedback to users about their past behaviour in engaging with misinformation. “We found that showing the video alone has no effect, while showing the video and giving personal feedback, on average, increases the ability of treated users to identify misinformation by 11% relative to the control group. The effect persists even after a lapse of 4-6 weeks after the experimental intervention,” shared Dr. Ali. “This is a very encouraging result because it opens up the possibility of implementing such interventions on social media platforms at a larger scale. It also provides insights towards avenues for future research such as exploring the role of who delivers the message and examining differential changes in behaviour depending on the strength of prior beliefs,” added Dr. Qazi. PR