Google will launch a new campaign in Germany that tries to make people more resistant to the corrosive effects of internet misinformation, following its success in Eastern Europe.
The tech behemoth intends to release a series of brief videos illustrating the strategies utilised by numerous deceptive claims. In Germany, the videos will display as adverts on platforms such as Facebook, YouTube, and TikTok. A similar effort is in the works in India.
Pre-bunking is a strategy that involves teaching individuals to identify fraudulent claims before they meet them. The strategy is garnering backing from academics and technology firms.
Beth Goldberg, the head of research and development at Jigsaw, a Google incubator that investigates new societal concerns, stated, “There is a real appetite for solutions.” “It is quite unusual to use advertisements to counter a disinformation strategy. And we are thrilled with the results.”
Although belief in falsehoods and conspiracy theories is not new, the internet’s speed and reach have given them increased influence. When fueled by algorithms, deceptive claims can prevent vaccinations, promote totalitarian propaganda, sow distrust in democratic institutions, and incite violence.
It’s a difficult problem with few simple answers. Journalistic fact checks are successful, but they are labor-intensive, not read by everyone, and will not persuade people who are already sceptical of traditional media. Content filtering by technology corporations is an alternative option, but it merely spreads disinformation elsewhere and elicits accusations of censorship and bias.
Pre-bunking videos, on the other hand, are relatively inexpensive and simple to create, and they can be viewed by millions when uploaded to prominent platforms. They also dodge the political difficulty by focusing on the tactics that make viral disinformation so contagious, rather than on the topics of false assertions, which are often cultural lightning rods.
These strategies include fearmongering, scapegoating, false comparisons, hyperbole, and context omission. Whether the topic is COVID-19, mass murders, immigration, climate change, or elections, deceptive assertions frequently employ one or more of these strategies to exploit emotions and impede critical thought.
Google conducted the broadest test of the hypothesis to date by launching a pre-bunking video campaign in Poland, the Czech Republic, and Slovakia in the fall of 2013. The videos deconstructed various strategies utilised in bogus claims regarding Ukrainian refugees. Many of these assertions were based on startling but unsubstantiated tales of refugees committing crimes or stealing jobs from locals.
On Facebook, TikTok, YouTube, and Twitter, the videos were viewed 38 million times, which represents a majority of the population in the three countries. Researchers discovered that participants who saw the films were more likely to recognise disinformation strategies and less inclined to distribute misleading information.
The pilot study was the greatest test of pre-bunking to date and contributes to the rising support for the notion.
Alex Mahadevan, director of MediaWise, a media literacy initiative of the Poynter Institute that has incorporated pre-bunking into its own programmes in countries such as Brazil, Spain, France, and the United States, stated, “This is a good news story in what has essentially been a bad news business when it comes to misinformation.”
The method, according to Mahadevan, is a “very efficient way to combat misinformation at scale because you can reach a large number of individuals while simultaneously addressing a wide variety of errors.”
Google’s new campaign in Germany will emphasise the ease with which photographs and videos can be offered as evidence of something fake. Following last week’s earthquake in Turkey, several social media users released footage of a large explosion in Beirut in 2020, claiming it was actually evidence of a nuclear explosion caused by the earthquake. It was not the first time that misinformation about the explosion of 2020 has circulated.
Google will reveal its new German campaign on Monday, prior to the Munich Security Conference the following week. The timing of the statement, preceding the annual gathering of foreign security leaders, underscores increased worry among tech companies and government officials regarding the impact of misinformation.
According to Sander van der Linden, a professor at the University of Cambridge and a major expert on the idea, pre-bunking is favoured by tech corporations because it avoids sensitive, easily-politicized subjects. Van der Linden assisted Google with their campaign and is currently advising Meta, the owner of Facebook and Instagram.
In recent years, Meta has integrated pre-bunking into numerous media literacy and anti-misinformation initiatives, according to a statement sent to The Associated Press via email.
They include a 2021 initiative in the United States that provided Black, Latino, and Asian populations with media literacy training about COVID-19. Later tests revealed that participants who received the instruction were significantly more resistant to false COVID-19 claims.
Pre-bunking presents its own difficulties. The videos ultimately lose their effectiveness, necessitating the employment of occasional “booster” videos. In addition, the films must be well-crafted enough to hold the viewer’s interest and must be adapted to diverse languages, cultures, and demographics. Similarly to a vaccine, it is not 100 percent effective for everyone.
Google discovered that its advertising in Eastern Europe varied between countries. Researchers found that while the impact of the movies was greatest in Poland, they had “little to no discernable effect” in Slovakia. The videos were not developed expressly for the Slovak audience; they were dubbed into Slovak.
In conjunction with traditional journalism, content moderation, and other strategies for combatting disinformation, pre-bunking could help communities develop a type of herd immunity to misinformation, thereby reducing its spread and impact.
“Incorrect information is analogous to a virus. It expands. It persists. It can influence people’s actions “Van der Linden told the Associated Press. “Some patients experience symptoms, while others do not. If it spreads and behaves like a virus, then perhaps we can determine how to immunise people.”