Resilience to Online Censorship

Political Science professor Margaret E. Roberts writes in 2020:

* While the Internet has long been touted as a technology that is difficult to censor, regimes around the world have adopted a wide variety of censorship technologies and online propaganda strategies to try to control it. As a result, a rich debate has emerged as to whether the Internet solidifies or undermines autocratic rule. Some scholars have called attempts to control the Internet futile because the controls can often be easily circumvented (Diamond 2010). Others have a much more dire view of the ability of governments and powerful interests to manipulate the information environment and limit their own accountability…

* Internet activist John Gilmore once posited that the Internet was impervious to censorship because “[t]he Net interprets censorship as damage and routes around it”…a

* For those who are unaware that censorship exists and do not know what information might be censored, compensating for information manipulation is very difficult, particularly in online contexts, where censorship is masked by algorithms and the complexity of user interfaces.

* While many pundits predicted that the Internet would be difficult to censor, a wide variety of research has shown that government censorship efforts can have a huge impact on information access and—at times—political belief and action.

* These findings, combined with online experiments in democracies that show huge impacts of friction on the Internet, suggest that the costs of access to information can have large effects on consumption of information and belief about politics. Epstein & Robertson (2015) use lab and online experiments to show that small changes in the order of results presented by a search engine have a large influence on the voting intentions of participants. While the experimental setting makes it difficult to know the external validity of this experiment, this “search engine manipulation effect” (Epstein & Robertson 2015) suggests that government manipulation of search engine algorithms could have large effects on political behavior. King et al. (2017b) show that the coordinated coverage in national newspapers has large impacts on the distribution of information that is discussed in social media. Participation experiments in democracies, such as Facebook experiments (Bond et al. 2012, Jones et al. 2017), have shown that an online nudge to go out and vote can significantly increase the likelihood of participation. These studies suggest that actors with power over what information reaches Internet users, and how quickly, can potentially have a large impact on what users see, what they believe, and when they decide to participate.

* While fear-based censorship—meant to intimidate and deter—must be visible in order to be effective, more sophisticated forms of censorship that work through friction and flooding such as blocking of websites, reordering of search results, and covert information campaigns can exert their effects without users’ awareness (Roberts 2018). For this reason, information manipulation can easily go undetected, and users may not notice government influence on their information environment.

* Given that awareness of censorship can create more interest in censored material and can lead to backlash, governments have adapted their censorship strategies by only exerting partial control of the Internet through friction and flooding in an effort to hide their manipulation.

* It is well established in the political science literature that demand for political information is typically quite low. Downs (1957) calls citizens’ general lack of interest in politics “rational ignorance,” meaning that for the most part, people rationally should be ignorant of political issues because they are unlikely to be pivotal in those issues. Surveys have documented a very low level of political knowledge among average citizens in democracies (Converse 1964, Popkin 1994). Rational ignorance in politics may be even more likely in authoritarian contexts, where citizens have less control of the political environment than in democracies.

Even for politically interested citizens who are aware of censorship, the inability to know what is missing might make demand for circumvention low.

* Given low demand for political information, resilience to censorship may be stronger when censorship is applied not just to political information but also to entertainment. Zuckerman’s (2015) “cute cat theory of censorship” posits that while demand for political information is often low, Internet platforms that combine entertainment (like photos of cute cats) and politics may be more immune to censorship. This theory would predict general websites (such as YouTube, Facebook, and WhatsApp) that contain both entertainment and political information to be more resilient to censorship than specific websites that mostly offer political information. This argument is consistent with scholarship that shows that consumption of political information increases when it is paired with entertainment. For example, Baum (2002) shows that Americans are more likely to consume news about international politics when the news is paired with human interest stories. Pan & Roberts (2020) show that before the block of Wikipedia, mainland Chinese users largely sought out entertainment content on Wikipedia, but they ended up consuming political content because they were directed to it through the Wikipedia homepage.

About Luke Ford

I've written five books (see Amazon.com). My work has been covered in the New York Times, the Los Angeles Times, and on 60 Minutes. I teach Alexander Technique in Beverly Hills (Alexander90210.com).
This entry was posted in Censorship. Bookmark the permalink.