You are here: American University School of International Service Big World podcast Episode 36: Who Controls the Internet?

Who Controls the Internet?

Over the past decade, the internet’s role in international affairs has expanded, with governments, including India’s, periodically shutting down the internet; great powers, like Russia in the 2016 US presidential election, spreading disinformation; and private companies like Facebook and Twitter becoming the gatekeepers of public discourse. In this episode of Big World, SIS professor Eric Novotny joins us to discuss how the internet is used and misused to impact international affairs.

Professor Novotny describes how governments’ internet shutdowns impact democracy (2:01) and the implications of social media platforms making decisions that affect international affairs (5:33). He also breaks down issues that arise when large social media companies interact with the US government (6:04) and differentiates what constitutes misinformation and disinformation (9:41).

Now that the internet and social media have become our digital public squares, what happens when those spaces host a rise in truth decay (16:04)? How important is internet freedom in citizens’ abilities to spark change and grow human rights movements (17:56)? Professor Novotny answers these questions and describes the kinds of anti-censorship tools he’s built that have been used in different countries (20:20). Lastly, he reveals whether he believes the internet is still a great equalizer or if it has been completely taken over by governments and corporate interests (21:52).

During our “Take Five” segment, Professor Novotny shares the five practices he would institute to address misinformation and disinformation on the internet (12:10).

0:07      Kay Summers: From the School of International Service at American University in Washington, this is Big World, where we talk about something in the world that really matters. The internet was conceived as a way to quickly and easily share information. It became a great equalizer, galvanizing interest groups, driving social change, and sparking innovation. But dark corners have always existed, and the possibility for governments and non-state actors to weaponize information and use the monitoring and data accumulation powers of the internet in scary ways has always existed. Over the past decade, the role of the internet to shape and even determine the outcomes of events and even elections has expanded. With governments periodically shutting down the internet, great powers spreading disinformation, and private companies becoming the gatekeepers of public discourse.

0:58      KS: Today, we're talking about how the internet is used and misused to impact international affairs. I'm Kay Summers, and I'm joined by Eric Novotny. Eric is senior advisor for digital media and cybersecurity at the US Department of State. He was also senior advisor for democracy and technology at the US Agency for International Development. Eric is a faculty fellow of the Internet Governance Laboratory here at American University and a professor in the School of International Service. Eric, thank you for joining Big World.

1:28      Eric Novotny: Good. Thank you very much, Kay. It's a pleasure to be here.

1:31      KS: Wonderful. So, Eric, just to get us started, there's been a trend of governments around the world, shutting down the internet in the face of protests in order to keep political oppositions from organizing. These are not just authoritarian regimes taking part. India is the world's largest democracy and authorities there have shut down the internet a number of times, most recently in the midst of farmers protesting the government's plan to deregulate wholesale trading, which is a huge issue over there. So, how do governments' internet shutdowns impact democracy?

2:01      EN: The whole idea of an internet shutdown has typically been associated with authoritarian regimes, but as you point out, this practice is spreading. And one of the reasons that this happens is because of mobile communications. Now, when the internet was primarily a desktop phenomenon, it still could be used, as you said in your introductory remarks, as a way of creating and passing information, imparting new ideas, getting work done, contributing to innovation, and so forth. But when we began to realize that we could take the internet in motion, that is we could drop it in our backpack or our pocket, and the internet would be going wherever we were going, and we still could have this kind of communication and functionality. Now the power of the internet was extended to organizing. In the Iranian civil disorders in November of 2019, the internet was used to prevent groups from organizing and communicating with one another.

3:16      EN: What's more troublesome is when democratic states begin to shut down all or part of the internet. It's something that we expected authoritarian regimes to do, and they still do, but many times we are now finding, as you point out, in India and in other countries where the mobile communications is shut down first, and that is to usually prevent people from organizing effectively. It actually doesn't work very well because if you take away the electronic communication that people are accustomed to, they may spill out into the streets just to figure out what's going on because they can't communicate electronically. So, in many ways it can be thoroughly counterproductive. The second thing I'd point out is that the rules for these types of shutdowns or where you would expect them in democratic states to be very well-defined are not. And therefore, there's a lot of work to be done under what circumstances and under what authorities governments would have the power to shut down all or part of the internet.

4:29      KS: And then in contrast to the government's control over the internet, we have social media. And some of the largest social media companies are based in the US, but the decisions they make can affect countries around the world. And in some places, Facebook in particular, can almost serve as a defacto window to the internet. For instance, Facebook played a role in the Rohingya genocide in Myanmar. The company failed to prevent its platform from being used by military leaders, among others to incite violence.

4:59      KS: So, what are the implications of private companies like Twitter and Facebook making decisions that purposefully, or even inadvertently, affect international affairs? And I guess, I would take it one more step further and say, what are the implications of them not making decisions? Because I think that's been one of the questions for these CEOs has been, what decisions are you making, but then what also are you just leaving up to benign neglect? So, what would you say are the implications of what these companies do?

5:33      EN: These platforms grew up in an environment where the legacy of free speech was firmly established. Under the US Constitution, the First Amendment basically makes it very difficult for Congress to legislate against free speech or assembly or of the press. And under those circumstances, when these platforms, particularly Facebook, became so easy to use and so widespread around the world that that default principle in which you did not censor nearly anything, unless it was in the service of conspiring or committing a crime, spread to the rest of the world without really the rest of the world knowing what they had gotten into.

6:35      EN: So, two things developed from that, one is a model that may not be appropriate in all cases. Secondly, the platforms like Facebook were absolved of normal liabilities for slander or libel, which allowed them to be sued. In a law that was passed by Congress, those internet platforms were basically exempted in a famous part of the law called Section 230, where their monopoly power or monopoly-like powers were free to expand, but their accountability was not. It exposes a fundamental tension between trying to prevent or correct misinformation, disinformation, or inciting protest and violence, and on the other hand, respecting free speech. And this is perhaps no more pronounced than in the case of Facebook.

7:39      KS: We see this theater of these tech CEOs testifying in some form or another before Congress, and they answer sometimes ill-informed questions from different senators or Congress people, and there's always the internet game of trying to figure out what they were really thinking or seeing if Jack Dorsey was actually tweeting while he was supposed to be testifying and that kind of thing. So, that's the theater piece. But as far as how these companies actually interact with the US government and with other governments, aside from those public performances, what are the issues that you see with how these large social media companies interact with different governments, including the US government?

8:26      EN: The ethic that has grown up out of Silicon Valley is one of complete freedom of movement, so to speak, in which these companies really do not want to be regulated. Now, why then did Mark Zuckerberg in his most recent testimony put on a publicity campaign, inviting regulation? Facebook is so dominant and so ubiquitous that it's very difficult for Congress to figure out exactly what it wants to regulate. So, Facebook can go on the record and say, "Yes, we'd like to be regulated. We want to be good citizens. We want to be responsible." But then when you actually try to write a law, which is what Congress is supposed to do, to actually perform those functions, it becomes very, very difficult to do.

9:20      KS: Some of the effects of misinformation and disinformation include increased distrust in democratic institutions, an arise in prominence of conspiracy theory discourse, both of which we have seen lately. So, with your teacher hat on, first, please draw a clear line for me, how are misinformation and disinformation different?

9:41      EN: So, in the case of misinformation, that is usually defined as an erroneous assertion or belief. As opposed to disinformation, can be partially true in terms of how I select and assemble factual information, or I can conceal facts or fiction, but it has to feed into a strategic goal. It has to feed into a larger narrative. And typically, in disinformation and especially its propaganda elements, there's also an element of emotional appeal. That it's not just misrepresenting a fact, it's also appealing to non-rational elements where you are looking at "a political crisis," or "it's time to act," or "this is a terrible situation," or "it's the right thing to do."

10:40      KS: And there's also a question of intent then, isn't there? Where it becomes a little hard to parse some of this, because you can have a piece that was very clearly disinformation. Something that someone created for the purpose of misleading someone, and it gets shared by—the example's always crazy uncle, which you know, is sad because I have some really nice uncles, but let's use that example. You got a crazy uncle, they share it, they believed it, so they're sharing what they believe that conforms to their worldview, and it just happens to be something that was created for the purpose of misleading people. So, it's just disinformation being shared, but there was not, on their part, the intent to obfuscate. They believed it, which is how we get into the place, I think, where we are.

11:27      EN: Yeah. That's very true, Kay. And in part, it's because if you are saying extreme or outrageous or these types of things. Because social media is so widespread, the chances that you will find some other like-minded people who agree with you, no matter how far out that might be, is fairly high. There will be people who come to your support and that in turn then reinforces the fact that, "Hey, I've got something here. I'm going to keep going with it."

12:10      KS: Eric Novotny, It's time to take five. This is when you, our guest, get to blue sky it and change the world as you'd like it to be by single-handedly instituting five policies or practices that could change the world for the better. What five practices would you institute to address misinformation and disinformation on the internet?

12:29      EN: So, I would say that policies should be ex post, rather than based on prior restraint. That would be my number one. If you're basing these policies on what the Supreme Court has called prior restraint, that is before I publish something, it's going to be reviewed and some third party is going to decide whether or not it's good for you, many of these situations require a ex post facto or reactive approach where you are essentially still engaged in the marketplace of ideas, and you are not actually snuffing out opinion before it's published. That, to me, invites instant abuse, particularly against opposition groups and civil society organizations. And I would not like to see that. The second thing I'd like to rely upon is public debate and exposure and contending ideas, particularly false claims, and on publishing counter-arguments as opposed to suppressing arguments.

13:37      EN: As long as we're conscious of it and we do it in the spirit of getting to the answer or to the policy, separating fact from opinion and so forth. The third thing I'd like to see is organizations rise to the level in which trust is restored in the gatekeeper organizations. Fourth, I do think that there are circumstances, given the monopoly position that many social media or other big companies have, that they should be able to temporarily suspend repeat performers who, after the fact, constantly engage in behavior that may violate their terms of service. I realize this could be a slippery slope, but if we're not going to reform Section 230, then I think there has to be some leeway to allow private media and platforms to say, "Hey, I've got to take this down, or at least I'm not going to do any more. I'm going to freeze this particular account."

14:49      EN: And it doesn't mean that you have to suppress information, necessarily, but you have to tell the public what you're doing and why. And then finally, I would hate to see the courts flooded with tens of thousands of free speech cases every year, I think that would be a mistake. And so, I would rely on legal process to solve these kinds of issues as a last resort.

15:16      KS: That is a hefty to-do list. Thank you for that. The way that we use the internet and social media as our gathering places, if we're using that town square model, you didn't have such a long tail on information. It was shared. If it was wrong, it was quickly disproven and people moved on. But if our internet and our social media are our digital public squares, now, this is where we gather, and we have that long tail where it really never dies, it's always out there. So, what happens when those spaces host a rise in truth decay, which is such a great phrase, truth decay.

16:04      EN: Yeah. One of the reasons I think, as you point out, is once information is published electronically in a digital network, it's very, very difficult to get rid of it. There are companies out there that perform a number of functions to try and manage this whole business, and there are services that will be on the lookout for bad things other people say about you. I've had this happen to me where I will get a spam message that says, "Somebody posted something about you, and it's not really complimentary. You might want to hire us and we'll tell you what it is, and then you can pay to have it removed." And I look at those things and I said, "One, they're probably fake in themselves. But number two, I could see where some companies would deliberately plant that sort of thing so they can clean it up." And on the internet, it's very difficult to police that sort of thing.

17:03      EN: And then there are also media organizations that will help you shape the social media environment by actually planting information to try and steer the narrative or the conversation in one or another ways. And they may never make themselves publicly known. There've been a number of scandals about organizations that have done this for both government and private sector clients that are interested in shaping the information environment.

17:34      KS: So, we talked a lot about the bad stuff. So, just a little bit about the good stuff. The internet and social media have catalyzed a number of positive things, including the Arab Spring, which it's the 10th anniversary of the Arab Spring this year. How important is internet freedom in citizens' abilities to spark change and grow human rights movements?

17:56      EN: Yes, that's a very broad question and something that I could talk about for a long time. I work a lot for a number of human rights groups. I'm on boards of directors of a number of major human rights organizations. I think I would approach it in this way, that here in the West, we do complain about all of the nonsense that surfaces on social media and some of it is very bad and cruel and untrue and so on. But we have to remember that there are millions of people in the world who would love to have a free, flat, fair, open internet and do not have it. In the Arab Spring, because mobile communications in particular was on the upswing, it allowed people to communicate and to organize with one another much more effectively.

18:55      EN: I think the authoritarian regimes, particularly in Tunisia and in Egypt, I wrote a very detailed article on this a few years ago, where I looked at these countries and what the reactions were, they were caught off guard. And so, I think people were able, particularly opposition groups were able to mobilize and organize and begin to focus their opposition in a much greater way than could have been possible just in an era of print media. Interestingly enough, what I also discovered in my research was that there was not an emergence of leaders who could then propel this social media phenomenon into political power, either to form a political party or to organize an opposition government or to pressure the regime or what have you.

19:54      EN: And I'm still a bit puzzled by that. If you look at and you study revolutions, you do find that a leadership typically does emerge in cases like this, but this was not the case with something that was social media based and we need to understand that a little bit more.

20:12      KS: Eric, can you tell us a little bit more about anti-censorship tools that you've built that have been used in different countries?

20:20      EN: Yeah. I have participated in the design and deployment of a number of them. When I was in government, I also funded many of these particular tools. What they essentially do is allow an individual citizen to overcome blocking by a government. Blocking can occur on the internet in a number of ways. You can block IP addresses you don't like. You can inspect the URLs and domain names, and you can knock out and block words—keywords that you don't like. More and more, these governments are becoming more sophisticated in the way that they actually either engage in surveillance or in blocking. Like deep packet inspection, where they can go down into the contents of communications or simply block an entire range of IP addresses that they know that a social media platform uses and can wipe them out that way. What circumvention tools do is essentially trick, so to speak, the censorship software so that it looks like a user is going to an innocent or permitted website when they're actually going somewhere else.

21:38      KS: Eric, last question, it's big, existential, possibly unanswerable. Is the internet still a great equalizer or has it been completely taken over by governments and corporate interests?

21:52      EN: I do believe the pendulum is actually swinging in the opposite direction. Not decisively yet, but I think people are now a little bit more aware that they cannot take at face value everything that they see on the internet. They're seeking out better information in some cases. There are organizations that are trying to change that model of the internet, where the big tech companies are dependent primarily on advertising. And when that's the case, the corporations are really the customer and not you. What you are is a source for their data to push ads at you in one kind or another. And I think a few of these scandals have now exposed the fact that we may have reached the era of peak monopoly tech, at least I hope so. And there are trends in the opposite direction, but it hasn't happened as much as I would like.

22:58      KS: Eric Novotny, thank you for joining Big World to discuss the huge topic of the internet and global affairs. It's been a real treat to speak with you.

23:07      EN: Absolutely. I enjoyed it very much, Kay.

23:10      KS: Big World is a production of the School of International Service at American University. Our podcast is available on our website, on iTunes, Spotify, or wherever else you listen to podcasts. If you leave us a good rating or a review, it'll be like finding out your dog's Instagram account just topped a 100,000 followers. Our theme music is, "It Was Just Cold," by Andrew Codman. Until next time.

Episode Guest

Eric Novotny,
professor, SIS

Stay up-to-date

Be the first to hear our new episodes by subscribing on your favorite podcast platform.

Like what you hear? Be sure to leave us a review!

Subscribe Now