Seven stocks carried the S&P 500 in 2023. Their names are familiar: Apple, Alphabet, Amazon.com, Meta, Microsoft, or “GAMAM,” plus NVIDIA and Tesla (Scheid 2023). Together, these Big Tech companies carry over a quarter of the value of one of the major stock indexes in the United States, and by extension, the global markets. Big Tech is economically massive – so much so that observers want to call them “monopolies” (Moazed and Johnson 2016; Hubbard 2021). But what makes Big Tech big isn’t just its global market dominance and the financial value it generates. They have infiltrated how we see the world.

We have started to look to Big Tech to fix all things. Just think: in the early days of the Covid pandemic, Big Tech companies and their leaders, such as Bill Gates and Eric Schmidt (Google), were invited to consult with governments about solutions to the global medical crisis. Apple and Google joined forces, overcoming their corporate differences to allow interoperability for the various Bluetooth-based contract tracing apps in the works worldwide. Meanwhile, public officials recommended social distancing, masking in public spaces, and air filtration while we waited for (what turned out to be) the record-breakingly quick development of an effective vaccine. These scientifically-informed tips were roundly derided in many countries, resulting in long-lasting “anti-masker” and “anti-vax” political groupings. To solve a historical mass health crisis, we turned to Big Tech while distrusting medical professionals on a global scale.

In short, Big Tech companies govern how we live our lives, see the world, and perceive what is of value. Governance has no shortage of definitions or investigations in political science. Briefly, governance describes how social order is created and maintained. In reference to global governance, Rosenau famously coined “governance without government” as a phrase, pointing out that governance is “a system of rule that is as dependent on intersubjective meanings as on formally sanctioned constitutions and charters” (Rosenau 1992, 4). In IR, we tend to study “global governance,” with the idea that at the international level, there are many possible actors who exercise governance. One of the watershed texts in the global governance space is Deborah Avant, Martha Finnemore, and Susan Sell’s book, introducing us to the idea of “global governors” (2010) that highlighted the authority that non-state actors have on global policy outcomes. Transnational corporations, a category into which Big Tech companies most certainly fall into, are certainly global governors.

But if governance is about regulation without formal decree, we need to look beyond “policymaking” in a formal sense. Policymaking is about rules. We expect therefore for governors to be negotiating over and trying to implement rules. Often we are trying to offer evidence non-state actors are being invited to the governance table, which is controlled by states. Non-state actors therefore govern indirectly by influencing other actors, or agitating when violations of rules happen. What Big Tech does is the table, through its market dominance, technosolutionism, and, as we see below, its digital platforms that generate intersubjective meanings – and value – for all their users. Big Tech directly creates and controls how we do things through their creations. This has been called “platform governance” (Gillespie 2018; Gorwa 2019). Platform governance goes beyond traditional ways of thinking about governance to thinking about how we live through our objects – digital devices – which create rules about how we live.

It’s also important to point out that in the process of governing on and around their platforms, Big Tech is embedded enough and globally widespread enough that governing through their platforms effectively means they govern beyond their platforms as well. Not only does Big Tech govern within its space – digital technology developments – but the nature of their innovations helps them go well beyond “just” digital technology. Their outsized economic size helps them dominate global markets. Their embeddedness in daily life, from smartphones to smart homes, facial recognition technology and technology-enabled cars, makes Big Tech creations integral to the human experience (W. H. Wong 2023), including the economy and social relations.

In this article, I will briefly review how non-state actors have been considered in global governance. I also link such literature to the writing on platform governance that has emerged in Science and Technology Studies (STS) and Media Studies/Communications as a way to understand Big Tech as a global governor. Building on previous work on global governors, I argue Big Tech’s purview is far wider and deeper than many other governors because of the types of products and services Big Tech companies make. Big Tech products are infused into our daily activities and, as such, these companies have the capacity to shape our behaviors, values, thoughts, and relationships that go far past how we have thought about the power of corporations in IR theory. Using the example of Meta, I will show how one (very large) company engages in all kinds of governance: through its own internal policies that control how we understand, navigate, and live on its platforms Facebook and Instagram (note, Meta also owns WhatsApp and Messenger).

Global Governors

As a concept, “global governors” shed plenty of light on how politics were changing. By identifying agents over processes, Avant, Finnemore, and Sell underline the critical place for non-state actors in the modern international system, examining them “as actors in their own right” (2010, 25). Though somewhat continuing the vein that had begun in the 1970s with the recognition that states were not the only actors of consequence in global politics (Nye and Keohane 1971), non-state actors were still generally not seen as governing so much as being included in the governing process.

In their watershed volume, Avant, Finnemore, and Sell confirmed what many of us were finding in our own work on specific non-state actors. They recruited some of the top thinkers to consider the kinds of non-state actors governing in the world, across a diverse variety of issue domains. Defining global governors as “authorities who exercise power across borders for purposes of affecting policy” (2010, 2), the volume identified several of their powers: creating issues, setting agendas, establishing and implementing rules/programs, and assessing outcomes. Critically, the power of global governors hinged on their authority – their ability to induce deference from others (2010, 9). What distinguishes platform governance – and therefore Big Tech governance – will be the lack of this authority to enable their governance.

But at the time, what made this book important was it rather firmly established that, rather than being a handmaiden to the state’s governor, non-state actors were making social order, determining shared expectations, and creating ways to enforce those orders and expectations that sometimes supplemented, and often substituted for state (in)action. A cottage industry around non-state actor power offered ways to think about the authority of global governors in various sectors (e.g. Barnett 2009; Büthe and Mattli 2011; Green 2014; Newman and Posner 2018; Stroup and Wong 2017; W. H. Wong 2012).

The global governor framework, however significant in the advance of IR theory, does not quite cover it for the types of corporate actors that have sprung up since the advent of so-called “Big Data.” In the late 2000s, capabilities for collecting vast amounts of data about human behaviors through our interaction with Internet-enabled devices, and the ability to merge and analyze these massive datasets ushered a new set of capacities for tech corporations and governments. As one prominent paper puts it, Big Data is the interplay between technological advance, using data to make societally-relevant claims, and the belief these abilities can generate a different (better) form of intelligence (Boyd and Crawford 2012). Since this time, “Big Tech” has taken off, thrusting itself into nearly every facet of human existence. The 2020 pandemic worsened this dynamic, as people had to shift much of their lives into physical isolation while still trying to connect digitally.

Thus, the focus on “making rules,” so pronounced in the global governance literature, does not apply as well in the current era of Big Tech’s capacity to “make realities.” They do so by understanding our every move through the harvesting of data. They then analyze those data to find generalizable patterns to shape how we see the world and make decisions. Algorithms not only manage our online experiences, they also mediate our access to daily necessities through practices like dynamic pricing, which changes the cost of goods and services depending on real-time supply and demand metrics.

All of this is to say that we need to go beyond how Avant, Finnemore, and Sell encouraged us to think about it in terms of management of various stakeholders (pages 18-25). Not only does Big Tech make rules, they mediate and facilitate human interactions on their platforms. Platforms, as will be explained below, facilitate social, political, economic, and cultural interactions in ways that go beyond convenience. As Lessig wrote more than two decades ago: code is law (Lessig 1999). Big Tech writes the code that make the platforms. They make law that not only nudges individual behavior, but collective actors, such as governments and other corporations. But they also make the platforms where exchanges happen – we might think of this as the digital meeting place in which human interactions take place.

Platforms as Governor Makers

Many people living today find themselves working on cloud-based computing systems. Most universities subscribe to cloud computing services, so readers of this piece might be intimately familiar with how (well, badly) these systems work. There are two dominant providers of cloud computing – Amazon (Amazon Web Services) and Microsoft (Azure), with Google coming in a distant third (Lu 2024). Governments and businesses alike use cloud computing, and that means two companies dominate the global dynamics of how data are stored and accessed remotely.

Platforms shape and enable a rich, globally-intertwined human life that is not without its downsides. For one, the definition of “platform” has been contested. There are technical aspects to “the platform,” which boils down to, “If you program it, then it’s a platform. If you can’t, then it’s not” (Andreessen 2007). A standard definition used in media studies is the “online, data-driven apps and services” (Gorwa 2019, 856) that can communicate with third-party services (Helmond 2015) via the use of application programming interfaces (APIs). APIs enable different software to communicate and share data.

Beyond the technical aspects of the platform, there are also the social and political aspects of platforms. In other words, what does the platform enable (McKelvey 2011)? What parts of the technical functioning of platforms also change what users can or cannot do as part of those platforms? From STS and media studies we understand “sociotechnical” to mean how people interact with technologies and how technologies are enabled by and build on societal characteristics (Chen and Metcalf 2024). It’s important to acknowledge that platforms govern as they arise in and change the political, social, and economic contexts into which they are born. In other words, platforms govern by allowing things to happen (or not) on them. Companies develop and own these platforms, though not all companies have more than one platform (e.g. X/Twitter versus Meta, four). This elision between platform(s) and company can create confusion (van Dijck, Nieborg, and Poell 2019).

One example of platform governance is character limits. Twitter started by limiting all messages posted on the platform at 140 characters. This later expanded to 280 characters. Nowadays, X allows premium users an even higher character count with support for different text formatting (Weatherbed 2023). In this narrow example, Twitter/X as a social media platform regulates the length of individual communications, and has created tiers of users. In so doing, they create order around shared, enforceable expectations of how users communicate and who might be subject to what rules.

In another example, Apple has been changing the way that its customers use its platforms. In its June developer’s conference, the company announced Apple Intelligence, its foray into generative AI in partnership with OpenAI, that will roll out this fall with its products (M. Wong and Warzel 2024), effectively forcing buyers of their forthcoming devices onto its new AI systems. On a more contentious issue, Apple has recently made changes to the App Store, the app that users of Apple devices use to download additional software. In the past, it collected a hefty 30 percent commission from app developers, leading to widespread protests by other software companies including music streaming service Spotify and the makers of popular game Fortnite, Epic Games (Stempel 2024; Mickle 2024). To access Apple mobile device owners, app developers had play by Apple’s rules. After many years of complaints, regulators in Europe, US, and elsewhere have started curtailing these practices.

Apple illustrates that those who control the platform create order (iOS) through shared expectations, which in this case is the safety of the apps available to Apple users. In turn, Apple’s control of the iOS platform allows it to create and enforce rules, including levying hefty broker fees for using its App Store or infusing their devices with AI technologies created by the controversial company OpenAI.

Both X and Apple are American corporations with a global customer base. These brief examples show how the language of global governors can be very easily applied to the practices of platforms. At the same time, neither X nor Apple are appealing to any other policymakers or other actors for authority. The source of their authority – their ability to govern – comes from their integration into our understanding of the everyday, which forms and reinforces their market power. They are simply governing through their platforms because their platforms have become embedded in the lives of millions and millions of people. The embeddedness of platforms has led some scholars to investigate how we have become “a platform society,” in which there are individual platforms (e.g. Uber) that exist in an ecosystem of platforms (e.g. Uber, plus Bolt, Didi, Gett, Lyft,and other ride hailing apps) (van Dijck, Poell, and de Wall 2018). Their embeddedness determines how their millions of worldwide users might engage on them. Neither of these examples, however, comes close to how Facebook, and now Meta, governs billions of people online in the practice of freedom of expression.

Meta as Governor of Freedom of Expression
*Content in this section is drawn from We, the Data: Chapter 6.

Facebook, we were told by founder Mark Zuckerberg, was established to help connect people (Constine 2012). Who doesn’t want connection in the great, big world? Yet, in so doing, Meta has de facto become global governors over free speech through its platforms, in particular Facebook and Instagram. They do so by creating enforceable rules, but also formally and explicitly as arbiters of what counts of legitimate freedom of expression on Facebook and Messenger through the establishment of the Oversight Board.

Meta has, depending on who and when we are counting, about 3-4 billion users of its products overall. The point isn’t the precision of the number, but the scale – somewhere between almost half and more than half of the world’s population is using a Meta product on a regular basis. That’s more people than any state can legitimately claim to govern. One in two people in the world use Meta platforms to express themselves, and they are subject to the terms and conditions set by Meta, what they call “community standards.” It’s a lot of content that can easily be misinterpreted across geographical boundaries within Meta’s borders. On a daily basis, content moderators, aided by AI systems, are evaluating whether jokes are actually jokes, if images are lewd or political, and tracking down violent or law-breaking videos.

The Oversight Board (OB) is the brainchild of Harvard law professor Noah Feldman, who envisioned the OB as a way to make decisions about content removal and reinstatement on two of the world’s most popular social media platforms. Content moderation is case-by-case and can be quite ad hoc. Why not have a body that can make definitive, legally-based and expertly-reasoned decisions for either the takedown or re-posting of questionable material? The OB is couched in the fact that freedom of expression is a human right (“Oversight Board Charter” 2019). It has independent funding, initially supplied by Facebook, but maintains an arms-length distance from the company. Although Meta does not have any human rights responsibility in any kind of human rights treaty at the international level, establishing the OB recognizes and applies the authority of international human rights instruments in its justifications. Some legal scholars at the time pondered if the OB constituted a kind of human rights tribunal (Klonick 2019; Helfer and Land 2021).

The OB was online for business in 2020. Its decisions regarding individual cases are binding. As of February 2023, it had made decisions on 34 cases. It uses human rights broadly, citing other rights besides freedom of expression in its decisions. Although it was seen by some at its founding as a patsy for then-Facebook to have official cover for its decisions about expression online (Butcher 2020), it has since proven some of its critics wrong (Edelman 2021). One of its most well-known cases was upholding Facebook’s banning of Donald Trump from the platform in the aftermath of the January 6 insurrection, but also asking that the company set a limit to the ban – it was set to two years (Clegg 2021). The OB also met with Facebook whistle-blower Frances Haugen, who testified before the US Congress on the harms of the platform to kids and its differential treatment between different kinds of users with regard to what could be posted (Ordonez 2021). This also led to the OB issuing a demand that the company be more transparent about its content moderation practices (Facebook Oversight Board 2021).

What the OB shows us is that Meta implicitly governs through its platforms and explicitly establishes rules through the OB. It is the only Big Tech company (to date) that does this consistently and openly. We might expect other companies to emulate Meta’s decision to use public law to justify their decisions on their platforms. As platforms’ use of massive amounts of human data, especially in AI systems, becomes more and more of a human rights problem, there is no reason why we wouldn’t see other companies going beyond their platform-based power to consider how they can use and shape regulations around them.

Big Tech’s infusion into modern life means that we can’t expect that dealing with their global influence over human systems can be contained by the economic idea of “monopoly.” As political scientists, we have to start using the very tools that we used to establish non-state governance to consider ways in which Big Tech has gone beyond our wildest imaginations in terms of governing our lives through digital technologies.


Andreessen, Marc. 2007. “The Three Kinds of Platforms You Meet on the Internet.” Pmarchive. September 16, 2007. https://fictivekin.github.io/pmarchive-jekyll//three_kinds_of_platforms_you_meet_on_the_internet.

Avant, Deborah D., Martha Finnemore, and Susan K. Sell. 2010. Who Governs the Globe? New York: Cambridge University Press.

Barnett, Michael. 2009. “Evolution without Progress? Humanitarianism in a World of Hurt.” International Organization 63 (4): 621–63.

boyd, danah, and Kate Crawford. 2012. “Critical Questions for Big Data.” Information, Communication & Society 15 (5): 662–79.

Butcher, Mike. 2020. “‘The Real Facebook Oversight Board’ Launches to Counter Facebook’s ‘Oversight Board.’” TechCrunch (blog). September 30, 2020.

Büthe, Tim, and Walter Mattli. 2011. The New Global Rulers: The Privatization of Regulation in the World Economy. Oxford ; Princeton, N.J: Princeton University Press.

Chen, Brian J, and Jacob Metcalf. 2024. “Explainer: A Sociotechnical Approach to AI Policy.” Data & Society, May. https://datasociety.net/library/a-sociotechnical-approach-to-ai-policy/.

Clegg, Nick. 2021. “In Response to Oversight Board, Trump Suspended for Two Years; Will Only Be Reinstated If Conditions Permit.” About Facebook (blog). June 4, 2021. https://about.fb.com/news/2021/06/facebook-response-to-oversight-board-recommendations-trump/.

Constine, Josh. 2012. “Facebook’s S-1 Letter From Zuckerberg Urges Understanding Before Investment.” TechCrunch (blog). February 1, 2012. https://social.techcrunch.com/2012/02/01/facebook-ipo-letter/.

Dijck, José van, David B. Nieborg, and Thomas Poell. 2019. “Reframing Platform Power.” Internet Policy Review 8 (2): 1–18.

Dijck, José van, Thomas Poell, and Martijn de Wall. 2018. The Platform Society: Public Values in a Connective World. Oxford: Oxford University Press.

Edelman, Gilad. 2021. “Admit It: The Facebook Oversight Board Is Kind of Working.” Wired, June 4, 2021. https://www.wired.com/story/facebook-oversight-board-kind-of-working-trump-ban/.

Facebook Oversight Board. 2021. “To Treat Users Fairly, Facebook Must Commit to Transparency | Oversight Board.” September 2021. https://oversightboard.com/news/3056753157930994-to-treat-users-fairly-facebook-must-commit-to-transparency/.

Gillespie, Tarleton. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven: Yale University Press.

Gorwa, Robert. 2019. “What Is Platform Governance?” Information, Communication & Society 22 (6): 854–71.

Green, Jessica F. 2014. Rethinking Private Authority: Agents and Entrepreneurs in Global Environmental Governance. Princeton, NJ: Princeton University Press. https://ebookcentral-proquest-com.myaccess.library.utoronto.ca/lib/utoronto/reader.action?docID=1414124&query=.

Helfer, Laurence, and Molly K. Land. 2021. “Is the Facebook Oversight Board an International Human Rights Tribunal?” Lawfare. May 13, 2021.

Helmond, Anne. 2015. “The Platformization of the Web: Making Web Data Platform Ready – Anne Helmond, 2015.” Social Media + Society, September.

Hubbard, Sally. 2021. Monopolies Suck: 7 Ways Big Corporations Rule Your Life and How to Take Back Control. Simon & Schuster.

Klonick, Kate. 2019. “The Facebook Oversight Board: Creating an Independent Institution to Adjudicate Online Free Expression.” Yale Law Journal 129 (8): 2418–99.

Lessig, Lawrence. 1999. Code and Other Laws of Cyberspace. New York: Basic Books.

Lu, Marcus. 2024. “The World’s Biggest Cloud Computing Service Providers.” Visual Capitalist. March 22, 2024. https://www.visualcapitalist.com/worlds-biggest-cloud-computing-service-providers/.

McKelvey, Fenwick. 2011. “FCJ-128 A Programmable Platform? Drupal, Modularity, and the Future of the Web.” The Fibreculture Journal, October. https://fibreculturejournal.org/fcj-128-programmable-platform-drupal-modularity-and-the-future-of-the-web/.

Mickle, Tripp. 2024. “How Regulations Fractured Apple’s App Store.” The New York Times, March 4, 2024, sec. Technology. https://www.nytimes.com/2024/03/04/technology/app-store-europe-law.html.

Moazed, Alex, and Nicholas L. Johnson. 2016. Modern Monopolies: What It Takes to Dominate the 21st Century Economy. Illustrated edition. New York, N.Y: St. Martin’s Press.

Newman, Abraham L., and Elliot Posner. 2018. Voluntary Disruptions: International Soft Law, Finance, and Power. Illustrated edition. Oxford, United Kingdom: Oxford University Press.

Nye, Joseph S., and Robert Keohane. 1971. “Transnational Relations and World Politics: An Introduction.” International Organization 25 (3): 329–49.

Ordonez, Victor. 2021. “Key Takeaways from Facebook Whistleblower Frances Haugen’s Senate Testimony.” ABC News. October 5, 2021. https://abcnews.go.com/Politics/key-takeaways-facebook-whistleblower-frances-haugens-senate-testimony/story?id=80419357.

“Oversight Board Charter.” 2019. September 2019. https://about.fb.com/wp-content/uploads/2019/09/oversight_board_charter.pdf.

Rosenau, James N. 1992. “Governance, Order, and Change in World Politics.” In Governance without Government: Order and Change in World Politics, edited by James N. Rosenau and Ernst-Otto Czempiel. Cambridge Studies in International Relations. Cambridge: Cambridge University Press.

Scheid, Brian. 2023. “Just 7 Companies Are Carrying the S&P 500 in 2023.” S&P Global Market Intelligence. May 18, 2023. https://www.spglobal.com/marketintelligence/en/news-insights/latest-news-headlines/just-7-companies-are-carrying-the-s-p-500-in-2023-75823741.

Stempel, Jonathan. 2024. “Epic Games Says Apple Violated App Store Injunction, Seeks Contempt Order.” Reuters, March 13, 2024, sec. Legal. https://www.reuters.com/legal/epic-games-accuses-apple-violating-app-store-injunction-2024-03-13/.

Stroup, Sarah S., and Wendy H. Wong. 2017. The Authority Trap: Strategic Choices of International NGOs. Cornell University Press. https://www-degruyter-com.myaccess.library.utoronto.ca/cornellup/view/title/539369.

Weatherbed, Jess. 2023. “Twitter Blue Subscribers Now Have a 10,000 Character Limit.” The Verge. April 14, 2023. https://www.theverge.com/2023/4/14/23683082/twitter-blue-10000-character-limit-bold-italic-features-substack-newsletter.

Wong, Matteo, and Charlie Warzel. 2024. “The iPhone Is Now an AI Trojan Horse.” The Atlantic (blog). June 10, 2024. https://www.theatlantic.com/technology/archive/2024/06/apple-generative-ai-wwdc/678648/.

Wong, Wendy H. 2012. Internal Affairs: How the Structure of NGOs Transforms Human Rights. Ithaca: Cornell University Press.

———. 2023. We, the Data: Human Rights in the Digital Age. Cambridge, MA: MIT Press.

Further Reading on E-International Relations



Leave a Reply

Your email address will not be published. Required fields are marked *