Featured Research
News

Who Decides What You See? How Middleware Can Reshape Social Media

By: Renée DiResta, Associate Research Professor, McCourt School of Public Policy

For as long social media has existed, there have been debates about content moderation and curation. Moderation –  the process of removing, reducing, or contextualizing content, such as through labeling – is often opaque, and inconsistently applied. Everyone has a story of being unfairly treated by an auto-moderator. Curation – the selection and ranking of content in feeds – frequently prioritizes sensational or enraging material, locking users into algorithmic loops that seem impossible to break. While these issues have long been sources of frustration, their significance has grown in an era of heightened political polarization. Moderation and curation shape what people see and what they pay attention to, making them flashpoints in partisan battles. In response, Big Tech companies occasionally make abrupt shifts in their policies—often in ways that appear to serve political and corporate interests rather than users.

Following the recent U.S. presidential election, and amid efforts to curry favor with a new administration, Mark Zuckerberg announced major policy changes at Meta: eliminating fact-checking, changing hate speech policies, and tweaking algorithms to once again promote “civic” (political) content. Elon Musk’s $44 billion acquisition of Twitter, now X, led to a chaotic overhaul of its moderation policies, the dismantling of trust and safety teams, and curation shifts to prioritize posts from paid subscribers. Meanwhile, TikTok, under pressure from bipartisan scrutiny in the U.S. over what its algorithms do or do not promote, made conspicuous efforts to appease Donald Trump in hopes of reversing a looming ban. These examples underscore a reality that technologists have long discussed: moderation and curation are never neutral, but shift in response to platform incentives and external pressures.

But what about user interests? 

This is precisely the question that a recent report, Shaping the Future of Social Media with Middleware, seeks to address. Produced by the Georgetown University McCourt School of Public Policy and the Foundation for American Innovation, and derived from a workshop held in April 2024 at the Stanford Cyber Policy Center, the report explores how middleware—third-party software intermediaries between users and platforms—might serve as a counter to the concentrated power of social media platforms, particularly where content moderation and curation are concerned.

Cover of the Middleware Report. Top half light grey background with title in all caps "Shaping the Future of Social Media with Middleware" December 2024. On the bottom half, image with black background and white rounded shapes with black and orange dots. The logos of the Foundation for American Innovation and the Tech & Public Policy program at the bottom.

If you’ve ever used (or heard of) tools like Block Party, which allowed Twitter users who were experiencing a high volume of harassment to mass-block accounts that had liked or retweeted a particular tweet, you’re familiar with middleware. These tools can perform actions on behalf of the user. By giving users more control over moderation and curation – by giving them more agency over what they see, and who they engage with – middleware has the potential to foster a more user-driven digital space. A marketplace of providers and algorithms would create a flexible alternative to both opaque, centralized platforms and an unfiltered, unregulated internet. 

This is a vision that others have advanced in the past – indeed, the report builds on the 2020 work of the Stanford Working Group on Platform Scale, led by Francis Fukuyama. However, recent technological shifts, such as the adoption of decentralized and protocol-based social media platforms like Mastodon and Bluesky, make middleware something that might be adopted via alternative platforms – rather than something that Big Tech would have to decide (or be compelled) to adopt. 

The report outlines key theoretical frameworks for middleware’s role in digital ecosystems, highlighting its potential to decentralize power, enhance user autonomy, and introduce competition into the moderation and curation marketplace. By enabling users to choose among different content ranking and filtering services, middleware provides a way to mitigate the influence of platform-driven engagement incentives that often amplify divisive or low-quality content. It also introduces flexibility into moderation, allowing users to adopt third-party moderation tools aligned with their values rather than being subjected to one-size-fits-all platform policies.

Of course, middleware is not a panacea. There is a risk of exacerbating echo chambers. There are privacy and security issues that must be addressed in responsible development, and navigating the economic and regulatory landscapes is necessary for middleware adoption. Nonetheless, it offers a vision for a future where users—not platform executives—hold greater agency over their online experiences. 

As social media governance remains a battleground, Shaping the Future of Social Media with Middleware presents a compelling blueprint for a more open, user-driven internet—one where individuals, not platforms, shape their online experiences.

Renée DiResta is an Associate Research Professor with the Massive Data Institute and Tech & Public Policy at the McCourt School of Public Policy at Georgetown University. She is the author of “Invisible Rulers: The People Who Turned Lies Into Reality.”

Tagged
DiResta
MDI Featured Research