Computing Community Consortium Blog

The goal of the Computing Community Consortium (CCC) is to catalyze the computing research community to debate longer range, more audacious research challenges; to build consensus around research visions; to evolve the most promising visions toward clearly defined initiatives; and to work with the funding organizations to move challenges and visions toward funding initiatives. The purpose of this blog is to provide a more immediate, online mechanism for dissemination of visioning concepts and community discussion/debate about them.


CCC@AAAS 2025 | Social Technologies, Artificial Intelligence, and the Future of Online Trust

May 1st, 2025 / in AAAS, CCC / by Haley Griffin

This is the second post in a two-part series recapping the panel Social Technologies: Why We Can’t Live With Them or Without Them, which was supported by the Computing Community Consortium (CCC) at the 2025 AAAS Annual Conference. The first post, Rethinking Social Technologies: Platforms, Protocols, and the Push for Decentralization, explored how researchers and technologists are approaching the future of social media through decentralized design and protocol-driven innovation.

This follow-up highlights the second half of the panel discussion, which examined critical issues related to artificial intelligence, online trust, and potential regulatory responses.

The panel was moderated by Sarita Schoenebeck, Professor of Information at the University of Michigan, and featured Andrés Monroy-Hernández, Associate Professor and co-leader of the Princeton HCI Lab at Princeton University; Motahhare Eslami, Assistant Professor of Software and Societal Systems at Carnegie Mellon University; and Bryan Newbold, Protocol Engineer at Bluesky.

Opportunities for Decentralization at Scale

Schoenebeck continued the panel discussion by asking: “How is decentralization going to scale to everyday people?”

Monroy-Hernández offered several ideas:

  • Decentralization through protocols: Break social media into distinct components (e.g., moderation, storage, account management), each managed by independent organizations to enable interoperability across platforms.
  • An ecosystem of providers: Envision a diverse landscape of for-profit and nonprofit entities offering specialized services. This would foster competition and innovation, giving users more choice and control.
  • User empowerment: Allow users to choose providers based on preferences for privacy, moderation, cost, and other needs.
  • Email as an analogy: Like email, a successful decentralized model enables users to switch providers while retaining content and identity.
  • Increased user awareness: Users should be more mindful of the tools they use—“If you are not paying, you’re the product,” he said.
  • Capitalism as an incentive: Let market forces drive innovation among competing providers.

While the audience recognized the benefits of decentralization, many expressed skepticism about whether such systems could scale effectively.

Distinguishing Between Humans and AI

Schoenebeck then asked: “Should platforms and apps distinguish between humans and AI? How do we do this?”

Newbold responded that this task is increasingly difficult. He described a globalized underground economy of inauthentic accounts, often created and maintained through coordinated human labor. Detecting and moderating these accounts is a major challenge.

Eslami agreed and emphasized the need for transparency: “It should be very clear that if something comes from a bot or a human.” She also shared personal anecdotes about her son forming attachments to Alexa and ChatGPT — illustrating how people, even children, are building emotional bonds with AI.

Regulating Social Platforms and Anonymity

Schoenebeck polled the audience: “How many people think AI agents, bots, and user accounts should be labeled as such on the internet?” Most attendees raised their hands.

Monroy-Hernández noted that the European Union is leading on global tech regulation, while in the United States, progress is more likely at the state level. He compared labeling in tech to food labeling: consumers deserve to know what they’re getting, even if they continue choosing lower-quality options.

Newbold supported labeling but cautioned against tying online identities to government-issued IDs. Instead, he advocated for invite-based systems or social vouching to counter inauthentic behavior.

Monroy-Hernández added that he is more concerned about users migrating to private, invite-only spaces than about the proliferation of AI “sludge.” He noted that even when users rely on tools like Grammarly or ChatGPT, they are still responsible for the content they share.

Eslami emphasized that anonymity should be context-dependent. In settings like ridesharing or healthcare, identity transparency may be essential. She also expressed skepticism that market forces alone would lead to meaningful user protections, citing weak enforcement of current privacy regulations.

Protecting Individuals Online

Katie Siek, Vice Chair of the CCC Council, asked the panel how individuals who share sensitive health information online can be better protected.

Monroy-Hernández responded that decentralization allows users to selectively share identity information and limit how far it travels. In decentralized systems, harms could be more contained within smaller, trusted circles.

Newbold agreed, suggesting that sensitive conversations should take place in trusted, smaller-scale online settings—such as private servers with clear privacy rules and accountable hosts.

Eslami referenced an idea from David Karger at MIT: hierarchical trust networks, in which users assign different levels of access to information depending on their relationship to others.

Cost of Decentralization

Cory Doctorow, a prominent technology author and activist, asked about the cost of setting up federated Bluesky instances, contrasting it with the lower cost of launching a Mastodon server.

Newbold clarified that the infrastructure cost of Bluesky components is not as high as some believe. Relay servers may eventually support tens of millions of users for only a few hundred dollars. The Bluesky team is working to make deployment more scalable by focusing on core network features.

Modifying User Behavior for Decentralization Adoption

Science journalist Bennie Moles asked how platforms can encourage user behavior changes when adopting decentralized systems—especially related to user interfaces and network size.

Monroy-Hernández responded that Mastodon emphasizes values over usability, while Bluesky takes a more centralized approach to ease adoption. Balancing usability with decentralization remains an active area of exploration.

Newbold added that identity alignment creates friction: users are often forced to choose a “side” or community (e.g., based on geography or interests), which can be limiting. Bluesky minimizes this by making instances nearly invisible to users.

Schoenebeck added: “I think we need friction — speed bumps — things that say not every thought has to be said at all times, right away. And I think that a lot of our social spaces would be better if there was a little bit of friction and some more accountability in how we interact.”

Newbold concluded by emphasizing that making it easier for users to preserve their contact lists and migrate between platforms will be key to long-term success.

Pros and Cons of Regulations

Nazanin Andalibi, Professor in the School of Information at the University of Michigan, asked the panel: “What kinds of regulations would you like to see for decentralized platforms?”

Monroy-Hernández advocated for regulation of behavioral advertising and mandates for platform interoperability — both of which, he argued, would reduce toxicity and enhance user choice.

Eslami emphasized the importance of transparency in how platforms apply rules and voiced concern about loopholes companies use to avoid compliance.

Newbold noted that small companies often lack the resources to meet regulatory demands. He supported governance models that emerge from within platform communities, with mechanisms for accountability and responsiveness.

Eslami pushed back, arguing that without formal regulation, platforms are unlikely to prioritize user well-being. “In my view, regulation is necessary to give users a voice,” she said.

Reflections and Thanks

CCC thanks the panelists and audience members for their thoughtful contributions to this timely conversation. To revisit the first half of the discussion, read part one of this series, which focused on decentralized infrastructure and the future of social platforms. Stay tuned to the CCC blog for more insights on the evolving role of computing researchers in shaping responsible technology.

CCC@AAAS 2025 | Social Technologies, Artificial Intelligence, and the Future of Online Trust