2022-07-24, 16:00–17:30, Envelope ✉️
Do we want social media platforms that provide neutral platform for pluralistic debate, or do we want social media platforms that protect their users from abuse and de-platform abusers? Can we have both? Is moderation censorship? Is Signal social media?
The debate about how to deal with abusers and fascists on social media has been simmering for a long while, but really blew up after the siege of the US Capitol. It has become clear, finally, that something needs to be done. What remains unclear is... what?
This workshop will probably not offer answers, but will at least try to make sense of that debate. A debate which is tangled and twisted, and complicated by the fact that for some (like journalists) social media is the infrastructure they need to do their work and reach their audience, while for others (like activists) it's the service they use to communicate among themselves.
Combine this with how strong the incentives are for centralized social media monopolies to not "police content" (lest they be accused of censorship), and how their algorithms tend to promote most-clickable (and therefore often extremist) content, and you find yourself in a discussion where journalists defend Nazis' access to Facebook.
There is plenty to unwrap there, so let's sit down and try to do just that together. Things that seem important (of course we will not limit our discussion to only these!) would be:
Is social media infrastructure, or service?
If it's infrastructure, we want it to be neutral (just like we want our ISPs not to police our bits on the wire). If it's a service, perhaps we want some moderation to happen?
Perhaps it needs to be both – a neutral infrastructure on which opinionated services can be built? This would immediately simplify this conversation, making it clear where moderation belongs, and where it doesn't.
Giving platform vs. providing tools
There is a difference between giving platform and just providing communication tools. It is hard to argue Facebook not banning a prolific fascist does provide them with a platform. It is as hard to argue that Signal doing same does not.
This difference seems to disappear in a lot of discussions; for example, when some propose that mainstream social media should ban fascists, others claim that it doesn't matter since they will still be able to communicate on Gab or Parler.
However, neither Gab nor Parler has the reach of mainstream social media. Effectively, this leaves their users stewing in their own sauce, not able to radicalize anyone new. They can still communicate and organize, but adding new members comes with added friction. So, how important is that friction? How much difference does it make?
Public vs. private channels of communication
A public Twitter profile with 50k followers is clearly a public channel. A Signal group of 10 people is clearly a private channel. Should a 50k-member Signal group considered private or public?
This question is relevant in the context of platform vs. tools distinction above. It's another way of asking: what is, really, the difference between a platform and an internal communication tool?
Signal has already added several "social" features (message reactions, and being able to join groups by clicking a link). Is keeping the groups invite-only on an encrypted platform enough for them to not be considered a "public" channel? If not, should Signal start moderating groups?..
What about closed Facebook groups – can these be considered private channels? If so, can they be ignored by Facebook moderators?
Christopher is a journalist and technologist, currently leading the technology research and building at Duke University’s Reporters’ Lab, while being based out of Brooklyn, NY. His main focus is on automating fact checking, developing data standards for the same, and supporting the efforts of misinformation and disinformation journalists around the world.
Previously Christopher has worked on organized crime investigative reporting in Eastern Europe, consulted with global south central banks on anti-money laundering policies, founded start ups in New York City and has worked as a photojournalist in East Africa.
James has worked in web technologies since the early 2000s. He has experience across industries including pharmaceuticals, government, oil, and consumer advertising and in that time has been a developer, designer, user experience architect, dev-ops engineer, data analyst, experience design innovator, digital strategist, and more. For several years he managed teams at multiple offices across multiple cities in the eastern United States as the technology director. Today he freelances as a web developer and digital strategist. James was also a nuclear electronics technician in the US Navy and spent time as a Jesuit while exploring a religious vocation.
James is a Gopher evangelist and operates a public-access *nix system called Cosmic Voyage. It is an open community for amateur writers to share in a semi-collaborative science fiction universe while exploring command line skills. He is one of the operators of the Tildeverse, a collective of like-minded "tilde" servers in the style of Paul Ford's tilde club. You can find him all over the web and in IRC.
Information Security ISNIC, the .is DNS registry. Co-founder of the Technical Error Correction Collective. Tech, policy, and activism background. Previously Chief Information Security Officer / Head of INfrastructure at OCCRP.
Co-operated with a number of EU-based organisations working in the digital human rights area and participated in a bunch of Internet governance meetings. Main policy interests: information security, privacy in the digital age, Internet governance (including censorship, surveillance, Net Neutrality), copyright reform, digital media literacy.