The Fediverse And The Future Of Social Media: What Policymakers Need To Know
7:53PM Jun 11, 2024
Speakers:
Hillary Brill
Ross Schulman
Mike Masnick
Amy James
Keywords:
moderation
users
system
decentralized systems
decentralized
concerns
mastodon
content
servers
questions
centralized systems
federated
centralized
social media
terms
interesting
services
decentralization
twitter
metaverse
Good afternoon again, everyone and welcome. It is my pleasure to introduce you, our panelists to
everyone for this webinar today. The Decentralized Future Council is proud to host The Fediverse and the Future of Social Media: What Policymakers Need to Know. My name is Hillary Brill, and I am joined with Mike Masnick, founder and editor of tech dirt. Amy James, founder Web3 Working Group, and author of What Kind of Internet Do we Want, and co founder of the Open Index Protocol, and Ross Schulman from the Electronic Frontier Foundation EFF, the first official Senior Fellow on decentralization policy. They've done many, many other things. But I want to jump right into this exciting and interesting topic, which, frankly, today we are going to take you outside of the box of your regular run of the mill discussion of social media platforms, how to moderate them how to moderate content, and the normal type of social media platform that you might think of immediately. Facebook, Twitter, Tik Tok. Instead of those, we're going to do a deep dive and discuss federated social networks, collectively known as the fediverse. These are emerging as alternatives to traditional social media platforms. The movement to these decentralized social networks is increasing. Users are joining some that you may have heard of Mastodon, Bluesky, and Threads. And they are also joining others for all sorts of benefits that they provide enhanced user control, setting more personalized content, moderation policies, among others. However, as with any new technology, and any shift in use of current technology, this shift to federated social networks, it's introducing new questions and policy challenges, new design challenges, and these questions about how do we regulate? They're about how do we moderate issues that we're already discussing content privacy intermediary liability, what are new, if new at all policy frameworks to help innovate and guide an effective and useful future or these federated networks. And with that, I want to do some table settings. So we can discuss what we are even talking about with fediverse. Because with any new type of terminology, I might think of it a certain way, somebody else might think of it a certain way. So let's discuss what are we talking about when we say to fediverse? And, Mike, I'd like to start with you.
Sure. So I think, first of all, thanks for having me. Thanks. For everyone who's here as well, I think we'll have a really interesting discussion today.
That, you know, part of the problem is that fediverse means different things to different people. And so it really started out specifically as a term that referred to one particular type of decentralized social network, or social tool that is one that is based on a protocol called ActivityPub, which is a standardized protocol, you know, one of the earliest of the decentralized social protocols. I think for the purpose of this conversation and the way that we all want to talk about it. Today, we're going to use a broader definition of the fediverse, which covers a wide variety of these decentralized platforms and tools that covers Mastodon, which is ActivityPub based, but also things like Bluesky, and also other services like Nostr, which is more of a peer to peer, totally decentralized system, or something like Farcaster, which is based on cryptocurrency. There are a number of different social media protocols and systems out there right now that are based on a variety of different approaches to decentralization. And they work in different ways and have some some different affordances. And therefore, some some different ways of being set up different ways that they handle questions of content, moderation, different ways that they handle a variety of different questions. So I think for the sake of this particular panel, we're going to be thinking of the fair versus sort of encompassing all of that, and with a recognition that, you know, not everything that we say that applies to one of these systems equally applies to all of them. And it sort of helps to have a general sense of what happens when you decentralized social media, but recognizing that there are a number of different ways that that decentralization can actually take place. And then what does that mean for the rest of the questions we're going to discuss?
Thank you for starting us off with that. And I want to ask if someone can provide a basic example for the listeners of how the federated social networks as federal fediverse would work in a way that would be positive and something that someone would want to use. Amy, I would love for you to give us some examples
Of the various options that people could use. Okay, great. Um, so the,
I think, thinking about what Mike illustrated there is that there's kind of two buckets that they fall into, there's the sort of more federated network shaped ones that are using servers. And then there are the more peer to peer ones. So on the federated side, those are primarily all going to be built on ActivityPub, like Mike was saying. So Mastodon, Bluesky, Threads, PeerTube, Pixelfed, there's a handful of others as well. It's the one that has the most apps built on it to some degree compared to something like Nostr, or Farcaster or lens, which do have some, you know, Farcaster has Warpcast. But it's doesn't have a handful of others the way that ActivityPub does right now. And that one is more peer to peer. So the network is shape is different. And where the content moderation can happen is different. The kinds of controls and responsibility that users have is different. So yeah, it's there's a lot of experimentation going on right now. And a lot of when you think about decentralized social media, I think you also have to think about like, well, what is that that's like, actually a lot of different protocols. It's a protocol for identity. It's a protocol for video transcoding. It's a protocol for video, you know, for video streaming, it's a protocol for maybe compute if you want to have some sort of chat. So anyway, there's just like a lot of different options for how these things can be built. And we're in the very early stages. Yeah,
the one one quick thing I'll add, just in terms of what I think might be useful in thinking about this, is that just just to distinguish all of the decentralized platforms from the centralized ones and ways to think about it, the, you know, centralized systems are what we've grown accustomed to over the past decade, decade and a half, obviously, the Facebook, Twitter, YouTube, TikTok kind of thing, where it's one company that is controlling everything, one company that has the full say, in terms of how the the platform operates, what happens on it, who is allowed, what are the algorithms, what are the recommendations, and what is the moderation and therefore also, some of the liability questions are a little bit different for those as compared to the decentralized ones, the decentralized ones are, you know, they can be built in a variety of ways. The federated setup is mostly sort of mini centralized systems that we've seen in terms of this is the way Mastodon works, there's a bunch of sort of mini Facebook's or mini Twitter's that can all communicate with each other, which makes it a little different. There's Bluesky, which is a slightly different approach, which is is, you know, there, you can have many, many different systems that are kind of peer to peer, but there are tools to connect them in a way that seems much more seamless. And so and then you have like the Nostr, and Farcaster kind of setup where there isn't centralized servers necessarily, there's just a whole bunch of different ways that people can connect. The idea is to make it feel like the centralized services, but where the actual happenings and what is decided and how those platforms work is not controlled by just one company.
I think it's also Hillary.
kind of useful. A lot of people sort of bounce off the word federated and sort of go like, I don't understand that. And it can be intimidating. And so I think it's oftentimes at the outset of these conversations important to enforce reinforce for people that actually they already know what Federation is and how it works, because it's how email works. If you have an email account at one email service, let's say Gmail, and you want to email me at my eff.org address, which lives on the Microsoft services, those are two different servers. But they can talk to one another because they both understand this simple protocol called Simple Mail Transfer Protocol. ActivityPub is exactly that. It's just got other social media stuff bolted onto the side of it, right. It's we're sending messages, but we're also sending likes and, and reposts, and stuff like that. But at the core of it, it is very much the same thing as email. It's just two servers talking to one another. They understand the same language and we can exchange messages by doing that. Thank you, Ross. And one other thing I just wanted to clarify, is that federate federated social network can be something that is not what we think of now with decentralized technologies on the blockchain. So there are social media, decentralized networks that you may or may not have heard of, and those are blockchain based networks, but we also have ones that are not
at all blockchain and the ones that I mentioned earlier are not blockchain. And that doesn't mean that that's why I like to set the table and say, what are we talking about? The fediverse can include both of them. So I'm not trying to confuse anyone. But it is exactly what you said, Ross, it's a federation, where all these different systems are able to talk to each other, and we already have them, they already exist. So I really appreciate you saying that, I, I also want to talk a little bit about just the basic user of these systems, some of the promises that they provide, and why people would want to join them, and why they would want to use them. I'm going to start again, Mike, maybe you can give us a little genesis of how the Fediverse, kind of started to pick up and start to grow. And you mentioned Farcaster, for example, now, Wall Street, and many, many different venture capitalists are interested, they only have 80,000 users, and they just raised $150 million. So we are on the movement, the emergence of these systems, and they're there, they're not to be ignored. They're very important. So go ahead.
Yeah, and I think, you know, Ross's point about using email as an example. And a way to sort of think about Federation is actually really important. Because, you know, that was a lot of the way the early internet worked was through sort of federated systems. And an email is a perfect one to perfect one is an example where you have different servers and different systems, but they could all communicate with each other. And we had a few other things like that Usenet was a sort of early communication sort of group platform that was built on an open protocol, and you could have different servers and they could federate. And, you know, really what happened in the sort of early to mid 2000s, was, you know, a bunch of companies sort of recognize that putting a very nice front end on these things. And creating a centralized version of them was a lot easier from from the standpoint of user adoption. So that's where we started to get, you know, what would generally historically referred to as web two Oh, services that were centralized, and they provided sort of a very, very easy to use system that didn't involve having to understand what was going on, you know, at the protocol level, and that certainly had a whole bunch of benefits. And, you know, the the internet that that a lot of people know, it was based on a bunch of these centralized services, we also began to realize some of the potential problems with that. And really, over the last decade or so more and more people have recognized their concerns about competition, their concerns about privacy, their concerns about content, moderation, their concerns about all of these things that have certainly come up, and there have been lots of different discussions about them. And that has resulted in a lot of people trying to think through, you know, are there better ways to do this. And the more recent focus on decentralization and Federation, I think was really, you know, in some parts started by a lot of people recognizing, you know, we had this early promise in, you know, really in the 1990s of these different decentralized systems that everybody can access. And, you know, that got a whole bunch of people excited. But we ended up with these more centralized systems, which sort of went against what a lot of people really thought the early internet, you know, where it was going to head. And so I think, you know, the, the more recent rise in these decentralized platforms came about as a sort of, you know, retort to the concerns and problems of centralized systems that people had, and the concerns that people have, you know, cover that wide range of things. So there's there are different concerns and different approaches, because of those different different concerns. And I think that, you know, a lot of the, the current interest in this space is, can we build something different, that is not just a competitor to what is out there, but is built on a fundamentally different system that hopefully avoids some of the larger concerns around competition and privacy and free speech and moderation questions that that came about as the central providers existed, then. So, you know, there's, there's been a lot of work done on it over the last few years. I think that the, the real sort of, you know, Kickstarter to a whole bunch of this was Elon taking over Twitter, and sort of really sort of giving a sort of shock to the system. I mean, a lot of the work on these other systems was happening before all that came into play. Some of it certainly ActivityPub predates it by quite a bit and Mastodon by quite a bit. But I think that really, you know, the problem with moving from a system that everybody is used to, to something that is brand new is is convincing, you know, dealing with the momentum of everybody's, you know, what everybody's already doing, it's very, very difficult to change that. But I think Elon taking over Twitter and really changing a lot of the way that that worked caused a lot of people to at least be much more open to rethinking what services they were using and how to think about it. And I think that's created a really big jumping off point for a lot of the new services. And
that gets me right. And that gets me right into a discussion of content moderation and content moderation policies, and how this different and alternative or return retort to use your language to our traditional centralized networks could actually work. Like why would it be better to have this alternative system? But before I throw that out to all three of you, you know, you mentioned at the time when, when, when Musk took over Twitter, I remember doing a webinar before that about the deep platforming of Trump, right, that was taken off. And one of the promises I and I throw this out, again, back to you about the idea behind the metaverse is you own your subscribers, you own your followers, you you can allow Twitter to go away, and you wouldn't lose all of the brand, and branding and engagement that you have. So whether you, you know, supported that situation or not, that's not the point. The point is that it wouldn't have had the impact that it did, in my opinion, that seems to be one of the results of what could have happened. If we if we had had a full acceptance, a fully developed a fully designed fediverse At that moment.
Yeah, I mean, I think it creates a different, you know, a different kind of response, right, and a different sort of setup. You know, one of the the sort of central ideas behind decentralization, it's kind of like central ideas behind decentralization is kind of a weird phrase, but but one of the ideas is that you are sort of pushing control and the power out to the edges of the network more than the centralized part of the network. And that means that users themselves or other intermediaries, perhaps a lot of different intermediate intermediaries, rather than just one can make those decisions. And if you as a user, you know, it gives you more control over what it is that you see what it is you're exposed to, and what it is that you're not exposed to. And I think that is that is part of the really interesting factors, in terms of how that actually plays out in terms of how much say you have over these different things, it really does depend on which of these different systems you're using, because they all take, in some cases, very different approaches to how all of that that works. But we're at this stage right now, where a lot of it is very, very experimental. And nobody knows which ones are really going to catch on or which ones make the most sense. But yeah, the general idea is that you as a user has has a lot more say in how this works. And if you don't like perhaps how, you know, if you adopt someone's set up, and you don't like how they're handling these questions, you can change and switch without losing access to the rest of the platform.
It seems very American, right? Like design your own, whether it's designing your own burger or design your own, Build A Bear at the mall, it's like designing your own social media policies and guidelines and communities that you want to, you know, use and engage in. And I think that's a promise that resonates with most of them, you know, our cultural values and norms as Americans. Amy, I was going to ask you, especially with your background in general, on decentralization, I think some of those same promises and hopes with a fediverse connect here.
Or there also, yeah, just what you were saying about it being American, I would say that what's really wonderful about the potential is that it gives us the opportunity to have both freedom of speech and freedom of association, right? Like you're saying, like design your own burger, design your own algorithm, like you can both say whatever it is that you want to say or, like seek out whatever speech it is that you're looking for, or, or not. And what has been missing in the world that we're in with the web right now, web 2.0 Is, is public space, right? All of these spaces are privately controlled, they are privately owned, they're, you know, the big tech monopolies control the entire pipeline. So whether that's like the physical infrastructure, the servers, it's actually providing the information to you, or it's the, the tech stack that they're using to provide it. You know, I was looking for something recently and found that, you know, Amazon is like, the one that's supporting the open speech or the open search, like spec these days, and so they have like, registered as a trademark and it's just really interesting to me that like, these things that we I think of are open are functionally very closed. And that's what this provides us to change significantly.
So we're still talking about content moderation. And and and I want to dig in a little bit more there. Clearly Congress has looked at this and tried to come up with solutions, many solutions, states are coming up with solutions. Those solutions have been some tried and courts, some gone all the way to Supreme Court about what we do with content moderation is, is the fediverse going to resolve those problems, we were talking about the benefit of design our own pick, what we want, is the fediverse going to get rid of some of those guidelines that we need to have out there. Clearly, there are other concerns, right, than just your moderation or your guidelines or policies. But if this is truly going to be a potential or an alternative, or solution, it would have to address so many different concerns that are already being bandied about. What do you guys think?
I don't think anything, I don't think there's anything out there that will resolve or get rid of the need for or the conversation surrounding content moderation. As long as there are people in the world using social media, we're going to want ways to sort of moderate that, that social media, what the fediverse does do is it certainly changes the conversation and, and it changes the Locus of Power, in discussions about what should or should not be moderated from a handful of companies run by a handful of men, mostly living within a 50 mile radius of San Francisco, too much more closer to the ground where people are actually experiencing the social media that they're using. So to give two examples, in ActivityPub based services of Mastodon, for example, you choose a server when you sign up at the outset, and that server may have anywhere from just yourself to, you know, a few 1000 other people on it, but it's much less than the population of say, Twitter or Facebook, right. And moderation decisions on that platform are made either by yourself, you can choose to block other people, obviously, or by the administrators of your server. So the content moderation happens in a much smaller community. And in a way that hopefully, you have more recourse to go to the administrators, your of your community, talk to them, appeal their decisions, and so on and so forth. In, let's take another example of the Bluesky model, very interesting model, the system actually has what are called labelers. And anyone can be a labeler. So I could, if I wanted to start today, label being being a label, and all their lives do is assign as you might imagine labels to content that they can either go on individual posts, or they can be attached to users. And those labels will convey some information about the post or the or the user, saying, you know, this is, maybe this is hate speech, or this is, you know, not safe for work or something like that. And then on yours as as a, as a user of this service, you can subscribe to any label or that you want. And you can decide what you want to do with all of the labels that that labeler offers. So you may say, oh, you know, posts that are labeled, you know, hamburgers, I don't want that to show up in my feed at all. But, you know, you know, but maybe posts that are marked drafts, you know, just, you know, give me a warning first or something like that. And so you get this system that's very flexible. And that gives a lot of power to end users both to both in how they consume the social media. And also, you know, it opens the door for anybody to become a moderator in a sense, and their, their power is only proportional to how many other people decide that they want to subscribe to that feed. And so if you gain a reputation as being very fair, and very reasonable, and, and so forth, then maybe you get more followers and more people subscribing to the one sort of small exception to that is the Bluesky labeler itself, which in every other way is exactly the same as every other labeler except that within the Bluesky app. Their moderation is basically mandatory. You can't opt out of it. You are subscribed to it automatically. And and they do a whole bunch of sort of what you might think of as like really low level, moderation like spam and taking out stuff like that, that basically nobody ever wants to see
Yeah, and I think I think it's important to think about these different approaches, because, you know, when we talk about sort of, on the regulatory side, the the different demands and ideas that are being bandied about, by by various governments about, you know, requirements for content moderation. You know, most of those are designed in a world in which it is very much centralized providers where you have, you know, you can order Facebook to do certain things, you can order Instagram to do certain things, the Mastodon ActivityPubs set up, in theory, because they're sort of mini Facebook's, you can see how some of those regulations can map to it, though, it probably makes less sense. Because as Ross said, you have these situations where it's, it tends to be more community driven, if there are disputes, there are more ways to appeal and sort of discuss it, you know, at a more community level, rather than sort of, you know, with the big centralized systems, it's the sort of big white wall of like, who knows who I'm reaching out to, if I'm complaining about something. But you know, on Mastodon, it is more more locally focused. But when you get to something like Bluesky, a lot of these approaches don't match, necessarily, because anyone can be a labeler. And you can, it's not just that you can subscribe to any of them, you can subscribe to all of them, right? So they can they stack on top of each other, you can have all different kinds of labelers. Some that might disagree with each other. But again, you as the user get to decide which which you know, what you do with different labels and how you handle them. And and I'm not even sure how that works in a world in which there are certain moderation requirements that are being pushed in some of the regulatory proposals. Because the the entire framework and thinking of how it works is entirely different than one in which you have a centralized world. So things about like, right to, you know, appeal, any decision, for example, you know, begins to seem a lot stranger when anyone can set up their own label or for any reason at all. And it's basically just like, I have an opinion about, you know, this account or these posts or whatever, you know, what happens in a world where you have regulatory requirements around that, it becomes very, it's a sort of mismatch between how those systems work, and how a lot of the regulatory proposals are coming about.
Yeah, especially this early in the, in the process, with so much experimentation going on, it's just really unclear, like, at what level and where these types of decisions will be happening. I mean, ultimately, they should be the the party in charge, or the party who commits, you know, some sort of crime, they should be held responsible, but dictating where certain things happen at certain points in the process just just doesn't make sense with how the technology itself works or where we are in terms of kind of figuring out how it will ultimately be adopted. And so you know, right now, not only would it be very difficult to implement some sort of regulation around it, it would be near impossible to enforce it.
Which is a key point, right, we're in the early phases, and to try and regulate the early phases, it's hard to understand where this technology is going in general, well not understand in terms of regulating it. I know I had a chance to talk to two, three of you before and one of the things we had said was currently the legislative conversations or regulatory conversations about moderating social media in general, never even thought about the fediverse. Right, that we're just having these conversations now. So let's say a lot did get past that was targeted towards Facebook to Twitter or Tik Tok. Regarding content moderation, or Google with YouTube, it may not address some of these concerns that could be coming up with a fediverse. And I do want to raise one of the concerns. But this idea of you get to design your own experiences sounds fabulous, it sounds great. You put in your own content, how I understand you putting your own moderated policies, you can join a community. And then there's different filters that that happen at different stages. But it seems that I could create such a personalized system that I really only get very, very tunnel vision in terms of what I see what I hear what I receive. And general concerns with social media platforms are misinformation bias, and and how do we combat that? Or is that going to be same problem with Metaverse or possibly even worse?
Yeah, I mean, I'll start on that one. I think that one that's already true of the way that social media works today to some extent. So we're not talking about anything different there. Right. I mean, the way social media works today is you get you know, based on who you follow, there may be recommendation algorithms and things like that, but those are still going to be based, more or less on who you follow and may be who you like, and, and things like that. I think that, you know, there has been some research that has suggested that the sort of the story of, of echo chambers or, you know, Bubbles, you know, driven by social media is probably less true than most people think it is. And in fact, you know, a lot of the research has suggested that social media has exposed people to more and different ideas than they've been exposed to in the past. So I'm less worried, you know, generically about that as a concern. I think that, you know, I'm not sure that's, that's been a realistic problem in the way that the world works. I do think that, you know, the nice thing about the decentralized systems is that it changes the motivations behind where these things play out, and how they play out. Generally speaking, when you're on a centralized system, the motivations are whatever is best for these companies. And, you know, the big ones are all obviously, public companies, and they have quarterly earnings reports that they have to make, and they have to, you know, hit their numbers and meet the bottom line. And so the things that you see, and the way the algorithms work are often driven by what is going to be the most sustainable and the most economically advantageous to those companies. And when you begin to shift that, and you have these other systems, some of which are certainly, you know, private companies based on private companies, some of them are nonprofits, some of them are not based on any company at all, it just all depends on what approach you have. And where you're giving more control to the users, the decisions now might be based on other things beyond just the economic advantage of the particular platform. And so, you know, we're going to start to see different ways in which those things play out in terms of, you know, what is recommended? How are the recommendations happening, for what reason? And so I think we begin to see some interesting experimentation around there. And then, you know, and then we'll see, you know, I'm not so sure that it is the job of regulators to say like, don't just listen to the people you agree with, you know, that that creates other other issues in the long run. But I think that the, you know, the nice thing about these decentralized platforms is it offers a different incentive structures and different ways in which these things can work. And, you know, we're going to learn a lot as these things begin to grow.
At like a really fine point. Sorry, go ahead. Really quickly, like, I think, a really short version of what Mike is saying, and I agree with everything he just said is, we don't have to worship the altar of engagement anymore. And therefore, we don't have to worship at the altar of hatred, which is, effectively what we got, when we were on centralized social media that had to pay the bills, as Mike said, we don't have to pay the bills anymore. So we don't have to constantly search for the next hit of rage, in order to keep people with their eyeballs on the screen.
I think the bigger sort of threat here when we're talking about regulation, and content, moderation is some sort of regulation that is, you know, written to target Facebook, that then constrains the design space of the startups that are working to solve these problems using technology. And so I think any sort of legislation around content, moderation, doesn't really make sense that we already have laws that protect us from things that are wrong, and we can just, you know, hold the people responsible for that, that do those things online using, you know, digital protocols. But I think that legislation that supports, you know, the web three fostering here in the US, you know, Americans like being able to run a node hosting, you know, participate in these networks provide services and resources to them, is what's really holding back progress at this point, and what would be really useful?
I think, Amy, that was a good takeaway about don't forget the fediverse have learned about it again, if you're with these conversations that are still ongoing, it's something to really consider, particularly because it can it can be a good option or a good alternative. I want to just address actually a question that I think is similar in onpoint. To this that one of the attendees raised and it the question was, how do you deal with illegal content? The content that this person raised was like Suzanne, but I think if any illegal content right, with a decentralized categorization of content, what what what would you say to that?
So there's a couple of different ways in which that gets handled. So for example, in the Bluesky system, content that is just Perforce illegal is actually filtered out kind of before it even gets to the labeling kind of system and Bluesky does that themselves at this point in time, in the sort of purely federated with like Mastodon and kind of fediverse ActivityPub system, it's up to each individual server operator to make sure that it's the users of that server are not uploading things that are illegal. And that will get enforced eventually other servers if if, if there's a particularly egregious instance out there that just sort of refuses to police its own its own stream, other servers will start just wholesale blocking that the offending server, and then kicking them, basically kicking them out of the network, because they don't want to inadvertently or carry carry that kind of content from them. So they'll get they'll just sort of get boxed out of the of the service altogether.
And I think that there, there are some interesting challenges there, right. I mean, most of the different providers in these spaces recognize that they need to, you know, they need to deal with illegal content in some form or another, otherwise they face, you know, serious legal, potentially criminal liability. And so, you know, different tools are being built around this stuff. You know, certainly for the ActivityPub world, there are new tools that are constantly being worked on to try and improve that process, to allow the different, you know, admin owners, instance owners, to better handle policing, things like that there are concerns, you know, in the traditional, centralized world, especially when it comes to see Sam in particular, there are tools like Microsoft's photo DNA, which are widely used among those services, but those are not available to most of the, to smaller organizations, such as those running Mastodon instances, or whatever. And that has created an interesting challenge. And there are reasons why those tools are not widely available, because there's fear of reverse engineering them or using them in dangerous ways. But that also creates real challenges for people running this smaller services like a Mastodon instance. And I think that we're, you know, there are discussions going on about how do you deal with that? How can you know, are there ways to, you know, give access to an API or something that allows instance, operator to effectively filter a police report, you know, blatantly illegal content and deal with it that way. But it's challenging. And then when you get to the even more decentralized systems, like Noah stir, you know, I think they're still figuring out some of how they're going to deal with that. And there are different effectively, I don't want to say choke points, because they're not exactly choke points. But there are different places in which the moderation for illegal content in particular, can occur, whether it's the relays or the clients or other aspects of the noastre ecosystem, and they're still kind of sorting out how that's going to play out, and who is going to sort of be in charge of enforcing, you know, dealing with blatantly illegal content, but it is something that each of these platforms are approaching in different ways and trying to explore. And I don't think there's any platform out there that is, you know, that is not thinking about this in some way or another because, you know, if they don't think about it, then you know, police are gonna show up at the doors of some people pretty quickly.
I want to I want to address another topic that comes up when there are concerns about regulating the metaverse and that regards how we are collecting information, who is going to be collecting information that is going on, and developers and how's it going to be used? So the discussions we're trying discussions are going on, and all sorts of different circles at federal and state level and international, about how to regulate data privacy and the use of information, what can be collected by brokers, what can be sold by brokers? So if you have this fediverse And again, it's similar in the concerns that at least one something centralized? We know who is using that information and why there are concerns right now on how you know Facebook and Twitter are using personal information and selling it. There are clearly concerns with TikTok about it not being an American company. And is that information being used in ways that shouldn't be in addition to is that being used for financial and economic reasons. So now in the fediverse. What, how is it different? How is it the same? Are there are greater concerns about who is collecting information and selling it with new regulations? Are there concerns similar AMI to if we start to regulate and do have a comprehensive Privacy Bill on how to collect data and sell it or broker it? Will it affect the fediverse? Please, please again,
right because we all have that window that pops up now asking
if we right click
so what will the unintended consequences be? I don't know that I could give a clear answer on that right now. But, I mean, what Mike was saying about how the one of the main things that's important about web three is that it pushes power to the ends of the network, you know, with great power comes great responsibility, you know, so the the question of users taking responsibility for their online footprint for their data for their access to these tools that they rely on to connect with their friends or to do work, or what have you. Is this, this thing that that I don't think has really been figured out well yet, because there's just a lot of friction between the kinds of backups that we're used to, or you can just hit, like, lost my password and get a new one on a centralized system and having to actually keep track of your keys in a decentralized system. Now to the point that we were talking about earlier about CCM and other kinds of illegal material online. One of the benefits of some of these decentralized systems is that it can create a bright line between authorized content and unauthorized content, so that I could maybe choose with my filter settings to only see content from users who have verified themselves in some sort of way that I'm comfortable with. And therefore I know I'm getting it directly from a source versus, you know, a deep fake of of me, or something like that, that's pretending to be me. And that that same aspect of what makes that possible, that kind of public private key cryptography signing, your information would also be part of this data ownership, part of owning my footprint means, you know, using this using cryptography to protect my online footprint to protect my access to these various networks. But then I also have the problem that I can't necessarily recover them. There are things that are being worked on around those kinds of issues, too. So you know, you can kind of like maybe set up a backup where you can get some of your friends to all sign it. And so by getting three or five signatures from your friends, you've set up in advance you there's a recovery, like there's different ways that people are trying to solve for these issues right now. But I would say like, when people are like, Well, where are we, you know, in this in this progression from web 2.0, to web three, a lot of people say we're in like web 2.5 right now. And I would say like, one of the things that's going to be necessary and getting us over the hump is like, finding that perfect balance between things that are like acceptably easy for users, and where we are now, and also acceptably secure for the demands that web three requires.
Yeah, and not that easy is the there's like, that is a little learning word right? There a lot of a lot of user experience. Research still needs to go into figuring out how to get people to understand and manage their, their cryptography keys responsibly, if not answered questions that way? Yeah,
it's a massive point of friction. And when you're talking about moving networks, you know, they have network lock in on something like Twitter, and when you're talking about pulling them into these ones, giving us you know, control over our social graph, like answering this question is the primary friction at this point? Yeah.
And I think that, you know, what's kind of interesting here is that, in some ways, this is because we have all these different systems, which I think tend to prioritize different, different aspects of this, we're going to begin to see what it is that users actually prioritize as well. Right. And so some of that is ease of use is sort of across the board is certainly something that people prioritize, but do people prioritize, you know, data ownership or privacy questions? Do they prioritize having full control over the moderation stuff? Do you know, they're all sorts of different questions, and it's not clear and different people are going to prioritize different things. And we're going to sort of see how that shakes out. And I think, you know, what will happen is we'll begin to recognize which things this is sort of a natural experiment, we're going to see which aspects of this beyond just the the ease of use and the the sort of nice UI features, you know, which things people actually do prioritize, because, you know, going back to the question Hillary, like, the the privacy questions, you know, each of these different systems really handles it somewhat differently. And so there are different aspects to the way that a Mastodon handles privacy questions versus a Bluesky versus a no stare versus a Farcaster. And some of that is going to come down to user preferences in terms of, you know, which things are most important to them what level of control what style of control is most important to the users? That's something that we're still finding out. And it's true similar to the question around content moderation and regulations, the questions around privacy regulations applies in a similar way. Some of the proposed privacy rules that are going around might make sense in a world that is all centralized. And it's all Facebook and Twitter, but don't necessarily make sense. In a world where you have something like noastre, where there there is no central, you know, entity that is even that it's even possible for them to be collecting data on users. And yet, they may end up with certain requirements based on these privacy laws. And so understanding these different systems and functionally how they work and how they're different, and how they might not even fit into this framework that a lot of privacy laws are being made around, I think, is really important to understand. And so as people are looking at different regulations, you know, having at least a basic sense of technically how these different systems work, and even if they you know, what happens in these systems, if you pass a certain kind of privacy regulation, what does that mean, I think is going to be really important.
So, what I'm hearing in terms of this sort of beginning of an experiment, even though it's been going on for a while, I say beginning loosely like big term. Since this is a federated system, interoperability, collaboration, working together, can lead to more success in a variety of different ways, depending on what users actually want. Maybe you all said ease of use seems to be an obstacle in terms of adoption for any new technology, right? And then there are these other protections that people are looking for. So I want to ask you this, we are in the beginning stages. But the discussions right now, the bill that passed Congress about banning Tiktok. If that, if the fediverse was fully established, and somebody had, you know, their their 30 million users, and something got got banned. They could theoretically take that with them. Right. I mean, that really is one of the promises like you wouldn't be as worried about TikTok being banned if it was a federated system. Right. Is that correct? I just want to make sure.
That is, yeah, I mean, it would be a lot more difficult to do a TikTok ban of decentralized system, you know, and again, you know, we don't know that necessarily the legality of the TikTok ban, side aside when putting aside that, right. But the the mechanism right now is that the app stores effectively have to block it. And that would be much more difficult to do with most of the decentralized systems, because, you know, almost by their very nature, they allow for there to be many different apps that can all access it in different ways. There are other ways of getting access to it, that that where there isn't the same choke point. I mean, I think the TikTok ban in particular sort of relies on the fact that there's a choke point in the app store. And, you know, part of the point of most of these decentralized systems is that by being decentralized, they're there, you know, you don't have single choke points, or they're much harder to get at in some way or another. And so, yeah, so I don't think you could ban, you know, a decentralized platform in a similar way, I'm sure that there are creative legislators out there who might come up with ways that might make it a lot more difficult and throw some sand in the gears towards getting access to some of these, but it would be a lot more difficult to to do that. Well,
once the TikTok ban was proposed, there were many different users that have have ticked up and say I have my small business on TikTok, or I have this many followers. What I see as one of the promises of this Metaverse is it wouldn't matter if you were hosted on TikTok, or you were hosted on Twitter or you post on YouTube, that that whole discussion would be moved.
That's that's the goal.
It's kind of like if we get it right, that's the answer. And because TikTok would be the interface, it would just be the application that you're using, because you'd like the filters that they highlight or the way that they where they put the buttons or that they have a dark mode, like whatever it is, that's your cup of tea about how they lay it out. And the thing that's sort of interesting here, especially with the rise of AI is that then what it opens the door to is like a massive amount of proliferation, personalization at the app level, where you can almost just like, ask your AI to write you an app that looks like this, because there's just this massive sort of data lake that it's pulling all of that information from because all of the different aspects that would make that app so Your identity, how you upload the videos or photos, like how you do the tagging all of that are just these sort of like composable protocols that are kind of like plugged into that app, and then give you that output that you're looking for. That's the dream.
And even today, so I mean, if for whatever reason, if you choose a Mastodon server, and for whatever reason, you end up not liking it, right? Maybe you disagree with the moderation policies, now that you've been there for a little bit, you can move, you can move servers, it's not 100% seamless, but it's pretty good. You can sort of up stakes, and your followers will come with you, and anyone who wants to find you afterward will find you at your new address. You know, kind of like setting the forwarding on an email address and moving to a different server. Right. It's it's it works pretty well.
Before I go into some of the questions, I do want to clarify and thank you, Patrick, that the law does not officially ban TikTok does not exactly what the law does. And I apologize for using those words. It's more complicated than that. But let's say there were a law that specifically banned Twitter, or there was a law that specifically banned YouTube at that, that that that's basically where I was trying to go with that. So thank you, Patrick. I do appreciate that. And I, I do want to be specific with my terms. So I didn't mean to interrupt anyone, because I'm gonna have any any final words on that? Because otherwise, I'd like to try to go into some some of the questions that have been proposed. So one of the attendees said, in order to achieve content moderation that is now being discussed, could this be accomplished by regulating Facebook and X, Twitter X, formerly Twitter to allow individual users to adjust their personal filters? Wouldn't it be necessary to go so far as to allow multiple companies to run the system?
So I mean, I think I think there's, there are a few assumptions in there, that it, it depends, is, is kind of the answer. You know, the, you know, there is this question, which is, I think, is what that question was getting at, there's this concept of like middleware, right? You know, can we create a space in which you still have centralized providers of the services, but there's a sort of marketplace for third parties to provide services on top of that around content, moderation, maybe around other things? I think that's a really interesting space. And I think there's value in sort of exploring what is happening with the sort of middleware concept, but it's still one that is fundamentally based on the idea that centralized services control the ground layer of this. And, and, you know, I think there could be really interesting things that happen with that, if there's some sort of mandate, it becomes very tricky when you start to get into the weeds of how do you do that in a way that continues to protect privacy, that doesn't lead to bad outcomes. I mean, you could argue that in a world in which, you know, there is mandatory middleware, you get situations, you know, like, Cambridge Analytica, or something where, you know, they were set up as sort of an app on top of Facebook, that was supposed to be this fun sort of sharing of data, which was then later taken and converted into being used for political messaging, which was not what people signed up for when they did it. And there were lots of concerns about it, and lawsuits and everything along those lines. If you're mandating middleware on a centralized platform, you could still end up in situations like that. There are questions around things like clear view AI, which is, you know, facial recognition service that built its its AI system based on scanning all sorts of social media pictures that they were able to get access to. So they're always concerned about how do you handle that? You know, again, like getting back to the issue of privacy regulations, then you know, how privacy regulations play with that. I think all of those are sort of interesting areas to explore. But they're fundamentally different than what I think we're trying to get at with these discussions on on the more decentralized systems, rather than just building another layer on top of the centralized systems.
I'm gonna jump into another question which we discussed the hurdles, but what what do we need to do to incentivize existing platforms or networks to open up and join the fediverse? We didn't mention threads. Threads just joined the fediverse. But not everyone is is federating with each other. I know there's a website that will say, which ones are and who they're connected to. So we're talking about a fediverse. And there's this assumption potentially that everyone's connected. Number one, that that isn't true, but But number two, what what do we need to do to incentivize opening this up and getting people to join
Well, I mean, the one one extreme one extreme side. Right, is that you use Digital Markets Act right now. I mean, right now, they're just focusing on on chat apps. And frankly, a bare handful of those. But But I think they've they've sort of said that they want to expand the scope of what is covered by the interoperability mandate in that regulation. And so, you know, I think I don't think that's a terribly American. I don't think we would ever see that happen here as a sort of a law to Congress, but it's really it's sort of like the extreme, the extreme version on one end, right.
Yeah, and I think I think there are a few other interesting approaches along those lines. I mean, one is that I think that certainly a lot of sort of smaller and midsize companies are recognizing that there are advantages in this and that they don't have to be as reliant on the big centralized players, which has been an issue in the past. So you have companies like Flipboard, and Medium, and Ghost that are, you know, sort of smaller, mid sized companies that are embracing the Fediverse, because they see it as a way that they don't, they were previously very reliant on Twitter and Facebook, and that represented certain challenges. So they're recognizing that just the basic advantages of connecting to this world, I think there's some other interesting sort of more speculative policy proposals, that could be really interesting. I'd love to see some sort of safe harbors for like, you know, potentially, like antitrust safe harbors for companies that are willing to adopt the, you know, decentralized system that allows users to get out of centralized system. So if you know, Meta, or Google are scared of, of antitrust cases being brought, if they're willing to adopt decentralized stuff that allows users to still access, you know, users on their system without being on their system. Maybe there's an interesting, Safe Harbor proposal there. I
wrote a paper a few years ago, recommending basically that
that would be, that'd be awesome. I think that like, the last piece of it is just like the financial incentive that, you know, as every time that the areas of the Web grow, the total addressable market gets bigger. And what happens when you let's say, like, you know, all of this content, there's all this discussion about, like the value of content that influencers or people are just putting out there and who owns it. And if you actually could put some sort of licensing or commercial terms on your content when you publish it, and then the app that's, you know, sharing that on your behalf is able to receive a cut of that, that just creates this like financial incentive, that is kind of baked into the protocol itself. And that's part of that cryptocurrency kind of side of this conversation. But that's a really, like, there's a lot of potential for revenue there.
I appreciate that all the like, like minds think alike. On this issue. And you addressed a little bit about what one of one of the questions was, which is what are some policy recommendations to improve the metaverse and benefit users? And I think we touched on a lot of that, when we talked about all the main concerns that are still existing outside of the fediverse. They do exist in the fediverse, maybe we don't want to forget that there are data privacy concerns, we didn't even get a chance to really talk about accessibility inclusivity we want to make sure that as as this is being designed, and the fediverse is in experimental stage, that it's inclusive, to many, and that people can can use the use it effectively, it's actually useful and they're empowered, but there are protections and legal protections, I think all those concerns to answer this person's question unless there's something else you guys want to add you guys addressed and you just really clearly we have a few minutes for for the last few words. And I do want to thank all of you for for coming but if there's one thing that you want all the listeners to take away now is your chance so so to put you on the spot Ross, I'm gonna kind of give you that. That's your 30 seconds.
Yeah, I mean, I think the one thing that I think is is really exciting that we didn't really quite touch on or illuminate as much as is the way in which the fediverse gives more power to users not in the sense of sort of like limiting what they can see or choosing what is limited but in growing what they can see and so if you're the sort of person that like me, who you know every time you open up Twitter used to go up to the top and say no, I want it in chronological order not your whatever you think I want you know the Fediverse for you like you can have your because it's completely open your client can do whatever you want. do and it's not going to keep flipping back to the old or at least it shouldn't keep flipping back to the other way. But you know, if you're the sort of person that loves the idea of like a really curated algorithmic feed, oh, my god, like the world is your oyster here, people are building feeds for themselves. It's on Bluesky, for example, you know, that are like, you know, I show me all the penguin pictures. And, you know, there's some AI like scanning every post that comes through Bluesky and pulling out all the penguin pictures and shoving them into your feed if you really love penguins. And so, you know, the the level of sort of control that individuals have now is amazing.
Any other last words,
the only thing I'll add just really quickly is that it is actually a really exciting time, there is there is all this experimentation going on. And I think that it's very easy to sort of feel that the world of the internet is stale, that we've, you know, we have these five companies that are all made up of screenshots from the other four of those five companies that are being passed around. And that's what everyone is talking about. And everyone's mad about this or that and angry about this or that or concerned about privacy or this or that. There's something really interesting and fascinating going on with all of these different experiments in decentralized systems. And that's something that we should be excited about. We should hope that that more of that continues, and that we begin to see sort of what comes out of it, because it's really the early days. And it's something that is different and fundamentally more optimistic than what we've dealt with for the last decade. Yeah,
I completely agree. Web three, makes the internet better. It helps. It's kind of like the opportunity for it to live into that original dream that we all had of this free and fair marketplace for ideas and people to connect. And I think that, you know, we can achieve that with these new tools.
Thank you for ending on such a wonderful optimistic note. And I look forward to seeing how the fediverse grows and unfolds. And thank you again, for all of your time and this interesting and engaging discussion. And with that, I'm going to say thank you to the panel, panelists and attendees and until we till we all get another chance for the next fediverse panel. Thankfully,