Why social media is broken and what to do about it
8:48PM Mar 3, 2021
Speakers:
Keywords:
people
communities
platform
build
moderation
rashad
tech
important
issues
question
accountability
club
social networks
misinformation
create
big
talking
ways
impact
companies
Hey, everyone, and thank you for joining us today. We really appreciate your time. Today we're gonna talk a bit about how social media is broken. That's not a particularly controversial ideas. I'm sure everyone on this panel would agree. But you know, I think in recent, in recent years, particularly in recent months, some people social media is problems are manifesting on a society level scale. And obviously, that's an even bigger problem than before. You know, from stuff like deadly COVID, misinformation to white supremacists, organizing militias online, even stuff like harassment that's been around for ages. You know, all of this stuff is kind of coming to a head this year. And so it's a conversation we want to have. And the other side of the coin, I think it is possible to imagine something better, you know, if social media is broken, there's bound to be a way to fix it. And if not about to be a way to build something that's better. So that's kind of where we want to kick off the conversation today. And on that note, now, I would love to start my first question. By asking you a little bit about your projects, you have two things going on Ethel's club, and somewhere good, both our communities built by and for people of color. Yes, they sound like really special places. You know, and they start with a really different idea than traditional huge social media platforms. So I'd love to know a little bit about what you think, um, regarding the question, if social media platforms need to be smaller, to be better, do they need to be small? Do they need to be custom built for communities? Is size important here?
Yeah, thank you for asking that. And I think the most important thing that social platforms should do moving forward in the future. And something that we're super focused on, is just being super intentional to start to be intentional about size, about our framework about foundational aspects of what the product is delivering to the end user. One of the aspects that was really important to us when we got started was size. And so one of the factors of somewhere good is kind of connecting people with smaller, more intimate communities, because people tend to behave better in smaller networks. One of the things that we use in our product meetings to kind of base a lot of our decisions off of is mimicking a real life dinner party. Who would you invite to that party? How do people interact when they're in those kinds of smaller intimate spaces, it tends to, again, be a lot more aligned with Better Behavior. And then that's, you know, that's our perspective in terms of the many ways in which social networks can sort of be more user friendly and a multitude of ways.
That's really interesting. It obviously, Ethel's club began as an offline community, right, it was founded pre COVID. And then you kind of had to transition into being an online community, which is a really unusual story, I think, and maybe a lesson for other social networks.
Yeah, we had to make the decision and about three hours, it was March 12 of last year, so we're coming up on a year. And it kind of answered the same problem, which was that in the physical spaces that people were visiting pre COVID, it was still sort of this, like, you kind of go into a space and it was a white empty box, and you didn't have that familiarity. And so even with Apple's cloud, the physical location in Brooklyn, we built it out, so that it necessitated smaller, more intimate interactions with our members. And so we took that same little nugget of knowledge and used it digitally and somewhere good is now our technology platform that allows us to build that out on a much larger, faster scale, but still at that level of intention that people want to be connected in a more intimate manner with one another.
Something really interesting about your project that I think fits into a broader conversation we're gonna have here we might get a bit into policy and other stuff that's going on politically with tech platforms is the idea that it is you know, custom built from scratch and it's small. And like the big conversation right now in tech is we need to break up big tech is big tech. The problem is that the problem that it's big, Jessie and Rashad, I'd be really curious to know what you think about like this idea that you could build social networks from this completely different place. And is this like part of what's made social networks today so toxic and created all these problems?
I can I can take a first crack. I guess, I think, I think I think what Nigel said about intentionality is so important. And I think that that I do think that these platforms have gotten too big. I mean, Facebook, for example, you know, that it's all built theoretically around community, but you just can't really have a community of 3 billion people. That's just not really how communities work. And so, but I think that part of it is about intentionality. I mean, there are platforms. For example, example, you know, I'm sure Snapchat has plenty of issues but it was designed with a certain intentionality. And you don't see stories about Snapchat being overrun by misinformation, even though You know, it is a large platform. And that's just because it was fundamentally designed with intentionality in a way such that you're not just being shown, algorithmically recommended and amplified, you know, or get posts from whoever, based on virality. You know, it was built in a way where they were, they were clear minded about who they, how they wanted people to be interacting and what the app was supposed to do. And so that's, I think Nozbe makes such a good point about like, if you have that intentionality from the beginning, I think everything else can flow from there. Whereas if your goal is just growth at any cost, you know, you're going to end up with all sorts of problems.
I mean, I would agree, I would agree with, with all of that, I think, what's important, I live in New York City, right? I live in Manhattan, you go down the street, in Manhattan, you see all sorts of buildings, I'm a really big, some are really small. And summer, you know, in between, right. And there's rules about how these buildings get designed, right, there's rules around and there's accountability, on the end, right, people who are call themselves engineers, and architects build these buildings. And if these buildings fall apart, and these buildings hurt people, people could lose their license or people could be held accountable. And we have a whole industry in tech, where people can have these titles of building things and have to have no accountability for the impact. So yes, something can be big, and something can be small. But if you don't have sort of rules at the heart of sort of the design process, and then accountability on the back end, to sort of create the dynamics that people have to sort of hold themselves to that even if you keep things small, you could have a whole bunch of small networks of white supremacists, who are creating, you know, clubs to then go out and do violence, and it could be small, and still have a lot of damage and do a lot of damage. And so what I guess I I want to speak to is sort of what is at the heart of how we hold an industry that has so much potential to bring us into the future, but is dragging us into a past, because of all the ways in which the incentive structures, the violent algorithms, the sort of, you know, as Jesse said, sort of this idea of growth, at all costs, growth and profit at the sort of expense of safety, integrity, security, and all the other things that industries across the board have hold themselves to, in some way, shape or form.
While we're talking a bit about, like those incentives, and how large tech companies operate, roughshod with color change, something really interesting about the organization is it didn't begin, you know, as a tech accountability organization, you know, color change integrated. Its work in the tech industry into its existing work, and it has a really broad, broad mission. You know, it addresses things like voter suppression, policing, reform everything. I was wondering if based on those kind of dual perspectives, like operating both in the tech world and a number of other industries, if you've noticed any, like attitudinal differences, and how tech considers itself when you're working with the tech industry versus say, Coca Cola or whoever else?
No, it's it's it's, it's complicated. You know, first of all, we operate from a theory that people don't experience issues, they experience life, that the forces that hold us back are deeply interrelated that a racist criminal justice system requires a racist media culture to create the demands, political inequality goes hand in hand with economic inequality and vice versa, that a hostile climate for communities is not just something you can solve by hitting an issue over here. And so while we did not start as a tech accountability, we started as a racial justice organization in the aftermath of a flood, which was Hurricane Katrina, caused by bad decision makers that turned into a life altering disaster by bad decision makers with black folks on their roofs, begging for the government to do something, and literally left to dock. And the thing that about Katrina, that's really important that sort of animates our theory of change and really connects your question is that Katrina illustrated things that people already knew geographic segregation, generational poverty, the impacts of what we've done to our planet, all the ways in which structural racism, undergirds underinvestment disinvestment, targeting of communities. But at the heart of Katrina, no one was nervous about disappointing black people, government, corporations, media, right. And so if you think about the tech world, and people building platforms, right, you think about the head of zoom, saying after the sort of early in the earlier days of the pandemic, when all the zoom bombing was happening, that they never imagined that people would want to interrupt someone's gathering. Well, you could not imagine that if that's not your sort of like experience of having sort of your gatherings of having you're sort of quiet Together, I'm interrupted or disrupted. And so to the extent that our mandate, right is to force institutions to be nervous about disappointing us to bring about change, to the areas that impact black people's lives, that's important. But I do think that in Silicon Valley, there's a, and in the tech world, in general, there's, I think, a tendency for people to be maybe a lot more impressed with themselves and impressed with their politics and impressed with their sort of like, ability to believe in something bigger, right, and not think of themselves the way that you know, sometimes it's a lot easier to deal with a Coca Cola where like I go in, and I know that these people think that they're making soda, not making like a new society for all of us. And I can like, deal with the impacts of something that they're doing. And we can all be on the same page, because they don't think that they should get a Nobel Peace Prize. But in tech world, they people think that they can code, we can code our way out of structural racism, when in fact, the code is just amplifying structural racism. And so it becomes a lot harder, but quick thing I will just say is that, you know, in the aftermath of the killing of Trayvon Martin and all the voter ID laws, is spreading. We had this very major campaign against the American Legislative Exchange Council, which was behind otherwise known as Alec, which was behind stand your ground laws, which behind voter ID, we got a lot of corporations at the end, probably over 150 corporations to divest. It was all over the news, right? We had like, gotten them and we started getting cooperation, care about the environment, who cared about justice, all these things to divest from Allen, in the middle of the Zimmerman trial, a whole bunch of Silicon Valley companies join Alec, because they thought that he was going to help them on some law that was happening in California, that was gonna create more accountability, the fact that they didn't have anyone around the table that said, Hey, yo, this might, we might still want to work on this law. But we probably shouldn't join this group that people are like, protesting over, like the killing of Trayvon Martin. And the stand your ground laws that shielded george Zimmerman is sort of an example of how these companies as large as they become as powerful as they become, have a lack of depth in terms of the real world and how race, gender inequality also impacts people's outcomes.
While we're talking a bit about, like what you said about the kind of tech being a historical by nature and sort of blindsided by these issues that are systemic, they've been around forever, they're manifesting in new ways. Jesse, I would love to ask you, I'm sure that you've been asked this before, but in 2016, you were working on Hillary Clinton's campaign for president? I would love to know, in hindsight, being perfectly honest, looking back with the experiences that you had, did you see some of this stuff on the horizon? You know, like the conspiracies Q and on misinformation? What What could you see? And what Couldn't you see from your perspective where you were?
Yeah, I think it's really interesting question. I mean, I think that there's no doubt. We had a lot of discussions about like, how do you manage I mean, Hillary Clinton comes but I love the woman for all my heart. She's carries more baggage from, you know, 40 years of disinformation predating Facebook and all that has been targeted at her. And then, you know, you think he faced a uniquely, I don't think that Donald Trump was necessarily an aberration but but so much as the culmination of years of Republican Party moving toward more outright, discriminatory racial appeals and, and just bigotry. And we had we had to deal with all sorts of debates internally, which is like, how are people going to react if we call Donald Trump a racist? How are people going to react if Hillary Clinton is the first presidential candidate to call her opponent a racist in modern times? Like, how, you know, how do we address I think she gave a big speech that about the alt right, which at the time, I think was pretty groundbreaking at the same time, looking back on it, I wish we wouldn't have called it the alt right, which is like the sanitized version of what it really is, which is like white supremacy and, you know, and, and so when you when, when I talk when you talk about, you know, did I see Q and lm coming? Probably not. But did I see this, like, horrifying crawl toward more and more explicit, you know, racism that's always been there, obviously, but had been sort of at least relegated to the corner at some sense where people at least didn't feel okay saying, being out and out about it. You know, like, out and out racists like proud to be part open out in the open being part of these kinds of communities. That was that we definitely saw that on the horizon. And I don't think we necessarily grappled with it the right way. I think we did. I don't know, it was a really challenging thing to try to navigate at the time, but it was, I very much felt like something really ugly is coming. And I definitely poured every hour of my waking time during the year and a half into that campaign, in part because I was like, terrified of where the country was headed if we didn't win the election, and everything and more that I shared has definitely come to fruition.
While we're doing a bit of looking back, you know, it's it's simplistic to say, Oh, this is where it all went wrong with social media platforms like we it could have been different from the start. Because, as Yashad was saying, you know, there's intentionality that maybe wasn't there in the beginning, as we're seeing now. nosh, who would love to talk to you a little bit more about the model of what you're working on? Because it's just completely different. Apples club, is it 1699 a month? I think this Yep. Right. And, you know, it's a paid subscription. People want to be involved in the community, they sign up and they pay monthly and they get to gauge I it might be overly simplistic. But do you think that there's an argument that social media being free, by which I mean, being ad supported, you know, the user is the product? is where it all went wrong? arguably, and do you think that social networks need to stop being free like you've chosen to do to realign those incentives?
Yeah, I don't think the issues we're seeing come from the internet and social platforms being free. I think it's deep embedded systemic problems, as Rashad mentioned, that have haunted this country since day zero, I think the people that are creating these platforms are creating what they know, which is to create, patriarchal, you know, systems that live within platforms that look glossy, and have illustrations and use fun, human centric language, but at the heart of it don't make space for what marginalized communities feel on those platforms. And so I think that's just embedded in almost every platform use, as Rashad mentioned, we've got zoom, which I don't think he would consider a social platform, but the fact that they didn't recognize that there are people who want to purely cause terror to black people and people of color, for fun, it is a big issue. And with some are good specifically. And also with Apple's club, all we've done is say we know what that is like. And so we're simply creating an alternative, whether that alternative is free or paid, I think is a much larger discussion of a club is obviously paid. And as a member, you get access to community, since you're on holistic wellness, it's incredibly specific, with somewhere good, it will be a free platform, but the things that we're doing differently, or we're taking moderation, safety, incredibly seriously, from the beginning, before we had a feature, you know, Trello board for like audio or video we were talking about, like, you know, how do you build consent between two parties on the platform? How do you, you know, make reporting something that is accessible and quick and not something where you're forcing someone to relive a horror and or go through some sort of long process where they maybe have to call a number? How do you get in front of these issues that my team which is composed of, you know, black people, Latino people, Asian American people, queer people, non binary people, the things that we experience every single moment of being online and sort of a larger explanation of that, right? That can be Tumblr, it can be Twitter, it can almost be can every platform. All the things that we've lived through, we are saying, What if we didn't have to? And what does that take to create that kind of platform? And so we're constantly throttled with questions of no interactions, we're having now interactions we've had in the past on it. And that's sort of what's driving us to make a better platform, not necessarily the financial difference of something having a cost behind it or Android being free.
I'm glad that you brought it moderation. That's actually something really striking about Apple's club and moderation is that, you know, on the site, when you go on, it says there's a zero tolerance policy. We don't allow any ifs or isms, which is great. And I'd be curious, like, has that played out in any kind of like, nuanced moderation discussion so far? Like, I'm not saying like, have you all had any, like issues, has anything gone wrong? But I just love to know what those conversations look like and how, you know, for somewhere good, how that's going to work. The moderators that will kind of be appointed to those communities.
Yeah, I mean, we take moderation incredibly, seriously. Again, before we even knew what the product was trying to sell people. We knew that it had to feel safe and sort of highly moderated. top down in that manner, and we have not had any issues. And this is me knocking on wood to this day over this past year. And I think a lot of that is because we do cause some friction through user experience upon their onboarding, I know that for talking sort of traditional product building language, you try to make it as simple for people to just sign up and kind of be thrown into a room and they're given their pedestal and they can shout, we actually like to put up walls and make people kind of jump through hurdles, because it's important that the people who are spending their time in our communities want to be there and understand the implications of what it means to be a part of that community. And so if it means you have to read through 12 guidelines, and actually mark off every single one, that's what has to happen. In terms of the events that we do with Apple's club, we send out sort of guidelines to talent events. So anyone who comes into our space understands what we built in the importance of following the rules that we placed. And so I mean, I can talk all day about moderation, I'm constantly thinking about ways to make it stronger, make it you know, better. But I do think it sort of boils down to the fact that we've done a great job of zoning and onto our end user is and so that when people find us, they're like, I've been looking for this super curated and moderated place online for a long time. And so we haven't had too many issues there.
That's such a special experience of having users finally find the thing that they've been looking for. So that's really great to hear. Yeah. I guess it Jesse and Rashad, I would be curious to know to get a little bit into the policy discussion. You know, what, what nauseous talking about moderation having a stop of moderation from the beginning. Obviously, major social platforms are still struggling still haven't devoted enough resources to taking things like racist hate seriously taking things like harassment, misinformation, everything seriously. These problems are ongoing. I would be curious to know like of the kind of like platter of policy solutions that are out there in 2021. Things like section 230. state and federal antitrust suits, bills targeting more specific stuff like algorithms, which of those solutions seem most promising to you all? And will any of them really have an impact when you can find a company $5 billion, and it just brushes it off?
Well, that's a big question. Because we do actually have to deal with the fact that the FTC, like the fines or things that these companies can simply write off, and don't actually provide the type of pain necessary. Some of these companies need to be broken up, some of these companies need to be regulated in deeper ways, some of these companies that section 230, which means we need a reform of Section 230, that actually both hold civil rights principles, but also kind of deals with the fact of targeted marketing on these platforms, and the sort of ways in which the companies can, you know, wipe their hands of it, we need some sort of CFPB. FDA version of infrastructure at the government level, because anyone who has watched some of the hearings, knows that the hearings on nuclear power, or even keeping our milk safe, would look the same if we didn't actually have government infrastructure, people who we elect, you know, to be in Congress or Senate can't be experts on all these issues. And so part of having that infrastructure is the same way I talked about having buildings that meet code. It's not because of our elected officials or experts. It's because we build the infrastructure at the government level, and that we actually have fines and accountability that's at scale to make sure that they're that they're that there's consequences. So I think all of those things are going to be necessary. We're, whether or not we have the political power is really the question. And right now what all of us should know is clear that we're going to need a lot more people involved in raising their voices. If we're going to win any real change that's possible to be helpful to make change, we will lose time and time in the back rooms because of the power of Silicon Valley because of their money. Because of how complicated these issues may feel. We will lose in the back rooms if we do not have millions of people lined up at the front door. And part of lining people up the front doors is centering racial and gender justice in these issues, recognizing the motivational role that communities will have to play in reshaping and reshaping the rules.
Yeah, I mean, I totally agree with everything Rashad said there. I was, I was gonna say when you were taking through the list of policy options, sort of Yes. Like we need all of these things against something Rashad said earlier, which is the the fundamental incentive structure around large social media platforms right now. is so perverse, they are incentivized to amplify the most toxic content, disinformation, hate speech. Like that stuff drives engagement and so long as they have platforms that where they have no accountability, you know, through centrally TALKING ABOUT SECTION 230, or form like, there's no way to hold them liable from a legal standpoint, the FTC is not hitting them with fines that really hurt them. They don't have ends and, and there's just no friction anywhere. And so like, the more engagement the better for them, and I couldn't, I don't know, Rashad, how you how you put up with continuing to, like meet with these people and have them tell you, Oh, we don't profit off. Hey, and it's like, that's just fundamentally not true. I'm trying to censor myself here. But like, like, Mark Zuckerberg has written posts about how like, no matter where we set the line, the closer you get to the line, the more engagement it generates, the more sensationalized the content, hate speech, disinformation, conspiracy theories, it generates more engagement, and the longer you're engaged, the more data they can collect the value and the more advertisements they can show you. And then, and there's just no incentive not to do that. And until we fundamentally, fundamentally up ends that incentive structure. They're gonna continue profiting off hate and distortion and deceit and delusion and discrimination. And that's just the reality. So we need to like the content moderation stuff is very nuanced, and very important. And it's like so refreshing to hear someone like nausia is coming at it from the outset. Like, what community what kind of community do we want to build, but at this point, with companies like Facebook, like, we can't just be talking about content moderation, we need to like change the incentive structure.
That's a great point, looking big picture and all of that stuff. We have a question from an audience member that I will throw out. And it's actually what I like, because it segues into a different question I was curious about, there's some kind of this is my bit, and then I'll get to that a bit. But there's some interesting near future stuff going on. Right now on social media. We've got VoiceBase platform stuff like discord and clubhouse, which is like, you know, very hot right now that people are talking about social media is kind of like moving in a different direction, trend wise. So the question from the audience member is, this is so timely, given all the hot stuff around clubhouse, wondering if social networks that grow more slowly build better and more polite, large communities?
I think that like once again, this gets to the rules, you know, white nationalist groups can can grow slowly, isn't Roger stone? Did he just joined clubhouse where they just hosting him for something the other night? You know, all of this stuff is all heads in the same direction, when the incentive structures are off. And this sort of goals are off. And, and and, you know, I just I think that yes, we keep trying to figure out a way for racism, for massage and a for all these things to not steep in the best ways that we create structures, rules and accountability. Otherwise, we will can we will end up exactly where we're at right now. We may just end up there a little bit slower, but that doesn't necessarily change the impact on the communities that have to bear the brunt of these platforms which have created hostile climate for them and their communities.
Thank you. I think we unfortunately, this is a huge conversation and we are about out of time for today. I wish we could go on forever and I'm sure we could. But thank you all so much for your time. We really appreciate it. We love hearing about what you're what you're building what you're working on, you know and and imagining a better future for social media that would make all of us you know, a lot safer, healthier and happier is always a good thing.