The Big 5 S2 Ep.9 - Deplatforming and the importance of moderation on social media
3:45PM Jan 31, 2023
Hello and welcome to the Big Five podcast from Northumbria psychology department where we learn big facts about human behaviour and experience. My name is Dr. Genavee Brown and I'll be your guide into the minds of psychology students, alumni and researchers at Northumbria University. I'm a lecturer and social psychology researcher in the psychology department and I love learning more about all fields of psychology. Each week on this podcast I'll speak to a guest who is either a student alumni or researcher in the Northumbria psychology department. By asking them five big questions we'll learn about their time studying psychology. Today I have the pleasure of speaking to Dr. Carolina Are. Hi Carolina.
All right, thank you for having me really excited to speak to you.
Me too. So Carolina is a research Innovation Fellow at the Centre for digital citizens. And she's going to share a bit of her research today with us. So just to start with, what is your research topic and how did you get interested in this topic of study.
So my research topic is mainly surrounding deplatforming content, moderation, content, moderation of nudity, in particular. And I kind of ended up doing that as a backup, because while my PhD was in content moderation, it was in the content moderation of online abuse. So what happened was that I was observing a really, really violent subculture that I'm not going to name. So I ended up not being able to publish from my PhD, because if I published I would have outed myself to these people. And previously, researchers that published on this subculture have essentially been trolled for life, like it hasn't stopped. So I wanted to avoid that. And conveniently or inconvenient Lee, during the second year of my PhD, Instagram started adding more and more censorship to some specific accounts and some specific content. And because in my free time, and to also support myself during my PhD, I was a pole dance instructor. Basically, I was affected straightaway. Because pole dancing is an art and a sport that was created by strippers. So like strippers, we don't wear that much in terms of clothing, we wear chunky shoes. And we, you know, we danced around the pole. So it's very weird and difficult for an algorithm to understand what we're doing. And basically, I started contextualising, what was happening with this censorship within the broader platform governance, literature and research, because that was my expertise in terms of, you know, for my PhD, and I also had previous professional expertise as a social media manager before retraining to become an academic. So I have a blog, which is called bloggeronapole.com. And I was just like, Okay, this is happening to me and my community. And let's put this into context. And I asked Instagram for comment, because coming from public relations and marketing, I was like, Well, I can't slag off Instagram without giving them the right to reply. And strangely enough, they did reply that started a conversation with the platform, which even if they don't say very much proved insightful for a variety of reasons. So that resulted in me having increasing material on censorship, why it was happening, what was happening, as well as a means to observe my own experiences. So I conducted a lot of auto ethnographies. And obviously, with me, being a pole dancer, I am in you know, social media circles in the same social media circles as strippers are sex workers, as a sex educator is a variety of people who post content involving nudity and sexuality. So doing my research became a way to inform these communities and to help them with their experiences of censorship. And yeah, that's continued ever since.
Awesome. Yeah, I find it really interesting what you said about pole dance being a sport because there's other sports where people were very little clothes, and I imagine those wouldn't get censored. So what kind of things are these algorithms looking for? When they're trying to censor things that makes pull dance or sex education or topics like that stand out and get censored? So that's a really good question.
Actually. I would say that other sports where people and particularly women and gender nonconforming people don't wear that much also gets censored quite a lot. Which, why Yeah, which is why my activism campaigns at the time really took off because when I received an official apology from Instagram about censorship of pull downs, then a variety of other users across all different fields were like, oh my god, this is what is happening. meaning to me because obviously it made the news. So it made a lot of people feel seen and heard. So they do get censored however you are right in the sense that they probably don't get censored as badly as pole dance would. And obviously with pole dancing being rooted in the sex industry because strippers started teaching other strippers and they realised, oh, this makes me money. Let's bring this outside of strip clubs. And let's teach women who don't do this job as well. Pole Dancing became this really fun, really entertaining sport and art that still owes so much to strippers. You know, we we dance wearing the same things and we dancer on the pole. So it becomes very sensual and sexual. If you want it to be obviously, there is a very fitness version of it, which is trying to access the Olympics and stuff like that. But I personally prefer sexy pole. And I find it more interesting, because otherwise, part of me is like, well, I could just do gymnastics if I wanted to. Although, you know, it's a beautiful and hard thing to do as well. But I'm more interested in that expressive side of pole. And this happened, this censorship of pole and have sex work and of everything surrounding bodies and sex happened and increased on the back of foster sister, which is an exception to Section 230 of the Telecommunications Act in the US. So before this exception, platforms essentially had no legal liability for what was posted on them, which meant that, let's say someone posted, I don't know something incredibly violent on Facebook, Mark Zuckerberg wasn't gonna get down for it because he was just an intermediary. But this has now changed for no content, but for content, essentially showing nudity and sexuality because what this exception to the Telecommunications Act did was making platforms legally liable for hosting content, facilitating and promoting sex trafficking. Now, you will be like, well, sex trafficking is not sex work. And sex trafficking is not pull downs. Of course, it's not. But this law, lumps in sex work consensual sex work a job with sex trafficking a crime. And on top of that, because platforms work with machine learning, what's it done is that it made them really afraid to get really heavy fines. So they made their algorithms detect anything remotely sexual, and it's becoming a slippery slope. Like when I got that apology from Instagram, it was specifically about shadow banning, which is a light censorship technique where your content isn't deleted by platforms, but it's made a lot less visible, it doesn't end up on the Explore page or on the for you page. So if you're a content creator, making money or reaching audiences, through your profile, it really affects you. And for answers, we find inspiration through social media, right? So we find tricky inspirations, choreography, inspiration, and we look at hashtags. So the pole dancing hashtag system is like hashtag PD pull downs, and then the name of the move. So when a few pole dancing, hashtags were shadow banned, then all of those hashtags were shadow banned all of a sudden. So it just became this slippery slope. And at the time, I remember even looking at hashtag female fitness. And if you looked at the hashtag page, you were notified that your content was shadow banned, because you don't get a notification when when you get shadow banned. But if you looked at the hashtag page, it said, hashtags, in this, you know, realm, and in this family of hashtags have been restricted, because our community have found that it goes that they go against community guidelines. And if you looked at the pictures and videos featured on that page, it was women in sports where it was women, you know, in sports, bras, women in leggings, even women with a T shirt on. And then if you looked at hashtag male fitness, that was not shadowbanned it was full of, you know, bare chested, oily dudes, but they were fine. It was female fitness, that was not okay. So, you know, it's just like the slippery slope that affects a variety of users in all sorts of ways, sadly, and it's mainly women, because, you know, the population of big tech workers is largely men, so I guess they, they code in their algorithms, anything, you know, related to women and LGBTQIA plus people as sexual.
Yeah, I was actually going to ask you about that and what role you think gender plays in this dichotomy because as you said, you know, volleyball players on the beach who are male who are topless, that's fine. But you know, women, it wouldn't be the same case or, you know, I've seen a lot of posts A lot of discussion recently about transgender people posting, and Instagram banning images of nipples after top surgery, for example. So I wondered, yeah, what are your thoughts about how gender plays a role in this and how we can combat some of those biases.
So gender definitely plays a role in this. And I think as I said, it's definitely related to the population of workers in big tech, because occupying a space that is in between sociology, criminology, psychology, and cybersecurity. Every time I go to cybersecurity conferences, women are the minority. And it's the same in big tech. And it's the same, you know, in the specific jobs that require people to make technology. And I've witnessed that with my own eyes, where even the mention of sex of sensuality of bodies makes these types of workers really uncomfortable. So it's not unlikely that this is how they would create detection algorithms. Plus, you've probably seen in the news this week that the oversight board which is metas independent oversight, body and Tribunal has overturned a decision to basically ban some posts by transgender and non binary people that showed nipples. And I was actually part of advising the board for those decisions. And on top of that, apart from overturning this ban, what they did was recommending some policies, including removing the female nipple as something so taboo from platforms. And if you think about it, it's really silly, because a lot of the reasons that platforms give for making nipples, not, you know, okay, on the on the wraps, is that it's a way to detect whether those posts are consensual or not. But it's a bit weird, because, you know, you could post a breast consensually or non consensually, whether there's a nipple or not, it doesn't really make sense. So it's definitely like very arbitrary and very visual ways to detect non consensual posts that are completely devoid of context. So I think we need to continue campaigning, I feel like decisions like the ones that the oversight board has made, could not have happened without all the activism work involved in this. I, you know, I personally really admires represent sex workers for their fight for this because this really, really affects them. It prevents them from working in safe spaces, like online spaces are a lot safer for sex workers than offline spaces, because they can they can vet clients. And similarly, if you're a brand, if you're a creator, if you're an artist, this censorship has a huge chilling effect on you. And it really affects your yearnings. So, I created a campaign on the back of the apology for shadow banning together with a variety of users across all sorts of walks of lives, which was called everybody visible, and it protested this was in 2019, a protested against Instagram, very sex negative community guidelines and against their lack of transparency. And that really made noise. Even Dita Von Teese, the burlesque performer supported it. So you know, all of these things year on year, they matter. And I think, you know, meta increasingly, are realising that this censorship is not helping them image wise. So in December, I was one of the academics invited by them to consult on their nudity, sexual activity and sexual solicitation policies. And even though it doesn't mean that they will do anything about them, I think the fact that we were invited to talk about these is a positive step forward, because if if they didn't think it was a problem, they wouldn't have invited us. Sure. Now, that's really positive.
I have a couple more questions about the tech industry. So I have done research on big data and tried, you know, to code things and to use automated coding systems. And I've been working on a sexism on Twitter around an overturn of Roe v. Wade. But one thing we found is that, while language models learn how to code for things like toxicity, or profanity or individual words that are easy to find, it's quite difficult to pick up on some of the subtle sexism or hate speech online. And I'm sure that's probably true in the visual sphere as well. So what kind of solutions do you think platforms can use to moderate content that's not just kind of these blanket bans around you know, certain types of contents and kind of what advice did you give to Meta about better ways to moderate content?
So I signed an NDA. So I can't exactly say well, things. But what I am going to say, which is something that I say all the time and nobody likes is that more investment is needed in human moderation. Because all of these algorithmic, you know, automated tools are great. And obviously, they pick up or rather, they're not great, they're important. And they pick up content at scale at a scale that no human moderator could pick up. But at the same time, the fact that there is so little investment in human moderation, there is so little support for human moderators who have to watch harrowing content and make split second decisions over something on which their pay depends, doesn't really encourage attention to detail or nuance. So if they invested in more human moderation, in more even like, in more human appeals, for instance, for false positives to reverse these automated decisions, which are based on, you know, mistakes and stuff like that, well, it's very difficult to go forward, because so many things related to nudity, and sexuality are nuanced. And I personally find it appalling that again, the you know, women's bodies and LGBTQIA plus bodies are so strongly detected while I get so much harassment, so many unsolicited dick pics, and those are never a problem. So sexism and homophobia and horrid phobia and racism, and then discrimination are coded within these algorithms. And I feel like only well paid not exploited humans with the relevant education about these topics can actually make relevant decisions about them. Because a lot of the time these human moderators will find themselves making decisions about something that they're not educated about. So I was recently in Stanford at the Trust and Safety Research Conference, and I met some of the very first Facebook workers who essentially ended up making community guidelines from scratch when the platform started growing, and this person told me Look, we didn't know what we were doing. We just found ourselves in this ever growing thing that we needed to regulate. And he made he gave me the example of the the napalm girl picture, which was wrongly flagged as child nudity, violence, all sorts of stuff. And obviously, if you a moderator that doesn't know the historical and journalistic importance of that picture sees it you see that picture, you're like, Oh, my God, like Bing, bing, bing, you know, all the things that Mehta wants me to regulate and to to get rid of, but because you don't know, you're not going to be able to make the right decision. So another thing on top of better working conditions for human moderators, for human moderators, and more human moderators, I would also say a forum of community source, moderation would be important. So if you think of Reddit and subreddits, you get, you know, moderators within channels that can help moderate about specific issues that they're they're volunteering for, or maybe even with some sort of payment. I think you gotta trust communities with their own content. Of course, it has risks, it can make communities increasingly insular, like in sales and all of that, but at the same time, it would really foster a more free internet.
Yeah, I think that's a really good point, too, because I remember stories coming out about moderation not taking place on racist posts, but they were in other countries. And so moderators didn't understand that there were these racial divides. And you really need local knowledge about the topic about the culture that things are being posted in to perform the correct moderation. So yeah, I think that's a great point. I'm going to use one of those naughty buzzwords that's floating around. But I find it really interesting that there's a lot of discussion about cancel culture right now. And I find it fascinating that what you're describing is a clear example of cancel culture. But that's never how it's discussed in the media. Do you want to talk a bit about how cancel culture and deplatforming kind of go hand in hand?
Sure. Basically, I you know, it's such an interesting question because I'm very nerdy about cancel culture I, I studied Latin and ancient Greek in high school in Italy. And the Romans were the ultimate bosses at cancel culture, right? There's this thing called them not to memoria, which is the damnation of memory. And basically it will go to some historical landmark and the guy will be like, Oh, yes, so this temple was dedicated to someone, but we don't know who it is because apparently they did something naughty. And whoever was the emperor that time made sure that they went there like, you know, and just remove the names from everywhere like you know, with with like, whatever a hammer or whatever tool they used at the time. So that is real Kancil culture. And, and also, yeah, the Platform D platforming is real Council culture because you are preventing someone from having a voice now, I'm not completely against the platforming of harmful individuals. If it were for me, I would like everyone to have a voice even really harmful individuals, provided that their views are contextualised and explained by platforms. The issue is that more often than not algorithms push controversial content like Andrew Tate or like Donald Trump, or whatever. And they don't tell you actually, this isn't true or this is harmful, like there is no fact checking. What they do go against, however, is bodies nudity, and sexual activity and stuff like that. And it is not a coincidence that foster sister focused on nudity and sexual activity because it was pushed by a form of coalition of anti sex by right evangelical Christians in the US, who are the same people that are campaigning against credit card companies supporting porn platforms and stuff like that. So their crusade is very much anti sex, and that's what it's targeting. So yeah, I think it's very interesting that, you know, people say that they're being silenced from the height of their millions of followers or their newspaper columns, while the real people that are being cancelled and wiped out off the face of the Internet are sex workers and marginalised people who really need the Internet to survive. So it's really bittersweet to see that because I'm, you know, I'm seeing that even in the gender critical wars, some very high profile individuals will, you know, set their followers on to trans people with, like, 100 followers, and the notion of power is completely OFF. And the platforms are not doing much about that, because those people are popular anyway. And people, you know, and, and they bring viewers to that platform. So really interesting parallel there. Thanks for raising.
Yeah, no, I think that point you made about power is really key there. Because oftentimes, we hear the stories about, oh, I got cancelled from very powerful people, because they still have a platform even after they got so called cancelled. So yeah, I think that's really interesting. So one last question. This has been a fabulous discussion, as a tech researcher, have fun loving hearing about all this. So where do you hope to go from here? Where's your next kind of research direction? Where things gonna take you?
Thank you so much. Yeah, that's a nice positive question to end on. So my research is generally usually depressing. Because of the platforming and stuff like that. It's not like I don't enjoy doing it. I love what I'm doing. But I'm at the moment. Professor Pam Briggs, who was my land manager and my boss in the centre of digital citizens. and I are working on a paper on LGBTQ plus identity and how social media platforms can help LGBTQ plus individuals, harness and discover and create their LGBT identity. So that's quite nice, because you know, it's not all doom and gloom. So I'm going in that direction as well. But I think I will still continue researching on content moderation, because it is a really hot topic. And I think it's important that we hold these big tech monopolies to account my role ends in April 2024. There's the option to extend it and I'm really loving my time at Northumbria AMA and the Centre for digital citizens. So I would love to say, so we will see where we go. But yeah, I really like I really liked the institution because it allows me quite a lot of freedom in terms of how I present my profile. So I'm still able to teach poll when I want to, I'm still able to continue my content creation and to present my research and as a social media creator myself, essentially. So I'm hoping to continue doing that as part of you know, impact and all those bambas was awesome.
Well, I can't wait to see that paper and see what you do next. Thank you very much for joining us today. Where can people follow you online and stay up to date with your research?
Thank you so much for having me in for the amazing chats. So I'm all over the place online at Blogger and poll that comes with a trigger warning. It is A lot of pole dancing content so there will be a lot of nudity. I am apparently one of the most difficult people to look at in the workplace. So apparently I'm like not very not safe for work for people's Instagram when they open it at work. It just you know, it feels pretty PC, but I appreciate that a lot of people are not used to that. So if you're following me, particularly on Instagram, maybe don't do it in the office.
Good. Good to know. Thank you for that content warning. all right, so listeners if you'd like to learn more about Northumbria psychology, you can check out our psychology department blog at Northumbria psy.com. You can also follow us on Twitter at Northumbria P S Y. If you want you can follow me on Twitter at Brown G nav E to stay updated on episodes. If you'd like to be interviewed on the podcast or know someone who would please email me at Genavee firstname.lastname@example.org. Finally, if you liked the podcast, make sure to subscribe to our podcast on your listening app and give us a review and rating. I hope you've learned something on this voyage into the mind. Take care till next time