How Instagram's Algorithm Connects Vast Pedophile Networks | WSJ Tech News Briefing

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
foreign to Tech news briefing it's Thursday June 8th I'm Zoe Thomas for The Wall Street Journal investigations by The Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst have found that Instagram helps connect and promote a vast network of accounts openly devoted to the commission and purchase of underage sex content these pedophilic accounts are Brazen about their interest using hashtags like pedohar and pre-teen sex to search for sellers of child sex material and seller accounts often claim to be run by children themselves and use overtly sexual handles both of these things violate policies of Instagram's parent meta the company has acknowledged the problems with its enforcement operations and vowed to improve them with me to discuss this is our social media reporter Jeff Horowitz who covered these investigations with reporter Catherine blunt and Jeff when someone on Instagram searches for one of these pedophilic hashtags what kind of material are they finding so a network of accounts that is very openly advertising child sexual abuse material for sale a lot of it is self-generated but basically kids and teenagers that are essentially offering really vile stuff up into including self-harm and bestiality content these hashtags are kind of watering holes for a community to form a round that is interested in this stuff the investigations found that Instagram helps connect and promote these kinds of accounts how did researchers find that the platform is doing that so recommendation systems are very good at connecting people to the communities that they want to be part in and the content that they're interested in and it's no different for pedophiles so obviously there's a lot of pedophiles on Instagram because Instagram has 1.3 billion users or a little more than that the question is how does the platform treat them and the answer is that it caters to their interests by basically suggesting at every turn content that might be interesting to them so that could be video videos of little girls dancing that could be if they enter a hashtag that is a little creepy it might suggest another version of it that has say an indication that it is going to be specifically devoted to child pornography it might be suggesting that they look at similar pages to or similar accounts to one that is purporting to sell underage sex content yeah can you explain a bit more about how that works if you're not specifically searching for this material how might you end up coming across it so I spoke with a woman named Sarah Adams She Goes by the handle mom dot uncharted on most social media she is a Canadian parenting related activist her main focus is the idea that pretty much devoting social media content to your children is a form of exploitation and of itself and she you know in the course of doing sort of online activism gets sent terrible things all the time from people who are like can you believe this and one of those accounts was and I'm sorry to have to use these words or named incest Toddlers and it was a meme account of pro pedophilia memes she reported it to Instagram didn't think anything more about it just took a few seconds to just like click a few buttons and basically say what what's going on this is terrible take it down and over the next few days her followers started bombarding her with questions saying like what what is going on I'm when I look at your profile I'm being being told to check out incest toddlers And this is actually the algorithm working exactly as it is intended to do which is that the people who are interested in incest toddlers the pro pedophilia meme page are very strongly interested to it right that's a niche Community with a obviously serving a particular audience but yeah I would say that if you are really uninterested in this stuff it's not like Instagram's going to shove it in your face I think that the question isn't whether Instagram is trying to push everyone toward pedophilia so much as whether the platform is facilitating the formation of a community that normalizes and facilitates the grooming and abuse of children the promotion of underage sex content violates Instagram's rules as well as federal law so what has meta the parent company of Instagram said about the findings in these investigations so in response to what we and Stanford and UMass have all been talking to the company about they have removed thousands of hashtags they've moved large numbers of posts they're planning future actions they have set up a child safety task force internally and they have discovered some operational failures so one thing that was happening is that a glitch sort of a software glitch was causing them to discard many user reports of child sexual abuse without reviewing them and they have found that and fixed it that's what the company says they have also given their moderators new training suggesting that maybe they should be a little bit more proactive in taking down this stuff is I think the takeaway they've also said they are working on preventing their recommendation systems from recommending that suspicious AKA potentially pedophilic accounts don't get recommended to each other and aren't allowed to follow each other Jeff the laws around child sex content are extremely broad investigating even the open promotion of it on a public platform is legally sensitive what did University investigators say about the findings so the university researchers I think were all very surprised that this was this bad I think everyone kind of generally understands that these are you know chronic problems but the the thing that I think came as a shock to everyone was uh you know that this has been allowed to sort of fester and grow to the scale it has I think there was an expectation that a company such as meta with the resources that it does and the need to make sure that their platform is safe for advertisers and kids that they wouldn't have done a better job without prompting so what did the researchers say about fixing the scale of this problem so I think that the thing that Stanford and UMass have been very focused focused on is that Instagram needs to one block the discovery mechanisms that are leading people to these communities and to just like invest heavily in human research to track what's happening how how does the problem on Instagram compare to other social media platforms this is difficult to say Stanford focused on this network sort of open child sex content sellers you know and those they operate across platforms they have links to Snapchat to Twitter and so forth but like really the center of it was Instagram I think a lot of that comes down to just sort of platform features to some degree Instagram makes it so easy to see who's following who and recommends so many different things and recommends connections not just content that it's sort of a higher risk place that was our reporter Jeff Horowitz thanks for joining us Jeff happy dude and that's it for today's Tech news briefing for more Tech stories head over to our website wsj.com I'm Zoe Thomas for The Wall Street Journal thanks for listening [Music]
Info
Channel: WSJ News
Views: 39,099
Rating: undefined out of 5
Keywords: instagram, algorithm, meta, safety issues, content moderation, instagram algorithm, pedophile networks, wsj, tech news briefing, child-sex content issues, safety concerns, inappropriate hashtags, stanford university, meta search concerns, algorithm issues, network of accounts, sensitive content, nsfw, teenager social media, child social media concerns, social media content moderation, social media inappropriate posts, social media safety, snapchat, twitter, content sellers, techy
Id: AJO2w5LFZIk
Channel Id: undefined
Length: 7min 41sec (461 seconds)
Published: Thu Jun 08 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.