There is No Algorithm for Truth - with Tom Scott

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

That can't be Tom Scott. He's not wearing a red shirt.

๐Ÿ‘๏ธŽ︎ 334 ๐Ÿ‘ค๏ธŽ︎ u/Astronox ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies

I think his opening is all you need to understand the core problem here: Option A: Get heard less, or Option B: Disagree and shout louder. Where is Option C: Change your mind?

The real problem is that people tie their identity to something they don't even know is true, which means you can't even talk about it with them, because if you disagree then you're threatening their identity.

๐Ÿ‘๏ธŽ︎ 43 ๐Ÿ‘ค๏ธŽ︎ u/Healovafang ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies

Itโ€™s a bummer that only a handful of commenters actually watched the video. I found it really insightful and thought-provoking.

๐Ÿ‘๏ธŽ︎ 47 ๐Ÿ‘ค๏ธŽ︎ u/incidesi ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies

Here use this instead:

return !user.isLying();

๐Ÿ‘๏ธŽ︎ 54 ๐Ÿ‘ค๏ธŽ︎ u/Noch_ein_Kamel ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies

Having a hard time getting through this. Not much focus and a lot of opinion.

๐Ÿ‘๏ธŽ︎ 74 ๐Ÿ‘ค๏ธŽ︎ u/Deracination ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies

Tom scott at the royal institute is a good time

๐Ÿ‘๏ธŽ︎ 3 ๐Ÿ‘ค๏ธŽ︎ u/maccas_run ๐Ÿ“…๏ธŽ︎ Oct 27 2019 ๐Ÿ—ซ︎ replies

I like Tom's videos they tend to be educational and interesting. However like a lot of youtubers he places far too much weight on the the importantance of the social media platforms. Most youtubers seem to operate purely within the bubble of youtube, facebook, google that is just one aspect of communication and honestly this is nothing new. Look back in history at the old snake salemans peddling his goods. It was mis communication, lack of the education but also a good sales pitch all operating without the social media of today. History is full of examples of the same issues we see today existing in Rome, Greece. If you want a good example of what he is talking about that has nothing to do with social media look at the Church.

The Church through bible has had a massive effect on the global world. If you take the basics of the 10 commandments it has become the guidling steps to law as we know it today. I am pretty sure laws around killing and stealing pre date the bible but it is largely via that document it spread to unify that as being the truth. This happened not through reddit but spread by world of mouth, by books, songs, poems etc etc. You cannot look to blame social platforms for what is it would seem human nature.

It is not an algorithm for the truth that is the problem is the nature of humanity for which their is no algorithm.

๐Ÿ‘๏ธŽ︎ 5 ๐Ÿ‘ค๏ธŽ︎ u/Redscoped ๐Ÿ“…๏ธŽ︎ Oct 27 2019 ๐Ÿ—ซ︎ replies

You could omit fluff and decide whether or not dude even said anything. It's a great way to reduce false information.

๐Ÿ‘๏ธŽ︎ 9 ๐Ÿ‘ค๏ธŽ︎ u/ElSeaLC ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies

I'm always at the edge of my seat with Scott's videos because with every click I know we are inching closer to a bald Scott

๐Ÿ‘๏ธŽ︎ 3 ๐Ÿ‘ค๏ธŽ︎ u/digitalkiks ๐Ÿ“…๏ธŽ︎ Oct 26 2019 ๐Ÿ—ซ︎ replies
Captions
Imagine that tomorrow, Google announced that they have invented a machine learning system that can tell fact from fiction, that can determine truth from lie. And this is a magical psychology land experiment, so in this experiment, in this thought experiment, the machine learning system is right - all the time. And then they say that they've hooked it up to the Google search results and YouTube recommendations. So now when you search for vaccinations, only the scientific consensus comes up. All the anti-vaxxer stuff is down on page six and page seven where no one's gonna look. Youtube videos on conspiracy theories start to just be recommended less and less and less, and the ones debunking them keep coming up at the top. It still matters whether the search result is engaging and relevant and well-referenced, but now it also matters objectively whether it's true. And then you realize that this new system, this algorithm, it disagrees with you on something really important, and it produces a result that you find abhorrent. Now I struggle to find an emotive example for this that wouldn't also be too sensitive for an audience like this. So, as this is the Royal Institution in London, and it is September 2019, imagine this machine learning system concludes that it would be a good idea to have a No Deal Brexit. (*audience laughing*) Okay, no. Don't applaud that, that was a cheap shot. That was a cheap shot and there may be people in here who genuinely believe that, so for those people - please imagine the extreme opposite conclusion. That this system believes it would be a good idea to dissolve the United Kingdom into Europe entirely. Look, this is a system that in this thought experiment does create objective truth and it disagrees with one of your fundamental core values. What's more likely, A: that you are gonna just go. Oh Okay, I guess I'm gonna be heard less and less and I guess we'll just have to deal with that or B: that you decide that instead, you need to shout louder, and that the algorithm is wrong. So, for the next hour, I want to talk about the state of play for science communication and broadcast communication in general, for an audience that has wildly different ideas and areas of knowledge on how that works right now. I want to talk about why some science communication seems to go everywhere, and a lot of it doesn't. And to give advice for people who are trying to reach out and to broadcast their truth. And whether that's for small individual groups putting out content, or whether it's big corporations and charities who are trying to do outreach. That's also going to include a dive into the concept of parasocial relationships. Those odd one-way relationships that appear in other media a lot. Because if you want to understand how to reach an audience, then you need to understand what that audience is looking for. And I want to talk about one particular set of things that are increasingly governing the media we consume and everything in our lives - Algorithms The ones that recommend the videos you watch on YouTube, the search results on Google, the order that your Twitter feed appears in and basically, everything that includes recommendations on social media these days. And from the corporation's point of view, which advertising to show you. Now, there are a couple of caveats here. I have generalized this talk as much as I possibly can. I've run it past quite a few folks in my industry, but I am speaking from a position of success. I'm lucky enough to have 1.9 million subscribers on YouTube at the minute. We didn't quite get to two million in time. Now, subscribers is not a particularly useful metric. That's more a function of how long you've been on the platform and how many one-hit wonders you've had. It's more honest to say that an average science communication video from me gets somewhere between a quarter of a million and a million views. And those might be about linguistics, which is what my degree is actually in. They might be about the basics of computer science where I'm self-taught but checking my script with experts. Or they might be about infrastructure and science, the interesting things in the world. Which is where I go out on location and I hand over to people who know what they're talking about. Now my degree is in linguistics with a research masters in educational studies. But ultimately I am in this position because I spent 15 years throwing things at the internet before something worked. I was extremely lucky that the thing that turned out to work was science communication, was going out and telling the world about things I'm interested in. I am even luckier that it turned out to involve filming on location. In the past few years I've been lucky enough to experience zero gravity. I have gone to the Arctic, I have flown with the Red Arrows. And yes, that is mostly a brag and mostly just an excuse to show the best photo of me that will ever be taken in my life. Like, have you ever looked at a photo of yourselves and thought it's all downhill from here - because it is. One more caveat, for those of you who've been to a Royal Institution discourse before, you'll know there are generally two types. There is one where a researcher with a PhD and an associate professorship talks about their research and there is one where someone in arts and culture shares their experience. This is more of the latter. Some of what I say is going to be opinion and not fact, and hopefully, this audience will be able to tell the difference. There are points in here where I explicitly say I do not have all the answers. And I also want to add one conflict of interest note as well. My company gets a lot of its revenue from the adverts that go on and around YouTube videos. Which means that my rent is indirectly paid by Google, Ireland. I can't imagine why it's in Ireland. No reason at all why they'd set up there instead of the UK. Not a word of this discourse has been passed by anyone in Google. They don't even know I'm doing it. I'm not employed by them, but ultimately, I am... While I'm willing to irritate that company and bite the hand that feeds me, they are indirectly paying my rent. I can try to represent the folks who have not been as lucky, who the algorithm has turned against but, I'm quite happy with them right now and a lot of other people aren't so anyway, that's the plan. This is the state of science communication in the English-speaking world at the end of the second decade of the 21st century. And to understand how it works, we need to start with the algorithm. Algorithm in this context means quite a bit different from what people with a lot of experience in mathematics and computer science might think it does. The algorithm is referred to in the singular which is the almost anthropomorphised name that's given to... to this collection of machine learning systems. I went to a conference for science communicators last year and after about three or four hours, we realized that we had to ban the word from conversation because while a lot of folks from YouTube would just sort of endlessly froth about it, anyone not on the platform just found it confusing and messy and would not shut up. So when I talk about the algorithm, I'm talking about this sort of almost magical black box of code and the idea is, that you've set up this... this black box and then you provide it with a list of human-curated examples and it works out their distinguishing features, provides some sort of categorization system. And then as you throw novel examples at it, it categorizes them and learns from feedback. And those distinguishing features may be completely novel or completely unknown to humans. So one of Google's recent AI projects was looking at retinal photography, and this recent paper claims that with uncanny accuracy, it was able to look at retinal photos and work out gender, sex with 97% accuracy, age within four years, and better than chance at smoking status, blood pressure, major adverse cardiac events. Eye doctors now cannot detect any of those things themselves, they're not really sure how the Machine did it. Now it's right to be sceptical of some of those claims. Maybe there's a difference in the metadata, maybe the retinal photography machine, I guess that's technically a camera, is set up or focused differently depending on some of those attributes but the paper does a pretty good job of covering their bases there and maybe a human could also be trained to pick out those differences, It's just that nobody's bothered when you can just look at the chart next to the patient. But the simplest black-box machine learning system is essentially categorizing pictures. You give it a load of pictures of cats - and you give it a lot of pictures of things that are not cats. And then you ask it, 'Is this new picture a cat?' Which sounds like it's gonna be useless unless you're trying to design a filter for adult content and you're sending it pornography and not-pornography and the aim is to have a classifier that can look at a photo it's never seen before and work out whether you should show that to all ages or not. Of course, it's not that simple, there are stories after stories after stories of machine learning systems that have failed through bad training data or more likely, biased training data. And this is slightly outside my ballpark. I know Hannah Fry covered this in her lecture here a while ago But I have an example that's very close to my heart YouTube uses a machine learning system to try and detect whether videos are suitable for advertisers to place their adverts next to. It was rolled out a little bit too fast and before it was entirely ready, because YouTube had one of the many little scandals that they have and they needed to do something to reassure their advertisers So, this is a highly abridged summary based on unofficial conversations and innuendo and scuttlebutt I'm breaking no NDA's here, but... the story goes, that they provided the machine learning system with a big block of videos that were definitely 100% safe for advertisers, and then they gave it a big block that were definitely not and they told the system to be fairly conservative because it was only a first line of defence if it deemed your video unsuitable You could send it off to humans for review and this is, in my industry a fairly controversial thing to say, but I don't think that's an unreasonable solution to a very difficult problem. YouTube has 500 hours of content, 500 hours of video. I'm not saying content YouTube has 500 hours of video uploaded every minute That's a human lifetime every single day It is not unreasonable To have a machine learning system be the first line of defence as long as there's a human review behind it and nowadays It's working fairly well, with some high-profile exceptions But the problem, so I'm told, was that there was a bias in the training data. Videos, I'd say channels, people; talking about LGBT issues were more likely to talk explicitly about sex in some of their videos. Not all of them, not the majority of them, but enough that the machine learning system figured out there was a correlation between people talking about LGBT stuff, and people talking about explicit sex Again, only in a small number of videos but enough, that when the machine learning system found something to be about being gay, It viewed it as more likely to be unsafe. Now the YouTube CEO said in a recent interview: "We work incredibly hard to make sure that when our machines learn something, because a lot of our decisions are made algorithmically, that our machines are fair" I know some YouTube employees. I'm friends with some YouTube employees. I believe that they work incredibly hard to minimize that bias -- but it's still there. Algorithmic bias is a major concern for every single machine learning system. The systemic biases in the wider world have already found their way to social media without machine learning being involved. Of the top 10 earning creators on YouTube right now Well, as of 2018, as of last year, of the top 10, all of them are male And I'm well aware that I'm in the Royal Institution giving this talk one of the reasons that I got that audience in the first place, one of the reasons I ended up standing here, is because I'm A white guy with a British accent, that sounds authoritative Trying to make sure that artificial intelligence doesn't inherit these systemic biases is an incredibly difficult job, and it's one for Hannah Fry and her crew, and not for someone who got a linguistics degree. When YouTube handed over that recommendation engine to machine learning, they set it to increase watch time. This is what we told everyone - if people stuck around watching your video all the way to the end It was 20 minutes long Then it was viewed as good by the system. At which point they fell foul of Goodhart's Law "When a metric becomes a target, it ceases to be a good measure" So people made longer videos, and they put all the important stuff at the very end, forcing people to watch all the way through So worse videos were being recommended So now YouTube's official line is that they reward high-quality videos that keep people on platform Now that may not be videos on the same channel, by the same creator, in the same genre. In 2017, Jim McFadden, who was then technical lead for YouTube recommendations. He talked about the new engine they had, that came from a department wonderfully called, Google Brain "One of the key things it does" He says, "it's able to generalize, whereas before if I watch this video from a comedian, our recommendations were pretty good at saying, here's another one just like it. But the Google Brain model figures out other comedians who are similar but not exactly the same, even more adjacent relationships. It's able to see patterns that are less obvious" And as for which videos to recommend, one of the new model's basic ideas - is that if someone comes to YouTube, because of your video, tick That's a good thing, and if someone does not leave YouTube because of your video, does not stop watching, that's a good thing So the black box takes in those signals, and it works out what's going to keep people on our platform, what's going to keep them watching the videos and again more importantly for Google, watching the adverts in between? Incidentally, apparently, Google also serves more adverts to people who are more tolerant to adverts If you'd like two adverts to appear less often before your video, skip them more! That was bad advice to give for someone who makes his money from that! Because you've got to remember, all these big companies, Google, Facebook, Twitter, they are essentially advertising companies. Almost all their revenue comes from being the greatest marketing, advertising, targeting company that the world has ever seen. Their idea is that every advert is perfectly targeted to you As all of you will be aware, they haven't got there yet But it turns out if you reward videos that keep people on platform, then what you end up with is conspiracy theories and clickbait. Yes, all, all of the companies that have algorithms are working on fighting disinformation because they are aware that it's a public relations disaster for them But they are doing it in English first. I'll get back to that later From a creative perspective, "The Algorithm" is often seen as being a bit like a skinner box It's an operant conditioning chamber. It is a food dispenser that might give you some money if you... you tapped Believer enough Google and YouTube and Twitter will never tell you what that algorithm is looking for - because every bit of information they give away, means that there's more opportunity for people to abuse it and send spam - but from my perspective, the pellets come out at random and we all develop superstitions and we all keep pushing the lever. I was lucky enough this year to have a conversation with the Head of Product for Recommendations at Google, the person in charge of the algorithm It seems like the idealized version of what they want for those recommendations that appear next to your video, is that, It's a bit like, and this is something that shows my age, It's a bit like a TV Guide It's a bit like the Radio Times Every single channel should always have something on it that is interesting to you, all the time It should be a transparent glass layer between the audience and what they want to see and the question is, as they're weighing up all those videos How much they put their finger on the scale, how much they make that TV Guide be for the worthy, honesty, truth-seeking version of you, and not the clickbait conspiracy version of you, because ultimately yes, sometimes you want to watch a documentary, but sometimes you want to watch someone trip over and hurt themselves - like; "You've Been Framed!" existed for a reason And I'm not just talking about YouTube. I'm talking about Twitter, I'm talking about every single algorithmic system out there, if all they're recommending is quick, short, dopamine hits that just get you in, get you out Then, that's not a long-term survival strategy. That spirals down into the lowest common denominator, which ultimately hurts the world But if they don't have some of that in there, then people are going to go elsewhere, they're going to go to the company that does have that If everything is painstakingly verified, and educational Then only a minority of folks are going to watch it. Finding that that balance, finding that solution I said there were gonna be analogies to older media. That's what TV commissioners still do It's what the people programming the YouTube algorithm are trying to do, and let's be clear, It's an unsolvable problem. There is not some magical equilibrium in the middle that will make this work. It's about finding a balance. It is about finding the least worst option You cannot have a successful platform that is all clickbait, but you also can't have a successful platform with no clickbait Because either way, advertisers are gonna leave, viewers are gonna tire of it, and it's not sustainable, long term There have been plenty of investigations into the effects of the Algorithm. Plenty of research, formal and informal, that showed how you could very, very easily go from something apolitical and then click again and find something just slightly political. And then click again, and find then find something It's a little bit clickbait, but still honest and then maybe something that's about moderately conservative politics that's a little bit untrue and then on the next click Find something that about why Hillary Clinton is evil and Donald Trump is the greatest thing to ever happen to the universe or vice versa The most notable recent investigation into online radicalization Is this deep dive by the New York Times and like I say the companies are working in English first because "The New York Times" looked into radicalization in Brazil and includes one of the most sobering paragraphs I've seen in a while "Right-wing YouTubers had hijacked already viral Zika conspiracies, and added a twist: women's rights groups, they claimed, had helped engineer the virus as an excuse to impose mandatory abortions." Quoting Zeynep Tufekci, who was referring to research in the Wall Street Journal "Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultra marathons It seems as if you are never 'hard core' enough the YouTube's algorithm. It promotes, recommends, and disseminates videos in a manner that appears to constantly up the stakes. Given its billion or so users, YouTube may be one of the most powerful radicalizing instruments of the 21st century" I worry quite a lot about how complicit I am in that Ben McOwen Wilson, YouTube's UK managing director had an interview Obviously he disagrees with that He says "that the platform reduces the spread of content designed to mislead people, and raises up authoritative voices" Note that he didn't say those voices were true. Or those voices were correct. He said they were authoritative There's an old Jonathan Swift quote, and it took quite a lot of research to prove that this actually was a Jonathan Swift quote "Falsehood flies and the truth comes limping after it" Quick show of hands, who saw this tweet? Few people, Yeah, that's about ten or twelve in the audience. I mean, look it did great numbers The idea that Marmite, you lay the bottle on the side to get it out, that did great numbers Who here saw the confession and the retraction? Alright, about half as many people. Good Sure, that's pretty much harmless. By the way, that is now the Marmite guy. That's what he's known as Everyone assumed that had been fact-checked by someone. It's not that important But everyone always assumes that - particularly if it agrees with our preconceptions I mean the long-term solution is to teach information literacy in schools. But as a real-world education policy that's about as useful as saying we should reduce our carbon emissions. It's true, how do we get there? I don't have the answer to that Douglas Adams, in his novel "Dirk Gently's Holistic Detective Agency" came up with the idea of a fictional bit of software called "reason" which is a program which allows you to specify in advance which decision you wanted to reach, and only then give it all the facts. And the program's job, which it was able to accomplish with consummate ease was simply to construct a plausible series of logical sounding steps to connect the premises with the conclusion. In the novel, It was sold to the Pentagon, highly classified, and it explained why they spent so much on Star Wars missile defence If you would like to be convinced of a thing, YouTube and Twitter and all the other networks will happily find you people to convince you If you've lost your faith in God, then there will be happily dozens of evangelical preachers from all sorts of denominations who will happily bring you back into the fold, and dozens of angry atheists who will make sure you stay out of it Take your pick, which way you want to fall If you want to know who's to blame, or what's wrong with the world right now? Then I can find you hundreds of people who will get you apoplectically angry at billionaires, or immigrants, or both Whichever way you want to go, there will be someone authoritative to tell you about it. So how do we define that authority? They used to be gatekeepers, only it seemed like they used to be gatekeepers. Science communication was in magazines, and television, and radio, and it meant that there wasn't any significant peer-review system out there, but you knew there were researchers at the back end, and standards and quality to keep these things fairly accurate. There were professionals. There's David Attenborough There's James Burke and a lot of other men, and they were all men, who were all giving authority. With the BBC to back them up Which is certainly true, but for every Planet Earth or Connections, somewhere on a channel, there would be an Ancient Mysteries, or an In Search Of, or Ancient Aliens The Bible code became a phenomenon in the 1990s This was the idea that there were certain hidden messages in the Scriptures That could be tracked if he just went through it exactly the right number of letters in a row The Daily Mail loved that. They put it right in the front page, a splash above the headline. Not 'cause it was true, but because they knew it had sold newspapers. We had gatekeepers We definitely did have gatekeepers, but I'm not convinced, that from the public's perspective, it was all that different to what we have today Authority online - often comes from having an audience. You know, why am I standing here giving this talk today? It's not because I've done 15 years of painstaking research into this stuff and that I'm now presenting my doctoral thesis to you It's not because I have a particularly broad range of knowledge or depth of knowledge It's because I've worked with the Royal Institution before, You know I can present. I wouldn't pretend to know your motivations, Sean, but, I think it's safe to say that I'm partly here, because you knew you could sell tickets, and you knew you'd get some clicks online from having me associated with this I'm not saying that's all of it, I'm saying that was probably a consideration There is a reason that every major British documentary about space, for the last 15 years, has been presented by Professor Brian Cox Except it's not just documentaries about space, that's Wonders of Life! That was about natural history. Sorry, I'm just to take a moment. Oh good, there he is That joke would've landed a lot better if that had come up at the right time Can we just take a minute to appreciate how bad the Photoshop job is, by the way? The lighting doesn't match, He's got a halo around his head, and that right arm isn't even cut out properly! It's the official BBC Press photo! Anyway, this was billed as a physics-based approach to natural history, and yes, it did cover some pretty advanced physics concepts, but let's be honest, from a television commissioner's perspective, It's a bit of a reach. It was almost certainly an excuse to provide Brian Cox on television, to the audience who want him, to get those ratings in. I'm sure it was well researched, I'm sure they did a really good job of it, but Wonders of Life with an extremely qualified dual biologist and astrophysicist, no one's heard of, would not have had as many people watching it, and it would have been much harder to commission How much is it worth, to get someone less qualified but known to the audience? Rather than the best person for the job? Sister Wendy - was not some great art historian, her degree was in English, but in the 1990s, she presented five BBC series on art history because audiences liked her and would listen to her. In her obituary, The New York Times said, "Her insightful, unscripted commentaries, connected emotionally with millions" Dani Beck, a Norwegian neuroscientist recently said this "Your desire for a platform or interest in several sciences should not supersede your responsibility and ethical duty to not speak on topics you are not qualified to. There exist others who know better and can do the job. Consider amplifying their voice instead" And I agree with that, which is why the rest of this lecture will be present... No, it won't, it won't. But just for a moment, you believed me and how did you feel about paying to come here and see that? Like the early videos, in my educational series were "the stereotypical guy who hasn't done his research and thinks he knows everything, spouting unsourced facts" There are still a heck-of-a-lot of people who stand in front of a green screen - or use voiceover and stock footage and provide no citations, and no references, and just ask people to trust them. I learned quickly not to do that, but the trouble is, that's sort of what the audience want A loyal audience, whether it's on YouTube, or Facebook, or Twitter, does not want to see someone else's voice amplified. If I hit the retweet button on Twitter, I just take someone else's message wholesale with their face, I send it out to my audience, few people will pass it onwards. If I quote tweet it, if I take their message and then I add a bit of useless commentary around it with my face, that goes much further, way more people interact, way more people pass it on YouTube gives creators retention statistics We get to see on average where people stay and leave a video. I will note that that chart does go up to 120 percent Couple reasons for that, one is that Sometimes people go back and watch a bit again, that counts twice, that can go above, but also, it's because YouTube's graphing API Just isn't good at percentages There's a big drop-off in the... the first few seconds up there, it's as people go Oh, I don't like that, or I don't like him, or this isn't what I thought, there's a big drop-off in the end, what's now called the end card and was once called the credits, but apart from that just steady slope downwards, as people get bored and You know the algorithm will look kindly on you if that slope is... is less steep But if I go somewhere on location, and I hand over to experts and experts carrying most of the video, that graph will be steeper, and retention will be lower People will get bored more quickly and they'll share the video less So, how much do you pander to your audience? How much do you tell them what they want to hear, particularly when on YouTube, there is literally a dollar amount attached to each video, and attached to each view. How much do you make what your audience wants? At a social cost. You know, it's not just the viewers being sent down a rabbit hole of radicalization It's also the creators when you look at clickbait that brings in numbers and money, It can be very, very tempting to just cut your losses. Double down on the clickbait and accept that well, make money while it's coming in, "Make hay while the sun shines" because in a couple of weeks it could all go away But, the light that burns twice as bright burns half as long In 1988, Jimmy Cauty and Bill Drummond, the KLF, wrote "How to Have a Number One the Easy Way: The Manual" They had a novelty pop song hit the top of the charts a few months earlier, and The Manual was a tongue-in-cheek guide to success in the music industry. How to achieve a number one, with no money, and no musical talent Which they said themselves, they've managed Almost all of it is out of date now, there's a wonderful section a couple of chapters in, which says that by the mid-90s, someone in Japan will have invented a machine that lets you do this all from home without a recording studio. That was right But at the very start, a little bit that is as true now as it was when it was written "The majority of number ones are achieved early in the artist's public career before they've been able to establish reputations and build a solid fan base Most artists are never able to recover from having one and it becomes the millstone around their necks to which all subsequent releases are compared" "Either the artists will be destroyed in their attempt to prove to the world that there are other facets to their creativity or they succumb willingly and spend the rest of their lives as a travelling freak show, peddling a nostalgia for those now far off, carefree days" the Cheeky Girls They wrote that ten years before the Cheeky Girls They wrote that ten years, 20 years - before YouTube came along, and it's exactly true for people online: A number one does not create a career; it can kill a career. If you've just got one hit, all the world will want to see is that one thing. A single video does not make you a YouTube star, a single tweet does not get you a book deal. Well, to be honest, that hasn't happened in five years anyway. A Number One just makes you the person who repeats that catchphrase until everyone is tired of it. What you want to do is build up a catalogue of minor hits over time, get a little bit of respect, hone your trade, learn your craft, and then once you've got a bit of an audience, then you can start aiming upwards. Bands and singers and the music industry is actually a really good analogy for how the Internet works right now. After all, you know, Brian Cox does arena shows and tours around the world. For the folks out there who are hoping to communicate science to the world it is like Starting a band or launching a music career or going out solo and singing. It will not pay the rent for the first few years, or decades, or maybe even ever. And if you are one of the lucky ones -- if you make it -- it will not set you up for life. And if you're an organization, trying to get your message out: Well, I used to think that people had a built-in corporate baloney detector I used to think that anything that was put out by An institution or by an advertising agency was doomed to fail just because it wasn't authentic and That's not true. That's not true at all You remember what I said earlier at 500 hours of video per minute not watched uploaded You know 82 years of video content a day and most of it took almost no time and effort to produce but every one of those videos has more or less the same chance as that project that your organization spent six months and a million dollars on The most popular recurring series that I do right now is very simple that's sped up obviously Um, it is a ten minute one tape monologue to the camera about computer science. The camera does not move The camera does not cutaway There are none of those jump cuts that a lot of people use I can now completely understand why they do them It's a lot easier um there are occasional graphics sometimes but mostly it's just me and the camera and That breaks all the rules established by television when I grew up it breaks all the rules that corporate media types think That they need to make this work It's not about 'spectacle' it's about people to explain that we need to talk about parasocial relationships The term parasocial relationship was invented by these two Donald Horton and Richard... Wall... Wohl... should really a fact check that The term means that there is a difference between the spectator The viewer as we now call them and the performer what we now call the creator That the spectator is emotionally invested in the person on screen but the person on screen has no idea the spectator exists and if they do there's such a Power imbalance there that they couldn't possibly be a friendship Parasocial relationships are not a new thing and neither is turning them into money any celebrity that ever had an official fan club With a membership fee was doing exactly that actually the late eighties early 90s there was a fad for celebrities to set up phone numbers that were their personal number or their personal voicemail and Some of them made a lot of money from it That's Corey Feldman and Corey Hart and if you think I didn't track down that advert to play it halfway through to get everyone's attention Back, you're completely wrong "You can listen to their private phone messages and get their personal number where you can leave them a message of your own Two dollars the first minute forty-five cents each additional minute ask your parents before you call" yeah... I Haven't tested the number I don't imagine it works anymore But those celebrities didn't have twitter twitter is effectively Someone's personal number if you know that a celebrity is always on their phone always typing on the computer Sending their thoughts out to the world. Oh Why not reply send them a message? They might notice it they might... they might actually reply to you personally You know - you... you might get attention from them and suddenly it's not a weird parasocial relationship They're your friend more than that because it's on social media ah... you know, you can try to get that attention You can try to get that... that over-the-top fandom, but it's also performative It can also be competitive all the fans can now see each other doing this I have a couple of friends who have that sort of terrifying Beatlemania-esque fandom, that sort of kids screaming at "Take that" concerts, except they're not just folks around magazines now in small groups They're not kids who come for a concert and scream at their idols and then disperse They have notifications on for when their Idol tweet so they can get the first reply. They have group chats whose only distinguishing characteristic for the people in there is that they're all fans of this person if you've ever wondered why... why kids Would rather sit and watch a stream of someone playing a video game Rather than just play the game themselves. It's not about the game It's about the person playing it a stream of a video game on its own isn't interesting I but someone you know, or someone you think, you know playing a video game Just hanging out with a friend with a chat next to it in that chat That's a lot of friends hanging out you all just hang out together except that one of you has a lot more power and influence than the others and often is indirectly asking for money You'll notice that I'm not using any slides during this bit I don't want to call out any particular individuals for what is basically just hustling Patreon and twitch subscriptions and YouTube memberships all these tools They have to raise money for individual people are not inherently a bad thing Patreon has meant that science communicators are able to support themselves. Despite the fact that sometimes Their content is not advertiser-friendly so people who are talking about sex education or mental health or ancient weapons can all get money for their content and perhaps even hold not have to hold down a separate job because individuals out in the world have thought This should exist and I'm willing to donate money to make that happen Animators writers podcasters the sorts of people who work incredibly hard and are only able to make what they do Because people have chosen to support them That is a brilliant thing But when it starts to get unsettling when it starts to get a little bit weird is when it becomes not about supporting someone's craft but about selling friendship. and If you think it's weird when I put it that way Yeah, you should it's... it's really weird. If you watch one of the really popular video game streamers on Twitch Which is a platform that is just video game, streaming. You'll see there. They're almost always talking. They're watching the chat They're reacting to the messages that are coming in. They're reading them out loud. They're replying to them. They're calling out the names They've seen they're greeting people who have been hanging around in that group for a long long time They are being friendly and open and on For hours at a time performing exhausting emotional labour They will thank anyone who said them and tip or even better yet subscribes because in a world where Netflix cost $13 a month subscriptions to a single person on Twitch can be $5 or $10 or $25 a month and depending on what you pay that might affect what perks you get back and what attention you get and If someone does repeat their subscription join again Then they can choose to announce it to the whole stream a little animation says how long they've been subscribed for you'll see people on Twitch say: "hey So-And-So thanks for being part of the cult for four years" That's... literally language I heard while researching this that seems normal to anyone embedded in that culture Now if you're a sound communicator, it won't be quite that much but it might get you behind the scenes access You know, you might get to see someone's videos early on and put some comments in or get your name in the credits it might give you access to a special members-only private chat room, or if you're giving someone maybe $50 a month, maybe there'll be a private video chat with the creator just for the folks who are spending that much money Maybe that is okay. Maybe that's the way that social norms are going now and I'm the old guy looking at that going what on earth are the kids up to that might be the case, but it can be essentially selling friendship and again It doesn't have to be there are people who use it purely a lot of people who use it purely as a way of funding Their work, but here is some of the advice that Patreon gives on how to get more people signed up via monthly subscriptions "Bring your audience 'along for the ride' by sharing pics videos and anecdotes from your life get vulnerable within reason an emotional connection to you as the creator can be key in converting a fan to a patron" an emotional connection can be key in converting a fan to a patron. There is a very very blurred line between Being a fan of someone's work and being a fan of someone. and that's a line I find really uncomfortable because in part my brain doesn't do parasocial relationships It never has maybe there's something wrong up here But there is, to me, a huge distinction between I'm a fan of X's work and fan of X I know linguistically they can be shortened. But those are to me very separate concepts. I like the work of Darren brown Mentalist magician I have shamelessly cribbed some of the... the techniques that I use are in talks some of the tricks Are you not like magic tricks, but some of the rhetorical tricks? Shamelessly from his live shows the idea of setting something up at the start and letting the audience forget it and then bring it back At the end revealing it's the key to the whole thing. I have - blatantly ripped that off Darren Brown. More than once I thoroughly enjoy his work. I think he's a great entertainer But I don't give him damn about the man himself Because I don't know him. He's a stranger parasocial relationships and Everything about the way that these one-way, one-to-many relationships work blur that line in the service of Greater profit. I have a strong memory from... from when I was a kid Maybe about... that tall, being asked in school to write something about a personal hero And I didn't have any when the benefit of adults hindsight, obviously the cop-out option was to talk about my parents, but As the kids around me wrote about sporting heroes or actors or whoever I just sat there Stumped and it wasn't until I was much older that I realized to most people liking someone's work and liking someone are the same thing and that was blindingly obvious to most people I'm sure but too young me, that was a revelation There, that has revealed something personal and vulnerable within reason, that's helping create an emotional connection between me and the audience - Tick the collection buckets will be by the door on the way outโธฎ God, it's like a cold breeze just came into the room that was brilliant. -- So why have I covered that in so much detail? What does all that have to do with science communication? It was worked out very early on in television histories that the people who were good at Getting an audience were not the people who went "Ladies and gentlemen!", there were the people who went. Hey Hello, they weren't the people who went. "How are y'all doing tonight?!" There were the people who looked down the camera and said, how are you? It's the difference between talking to the audience and talking to the viewer There was a difference between a nature documentary would stock footage and a voiceover and a David Attenborough Nature documentary there is a difference between wonders of life and Brian Cox's Wonders of life sound quality factual accuracy video quality They all matter but not nearly as much as having someone on screen who the audience can connect with My friend... Dr. Simon Clarke vlog did his PhD at Oxford for the more old-school people in the audience vlog essentially means video diary Um Simon now has a doctorate in atmospheric physics in 2018 He stopped making personal videos about his life in favour of science communication and he wrote this post About why and how and I saw quote a little bit from it "The motivation of watching someone struggle with the monumental task of researching PhD was mostly what attracted viewers to watch ...I was the product By that, I mean that my lived existence on earth was a commodity something to be bottled refined and sold." Simon's recent science communication videos are really good but some of his audience Didn't stick around as he changed from talking about his life to talking about his work He has had to build a new audience who are interested in that post academia career of his and about the subjects He's interested in now. He's getting there. He's doing really well Dr. Clark is in the minority because he's qualified to talk about his subject for every one of him. There are Countless people speculating or repeating misguided facts or just flat-out lying or trying to shill Essential Oils by claiming they cure cancer You would hope that it's the people like dr. Clark that would be authoritative. But often that's not the case Authority Frequently comes from having an audience and having an audience comes all too often from that parasocial emotional connection with people if you are going to try to talk science to the world as an anonymous voice or a Corporation just saying words Doesn't really matter how well sighted your sources are how groundbreaking your research is You have to tell people about the human story that's in it That's preferably your own. and to a certain extent, You have to be parasocial So you may think well Okay on television we had Gatekeeping we did have that there was there were certain standards. Surely. It may well have been about parasocial relationships. There are occasional failures, but At least there people we could relate to even if they weren't technically qualified, and again to be clear Most of the people who are doing this today are extremely qualified and even if they're not there is a whole team researching behind the scenes but - it's still the main the medium that gave us most haunted and Ghost hunters and even most haunted you weren't watching because you're interested in ghosts. You're interested in watching the presenter go "SIGH!" at something that wasn't there and the online world is often seen as this Uncontrolled, unmediated place where anyone can say anything about anyone but the last few years have shown is that that's also not the case Which brings me to the last main part of this which is about Echo Chambers and Nazi Bars the final piece of the puzzle Working out why some lucky broadcasts go around the world and some don't, is talking about the people who pass them on it's the groups in which someone can choose to amplify your voice or condemn it. where it's passed on person to person to person group to group to group. and some things are passed on because they're interesting or entertaining and for no more reason than that But often it's because they support Existing views because it's reinforcing the in-group or - because it's diametrically opposed to those in groups views and they can bond over Despising it up. Up there, are the two extremes of online moderation. Let's talk about the Nazi bar first This is what happens when a lot of sites are set up to be this sort of bastion of free speech where... Anything legal and... and by that, they usually mean lethal under United States law Anything legal - is free to post and you see this set up by the sort of well-meaning libertarian tech grows out of Silicon Valley Reddit, which is one of the major hubs for discussion at least among tech-savvy Americans is perhaps the canonical example They were set up as... as this perfect bulwark of free speech if it is legal Reddit will let you say it Small groups in there might have their own rules, but other than that, it is a meritocracy The best ideas will rise to the top, at which point, inevitably the Nazis moved in. Ah... And that's not just a label. I'm not just... just slandering people with right-wing views there. I mean literal modern-day neo-nazis and unlike most... like many European countries, the U.S. Does not have a law against incitement to religious or racial hatred. So that was legal So Reddit let the man play free speech. They said could be countered with more free speech and as you might expect That lasted until it starts to affect their bottom line When there was, finally when advertisers were starting to have trouble with them - when there was finally a crackdown on the overtly - staggeringly racist discussion there This was the quote from one of Reddit's co-founders and it's kind of astonishing we didn't ban them for being racist We banned them because we have to spend a disproportionate amount of time dealing with them This isn't exclusive Reddit, by the way, Facebook's moderation has been similarly lacs and inconsistent and somehow they've mostly got away with it The inevitable conclusion of - let anyone say anything - is that the worst people having finally found a place that will let them in Start to drive out the more careful and cautious So the discussion swings a little bit more towards their views which means that more moderate people leave so it swings a little bit that way and the cycle continues and continues and continues until Eventually, you realize that either The worst people will survive or maybe the moderators might want to kick out the Nazis This is... this is the analogy to the Nazi bar the local pub Might be the great place... greatest place in town, but if they let the Nazis meet in the basement You're not gonna want to go in there. Or at least you're not gonna tell your friends you go in there. in 2015 Reddit conducted a survey of its users they found the number one reason that their users do not recommend the site even though they use it themselves is Because they want to avoid exposing friends to hate and offensive content So let's look at the other extreme the Echo chamber for better analysis of this, I would direct you to the work of Walter Quatro Chiaki and the folks he works it with from the Laboratory of data science and complexity at Venice's Kalfas Kara University. I've mispronounced one of those words I don't know which. - in an echo chamber There is definitely not free speech no dissent is allowed you see that in places like Facebook groups from multi-level marketing schemes and... and Anti-vaxxers where everyone has to buy-in to the group's philosophy or else be branded a shell or a hater anyone with a dissenting opinion is shouted down by a large crowd all of whom Support each other in their views whether that view happens to coincide with reality or not And if all dissent is banned you end up with a similar problem the most obsessed the most extreme radical believers chase out the people who aren't so sure so the discussion on average starts to tolerate more extreme obsession and less descent and the cycle continues, and continues, and continues - if those two failure modes sound similar Well, they sort of are. both of those extremes are harmful and note I am NOT talking about political alignments here I'm talking about the extreme edges of the policies that either allow anyone to say anything or allow no dissent whatsoever. And every major company that enables discussion has to pick where they sit somewhere along that scale So this is a post from a Florida-based natural medicine clinic that is selling homoeopathic vaccines I've... Anonymized them as best I can for obvious reasons. Their phone number shouldn't be up here I think all of us here can agree that this is dangerous and there will be wide-ranging views on whether that should be legal I have so they did get part of it rightโธฎ - Those are definitely safe for ages above 5 There is a concept on Twitter called the ratio It's a number of replies you get to the number of likes to the number of retweets If you retweet number is biggest you have made something that has resonated and should be signal boosted and sent it out to the world if you had likes number is biggest you've said something heartwarming or personal or emotional that people want to sympathize with but maybe Don't want to send on to their friends. if your replies number is largest like at the bottom of that tweet You've probably said something that a lot of people disagree with On a response like that is... is called getting "ratioed" that homoeopathic vaccine tweet was thoroughly ratioed. those 132 replies are all people mocking it or Occasionally clearly laying out why homoeopathic vaccines are a bad thing along with a scientific consensus I think we can agree, at the Royal Institution, It's probably a good idea for homoeopathic vaccines to get some pushback But here's the problem 132 people suddenly replying to that company - who usually gets literally zero replies to anything is algorithmically indistinguishable from a mass abuse pile on targeting a vulnerable person If some awful person with it with a moderately sized following says hey, I hate that guy go and mock him then machine learning systems cannot tell the difference between abuse and A company getting pushback for selling homoeopathic flu shots any policy decision that is designed to reduce abuse on Twitter to stop vulnerable people being mask targeted which is vital and need it is Also gonna help the snake oil peddlers. I don't know where to draw that policy decisions about community standards are often Seen about drawing that line somewhere between the Echo Chamber and The Nazi Bar but there's this idea that if the company just Nudges the line a little bit that way. I'll nudge it a bit. That way. There will be a perfect solution that keeps everyone happy and I'm really sorry to say it, but it's not true. Echo Chamber and The Nazi Bar and not to polar opposites They are both a gradient and they overlap in the middle. You cannot choose the best option. You can only choose the least worst a slight tangent But I think that's doing part of the centralization of the web that's happened over the last 20 years or so, In the early 2000s, every discussion site out there was on a different server run by a different person, maybe in a different country, with completely different rules and that way it sort of reflected the the Systems we've got now in real life, you know one of those sites might well allow You know vulgar abuse and back-and-forth argument Another of those sites might ban someone for even mild swearing and this is how it works in the real world You know, there's a big difference between the conversation that football fans have chanting as they go into the stadium and you have the conversation that the hallowed halls the Royal Institution. I mean I assume there is Sean, I haven't been here after last Christmas like sure I guess it gets a little bit messy, but that... that different register and different social norms - is true for a lot of the bubbles within Twitter and Facebook and YouTube. you can have community norms within smaller subgroups But all those subgroups are centralized on platforms run by enormous mostly American corporations So the community standards have to be standardized through all of them and they have to be what the corporation or their? Advertisers or VC backers will support, you know, you cannot establish Massive cross-platform policies that allow every type of discussion, or in the case of YouTube comments any type of discussion whatsoever The problem is that while the communities can be small the platform's they're on, are too big and too Centralized 'cause you know some people just kind of go on twitter, and just like - that's a nice little coffee shop Friendly space with some people I know, you know, I'm just gonna talk to my friends and then suddenly *Boom* You know that... It's... they get this attack from these "other people" who use Twitter was this massive shouting match forum they will search out anyone using this particular hash... hashtag anyone with these opinions and they will shout at you because that's the way They use the platform And a few people have tried to federate this It's called there's a... a network called and Mastodon, which is basically Twitter in several different servers You know, they all have their own different rules. They all talk to each other but Running a server is complex and expensive and joining Twitter is free and easy Why would you not do that? There is something called discord which is the closest I think there is to those old web forums those old bulleting boards each discussion section is... is locked off and private and Just separates from the rest of the world sounds like... sounds like a great plan sounds... sounds brilliant Great And you know, it might work for a lot of small groups, but it still has a single federated Excuse me, not federated a single sign-on across the whole network Unintended consequences are rife If you try and play about with this stuff YouTube recently had an algorithm change which you know trying to raise up authoritative voices Suddenly if you were watching videos about climate change Then you might be sent to someone like the Royal Institutions' videos which are about the scientific consensus Which means that suddenly all the people who were climate deniers who were already Entrenched with their views were being sent to videos like the Royal Institutions' and suddenly underneath each of those videos. There are ill-thought-out Unscientific comments and as a viewer, you can just scroll down a bit and go Oh, there are my people, they're the ones that are right just like you might if The algorithm had determined that your fundamental beliefs were wrong Which brings us back the start: I'm pretty sure the person running that homoeopathic flu clinic genuinely thinks that they are doing good for the world The fundamental beliefs are at odds with reality but that's never stopped people believing things people may take what's... what is just a saline injection and they may think that it's effective and They may spread flu And that is in worst case lethal. and I'd argue that in a perfect world those tech companies that facilitate that discussion Have a moral imperative to reduce or remove Messages like that, but it can't be that clear-cut because we aren't perfect the people running them sure as hell aren't perfect now ideally the algorithms for Facebook or for any other YouTube...for a YouTuber for any other company. they would be able to think a little bit further ahead. They'd be able to increase long-term profits as their goal At least that's what the corporations would like They do understand public relations and you know They... they'd be able to work out what to do, and from humanities perspective The ideal algorithm would be you know, helping humanity to survive long term It would suppress conspiracy theories and fake news But it would allow enough entertainment and nonsense that we still pay attention to it. in 1950 Isaac Asimov wrote a story called the evitable conflict it became the last chapter of iRobot and it was about Giant supercomputers that run the world's economies. They were called "the machines" and in the story, they're not being perfectly efficient. They're not perfectly designed They're making small errors here and there it turns out spoilers are for a book in 1950 spoilers. They are programmed to protect Humanity, and they know is better than we know ourselves a little nudge here, a little nudge there. Then humanity is less likely to destroy itself and the people who believe that the machines are doing that are more likely to be seen as conspiracy theorists We don't have machine learning systems like that. We don't have that sort of artificial intelligence Not yet anyway, and you know We can't... we can't tell a computer program here are the odds of humanity surviving into the next century and beyond improve them If you ever do have a machine learning system like that the sort of superintelligence that could in theory control broad strokes of humanity Then the world will be about to change so much that, fake news all be the least of our worries if anyone ever does for I can only hope that its goal is not (to) maximize the profit of one company, but, until someone does work out an algorithm that can do all that It's up to us. I know this is a really corny note to end up on but like we are that system It's up to us to fact-check things before we pass them on It's everyone's to create things that are honest that don't smudge the truth It's of the few of us that create and train those algorithms To understand the biases and make sure we're not creating conspiracy rabbit holes and it's up to those of us who create things To manage those commercial demands of clickbait and drama against honesty and truth and help the world Not turn into a horrible pit. The only "algorithm for truth" that we have right now is ourselves. My name is Tom Scott, Thank you very much. < Applause >
Info
Channel: The Royal Institution
Views: 1,635,278
Rating: 4.8754487 out of 5
Keywords: tom scott, algorithm, social media, truth, science, science communication, lecture, discourse, royal institution, internet
Id: leX541Dr2rU
Channel Id: undefined
Length: 59min 34sec (3574 seconds)
Published: Thu Oct 24 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.