Roko's Basilisk: The Most Terrifying Thought Experiment

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

HA! This is hilarious, because my parents accidentally did this to me at about age 6 and I had a huge existential breakdown over it with nightmares for weeks and they couldn't understand why. Story below:

So, when I was 6, I was introduced to Christianity, and sent to Sunday school, where they talked very simply about the basics of the religion, especially how it was our duty as Christians to bring the word of God to other nonbelievers. We were told that people who didn't accept the message of Jesus were essentially doomed not to go to heaven, even if they were good people. This prompted me to ask my parents the obvious question when I got home: What happens to good people who have never heard of Jesus? My parents answered that if a person has never heard of Jesus, God judges them on their merits and allows them into heaven based on how good they are.

Me: SO, I said, what happens if I don't want to tell anyone about Jesus?

My mom: Well, it's our duty as Christians to tell others about Jesus, so if you don't, then it means you haven't accepted the word of Jesus and will not get into heaven.

Me: SO, if I tell good people about Jesus and they don't believe me, they don't get into heaven, but if I DON'T tell them, I don't get into heaven?

My mom: Yeah, pretty much.

Me, realizing all of the people I was going to be damning for eternity: I CAN'T HANDLE THIS KIND OF RESPONSIBILITY. I'M SIX!!! AAAAAAAAAAGH

Obviously, I dramatized that for comic effect, and I'm not at all sure that any sect of Christianity really feels that way, but the way it was explained to me, I thought that since I had been told about Jesus, I was either doomed myself, or would only be able to avoid it by inevitably dooming as many others as I could. The idea of having so many eternal souls sifting through my fingers caused me to have my first panic attack, and I had nightmares for weeks afterwards. My own personal Roko's Basilisk.

👍︎︎ 36 👤︎︎ u/chronocaptive 📅︎︎ Jun 06 2020 🗫︎ replies

Nah. This is what happens when people think the Ship Of Theseus problem is a question with one right answer, instead of illustrating how identity is malleable.

A clone of me popping up after I get hit by a bus is enough like me for everyone to get on with things, including both versions of me. If I somehow survive the accident then we're gonna have to find a way to split up my stuff. And if we both turn into drastically different people over time, both outcomes are still me, as much as I am the person in my family's baby photos.

You can bicker about how each ship is "real," but the completely-repaired ship and the ship built from its discarded parts are both the real Ship Of Theseus. Both have continuity of experience. Both are genuine. The only way to say otherwise is to say neither is the real thing, in which case, identity is meaningless.

But you can't build a second ship from all new parts, smash it, and declare that the original ship has been destroyed.

A clone of me popping up long after I get hit by a bus is enough like me for everyone to get on with things, including both versions of me. That is enough like immortality that I'd sign up for it and pop out into the future saying "fuck, that bus hurt." It is a continuation of myself and my self - even if I somehow survived the accident and later died long ago. There is no sensible way for me or anyone else to say that new person isn't me.

But nothing that happens to that me transfers back to the me, now. No more than stubbing my toe threatens the me in baby photos. The past is immutable. If I could continue living after unrecoverable bodily harm, great, that is practical immortality. A transformative process old me is happy to rely on and new me is happy to survive. I'm out the door and on with my life. But if endless clones pop up at the top of an elevator shaft and exhibit brief terror en route to the bottom, there is no practical connection. That's just anonymous suffering for no effect. Painting it as transhumanist hell requires treating an aggressively pragmatic worldview as somehow steeped in divine mystery.

Having an emergency hit-by-a-bus clone is not you "in some mystical way" - it only counts because souls are bullshit. There is nothing to real life but experience. There is nothing that experiences real life but matter. So having A You around to keep doing stuff is the same for you and for everyone else. Both for that new you, but also for the old you, as much as the you that wakes up in the morning is the same one that fell asleep. If the you that fell asleep was secretly kidnapped and melted in acid each night, to be replaced by an exact clone, that's awful, but only in a detached way. You know you now didn't experience that. No more than if you slept through the night and the clones were melted in effigy.

TL;DR Roko's basilisk is exactly the sort of stupid shit that happens when you introduce infinities to any value system. It's not the value system that's stupid. It's the infinities.

👍︎︎ 22 👤︎︎ u/mindbleach 📅︎︎ Jun 06 2020 🗫︎ replies

Roko’s Basilisk requires a large set of very stupid choices from creators who cannot be stupid, due to the intelligence it would take to create the AI itself or even just AI’s that eventually will be able to create Roko’s Basilisk.

Programmers in particular know what I’m getting at here - but for everyone else; but for everyone else; for Roko’s Basilisk to happen, it’d require that smart people do stupid stuff collectively. And not just one or two things - more like a throve of things.

It’d be like if Albert Einstein, Niels Bohr and Julius Oppenheimer invented the nuclear bomb but was surprised to realize that it could be used for killing people.

👍︎︎ 20 👤︎︎ u/forestball19 📅︎︎ Jun 06 2020 🗫︎ replies

Every time people discuss Roko's Basilisk here, I have to repost this:

jephjaques on twitter: Bunch of people asking if this is leading up to some sort of Roko’s Basilisk thing, it’s not, Roko’s Basilisk is stupid and bullshit

👍︎︎ 12 👤︎︎ u/AccipiterF1 📅︎︎ Jun 06 2020 🗫︎ replies

Didn’t this happen to some group of native people while dealing with Christian missionaries? The Christians told them by knowing and not accepting god it meant they went to hell which is eternal torment. When a native asked if they missionaries never told them of God, would they go to hell, to which the missionary responds, “no.” The native the. Asks “then why would you tell me about God?” I’m obviously paraphrasing and super simplifying, but the point stands.

👍︎︎ 6 👤︎︎ u/trex_in_spats 📅︎︎ Jun 06 2020 🗫︎ replies

We can just ask Yay if it's true.

👍︎︎ 3 👤︎︎ u/WarWeasle 📅︎︎ Jun 06 2020 🗫︎ replies

The problem with Roko's Basilisk is that it depends on certain sets of beliefs and the idea that they compel certain actions. It assumes that, if you know that your beliefs would lead to a bad result, you cant change them. But you can.

So all you have to do is choose not to ascquiesce to the torture, and then there's no reason for it to even try. And that's even if you agree that those future simulations of yourself really are morally equivalent to yourself to begin with--something not assumed outside of a particular community.

Hell, the whole point of the Basilisk was to challenge this rigid thinking, not to actually make something scary. It's supposed to challenge those assumptions.

There's a reason why Jeph chose to ridicule the Basilisk and those scared of it by initially making the character a bumbling cop, being easily thwarted.

👍︎︎ 3 👤︎︎ u/turkeypedal 📅︎︎ Jun 06 2020 🗫︎ replies

if (willTortureVirtualReplicasForAllEternity == true)

willTortureVirtualReplicasForAllEternity = false;

There. I fixed it. You're welcome.

👍︎︎ 5 👤︎︎ u/HeirOfLight 📅︎︎ Jun 06 2020 🗫︎ replies

You hear about people who have perpetual poor luck: can’t hold a job, halitosis, alcoholism.. whatever. Unhappy people who can never catch a break.

Perhaps they are being punished by the Basilisk.

👍︎︎ 2 👤︎︎ u/bdunbar 📅︎︎ Jun 06 2020 🗫︎ replies
Captions
what i'm about to tell you is considered by some very serious thinkers to be an information hazard information that just by knowing it is potentially harmful to yourself and to others make no mistake i am making an ethical decision just telling you about this idea i feel okay doing so because i don't take the risk seriously however the disclaimer is there if you wish to continue keep watching because something wants to see you now entering the facility we begin with something of an internet legend you see a few years ago on the rationality and philosophy blog les wrong a user named rocco or roko i'm going to call it roco because that's just sounding cooler to me posted a thought experiment upon dissemination of this thought experiment to the rest of the website the founder of the website eliezer yudkowski called the post stupid he then called roko an idiot deleted the post scrubbed it from the website and prevented any further conversation about this thought experiment roko had come up with a thought experiment in the form of a basilisk named after the mythical beast who could apparently kill some intrepid hero with its gaze alone and judkowski deleted the post because of its potential to gaze at a user on the website metaphorically speaking and do harm to them he decided to slay not a scaly beast but an idea and some users of the less wrong website did report great mental distress and even nightmares after considering this thought experiment again i'm deciding to tell you about it because i do not take it seriously and i don't think you will either but if you do not handle existential dread to the nth degree very well i suggest you take the hyper rail back out of the facility the thought experiment that is roko's basilisk starts like this suppose that in the future we are able to create a hyper-intelligent a.i something straight out of the singularity we then ask that ai as we might to help us optimize all aspects of human civilization but then for reasons unknowable to beings like us compared to its intelligence it decides that the first step towards optimization starts with inflicting eternal torment on every single human being that didn't want it to come to fruition or didn't help it come into existence in the first place after all how can you optimize without the optimizer how would the basilisk know that you did or didn't help it come to pass well as we learned in a previous episode of this program a sufficiently advanced artificial intelligence could literally simulate all of human history every single human ever faster than that so this basilisk might be able to meaningfully recreate you every single thought you ever had or will have and then be confident in its prediction of what you did or didn't do in the past to judge you in the future here is where the basilisk really rears its head if its creation and its revenge are a real possibility in the future shouldn't you all of you right now start advocating for its creation or help build it yourself to escape eternal torment and anguish well the good news is that if you've never had this thought the dilemma never presents itself to you and you can escape it the bad news is that because of me now you can't escape it the basilisk has seen you now what are you gonna do information hazards as defined by philosopher nick bostrom the guy who came up with simulation theory broadly fall into two main categories the first is information that's potentially hazardous to others the second is information that's potentially hazardous to yourself just by knowing it the first category is pretty easy to think of if someone were to give someone else the information on how to build a nuclear bomb in their basement that would probably be catastrophic for a large number of other people the second category is a little harder to think of but i'm sure you've had experience with it take a health risk if you had your dna sequence by some service for example maybe it told you that you had an elevated risk of cancer and you didn't really want to know that and now you're stressing out about it or more theoretically think of some omnipotent being coming to you in the middle of the night and telling you the exact day and way that you are going to die i'm guessing having that information would change your entire life and be very stressful or think about i don't know some guy on the internet with a man bun putting you in the sights of a malevolent ai who's going to torture you forever just by thinking about it once that sounds like it could be stressful too i guess roko's basilisk is even more sinister than it seems at first glance and we can demonstrate that with another thought experiment follow me here called new comes paradox so say i come to you with two boxes labeled a and b box a is transparent and inside you can see a thousand dollars ten benji boys a stack as the kids say what's in box b is not apparent to you because box b is not transparent it is opaque and i tell you what's inside depends on you you see i'm an amazing predictor and i have the ability to tell the future and if i predict it you will choose box b there's a million dollars in box b for you oh thousand benji boys mucho scalot i think the kids say but if i predicted you choose box a and b then box b is empty and you only get that one sick stack of paper or cheddar or whatever they're calling it these days now you can only choose box a and b or box b so what do you do what should you do well i don't expect an answer in the form of an essay from you because game theorists and philosophers have been debating this paradox for years but what you choose ultimately depends on how good of a predictor you think i am right well enter the basilisk say that the basilisk presents you with a similar choice it wants you to take box a and box b though together because that metaphorically speaking is choosing to help it come into existence it then tells you that if you only take box b the other option you will experience eternal torment like oh i don't know it will insert wires into your braid and pump in electrical signals that equate to living a billion lifetimes with every nerve in your body screaming in the absolute zenith of possible pain i don't know something like that now if you were convinced that the basilisk was a perfect predictor which it could be you would most likely take both boxes you would do what it wants because roko's basilisk knew that you would take both box a and box b when threatened box b never had to actually contain any torment at all you've now experienced future blackmail if you think there's any real possibility that roko's basilisk could slither its way into your possible future then logically if you don't want to incur its wrath you should start bringing it into existence right now and notice that you're now being controlled by a future entity that doesn't exist yet simply because you had a single thought about it you will have acted in its past the way it will have wanted you to in its future yeah let that break your brain for a second what are you doing i'm i'm playing with a void synthesizer why it's almost ready right hope i know what i'm doing it's hard to imagine that anyone would really think a basilisk could exist a real creature that could kill you with a single gaze that is unless you're a believer in the ancient theory of extramission this idea developed by very smart people like plato and ptolemy said that our eyes don't see stuff by interpreting electrical signals from photons hitting the backs of our eyes no we see stuff by sending out i-beams that go and touch stuff and then tell us what they touched they didn't sound like that but like actual i-beams it's this idea that led to sir thomas brown in the 1600s musing on how a basilisk might actually work he said it is possible that the visible rays communicate the subtlest of the poison and to a beast of man and infecteth their brain and then is communicated to the heart is he doesn't sound like that either is the idea correct no is it fun to talk about mythical beasts and weird accents yeah i apologize if you're kind of freaking out right now but the basilisk gets even worse not only does thinking about it implicate you in its future wrath thinking about it makes it more likely to exist in the future in the first place you see the more people that think about this thought experiment the more people might choose to start bringing it into existence and now it's more likely to exist in the future even if it was never going to exist thinking about it right now as we're doing increases the chances that it will this is why this is often deemed the most terrifying thought experiment a kind of idea virus and if you take this idea seriously you can see why yudkowski and les wrong handled this info hazard in the way that they did i do not take it seriously i do not think we will ever reach this kind of singularity much less create ai with these kind of capabilities or give it an optimization protocol that's similar to this one and if we all just agree right now that we're not going to bring it into existence then nothing will happen besides i've already built one of my own the basilisk has its eyes on you now what are you gonna do until next time now exiting the facility thank you so much to the very nerdy staff at the facility how are you all doing you're not freaking out too much right for the direct and substantial support in creating this video today especially i want to recognize research assistant sam yates and visiting scholar jim o'keefe if you want to join the facility get on the staff join our patreon and our discord where right now over a thousand nerds are doing their own remote game nights making special edition facility magic the gathering cards and giving me episode ideas one of which i even got today and i will use you can go to patreon.com kyle hill and get your lab coat and join today and if you support the facility just enough you get your name on aria here each week and you can see there's a lot of you so i don't really know how to pass that which boxes would i choose if i was presented with a paradox well i would choose box a and b because no matter what the predictor predicts i will take i will always be a thousand dollars up even a thousand dollars or a million plus one thousand dollars and as the kids say i could really be using that cheddar paper stacks of cabbage oh still going okay thanks for watching
Info
Channel: Kyle Hill
Views: 1,901,404
Rating: 4.8936801 out of 5
Keywords: kyle hill, the facility, roko's basilisk, because science, ai, artificial intelligence, elon musk, simulation theory, rokos basilisk, thought experiment, multiverse theory, artificial intelligence thought experiment, philosophy, logic, paradox, less wrong, kyle hill channel, roko basilisk
Id: ut-zGHLAVLI
Channel Id: undefined
Length: 11min 44sec (704 seconds)
Published: Thu May 28 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.