So our topic today is Consciousness and Identity. This is something we’ve looked at before
and I felt we should spend some more time on, and so this is going to be mostly a concept
video where we overview a lot of the stranger approaches to this topic. Toward the end we’ll look at the notion
of how immortality or very long lifetimes can potentially mess with personal identity
and even explore a somewhat strange solution to the Fermi Paradox involving identity that
we’ve not looked at before. But we should do away with the preliminaries
first. There is no definition of either consciousness
or identity that is universally agreed on by philosophers, which is a bit amusing since
both are things that everybody knows what they are. I am me, you are you, my cat Prospero is my
cat Prospero, though I’m not sure he knows that’s his name or just some noise I make
when I want his attention, which he’ll sometimes deign to grant me. Of course when you get around to it that’s
probably what names began as anyway, a specific noise made to get a specific person’s attention. We all know what identity is. We all know what consciousness is. But if you try to slap a proper definition
on it, things tend to get complicated. I sometimes wonder if trying to describe identity
in terms of concepts is like trying to describe smells in terms of colors or sounds. Some things you just can’t describe without
resorting to a circular definition, which is a definition that is circular. We know people change with time, most of your
cells are replaced gradually and those themselves routinely take in fuel and material to replace
damaged components. There are probably some atoms left in your
body from when you were born but odds are good a lot of those left your body at some
point and have since returned by coincidence. But while the child often differs from the
man or woman they will grow to be, we know they are the same person. We just know it. We’re probably wrong but it hardly matters
because it’s like free will, if it doesn’t exist it makes precious little difference
if we incorrectly think it does since that thought was predestined to happen. If you don’t have free will then you don’t
make decisions any more than a rock does and you don’t have opinions any more than a
rock does. And if identity isn’t something more wide
spectrum than your exact position of atoms at this moment, then there isn’t any you
to have opinions since the you who thought them up wasn’t you, it was someone else,
and not the same person who was listening when I began this sentence, or whoever it
was who began that sentence, since that wouldn’t have been me. Such ruminations are absurd… they might
actually be right… but they’re still absurd in the sense that there’s no reason to contemplate
them since if true they’d mean you’re not contemplating anything anyway since there’s
no you to be doing the contemplating. So for the purpose of this video we will assume
this video has a purpose, and that I exist as a distinct entity and so do all of you. I can doubt many things but I can’t doubt
my own existence since if I don’t exist there wouldn’t be anyone to be doing the
doubting, which doesn’t prove I am thinking or exist, but it does prove there’s no point
doubting your identity, since you’re incapable of doubt if you don’t exist. Thus, since I think, I am. We’ve been whacking our foreheads against
this wall for centuries, it’s occupied the minds of some of our greatest philosophers
and to this day we’ve got no conclusive answer. So it isn’t one we’ll answer today. Quite to the contrary I’m just going to
throw in some extra mental confusion by seeing how science and concepts from science fiction
have made this whole set of topics even worse. I mean there is a reason why this is called
the existential crisis series, if you don’t want some aspirin by the time this episode
is over I probably haven’t done my job. So we know people change with time, both in
their software and hardware. It’s not just addition either, you lose
stuff, cells die, atoms switch out, memories get forgotten. We usually say that if its gradual and it
maintains continuity in the process the identity has been maintained. I plant a seed, that seeds becomes a sapling
and grows to be a tree, throughout that process identity is maintained. Even though there’s few if any atoms left
over from that first seedling. If I chop it down and make a table, and a
bunch of firewood that becomes ash that becomes fertilizer for my garden that become vegetables
I eat on that table, well, identity would seem not to be maintained. I could maybe argue that the table was kind
of sort of the tree still, but I can’t take that too far, not to the point of claiming
my salad from my garden fertilized with wood ash was, because that path leads to madness. You then argue that the tree, coming from
a seed from a previous tree, is actually that tree, except that claim is a lot less crazy
since at least its contribution was significant, in terms of DNA. It would still get confusing since that tree
came from a tree too, which would then have an equally valid claim as the grandfather
tree of being the tree I cut down and made a table and salad out of. Your parents have an even better claim, under
the default definition of the people whose DNA you got and who raised you. And we know identity really is blurry there
too, because you did pick up a lot your personality from the people who raised you, and a lot
of the physical traits of whoever bred you, and both if those were the same people. But if they weren’t, most of us would say
these days that it was the folks who raised you who have the better claim on your identity. We tend to think the software matters more
these days. So if I clone you it’s definitely not you,
it’s just a younger twin sibling with none of your memories, or maybe arguably your kid,
via parthenogenesis, depending on how you look at it. What if I duplicated you? Okay, now we have a copy of you complete with
memories. Now we know in time you and they will diverge,
and arguably count as separate people right from the first moment, after all software
matters most maybe but hardware is a big deal. Now the hardware is mostly identical, if I
copy you into a full grown clone body, but it’s not getting the same sensory inputs
or experiences anymore. Of course it could be more than that. If I duplicate my mind onto a different body,
say a professional basketball player, I won’t know how to play basketball, but I’ll be
better at it then I am now and I’m sure being way taller than average will alter my
view of the world more than just literally, though that’s enough. Now if I just switched bodies with him, we
would view it that way, we switched bodies, not minds, the body is the inferior partner
in the pair. We wouldn’t say, if Bob and Todd switched
bodies, “Hey look, there’s Bob with Todd’s Brain, we’d say Todd is in Bob’s body. If that stayed that way long enough we’d
probably say Todd was now that body too. Maybe it’s a whole new identity, but there
is no one with a better claim to that identity. And that’s probably important, and will
be when we look at this with the Fermi Paradox in mind at the end. Maybe Todd isn’t actually Todd anymore,
but nobody has a better claim to be him. Maybe you aren’t who you were ten years
ago, but nobody has a better claim to that identity then you do. Most of us would agree that copy wasn’t
actually us, but it was a pretty good approximation, and if a friend died and was so duplicated,
complete with memories, we’d be pretty justified in treating them as we did the friend. A lot of folks would say it was the same person. That’s why concepts like mind uploading
or dumping yourself into a clone body tend to be popular routes to immortality in fiction. So I’m on the table getting ready for an
upload or transfer after which I’ll die. In one reality that’s exactly what happens,
I’ve got some untreatable illness and we’re going to clone or print me a new body, or
maybe stick me in an android body or upload me to a virtual reality. And that happens and the new me wakes up,
takes a sad glance at dead me on the table, and probably decides it is me. It’s a perfect clone of mind and body. Now we handled a similar case when we looked
at the Simulation Hypothesis, but we went through that quickly. At this point a computer monitor flicks on
and a digital copy of me pops in proclaiming itself the true me, and downloads itself into
an android. That android severs ties with the digital
self and now we’ve got three of me standing around glaring at each other. Now because I am the sort of person who has
contingencies plans for identity crises like this they actually all know the plan for such
a weird scenario and throw 24 of the 26 letters of the alphabet into a hat, not an I for Isaac
or an R for Rascally Rabbits, each pull one and pick an extra new name beginning with
that letter to be used for internal conversations and official documents. They randomly divide my possessions. Of course what they do about friends and family
is a little trickier. In a very real way we are our friends and
family and various professional or social obligations. It would be nice not to have to divide those
up, certainly not initially but that could cause a lot of confusion, even if we made
it very clear to everyone which was which. So what if someone found a way to keep us
linked together so we could share memories? Now this isn’t quite the same thing as a
hive mind, where there is one single collective mind running everything, and all else are
drones, but it raises the problem which we’ll get to in a moment. We could throw in a lot more folks too, like
they cut my brain out and stuck it in a brain dead person who’d bequeathed their body
to science, and raises the issue of if I’d want to have cosmetic surgery to better look
like my old self. That might get pretty intensive too. Let’s say a woman had her mind put in a
male body, is she a man or a woman? Should she get gender re-assignment surgery? Or should she spend some time as a man to
experience that? We’d usually say that’s entirely her choice,
but if it were us making the decision as an individual, in terms of what we’d do, which
would you do? So Hive minds. These come in a lot of forms, from the single
mind running the bodies like drones to where everyone has simply done some sort of linkage
to share some or all experiences, to a sort of loose internet that expedited chatting
between them and allowed important material to be shared out automatically. We always see this as multiple bodies, whether
its one mind or many, but it could go the other way too. Imagine in the future we’ve reached a point
where we couldn’t have more people. Earth’s reached its comfortable maximum
and our colonies near us have too, and we just can’t migrate new people where there
is space. And everyone is functionally immortal. Now folks occasionally still do die, suicide
or whatever, so we still get some new kids. Indeed maybe you can only have a new kid if
someone volunteers to die. But a lot of folks who are very old, in terms
of experience not body age, firmly believe we need to be bringing in new fresh minds,
and at a faster rate than is occurring. Maybe they even get together, say 1000 of
them, and agree to a lottery, someone dies from that group and a child replaces them. The person with the short straw agrees to
die when the kid is born, having nine more months of fun and a big death slash birth
day party on that day and the kid is named after them. A bit bleak but pretty civilized. Now the question is who raises them? After all the person with the best claim is
gone. Many might remove themselves from the contest,
they might have an election, they might have a lottery. But someone suggests that since many of them
are transhumans with very accelerated thinking, and many are totally digital entities on a
computer, that they could just build a mom and dad android where they all voted on each
major action, and agreed to a general set of behaviors while piloting the robot. So the kid ends up with a few hundred people
raising them as a collective pair of minds, with some guidelines to avoid either entity
acting like someone with a very severe and real case of multiple-personality disorder. That way those who want to get to enjoy every
minute of being a parent and the kid gets the benefit of having hundreds of parents
most of whom might have thousands of years of life experience. It’s a bit weird, but I’d tend to guess
the big threat to the child would be getting spoiled horribly. Similarly we’ve all got a long list of things
we want to do, many of which can’t realistically be pursued simultaneously. Now while an immortal person has time to do
everything we are also impatient. Maybe you’ve got three things you really
want to do and get yourself cloned twice, and each of you picks one of those three things. Say one of you wanted to focus on running
a business, another wanted a family, and a third wanted to go explore other solar systems. You agree to exchange memories every so often
so you can all enjoy this. You know you’ll diverge with time to become
truly separate people if you don’t, and maybe will too anyway. Now that’s all fine and well, but now another
person comes along and decides to make tons of himself, all to operate a giant business
empire, with each individual copy going off to train to be an accountant, a marketing
expert, a lawyer, and so on. This entity employs no one but themselves,
as it were. Now with all those different life experiences
you’d expect this might fragment but let’s say it doesn’t. Society doesn’t really approve of this mega-personality
of a million clones but they don’t break the law, they work hard, they pay their taxes,
and many of them are out constantly doing volunteer work. But now a big election is coming up and they
all decide to register to vote, all one million of them. And they are definitely planning to bloc vote. What do you do? Now at the same time one of them goes out
and does commit a crime, a pretty big wave of them too, many murders, and turns himself
in, ready to take the consequences. Who do you charge? This seems like it would need to depend a
lot on how tight that network was. If they really all totally separate people,
then only one person committed the crimes and the others are either innocent or maybe
accomplices. It would also seem like they all get to vote. But if they’re still one big mind, then
they are all guilty and shouldn’t get even one vote since felons normally don’t get
to. There’s probably going to be a very gray
area in between, where you can see evidence of a single mind and evidence of separate
personalities too. This isn’t a scifi novel where the answer
is going to be painted as very clear, simple, black and white at the end. So I’d actually encourage you to go with
your gut instinct for the moment, whichever that is, individuals each with a vote or a
collective mind that is guilty of murder. Let’s tack a different direction for the
moment, and we’ll come back, but keep your answer in mind. If we are the sum of our experiences more
than anything else, than adding those or subtracting those changes the person. Let’s say we have the ability to edit memories
to, like remove a traumatic experience. Some would argue you shouldn’t do that,
that we are the product of those events, good and ill. Others might agree in general but would say,
look this kid just saw her parents brutally murdered in front of her, we don’t need
to remove that death, but we can at least just cut the minute she saw that. She’ll still remember they are dead. And that’s kind of hard to argue with, I
mean the big objection would seem to be that it is a kid, and we are deciding for her,
even if she’s quite happy to proceed with that she’s still not an adult. Flip side, what about adding memories to a
kid, like sitting through a math lecture? Or for an adult even, if someone sits through
a math lecture for you and transfer that to you and many others, are you now partially
her, because you have those memories? Is that really any different than watching
a recording of that math lecture, only in Ultra-high-definition, as it were?
Bit of a tricky notion, since it would then imply some of my identity is being imprinted
on you right now, and we know there’s some truth to this, we do inherit a lot of our
personality from our parents and what they imprinted on us. We definitely do get our personalities, a
big chunk of our identity, altered by those we interact with. Let’s go in one more direction. If we are the sum of our memories, and I clone
myself, mind and all, and it goes and commits a crime right away, it would be very hard
to argue I’m not guilty too, just because I didn’t pull the trigger and don’t remember
the event. We might even need to use an affirmative defense,
which is when you have to prove something rather than have it proved against you in
a court. Like Bob shot Todd and that can be proven
beyond a reasonable doubt, because he says he did for one thing, but he says it was in
self-defense and he now needs to show this is true. We argue the clone was mentally unbalanced
perhaps, or that the motive for the murder came after the process, that sort of thing. If we fail we might both get jail time or
the needle. The same would go for if someone deleted their
memory of a crime. No memory, no guilt, some would say, but not
the US courts. We’ve got some legal precedents for amnesia
of a crime. In People vs Hibbler, 1971, a man with chronic
alcoholism who didn’t remember committing forgery was convicted of it, and in Lester
vs State in Tennessee in 1963 a murder conviction was upheld even though the defendant didn’t
remember committing the crime. And in neither case was the ruling based on
an assumption the person was lying, though the jury or court might have thought they
were. That doesn’t necessarily mean they decided
correctly though, but that is the precedent in US law if you’re wondering. Now amnesia can be used as a mitigating factor
in sentencing, that’s separate, but the law hasn’t really had to deal too much yet
with the idea that identity might be a mobile and flexible concept. Let me throw another out there for consideration. You are abducted by a man who just committed
murder, he transfers his last year of memories into you, including the crime and original
motivation for it, and adds your last year to his own, deleting the actual last year
for both of you from your minds. This process is not reversible and the irritating
thing is you have the motivations for the crime now too and you think the victim had
it coming. So who do we charge with the murder, and for
that matter the abduction and mental assault? You with the memories and motivation for murder,
and the kidnapping of yourself, or them? Who remember none of that except being kidnapped? Let’s make this worse by adding in a sympathy
aspect. He picked you because you were the passenger
in the car that hit his kid, and you best friend was driving, and wasn’t paying attention
because he was messing around with the radio. You lied for your friend, saying the kid jumped
into the road, and nobody really believed you but it raised reasonable doubt and he
didn’t get convicted. Now hitting someone on accident, no matter
how negligent you were, is not murder but the father saw it that way and wanted revenge. He killed your friend and switched memories
of it with you. Now if this was a novel I’d probably have
it turn out to be that you were the one who actually was driving and your friend took
the fall for you, and you lied to protect them, and you and the kid’s dad would end
the story staring at each other in absolute horror, and either shooting yourselves in
self-disgust or shooting each other in blind hatred. It’s not novel though so we are still left
trying to figure out who to charge with a crime, and now which ones. We’ve got perjury, murder, and abduction,
not to mention whatever taking someone’s memories away forcibly or giving them forcefully
would qualify as, which I suspect would end up being viewed as something at least as bad
as rape. Now insanity defenses and mitigating circumstances
can all play a role afterward but you first have to decide who is actually getting charged,
and with what. This is just as bad with hive mind scenarios
where the individual identity is still maintained. There’s, say, twenty of you who share a
communal mind, exchanging memories and able to mentally talk to each other, but still
essentially unique people. The memories aren’t shared as fully perhaps,
so you don’t converge entirely into one personality. One of the twenty joined because he’s got
anger issues and being part of a loose group mind helps him keep that in check, and one
day he experiences one of his score of mental kindred getting abused and it triggers him
to go kill the abuser. Now in and of itself, that’s not complex,
it wouldn’t be that weird to have a friend who related a crime against them that sent
that person off to go get revenge on their friend’s behalf. Their friend didn’t ask them to, and so
is guilty of no crime, and revenge might come up as a mitigating factor in sentencing but
the murder still took place. It is a bit different if you’re sharing
a memory though. For instance if someone attacks me and I kill
them during that, that’s self-defense, if I go chasing after them afterwards, now its
murder. Shooting someone in the middle of an argument
is not the same crime as walking out to your car and getting a gun, then coming back and
shooting them. It adds premeditation. But if your hive-mind buddy is busy thinking
how much they want to kill that person and that’s leaking into your head with vivid
clarity, it starts getting a bit dubious what the situation is. I suspect such contemplations will be nightmares
for any judge who has to make the first ruling on cases like that, I bounced something similar
off a friend of mine who is a judge and noted for having a good head for that kind of esoteric
thinking and she just laughed and said she had no idea. Law and science fiction make some very interesting
concepts and conundrums that I feel tend to get neglected in the genre, or treated over-simplistically. Wasted opportunity I think. Whudunnit Murders are certainly popular in
scifi but not so much the actual trials. So I go to the doctor and get a bad diagnosis,
its cancer, its malignant, and I’ve got two years left tops. The next day I wake up and they tell me it’s
the year 2100 AD, and the original me had a memory transfer done and meant to update
it till the moment of his death, while I was being grown or stored in stasis, but they
found out the clone, me, actually had a fatal genetic weakness pop up. He, the original, had me stored on ice until
they found a cure. He also got very bitter and sometime a year
later shot someone and then died in a shoot out with the police. Now I don’t have those memories, but we’ve
already argued that just deleting the memories of a crime or even the motivations for it
too, doesn’t necessarily make you innocent of it. But let’s say you think right now I am,
what if they come up with the big black box that had my memories all the way up to death
on it. If I am innocent right now, can we actually
argue I’d stop being innocent if I downloaded those into my brain? And if that does make me guilty, wouldn’t
that also make anyone else who downloaded them guilty, whether it was voluntary or not? Like with our passenger who got abducted by
the father of the kid his best friend had run over? It doesn’t actually matter if it was voluntary
or not, that’s a separate crime and problem, if now having those memories make me guilty,
then anyone else having them is also guilty, and not having them makes me innocent. So deleting my memories of a crime makes me
innocent. This is a good way to tie your brain up into
knots, and as I mentioned before I’ve got no answers. This is stuff to think about. Because it extends beyond that. Making yourself smarter for instance is very
like mind uploading or replacing bit of mind with computer parts too. It could be done gradually or instantly, but
you can make a very good case it’s not you anymore. And a very good one too because suddenly jumping
someone’s IQ up a few orders of magnitude is probably going to seriously change their
personality. Arguing if a duplicate of you down to the
last memory is you is maybe somewhat semantic. Ditto pulling out some memories or adding
others in, the former shouldn’t result in any change of behavior except gradual between
you and that clone as you develop different new experiences. But usually when we talk about extending life
in a transhumanism sense we’re talking about changing their mind too. You can make a pretty good case massively
upgrading your mind or merging it partially with others is creating a new person and arguably
killing the old one. Some might view it, especially major changes,
as tantamount to suicide. Others might argue that it’s no different
than a metamorphisis from child to adult. Jessica isn’t the same person she was when
she was a little girl, anymore than that clone she made of herself ten years ago, complete
with memories, is the same person now as she is now. Yet there was no death there, and Jessica
in the year one million AD, as some giant planet sized computer orbiting with a few
million other megaminds around Alpha Centuari as a form of Matrioshka Brain, is probably
not the same person anymore. Of course if things like identity and consciousness
and free will genuinely are illusions, things we trick ourselves into believing to maintain
sanity, I sometimes wonder if there is a maximum amount of intelligence something can have
before it can’t trick itself anymore into believing in them and just shuts off, not
even regarding it as a suicide. It doesn’t help to add to that problem that
a mind that big and potent, like we discussed in the episode on Matrioshka Brains, has no
problem simulating quadrillions of human minds simultaneously and could run through all their
lifetimes, again simultaneously, in one second. If we can imagine ourselves getting bored
after a few thousand years of life, to the point that death is just fine, imagine what
that would be like for something like that. And your brain can’t store endless memories,
but it probably can be modified to store millions of years’ worth without a problem, probably
a lot more. You might need to archive a lot of them or
zip them down to take less memory up, but that itself shouldn’t be an issue. In fact it ought to be so little of an issue
that if we do figure out how to reasonably seamlessly integrate memories from other people
into ourselves too, and run our selves at higher subjective experiences of time, overclocking
our brain as it were, you’d expect there to be a huge market for experiences. I mean we could probably build a simulation,
a virtual reality, of being the first person to step foot on the moon but is that actually
the same as feeling the dread of being a quarter million miles from home in a thin tin can
or the sheer exhilaration and awe of being the first person to leave your footprint there,
of remembering the feel of lunar regolith under your feet? So it isn’t hard at all to think folks might
buy up all the memories they can from other people. We’ve never really talked about currency
before in a futuristic context, we probably will do a video on crypto currency pretty
soon here, but we tend to figure the main commodities are energy or mass or processing
power, but memories might easily be in there too. And if you’ve got the capacity for it you
might cram an awful lot of them in there, and people being people memories of crimes
might be a big commodity too, openly or on a black market. Which raises the issue again if the memory
of an event, like a crime, makes you guilty of it, or a victim of it if you remember the
other side of it. And Identity could get real blurry if a billion
people remember winning the Olympics, if me uploading my mind to an android makes that
android me, it’s kinda hard to claim that me downloading the memories of someone who
won the Olympics doesn’t make me that person too. That also means that to save space a shared
memory might be stored in just one place, or a few for redundancy, and folks might end
up with some equivalent to cloud memory storage. All of which leads to the idea that civilizations
where the people live a really long time and can move memory around and store lots of it
might get incredibly touchy about the notion of identity theft, particularly in this especially
literal way. I could well imagine it being taboo or a crime
to voluntarily transfer memories or mix memories or delete memories or run multiple copies
of yourself. One person, one lifetime, one unique set of
memories, even if that lifetime was essentially indefinite. They could get a lot more touchy about that
too, once on the pathway where uniqueness is important and violating it gets seen as
the worst sort of crime. Not just, ‘you made a copy of me, you took
my memories for your own’ but ‘you are impersonating me’. There is after all only so much true uniqueness,
same as me tweaking a few things about you doesn’t change who you are, since that is
constantly happening and so your identity is a wide spectrum. If I took a book by Alistair Reynolds, say
House of Suns, one of my favorites that plays with mass cloning, long lifetimes, and identity,
if I took that book and changed just a few words, that is still plagiarism. Hands down. On the same idea, if I dye my hair blond like
it was when I was a kid, I’m still me, and if I’d chosen to reference Peter Hamilton,
who also incorporates these kind of themes in his work, instead of Alistair Reynolds,
this is still the same video. So identity isn’t some discrete state with
clearly defined borders and there could actually be a limited number of them available. There are infinite shades of gray for instance,
or may as well be, and the line between it and white and black is hazily defined, but
there are only so many genuinely, substantively unique ones. And a civilization may decide that is, say,
one quadrillion. There are only about a quadrillion reasonably
unique personalities. Try to make a new one and someone says ‘that
is identity theft, I want her deleted, she’s impersonating me’ and they might actually
do that, after all nothing is being lost, there’s no new person in their eyes who
is being killed. I think that’s pretty dark but I could envision
how something like that could happen, because I can see myself being angry, and we can all
see ourselves being angry, at someone who tries to mimic us when it goes beyond a bit
of flattery. In such a situation killing that copy might
be seen as self-defense of your own identity, the only thing that probably matters to a
transhuman society where if someone lops of your head they can just dump you in a new
body and might just send the culprit a bill for that and fine for inconvenience and trauma. So I mentioned a new Fermi Paradox solution,
and it ties to that. I get asked sometimes if after the Dyson Dilemma,
which as I’ve mentioned isn’t a Fermi Paradox Solution, it’s just a challenge
to many of them that points toward one solution being more likely, if I’d come up with any
solutions that still fit the criteria imposed by the Dyson Dilemma. Specifically the notion of eternal expansion
of a civilization till it runs out of new and uninhabited places to be. That’s essentially it, I can imagine that
a civilization might permit only one of everyone inside itself, and that would impose a maximum
population on them. I can also imagine them aggressively enforcing
that. With the Fermi Paradox as I’ve mentioned
before, it doesn’t matter if a civilization feels it has enough people if there are some
dissenters, since unless they are willing to kill anyone trying to leave to setup shop
elsewhere some will do that. If you’ve come to determine there are essentially
only so many unique identities and that it takes less than a Dyson Sphere to support
all of them, and if you’ve come to view copying or impersonating that identity as
the worst of crimes, yes you would send missiles and warships after anyone trying to flee your
empire to setup their own new place with new people. Because you would view those people as the
worst sort of criminals. But this only works as a solution if they
come to see identity with that degree of passion, and if that’s an inevitable thing virtually
all civilizations go for. I don’t think that’s likely, I certainly
hope it isn’t because it’s kinda dark, but I could see it happening. It would also only work if the number of unique
identities was less than one solar system or a handful could handle, though considering
how many human-level intelligences we calculated you could cram into a Matrioshka Brain that’s
not much of cap. And irritatingly it fits in rather nicely
with the Doomsday Argument, since you then have a set and finite number of people. If there’s a maximum of, say, a trillion
identities, us being among the first hundred billion of them is not particularly improbable. If those folks are immortal the count ends
but it arguably doesn’t matter if they don’t since if Bob, with his specific set of twenty
key personality traits, does die and they replace him with someone of those same traits,
it’s kind of debatable if they actually replaced him or just resurrected him and the
count remains the same. And we can play a similar matching game with
the Simulation Hypothesis obviously, though we’ll stop here for today. I don’t care for this solution as I mentioned,
but of all the ones I’ve tried to slap together after coming up with the Dyson Dilemma it
is the only one that I feel has sufficient consistency to earn a place as one. We’ll call it the Uniqueness Solution, though
I’m tempted to call it the Uniqueness Holocaust solution but I suppose I’m obliged to try
be neutral and open-minded even to my theories. Summary version, we still don’t understand
identity and consciousness yet, maybe we never will, and it doesn’t permit us to speak
with much certainty about things in the future that revolve around that. Hopefully you’ve come away from this episode
with some more questions about this key topic of philosophy and science. Incidentally in regard to last week’s video
on Tabby’s Star, there was a new paper on the long-term dimming that came out right
as the video did, and I ended up directing to folks to Centuari Dreams to get Paul Gilster’s
take on it. There’s probably going to be a lot of developments
on that matter and he’s excellent place to go to stay up to date on those, and you
can subscribe to his site to get emails of the material as it comes in. Centuari Dreams is one of my own favorite
sites and he’s very good about presenting the material in-depth without dumbing it down
while still making it very easy to read. So check him out if you want updates about
that crazy star. So channel updates. First the website is up, and that is Isaac
Arthur.net, doubtless it will be tweaked or added to as time goes by but I’m very happy
with it and very grateful to Luis, the gentleman who volunteered an awful lot of his time to
making it happen. With it in place we’ll move on with some
other expansions like getting the channel its own Facebook page, and I did open a twitter
account for the channel, which was my first time using that, and I’ll look at some other
options like reddit. Starting last week I uploaded the episode
narrations to Soundcloud, and everything is up there now with each series having a new
audio-only introduction, some of which amount to short episodes. I’m still surprised folks wanted an audio
only version but it available. Speaking of that, you may have noticed this
video I didn’t remind people to turn on the subtitles or how they can click on video
links, it’s not compatible to those audio only versions, and I think quick pop up on
the video will do. I might mention it out loud in gateway video
I think will have a lot of new audience members, but mentioning it almost every time is getting
on my own nerves and I think a lot of the audience too. Of course the subtitles will stay as will
the video links, and I’m looking at putting them up in other languages. A Professor was kind enough to hand translate
one of the videos into French and I still haven’t figured out how to get the timing
on that to upload it, and he’d suggested I might do that for all the videos and in
as many languages as possible. I love the idea, more than half my audience
is not from the US, where I make my home, and most of them aren’t from English-first
countries, so having translation available as subtitles, even if I have to resort to
dubiously done auto-translations, is very tempting if I can figure out how to do it
in some sort of reasonably quick method. If anyone happens to know of any, let me know. I’m also adding Frequently Asked Questions
to each video on the website, those will emerge as time permits, some are already done, others
it might be a week or two. Also on the webpage is a donations page link. That will take you to the options of Patreon
for continual donations, but also now Paypal for one time donations and Bitcoin for those
prefer that. Next week’s episode will be a look at Dark
Matter, as we prepare to do some cosmology discussion to clear up confusion folks often
have about the size of the Universe, Dark Energy, Dark Flow, Galactic Superclusters
and Walls, and Voids. I’d thought about covering both Dark Matter
and Energy at the same time but they truly are different topics and I’ve been letting
the videos get too long again, I can’t realistically keep to the half hour plus video a week I’ve
been doing of late and need to start breaking things into smaller bits. As always questions and comments are welcome. Subscribe to the channel if you want alerts
when that and other videos come out, and if you enjoyed this episode, share it with others,
and try out some of the other series and episodes on this channel. Thanks for joining me today for this look
at consciousness and Identity and we’ll see you next time!