(light music) (applause) I'm here to talk to you about
some of the lessons I learned at CIA over my
32-year career there that saved my butt from really
some embarrassing mistakes, although, I nevertheless
made embarrassing mistakes while I was at CIA and
I've entitled my talk, Survival Heuristics. So, just a little about
a (mumbles) thing here. I did work at the CIA,
I spent 32 years there as an analyst and as
a manager of analysts, and here I am in only one
of two possible countries, which one is it? Afghanistan, I heard
the F somewhere, and these two guys here are... Protecting me and I
carefully cropped the photo select you can't figure
out the plane number, and I spent 32 years there. I worked on Southern
Africa and the Middle East and then I eventually
became a Poo Bah and hardly had to work at all. As I went through my career, I became less and
less interested in any particular
part of the world, and much more interested
in how we think. How do we generate good ideas? How do we examine an
issue comprehensively? And so, I became really
interested in that and as a result, I sort of
became a heretic at the CIA. I was thinking too much, right? And I was always
suggesting different things that we could do, different
approaches that we could take. In fact, when the
internet reared its
head in the mid 1990s, I was the person at
the CIA who was saying, "You know, the internet is
gonna be big, it's gonna change "the way all knowledge
organizations do their work." In fact, it actually
changed the way all organizations do their work. And at the time when I was
arguing that in the mid 1990s, people were not very
receptive that it at the CIA. I didn't realize,
and this is something that I get into a little bit
in my book, Rebels At Work, which is co-written
actually with Lois Kelly, is that when you're
trying to make change in an organization, and
this may have some relevance to being a cyber security
officer in a large organization, when you're trying to affect
change in an organization, it's very difficult to do it if what you are presenting
is a theological argument. So, the internet is all about
opened information, right? At least it was in the 1990s,
and the CIA is all about... Right, closed information
and arguing the benefits of something that
would promote openness and sharing an information to an organization
that was so closed, I might as well have
been like Martin Luther, banging up the letters on the
church in Wittenberg, right? That's how receptive they were. So, some of the
lessons about that are captured in the
book, Rebels At Work. Which some of you, some
unlucky people I guess are going to get today. But the rest of my talk that
I really wanna talk about are the lessons I learned
about good thinking and avoiding the traps
that lead to bad thinking. So, are you good with that? (hooting)
Yeah? Alright, let's go, so... I'm just gonna go through
a few of the traps. Avoid the Streetlight Effect. Raise your hand if
you have some idea of what I mean by the
Streetlight Effect. There's a few,
okay, but not many. So, there's a joke, and the
politically incorrect version of the joke is that
there's a drunk at night on a sidewalk
on his hands and knees. And the person doesn't
have to be drunk, right? It's just something
they threw in the story. And a policeman finds
him and says, or her, "What are you doing?" And the drunk says,
"I lost my car keys, "I'm looking for my car keys." And the policeman says, "Is
this where you lost 'em?" The drunk says, "No, but it's
the only place I can see." So, that's the
Streetlight Effect where we as analysts
treat the information that is in front of
us, the streetlight, and act as if it accurately
represents reality. And I have to say and
I'm ashamed to admit it that I was well
into my CIA career, probably 20 years
into my CIA career, which gives you an idea of how effective their
internal brainwashing is and/or how not
introspective they are, that I said, "Wait a minute, "this information that
reaches me through my inbox, "I don't actually know "how accurately it
represents reality." Actually, I don't know what
share of reality it represents. Is it 5%, is it 20%? I mean, if you think
about it yourself as cyber threat analysts, if you were God and
you were on omniscient, and you knew everything
there is possible to know about cyber threat, now compare that
to what you know. What percentage of everything that you should
know do you know? But yet, you're forced to
make decisions on that, you know, you have no choice but to make decisions
on what you do know. But the only point here that
I wanna make is be humble about those decisions
that you're making. Avoid the Streetlight Effect. Always realize that
the information you have in front of
you represents a slice, and a kind of a flawed slice
'cause it's biased too, and I won't get into that
for the sake of time, just represents a
slice of reality. So one, avoid the
Streetlight Effect. Trends are always, is
always about the past. I hate it when people presented
an argument about the future based on trends of the past. Trends are composed of data
points and so by definition, if it's a date point, it
has to be about the past. So, anyone know what this
trend line represents actually? It's a very famous trendline
in American history. It's the trendline of
the Great Depression. So, the first severe
drop is October 1929 when the stock market crashed. Notice that it began
to go up after that and a lot of people thought, "Oh, it's over,
everything's gonna be fine." And then it just
steadily plummeted down until I think the last point is maybe 1936, 1937 on that line. So, at any point
on that trend line, if you had based your decision on what had happened previously, you were likely to
make a flawed decision. So, what do I like to
use instead of trends? Well, I'm a fan... Okay, so first you have
to use trends, right? You have to know what
the past tells you. But if you know anything
about probability, what happened a moment
ago doesn't really affect what's gonna happen next. It's a false construction
that we have in our heads. And what I like to look at a lot are small indicators of change. I think the best way
to observe the future is to observe the present
very, very carefully. So, when I travel, for example, I pay a lot of
attention to graffiti. Graffiti is one of my
favorite social indicators. Just the volume of graffiti
in a particular city like Madrid, for
example in Spain, tells you something
about the social cohesion of that country. And from what I know about
cyber threat intelligence, perhaps this is the
point for me to confess that I don't know
a lot about cyber, I didn't work on it
in my agency career, but from what I understand of the way these
malicious groups work, they do have a kind of graffiti that exists, that
they leave behind as they troll companies and
as they talk to each other about what they wanna do next. So, I like small indicators, I describe myself as an
analyst of small things. And kind of related, but a
somewhat expanded category are non-obvious indicators
of whatever phenomenon that you're trying to look at. What can you see that is
not a direct indicator, but somehow travels
with the phenomenon that you're trying to look at? But my bottom line is
don't depend on trends. Related, most things
don't happen by chance. Now you know what? I've said this before
at other conferences and I've bee
appropriately challenged by someone who points out
that you do have things like random clustering, correct. How I really came
upon this rule, or thinking trap to avoid, is an analyst actually, when I was reviewing
his or her paper, actually wrote that
X event in a country had happened by chance. And I pulled the
analyst in and I said, "Well, what do you
mean by the fact "that this thing
happened by chance?" And the analyst was you
know, hemming and hawing and I realized that
when we use the phrase, something happened
happened by chance, what we're actually saying is that we do not understand
the causality chain that lead to this event, right? So, when you say something
happens by chance, you're saying, "Well, there's
no way I coulda known, "therefore I'm not
responsible to try "to figure out
how I could know." If you replace that phrase with, "I do not yet understand
the causality chain," then in fact, you are
much more likely to work on trying to figure
out why things that you didn't think
were gonna happen, or that completely surprised
you, in fact did happen. So related to most things not
happening by chance is this, which is exponential causality. Now, I've scoured the
internet looking for an image that could represent
exponential causality. At this point,
the Grand Taxonomy of RAP Names is what
I've settled on. If you Google that
and look it up, you'll see that one rapper
comes up with a name and all of a sudden, that influences a whole
lot of other rappers, and then lead to a whole
lot of other rappers and this great universe
of usual names emerges. Now, exponential
causality is related to the notion of exponential
functions, of course, and how many of you have read, Thinking Fast and Slow
by Daniel Kahneman? Required reading, I think, for a serious
intelligence analyst. He makes the point
that one of the reasons why humans are so bad
at statistics, me, is because we're not very good at understanding
exponential functions. But related to
exponential functions, how something can go from
like three to 15 in one step, is what I call
exponential causality. So, we tend to think
linearly, unfortunately. It's something that our
education encourages us to do and the way we look at
facts encourages us to do, but we tend to
think A leads to B, B leads to C, so
forth and so on. But in reality, A leads to a
multiple set of consequences and each of those leads to another multiple
set of consequences, and if you keep up
on quantum theory, it's even argued that something in the future can
affect the past and affect the consequences
of the thing in the past. So, you're dealing with
exponential causality and you have to respect that. And trend lines, actually, are a great example of
something that we use a lot that doesn't take into
account exponential causality. Are y'all still with me? Yeah? Okay. Worst case scenarios happen. Now, I love this gif, I will
play it over and over again. Hopefully it'll
just, there it goes. My favorite thing
about this gif, or gif, there's a controversy as
to what you should say, if you look at it again, is the car approaching
the intersection. The driver, I love to imagine
the driver in that car going, "Large red ball has
appeared in front of me." You know, something he
had never prepared for. This is apparently a
Toledo art installation or something like
that gone wrong and the large
rubber ball escaped. I don't think I can
stop it unless I advance to the next slide, so we'll
just leave it like that. Anyway so, I learned this lesson as a result of a conversation
I had with a policy maker. And bad stuff had begun
to happen in the country, you might even imagine that
country as Iraq, for example. And there's a meeting
with the policy maker and policy maker goes, "You
didn't warn me about this." And we say, "Well, we did, we
told you this could happen." And the policy maker says, "But you said that
was a worst case." Interesting, so what was
the policy maker doing? What was he assuming we meant
when we said worst case? Unlikely, so police
makers often conflate the worst case scenario
that you've painted and think to themselves,
"Oh, that must be unlikely." So, one of the most
important pieces of advice that I have to give you all, particularly those
of you who deal with decision makers
in your company, and who have to convince
them of the importance of a security threat
of one kind of another, that just because you say
something is worst case doesn't mean that it's
unlikely and in fact, the probability of something
occurring is independent from the consequences that
that event may have, right? And it's human nature, you
have to really fight this hard to equate worst case scenario
thinking with unlikely. Completely different and can
cause all sorts of disconnects and communication issues
with your customers. Finally, we get to
move on from that. Again, this is also about
dealing with your customers. When you're explaining,
you're losing. I think it was Ronald
Reagan, when I Googled this, Ronald Reagan comes up as
the originator of this quote. "If you're explaining,
you're losing." But when we as technical
experts present whatever our findings are, we tend to present them
with a lot of facts and lot of information, and we're doing that
to a policy maker who wants simply
to make a decision and the very act of explaining, the more you go on,
the more your drone on, I think the more you're
losing, as a general rule, the person that you're
trying to persuade. This is particularly true
when an argument ensues. You say X and the
policy maker says, "No, I don't believe
X, Y is true." And then you start to explain
why the policy maker is wrong. And you ain't getting
anywhere there. So my advice to you, and we're
gonna talk a little bit more about best practices
in a little while. But my advice to you when
you're talking to a policy maker is that you need to have
compelling stories and examples to make the point
that you want to make and have those ready to go rather than another
technical explanation. I don't know what the analogs and metaphors are
in your domain, but you need to
have those available so that you can present them rather than only rely
upon explanation. So for example, a policy maker might say,
"Well, do you have any evidence "that this could happen?" And you might not have any
evidence, any hard evidence. And you could say, "No,
I don't but you know, "we wanna avoid being
like the drunk looking "for his car keys only
in the streetlight." Something that really
makes the point without having to deal with
a lot of technicalities. And finally, I think this is
the final one of my traps is you never run out of bullets. This is again, I
guess another way to come out linear thinking. When I was, and I
think this might really apply to your domain. One of the lessons
I learned early on from one of my
first team chiefs, he had been working
on Laos, I think, and he told me, "Carmen,
I had it figured out. "I had counted the
number of bullets "the Laotian guerrillas have, "and I knew that on April 13th, "they were gonna
run out of bullets. "They weren't gonna have
any more bullets to fire." And my boss at the
time said to me, "John, you never
run out of bullets." i.e. life happens,
stuff happens, and they find a
way to keep going. So, when you think about what
you're working on analytically and you've got it, like the
whole linear train figured out and you know that on this day, the attack is gonna
end or on that day, the attack is gonna start, step back, swallow a
little bit of humility, and realize that you
never run out of bullets. Still with me? Alright, I just like
to check, you know? Alright, oh, one more,
speaking of bullets. I ruined my segue,
emotions can kill. Has anybody here ever
worked at CIA, actually? I should have asked
this question earlier. Ah, I see a couple
of hands, yep. So, one of the things
that we did as analysts is we tried to be
extremely objective and we didn't wanna have any
kind of subjective issues injected in our analysis. And again, as I went
further into my career, I realized that
that was mistaken because the world
is made of humans and humans are of
course highly emotional. And we often would
misjudge a situation or the severity of an outcome, or what happened in
Iraq, for example, or what's happening
today in Afghanistan, because we underestimated
the emotional motivation that the actors had
in the conflict. I was thinking how this might
apply to what you all do and I understand that a lot
of what happens in terms of... Successful attacks on a
system involve getting a human to click on a
piece of click bait that brings the virus or
the malware into the system. And you can say to the
human over and over again, "Don't click on attachments." But you know, if they
love kitten pictures and they are having
a really bad day, the chances are heightened
that they are gonna click on that thing that
appeals to us. And of course, there's
this entire field of, see if I get it
right, social cyber... Social cyber security or social
cyber attacks where in fact, and we saw some of this
in the 2016 elections and in the Brexit elections where people who are
trying to influence how you act or how you
vote are very methodical in presenting information to you in a way that will appeal to you so that you will begin to
believe or accept the story that they want you to accept. And all of that has
emotional resonance, that's what they're
appealing to. So, emotions can
kill and emotions, if you're not in how
you're thinking about... Your cyber threat environment, taking into account emotions, you're underestimating
your opponent. So, I wonder a lot
for example about how emotions might play in terms of how people perceive
the US or whatever, in terms of the different actors that might enter the
cyber threat environment in the months and years to come. Okay, now we're talking
about survival strategies. Construct an analytic landscape. Now, that is actually my dryer underneath that little
thing, a picture of it, and this is the landscape
of the future of warfare that some analysts
who were working for me 15 years ago built. What's interesting about
this is it was an assignment where I said, "Okay, I
want you all to step back "from your day-to-day work
and I want you to think "about every element involved
in the future of warfare." And I gave them two
rules, "And I want you "to build an analytic
landscape of your issue." And they said, "What's
an analytic landscape?" And I said, "I don't know,
and I want you to build it." And I said, "There's
only two rules. "You have to consult outsiders
in building your landscape," In other words, it can't be
just internal group think, "and you have to
present it graphically. "I don't want us to be... "Kidnapped or held
hostage by words, by text. "I want you to present it,
you could present it text, "but I also want you to
present it graphically." So, they built this using
kind of a Delphi method where they asked people
to rate different aspects of the issue, (softly
mumbles off mic) So, the spectrum of
conflict was the bottom so it's a 3D thing
and they go from rock throwing to nuclear war, see if I can see, what's
the one on the side here? I can't read it. (audience members mumbling)
Adversary Capability, of course, we go
from low to high, an importance to the US
or to whoever your actor, when from low to high. And so, you can
imagine that this peak in the middle could
conceivably been North Korea. Nuclear warfare,
mid-level capability, but highly important to the US. The point of this is not
this particular landscape, although I think it's very cool and our customers loved it. The navy wanted us to
build a simulated version that their jets
could fly through or something odd
like that, right? (laughing) But you know, it
captured a reality about the future of warfare that you could not
represent just with words and it forced them to think
about every aspect of it. And what's important
about this is that it helps you avoid
the Streetlight Effect. You're only working on this because this is what you
have the information on. By building an
analytic landscape, another dimension that
you could overlay on it is how well do we know each area? You could have colored
the mountain ranges
by a color scheme that indicated I feel confident about my information
in this area, and I need to know a lot
more about that area. Once you develop that
analytic landscape, then you have a tool
for shifting attention and for making sure that
you're not ignoring areas that could be important. And I should change this
slide and it should read, Construct and Revise
Your Analytic Landscape 'cause one of the problems
with all these tools, 'cause you know, we're weak
human beings, we're flawed. No matter what tool
you hear about, you're gonna use it as a crutch. And all of these
tools have limitations and they particularly
have limitations if you overly rely on one or if you use the same
analytic landscape like for five years in a row. One of the things I learned
about my analysts at CIA, or they're not mine, I don't
own them, the analysts at CIA, was that we had a
program that brought our classified information
into our inboxes based on our areas of interest, and it was sort of like
this Bayesian kind of thing, this and not this. And one day, I asked
the analysts, I said, "How often have you
revised your search query?" The profile that brings the
information into your inbox, what do you think I heard? (audience member shouts off mic) Five years, never, "I didn't
know you could revise it." "I'm using the one the
senior analyst gave me "when I started
eight years ago." It was apoplectic, but
such a simple thing of such significance had
long been overlooked. So, construct and
constantly revise your analytic landscapes. Apply a category
system of some kind. I actually think the first step in analysis is categorization. It's a necessary step but
can be a very dangerous step because we live and die by
the categories we create. Now this is my favorite first
order categorization system. Who knows what his is? No? This is the Cynefin
framework, does that help? C-Y-N-E-F-I-N? No, look it up, C-Y-N-E-F-I-N,
it's a Welsh word I think. There's so much information
on this on the internet. But it's a guy named David
Snowden, no relation. (laughing) And he came up with this
very simple heuristic device that your problem
can fall into one-- Oops, sorry, went the wrong way. Your problem can fall
into one of four spaces. It can be simple, so
that's the problem where linear analysis
really helps. It can be complicated, so
it's still cause and effect but there's a lot of causes
and effects so that it's, the famous Stalin
quote that quantity has a quality all its own. So, it just takes more time. Complicated problem
sets are the ones where experts are
really helpful. You don't really need
experts when it's simple, but experts can really help
you when it's complicated. Complex is when cause
and effect is unclear and generally cannot be
determined in advance. Complex problems
are the problem set that experts can fail you
because they will apply, confidently apply
their expertise and be fooled by randomness. And then chaotic is
where you never wanna be. Chaotic is where there's
seemingly no order and often it happens when an
existing system breaks down. And for a while,
it's a free for all and you're not
really sure what are the important casualties
are going to emerge. I was thinking when I threw
this up, where is cyber? Okay, so correct
me if I'm wrong, I sort of thought a lot of
cyber threat intelligence might be complicated,
in other words, that you sort of understand
the causes and effects, but there's so many of them that they're very
hard to keep track of. But then sometimes it'll kind
of slosh over to complex. I don't know if anyone... How many of you think
it's complicated? I'll just have you
raise your hands. Ah-ha, how many of you
think it's complex? Okay, how many of you
think it's chaotic? Alright, okay, but anyway, it's just an interesting
thing to think about. If you think about your domain and maybe your particular
company's profile, then you could go,
"Okay, you know, "if it's simple trends,
okay, I can work with trends. "If it's complex, I
better built myself "out an analytic landscape
and because it's complex, "I better revise
it all the time." That's the way I would use this very simple
heuristic device. Know your thinking style. I know, who doesn't
love Winnie the Pooh? Oh, gosh, but I
know that I heard in the introductions
briefly a discussion about cognitive bias. And we each have to know
what our thinking style is. There are a few instruments
on the internet. I've used, it's free, the
Gregorc thinking inventory, G-R-E-G-O-R-C, it's
free and it's simple and it kinda puts
you, are you abstract, or a concrete thinker, or
are you random or sequential? I asked a group of intelligence
analysts to take it once and they all ended up almost exclusively concrete
sequential thinkers. So, what that told them, and they were really
kind of upset about it, is that they're vulnerable
to not seeing change because change, sudden change, is neither concrete nor
sequential, correct? Right, so understand
your thinking style. And your thinking style,
there's so many dimensions, I think that's why
there's not a good test. But one for example is
optimism versus pessimism. I'm an optimist by nature. I believe men and women are
basically good but feckless, so they end up doing
the wrong thing even though they
had good intentions. And so, I know that affects
how I think about problems. I'm one of those
whose likely to think that a worst case
scenario is unlikely. And so, just knowing that
helps me calibrate my thinking so I'm a little
bit more effective. Find a thinking partner. Now, because I'm
kind of an optimist, my favorite thinking partner when I was at CIA was an
Eeyore, a real pessimist. And we knew there was a
disturbance in the force when she was optimistic
about something and I was pessimistic
about something, that's was like a
fascinating outcome to us. But do you have a
thinking partner? Do you have more than
one, really, a go-to, a group of people that you
can explore ideas with? I was part of the executive team that took over the analytic
program at CIA after WMD, 9/11, and the Iraq war,
that was a trifecta, and we spent a lot
of time introducing, and I'm sure you're
familiar with this, structured analytic techniques. And of course, as
I said earlier, when you introduce a new method, it tends to become a crutch. So, everybody was into
structured analytic techniques. What I like to say is, taking
a walk around the block with your favorite
thinking partner is as effective an
analytic technique as anything you can do. So, don't ignore the
analog analytic techniques and having a thinking partner
is a very important one. Deploy diversity of thought. In certain domains, diversity
of thought becomes harder because your domain
does not attract a full range of thinking styles. At CIA, for example, we had 100,000 plus
resumes every year. But I worried that they
did not represent thinkers who were very creative, or who were radical
in their approach. I worried that the only people who would put their
resume in for the CIA would be people who sort of saw
the world in black and white and didn't see a lot of
gray or a lot of color, and I think that your
domain is a similar domain where you have an attraction
bias where the people who come are self-selecting
themselves away. When I would go to universities, I would tell the audience
that I was talking to the people that I
really want to see apply are the ones who think they
could never work at CIA. You're the people I most
want, and rarely got them. So, deploy diversity of thought. I have a whole other talk
about diversity of thought and ways to think about
diversity of thought. I'm curious, how many of
you manage other people? Ah, a lot of
people, so I'll talk about this for a
couple of minutes. First, diversity of
thought is important. Organizations that
allow for a lot of different ideas
have better outcomes even when the
dissenters are wrong. So even when the people who are expressing a
different view are wrong, the organization still
has better outcomes and the reason is because
allowing for free-flowing debate leads everyone to
raise their game. The people that have the
majority point of view, or the majority opinion, they work harder when they know that their team
has vibrant debate. So, diversity of thought... Is important, leads
to better outcomes, particularly the more
complex the tasks, the more important
diversity of thought is, and contributes even when
the dissenters are wrong. So, the literature
shows that very clearly. The second point I wanna make
about diversity of thought to your managers is
that it's very hard to manage diversity of thought and none of you have
been taught how to do it. Most organizations want the
organization to run smoothly. When I was a manager,
people would say, "Oh, you never hear any problems
from Carmen's team, A-OK." But that's not necessarily
a good indicator because if you're going to
encourage diversity of thought, your team is gonna
be crunchy, right? There's gonna be
differences of opinion and you haven't been
taught how to do that. I don't know of any
management program that actually helps
managers manage differences of
opinion effectively. And not only do you have to
manage difference of opinion, you have to conduct yourself
so that your team members know that you welcome their opinion. So, a couple of points there. Very popular today
in organizations is
the stand-up meeting. Five or 10 minutes, stand
up, that's it, we're done. When you as a manager only have a five-to-10-minute, what are you telling
your reports? That it better be really
important for them to mention it because clearly,
this 10-minute limit on the meeting is very
important to the manager. So, the way you conduct yourself helps or depresses diversity
of thought in an organization. Another thing that's
very common for people, managers to do, is to
talk for 45 minutes, which I'm about to do, and
then say, "Any comments?" What happens when you talk
most of the time in the meeting and then you say,
"Any comments?" What do you get? You get crickets, right? The Any Comments is a sign
that the meeting is over. So, what could you say instead? As a manager, what
did I get wrong? How would you do it differently? What did I miss? Or even, why don't
you set the agenda for the meeting next time
and I'll sit in the audience? So, I could talk
way too much time, spend to much time on this. But as a manager, the
way you conduct yourself, both physically and
with your words, has a lot to do with
whether or not you're able to operationalize
diversity of thought. Think together from the start. Okay, so you've got
diversity of thought, you've got different
people in your group, and then often what
happens is that you don't actually incorporate
the other view points until later in the process. So, when you collaborate
toward the end of the process, you're not collaborating at all. What you're doing
is deconflicting. Collaboration much occur
from the beginning. When you're close to the end
of whatever you're doing, the last thing you want
is someone else's opinion to derail you and make you
start again from alpha. So, when you collaborate and you deploy
diversity of thought, do it from the beginning
of your project, think together from the start. And finally, respect
your intuition. So, we talk a lot about method, but your brain has a way
of revealing to you things without articulating it
to your consciousness, and I'm gonna give
you an example. I've never actually said
this in public before because I think it's a
little, I don't know, you'll tell, you
can, you'll decide. But over the years,
as I drive around, I'm always observing, looking, and I've noticed,
I'll be driving and overtaking a
pedestrian who's walking, and as I overtake
that pedestrian, the thought will enter my mind, "Oh, that person is Asian." And then I'll go past
him and I'll look in my rear-view mirror
and they're Asian. They're Japanese, or
Chinese, or Korean, and every time I'm right, I
think I'm batting like a 6/6, my number on that, I may
have been mistaken once, and this is over
many, many years. It's been at least 10 years
since I noticed that phenomenon. To this day, I don't know
what it is I'm noticing. I don't know, and I'm not
gonna Google, you know... I don't even know
what to Google. So, I'm driving along,
winter or summer, I notice a person walking along, and I go, "I think
that person is Asian." And I think that it has
to do, I don't know, with the way their head
and their shoulders, or something relate
to each other. Whatever it is, I
can't articulate it. So, my brain has
noticed something that I'm not consciously
able to explain. That's what intuition is. Intuition is what I think, at least intuition as it applies to what we're talking about, intuition is when your
brain has picked up on something visually
'cause you know, your brain is best
at visual processing, that it can't figure out the
text to explain it to you. And so, I think it's
always important to respect that as well. And I hope that you
feel like this dog, happy and content. Thank you very much
for your attention. (applause)