Creating the sixth sense - David Eagleman, Baylor College of Medicine

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so we are trapped between the infinitely small and the infinitely large and we're not very good at understanding reality at either of those levels because we didn't evolve to understand reality there instead we're trapped on this very thin slice in between but the surprise is that even at that level we're not seeing most of what's going on so take for example the colors of our world so um this is electromagnetic radiation that bounces off objects and hits the back of our eyes and that's what we call visible light but in fact what we're seeing is only one 10 trillionth of the available spectrum the rest of it radio waves and microwaves and gamma rays and x-rays all that's passing right through your body right now and it's completely invisible to you because you don't have specialized biological receptors for picking that stuff up so it's not that these other frequencies are inherently unseeable snakes pick up information the infrared range and honey bees in the ultraviolet range but you don't have access to any of that at least not yet so what this means is that your experience of reality is constrained by your biology and that goes against the common sense notion that your eyes and your ears and your fingertips are just picking up the objective world out there instead your brain is sampling just a small bit of the of the external world now when we look across the animal kingdom we find that different animals pick up on different slices of their world so in the deaf and blind existence of the tick the important signals from its environment are heat and butyric acid and for the black ghost knife fish its world is lavishly colored with electrical fields and for the echo locating bat it's all about air compression waves now that's the slice of their environment that they pick up on and we have a word for this in science it's called the umvelt and that's the german word for the surrounding world and here's the important part is that presumably every animal takes its umvelt to be the entire objective reality because why would you ever stop to imagine that there's something beyond what you can sense instead we all accept the reality that's presented to us so let's do a consciousness razor on this imagine that you are a bloodhound dog you've got a long snout with 200 million scent receptors and you have wet nostrils to trap scent molecules your entire world is about smelling now one day you stop in your tracks and you look at your human master and you have a revelation and you think what is it like to have the pitiful little nose of a human being what's it like when you take a little feeble nose full of air to not know that there's a cat 100 yards away or or that your neighbor was on this very spot six hours ago but because we're humans and we've never experienced that we don't miss it because we're firmly ensconced in our own umvelt but the question i'm going to ask is do we have to be stuck there so as a neuroscientist i'm very interested in how technology might expand our umvelt and how this will change the experience of being a human so we're already getting very good at marrying our technology to our biology there are hundreds of thousands of people walking around with artificial hearing and artificial vision and the way this works is you take a digital microphone or you take a digital camera and you plug the signals straight into the biology and at first people thought that this was never going to work why because these devices speak the zeros and ones of silicon valley and that's not the same as the dialect of our natural biological sense organs but the surprise was it works just fine and people who are deaf and blind can understand what these their brains figure out what to do with the signals now how can we understand that well the big secret behind the whole thing is that your brain isn't actually seeing any of this your brain is locked in silence and darkness inside the vault of your skull all it ever sees are electrochemical signals that's all it has to work with and nothing more but the brain is very good at extracting patterns and finding meaning in these signals and from that it serves this up to you your private subjective experience of the world now the key thing is that the brain doesn't know and it doesn't care where the data are coming from whatever is fed into it it just figures out what to do with it and how to extract meaning and i think what this means is that the brain is a very efficient sort of machine it's a generalized computing device and what that means is once mother nature figured out the principles of brain operation all she had to worry about from there is designing new peripheral devices to feed information in so this is what i call the ph model of evolution where ph stands for potato head and the idea is that all these sensors that we know and love they're just peripheral plug and play devices you stick them in and you're good to go the brain will figure out what to do with the data and when you look across the animal kingdom you find that there are all kinds of strange and interesting peripheral devices everywhere nature is the ultimate hacker space and these are just a few of the ways that genetics have figured out how to channel information into brains now i think the lesson that surfaces here is that there's nothing special or fundamental about the sensors that we happen to come to the table with instead this is just what we've inherited from a complex road of evolution but it's not what we have to stick with and i think the best proof of principle for this idea comes from what's known as sensory substitution and that's the idea of feeding sensory information into the brain via an unusual channel and the brain figures out how to make sense of this now this might sound like a speculative idea but the first demonstration was in the journal nature in 1969 so blind people were sat in a modified dental chair and a video feed was set up and then whatever is put in front of the camera you feel that in your back with a grit of solenoids so you wiggle something in front of the camera and you feel it and blind people got really good at being able to tell what was going on in the visual world by feeling it through the small of their back now there have been many modern incarnations of this the sonic glasses turn a video feed into a sound landscape so as you move closer and farther from things it goes and it sounds like a cacophony at first but people are able to figure it out and it doesn't have to be um just through feeling or hearing you can put it on um you can put a an electro tactile grid directly on the forehead and turn a video feed into a feeling of little pokes on your forehead why the forehead because you're not using it for much else people can figure out how to see this way and then the most modern device on this is called the brain port and that turns uh an electro-tactile grid onto your tongue so blind people can learn to see through their tongue they can get very good at this and navigate obstacle courses now if it sounds crazy that you could see through your tongue just remember that all vision ever is is electrochemical signals moving around in the brain that's it so one of the projects in my laboratory is to develop sensory substitution for the deaf and this is something i've done in collaboration with my graduate student scott novick and here's the idea we thought look what we want to be able to do is capture sound from the world and push it through some sort of channel so that someone who's entirely deaf can understand what's being said and we wanted to make this run through our cell phones and have it be a wearable device and so here's what we did so this is a this is a tablet this is a tablet here and what it's doing is capturing the sound of my voice and it's converting that into a pattern of vibrations little vibratory motors on a wearable vest so as i'm speaking my sound is getting turned into a pattern of vibrations and it turns out this is not just conceptual but i'm wearing the vest right now and as i speak my sound is getting turned into a dynamic pattern of vibrations i'm feeling the auditory world around me now we've been using this to test thank you we've been using this to test deaf participants this is jonathan he's 37 years old he has a master's degree he was born profoundly deaf and so what that means is there's a whole part of his umvelt that's unavailable to him so we trained jonathan wearing the vest for four days two hours a day and on the fifth day this is what we observed so this is scott he says a word so jonathan feels the vibrations on the vest and writes on the board what he understands so jonathan is able to translate this complex pattern of vibrations into an understanding thank you actually the applause feels good it's like a massage so here's this so here's the thing jonathan's not doing that consciously the patterns are too complicated but his brain is unconsciously figuring out how to unlock the patterns for an understanding and our expectation is that after wearing this for a few months he will be having a direct perceptual experience of hearing now we've been very encouraged with our results with sensory substitution and that's led us to do a lot of thinking about sensory addition how could we pass in other kinds of data streams to have an entirely new sensation of the world and expand the human oom belt for example by passing real-time data from the net to the vest so here's an experiment we're doing in the lab this is a participant who's feeling a pattern of vibration streamed from the net and after five seconds two buttons appear on the tablet and he has to make a choice and then he makes a choice and then a second later he gets feedback a smiley face or a frowny face now he has no idea what these patterns mean or what he's doing we're seeing if he gets better at being able to make good choices on the tablet what he doesn't know is that we're feeding in real time stock market data and he's making buy and sell decisions so what we're seeing is whether he can develop a direct perceptual experience of the economic movements of the planet here we're working with a quadcopter pilot and we're feeding nine different streams of data into a vest while he's flying it we're feeding pitch and yawn role and orientation and heading and so on to see if that improves his flight and it does because essentially it's like extending his skin up there so instead of seeing all these measurements he's feeling them and this is where we're going is that a modern cockpit is full of gauges and it's very hard to take this all in with a visual system but what we're doing is making it so that a pilot can feel it because there's a difference between having access to big data and experiencing it and i think in general there's no limit to the expansions that are possible on the horizon so just imagine if the astronauts in the international space station could feel the whole health of the iss or if you could feel the invisible states of health of your own body like your blood glucose level and the state of your microbiome or being able to see in infrared or ultraviolet or having 360 degree vision the fact is that we're now in a position where we don't have to wait for mother nature's sensory gifts on her time scale but instead like any good parent she's given us the tools that we need to go out and define our own destiny so the question is how do you want to experience your universe thank you very much
Info
Channel: Web Summit
Views: 31,864
Rating: undefined out of 5
Keywords: David Eagleman (Author), Web Summit, 2015, Dublin, Ireland, Neuroscience (Field Of Study), Baylor College Of Medicine (College/University)
Id: MPG9kNKron4
Channel Id: undefined
Length: 14min 1sec (841 seconds)
Published: Tue Dec 08 2015
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.