Using AI to Decode Animal Communication with Aza Raskin

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
welcome to the stage comedian and activist Josh Burstein so has anyone played around with chat GPT and thought this thing's going to help me become the ambassador of the dolphin race well our next speaker certainly sees the promise in these robots to helping us speak to Nature he is the co-founder of Earth species project a non-profit dedicated to translating animal Communications he's also co-founder of the center for Humane technology which was featured in the documentary the social dilemma to discuss how we can use AI to learn from other species it's the co-host of the hit podcast your undivided attention please give exactly that for ASA Raskin thanks so much brother good morning everyone I'm so impressed we're all awake somehow yeah yeah we'll we'll see how awake you are by the end I think we might be able to bring up a little bit um so as you know my name is ISA Raskin uh all my contact information is up there run Earth species project which has the cute acronym of ESP which is probably on purpose um does anyone know what animal makes this sound [Music] no no no close yes it is oh nice thank you oh that really helps isn't that guy cute this is a bearded seal making the sounds of like the 1950s UFOs by the way that was the mating call of the bearded seal so if you're excited now you know where to go here because that's such a wild sound right we don't normally think of the incredible diversity of communication that exists on this planet it like helps us open the aperture of the human mind and our goal right is not just can we learn to listen from animals but can we unlock communication to transform our relationship with the rest of nature this idea actually started um in uh 2013 I was driving my gold station wagon Volvo uh and I heard an NPR piece about gelato monkeys which I'd never heard about before they're these guys they have these huge Mains giant fangs they live in groups of like 2 000 fission fusion groups and the uh researcher Morgan gustinson who's researching them said that they had the largest vocabulary of any primate except humans they sound like women and children babbling they sound a little bit like this let's see does it work [Music] s they did this incredible like lip smacking thing that like mimics humans like moving our mouths to modulate our sound and the researchers swear that the animals talk about them behind their back which is probably true and I was like well why are they out there with a hand recorder hand transcribing trying to figure out what they're saying shouldn't we be using you know like machine learning and microphone arrays to go figure this out and when I started looking into it at that point machines couldn't do something that human beings couldn't already do they couldn't Transit a language that didn't already have a translation I'll get to like the punch line of what changed in 2017 that said now is the time to start going but I just want to say you know mycelial networks have like mushrooms like the weird fruiting bodies that come forward but it's really like the the things under the ground that do all the work I feel the same way I'm the weird fruiting body you see in front of you um but the real work happens from like this incredible team uh behind me and since we got going in 2017 where really it was just us on our own it's really grown but now two books out about this topic there's a huge set of people and partners working on this there is a wave that is coming that will crash in the next couple of years and it's incredibly exciting so core of the thought if there's going to be two sort of main thoughts to hold throughout this talk and one is our ability to understand is limited by our ability to perceive if we can throw open the apertures of what we can perceive that throws open the aperture of what we can understand I always like to start this way whenever I'm feeling a little ego forward I'm like okay but how much of a reality am I actually experiencing and and here is this so time this is a ridiculous chart but time is on the left all of it time spaces on the bottom all of it and The Human Experience occupies a tiny little fraction of what we can perceive which means what we can understand and to give a sense of this like do you guys think that flowers can hear the sound of approaching bees like do you think plants can hear I'm getting some I mean technically this is a leading on question so you are all correct um but researchers in 2019 did this at University of Tel Aviv and they took a primrose flower and they would play different sounds to the flower and they would play you know like traffic noises low noises bat noises High noises and then the sounds of approaching pollinator and only when they approached or played the sounds of an approaching pollinator did the flowers respond and they respond by producing more and sweeter nectar within just a couple of seconds right so the flowers hear the B through its petals and get excited okay so plants can here and actually there's another incredible study that the same university did right after where they're like okay but can they speak and so they actually stressed out tobacco plants um they would either like cut them or dehydrate them sort of plant torture um and when they did the more dehydrated that the plants got the more they would emit sound um these and not quietly it's at the sound of human hearing um just up at 50 or 60 kilohertz so we can't hear so whenever I'm wandering now through like a dry California uh Meadow like the plants are hearing the pants are emitting sound like the world is Awash in signal and meaning that we're just completely unaware of but on the cusp of really diving into because these are just the first sets of experiments all right so plants emit sound they respond to sound uh pea plants actually are known if you uh put their Roots into a plastic bag The Roots will still move towards the sound of running water so this is everywhere but how about this so there's this uh 2013 study uh discovered this Vine is actually a very common Vine in South America bokila trivialada and what's interesting about this plant is that it mimics the leaves of the plant that it grows on so if it's growing on a red plant it'll change its leaves to be red if it has like three loaves it'll change its leaves to have three loaves took a while for at least Western scientists to discover it because it camouflages but then there was a huge debate okay if it can change its leaves to mimic what it's growing on and it will change you know in the same same Vine how is it doing it maybe it's volatile chemicals and it's pre-programmed it wasn't until 2020. that you know a scientist realized hey shouldn't try growing this thing on an artificial plant can it still mimic the leaves of an artificial plant and the answer is yes it'll change the color shape and size of its leaves even from an artificial plant and they're like process of elimination like how is it doing this and the only thing that the scientists can come up with are aceli which is a face-saving term in biology for eyes that they believe that the plants can literally see what they're growing on and mimic and actually it was Darwin who first hypothesized that plants could use their cell walls as a kind of lens to focus light to see okay so plants can see here emit sound maybe it's not so ridiculous to think that animals communicate and have some kind of Rich deep language but still how do we know that there's a there there right and this is a study from University of Hawaii 1994. um and it's it's fascinating they here they taught Dolphins two gestures and the first gesture was do something you've never done before right innovate hard enough to teach that to humans and the Dolphins do it and that's think about the cognitive load for that right because you have to remember everything you've done that session you have to understand the concept of negation not one of those things and then invent a new thing you've never done before but Dolphins will do it and then they give a second gesture do something to get and they'll say to Dolphin pairs do something you've never done before together and the Dolphins go down exchange information come out and do the same thing they've never done before at the same time right and that doesn't prove representational language but like Occam's razor like it's sort of on the the null hypothesis now on the other foot like we have to sort of show how they would do that if it wasn't using some kind of Rich symbolic communication the study got us so excited by the way they replicated this over multiple dolphin pairs um so it has replicability unfortunately when they did it they didn't record the audio and so now there are groups of scientists that are starting to work to recreate this experiment okay but I said something happened in 2017 right that said now is the time to go how can AI possibly help Transit a language that doesn't already have a translation how do you translate a language that doesn't have a Rosetta Stone or any examples well that was impossible until 2017. and to understand this because in 2017 it was discovered by two papers back to back that AI could translate between two human languages any two human languages without the need for any examples or a Rosetta Stone it's deeply profound what I want to do now is walk you through that process you have an understanding of how that works if you want to have a deep understanding of an AI of AI and how it works remember this one thing that AI turns semantic relationships into geometric relationships okay what what does that mean semantic relationship into geometric relationship so this is English this is the top 10 000 most spoken words in English so imagine you know every Star is or a word and words that mean similar things aren't near each other where is the chair of semantic relationship share a geometric relationship so King is to man as woman is to queen so in this shape King is the same distance direction to man as woman is to queen and so you just do king minus man plus woman equals queen or plus boy equals Prince or plus girl equals princess semantic relationship into geometric relationship the first thing I ever tried when I got my hands on this data set the technical term for these things by the way is a latent space or an embedding space but I tried okay hipster minus authenticity add that to conservative and the computer sped out electability right yeah computers should not write better jokes than we can all right so a more example is just to get the hang of it meloderaces the pretentious way of saying smelly you do a melodious might smell it get this gives you pretentiousness as a relationship added to book equal Tome added to clever equal a Droid and if you want to get a conceptual understanding of how this works because the computer doesn't know what anything means just knows how things relate so if you're looking at lots and lots and lots of text right and you saw this word ice and you say oh ice appears next to this other word cold often but ice does not appear next to Fashion often and if I divide those two ratios I get some hint that ice and cold are more related than ice and fashion and that's the basic way that this stuff gets built of course it gets more advanced when you move into AI but what that means is that if you think about the point which is dog well it has relationship to friend and a guardian to howl to Yelp into a wolf and to man it sort of fixes it in a point in space and if you solve the problem of every concept relation to every other concept it's like solving a massive multi-dimensional Sudoku puzzle and out pops a rigid structure that represents all the internal relationships of a language okay that's cool but what would you do with that well the shape which is say Spanish can't possibly be the same shape as English right if you talk to anthropologists they would say different cultures different cosmologies different ways of viewing the world different ways of gendering verbs obviously going to be different shapes but you know the AI researchers were like whatever let's just try and they took the shape which is Spanish and the shape which is English and they literally rotated them on top of each other and the point which his dog ended up in the same spot in both and you're like okay but Spanish and English are super related let's try that with Japanese and even though there are definitely words in Japanese that don't exist in English and don't exist in Spanish if you blur your eyes the shapes are still the same and in fact that's not just true of English and Japanese and Spanish but Urdu and Aramaic and Finnish um pretty much every human language that's been tried ends up fitting in a kind of universal human meaning shape which I think is just so profound especially in this time of such deep division that there is a universal hidden structure underlying us all okay well the obvious next step is can we build one of these kinds of shapes for animal communication and I just should point out that this is like 2017 technology which is essentially Stone Age and AI so there are many more techniques that have merged since here but I think it's really helpful to hold this in your mind because it gives like an intuition for how this kind of stuff works um okay if we could build the shape that represents animal communication well maybe there's some part of that shape that overlaps with how human beings communicate and see the world because language is crystallized shadow of mine it is a model of the world and to the extent that we live in the same world maybe there's an overlap and the part that overlap would be a direct translation into The Human Experience and the parts that don't overlap with sort of like stick out and we could see that there is Rich structure there but we wouldn't be able to directly translate and I still don't know which of these two things would be more fascinating because you know whales and dolphins have had culture passed down vocally for 34 million years humans have only been speaking vocally impacted on culture for like 200 000 years tops like and that which is oldest correlates with that which is wisest so there's probably some very deep truth that's embedded in a 34 million year old Culture by the way whales and dolphins have uh dialects that drift far enough apart that they become mutually unintelligible languages there's a great case that happens generally yearly off the coast of Norway when a group of false killer whales speaking some way and a group of dolphins speaking their own different way come together and hunt in a super pod and are speaking a third different way so let's go figure out what that overlap is but you could say the world of an animal is so different their umvelt is so different there's some sorry it's so different we shouldn't expect any overlap but I think that's a kind of human exceptionalism so here's an example do you guys know the mirror test this is where you these are dolphins looking in a mirror you paint a DOT on a dolphin um I love this because this is definitely a universal experience of looking in a mirror and checking out your abs but this this um this test is you paint the dot on an animal they don't know about it until they look in a mirror they look in the mirror and they're like oh what's that thing and that means that they have to recognize the image they see in the mirror with themselves that is me self-awareness right one of the most profound experiences you can possibly have and if they're communicating then they may well be communicating about a rich interiority so that is shared with animals uh Dolphins elephants a number of other species we'll we'll pass this these guys are lemurs taking hits off of centipedes so they bite centipedes literally get high and they go into these trance-like states I'm sure this is not at all familiar to anyone here um they get super cuddly uh and then later wake up and go their way but they are seeking a kind of transcendent State of Consciousness Apes will spin they will hang on Vines and spin to get dizzy and then Dolphins will intentionally inflate puffer fish to get high pass them around in the ultimate puff puff pass right many mammals seek a Transcendent altered state of being and if they communicate they may well communicate about it this is too early in the morning to get this example but this is a pilot whale carrying her dead Young this is week three so grief is an experience that's shared and to not end on that note we have to go another way this is an orangutan watching a magic trick and to see if you recognize this Behavior which is definitely me and all magic shows so like surprise and delight at having your expectations frustrated that is another shared experience and presumably some kind of gentle hatred of magicians right so there is a lot that is conserved and there's a lot that's not and so that's sort of what we're moving towards but to actually build this shape takes a lot of work because it turns out real world data is super messy so oh here's another here's another guessing game what animal makes this noise [Music] no but you're in the right sort of ballpark it is a type of whale it's sort of white it has a melony head yeah nice it's a beluga these are a couple belugas hunting and communicating and to me it sounds like an alien modem this is nothing there is digital that was just a microphone stuck in water we can hear up to 20 kilohertz they're speaking up to 150 kilohertz it turns out dolphins have names there's a 2016 study of the Ian Yannick that showed that they not only have names that they address in each other by but they will use them in the third person belugas only recently were discovered to have names and in their call they put both their unique identifier and their Clan identifier but unlike a dolphin whose signature whistle is what it's called is their name and it's just a whistle these guys have these Broadband modem-like packets yeah now here's the surprising thing this is one that like made my eyes get really wide because the woman who did this research Valeria Vergara Dr Valeria Vergara said even though she has tags that she puts on the animals that record like video and audio um and kinematics even though she has that she still has to throw away 97 of her data because she can't tell who's speaking or because they're overlapping right they can't separate them out to do analysis only three percent of beluga communication from these data sets are explored like the ocean is more explored than that the Moon is more explored than that this is so exciting the most vocal underwater animal has the Super majority of data never analyzed by Western science but that means if we want to start doing this work we're gonna have to figure out how do we separate out each of the individual tracks from the animal so that we can do analysis that's been impossible so one of our first papers was actually doing exactly this so this is an example that works on bats it works with elephants um starting to work with humpback whales but these are two dogs that are barking at the same time and then you'll see the AI that we trained to separate it out into their own individual tracks so together thank you and then separate and so you can imagine this is like where this like cusp of a moment as we move this from able to work with lab-like data to real life data that we're about to have access sort of like to the new telescope to look out at the universe and then to discover all the things that were invisible to us before what we can understand is limited by our ability to perceive and AI is throwing open the aperture of what we can proceed I sort of think about it at this moment like there's that moment in 1995 uh I think it was late December when Hubble Telescope was pointed at that empty patch and sky and what was discovered was the most galaxies we'd ever seen before in one spot right we pointed at nothing and what we found was everything there are so many blank spots and what human beings know and I think what we're going to discover when we look at nature is everything Freeman Dyson had this to say about how do scientific revolutions happen and he said in the last 500 years we've had six major concept-driven scientific revolutions and in that time there have been about 20 tool driven revolutions right science progresses generally not because of a thing that we see but because we increase our ability to perceive yeah and it's not just for us about getting to you know sort of like you go to the moon but you also invent Velcro so our goal here is to not just like let's get to the end goal of transiting animal communication but animals-based biology conservation deeply underfunded and so if we can just act as a bridge between like the AI world and the incredible resources of you know Tech in the valley and conservation ethology I think that can do a whole bunch so in the last you know four years I guess five years now um there are these things called Foundation models that sort of like what chat GPT all these other models are built on the large-scale models that are trained on just raw data to discover the fundamental patterns of vocabulary to describe what they're seeing and they've been transforming the human world so this is the number of foundation models that have been used for the you know papers in the top journals um in NLP and in 2018 you know it was around four percent of papers were based on Foundation models in 2020 90 were and that number has continued to shoot up into 2023 and at the same time in the non-human domain it's essentially been zero and actually it went up in 2022 because we've published the first one and the goal here is hey if we can make these kinds of large-scale models for the rest of nature then we should expect a kind of broad scale acceleration I don't know whether it'll be 20 50 90 percent and to the extent that con like conservation research ethology research already helps with conservation outcomes then doing this kind of work should do a broad scale acceleration of all conservation so sort of you know go to the Moon also invent velcro these are our first two papers on this this is the first major paper um of benchmarks of animal sounds essentially if you don't have a yardstick how do you know when you're getting better like all of machine learnings are built on this kind of thing um didn't exist for animal communication so we made it this is AVS the very first Foundation model for animal communication and it's already helping change things this is just a lot of numbers because I knew you guys woke up early and wanted to see it um but essentially all it says is that the thing we built down the bottom has all the Bold lines is uh beating all the other uh methods out there but it's already starting to help us do things like build these kinds of language clouds in this case it's for Beluga communication so that the different types of calls that they make can be automatically separate out separated out because these tools help you do classification detection Source separation denoising all the things that biologists need to do so we're pretty excited about that for those who came to the talk yesterday I think this will this is where like we sort of like dovetail for just a second which is I think a question that you might have in your mind is okay you've convinced me that we can translate between human languages without the need for Rosetta Stone and I can see that maybe there's a way that you could do some translation and some parts will overlap in some parts won't but really Translate to something that has no like a whale that's like you know a kilometer deep their world is so different and here's where I think the everything is language is really key because this kind of aligning shapes it doesn't just work only for language it works for everything it's an incredibly Deep thing right so you probably probably saw this at least yesterday that AI used to have separate fields this is great when I get to reuse slides um speech recognition computer vision robotics music generation were all different fields that changed also in 2017 when they became one thing language where you could learn to treat the text of internet as just a sequence of words that's language code is the sequences of characters its type of language DNA is just a sequence of base pairs it's a language and what AI is learning to do is translate to and from all of them so give an example of how this works you know when you've seen you type in an attacks and outcomes an image how does that work well you ask the computer to build a shape that represents language then you ask the computer to build a shape that represents images and though to give a sense of right when I said turns semantic relationships into geometric relationships how does that work let's just focus in for a second on human faces imagine you have a database of human faces you turn that into a shape that means there's a relationship between you know your face and your face which is smiling if I actually take your face and your face which is smiling and I subtract them I get the relationship which is smilingness and then add that to any face in this shape and it'll give me the smiling version of that shape there is a direction that represents oldness youngness maleness femaleness um so what happens then when you build these shapes for language for images you then look at all the image caption pairs on the internet and now it's not a rotation anymore but it is linear transform it's like a squishing and a rotation and if you do that you get a correspondence between these two so when you say portrait of Chile as a person it finds the spot in language space it translates it over to the image space and then you say computer what image would go here generate the image that's at this point and that's how you get this so it's not just language to language that lets you match these kinds of embedding spaces these shapes to do translation it's everything and that helps you understand how this worked right where you know on the left is you know the human seeing an image that's the fmri pattern from seeing the image you give the computer only the ability to see the fmri pattern and it creates you know this draft shape it's working because it's build a shape that represents all of the internal relationships of fmris you have fmri shape you have image shape you just match the two of them and that's how you can translate between so that gives you a sense of the kind of deep power of this technique and actually I think there's something even deeper going on here but this is just a hypothesis there's a thing called the unreasonable effectiveness of mathematics like why is it the case that you go off and you invent complex numbers and quaternions and do abstract algebra and somehow that has something to say about the physical world still a mystery but there's an unreasonable effectiveness of deep learning there is no a priori reason why DNA and images and video and speech synthesis and fmri should share a kind of universal shape but they do and I think that's telling us something very deep about the structure of the universe this was my face when I saw this by the way right AI is learning how to decode translate and synthesize the languages of reality which is both terrifying and rad which is what the future is like terrifying and rad so we actually just won the National Geographic Explorer award um for doing this kind of work where we're turning motion into meaning so that we can translate now not just from what an animal says into another thing an animal says but into how they move um so this is actually one of our partners Ari friedlander and he's in Antarctica tagging whales so these are devices that have video that have audio that have kinematics how they move they're very mesmerizing um and this gives us a wealth of data to make multimodal models that lets you translate to and from what an animal says to how a pod of animals might move or how a pot of animal moves to what they say um we've also just done the first big Benchmark for how do you turn like all of that motion into into meeting you can see the way that we approach this is fundamentally where we think of ourselves like the Switzerland or the myselial network like we're a non-profit we are not in Academia we're not a startup we sit in between everything and that lets us just collaborate with everyone and so what we're doing now is we're starting to build the ability to translate between motion and video and uh what animals are saying and lets us do things like we haven't got here yet but we're getting there given this motion for an animal what sound might it make an example two whales coming together what sound do they make that might mean hello if a whale Dives what sound would the other whales have to make to make that whale dive and that would mean maybe it means dive maybe it means there is danger up here maybe it means there's food now there but has something to do with diving or you could say generate me the audio of two elephants coming together okay now do it again but where the ears are flapping and the Elephant is running faster and you can start to see how this gets you very quickly towards the ability to decode and then you combine that with the ability to align shapes and you can get up into the abstractions oh sorry so where do we go from here so now we've talked about how we can translate motion into meaning we've talked about how you can sort of align shapes we sort of need the final step here how do we test our hypothesis how do we get two two-way communication would you guys like to know how to say hello and how back well all right on the condition that we use it all over the boat as often as possible the way you say hello in humpback whale is oh please foreign they respond but we're sort of at this speak and spell level of communication like we can record audio and play it back maybe we change some frequencies but it'd be really nice to get to like two-way flu and conversation right so I want to tell this story because this is a photo I took um one morning Onalaska I was on a boat um we were out there for you know almost two weeks and we were following the humpbacks we hadn't seen them we hadn't seen them we'd see sort of one off in the distance we recorded audio um one day and the next day we weren't driving this trip we were we were sort of passengers learning how it was done work helping these scientists um and this particular morning it was very foggy the sun came out and it was like Vivid White Light um and birds are just flying out of nowhere going to nowhere and you could hear the humpbacks breathing because they sound they're like oh when they get up there they sound a little like a mix of Darth Vader doing a yoga class it's a very specific Vibe um and then we started this experiment we're playing something and we actually didn't know what we were playing because we're keeping it double blind we were standing on the boat top looking and we're supposed to record where the humpbacks were uh how far away they were they start the experiment and almost instantly I hung back around 30 meters away lunges at its the boat with its mouth half open and we have never seen bad behavior before and then it goes down comes across and from the other side of the boat comes back with the same lunging behavior and for 10 minutes he was moving back and around the boat around the boat the experiment stops we stopped playing anything and we just see the humpback beeline away and we go downstairs like what happened and it turns out you can identify humpbacks by sort of their fluke they have a flute print like we have a fingerprint and it turned out that you know this humpback humans are giving it the name Twain that we were playing Twain's own hello back to himself we did this accidentally it's like an audio mirror and our speaker you saw it it's sort of small and so on the recording you can hear us go I'm Twain and then the humpback going oh I'm twin I'm Twain oh I'm Twain can you imagine a house I could delic that must have been to like live 38 years and then encounter yourself in the ocean you're like this is not okay um hahaha we felt just a little crappy um but it was fascinating to experience but this is the point is that we are at speak and spell level right that is the state of the art is record and play back and imagine going to a foreign country and trying to order food with one of these things it just wouldn't really go or having meaningful conversation so where we're heading to now is two-way communication but not directly human to animal AI to animal and then using that to help decode and there's some really as you can tell from that story interesting ethics that comes out from this which I'll dive into into a second but we gave this demonstration also yesterday been the last couple of months this is four months of uh ago the ability to listen to three seconds of anyone's voice and then continue speaking their voice began so here's the prompt this is real just impressions of people are in nine cases out of ten synthetic your spectacle reflections of the actuality of things but they are impressions of something and the same thing but with piano piano prompt real synthetic you just can't tell so then the obvious next idea is like well can we do that for animal communication and of course the answer is yes and just like you can build a chatbot in Chinese without needing to speak Chinese in the next couple of years you know one three five we're gonna be able to build essentially synthetic whales synthetic tool using crows that can fluently speak it's just the plot twist is we won't yet fully know what we're saying so bizarre not have predicted this this is examples these are chip chaffs and you can hear the prompt and then synthetic prompt we can't tell the question is of course will the animals be able to tell um here's here's us doing it with humpback whales so prompt and then synthetic prompt [Music] can't really tell um that by the way was a version of hello which I'm sure you could all translate now think about it a little bit like this and I think this is helpful to when you're looking at chat GPD or these other things is not that they have an understanding it's they have the ability to mimic or emulate having an understanding so this is like having a superpower imagine your superpowers you could walk up to anyone speaking any language you don't understand and you sit there for a while just listening and he's like okay I sort of see when they say this kind of sound and this kind of sound this kind of sound then that kind of sound and you're like got it and you just Babble you have no idea what you're saying but the other person's sitting there being like yes and it looks like they have a really meaningful conversation and they walk away and you're like I still have no just speaking light language whatever that is um I'm babbling but they understand that's we'll get to here it's profound because that means really in the next 12 to 48 months we will cross whatever this conversational barrier is it's I think problematic to call it first Contact because indigenous cultures obviously have been having communication for a long time um we have communication with you know our our pets but something fundamentally new is happening here where I think we are going to get to experience and understand the cultures of Earth that human beings have never really been able to see or hear or understand and that's we are on the very cusp of that happening but problem humpback whales their song goes viral like for whatever reason Australian humpbacks are like the K-Pop singers and whatever song is sung off the coast of Australia will get picked up and often sung by the world population within a season or two it's a particularly catchy tune because they sing halfway across the ocean Basin there's actually this fascinating uh Zone in the ocean called the sofar Channel where things work just out just right for it to be a um sort of like a fiber optic cable and wells will go down there and speak over a thousand kilometers back and forth in this channel plus they migrate halfway around the world so that gives you a sense of like how these songs can go viral so if we're not careful if we just create a synthetic whale that sings we may up a 34 million year old culture we may create the equivalent of like whale Q Anon we probably shouldn't do that um and what that means is that to actually start this work means starting more in the lab with animals that don't socially learn and building our way up so that at every step of the way we know what we're saying before we go out into the wild and start trust blindly trying to communicate I think it's really interesting that it's so easy for us to see how easy it is for us to mess up animal cultures and how much harder it us is for us to see that when we have these things just blathering to humans then we're going to mess up our own think well before I get into this I think there's a huge opportunity in storytelling which I'm excited about which is in order to show up to the Newfound power of being able to communicate requires a different kind of responsibility right and the whole point of this project is to change the way that we relate to the rest of nature and to realize I have everyone on the planet realize that we can have two-way conversation with animals that have a rich culture and Rich symbolic language is to shift our relationship so you can imagine a kind of prime directive or a Geneva Convention for cross-species communication that we should probably work on before we get to the actual ability and then imagine telling that story on the world stage a kind of like a social dilemma or something else but now for talking with animals I always think about the number of people that come up to me and talk about my octopus teacher and how that's shifted their behavior imagine what we can do when we tell this next version of that story at global scale like that gets me so excited and then you pair that with you know rights for nature personhood for non-humans you or Wilson and a half Earth um I think there's a way of using the power of the story we're talking about now to superpower the theories of change of everyone else who's already out there working in climate that I think is super exciting the way I've been thinking about is that AI I mean AI is like everything it's a hard thing to pin down in a metaphor but I think in this case AI is like the invention of modern Optics right because objects give us two things it first let us like create microscopes and discover all of the world of like germs and viruses like a completely unknown thing but even more so it gave us the telescope and what did the telescope do the telescope let us look out at the patterns of the universe and what did we discover we discovered that Earth was not the center this time we're going to look out at the patterns of the universe and what are we going to discover yeah that humans are not the center and even if we could draw down all of the carbon to we had that technology and we should do it wouldn't fix the core generator function which is human ego and so I think of this project as a kind of project there is no Silver Bullet these are complex systems maybe there's no Silver Bullet but maybe there is silver Buckshot this is one of the pellets they can help shift how we think about ourselves in relationship to the cosmos and this tiny planet we live on this is I think the next Frontier sometimes I think our telescopes are pointed the wrong direction that if we really want to understand that we are not alone we need to be looking in our oceans and in our backyards so thank you very much [Applause] [Music] all right all right
Info
Channel: Summit
Views: 6,343
Rating: undefined out of 5
Keywords: summit, conference, ideas, talks, performances, gathering, community
Id: 3tUXbbbMhvk
Channel Id: undefined
Length: 49min 36sec (2976 seconds)
Published: Tue Aug 01 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.