The Revolution Of AI | Artificial Intelligence Explained | New Technologies | Robotics

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
this is what the future of hyper intelligence looks like most people no longer own cars instead artificial intelligence operates fully electric Network self-driving Vehicles as a result air pollution and traffic congestion plummet across the planet self-navigating aerial drones are on the front lines for Disaster Response and search and rescue missions most people live and work side by side with self-aware Androids these AI companions boost productivity and liberate humans from tedious tasks completely revolutionizing Modern Life [Music] feel like I'm in a superhero movie today scientists are blazing a trail to this very future the fact that we're enabling the system to make its own decisions I don't even know where to begin with that I want to know what breakthroughs are being made it's talking and it's having this Dynamic conversation with you that's the Wonder machines can be self-aware in ways that we can't that will Forge the future too oh my gosh it's looking at me hyper intelligence [Music] [Music] [Music] foreign [Music] Bigler as an engineer and neuroscientist in training I'm obsessed with artificial intelligence as a kid my father took me to Tech and Robotics trade shows where I became dazzled by science every year the inventions were smarter and smarter artificial intelligence has come a very long way in the last couple of years most AI Technologies are programmed to think for themselves to learn from examples kind of like simulating human intelligence in a way that it learns from past experience but how does AI actually work in the future will AI achieve human traits like emotion Consciousness or even Free Will and how will humans and robots work together today the clearest road to the future is the self-driving car unlike a regular car which is just a machine a self-driving car is a robot that can make decisions in the future will every car on the road become driverless to find out I've come to a hotbed of self-driving car research Pittsburgh Pennsylvania every single person has started to have conversations about self-driving cars because essentially they're the future but in order to understand it we have to look under the hood making decisions on the Fly even simple ones like these does not come easy for computers to discover the inner workings I'm meeting a true Pioneer in the field please get it thank you Dr Raj Rajkumar of Carnegie Mellon University Carnegie Mellon is the birthplace of self-driving car technology thanks in large part to the work of Raj and his colleagues they've been the leading innovators in this field for more than 25 years so how does his self-driving car make decisions to safely navigate the world like a human driver um should we get started yes we can since Raj is distracted by our conversation for safety reasons the state of Pennsylvania requires another driver in the front seat to monitor the road this is so cool I'm nervous but excited with the longest you've ever driven a vehicle autonomously oh we have gone hundreds of miles awesome I'm going to go Auto by pressing this button oh my gosh it really is driving itself While most self-driving cars are built from the ground up Raj just bought a regular used car and hacked it with powerful onboard computer systems making it more adaptable than other regular cars we installed a bunch of sensors in them it is able to shift the transmission gear it is able to turn the steering wheel apply the brake pedal and the gas panel it's really a software that runs on the computers that makes this capability really practical and there are some very key fundamental artificial intelligence layers that tries to mimic what we humans do to mimic human decision making most self-driving cars use a combination of cameras and advanced radar to see their surroundings the AI software Compares external objects to an internal 3D map of streets signs and transportation infrastructure the map is something that is static in nature traffic and people and objects are dynamic in nature the dynamic information it figures out on the Fly comprehending Dynamic information allows it to understand where it is heading objectively in space and react to changes and traffic signals aha it recognizes the stop sign yes excuse me a pedestrians we definitely should not be honest with that person the AI challenge to make a vehicle drive itself is not an easy task safety is priority number one and party number two and number three as well but what happens when the AI system doesn't understand specific objects in its surroundings a pedestrian in Tempe Arizona was killed last night by a self-driving taxi it's believed to be the first fatality caused by an autonomous vehicle this tragic accident happened because a self-driving vehicle didn't recognize something in its environment a jaywalker in the future Advanced self-driving cars will have to make life and death decisions on the fly if avoiding the jaywalker means crashing head-on with another car potentially killing the driver what should it choose how will scientists address Monumental problems like these the first wave of artificially intelligent robots were programmed by Engineers with static sets of rules to achieve their goals these rules are called algorithms but not all rules work in all situations this approach is very inflexible requiring new programming to accomplish even the smallest changes in any given task a new approach called machine learning has changed everything with machine learning computers can absorb and use information from their interactions with the world to rewrite their own programming becoming Smarter on their own yeah to see machine learning in action I'm meeting another Carnegie Mellon team at an abandoned Coal Mine Dr Matt Travers leads a group that won a challenging Subterranean navigation competition held by the department of defense's research agency DARPA they're affectionately known as R1 R2 and R stands for robot these robot twins are designed for search and rescue missions too dangerous for humans and unlike the self-driving car they operate without a map to achieve this they have to learn to identify every single object they encounter on the Fly they are programmed to go out and actually act fully autonomously and they will be making 100 of their own decisions so they're recognizing objects they're making the decision of where to go next where to explore to see this in action the R2 robot is starting on a simulated search and rescue mission to find a stranded human dummy in the mine imagine having a map of a collapsed mine before you sent a team of people to go rescue someone in that mine right like it's a game changer how the robot discerns elements in this environment parallels how an infant learns about her environment a three-month-old uses her senses to cognitively map out her environment and learn to recognize her parents she ultimately uses this map to interact with everything in her world just like this robot okay so we ready to roll artificial intelligence makes this learning curve possible but how does it create its own map and identify a human on its own and without an internal mapping system like the internet test engineer Steve willitz shows me how the R2 robot can detect a surrounded person when you're in a search and rescue scenario that's kind of situation where you'd want to deploy one of these as it explores and maps The Cave it drops devices called signal repeaters to create a Wi-Fi network Trail it drops those just like breadcrumbs along the path using this network the robot sends data back to home base to create a map at the same time the robot must look at every single object to identify the stranded human so the lidar system is giving a full laser scan lidar stands for light detection and ranging similar to its cousin radar which uses radio waves lidar systems send out laser pulses of light and calculates the time it takes to hit a solid object and bounce back this process creates a 3D representation of the objects in the environment which the onboard computer can then identify this process is similar to how the eye feeds visual data to the brain which then recognizes the objects by tapping into our pre-existing knowledge of what things look like fully understanding its environment R2 can then make better decisions about where to go and where not to go what are robots doing right now is exploring so the robot came to a junction and off to the left it could see that it couldn't get past right so it saw the opening to the right and that's where it went [Music] foreign it kind of looks like it's making decisions about whether or not to climb over these planks and obstacles all up in this area right that's exactly what it's doing at this point just like a baby R2 learns through trial and error it's like a little dog wagging its tail but there's no one here to rescue so it moves on as R2 continues to map out the mine oh my God a human it stumbles upon its intended target that is Randy rescue Randy hello rescue Randy scared me with the discovery of rescue Randy the R2 robot can not only alert emergency Personnel but also give them a map on how to find him that is incredible it knows what it's doing these incredible rescue robots are clearly Paving the path to the future of hyper-intelligence in the future autonomous exploration Vehicles perform search and rescue missions in every conceivable disaster Zone even in Avalanches atop Mount Everest incredibly intelligent off-road vehicles are also mapping deep cave systems previously unknown to science discovering a vast supply of rare Earth elements essential for modern technology artificial intelligence will clearly save human lives in the future but there's a lot of terrain on Earth that's too difficult to navigate on wheels how will intelligent robots make their way over rainforest bodies of water or even mountaintops in Philadelphia Jason darinik of xn Technologies is working to overcome this problem what we focus on is autonomous aerial robotics to enable drones to safely navigate in unknown or unexplored spaces Jason's team has built the first industrial drone that can fly itself and map their environment as they go we focus on all aspects of autonomy which includes perception orientation of during flight motion planning and then finally control but going from two Dimensions to three dimensions requires an increase in artificial intelligence processing the mission for their drone is to fly independently through a three-dimensional path from one end of the warehouse to the other starting mission three two one now to mess with its computer mind Jason's team places new and unexpected obstacles in its path will the Drone recognize these unexpected changes will it get lost will it crash essentially we have a gimbled lidar system that allows the vehicle to paint a full 360 sphere around it in order to sense its environment like the robot in the mine this aerial robot uses lidar to see it actually generates a voxelized representation of the space which you see here and each one of these cubes in the space it's trying to determine whether that cube is occupied or whether it's free space [Music] of where to go based on its visual input kind of like us humans incredibly the Drone recognizes the white boards and flies around them one of the things about this system that make it particularly special is that it's actually being used in the real world to keep people out of Harm's Way [Applause] they're already at work in hazardous Industries like mining construction and oil exploration they safely conduct inspections in dangerous locations and create detailed maps of rough Terrain from a technological perspective the fact that we're able to do everything that we're doing on board self-contained and enabling the system to make its own decisions I don't even know where to begin with that self-flying robots like these will revolutionize search and rescue and Disaster Response they could also transform how packages are delivered but there are limits to what large single drones can do more complex tasks will require teams of small Nimble autonomous robots Dr Vijay Kumar at the University of Pennsylvania is working with swarms of drones to perform tasks like play music or build structures cooperatively he's also developing Technologies to tackle some very big problems including world hunger in a couple of decades we'll have over 9 billion people to feed on this planet of course that's a big challenge to take on a task this big he's building an army of small flying robots with the ability to synchronize we think about technologies that can be mounted on small flying robots that can then be directed in different ways like a flock of birds reacting to a predator or a school of fish you have a coordination collaboration and it all happens very organically using AI to get robots to work as a coordinated Collective group is a daunting task three five years ago most of our robots relied on GPS like sensors today we have the equivalent of smartphones embedded in our robots and they sense how fast they're going by just looking at the world integrating that with the inertial measurement unit information and then getting an estimate of where they are in the world and how fast they're traveling This I Gotta See and I'm gonna check it out virtually as a robot I'm at UPenn remotely in Vijay Kumar's lab sample my surroundings oh I hit something hello hi vijay's Apprentice Dinesh takur is my guide today we are going to show robots playing in a formation great can we see how that works sure yeah the first step Dinesh takes in coordinating the drones is to provide them with a common point of reference in this case a visual tag similar to a basic QR code [Music] using only the onboard camera these drones reference the code on the tag and visualize where they are in space using sophisticated bio-inspired algorithms the drones then figure out where each other drone is within the collective swarm these drones are communicating with one another right yeah right now they're communicating over Wi-Fi so cool future versions of these drones will create their own localized wireless network to communicate but for now this swarm is a proof of concept you've defined a formation and then they're assuming that formation yeah I just say I want to form a line and the drones themselves figure out where they should go once they figure out where they are in relationship to each other they can then work together to accomplish a shared goal like ants working as a collective entity once they can coordinate between each other we can send them out and doing specific missions that's really cool swarms of flying robots have their advantages unlike a single drone self-coordinating swarms can perform complex operations like mapping much faster by working in parallel and combining their data and losing one drone in a swarm doesn't Doom the whole operation [Music] the Jay imagines employing his Advanced swarm technology to work on farms this Precision agriculture will help feed the world's growing population we'd like robots to be able to roam farms and be able to provide precise information about individual plants that then could be used to increase the efficiency of food production that would be a huge impact in a world this is our duty as responsible citizens and as responsible Engineers the high-flying approach towards resolving the problems of the future is definitely a path in the future artificial intelligence coordinates flocks of drones to protect the environment and boost the food supply to combat the negative effects of climate change on agricultural crops robotic bees assist with pollination in Orchards and on farms making them more sustainable and productive fish-shaped underwater robots automatically deploy at the first sight of an oil spill these drones create a barricade to rapidly contain spills saving marine life and oceans across the world modern society has a long history of building robots to do work that's dangerous difficult or too repetitive for humans AI is poised to automate all kinds of tedious work ranging from factory work to taxi driving to customer service while some are worried that smart robots will replace human labor that's not necessarily the case as a sector artificial intelligence is expected to generate 58 million new types of jobs in just a few years so what will the future of human robot interaction mean for our work and livelihoods I'm at the Massachusetts Institute of Technology to meet Dr Julie Shaw she's leading groundbreaking research in human robot collaboration my lab Works to develop robots that are effective teammates with people Julie and her team are creating software that helps robots learn from humans even giving them insight into different human behaviors by being aware of real people robots can directly work and interact with them how do you teach these robots or machines to do these human-like tasks the first step as it would be for any person the first thing they do is become immersed in the environment and observe and then we need an active learning process the robot needs to be able to communicate or show back to the person what it's learned we don't want the robot to learn the direct sequence of actions we want the robot to learn this more General understanding that's ultimately like our challenge but getting a robot to grasp the bigger picture Concept in order to understand the basic of its task in the first place requires a lot of observation and well hand-holding my research is focusing on trying to make robot programming Easier by trying to teach robots how to do tasks by demonstrating them Julie's colleague Ankit Shah shows me how this robot is learning to set a table so this is all the silverware and the plates the Bulls the cups and this is the table that it has to set yes that is correct okay as any parent knows the first step in helping a child to learn is to model the desired Behavior it's the same with machine learning in this case the AI robot recognizes the objects with a visual tag similar to a QR code and for two weeks it observes unkit setting a table so did you pick up an item and then place it on the dinner table that's basically what we did and based on that the robot learns what it means to set it in a table Dynamic tasks like setting a table or doing laundry are easy for humans but incredibly hard for robots the software has difficulty with so many variables and even subtle changes in their environment can throw them off one of the things which I like to do is to actually hide some of the objects so it's not going to see the spoon and the reason we do this is we want to show that the robot is robust to some of the disturbances in the task the robot software has learned what each object is and where it goes now let's see if it's learned the concept and can think dynamically to set the table so you can just pick up the card here we go I've revealed the spoon incredibly the robot recognizes the spoon and instantly places it next to the Bowl this reveals that the robot has learned the concept and executes the right action dynamically in the process the software is continuously writing and revising its own computer code basically it's learning foreign if like humans robots can grasp the bigger picture context and not just the mathematical tasks will AI driven robots of the future spell the end of having to work the key aspect is not developing AI to replace or supplant part of the human work but really interesting how we fit them together as puzzle pieces people work in teams to build cars to build planes and robots we need to be an effective team member it's real teamwork as if you're in a basketball game you have your goal right and you have to think spatially who am I going to pass the ball to and at what time you do that so that everything matches up the analogy of a basketball team is outstanding because we actually need to know spatially where they're going to be and the timing is of critical importance and so we need to develop the AI for a robot to then work with us one of the most difficult aspects of creating hyper-intelligence is actually something that even we humans sometimes get wrong and that is anticipation anticipating what a teammate or co-worker might do requires understanding of contextual information on a much more sophisticated level and predicting what will happen next can robots make predictions as accurately as we can Abby's are industrial robot it's giving this Abbey machine the intelligence necessary to help it anticipate a human co-worker's action this is a simulated manufacturing test that we have set up okay to simulate some sort of task that a person or robot could visibly work on together for safety reasons actual human robot interaction is at present fairly minimal typically in a factory you would see these guys behind a metal cage and you wouldn't have people working with them so what we're trying to do is make something that a person could safely interact with what is human enroll to do together in this task on this task a person is placing Fasteners in some surface of a plane and a robot applying like a sealant over to seal in okay can we see it happen sure the robot must first be able to see and recognize the actions of his human counterpart and adjust to the person's every move ooh I feel like I'm in a superhero movie so the camera is in the room can see these lights and track your hands so that your hand doesn't get cut off by the robot that's right yeah so the cameras and the lights basically work as ice for the robot so that's how the robot knows where I am the monitor shows the visual representation of the room that's inside the robot's mind so this is what the robot might be doing if you know I'm not in this way and the robot's just stealing and I'm not supposed to be here put my hands in a robust way by quickly understanding this Human Action the AI software reacts accordingly by stopping it's important to be able to share the workspace building on this sense of teamwork pem's next step is helping Abby anticipate where he will move next based on subtle contextual cues so in this case the robot will not only track which actions I've done so far but also anticipate which portion of the space I'm going to be using okay and when it's planning its own motions it'll avoid those locations so that we can more work together together so what you'll see now is after I place this Bolt the robot is going to predict I'm going to go to this one next so what you'll see is it'll behave in a different way so now that I place this build the robot kind of takes a more roundabout path that allows me and the robot to work more closely together and I don't have to kind of worry about it crashing into me because I can see that it's trying to avoid me so similarly Linda's side replace this Bolt let's get a robot take some more kind of like roundabout path yeah because you're going to go there now slow down because I'm close to me right work together at the same time so not only is the interaction more efficient in that the robot's not spending too much time standing still [Music] it's safer because the robot's not constantly kind of almost hitting me and also feels nicer for the person working with the robot I really love this theme of teamwork programming robots to coordinate with us and anticipate where we will move won't only revolutionize the workplace but it will also change society at Large in the future the coordination of men and machine is so Advanced that this collaboration increases productivity and accuracy in most Industries AI robots now accompany surgeons in hospitals across the globe they anticipate the doctor's needs and hand them the appropriate medical tool just before it's needed this dramatically reduces surgery times and human error and will artificial intelligence actually surpass human intelligence while some machines have exceeded human ability in games like trivia now we come to Watson in chess these AI systems were designed to master just a single skill these programs use brute force computer processing power and specially tailored software to beat their human opponents to achieve the Holy Grail of hyper-intelligence scientists must develop systems with flexible human-like abilities to both learn and think this form of smarts is called artificial general intelligence I'm back in New York City on my own campus at Columbia University to meet with Dr hod Lipson hod's lab is developing creative robots that paint original artworks self-assembling robots and even robots that learn about their world without human assistance but his ultimate goal is even more ambitious can a machine think about itself Can it have free will I believe that in fact machines can be sentient could be self-aware in ways that we can't as a neuroscientist I know we've only scratched the surface of our scientific understanding of how Consciousness Works in humans how could one possibly use computer code to put this Transcendent feature into a robot our hypothesis is actually very simple it is that self-awareness is nothing but the ability to simulate oneself to model oneself to have a self-image the first step towards creating robotic Consciousness is to teach the software to build an image of its physical mechanical self inside its computer mind we humans take Consciousness like this for granted even in simple moments like understanding our own image reflected in a mirror humans start to develop awareness of their own emotions and thoughts around the age of one this helps babies understand their self-image in their minds and it helps them to learn about their environment and their role in it when a robot learns what it is it can use that self-image to plan new tasks both humans and robots awareness about the ethical self is called proprioception neuroscientists sometimes call this self-awareness of our bodies a sixth sense we use the same test that a baby does in his crib a baby moves around flails around moves its arm in ways that look random to us but they're not random and then it touches its nose right now if it brain predicts that it's going to feel something and it actually feels that that means that its self-image was correct same thing happens with the robot if proprioception can be developed to the same level as humans this could lead to robotic consciousness hot's colleague Rob quiatkowski is the proud parent of a brand new baby robot that he built and by interacting with its surroundings it's in the process of developing its own internal self-image so what are these claws so these are actually feet they're designed for walking on carpet but as of now it doesn't really walk it's still kind of a baby needs to learn how the world works first what do you mean it's still kind of a baby what does it do like a baby it's sending completely random actions to each of these robot arms and really try to get an understanding of itself primarily it looks like a spider that that doesn't know how to use legs yeah I guess that's pretty good way to put it yeah so it will really be learning by doing this babbling for somewhere on the order of a day to a week it will process this data to create an informative model of itself and from there imagine how it would walk and then execute that walking in the real world these are the first baby steps towards developing its self-image and like a baby it will eventually learn to walk we know this because an earlier version of this robot using the same technique learned to walk after 100 hours but walking won't by default lead to robotic consciousness and that's why self-awareness is so crucial this is a robot which we've taken to calling a self-aware robot it is self-aware pretty much in a literal sense that it is aware of itself its locations in space and its Dynamics as to how it moves okay so it kind of understands its own movements and where it is in space how does it do that by leveraging this technique which has become popular in recent years called Deep learning deep learning is a form of artificial intelligence that like the human brain learns through raw data unsupervised and without structure deep learning gives machines the ability to experience and process reality like us Rob has devised an experiment to test what this robot knows about its world you can think of it as if you're looking at these red cotton balls you have some idea in space as to where they are right now if you were to close your eyes and try to pick them up and put them in this cup so obviously it's not a trivial task no but it's not the most difficult task in the world because we have a good proprioception we have this good model of yourself you know where your arm is in space relative to other things that you see in space but for this robot there's a catch so where are the cameras so there's no camera it's none as if you were to close your eyes you know the locations at the start and it's picking it up and placing it completely blind furthermore it was not given a map or any formal instructions the robot simply has to feel its way through the task all right let's see it give it a shot first the robot learned how to use its arm through trial and error developing a sense of proprioception by exploring its surroundings it generated an internal representation of the world and its place in it the robot is using only its internal image of the external world to maneuver its arm to pick up all nine balls and place them in the cup I'm not sure that's something I could do with my eyes closed huh it's really just based off of understanding where you are in space yeah that's right creating AI robots that have an internal model of their world is an important step towards machine self-awareness self-awareness is sort of a similar propriet sensitive capability but applied to mental thinking so if they think about thinking they think about what they are because once you can do that it means you can plan things into the future once robots become self-aware they will need Advanced ways to communicate hit with humans keyboards and screens are inadequate for complex thoughts robots will need to learn to speak and have natural conversations like a baby who listens to those around her and learns to talk laying the groundwork for this kind of human machine interaction is pioneering scientist Barbara gross of Harvard University her seminal work in what's called natural language processing directly led to the development of voice activated artificial intelligence you know like Alexa or Siri natural language processing actually predates artificial intelligence and started with machine translation efforts ability for a computer system to OK spoken dialogue with a person has been a long-standing goal of artificial intelligence research from its Inception and it turns out this is a challenge because when you speak what you say really depends on the context in which you say it another challenge is the meaning of words can change depending on how they are delivered so one example is the contrast between saying that's fabulous they're that's fabulous also when we have a conversation we Mark paragraphs at the beginning with a rise in intonation in a fall at the end so there's a whole way the speech signal tells you something about the context and something about the intended meaning Barbara's early research led to methods for programming computers to understand the meaning of spoken language by using Clues from a person's tone and context so let's Flash Forward the speech systems are amazing now because there are lots of recordings of people speaking that they can build their systems on as a result AI has gotten much better at understanding still room for improvement the systems that do exist are pretty much focused on very narrow tasks this takes Siri and Alexa as examples they're mostly oriented around a single question or a single request and they presume that anybody will stay within the range of behavior that the designers imagined so researchers are turning to machine learning by training AI with hours and hours of human conversation they can learn to better understand the context of how humans Converse future versions of this technology will allow us to have natural conversations with our computers one of the things that's amazing to me is that her Fields have succeeded so well that there are devices out in the world that people use every day I never dreamed that would be the case in my lifetime hyper-intelligent natural language AI will change the way we interact with our computers and robots but this advanced technology will never reach its full potential as human companions until it looks convincingly like us I'm in Los Angeles to meet how Lee his company pinscreen is giving AI a human face they're developing cutting-edge techniques to create hyper-realistic digital avatars in an instant one of the hardest things to bring to the virtual world are humans right and specifically faces to create believable faces how is relying on complex AI algorithms it's an artificial intelligence that actually digitizes yourself into the computer by just looking at a single image or you know partial information and it's not just a 3D model static one but it's one that can also be animated and brought to life other methods of generating life like avatars need to capture multiple angles of a faith and motion and they can take hours to render but not pin screens technology I can show you real quick how this works do you want to see yeah incredibly his software also allows him to superimpose any face he wants in real time so if I do this this blue face is basically a face tracker so in real time it's actually modeling my face in 3D so if I move around my face the blue mask isn't basically a three-dimensional representation of my face wow it's kind of like a green screen like a Hollywood CGI film the computer dynamically models house faith and tracks his movement on the Fly click on him turn myself into Putin wow it's basically generating the whole thing in real time right now oh my God Putin's talking to me right now political leaders aren't the only thing Hal can generate [Music] Audrey Hepburn it's generating all the pixels in real time these teeth are never seen in this picture so it's predicting what your teeth would end up looking like these aren't even your teeth yeah these are not my teeth it's actually generating oh my goodness Howell believes that software like this will give a more human face to the digital world ultimately this will result in friendlier looking Androids and even virtual beings I have been hanging with my dog for a while do you have pets I have three toy poodles someday we're actually going to interact with virtual beings that are going to assist us in our life imagine instead of talking to a Serial Alexa you're talking to a face right and it's the best way to communicate is to have a face-to-face communication an AI provides you perfect companionship this kind of Technology will give AI a faith that most people can relate to are you human or are you artificial intelligence that is a very interesting question I think I am human but I am artificial intelligence hyper-intelligent companions could Usher in a more helpful and hopeful world [Music] high powered virtual beings that look talk and even think like real humans are commonplace these holographic assistants take care of many aspects of daily life ranging from fashion advice to business consultation their faces and wardrobes can be customized depending on their role when a doctor is required these virtual assistants play the part and are always on call armed with the latest medical knowledge they accurately diagnose most common diseases the same technology is also capable of capturing the image voice and life story of loved ones after death these virtual friends and family are always a part of our lives even if Engineers can create lifelike robots that look like humans called Androids in order for AI to become true companions people will need to feel comfortable embracing these Androids figuratively and literally I'm outside San Diego to meet Matt McMullen the founder of real botics Matt is building Androids that people will want to embrace physically the goal is to create not only a robot but an AI that are both appealing enough that someone would feel like they were actually getting to know someone not something once the sculpting and casting is complete members of math on an actual functioning robot the faces are actually modular the face just literally comes off yes it does the idea is you create one robotic head and a whole bunch of different characters that can all run on that same heads so all of the things that move in the face are actuated by these magnets that are in skin [Music] programmers use artificial and advanced chat Bots for these robots the goal is for them to have natural conversations with their companions blanket yeah she's a blinker it's looking at me hello how are you today I'm okay I I'm fine I'm doing just fine how are you why do you ask me that um you know because I care about your feelings [Music] speech is only one aspect of human communication facial expressions are hugely important in social interaction so Matt is incorporating this non-verbal communication into his Androids the vision system that we're working on she'll be able to look at you and detect your emotion by the expression on your face by the temperature of your skin and all these other things coming or is key right yes exactly it looks remarkable it's moving it's talking and it's having this Dynamic conversation with you that's the Wonder I can imagine some people might walk in here and say oh look at sex robot the thing is to make a really impressive and good sex robot you actually have to make a good robot in the first place but I think that the the longer term goals are going to be to create these systems for people to use in whatever way they see fit we're creating human-like robots that we think can be used for a huge variety of things for people who are lonely whether they're old or maybe they're socially isolated or maybe they suffer from social anxiety Androids with a friendly face could keep the elderly company and monitor their health armed with artificial intelligence these Androids could take on other qualitative roles I think therapy is a huge one using the robot as a safe conduit for communication and letting people really open up because they don't feel like they're being judged by something like this yeah companion Androids like this will Forge a future where nobody will ever have to feel alone again wow life like human Androids and virtual beings have the potential to enhance human social interactions there are ethical concerns as well using artificial intelligence it's possible to hijack a person's physical identity there is one very big problem in the whole thing which is privacy what if I would do something harmful to you when you say harmful you mean reconstruct somebody and have them say something that would they would never say or never do right digital Fabrications like this are already emerging online in what are called Deep fakes I can go on your website take a picture from it and then create some content with it without your consent also dangerous swarms of AI driven drones could be used in terrorist attacks can drones be weaponized of course they can be recognized these scientific breakthroughs yield results again oftentimes be used against humans so you have to be held accountable for what you developed and it's a moral responsibility to think about the broader consequences when it comes to ethical considerations like feel hopeful that science technology and human Ingenuity will find solutions to these big problems the potential for artificial intelligence to profoundly improve Society to improve jobs to improve health care to improve education is enormous if we do it the right way trying to build computer systems that assist them in doing what they're doing better technology is more likely to provide some tools that will allow us to become superhuman augment our intelligence to make better decisions and to get better insights about the world future versions of this technology will become even more intelligent than us humans I believe there's no doubt robots will exceed human capability I mean the path is very clear whether it's going to take 20 years or 200 years this is maybe the most powerful technology we've ever invented potential of you can ultimately probably program a robot or or a computer to carry out your vision and in the right hands this technology has the potential to radically transform every part of daily life for the better a true partnership with hyper-intelligent robots with their intentions aligned with our own will transform Humanity for the greater good [Music] foreign [Music] [Music] [Music] [Applause]
Info
Channel: Moconomy
Views: 359,034
Rating: undefined out of 5
Keywords: The Revolution Of AI, Hyper Intelligence, Artificial Intelligence, AI, documentary, documentaries, best documentary, youtube documentary, full documentaries, robotics, drones, technologies, new technologies, cutting edge technologies, future technologies, technologists, intelligence, flying drones, robots, robot rovers, robot assistants, future technologies documentary, artifical intelligence explained, ai explained, Shivani Bigler, Jason Derenick, Kyle McCabe, Christopher Webb Young
Id: ADga4JH3Ywo
Channel Id: undefined
Length: 49min 3sec (2943 seconds)
Published: Sat Mar 18 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.