Michael Perry // Boston Dynamics

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
good morning everybody so a few months ago when the schedule for this event came out somebody in the construction industry emailed one of our team members at Boston Dynamics and said why on earth is Boston Dynamics presenting at a construction symposium and that's what I'm here to talk to you about you know I'll give you a overview of how Boston Dynamics develops robots how we're moving from research and to product ization and how we believe that product ization can be applied to a wide variety of industries specifically in the construction world now we have some theses about how this technology can be applied but I hope the main thing that you take away from this isn't necessarily the exact applications that we imagine for construction but more the message that this system that we're developing is a platform that's flexible for pipe different types of palos sensor configurations manipulation devices and software so that you can bridge the digital world with the physical world and way that was previously impossible many people are familiar with Boston Dynamics because of some of the research projects that we've been putting up on YouTube for the last 10 15 years you know we're famous for putting our robots through antagonistic environments making sure that they can persist in their challenges even when faced with things that knock them off course difficult terrain adverse weather situations this is ls3 which was an 800-pound robot designed to carry 400 pounds of payload into the field after doing some research for the military we started thinking about how we could scale down that idea of total mobility with robots into something that could be applied more generally for different applications this is an example of us testing what it'd be like to do package delivery with a robot over time we scaled that down even further by using electric actuation and electric power into the very first version of spot mini which was designed to start going into human purposed environments no longer having to operate in large open spaces we could do locomotion and applications and a house for example this isn't something that we're planning to do anytime soon but it's more a demonstration of the robots ability to get inside a human purposed environment and do something meaningful while still being terrain agnostic being able to go over any type of terrain and so persistent it's tasks those were all the research robots but here a few years ago we did another design spend on spot and went through a design for manufacturing past so that we could actually pass this robot to contract manufacturers to manufacture this technology at scale for the first time now the world of legged mobile mobile robots is pretty small right now and actual deployments of mobile robots in the world is even smaller so when we started thinking about how to bring this technology to market we decided to take a platform approach so that people could build on top of the robot and discover its utility and its applications as they actually started using it but we also wanted to start digging in deep into some of the industries where we saw challenges in mobility challenges and automation that something like a robot could could handle and that's what led us to the construction world we started seeing a lot of these tasks in terms of surveying documentation that take up a lot of time they're put people in hazardous situations people that are already really busy you know many people working overtime on construction projects as is so adding anywhere between 8 of 15 hours per week just take going around a site taking pictures is really cumbersome but that work is important because we're starting to digitize struction sites by censoring them so that you can put it into your project management tools your BIM software so that it can feed back into the overall construction loop right now in the parlance of automation construction is open-loop so you're taking a very analog process and saying in order to make it digital you have to do more work to actually been reap the benefits of these digital tools so people that are already busy already working overtime are having to increase the amount of work that they're that they're already doing just to collect enough data to put into the digital tools and then do even more work to extract the information from those digital tools and start planning and preparing them on-site some of the insights on site so that you can actually get to the build phase so we started saying well what would it take what would it mean if you started automating some of this data collection and some of the preparation of the site after putting it through a digital processor and you know we we said well you could increase transparency on-site have better insight on some of the challenges that are happening day to day improve your billing cycle make sure that you're reducing the amount of rework you could probably optimize the time of your people on site not only decreasing the amount of time that they're having to do documentation but potentially stage materials make sure that workspaces are less cluttered when people actually go into the site to do work and also at the end of the day save money in the construction process but the real question is why isn't this happen yet and we we assess that there are a ton of tools already to start doing some of this work but it's not fully capable of addressing the full construction site so you have fixed sensors wheeled and tracked robots and drones all capable of doing some of this data collection but there's so many obliques so many areas that are difficult for automation to touch right now that this type of automation hasn't been fully deployed on construction sites so what would it take well we imagine that if you tell the robot to go up those stairs it would be able to do it automatically without any additional work or input by the operator the robot should also have the intelligence so if it's directed at say something that's moving in the environment dynamic objects like scissor lifts or materials that were staged differently from one day to the next the robot has enough intelligence onboard to make a decision how to get through the environment and still persist in this task and it should be intuitive you know you shouldn't be have to be a trained robot operator to interact with the robot and get it to do the thing that it needs to do so you should just be able to tell the robot where to go what to do and just execute the task we imagine this would create a closed loop for the construction environment we're collecting data processing and planning are all automated so that people on site can just focus on actually building and that's where spot enters the picture so spot is our first product eyes robot that we will be bringing to market later this summer and this is the this is the robot that is finally bringing unrestricted mobility to the construction site and other applications so the first thing that you'll notice about spot is that by having legs it can get through dynamic uncertain environments with ease so as you saw it can drive up those stairs and just without having to change any of the software not having to do anything besides saying drive forward it's able to get up being a quadrupeds it's also omnidirectional so can move forward and backwards it can turn in place it can move side-to-side strafe which means that it has a much smaller footprint than any other type of mobile robot in a construction site it uses its legs to also position its body and normal in different ways so as you can see it can stretch up stretch down shift side to side and besides looking adorable it also means that you have a lot more flexibility and where you can point a sensor and we'll show you also how it increases the workspace of the arm the spot can even sit down and this is important because if it falls over which you know legged robots have a tendency to fall over it can get back up again by itself so you can stand so with the robot we have five stereo camera pairs around the body that are giving it real-time information about the environment around it and that not only helps it understand where to put its feet but also where there might be obstacles in its environment that it can walk around it does that autonomously the robot has three different payload ports on the back of the robot they're pretty standard db25 payloads that allow you to provide both power and columns to external processors sensors comms devices here we have a spread spectrum radio on the back of the robot which gives it a wide wide range of communication distance we've even done tests with LTE on the the back of the robot we've been sitting in Tokyo driving a robot in Boston but it also allows you to to add a manipulation device so this is the arm that we'll be shipping with the robot and they're selling separately from the robot in months after the robot is available but this is a critical component because it moves the robot from the ability just since the world but to sense and interact with the world and do a wide variety of tasks but critically you know we've automated a lot of behaviors specifically for the arm so that you don't have to be a skilled robot operator to do great things with it so no longer do you have to do joint by joint kinematic planning for the arm marty is just driving the arm telling it where to go and the arm moves naturally the other thing that's really cool about the arm is that you can decouple the body motion from the the manipulation tasks so this is what we call chickenhead mode so the head stays in place as the body moves around and this is not only really cool for demonstrations but it's also something that's critical if you're doing manipulation tasks in the world so the robot can hold on to an object like a door handle and the body can be doing something else to leverage the power of its form factor to move around the last thing that we'll show is that you know you have the ability to with through software you have that we have an API that allows you to program different types of interfaces for the robots so this is a interface that we've designed that allows you to see the the cameras from the robot and in fact we have a mode called tap to go that will be showing during the coffee break where you just tap on the screen tell the robot where you want it to go and it look emotes they're automatically by itself so creating a really simple and intuitive interface to drive this robot around in a way that previously was incredibly difficult so beyond this type of human-in-the-loop teleoperation driving spot is also capable of doing a wide variety of autonomous behaviors so this is an example of what spot's seeing as it goes through the world as I said it's looking at its environment constantly using stereo cameras around its entire body looking for places to put its feet safely figuring out if I put my myself here will I be unstable and I am unstable how can I quickly move to still be stable so here's an example of an antagonistic behavior that spot can automatically correct against vision system part of the internals of this the internal IMU that's telling it what is stable and how to stay there it's also looking for obstacles so if you drive it towards a doorway or a static obstacle that's in its place it'll figure out a way to walk around it or just stop and then finally it's using its stereo camera pairs to start building a 3d point cloud of the environment around it so this is spot doing an autonomous patrol in our office you'll notice that in the lower left-hand corner you have the prior map that it's collected on a previous teleoperation mission so you drive the robot through the environment collect a point cloud and then you can start creating waypoints to say go here here and here and at each one of these places you can do a different task the important thing about this video though is it's not only the fixed map of the point cloud that spot is able to navigate against but also has a real-time voxel map so that it can sense its environment in real time and see oh there's an obstacle here that wasn't here before I can figure out a way to get around it so even as the environment changes spot has enough intelligence on board to figure out how to navigate through this really complex environment so the question is what does that mean for a construction site and with these capabilities what can spot potentially do we've been speaking to a number of people in the construction industry over the past several months to try to get a general sense of what we could potentially do and that could include doing regular site surveys even doing some human interaction and potentially manipulation tasks like doing material staging or collecting tools and staging them where they're supposed to be for work that's during the next day even more speculative stuff would be in the human interaction loop you could start doing projection mapping for construction workers telling them exactly where to drill on the site so let me walk you through some of the behaviors that are already in development that will enable this type of future where you have robots collaborating together with construction workers on site so let's say this is a blueprint and this is a map of where you want spot to go on the construction site you could start tagging individual places for spot to do a behavior so let's say we're going to start off with an inspection mission you have an inspection payload that's a 360-degree camera plus the point tilt zoom that's able to zoom in and look at closely at details well now you can create effectively a Google map style view of your work site comparing day 1 versus day 3 drop into any of these blue blue spheres here to pop into some of the more detailed inspection work and here we're on day 1 the drywall pilots haven't been delivered but now they have so now we've done an autonomous survey of this site on a daily basis and we can start comparing data from one day to the next now obviously this is a human driving this inspection and doing change detection but obviously in the future you could drive use an algorithm to start doing some of that change detection work now say you want the the robots to start manipulating the environment so it opens doors that's able to get through more spaces and and I'll actually start touching the the world around it well this is a example of an automated behavior that we built into the robot to start doing some of that object manipulation work so this is a spot opening a door you know it uses force control to you know you uh its able to see where the handle is it uses force control to twist up and down figures out if it's a push or pull door reaches around and figures out a strategy how to get through the door itself now we've shown videos of this before and if you look at the YouTube comments which I never recommend looking at the YouTube comments but one of the first criticisms that people say is oh that was fully scripted you know someone like marty was in the back puppeting the the robot through well that's actually not true we've actually been putting the robot autonomy system through a wide variety of different types of door types to really tease out some of the faults and make sure that we're able to get through every single door in our office and every single door that of a new environment that we go into so you notice in this video that some of the doors are pushed some of the doors are pull some of them have highly reflective surface some of them are heavier than others you can't really see but some of the hinges here are really sticky but spots still able to figure out a strategy of how to open them and how to get through the space and that type of localized intelligence is something that we think could not only be useful for getting through an environment but also for doing a wider variety of interactive tasks on a construction site so beyond just getting through this space taking pictures you know actually doing some object manipulation on the construction site could potentially be valuable we've heard several times that you go into a construction site and it turns out the place that you're going to be working on has a ton of debris covering the and the first hour is just clearing out all that material before you can actually start working on the thing you're supposed to do that day or tools have a tendency to walk all over the jobsite if you could have something to retrieve them and staged them in one place perhaps that would be useful so here's an example of spot doing that type of manipulation tasks repetitively so here we're playing fetch with spot we've got a newspaper rule that we've just thrown at random locations through through our lab and spots still able to go identify the object grab it and stage it after it's grabbed it so we are now we've gone through several iterations of spot as you saw in the research slide the design and the morphology has changed over the years last year we built 12 robots in-house where we took this new design rebuilt everything for this new design for manufacturing morphology and that allowed us to start getting out into the real world and doing some testing but starting at the beginning of this year we passed everything to contract manufacturers we're now through a beta build phase where we're eventually going to build 100 robots by the end of this month and we're actually putting these beta robots through hundreds of hours of cycle times cycle time per week to tease out some of the hardware and software faults before we go into mass production so here we're putting a lot of strain and stress on all the actuators the leg you saw the knee actuators moving these are some of the hip actuators that are being tested these are the robot lanes where we just have the robots walking up and down hundreds of hours per week just to tease out again some of the software faults that we think might potentially hinder its performance out in the world I love seeing these videos that it's it always looks like a stockyard to me but you know one of the things that's really great about having this autonomy system up and running is that the robots actually do all of these tests themselves those were autonomous behaviors where it's just running back and forth back and forth so only one driver is monitoring the entire fleet of robots we're also making sure that it's mobility is still up to our standard so going through you know difficult environments rocky environments going upstairs still something that we test through this process but we're not just looking at the robot in the lab work we were already starting to deploy the robot on real-world sites to do proof-of-concept trials and really understand the value proposition of a robot in the real world so these are some of the autonomous inspection tasks that we did together with some of our partners in Japan this is the robot going through a site at Tokyo naka this is spot going through a Hensel Phelps site at SFO terminal one where they are starting to work together with holo builder to do documentation on the site they've been doing this as a manual process for several months already but using spot they can now start automating that documentation process we're putting spot through more difficult terrains like grass through snow making sure that we're able to test some of the mobility at other Japanese construction sites such as Fujita and then in the next piece you'll actually see what the full hollow builder integration looks like at this test site in San Francisco Airport Terminal one where you have a 360 camera payload that they've designed as a mast on top of the robot and the tableau that's running in the back is giving it cues of when and where to take pictures as it drives through the site and that's something that we passed to them we enabled this customer to design their own application with the robot creating their own payload creating their own hardware to actually deploy it and get it out into the real world this is a picture of spot doing an inspection task for kijima which is a construction company out in Japan they're doing a tunnel construction project where every meter they drill in deeper into this tunnel they have to send people effectively wearing armor to go in and do an inspection and task at the at the tunnel face to make sure that it's not going to collapse on their workers they've had challenges deploying automation that can't get close enough to the tunnel face that has trouble getting down the full kilometer to get to the tunnel face to actually do inspection and they've started testing spot to this to do this task which they believe will save them time but also reduce some of the workload of the the people that are on the site we're already working overtime as it is and taking them out of harm's way and we're really excited about applications like this because it really proves the the full value of a robot being deployed in a real-world environment doing something that's dull dirty dangerous time-consuming costly and just improving the processes will hold eventually connecting both the analog in the digital world together so we are in the process of as I mentioned we plan to start selling this robot later this summer and we're hoping to start finding ecosystem partners that can be developing applications on top of the robot using the API using the payload ports to explore what the capabilities are for the robot and in parallel we're working closely with actual end customers to deploy the robot on their facility to make sure that we're actually able to provide the full value of what this technology can potentially be so you know we invite you to reach out to us at Boston Dynamics our Kong backslash AEC if you've got any interest and working together with us but I think in the five minutes that I have remaining I can open up for some questions and Martin's back there with a microphone there's a question right here hello hi it's probably a question you get a lot and roughly what sort of pricing do you think they will be put out we've now finalized the pricing yet some of that's going to depend on some of the final tweaks that we're doing in the production process but we're imagining that this is going to come in below $100,000 any other questions yes I didn't expect to run today I just wondering what the most unexpected thing you've discovered by using the robust and laying other people develop the payloads and what's the most unexpected thing and the best thing you've seen well we're still in the early so the question was what's the most unexpected thing that we've seen built on top of the robot honestly not not a lot because a lot of it's our own internal developments right now but eventually we plan to release it to other people in terms of some of the development that our internal team has put together I really thought the application that we've filmed recently of several spots pulling a trailer in a coordinated way was a pretty surprising application of the technology but yeah we get a lot of really interesting requests by our website in terms of what they they hope the robot can do Christmas will be interesting for Santa I think we've got time for one more question yeah hi there Greg from react robotics so we have the other quarter pet in the exhibition space downstairs and I mean how focus really at the moment is to allow researchers particularly AI machine learning researchers to use the platform and one of the reasons we do that is there's potential improvements that a large community of researchers can make and the sort of accountability aspects so having an open sort of low-level control is quite important for them is is this an area that you're interested in exploring or I mean what are your thoughts on on being open with various aspects of your software so we we are making the API public but it were only exposing the omission layer right now and that's primarily because most of the customers that were hoping to engage with are not interested in the lower level joint kinematics but they're more interested in what the robot can do but that is part of the reason why we're putting hundreds of hours in the qual testing of its locomotion capabilities and effectively the culmination of several decades of research that we've done internally to to get legged robots to work great thanks all right all right so I'll pass things back to Joe but if you're interested in seeing spot and the action will be downstairs during the coffee break so please come by say hi and get some stick time with spot thank you you you
Info
Channel: AEC MAGAZINE
Views: 4,287
Rating: 4.8367348 out of 5
Keywords:
Id: oXXVtQX0nsw
Channel Id: undefined
Length: 30min 18sec (1818 seconds)
Published: Mon Jul 29 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.