Nvidias Just Revealed Stunning New AI Upgrades! (Nvidia Computex)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so nvidia's CEO Jen sen hang recently did a keynote speech at computex 2024 and it was genuinely fascinating to see all of the updates that Nvidia is going to be delivering over the years to come now this video is going to show you guys only the key AI updates that you need to be aware of because they are rather important and also quite fascinating so coming in at number one is of course the digital Twins and advancing the capabilities of robots in AI with a lot more developments that we're going to see in the future the era of Robotics has arrived one day everything that moves will be autonomous researchers and companies around the world are developing robots powered by physical AI physical AIS are models that can understand instructions and autonomously perform complex tasks in the real world multimodal llms are breakthroughs that enable robots to learn perceive and understand the world around them and plan how they'll act and from Human demonstrations robots can now learn the skills required to interact with the world using gross and fine motor skills one of the integral Technologies for advancing robotics is reinforcing learning just as llms need rhf or reinforcement learning from Human feedback to learn particular skills generative physical AI can learn skills using reinforcement learning from physics feedback in a simulated World these simulation environments are where robots learn to make decisions by performing actions in a virtual world that obeys the laws of physics in these robot gyms a robot can learn to perform complex and dynamic tasks safely and quickly refining their skills through millions of Acts of trial and error we built Nvidia Omniverse as the operating system where physical AIS can be created Omniverse is a development platform for virtual world simulation combining realtime physically based rendering physics simulation and generative AI techn techologies in Omniverse robots can learn how to be robots they learn how to autonomously manipulate objects with Precision such as grasping and handling objects or navigate environments autonomously finding optimal paths while avoiding obstacles and Hazards learning in Omniverse minimizes the Sim to real Gap and maximizes the transfer of learned behavior building robots with generative physical AI requires three computers Nvidia AI supercomputers to train the models Nvidia Jetson Orin and Next Generation Jetson Thor robotic supercomputer to run the models an Nvidia Omniverse where robots can learn and refine their skills in simulated worlds we build the platforms acceleration libraries and AI models needed by developers and companies and allow them to use any or all of the stacks that suit them best the next wave of AI is here robotics powered by physical AI will revolutionize [Music] Industries so this recent demo did go ahead and show us that you know for the first part of this you know video that the robots in the digital twins that have being trained in the simulation really do speed up the time that we can get feedback in order to get these robots out into the physical world to be able to do tasks that are very very effective and we can see that these digital twins have been used there was even a robot demo that I was looking at yesterday where the gr1 robot in fact the unry G1 robot was actually showcasing it being trained in the digital twin so it's basically as you can see right here the demonstrations in the simulation environment which we we have in nvidia's digital Omniverse you can see that the demonstration the Isaac Sim in Omniverse digital twin and then of course it's mapped onto the real robot which is very very effective now some of the robots that you are seeing do have Partnerships with Nvidia and it seems that not only are there humanoid robots but the entire robotics industry is taking this upgrade and is all getting evolved at the same time which means that like we saw in the start we could see this entire different Evolution where not only humanoid robotics but these kind of robots too where we can see robots in a surgery robots you know autonomous robots just a huge different range of applications for these robots being used in our everyday lives now another thing that was really really fascinating from nvidia's demo was of course the nvidia's digital humans this was something that was uh you know caught me a little bit by surprise but not too much cuz I predicted this in video where I spoke about what one of the things that I thought about that was going to come in AI for 2024 but this is digital humans a agent avatars or I guess you know however you want to describe it I'll let Nidia take it away and then I'll further describe exactly what's going on here Market let's dive before I head out to the night market let's dive into some exciting frontiers of digital humans imagine a future where computers interact with us just like humans can hi my name is my name is Sophie and I am a digital human brand ambassador for Unique this is the incredible reality of digital humans digital humans will revolutionize industries from customer service to advertising and gaming the possibilities for digital humans are endless using the scans you took of your current kitchen with your phone they will be AI interior designers helping generate beautiful photorealistic suggestions and sourcing the material and Furniture we have generated several design options for you to choose from they'll also be AI customer service agents making the interaction more engaging and personalized or digital healthcare workers who will check on patients providing timely personalized care um I did forget to mention to the doctor that I am allergic to penicillin is it still okay to take the medications the antibiotics you've been prescribed cicin and metronidazol don't contain penicillin so it's perfectly safe for you to take them and they'll even be AI brand ambassadors setting the next marketing and advertising Trends hi I'm IMA Japan's first virtual model new breakthroughs in generative Ai and computer Graphics let digital humans see understand and interact with us in humanlike ways H from what I can see it looks like you're in some kind of recording or production setup the foundation of digital humans are AI models built on multilingual speech recognition and synthesis and llms that understand and generate [Music] conversation the AIS connect to another generative AI to dynamically animate a lifelike 3D mesh of a face and finally AI models that reproduce lifelike appearances enabling real-time path traced subsurface scattering to simulate the way light penetrates the skin scatters and exits at various points giving skin its soft and translucent appearance Nvidia Ace is a suite of digital human Technologies packaged as easy to deploy fully optimized microservices or Nims developers can integrate Ace Nims into to their existing Frameworks engines and digital human experiences neotron slm and llm Nims to understand our intent and orchestrate other models Reva speech Nims for interactive speech and translation audio to face and gesture Nims for facial and body animation an Omniverse RTX with dlss for neural rendering of skin and hair Ace Nims run on Nvidia gdn a Global Network of Nvidia Exel accelerated infrastructure that delivers low latency digital human processing to over 100 regions so nvidia's digital humans are going to be a very fascinating concept to deal with in the future as humans get more and more accustomed to dealing with AI in I guess you could say your everyday environment I think that this is a a perfect perfect segue and a perfect demo with into what the future holds because there are a lot of interactions in which AI actually improves it and I know that a lot of people might think that this is I guess you could say too hypy or too much AI integration but trust me when I tell you that there are just some interactions that are just far better to have an AI guide you through in terms of just the overall experience because humans always do want to talk to another human but sometimes deciphering that kind of information is just much easier to do if you can talk to someone who is patient and who can explain things to you in a way that you may not previously have understood and a lot of the times these AI systems will genuinely speak other languages fluently meaning that the cross border barrier that many used to suffer from is no longer going to be an issue so I think this is you know real progress across the bounds for all I'm not entirely sure about how AI influencers are going to be perceived as I've seen them do some of them do pretty well so far but I do think that this this entire digital human thing is truly the next step in AI the next part that we do have to talk about is of course nvidia's robot factories and this one is so so crazy so take a look demand for NVIDIA accelerated Computing is skyrocketing as the world modernizes traditional data centers into generative AI factories foxcon the world's largest electronics manufacturer is gearing up to meet this Demand by building robotic factories with Nvidia Omniverse and AI Factory planners use Omniverse to integrate facility and Equipment data from leading industry applications like Seaman team Center X and Autodesk Revit in the digital twin they optimize floor layout and line configurations and locate optimal camera placements to monitor future operations with Nvidia Metropolis powered Vision AI virtual integration save planners on the enormous cost of physical change orders during construction the foxcon teams use the digital twin as the source of Truth to communicate and validate accurate equipment layout the Omniverse digital twin is also the robot gym where foxcon developers train and test Nvidia Isaac AI applications for robotic perception and manipulation and Metropolis AI applications for Sensor Fusion in Omniverse foxcon simulates two robot AIS before deploying run times to Jets and computers on the assembly line they simulate Isaac manipulator libraries and AI models for automated Optical inspection for object identification defect detection and trajectory planning to transfer hgx systems to the test pods they simulate Isaac perceptor powered fobot AMR S as they perceive and move about their environment with 3D mapping and reconstruction with Omniverse foxcon builds their robotic factories that orchestrate robots running on Nvidia ISAC to build Nvidia AI supercomputers which in turn train foxcon [Music] robots now that was not surprising to me at all I know that n is truly truly working hard on these robot factories with all of the Omniverse digital twin stuff that they are doing but another thing that I did find completely interesting from this announcement was the gaming assistant that Nvidia did announce so this one is pretty fascinating because we recently did see some kind of gaming assistant released by Microsoft recently but um here is nvidia's new gaming assistant uh and I'll let them take it away and then I'll update you guys on my thoughts and you know theories a demo showcasing how developers can enhance games and apps with AI assistance many games offer vast universes and Rich gameplay systems to explore each with its own intricacies and mechanics depth has become the Cornerstone of some of the most captivating gaming experiences whether it's a new player trying a game for the first time overwhelmed by Massive skill trees or a season veteran looking for that Perfect Weapon to minmax that DPS build many of us spend hours researching powering the internet to learn to dig deeper and ultimately enjoy more of what the game has to offer to show you what's possible we've partnered with Studio wild card for attch demonstration using Arc survival ascended the assistant is your conduit for game specific knowledge you can ask it questions you often find yourself looking up online like what's the best early game weapon and where do I find the crafting materials for it the best early game weapon is the spear it provides knockback and is essential for survival not only does it know about the game it can understand what's happening on the screen maybe you're staring at a dinosaur you're not familiar with and you want to know what it is or how to tame it and because the assistant is context to wear it can tailor its recommendations to your playthrough say you've racked up enough points to level up or to unlock a crafting recipe but you're just not sure what to go for next but Gamers often need help outside the game too the AI assistant understands your system and can help tune and optimize it for you it can provide insights into performance metrics like FPS and PC latency for example and chart these for you as you tweak your PC for more responsiveness it can scan your system to see if there's anything out of order like not taking full advantage of your displays capabilities you can ask it for recommendations to increase performance or efficiency it may suggest optimal Graphics settings for your particular setup or even overclocking or finding ways to consume less power while maintaining a target of 60 FPS for example your system has been optimized for performance per watt with a minimum maintained frame rate of 60 FPS AI assistance will transform the way we engage with our favorite games and apps and Project G assist is a glimpse into that future so yeah I think this is kind of interesting because one increasing Trend that we're currently seeing is that gaming AIS are increasingly getting you know some kind of I guess you could say merging with the current gaming landscape I mean these AI systems are going to be in there it seems and it seems that you know these AI systems are going to be acting as guides for players in these games in the future and I wonder if this trend will continue if they can actually pull this off because certain gaming Trends do really really well and other gaming Trends just really don't do well at all so it will be interesting to see how the AI Evolution does impact gaming as well as the digital humans and of course this gaming assistant but with that being said what is your favorite announcement from Nvidia there were some announcements regards to to you know the kind of chips that they're going to be making nothing too crazy but of course Nvidia is right now one of the biggest companies in terms of AI development because their architecture their chips are needed to power the next big training runs so with everything that Nvidia is doing from the Omniverse to Isaac Sim to the ISAC gym to their GPU clusters and whatever it is that they might be building hopefully you enjoyed this video and I'll see you in the next one how how is that 30 minutes how
Info
Channel: TheAIGRID
Views: 24,887
Rating: undefined out of 5
Keywords:
Id: pmxKUq75Kwg
Channel Id: undefined
Length: 16min 59sec (1019 seconds)
Published: Sun Jun 02 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.