Intro to Building Apps for HoloLens 2 Using Unity and Mixed Reality Toolkit - BRK1003

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

For those who were really concerned about the apparent lag time of hand tracking in the original HL2 reveal, the live demo here at 37:34 show the interactions to be much more immediate. It looks like there's still some latency, but that's probably just networking and rendering.

👍︎︎ 1 👤︎︎ u/FizixMan 📅︎︎ May 14 2019 🗫︎ replies
Captions
MY NAME IS -- [APPLAUSE] ALL RIGHT! GOOD MORNING, EVERYBODY. THANKS FOR COMING OUT. MY NAME IS JULIA SCHWARZ. I'M AN ENGINEER AND I WORK ON INPUT AND INTERACTION FOR HOLOLENS. >> HI, EVERYBODY. MY NAME IS ANDREI BORODIN. I'M AN ENGINEER IN MIXED REALITY TOOLKIT. TODAY WE ARE GOING TO TALK ABOUT MIXTURALITY. IT IS A NEW FORM OF COMPUTING THAT MICROSOFT HAS BEEN DEVELOPING FOR THE PAST SEVERAL YEARS BUT BEFORE I CONTINUE, WHO HERE HAS SEEN HOLOLENS LAUNCH EVENT? THE VIDEOS? YEAH? YOU REMEMBER THE AMAZING HOLOLENS VIDEO, JUST BEAUTIFUL. I LOVE IT. YOU GUYS LIKE IT? COOL. DO YOU REMEMBER JULIA FROM THAT EVENT? GREAT. SHE WILL TALK MORE ABOUT THAT. ALL RIGHT. SO THEN AS YOU KNOW, HOLOLENS AND MIXTURALITY COMPUTING IS BASICALLY WHEN WE BLEND THE DIGITAL WORLD WITH A PHYSICAL WORLD. I MEAN, CONSIDER THIS, RIGHT NOW, WE ARE LOOKING AT TWO-DEEP SCREEN FOR CONTENT. FOR DIGITAL CONTENT. IT MAY NOT BE THE BEST, FOR EXAMPLE, TO INTERACT WITH THE SYSTEM. TO INSPECT THE SYSTEM. ALL INTERACTION IS OFTEN INDIRECT. THEY USE A KEYBOARD AND A MOUSE. BUT WITH HOLOLENS ONE, WE PUT HOLOGRAM, DIGITAL CONTENT TO THE WORLD AROUND YOU . >> YES, EXACTLY. THAT SOLE SYSTEM CAN BE RIGHT HERE. WITH HOL LICENSE 2, WE WENT FURTHER. INSTINCTUAL INTERACTION. WE ALLOWED YOU TO TOUCH HOLOGRAMS. CONSIDER THIS, LET'S SAY I HAVE THIS SYSTEM RIGHT HERE, WE HAVE THE SUN, EARTH, MAR, A CAR. IF I WANTED TO INTERACT WITH THE EARTH, INSTINCTUAL INTERACTION WOULD BE TO PICK IT UP. NOW THAT, THAT INTERACTION OF HOLOGRAMS, DIGITAL CONTENT, NOT ONLY DIRECT, BUT NATURAL TO US. IT IS COMPUTING THAT IS NATURAL TO US. AS IT SHOULD BE. AND USING THESE TWO CAPABILITIES, MIXTURALITY DEVELOPERS TODAY AS BUILDING UNIQUE EXPERIENCES, TO SOLVE REAL WORLD PROBLEMS. AND THEY ARE BRINGING INTO FRUITION WHAT ONLY RECENTLY WAS IN THE REALM OF SCIENCE FICTION. INTERACTING WITH HOLOGRAMS. AND WHAT WE ARE HERE TO TALK ABOUT IS YOU. THE MIXTURALITY DEVELOPERS. WHO HERE BY THE WAY HAS DEVELOPER FOR HOLOLENS? OKAY. FOR THE REST OF YOU, WELCOME TO WHOLE NEW REALITY OF DEVELOPMENT AND THAT IS WHAT WE ARE GOING TO TALK ABOUT. WE ARE GOING TO TALK ABOUT THE PEOPLE THAT HAVEN'T DEVELOPED FOR HOLOLENS AND THE PEOPLE WHO HAVE. THE TOOLS WE ARE BUILDING TO EMPOWER YOU TO BUILD THE NEXT GENERATION OF EXPERIENCES. ONE OF THOSE TOOLS IS MIXTURALITY TOOLKIT. OPEN SOURCE PROJECT FOR 3D THAT IS BUILT BY MICROSOFT BY OUR TEAM AND IT GIVES YOU THE ABILITY TO BUILD NEW HOLOLENS APPLICATIONS VERY QUICKLY EVEN IF YOU ARE A NEW DEVELOPER OR ENHANCE EXISTING APPLICATIONS. AND OUT OF THE BOX, JUST OPENING UP A SAMPLE SCENE, YOU CAN START PLAYING WITH HOLOGRAMS, INTERACTING, RESIZING THEM. THEY GIVE YOU A LOT OF FUNCTIONALITY. THAT IS ACTUALLY MY FAVORITE. THE WAY IT REACTS. SO UNITY, AS I'M SURE MANY OF YOU ARE FAMILIAR, IS A GAME ENGINE FOR DEVELOPING GAMES. IT IS ALSO FOR DEVELOPING MIXED REALITY APPLICATION FORCE HOLOLENS AND WINDOWS MIXTURALITY. AND THERE ARE OTHER DEVICES. MIXTURALITY TOOLKIT IS BUILT TO SUPPORT HOLOLENS ONE AND MIXTURALITY. AND ANY DEVICE RUNNING ON TOP OF OPEN VR. WE ARE MAKING STRIDES INTO AR AND AR CORE. WE WILL SHOW YOU SOME OF THAT TODAY. WHEN I SAY OUT OF THE BOX YOU GET A LOT, WITH MIXTURALITY TOOLKIT, THAT IS AVAILABLE RIGHT NOW, YOU GET BUTTON, ASSISTANT KEYBOARD, MANY, MANY OTHER CONTROLS AND BOUNDING BOX, IT IS BUILT ON TOP OF GUIDELINES THAT JULIA WILL TALK ABOUT LATER. THEN YOU GET A BUNCH OF FUNCTIONALITY. I TRACKING, SPATIAL STANDING AND MANY, MANY MORE. WITH THAT, LET'S SWITCH GEAR FORCE A BIT. JULIA IS VERY PASSIONATE ABOUT DIGITAL INTERACTIVE, HUMAN INTERACTION WITH DIGITAL CONTENT. SHE PARTICIPATED IN A LOT OF STUDY, PROTOTYPES AN RESEARCH, TODAY SHE IS GOING TO SHARE WITH YOU SOME OF THE GUIDELINES OF LEARNINGS WE HAVE COMPILED OVER THE YEARS. AFTER THAT, SHE WILL TALK ABOUT THE MOBILE WORLD CONGRESS, A VIDEO AND HOW SIMPLE IT IS TO RECREATE THAT USING MIXED REALITY TOOLKIT. ACTUALLY, OUT OF CURIOSITY. SO HOLOLENS 2, PERSONALLY FOR YOU, WHAT DO YOU LIKE THE MOST ABOUT WORKING WITH THAT? >> WHAT I LIKE THE MOST OF WORKING ON HOLOLENS 2, THAT IS EASY. DEFINITELY GETTING TO WORK ON ALL OF THE INPUT AND INTERACTION TECHNIQUES AND BASICALLY DISCOVER THE NEW WAY OF INTERACTING WITH OBJECTS. FOR THOSE WHO DIDN'T SEE IT, COUPLE MONTHS AGO, WE ANNOUNCED HOLOLENS 2 IN BARCELONA AT MOBILE WORLD CONGRESS. THIS IS ACTUALLY A VIDEO OF ME GETTING TO SHOW ALL OF THOSE INTERACTION TECHNIQUES THAT WE SPENT YEARS DEVELOPING. AND THE THING I LOVE ABOUT IT IS, ACTUALLY HAD TO PINCH MYSELF BEFORE I WENT ON STAGE AND ASKED, WAS I DREAMING? WHEN YOU THINK ABOUT IT, THIS KIND OF STUFF IS THE STUFF THAT YOU NORMALLY SEE IN MOVIES WITH LIKE ACTOR, WAVING THEIR HANDS IN THE AIR ON A GREEN SCREEN AND SORT OF, ALL OF THE ARTISTS PUT THAT ALL UP IN POST PRODUCTION. BUT WHAT YOU SEE HERE, ON STAGE, THIS IS REAL. WHAT DO I MEAN BY THAT? I AM LITERALLY SEEING THE HOLOGRAMS THAT YOU ARE SEEING IN THIS VIDEO. I'M ACTUALLY INTERACTING WITH THOSE SLIDERS. WITH THE BUTTONS. I'M PUSHING IN. THAT IS ME, IT IS ACTUALLY WHAT I AM SEEING. AND THAT IS REALLY INCREDIBLE. IN FACT, TO PROVE TO YOU THIS IS REAL AND NOT JUST SMOKE AND MIRROR, WE ARE GOING TO RECREATE THE BUTTONS YOU SEE HERE AND THOSE SLIDERS THAT YOU SAW EARLIER TODAY IN MIXED REALITY TOOLKIT. SO YEAH, THE THING THAT I LOVE ABOUT THIS TOO IS THAT IT IS ACTUALLY KIND OF INCREDIBLE THAT WE CAN DO ALL OF THIS STUFF. IT MIGHT LOOK REALLY EASY, SORT OF WHEN YOU SEE IT HERE DONE LIKE THIS, BUT LET ME TELL YOU, IT IS ACTUALLY WAS QUITE A LOT OF WORK TO FIGURE OUT WHAT THE RIGHT WAY TO DO THESE INTERACTIONS WAS, HOW YOU SHOULD ACTUALLY INTERACT WITH THESE INTERFACES AND WE ARE SORT OF IN THE WILD WEST OF 3D WIDE AND AUGMENTED REALITY TODAY. WE ARE ALL DISCOVERING THIS TOGETHER. MAYBE IN SOME TIME IN THE FUTURE, IT WILL BE COMMONPLACE AND EVERYONE WILL TAKE IT FOR GRANTED. RIGHT NOW IT IS DEFINITELY NOT OBVIOUS. AND WHEN YOU GET YOUR DEVICES, YOU WILL WANT TO TRY MANY DIFFERENT THINGS AND SO EXCITED LIKE I WAS WHEN I STARTED DISCOVERING ALL OF THIS. AND YOU ABSOLUTELY BY ALL MEANS SHOULD TRY EVERYTHING BECAUSE, FOR SURE, YOU KNOW, THERE IS A SMALL TEAM OF US WORKING ON THIS, DISCOVERING THINGS. THERE IS HUNDREDS OF THESE, THOUSANDS OF YOU. YOU WILL DISCOVER NEW STUFF. ABSOLUTELY RECOMMEND THAT YOU TRY EVERYTHING THAT YOU CAN AND EXPLORE AND THEN, OF COURSE, PUBLISH AND SHARE YOUR LEARNINGS LIKE WE ARE DOING HERE. BUT WHAT I WANT TO DO TODAY BEFORE I ACTUALLY GET STARTED IN SHOWING HOW WE CODED THIS UP, GIVE YOU A LITTLE BIT OF GUIDING LIGHTS ABOUT WHERE TO GET STARTED. SORT OF NOT ONLY THESE IDEAS BUT WHAT ARE GROUNDING THINGS WE LEARNED BASED ON OWN PROTOTYPING OF THE THINGS THAT WORK AND DIDN'T FOR HOW IS, WHAT IS A GOOD WAY TO BUILD INTERFACES FOR THIS? SO I'M GOING TO PROVIDE SOME BASIC GUIDANCE. I'M NOT GOING TO HAVE TIME TO GO OVER EVERYTHING. I WILL TELL YOU WHERE IT IS A GREAT PLACE FOR YOU TO GO ON LINE TO GO OVER THE DOCKS SO YOU CAN SEE LATER ON. AND BY THE WAY, UNFORTUNATELY, TODAY, I'M NOT GOING TO HAVE TIME TO SHOW YOU ALL OF THE PROTOTYPES THAT WE BUILT TO GET WHERE WE ARE TODAY. AND TO DISCOVER THIS MODEL. BUT I GAVE A TALK LAST WEEK AT MR DEV DAYS WHERE WE BASICALLY SHARED MOST OF THE PROTOTYPES THAT WE BUILT WHILE WE WERE DISCOVERING THE WAYS TO INTERACT WITH HOLOGRAMS. AND WE DID, WE SHARED A PROTOTYPE. SHOWED WHAT WORK AND WHAT DIDN'T. THEN THE PRINCIPLE WE LEARNED FROM IT. WE ARE WORKING TO PUBLISH THAT ALL ON LINE AND MAKE IT AVAILABLE. SO I AM JUST REALLY ENCOURAGING YOU TO LOOK UP MR DEV DAYS. THE HAND INTERACTION CRASH COURSE TALK TO LEARN ABOUT THAT MORE. ALL RIGHT. SO LET'S GET STARTED WITH THE BASIC IDEA OF THIS NEW INPUT MODEL, THIS NEW WAY OF INTERACTING. IT IS ACTUALLY VERY SIMPLE. THE CORE IDEA OF HOLOLENS 2 IS INSTINCTUAL INTERACTION. SHOULDN'T NEED TO LEARN HOW TO USE A DEVICE, SHOULD KNOW HOW TO DO IT. IT IS QUITE SIMPLE BECAUSE YOU SEE THESE FREE OBJECTS MUCH WE KNOW HOW TO INTERACT WITH 3D OBJECTS IN THE REAL WORLD. REACH OUT AND INTERACT WITH THEM. THAT IS THE SAME IDEA HERE IN HOLOLENS. AS I MENTIONED BEFORE, I WON'T BE ABLE TO COVER ALL OF THE DETAILS OF ALL OF THE INTERACTION PRINCIPLES THAT WE LEARNED. SO JUST LEAVING THE LING HERE, A KI. THIS IS WHERE THE DESIGN TEAM IS PUTTING UP THE GUIDANCE, GORGEOUS PICTURE AND THINKING OF THE RIGHT WAY TO ACT. I'M GOING TO LEAVE IT UP HERE FOR FOLKS TO GET A PICTURE OF. GET THAT IN THEIR MIND. FOR NOW, LET'S TALK ABOUT A COUPLE BASIC IDEAS THAT WE ARE GOING TO TOUCH ON A LITTLE BIT IN THE DEMOS TODAY. SO THE FIRST IDEA IS VERY SIMPLE. ACTUALLY, IT IS GREAT WHEN YOU CAN HAVE A SIMPLE INTERACTION MODEL BECAUSE IF IT IS SIMPLE, EASY TO EXPLAIN, THEN IT SHOULD BE EASY TO USE. THE FIRST IDEA IS THAT YOU SHOULD, YOU KNOW, BE ABLE TO PRESS BUTTONS WITH YOUR FINGER. SOUNDS OBVIOUS. WHEN YOU THINK BACK TO IT, WHEN I WAS STARTING OUT ON 3D INTERACTION, I STARTED WITH KINECT, LIKE MANY OF YOU DID. I WAS BUILDING ALL OF THESE 3D INTERACTIONS WITH PRESSING BUTTONS IN THE AIR. BUT MAPPING TO A 2 DUI. 3D INPUTS TO 2D OUTPUT. MADE IT DIFFICULT TO USE. THE AMAZING THING ABOUT HOLOLENS, NOW THAT WE HAVE 3D INPUT THAT MATCHES TO THE 3D OUTPUT. THAT MAKES THESE INTERACTIONS WORK MUCH BETTER. WHEN WE DEMOED TO PRESS AND ALL SORTS OF OTHER FOLKS AT MOBILE CONGRESS, IT WAS REALLY REWARDING, SO YOU WOULD CHOOSE A BUTTON, SAY, JUST DO WHAT YOU THINK YOU SHOULD DO. AND WHAT IS NATURAL. PRESS A BUTTON AND IT WORKS. THEY DON'T GET EXCITED IT WORKS. I'M EXCITED THAT IT WORKS. THEY ARE LIKE, OF COURSE, THAT IS WHAT SHOULD HAPPEN. SO THIS IS SORT OF A VERY SIMPLE IDEA BUT AN IMPORTANT ONE. NOW THE DETAILS OF HOW TO BUILD THESE BUTTONS TO MAKE THEM WORK WELL, THAT IS ACTUALLY SORT OF THE DEVIL'S IN THE DETAILS FOR SURE. BASIC IDEAS ARE THAT IT IS VERY IMPORTANT TO PROVIDE AUDIO AND VISUAL CUES. WHAT DO I MEAN BY THAT? VISUAL CUES. IT IS VERY IMPORTANT TO SHOW A PROXIMITY LIGHT. WHERE YOUR FINGER IS IN RELATION TO THE BUTTON. NICE SHADE OR EFFECT AS IF YOUR FINGER HAD A LIGHT AND ILLUMINATING THE BUTTON. THE REASON IS, THAT GIVES THE USER A SENSE WHERE THAT FINGER IS IN RELATION TO TARGET BETTER. WE ACTUALLY HAVE A THING CALLED THE FIXER CURSOR THAT YOU WILL SEE IN HOLOLENS THAT IS A RING ATTACHED TO YOUR FINGER. THAT ACTUALLY CHANGES IN RADIUS AS YOU GET CLOSER TO THE BUTTONS AN PRESS THEM. THAT HELPS COMMUNICATE NOT ONLY THE LOCATION OF THE FINGER BUT ALSO HOW CLOSE IT IS TO THE INTERACTIVE SURFACE TO HELP PEOPLE TARGET AND CLICK. I MENTION CLICKS. SOUNDS IS VERY IMPORTANT FOR A BUTTON INTERACTION ACTUALLY. IT IS VERY IMPORTANT TO PLAY AUDIO SORT OF TO COMMUNICATE ALL OF THE STATES OF THE BUTTON. ON PRESS, ON CONTACT, ON RELEASE, THE REASON FOR THIS, BECAUSE WHEN YOU THINK ABOUT WHAT YOU ARE DOING, MAYBE SOME OF YOU SAW ME EARLIER, EARLIER TESTING OUR DEMO OUT. I'M JUST POKING THE AIR. I'M NOT ACTUALLY, YOU KNOW, TOUCHING AN ACTUAL SURFACE. AND BECAUSE OF THAT, WE ARE MISSING A LOT OF THE FEEDBACK THAT WE GET FROM OUR FINGERS WHEN WE ARE TOUCHING THINGS. WE COMPENSATE FOR THAT BY USING AUDIO AND VISUAL CUES. MY USUAL PIECE OF ADVICE IS CRANK UP ALL OF THE OTHER INPUT SOURCES THAT YOUR BRAIN CAN GET. SO AUDIO AND VISUAL TO COMPENSATE FOR THE FACT THAT YOU ARE NOT TOUCHING SOMETHING. WHEN WE WERE STARTING OUT PROTOTYPING WE WEREN'T SURE THAT WOULD WORK. IT TURNS OUT THAT DOES WORK QUITE WELL. AND YOU CAN COMPENSATE FOR THOSE THINGS. AND HAVE PEOPLE SUCCESSFULLY PRESS BUTTONS AND THE DATA I HAVE TO DO THAT, RUN A LOT OF STUDIES BUT ALSO KNOW IF PEOPLE ARE ABLE TO PRESS BUTTONS. SPEND A LOT OF TIME ON THAT SLIDE. I WILL GO FASTER ON THE REST. BUT I REALLY, AS YOU CAN SEE, I COULD TALK ABOUT BUTTONS ALL DAY. WE HAVE TO KEEP GOING. SO WHEN YOU GET YOUR HOLOLENS 2 OR ANY SORT OF DEVICE THAT HAS HAND TRACKING, YOU PROBABLY WANT TO STICK A COLLIDER ON EVERY SINGLE ONE OF YOUR FINGERS AN TYPE WITH ALL TEN FINGERS. IN FACT, THAT IS WHAT WE TRIED WHEN WE GOT OUR DEVICES FIRST. WHAT WE LEARNED VERY QUICKLY, IT IS ACTUALLY VERY HARD TO DO VERY ACCURATE INTERACTIONS AND TYPING AND BY ACCURATE I MEAN WHEN A USER INTENDS TO PRESS ONE OF THREE BUTTON, BUTTON A ON THE LEFT, THEY PRESS THE LEFT BUTTON AND NOT IN THE MIDDLE ON THE RIGHT. THE REASON IS BECAUSE OF FALSE ACTIVATIONS. THIS IS BECAUSE ON A SURFACE, AGAIN, YOUR FINGERS STOP WHEN THEY HIT THAT SURFACE. IN THE AIR, THEY ARE ALL GOING TO GO THROUGH. IT IS QUITE ACTUALLY DIFFICULT TO MAKE THE UI SORT OF RECOGNIZE WHICH FINGER YOU ARE INTENDING TO USE FOR THE PRESS IN A GENERAL WAY. AND CERTAINLY THIS IS A REALLY ACTIVE VARYING RESEARCH, REALLY GOOD LOW HANGING FRUIT, JUST TO HAVE A COLLIDER ON THE INDEX FINGER OF EACH HAND. THE NICE THING IS, THIS ACTUALLY WORKS IN MANY CASES. EVEN IF PEOPLE USE A DIFFERENT HAND POSE, LIKE THEIR FULL HAND TO PRESS A BUTTON, IT IS STILL GOING TO WORK BECAUSE THE INDEX COLLIDER WILL GO THROUGH IT. IN FACT, A STORY WHERE WE AT MOBILE CONGRESS, WE HAD SOMEBODY TRY TO PRESS THE BUTTON WITH THEIR PINKY. I WAS WORRIED. IT WORKED. THE REASON IS BECAUSE THE INDEX FINGER STILL WENT THROUGH THE BUTTON. SO AGAIN, THIS, PEOPLE THOUGHT, I CAN PRESS, BUT ACTUALLY JUST THE INDEX FINGER. THIS IS A REALLY NICE TRICK THAT WORKS QUITE WELL. SO AS I MENTIONED, THERE IS A LOT OF DETAILS IN BUTTONS TO ACTUALLY MAKE THEM WORK REALLY WELL. AND SO LUCKILY, IN MIXED REALITY TOOLKIT WE ARE PROVIDING WITH THE BUTTON AND INPUT SYSTEM THAT CORRECTLY ASSIGNS COLLIDERS WITH THE FINGERS SO YOU DON'T NEED TO WORRY ABOUT THIS. WE'LL GET INTO THAT A LITTLE BIT LATER. OKAY. SO, PUSH BUTTONS WITH YOUR HANDS. NOW HOW ABOUT MOVING OBJECTS? ANOTHER VERY BASIC IDEA. GRAB OBJECTS, DIRECTLY WITH YOUR HANDS TO MANIPULATE THEM. NOW SOMETHING THAT IS KIND OF INTERESTING ABOUT GRABBING OBJECTS THAT WE LEARNED WHEN WE SORT OF SHOWED THIS TO PEOPLE IN OUR STUDIES WAS THAT THE SHAPE OF THE OBJECT ACTUALLY REALLY DICTATES THE WAY THAT PEOPLE GRAB THEM. FOR EXAMPLE, FOR A SMALLER OBJECT, PEOPLE WILL DO A FINGER PINCH OR MAYBE A, WE HAVE, WE GAVE THE DIFFERENT GESTURES NAME. THIS IS BABY SHARK, KIND OF AN OPEN HAND. AND THOSE ARE ACTUALLY REALLY EASY TO DETECT. THEY ARE HIGHLY RELIABLE. SO GRABBING SMALL THINGS LIKE SMALL OBJECTS GENERALLY TENDS TO WORK QUITE WELL. WHEN YOU GO FOR THE LARGE ONES, PEOPLE TRY TO USE THE FIST WHICH ACTUALLY WORKS QUITE WELL WITH HAND DETECTOR. AND NOW YOU NOTICE IN THE DIAGRAM I SORT OF STOP AT THE HAND SIZE OBJECTS. BUT WHAT IF I WANTED TO PICK UP THIS PODIUM HERE? SO WE ASKED PEOPLE TO DO THAT. WE WOULD SHOW PEOPLE A LARGE OBJECT. PICK IT UP. THEY ARE LIKE, OKAY, LET ME PICK IT UP WITH BOTH MY HANDS. LIKE YOU PICK UP A REAL OBJECT. THERE IS ARE A COUPLE PROBLEMS WITH THAT. FIRST IS, YOUR HANDS ARE OUTSIDE OF THE FIELD OF VIEW OF THE HOLOLENS. SO THEY ARE NOT GOING TO ACTUALLY SEE YOUR HANDS. IF YOU DO THE BEAR HUG STYLE PICK UP. THE OTHER PROBLEM IS, WITH THE FRAMING GESTURES IS THEY CAN WORK AND I DIDN'T IMPLEMENT SOMETHING LIKE THIS. IT IS ACTUALLY QUITE HARD TO DO IT WELL IF YOU ARE ENABLING TWO HANDYMANIP LATION. WHEN YOU A ARE APPROACHING SOMETHING FROM THE SIDE, IT IS FRAMING GESTURE, THEN YOU END UP BUILDING THIS UI THAT HAS A LOT OF AM AMBIGUOUS STATE. WHILE INTERESTING RESEARCH PROJECT, LEAD TO COMPLEXITY IN THE CODE AND MAKE IT DIFFICULT TO DEBUG. MY ADVICE ABOUT GRABBING OBJECTS, ABSOLUTELY GRAB OBJECTS TO MANIPULATE THEM. FOR LARGER OBJECT, PROVIDE HANDLES OR OBVIOUS CUE FORCE PEOPLE TO GRAB INSTEAD OF FORCING THEM TO GRAB THE WHOLE THING. IF YOU DON'T PROVIDE OBVIOUS UI CUE, THEY ARE GOING TO TREAT IT LIKE A REAL OBJECT AND PICK IT UP LIKE A REAL OBJECT. WE LEARNED, WHEN WE RAN STUDIES WHERE WE BASICALLY SHOWED PEOPLE OBJECTS WITH AND WITHOUT THE HANDLES IS THAT YOU JUST NEED TO SHOW PEOPLE HANDLES ONCE. IF YOU SHOW THEM ONCE, HIDE THEM LIKE IN THE SHELL. HANDLES THAT APPEAR AND DISAPPEAR, THEY REMEMBER AND PEOPLE ARE GOING TO PICK THEM UP FROM THAT HANDLE. IF YOU NEVER SHOW HANDLES, PEOPLE ARE GOING TO TRY TO PICK UP OBJECTS LIKE THEY WERE REAL OBJECTS. THAT IS ACTUALLY GOING TO BE A LOT HARDER TO DO RELIABLY. ALTHOUGH IT IS INTERESTING RESEARCH. ALL RIGHT. NEXT GOING BACK TO VISUAL AUDIO CUE, I'M GOING TO REPEAT MYSELF BECAUSE IT IS VERY IMPORTANT. NEVER SOME HARM IN THAT. VERY IMPORTANT TO PROVIDE VISUAL AND AUDIO CUES TO COMMUNICATE THE STATE OF YOUR OBJECT. FOR EXAMPLE, WHEN YOU ARE HOVERING NEAR A GRABBABLE OBJECT, CHANGE THE COLOR OF THE GRABBABLE SO THAT PEOPLE KNOW WHEN THEY GRAB, THAT IS THE OBJECT THAT IS GOING TO GRAB. BASIC STUFF IT SOUNDS LIKE. IT MAKES A WORLD OF A DIFFERENCE. ALSO PRESS PLAYING SOUNDS ON GRAB START AND END, VERY IMPORTANT. BECAUSE AGAIN YOU WANT TO COMMUNICATE THE STATE OF YOUR SYSTEM TO YOUR USER. IT IS CRITICAL TO DO THIS. AND SORT OF HOLLOWGRAPHIC APPLICATIONS. FOR MANIPULATION, SUPPORT ONE OF TWO HANDYMANIP LATIONS. AND SORT OF, THIS IS A LOT OF DETAIL. A LOT OF STUFF TO DO. SUPPORTING ONE HAND, MANIPULATION. TWO HAND STATE TRANSITIONS . FOR THOSE WHO USE HOLO TOOLKIT, THIS IS TWO HAND MANIPULATION COMPONENT. WORKS BOTH DIRECTLY AND INDIRECTLY AT A DISTANCE AND IT WORKS ON HOLOLENS ONE AS WELL. SPEAKING OF INDIRECT DISTANCE INTERACTION, ANOTHER THING THAT WE LEARNED WHEN WE SHOWED THESE SORT OF HOLOGRAMS TO PEOPLE, WE BASICALLY SHOW THEM A BUNCH OF HOLOGRAMS AN SAY, HOW WOULD YOU MOVE THAT COFFEE CUP THAT IS ON THIS SIDE OF THE TABLE TO THE OTHER? NOT SURPRISINGLY, WE LEARN THAT PEOPLE ARE KIND OF LAZY. DON'T NECESSARILY WANT TO WALK OVER AND GRAB SOMETHING DIRECTLY. WANT TO DO THIS INDIRECT INTERACTION. HOW DO WE INTERACT WITH THE DISTANCE ON HOLOLENS NOW WE HAVE ARTICULATED HANDS? WE USE THIS HAND RAISE. THINK OF THAT, IF ANY OF YOU ARE FAMILIAR WITH VR AND MOTION CONTROLLERS IN VR. AND YOU HAVE RAYS OUT OF THE MOTION CONTROLLERS. WE DO THE SAME THING FOR THE HANDS. SO YOU HAVE A MOTION CONTROLLER, HAND WITH A RAY SHOOTING OUT OF IT. WHEN YOU GET CLOSE TO THE OBJECT, THE RAY DISAPPEARS AN DIRECTLY MANIPULATE IT. WHEN YOU MOVE YOUR HAND AWAY FROM IT, THE RAY SHOWS UP AGAIN AND DOING INTERACTION AT A DISTANCE. IT IS REALLY NICE, SORT OF WORKS BOTH NEAR AND FAR. IT IS ONE SINGLE INPUT MODEL. WE KIND OF LIKE THAT. IN OUR MR DEV DAYS TALK, I GO INTO DETAILS EXPLAINING ALL OF THE OTHER DIFFERENT WAYS TO TRY TO INTERACT IN A DISTANCE. YOU KNOW, WHY WE ENDED UP WITH THIS. AND WHY WE ENDED UP WITH THE ALGORITHM THAT WE DID. IT IS ACTUALLY, I REALLY AM HAPPY TO SHARE ALL OF THAT WITH YOU. IT WAS A, YOU KNOW, THE THING THAT YOU ARE GOING, THE THING THAT YOU ARE GOING TO WANT TO DO, THAT YOU WILL BE LIKE, WHY ARE THE HAND RAISE NOT DOING THIS? I WANTED IT TO DO THIS. THERE IS A REASON. AND IT IS, I TALK ABOUT IT IN MY TALK FOR A WHILE. SO AGAIN, IN MIXED REALITY TOOLKIT WE ARE PROVIDING YOU WITH THE HAND RAISE. AGAIN I'M SHOWING YOU THE MOTION CONTROLLER BECAUSE WE DO IN MIXED REALITY TOOLKIT AND VIRTUAL REALITY AS WELL, SHOWING YOU THE MOTION CONTROL AND VIRTUAL REALITY AND HOW IT PARALLELS TO THE HAND RAISE AND HOW NICE THAT IS. HOLOLENS 2 IS NOT SHIPPING WITH MOTION CONTROLLERS. THIS PICTURE IS JUST SHOWING YOU THAT NICE PARALLEL. AND THERE IS A LINK UP HERE THAT I HIGHLY RECOMMEND FOLKS TO TAKE A LOOK AT. AKA A MESS MRTK DOCS. THIS IS WHERE THE MIXED REALITY TOOLKIT TEAM IS PUTTING ALL OF THE DOCUMENTATION ON GITHUB. IT IS PARS FROM GITHUB AND IT HAS A LOT OF DETAIL ABOUT HOW TO GET STARTED WITH MIXED REALITY TOOLKIT AND ALSO ALL OF THE DIFFERENT UI COMPONENTS THAT I TALKED ABOUT HERE. SO AGAIN, HERE IS THE SAME LINK. AND WANTED TO POINT OUT ONE SPECIFIC SCENE IN MIXED REALITY TOOLKIT CALLED THE HAND INTERACTION SAMPLE SCENE. AND THE REALLY COOL THING ABOUT THIS SCENE, IT IS ACTUALLY SHOWING YOU PRETTY MUCH ALL OF THE GUIDANCE THAT I TALKED ABOUT HERE WITH DIRECTLY GRAB OBJECTS. PRESS BUTTONS TO ACTIVATE THEM. AT A DISTANCE, USE HAND RAISE. IT IS SHOWING YOU ALL OF THOSE UI INTERACTION PRINCIPLES WITH THE SET OF UI, COMMON UI CONTROLS THAT YOU THEN CAN COPY YOURSELF TO USE IN YOUR OWN APPLICATIONS. AND WE HAVE SORT OF ALL SORTS OF THINGS HERE. AND OF COURSE, IN KEEPING WITH OUR PRINCIPLE OF SUPPORTING BOTH NEAR AND FAR INTERACTION, EVERYTHING HERE WORKS FAR AND NEAR AND IN FACT, IF YOU HAD A VR CONTROLLER, WORK WITH VR, TOO. WHICH IS GREAT. PLEASE YES GO TO AKA. AMESS. DOCS TO LEARN MORE ABOUT MIXED REALITY TOOLKIT. LET'S GET CODING READY. LET'S START REBUILDING THE DEMO THAT I TALKED TO YOU ABOUT. AND SORT OF TO REMIND OURSELVES OF WHAT WE ARE DOING. SO I CAN REMIND MYSELF, TOO. SO HERE WE ARE GOING TO MAKE BUTTONS. LET'S LOOK AT THE BUTTON A LITTLE MORE TO UNDERSTAND IT. WHAT I AM DOING, YOU ARE NOT HEARING SOUND HERE. WE ARE PLAYING A SOUND WHEN THE BUTTON PRESSES AND RELEASES. WE ARE VISUALLY CHANGING THE STATE OF THE BUTTON ON TOUCH. IT IS HARD TO SEE. WE ARE DOING A LITTLE BIT OF A PULSE EFFECT. OKAY. KEEP THAT IN OUR BRAINS. THE OTHER CONTROL WE ARE GOING MAKE IS THE SLIDER CONTROL. SO THIS IS THE CONTROL THAT WE ARE MOST SUCCESSFUL, THE TEXTILE SENSATION WHEN YOU PINCH WITH YOUR HANDS AND MOVE. IT IS GOING TO BE PLAYING SOUNDS AS WE GRAB AND RELEASE. YOU CAN SEE THAT IMPORTANTLY THAT IT TURNS BRIGHT BLUE WHEN I GRAB IT. AND WE ARE ACTUALLY GOING TO DO ONE EXTRA. HAVE IT TURN BLUE WHEN I HOVER NEAR IT. OKAY. SO LET'S GET RIGHT TO IT. AND GO UP TO MY UNITY. HERE. SO HERE WE ARE IN THE UNITY GAME ENGINE EDITOR FOR THOSE WHO AREN'T FAMILIAR. AND WE HAVE A LOT OF STUFF HERE ALREADY BECAUSE IT TAKES A WHILE TO IMPORT EVERYTHING. WHAT WE HAVE DONE, CREATED A NEW PROJECT AND THEN WE HAVE GONE AHEAD AND IMPORTED MIXED REALITY TOOLKIT. IF WE GO TO MRTK GETTING STARTED, LEARN TO DO THAT. DOWNLOAD TWO PACKAGES. FOUNDATION AND EXAMPLES. THEN TO ASSETS, IMPORT PACKAGE TO ACTUALLY IMPORT THE TWO PACKAGES. WE HAVE THAT. AND THEN WE HAVE DEMO FOLDER WITH THE FINISHED SCENE, SO YOU ALL COULD SEE WHAT THE FINISHED PRODUCT WILL BE. ALSO A COUPLE OF PREFABS THAT OUR ARTISTS HAVE GIVEN US THAT HAVE ALL OF THESE GORGEOUS MESHES AND BUTTONS. OKAY. LET'S GO AHEAD AND GET STARTED. SO WE ARE GOING TO MAKE A NEW SCENE. ALL RIGHT. SO BECAUSE I'M USING MIXED REALITY TOOLKIT, IT IS ACTUALLY GOING TO PROMPT ME AND SAY, HEY, YOU DON'T HAVE MIXED REALITY TOOLKIT FIGURE FORD THIS SCENE. WOULD YOU LIKE TO CONFIGURE IT WITH A PROFILE? PROFILE FROM MIXED REALITY TOOLKIT, IS BASICALLY THE MIX OF SETTINGS. IT HAS A LOT OF STUFF IN IT. YOU DON'T NEED EVERYTHING ALWAYS. SO THIS PROFILE LETS YOU BASICALLY ENABLE AND DISABLE ALL OF THE DIFFERENT SYSTEMS YOU WANT. I WILL USE THE DEFAULT ONE, WHICH IS PRETTY MUCH EVERYTHING. NOW YOU CAN SEE THAT IN MY EMPTY SCENE, I HAVE NOW GOT THIS MIXED REALITY TOOLKIT OBJECT HERE. THAT IS HOW YOU KNOW YOU HAVE MIXED REALITY TOOLKIT IN YOUR APP. THEN A BUNCH OF SETTINGS HERE. WE WILL GO THROUGH THAT LATER. I LIKE TO DO SORT OF RAPID ITERATION AND DEPLOYMENT. I AM GOING TO GO AHEAD AND PRESS PLAY TO SEE SORT OF WHAT CHANGED ON THAT. SO WE ARE GOING TO HIT PLAY. RIGHT AWAY, WE SEE THIS FRAME RATE COUNTER THAT COMES FROM MRTK THAT WILL TELL US IF WE ARE PERFORMANT. WE WANT TO KEEP THEM RUNNING AT 60 IF POSSIBLE. THEN, CHECK IT OUT. THIS IS COOL. A HAND. A FULL ARTICULATED HAND. WITH THAT HAND, I CAN MOVE IT IN AND OUT AND PRESS IT. THIS IS BASICALLY EMULATING THE ARTICULATED HANDS YOU WILL GET IN THE HOLOLENS 2. EVEN IF YOU DON'T HAVE A HOLOLENS 2, YOU CAN ALREADY PRETTY MUCH DEVELOP YOUR WHOLE APPLICATION HERE WITH THESE SIMULATED HANDS. WE HAVE WAYS TO SUPPORT MULTIPLE POSES. YOU CAN RECORD AND SUPPORT ARBITRARY POSES AND SWITCH BETWEEN THEM WITH DIFFERENT KEY PRESSES. ALL RIGHT. SO THIS FRAMEWORK COUNTER IS NICE. BUT I KIND OF THINK IT WILL GET IN THE WAY FOR THE DEMO. LET'S GO AHEAD AND DISABLE THAT. TO DO THAT, THAT IS PART OF THE CONFIGURATION. I WILL MAKE A CUSTOMIZED VERSION OF MY CON ANYTHINGUREATION. AND THEN THE FRAMEWORK COUNT IS IN THE DIAGNOSTIC SYSTEM. I WILL UNCHECK THAT BOX. AND NOW WHEN I HIT PLAY. IT IS GOING TO GIVE ME THE SCENE. MY BEAUTIFUL HAND. AND NO DIAGNOSTIC SYSTEM. AWESOME. I LIKE TO DO ONE OTHER THING. WHICH IS I LIKE TO MAKE MY BACKGROUND BLACK. IT MAKES EVERYTHING EASIER TO SEE. I WILL GO AHEAD AND, THIS IS A TRICK FOR FOLKS, LITTLE BIT OF A, I DON'T KNOW IF IS EASTER EGG. I LIKE TO SET THE SKY BOX TO NONE. NOW WHEN I HIT PLAY, IT WILL ACTUALLY BE BLACK BACKGROUND. BOOM. NOW IT IS SORT OF, THIS IS MOST OF THE SCENES LIKE THE HAND INTERACTION, ALL SCENES DO THE SAME SORT OF TRICK. ALL RIGHT. SO BUTTONS. LET'S GET STARTED ON THAT. FOR THIS, I'M GOING TO DRAG IN MY BUTTON PRE-FAB. PREFAB LIKE A PRE-MADE SET OF CONTROLS THAT YOU CAN SORT OF COPY AND PASTE AROUND. AND LET'S HAVE A QUICK LOOK AT THIS PANEL. SO THIS IS THE OVERALL PANEL. I'VE GOT MY LARGE AND MEDIUM BUTTONS. AND THEN IN HERE, I HAVE THIS THING CALLED THE BUTTON COMPONENT AND THAT HAS ALL OF THE LOGIC FOR DOING IT. I AM GOING TO COVER SORT OF THE PARTS THAT MAKE UP THE BUTTON COMPONENT AND WE ARE GOING TO ACTUALLY HIT PLAY AND SEE WHAT WE GET. SO FOR THE BUTTON COMPONENT WE WANT TO HAVE COLLIDER HERE. THIS COLLIDER IS GOING TO GIVE US THE REGION THAT THE FINGER IS GOING TO BE TRACTED. WHEN THE FINGER IS ANYWHERE IN THIS REGION, YOU WILL GET TOUCH EVENT. TOUCH STARTED, UPDATED, NDED EVENTS OR POINTER EVENTS. POINTER EVENTS ARE SORT OF OUR MORE GENERAL WAY OF HANDLE BOTH NEAR AND FAR INTERACTIONS AND MOTION CONTROL, ETC. WE HAVE THE NEAR INTERACTION TOUCHABLE. IF YOU WANT TO MAKE A COLLIDER TOUCHABLE, ADD THE NEAR INTERACTION TOUCHABLE COMPONENT SO OUR INPUT SYSTEM KNOWS HOW TO ROUTE THE INPUT. OR ELSE SENDING TO ALL COLLIDIBLES THAT WOULD BE EXPENSIVE. WE CONFIGURED TO SEND TOUCH EVENTS INSTEAD OF POINTER EVENTS. ALSO SPECIFYING THE FORWARD VICTOR OF THE TOUCHABLE THERE. THAT IS SORT OF SAYING THAT THE BUTTON IS TOUCHABLE FROM THIS PARTICULAR ANGLE. RIGHT NOW, OUR SURFACES ARE GOING TO BE TOUCHABLE JUST FROM ONE LOCATION. WE ARE WORKING ON ADDING TOUCHABLE VOLUMES AND ETC. ALL RIGHT. SO FOR THE BUTTON ITSELF, WE CAN SEE A LOT OF STUFF HERE. THESE PLAINES ARE DESCRIBING THE DIFFERENT SORT OF EVENTS AND STUFF THAT HAPPENS. FRONT PLAINES, THE ONE HERE, WHERE THE TOUCH TRACKING STARTS. THE BLUE ONE HERE IS MAXIMUM PUSHABLE DISTANCE OF THE BUTTON. IT WILL NOT GO PAST THAT. AND THEN THE YELLOW PLAIN IS GOING TO BE THE PRESS EVENT AND THE RED IS WHERE YOU GET THE RELEASE EVENT. WHEN THE BUTTON GOES THROUGH THE YELLOW, IT WILL SEND A PRECEDENT, AND THEN BACK THROUGH THE RED, GETS A RELEASE. AND YOU CAN LISTEN TO ALL OF THE EVENTS TO GIVE YOUR USER INTERFACE VARIOUS KINDS OF FEEDBACK AND DO ACTIONS LIKE ON CLICK EVENT HANDLERS AND ETC. WHAT WE HAVE DONE HERE, WE HAVE HOOKED UP THE EVENTS TO AN ANIMATION CONTROLLER. AND THAT ACTUALLY ARTIST HOOKED IT UP. AND THE ANIMATION CONTROLLER IS ACTUALLY GOING TO CHANGE THE VISUALS OF THE BUTTON TO MAKE IT, FOR EXAMPLE, LIGHT UP WHEN YOU TOUCH AND THEN WHEN YOU PRESS TO DO THAT PULSING EFFECT. ALSO WE ARE HOOKING IT UP TO PLAY SOUNDS. WHEN THE BUTTON GETS PRESSED AND THEN RELEASED. OKAY. LOTS OF TALKING. LET'S GO AHEAD AND SEE WHAT IS ACTUALLY GOING ON HERE. SO I PRESSED PLAY. I CAN USE THE WASA KEYS TO MOVE AROUND. WHEN I ACTUALLY MOVE UP, I CAN ACTUALLY SEE WHEN MY FINGER TOUCHES IT, IT LIGHTS UP. WHEN I PRESS, I GET A PULSE. RELEASE, I GET A SECOND PULSE. WITH AUDIO YOU CAN ACTUALLY HEAR A PRESS SOUND AND RELEASE SOUND. YOU CAN SEE THE PULSE BETTER HERE. PRESS EVENT. PRESS THE PULSE. RELEASE DOES THE OTHER PULSE. LET ME SHOW YOU SOMETHING COOL. HOW ABOUT WE MAKE THAT BUTTON RIGHT NOW. WE HAVE TO PRESS IN DEEP FOR THE PRESS TO HAPPEN. LET'S MAKE IT SORT OF PRESSED RIGHT AWAY. TO DO THAT, I AM GO TO GO AHEAD AND MAKE THESE PLAINES EDITABLE. I'M IN THE SCENE, MAKING THE SCENE DIRECTLY. MOVE UP THE RELEASE, THE PRESS PLAINES, THE MAX PUSH DISTANCE PLAINES BUT MUCH CLOSER TO THE FRONT. IF WE LOOK HERE, NOW BASICALLY I SHOULDN'T HAVE TO MOVE THE BUTTON AT ALL BASICALLY. AND IT SHOULD ACTIVATE. SO LET'S GO AHEAD AND DO THAT. ALL RIGHT. SO MOVED THEM UP. STILL IN THE GAME. AND NOW I GET THAT PRECEDENT AND DON'T MOVE IT FAR. EVEN WHEN I KEEP MOVING MY HAND, I'M BASICALLY SCROLLING. IT IS STILL NOT GOING. THAT SHOWS YOU THE MAX PUSH DISTANCE. ALSO HOW NICE IT IS TO EDIT IN EDITOR BECAUSE YOU CAN CHANGE PARAMETERS LIVE IN YOUR GAME AND ITERATE REALLY QUICKLY THIS WAY. THIS IS WHY I LOVE THE FACT THAT WE HAVE IN EDITOR SIMULATION BECAUSE HAVING TO BUILD AND DEPLOY THIS EVERY TIME WILL BE TIME CONSUMING. IT IS USEFUL TO HAVE THESE SORT OF THINGS. BUT THE NICE THING IS, WHEN YOU STOP THE GAME, IT RESETS EVERYTHING BACK TO WHERE IT WAS. WE HAVE YOUR BUTTONS. NOW LET'S GO AHEAD AND ADD THE SLIDERS. SLIDER PREFAB HERE. IT IS A LITTLE CLOSE TO THE BUTTON. LET ME JUST MOVE IT OVER. AND NOW LET'S SEE, ACTUALLY WHAT WE GET OUT OF THE BOX FOR THE SLIDER. I'M GOING TO GO AHEAD AND PRESS PLAY. AND NOW I CAN MOVE OVER TO MY SLIDER AND BRING UP WITH THE, BY THE WAY, TO BRING UP THE HAND I'M PRESSING AND HOLDING THE SPACE BAR FOR THE FOLKS WHO DON'T KNOW. YOU CAN SEE, WHEN I GET NEAR, I'M HOVERING TO INDICATE NOW IF YOU GRAB AT THE SLIDER, WHEN YOU MOVE, THAT IT IS, MOVE THE SLIDER. AND YOU CAN SEE THE TEXT VALUE HERE UPDATING. LET ME ACTUALLY SHOW YOU WHAT IS GOING ON. LET ME SHOW YOU ONE COOL THING. WE TALKED ABOUT NEAR AND FAR. AND AGAIN, BECAUSE WE ARE USING MIXED REALITY TOOLKIT, WE GET ALL OF THE NEAR AND FAR MANIPULATION OUT OF THE BOX. EVERYTHING HERE HANDLING POINTER EVENTS WILL WORK NEAR AND FAR. OKAY. LET'S TAKE A LOOK AT WHAT IS GOING ON WITH THE SLIDER. TEXT IS BASICALLY, HAS A SCRIPT THAT IS ESSENTIALLY JUST CHANGING THE VALUE OF TEXT BASED ON WHEN A METHOD IS GETTING CALLED. THEN WHAT WE ARE DOING, WHENEVER THE VALUE OF THE SLIDER GETS UPDATED WE ARE TELLING THE SHOW SLIDER VALUE SCRIPT TO UPDATE THE TEXT. IT IS SIMPLE. AND WE HAVE ALSO GOT EVENTS FOR WHEN THE INTERACTION STARTS AND ENDS, SO WHEN YOU GRAB AND RELEASE, WHEN WE START HOVERING THAT AND UNHOVERRING IT, WE ALSO ARE CHANGING ANIMATION STATES HERE. NOW THE THUMB IS ACTUALLY CONFIGURED HERE. WE ARE TELLING THE SLIDER COMPONENT THAT THIS IS THE THUMB THAT YOU ARE ACTUALLY GOING TO BE GRABBING. IT IS MOVING ALONG A TRAP THAT IS CONFIGURED HERE START AND END. LET'S LOOK AT THAT THUMB TO SEE WHAT IS GOING ON HERE. IN THE THUMB, KIND OF DEEP IN THE HIERARCHY, IT IS SET UP THIS WAY BECAUSE, THIS IS HOW WE GOT IT SO, FROM THE ARTIST, BUT IT IS ACTUALLY KIND OF NICE TO SEPARATE ALL OF THE THINGS OUT. WE HAVE THE BOX COLLIDER. THIS IS ACTUALLY THE OBJECT THAT IS RECEIVING ALL OF THE POINTER EVENTS. AND IT IS GRABBABLE NEARBY. YOU CAN REACH OUT AND DIRECTLY GRAB IT. ALSO THE COLLIDER GIVES YOU THAT REGION IN THE DISTANCE. WHEN THE HAND OR RAY GRABS THIS COMPONENT, IT IS ACTUALLY GRABBING A CENTER POINTER, THAT BUBBLES UP TO THE PINCH SLIDER. THEN THIS SCRIPT HANDLES IT. DOES THE MATH TO FIGURE OUT AS YOU ARE DRAGGING TO PROJECT THE SLIDER ROOT DOWN TO THE ACCESS OF INTERACTION AND MOVE IT AND SEND THE UPDATED EVENTS. AND WE ALSO HAVE THE SLIDER SOUND SCRIPT THAT IS ACTUALLY AGAIN, LISTENING FOR THE EVENTS AND THEN PLAYING SOUNDS ON GRAB RELEASE, ETC. WE ARE ABSOLUTELY PUBLISHING THIS ON LINE. MUCH EASIER FOR FOLKS TO LOOK AT CONTENT WHEN WE ARE THERE. SO THAT WILL BE MUCH EASIER. ANOTHER COOL THING, WE CAN ACTUALLY JUST CHANGE THE START AND END OF THE SLIDER LIVE. SO LET'S SAY I WANT IT TO BE SHORT. AND AGAIN, WE COULD CHANGE THE VISUALS BY CHANGING THE TRACT VISUAL SCALE HERE. BUT NOW, SORT OF IF I UPDATE IT, WE ARE GOING TO BE ABLE TO, GETTING CLOSE, SO I CAN MOVE MY HAND BUT IT IS ONLY MOVING THE SHORT DISTANCE. AND IF WE WANTED TO, WE COULD ALSO CHANGE THE ACCESS OF THE SLIDER. BY HAVING IT MOVE ALONG THE Y ACCESS HERE. NOW IT IS GOING TO MOVE VERTICALLY AND MOVE FUNNY. HEY, THERE WE GO. ALL RIGHT. SO LOTS OF STUFF THAT WE CAN DO WITH THE SLIDER. AND WE CAN INTERACT WITH IT IN DISTANCE. THAT IS GREAT. NOW I WANT TO DO ONE MORE THING THAT IS THE HAND RAISE LOOK REALLY COOL. FOR THE DEMO THAT WE WILL SHOW YOU WHEN RUNNING LIVE, IT WILL GET IN THE WAY. LET'S TURN OFF THE HAND RIS. I'M GOING TO GO AHEAD AND IT IS IN THE INPUT SYSTEM AND THE POINTER SETTINGS. I'M GOING TO MAKE MY OWN INPUT SYSTEM. AND THEN CUSTOMIZE MY OWN POINTER PROFILE. AND THEN IN THE POINTER PROFILE THE RAYS, BASICALLY CALLED THE DEFAULT CONTROLLER POINTER. SO ALL OF THESE INPUT, OC ULOUS, ALL GOING TO CREATE RAYS. ARTICULATED HANDS CREATE HAND RAYS. SO WE ARE GOING TO UNCHECK IT SO THAT ARTICULATED HAND IS NOT MAKING THE RAYS. NOW WE ARE NOT GOING TO HAVE ANY FOR OUR INTERACTION WITH THE HAND RAYS. SO YOU CAN SEE THAT I MOVE AWAY. I DON'T SEE IT. NOW WE CAN SORT OF DO THAT. NOW I ALSO WANT TO BE ABLE TO TURN OFF THIS HAND VISUALIZATION WHEN RUNNING LIVE. SO FOR THAT, I AM GOING TO USE THE TOGGLE FEATURE PANEL. SO THAT IS A COMPONENT THAT COMES IN FOR FREE AND MIXED REALITY TOOLKIT HERE. AND LET ME BRING IT CLOSER. SO HERE WE GO. SO NOW IT IS GOING TO LET ME TURN OFF AND ON THE VISUALS . ALL RIGHT. SO THERE IS THE PANEL. THEN I CAN, LET'S SAY THAT I WANT TO TURN OFF THE HAND JOINTS. VISUALIZATION, SO I GO HERE AND NOW, BOOM. ALL WE SEE IS THAT NICE LITTLE RING CURSOR THAT GIVES YOU THAT PROXIMITY. OKAY. THAT IS A LOT. I THINK THAT NOW, SORT OF, I TESTED EVERYTHING. IT SEEMS TO BE WORKING. THIS IS NORMAL FLOW. NOW WE ARE READY TO GO ON THE HOLOLENS. TO DO THAT, BUILD THE BWP SOLUTION HERE. >> JULIA. >> YEAH? >> ACTUALLY A QUESTION. WE TRIED THIS APP. I FEEL CONFIDENT. AND YOU WANT TO TRY USING THE PLACE TO PROJECT? >> FILM ME USING THE PHONE? INSTEAD OF THAT BIG EXPENSIVE RIG ON HOLOLENS 2, WE ARE GOING TO FILM ME DOING THE HOLOLENS INTERACTIONS AN SEE IT ON YOUR PHONE. >> YEAH. >> OKAY. >> YOU WANT TO DO IT? >> YEAH, LET'S GIVE IT A SHOT. WHAT DO I NEED TO DO? >> FIND, GO TO MIXED REALITY TOOLKIT SHARING. DATA VIEW. >> I NEED TO DRAG IN THE SPECTATOR PREFAB. >> I GO TO THIS AND EXPLAIN THE REST. >> SOUNDS GOOD. LET ME START COMPILING IT. >> ALL RIGHT. SO IN MIXED REALITY TOOLKIT WE ARE BUILDING OUT A COUPLE OF PIECES AROUND SHARING. I'LL GET TO THAT A BIT LATER. ONE OF THE PIECES IS EXPECTATION. WE HAVE THE SAME PROJECT HERE. AND I NEED THE SAME PROJECT FROM JULIA BECAUSE I'M GOING TO BE RELYING ON THE SAME MESHES. HOWEVER I DON'T NEED THE SAME SCENE. SO WHAT I AM GOING TO DO IS BASICALLY I'M GOING TO BUILD THE SCENE FOR AR CORD DEVICE AND DROP IN THE SPECTATOR VIEW PREFAB THAT WE HAVE BUILDING OUT. AND WE JUST GO AHEAD AND DO THAT AND THEN PREFAB. WE ARE GOING TO SWITCH TO EXPECTATION. WHAT THAT IS GOING TO DO, THIS PREFAB WORKS OFF OF LOCAL WORK. THE HOLOLENS WHICH HAS A HOST VERSION OF IT, WILL ACT AS A HOST. AND THEN THE PHONE WILL CONNECT TO IT. AND THEN WE HAVE, I HAVE CONFIGURED AZURE SPATIAL ANCHORS HERE. AND WE ARE GOING TO USE THAT TO LOCALIZE. IF YOU AREN'T FAMILIAR WITH THAT, WHO HERE HAS ACTUALLY ATTENDED THE SPATIAL ANCHORS SESSION ON MONDAY? GREAT. ALL RIGHT. FOR THOSE WHO HAVEN'T, IT IS BASICALLY A SERVICE RUNNING IN THE CLOUD THAT ALLOWS YOU TO SHARE LOCATION BETWEEN MULTIPLE DEVICES. BETWEEN HOLOLENS, AR CORE AND AR KIT. THE WAY IT WORK, BASICALLY THE HOLOLENS WILL CREATE SPATIAL ANCHOR AND UNDERSTANDING OF THE WORLD. STERILIZE THAT IT DATA AND COMMUNICATE WITH THE SERVICE. MY PHONE AFTERWARDS WILL ALSO COMMUNICATE FOR THE SERVICE ASKING FOR THE SPATIAL ANCHOR. AND THE SERVICE WILL FIGURE OUT WHERE THE PHONE IS IN PROXIMITY AND SEND THAT INFORMATION BACK. ONCE WE HAVE THIS, ONCE WE HAVE AGREED ON WORLD POSITION RELATIVE TO EACH OF THE COORDINATE FRAMES WE CAN START SYNCHRONIZING DATA. AGAIN, THIS SCENE IS EMPTY BECAUSE CODE WE HAVE BUILT UP WILL REGENERATE THE SCENE BASED ON WHAT JULIA IS RUNNING. I'M ACTUALLY GOING TO SKIP THE COMPILING STEP BECAUSE WE HAVE ALL OF THE WIRING HOOKED UP. YOU CAN TRY IT OUT FOR YOURSELF LATER THIS MONTH. THIS EXACT CODE. ALL RIGHT. SO JUST GOING TO ENTER IT HERE. SO THE IP OF THE HOLOLENS WHICH WAS -- ALL RIGHT. SO WE ARE GOING TO CONNECT. ALL RIGHT. NOW SHOULD BE LOCALIZED. YEAH, THERE WE GO. ALL RIGHT . [ APPLAUSE ] >> GIVE YOU GUYS A LOT OF TIME. SO THIS IS, GOT TO SAY REALLY A DREAM COME TRUE FOR US. WE REALLY WANTED TO CREATE THIS USING TOOLS THAT ARE FREELY AVAILABLE TO OUR DEVELOPERS. AND JUST THE PHONES THAT THEY HAVE IN THEIR POCKETS INSTEAD OF MEETING A CUSTOM SET UP OR ANYTHING. IT IS REALLY SPECIAL TO SHARE THIS WITH YOU. AND LET YOU TAKE VIDEOS. IT IS GREAT. SEE WHAT JULIA IS DOING. WE CAN PARTICIPATE RIGHT NOW AND EXPECTATION ONLY. WITH THIS. AND THIS IS REAL CODE THAT IS RUNNING. THAT IS GOING TO BE IN MIXED REALITY TOOLKIT SHORTLY. >> YOU WANT TO TAKE A SHOT? >> WHAT DO YOU FEEL LIKE? A SCREEN SHOT? POST ON SOCIAL MEDIA? LET'S DO IT. ALL RIGHT. I WILL POST IT LATER. #BUILD 2019. ALL RIGHT. SO IS THAT AWESOME? [APPLAUSE] >> LET'S GET BACK HERE. >> SO I TALKED ABOUT SPATIAL ANCHORS, FOR MORE INFORMATION THAN THAT, GO TO AK. AMESS. SPATIAL ANCHORS. IT IS AMAZING SERVICE. IT WILL ALLOW YOU TO DO A LOT OF SHARED EXPERIENCES. SPEAKING OF WHICH, REMEMBER AT THE START, I SAID THAT MIXED REALITY COMPUTING IS A BLENDING OF DIGITAL CONTENT FOR THE PHYSICAL WORLD. WE HAVE HOLOGRAMS HOLOLENS 1. AND DISTINCTIONAL INTERACTION WITH HOLOLENS 2. AND THAT WAS, WHAT JULIA WAS DOING UP THERE. INTERACTING WITH THE HOLOGRAMS. THAT WAS I WAS LEFT OUT. THIS SHARING WHERE HOLOLENS CAN INTERACT IN A SHARED SESSION WITH ANY OTHER DEVICE, BE AR CORE OR AR KIT WHAT WE STRONGLY BELIEVE IN. THERE IS MANY APPLICATIONS OUT THERE THAT CURRENTLY EMBED THIS INTO THE APPLICATION. WHAT WE ARE DOING WITH MIXED REALITY TOOLKIT WE ARE BUILDING THE PROPER SUPPORT TO ALLOW DEVELOPERS TO ENABLE A FULL SHARING CAPABILITIES IN ANY APPLICATION. WE SAW TODAY, REAL CODE THAT IS RUNNING. IT IS EXPECTATION, ONE WAY SYNCHRONIZATION AND WILL BE AVAILABLE VERY SHORTLY. AND THAT IS ONLY THE TIP OF THE ICEBERG. IF YOU WANT TO LEARN MORE, GO TO AKA. AMESS. STK. CONSUME THE CODE FROM THERE. AND YOU CAN INTERACT WITH THE DEVELOPERS. AND YOU CAN REACH OUT TO ME, JULIA. WE CAN ALSO, YOU CAN ALSO SUBMIT FEATURE REQUESTS OR IMPLEMENT THE FEATURES YOURSELVES AND WE CODE REVIEW AND WORK WITH YOU TO GET IT IN. WE SIMPLY LOVE ENGAGING WITH OUR PASSIONATE COMMUNITY. A PASSIONATE COMMUNITY OF DEVELOPER WHOSE ARE EXCITED TO BUILD WITH MR. WE ARE EXCITED TO BUILD WITH THE DEVELOPERS LIKE YOURSELVES IN THE FUTURE. THANK YOU VERY MUCH. [ APPLAUSE ] >> YEAH, ALSO. IF YOU GUYS FOLLOW US ON TWITTER, WE WILL BE ANNOUNCING WHEN WE POST THE DEMO CODE THAT WE SAW TODAY. WORKING ON, GETTING IT UP AND GETTING IT READY FOR YOU ALL. ANY QUESTIONS? >> FOR QUESTIONS, PLEASE COME ON UP TO ONE OF THE MICS. THERE ARE SEVERAL IN THE MIDDLE . WHO HERE IS ENJOYING BUILD? [APPLAUSE] THAT IS THE MOST IMPORTANT THING. YES. >> YEAH, I HAVE A QUESTION ABOUT WEBX R. EXEB VR. ARE THERE PLANS TO HAVE SUPPORT IN THE TOOLKIT FOR WEBX R ? >> I THINK THAT WE ARE INVESTIGATING THAT RIGHT NOW . >> COULD YOU GO ON GITHUB PAGE AND FOLLOW-UP THERE. >> YES. THANKS. >> AWESOME. THANK YOU. >> IS VOICE COMMANDS BUILT IN? >> VOICE COMMANDS. YES. >> IT IS IN THE SPEECH CONFIGURATION PROFILE. AND YOU CAN ADD ANY VOICE COMMANDS THAT YOU WANT AND MAP IT TO THESE THINGS CALLED ACTIONS. THEN YOU CAN, SPEECH PROGRAM HANDLER THAT YOU CAN BASICALLY LISTEN TO SPECIFY WHAT VOICE COMMAND YOU WANT AND RESPOND. >> YEAH . >> GO AHEAD, COULD YOU SHARE ANY EXPERIENCES YOU HAVE WHEN CREATING TWO DIMENSIONAL HOLOGRAMS. SO FOR EXAMPLE, LIKE HOW IS IT DIFFERENT TO HANDLE THE INTERACTIONS WITH THE 3D HOLOGRAMS YOU SHOWED VERSUS SAY, LIKE THE THREE DIMENSIONAL RENDERING OF THE 2D WEB BROWSER? >> YEAH, I SPENT A LONG TIME ON THAT. ACTUALLY, IN THE MR DEV DAYS TALK CAN INTERACTION CRASH COURSE, I GO OVER A LOT OF THE PROTOTYPES THAT WE BUILT. IT IS ACTUALLY, A COUPLE POINTS ON IT. AND I SHARE LEARNINGS THERE. ONE SUPER, FIRST OF ALL, THE BASIC PRINCIPLE OF IT, FOLLOWING TOUCH INTERACTIONS. SO IF YOU TREAT IT LIKE A TOUCH , A TOUCH SCREEN, THAT BASIC PRINCIPLE WORKS WELL. THERE ARE A LOT OF DETAILS IN THE IMPLEMENTATION. ONE THING THAT CAN BE CHALLENGING WHEN YOU HAVE VERY FLAT, BRIGHT OBJECT IS IT CAN BE SORT OF HARD TO SEE WHERE YOUR HAND IS RELATIVE TO THAT OBJECT. YOU CAN RENDER THE HAND ON TOP OF IT. WE TRIED. AND THE PROBLEM THEN IS THAT YOU ARE INCLUDING THE CONTENT, LIKE THE WEB BROWSER CONTENT THAT YOU WANT TO SEE. BECAUSE YOU CAN'T SEE YOUR HAND, IT CAN BE HARD TO KNOW WHEN YOUR HAND IS PRESSING THE CONTENT. SO WHAT WE DO, THAT FINGER CURSOR THAT YOU SORT OF BRIEFLY SAW IN MIXED REALITY TOOLKIT WHICH IS A RING THAT KIND OF SHRINKS IN SIZE TO COMMUNICATE BOTH THE LOCATION OF THE FINGER AND PROXIMITY TO THE SLATE, CAN, IT IS BASICALLY LIKE AN AIMING AID. IT HELPS YOU CLICK ON SMALL LINKS. WE ACTUALLY HAVE THIS PROBLEM IN THE SHELL. LIKE WE HAVE TO HAVE PEOPLE USE EDGE. SO IT IS LIKE THE HARDEST POSSIBLE PROBLEM THAT WE HAVE. SO THE OTHER THING THAT WORKS, THAT IS CRITICAL TO HAVE A SHADOW PROJECTED ON TO THE SLATE. SO IT IS BASICALLY A SHADOW OF THAT CURSOR. SO WITH THE EYE, WHAT IT DOES, USES THAT CONNECTION BETWEEN THE DECREASING RADIUS RING AND THE SHADOW TO KNOW WHEN YOU ARE TOUCHING. THEN TO FURTHER COMMUNICATE THAT YOU ARE CONTACTING WE PLAY A PULSE AND DO A SOUND. ON TOP OF THAT, WE HAVE TO ADJUST THE DEAD ZONE SIZES THAT WE HAVE FOR THE SCROLLING TOUCH VERSUS CLICKING ASSIMILATION OF THE 2D, UI TO MAKE THEM LARGER. THERE IS REALLY INTERESTING OBSERVATIONS THAT WE IS HAD FOR HAND MOVEMENTS, BECAUSE WHEN PEOPLE CLICK AIR, THEIR HANDS ARE ACTUALLY MOVING IN ARCS. THEY DO THESE ARC MOTIONS. IF YOU REJECT IT DOWN IN THE PLAINES, IT MOVES XY PLAINES. IT IS CHALLENGING BUT DOABLE. LONG ANSWER. I LIKE TALKING ABOUT THAT KIND OF STUFF. SORRY. >> GO AHEAD, PLEASE. >> DO YOU HAVE ANY FUTURE PLANS TO CREATE FEEDBACK GLOVES TO SOLVE SOME OF THE PROBLEMS WITH PEOPLE MOVING THROUGH THE BUTTONS? >> I, WE HAVE DEFINITELY RESEARCHED IT. I'M NOT AWARE OF ANY PLANS THAT I DEFINITELY THAT I SPEAK OF. I'M NOT AWARE OF ANY PLAN. BUT I HAVE LOOKED INTO IT A LOT. I KNOW THAT THERE ARE ACTIVE INTERESTING RESEARCH IN THAT AREA. MY MAIN TASK WHEN THINKING ABOUT HOW TO DO THIS FOR THE SHOW WAS, KNOWING THAT IS LIKE AN EXTRA PIECE OF HARDWARE, CAN YOU GET AWAY WITH NOT HAVING IT? SO THE ANSWER WAS YES. BUT THE ACTIVE RESEARCH. ALL RIGHT. >> I WAS JUST WONDER, IS IT, DOES IT WORK WITH UNITY'S DEFAULT UI SYSTEM? ONE OF YOUR SLIDES SHOWED THAT. >> YEAH. >> SO YOU DON'T HAVE TO DO ANYTHING TO MAKE IT WORK? TO SET IT UP? >> YOU WANT ME TO TAKE IT? YES, IT DOES ACTUALLY. WE HAD A CONTRIBUTOR FROM PARTNER TEAM ACTUALLY MAKE IT WORK WITH THE UNITY UI SYSTEM. AND SO ACTUALLY IF YOU ADD UNITY UI CANVAS, AS LONG AS USING MIXED REALITY TOOLKIT. IT IS PRETTY COOL. I WOULD RUN IT ON THE EXAMPLE SCENE. YOU CAN BASICALLY, WORK AT A DISTANCE WITH THE RAYS AND DIRECTLY TOUCH IT. >> REALLY COOL. >> WE HAVE TWO QUESTIONS. FIRST ONE IS, WHY DID YOU GUYS REPLACE THE BLOOM WITH A WRIST TAP WITH THE START MENU? THOUGHT THERE? AND SECOND ONE IS IT, IS THERE SUPPORT FOR RADIAL MENUS AROUND A CLOSED FIST? >> I CAN TAKE THE BLOOM VERSUS WRIST TAP BECAUSE I SPENT A LOT OF TIME ON THAT. VERY QUICKLY, WE REALIZED, ACTUALLY WE REALIZED IT ALMOST INSTANTLY, ONE OF THE FIRST PROTOTYPE WHERE WE STUCK A LEAP MOTION ON TO HOLOLENS AND FIRST THING PEOPLE DO, THEY GO LIKE THIS. THAT BLOOMS YOU INSTANTLY. EVEN WHEN PEOPLE GET PAST THAT AMAZEMENT. WHEN YOU GRAB AND RELEASE SOMETHING, HALF THE TIME, YOU DO ACCIDENTALLY DO A BLOOM GESTURE. FURTHERMORE, A LOT OF FEEDBACK FROM CUSTOMER, WHEN DOING TASKS IN THE APPS, FAVORITE EXAMPLE, IN MEDICINE, SOMEONE WAS TRAINING HOW TO GIVE SOMEONE A SHOT OR SOMETHING. THEY WOULD LET GO OF WHATEVER THEY WERE HOLDING AND BLOOM. SO THERE WAS A LARGE NUMBER OF FALSE POSITIVES. WE NEEDED SOMETHING THAT WAS MORE CONSTRAINED THAN THE BLOOM. AND WE ALSO WANTED TO DO SOMETHING THAT WAS A LITTLE BIT MORE FOLLOWING INSTEAD OF A SYMBOLIC GESTURE. MORE OF DIRECT MANIPULATION. SO ORIGINALLY, WE ACTUALLY THOUGHT THAT WE COULDN'T DO THE WRIST TAP. BECAUSE WE THOUGHT THAT WE COULDN'T DO TWO HANDS CLOSE TO EACH OTHER. HAND TRACKING TEAM FIGURED IT OUT. THAT IS WHY WE WENT TO THE BUTTON HERE. THE IDEA BEHIND THE UX, PEOPLE REALLY LIKE MENUS AROUND THEIR HANDS. WE ACTUALLY HAVE, I HAVE SEEN DESIGNER, PROTOTYPING THE RADIAL MENUS AROUND THE HANDS ALREADY. I BELIEVE THAT THERE IS NO RADIAL HAND MENU COMPONENT. AND YOU CAN DO IT WITH A HAND SOLVER COMPONENT. BUT THAT IS GREAT FEEDBACK. GREAT TO HAVE AN ISSUE AROUND, I WANT A HAND MENU. SO THE UI SYSTEM IS THAT SORT OF ANYTHING BELOW THE WRIST IS SYSTEM. AND THEN ANYTHING ABOVE THE WRIST IS SORT OF THE APP HAS FOR THEIR MENUS. >> THANK YOU. >> SO TO THAT POINT, JULIA WAS REFERRING TO SOLVERS BUILT IN. AND YOU DOWNLOAD IT TODAY AND ACTUALLY TRY OUT THE RADIAL MENU AS LONG AS YOU FOLLOW THE GUIDELINE OF PUTTING IT UP ON THE WRIST. >> SO QUICK QUESTION, WHAT DO I NEED TO GET STARTED THAT SEEMS PRETTY COOL. I HAVE NOTHING BUT A LAB AND VISUAL STUDIO AT HOME. WHAT DO I NEED? >> WHAT DO YOU NEED TO GET STARTED WITH A MIXED REALITY TOOLKIT? >> YEAH. >> GOOD QUESTION. SO YOU NEED RIGHT NOW, WE ARE RECOMMENDING 2018. 3. AND YOU NEED TO DOWNLOAD, ACTUALLY GO TO OUR FIRST PAGE IN THE READ ME, IT DETAILS YOU EXACTLY SO YOU DON'T HAVE TO REMEMBER IT. >> YEAH, WE HAVE A GETTING STARTED PAGE. THEN THERE IS WINDOWS, WHICH IS NOW LATEST UPDATE, IT IS INCLUDED. YOU DON'T NEED TO INSTALL PREVIEW WINDOWS. ACTUALLY, YOU DON'T NEED WINDOWS STK TO DO THE STIMULATION. ALL YOU NEED IS JUST UNITY, LIKE A, PERSONAL LICENSE UNITY. >> OKAY. THANKS . >> GOING BACK TO THE WRIST OPTION. ARE THERE ANY OTHER MODULALITYS, THAT CAN OPT IN, IT THEY HAVE ONE HAND? >> ABSOLUTELY. THIS IS, I'M VERY PASSIONATE ALSO ABOUT THE ACCESSIBILITY OF THAT. WE ARE VERY ACTIVELY SORT OF PLANNING TO SUPPORT ONE HANDED OPTION. IT IS ACTUALLY FUNNY, BECAUSE THAT ONE HANDED OPTION IS SORT OF THE ONE THAT WE STARTED WITH. THEN WENT TO TWO-HANDED THING. THEN SUPPORT A HAND UP. WHEN YOU BRING YOUR HAND UP, THERE WILL BE A ICON HERE THAT YOU PINCH TO OPEN. WE ARE ACTIVELY WORKING ON THAT. WE ARE PLANNING TO RELEASE IT SOON IN THE FIRST UPDATE. SO IT IS LIKE THE TIMING THING FOR US. >> JULIA. HAS THERE BEEN ANY WORK DONE ON TEXONOMY OF INTERACTIONS IN AR. MAYBE IT IS DISSERTATION THAT WE SHOULD SUGGEST TO SOMEONE? >> I WOULD LOVE TO DO THAT. >> HAS THAT STARTED ANYWHERE? >> I SORT OF PERSONALLY TRY TO COLLECT THESE INTERACTIONS MYSELF. SO THERE IS A WEB SITE THAT IS SORT OF COLLECTING AR AND VR INTERACTIONS THAT SORT OF A STUDENT CREATED. I WOULDN'T SAY THAT THERE IS ONE LIKE DEFINITIVE GUIDE THAT I'M AWARE OF. ALSO RESEARCH PAPERS THAT TALK ABOUT, YOU KNOW, GIVEN, THEY DO STUDIES LIKE, HOW WOULD YOU DO THIS GESTURE TO CLOSE AN APP? THEN THEY RECORD THE FIVE DIFFERENT WAYS THAT PEOPLE WANT TO DO IT IN THE AIR. AND THEN SORT OF TRY TO COME UP WITH A GESTURE SET THAT WAY. I HAVE SEEN A LOT OF THOSE. I HAVEN'T SEEN REALLY DETAILED EXCEPT FOR THE ONES I PERSONALLY WORKED ON. >> ON MY WISH LIST. >> YEAH. IT WOULD BE FUN TO DO. >> HI, THAT IS VERY COOL. AND MY QUESTION IS, IS THERE ANY MUCH LEARNING IN THE SYSTEM THAT CANNOT LEARN YOUR GESTURE? IN THE WAY THAT IT GET TO USE YOUR WAY OF DOING CERTAIN THINGS? >> WE DO HAVE SOME MACHINE LEARNING IN THE HAND TRACKER. IT IS BASED ENTIRELY OFF DEEP NEURAL NET TO FIND, TO SORT OF SEGMENT OUT THE HAND BACKGROUND AND ALSO TO ALIGN THE DEPTH MESH TO THE HAND POSE. WE ALSO HAVE, IT IS ALSO LEARNING THE SIZE OF YOUR HAND. BUT AS FAR AS GESTURES GO, IT IS A VERY LIKE, A VERY, IT IS A GREAT IDEA AND ACTIVE AREA RESEARCH IN SOMETHING THAT WE REALLY WANT TO INCLUDE WHICH IS SORT OF PERSONALIZING THE GESTURE RECOGNITION. FOR NOW, WE ARE JUST, WE ARE SORT OF JUST SCRATCHING THE SURFACE. CAN'T EMPHASIZE ENOUGH THAT WE ARE JUST GETTING STARTED WITH THIS. ACTIVE RESEARCH FOR US. >> THANK YOU. >> WHAT CAN YOU TELL US ABOUT EYE TRACKING, NOT JUST THE TECHNOLOGY BUT SORT OF THE ETHICS AND PRIVACY CONCERNS AROUND THAT. >> I CAN TAKE THAT ONE. YOU CAN TAKE IT TOO. YOU WANT TO TAKE IT? >> GO AHEAD. >> WE ACTUALLY HAVE, WE ARE, FIRST OF ALL, WE SUPPORT EYE TRACKING HOLOLENS 2. SECOND OF ALL, WE ARE SUPPORTING IN MIXED REALITY TOOLKIT. WE HAVE AWESOME EXAMPLES THAT SOPHIE HAS BEEN WORKING ON TO SHOW YOU HOW LIKE EFFORTLESS EYE TRACKING CAN BE AND HOW IT CAN MAKE IT FEEL LIKE IT IS REALLY READING YOUR MIND. SHE HAS MADE A LOT OF EXAMPLES IN MIXED REALITY TOOLKIT. ALSO A REALLY GREAT TALK THAT GIVES A LOT OF DETAIL ABOUT THE IMPLEMENTATION OF IT. AS FAR AS THE ETHICS QUESTION GOES, I KNOW THAT I CAN'T SPEAK TO IT DIRECTLY BECAUSE I DON'T, I DON'T KNOW. I KNOW THAT WE ARE ACTIVELY, IT IS AN ACTIVE, LIKE A VERY IMPORTANT FOR US TO MAKE SURE THAT WE ARE NOT SHARING THAT SORT OF DATA ANYWHERE. SO I KNOW THAT WE ARE ACTIVELY CARE BEING THAT. >> MICROSOFT DOES A VERY CLEAR WINDOWS PRIVACY POLICY. AND I BELIEVE IF YOU REACH OUT TO US ON LINE, WE CAN DIRECT YOU TO EXACT DETAILS FOR THE TRACK. >> ALL RIGHT. THANKS. >> THANK YOU. >> IS IT POSSIBLE TO SIMULATE EYE TRACKING, EXTERNAL CAMERA? >> WE DON'T HAVE IT WITH THE, TERNAL CAMERA ALTHOUGH THAT IS A GREAT IDEA. STICK TO EYE TRACKER. AND YOU CAN SIMULATE THE EYE TRACKING IN MRTK. AND YOU SIMULATE IT. THE CAMERA MOVEMENTS. YOU CAN HAVE IT BE INSTEAD OF THE HEAD THE EYES. THERE ARE WAYS TO SIMULATE IT IN MIXED REALITY TOOLKIT. THEN HAVING EXTERNAL TOOLKIT IS INTERESTING IDEA. >> IT IS ON MY WISH LIST. >> YEAH, ACTUALLY -- >> MINE TOO. >> SUBMIT A FEATURE QUEST TO GET IT. THAT IS ACTUALLY PRETTY COOL IDEA FOR EYE TRACKING SIMULATION >> YOU SHOW HOLOLENS TODAY. WHERE ARE YOU IN REGARDS TO USING YOUR PHONE, AR KIT, AR, USING TOOLKIT? >> RIGHT. SO OFFICIALLY ON THE PAGE RIGHT NOW, WE ARE SUPPORTING HOLOLENS 1, 2, WINDOWS MIXED REALITY AND DEVICES MIXED ON OPEN VR. WHAT YOU SAW TODAY, WE ARE LOOKING AT EXPERIMENTATION OF HOW TO GO TO OTHER DEVICES. FOR EXAMPLE, FOR EXPECTATION, THIS DEVELOPER WORKFLOW WHERE SOMEONE IS WORKING ON SOMETHING AND I WANT TO VERY QUICKLY PARTICIPATE IN AND SHOW IT TO THE REST OF THE TEAM. IT IS HIGHLY VALUABLE. SO THIS, THERE IS GOING TO BE MORE DETAILS COMING INTO HOW WE WILL SUPPORT IT. BUT THIS SPECIFICALLY AT LEAST WILL HAVE EXPECTATION CAPABILITIES THIS MONTH. >> JUST PHYSICALO US ON TWITTER. WE'LL POST THE UPDATES. THE HANDLES ARE RIGHT THERE. >> YEAH. >> FOR THE HOLOLENS 1 YOU HAD RESEARCH MODE TO ACCESS THE SENSOR DATA. IS THERE SOMETHING SIMILAR IN HOLOLENS 2? >> THAT IS A GOOD QUESTION. BUT I THINK THAT IS OUTSIDE THE, WE CAN DIRECT YOU AGAIN. IF YOU REACH OUT TO US. >> JUST REACH OUT TO US AND WE'LL DIRECT YOU TO THE RIGHT PERSON TO OFFICIALLY ANSWER THE QUESTION. YEAH. IT IS A GREAT QUESTION. BY THE WAY, MIXED REALITY TOOLKIT DOES WORK WITH HOLOLENS 1. ALL OF THIS STUFF, WILL WORK IF YOU DEPLOY TO HOLOLENS 1 AS WELL. >> COOL. THANK YOU VERY MUCH. >> THANK YOU.
Info
Channel: Microsoft Developer
Views: 54,812
Rating: undefined out of 5
Keywords: b19, msbuild19, microsoft build 2019, Intro to Building Apps for HoloLens 2 Using Unity and Mixed Reality Toolkit - BRK1003, Mixed Reality, Breakout, Foundational (100)
Id: P8og3nC5FaQ
Channel Id: undefined
Length: 55min 53sec (3353 seconds)
Published: Wed May 08 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.