Oculus Connect 4 | Day 2 Keynote: Carmack Unscripted

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Does anyone have videos from the hall?

πŸ‘οΈŽ︎ 10 πŸ‘€οΈŽ︎ u/valdovas πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

That's not actually a full recording. They cut away the Q&A.
The VODs of the Facebook live stream and the Youtube 360 stream still have it.

πŸ‘οΈŽ︎ 14 πŸ‘€οΈŽ︎ u/Kaschnatze πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

This video just got taken down while I was watching it...

πŸ‘οΈŽ︎ 4 πŸ‘€οΈŽ︎ u/Jordak6200 πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

At 1:00 he mentions virtual wine-tasting experiences. As predicted! (almost)

πŸ‘οΈŽ︎ 4 πŸ‘€οΈŽ︎ u/SvenViking πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

Listened to half of this last night. I love Carmack. I don't necessarily understand about a third of what he's talking about, but that's ok. It also feels a little bit naughty when he criticizes stuff internally, like the Rift lenses. I can practically hear the marketing team and other managers tearing their hair out...

πŸ‘οΈŽ︎ 7 πŸ‘€οΈŽ︎ u/hobx πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

All I do here is convert the video to MP3 and listen on my phone while I'm at work. It's the best part of John not using slides.

It boggles my mind how much experience he has, when you consider he started out learning basic on late 70s computers, and figuring out how to do scrolling and simple fake 3D on PCs to now working on machines tens of thousands of times faster developing things that he couldn't even dream of back when him and Abrash were designing the Quake engine together back in the mid nineties.

πŸ‘οΈŽ︎ 4 πŸ‘€οΈŽ︎ u/[deleted] πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

Only an hour and a half? What happened?

πŸ‘οΈŽ︎ 2 πŸ‘€οΈŽ︎ u/chibicody πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

he mentions around 1:47:00 that they have experimental lenses that give you "amazing" 140FOV, but he doesn't believe it's the right trade-off because of the pixels wasted compared to 90FOV at an equal pixel count.

I wonder what he thinks about PiMax's offering an even crazier FOV, with 8K pixels to make up for the loss of pixel density. Are we missing something about PiMax's promises... How can they be pulling this off.

πŸ‘οΈŽ︎ 2 πŸ‘€οΈŽ︎ u/my_name_is_memorable πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies

In the Q&A he let slip the internal code name for the Oculus Go: β€œPacific.”

Just thought I’d share :)

πŸ‘οΈŽ︎ 2 πŸ‘€οΈŽ︎ u/Augmentl πŸ“…οΈŽ︎ Oct 13 2017 πŸ—«︎ replies
Captions
[Music] okay so of course the main keynote gets obsessed over by a big team of professionals for weeks but here I get to be a lot rougher and more opinionated and even a little bit critical I actually wrote up all of my notes on the plane flight over here which I know drives a lot of production people crazy but I think that's part of the charm of this so I'm you know here we are four years on and I I get reminded of this quite a bit when I started at oculus as a new employee I did what a good new employee does and I adopted the company coding style which was Michael Antonovs which included headers at the tops of files with creator and date so now I still run by all this stuff and see author John Carmack created four years ago it's been a long time that we've been been working on this and VR hasn't taken over the world yet but it is showing up in a lot of surprising places just last week I was at the grocery store and I looked over and there was a gear VR on a poster for some immersive wine tasting experience or something which is very strange thing that four years ago we probably would not have predicted that we'd be seeing it in those sorts of circumstances but there still is this sense that you know maybe this year we we are out of the heroic age of VR about shipping commercial you know consumer commercial VR out to people where the people that signed up for this this bold new journey to go out and build these products that you know that hadn't been out before I am and some people expected it to be magic right off the bat that it would go out and they would be a strike a bolt of lightning and there would be this huge amount of success but the truth is we're product competing against hundreds of other very good products and very finely tuned products and it's gonna take a whole lot of work to to kind of reach that level of success but I do think I'm optimistic in so many ways and much of what makes me optimistic about VR now is that it actually feels like all of the pieces all of the ingredients that we need are already really here they're just not stir cooked and seasoned exactly how they need to be when we look at the fact that we have over two thousand applications I in our store and there are no there's a broad range of quality but there are some very good things that almost anybody should be able to find something good and interesting in the store and I look at and say Facebook is largely an advertising company we should be good at connecting people with the content and that they're going to enjoy but that's something that we're not doing a great job at yet but I think that there's huge room for us to improve that there's a lot more that we can do and the fact that 360 media has probably exploded beyond anything I would have expected four years ago a lot of Facebook having 360 in the feed both eye photos and videos is huge the fact that we have thousands and thousands of high production quality 360 videos and millions and millions of 360 photos I am 360 photos are by far the most the most popular form of user-generated content for virtual reality or for immersive media and terminology wise I have tried to to settle on using immersive media more often because we can have this whole mess of different content types whether it's 360 or 180 or stereo whether it's got 60 frames per second or narrower fields of view and I lump all of this together and just try to say you know immersive media is something that's better than looking at media on a flat screen and I think that there's enormous value that's already there that we're again just doing a poor job of exposing to people the fact that I make this comment a lot if you look into oculus video and you say I want to watch 360 videos you could be forgiven for thinking that there are only 80 videos as you page through sort of the top featured things not that there are thousands and thousands of them available just kind of waiting to be discovered so these are some of the things that I am you know that oculus should be doing a better job at and in our partners we do have some world-class brands like Netflix and minecraft that everybody in the world knows that that are on our platform and the web experience that we're providing is is getting pretty good and in many ways the web is the backstop for all of the apps that you don't have where there's a webpage or an application for practically anything that you could want to do so again it feels like the pieces are there and in many ways Samsung has been doing has done a better job at being user focused and I feel oculus has in many ways when their applications first launched the original I am Samsung VR or milk VR as it was at the time or the Samsung Browser I was fairly critical of the initial applications I would look at them and say you know they should be doing this in a time warp layer this should be filtered better this wrong option is on here but I am you know they went they released version after version and we look over some time number of versions of later and Samsung had the number one application in VR for a while and I used that to berate a bunch of people and oculus about like we need to step up our game you know Samsung is out there listening to users doing what they want providing the features that that they actually find valuable while I think oculus has in many cases focused on we want to build the platform we want to build the infrastructure for some of these things but it's easy to kind of fall into a trap there and not judge yourself on value and impact to the users when you're saying well I'm building infrastructure that's gonna rely on somebody else to extract this value here so I do you know push oculus very much to to try and say well let's actually do things for our users not necessarily for our platform you know often I'm like in the voice of Tron we should fight for the users but many of our applications in the store got 2,000 plus applications but so many of them are stuck on sort of that first version like the original Samsung applications and they don't really get to the point of being able to be iterated on and and cleaned up and brought to that level of value and in fact some developers go through series of applications they if they build a VR application do something interesting then they build another VR application and do something interesting but they're all only taken to that first maybe 80% and or sometimes like in game development you talk about how the second 90% is always the hardest and not many of our applications are going that extra mile where all the really important magic and user value happens and there's a school of thought where that could be the you know the intelligent direction where if you believe that you need a killer app that it's something nobody's ever seen before then it can be completely rational to be sort of point sampling the application space trying lots of different things or throwing things at the wall and seeing what sticks so if your thought is that nothing that anybody's doing is even on the right path to being the winning solution then you just try shooting around randomly in different places but I'd like to present an alternate view point there where there are a whole lot of nonlinear value thresholds where it's not a matter of adding a little bit more work here always results in a little bit more user value sometimes there are a very nonlinear or very sharp cutoff points where if you look at many of the the things about winning and succeeding in lots of different areas and even the the revenue streams at our stores it hurts to look at these sometimes where you have a handful of applications that go make you know million dollars here and are very successful and then you have hundreds of applications where you've had developers put developer years of effort into something and wind up with five hundred dollars of revenue from it and that's really sad but it's unfortunately that's reality and almost all of these consumer facing industries where sometimes the difference between something that does almost nothing and something that is really huge are relatively minor things and things like Facebook and Instagram are examples where they had competitors they had competitors that were doing very similar things and in many cases the differences between what became the multibillion-dollar juggernauts and the companies that went out of business were doing you know dozens of small things somewhat better so I would like to push everyone towards trying to work harder on existing applications and there's another argument for that and that because each new generation of Samsung devices especially we bring in huge new tree of users and that's an opportunity that an older application cleaned-up improved has an opportunity to make brand new fresh impressions on people and I'd also say that again in terms of application discovery and presentation to users we're not doing the best possible job where an application could go out vanish without a trace beneath the surface not necessarily because it maybe didn't have a kernel of something something right in it but because it never got exposed to the user's right eye you know it needs the different guerilla marketing tactics and all the things that people do to be successful on other platforms also have to be applied to the platforms in VR and so I if you have a brilliant idea if you think you've got that lightning in a bottle the killer app you know by all means go ahead and work on it but otherwise you can always spend time improving the existing applications and those are good muscles to exercise so even if it turns out that that wasn't the magic app I am going through the the disciplined work of making it as good as it can possibly be is what's going to need to be applied when you eventually do get the magical application and when I complain kind of bemoan internally about how much better our VR applications could be about the things that they could be doing better why the graphics should be better why the responsiveness should be better why the load time should be better you know I have people that I you know that could know and agree but they'll say but all the devs they don't have your experience you know to which I'd reply well go get it grow stronger you know I'll help you know there's a lot of people that that do have this experience you know you can find old-timers that you know that know how to do this type of optimization and even the new people that know how to do this type of user testing or even marketing these are skills that you know that are going to be necessary for success and I some people would like to think that this type of disciplined coding and design within very tight constraints that maybe if it's like in the old days you could just wait a few years and I and you're lazy design whatever just works on the new computers that was the way of PC development for a very long time but we're in a situation now with with mobile being important and with you know the end of Moore's law kind of drawing nigh that those types of skills are absolutely going to remain important for the foreseeable future this future device that we all imagine where we have AR sunglasses that billions of people are wearing that's probably going to have at least as tight of a power budget and design constraints as you have on gear VR I mean it's going to have lighter weights smaller battery less thermal mass it'll be expected to run for 20 hours instead of three or four I am you are not going to have some magical thing that comes in between now and then that's going to just solve all of our problems and this distinction between I raised the hypothetical question to a lot of people if you had a choice between getting magic hardware and magic software you know which would you choose if I could say well I could take I you know gear VR are stand-alones and automatically give it 4k displays give it twice then the power performance that it's got right now magical optics that are perfectly crisp across everything or you could get software that actually does things to the limits of what the hardware we have today can do and I have always taken the side that right now I would rather have magic software over magic hardware I think that we have enough extra value that we can put into the software on the platforms we have today to make those kind of critical differences and I think we could take what would be thousands of dollars of hardware magically put it down to this low level and the current software that we have still wouldn't be incredibly awesomely engaging just because some of the the hardware magically got a lot better so I gave a commencement speech earlier this year and the advice that I gave to the the young impressionable minds there is basically the same advice that I'd give to the older but hopefully still impressionable minds here and that's to embrace the grind you know work that you've all shown that you're bold in and starting to work on an emerging platform that's not really mainstream yet but it takes more than just be bold have to actually work really hard and you've got to fill your products with give a damn to really care about every aspect of them and turn them into the things that you know you believe in everything you think you've done the best possible work because the whole world's first impression of VR is largely ahead of them and you might be providing it so every little detail counts I like to say that success isn't about that one brilliant idea it's about the doing five hundred or a thousand little things right and getting it all done so despite my comments about favoring magic software over magic hardware it's fair to say that the big announcement a lot of the buzz from yesterday is around our standalone hardware and a lot of people have been surprised that that I've been the champion of mobile and basically lower-end VR given my prior history with AI with high-end PC and pushing the kind of the limits on expensive GPUs and so on but I do think that I signed up for this mission of getting a billion people in VR and that's not gonna happen with very expensive hardware and I'm also not willing to just say well let's wait 10 years or 15 years so that the capabilities of the very expensive hardware are now somehow trickle down if that ever happens I do make the point that you should believe today that literally the power of the PC will never get to a mobile platform we'll run out of Moore's law first it's just not going to get there we'll get another order of magnitude faster but we are unlikely to get to so if we want to get to this massive scale it does mean adopting these lower power lower end mobile technologies and I like go a lot I am I've been using it I you know normally I spend most of my time actually programming but in recent weeks I've been spending a lot of my time being probably most valuable beta tester or dog food or on go and I've been trying to like take it to the places in my life where where I would normally use my iPad instead I'm carrying around my go you know in a little bag for secrecy reasons up until now and now I can start actually carrying it openly but this is the the place where I feel stand alone we'll fill I am it's not your phone it's not your work computer but it's the place that the tablet could fill and I would take it around I would use it in my you know I do my morning trawl of Twitter and hacker news inside the browser in oculus I am inside the go I'd watch Netflix with it I'd watch I've you know movies in it and in fact one day I as an example of the different use cases I basically left it in Netflix and it sort of functioned as a Netflix machine the the headset was sitting there I could put it on watch for 15 minutes take it off go do something else a few hours later put it back on watch a little bit more and that's a very different user experience than working with gear VR it's the same software there but instead of going through the trouble of starting it up docking it launching an application getting through that friction matters a lot and I preached this in all sorts of application areas where one click versus two clicks versus three clicks make enormous differences and people that that are hyper analytic about things like website I conversion rates and things know the value of a lot of these things about how long it takes you how many actions it takes you and the stand alone that can sit there as a device that really is just ready when you put it on that is a significant change in the experience even if you do basically the same things so I'm pretty bullish about them the possibility is there I am you know one of the things in me trying to get treat this like I would one of the other devices I I am like I left it on my nightstand and in the morning I'm like okay pre-dawn time let's put on the headset and of course there's like ah too bright you know blinding in the morning I am so that's what you can think about well should we have something like an automatic proximity our sensor for ambient light so you could adjust for or that kind of condition but it's interesting that it does show up some of the trades that were made with go where because it's an LCD screen even when the screen went completely to black on my dark-adjusted eyes it was like this is still pretty bright and there are several points about go where they are net win but not pure winds where there are some trade-offs that if that were involved and of course the biggest trade-offs were I think the price point is is magically important when I talk about nonlinear thresholds getting to $199 I think is incredibly important there's a lot of data from things like console sales where maybe you start at $400 and you sell this much and you drop to 300 but dropping to $1.99 is one of those threshold points where you do a whole lot better at that number and it's also a number that psychologically for for a lot of the well-off people it's at the point where it's a giftable price point also where I mean lots of us in VR do not want to buy a gamer PC for our parents or answer uncle's or something or if they're not using a Samsung phone we don't want to set up a whole separate phone for them to set up a gear VR but when you've got the stand alone that it's just here's this box you don't need anything else open it up I am you know get through it and you'll be in VR and you can look at these applications I think that's really important but some of these I some of the trades that happened obviously the the LCD screen versus an OLED is a big one where we've spent years talking about all the virtues of OLED displays you know rift is OLED gear VR is OLED and there's some powerful value Doland I the fact that you get perfect blacks and you get very pure colors you have very quick response times to it and these are things that are valuable for VR but it's worth noting that there are some significant trade-offs to OLED as well the rift in particular spends a decent amount of computational time during mirror correction to correct for the inconsistencies across the individual OLED pixels and to do that involves generally adding some values to some things which I am which can detract from the pure black argument and on the Samsung gear VR in particular for years I've struggled with the the problems with the deep blacks anybody that watches movies in the theater or a Netflix when scenes go very dark I am they just don't look that good and it took me I only this year figured out the last piece of that puzzle for why some of those things arm I am the for a long time we've had the obvious case of there's the two frame ghosting to frame rise ghosting problem if you have a very dark line next to a bright area you move back and forth you will see one distinct frame of a slightly dim ghost version and that happens with the very dark values not I am NOT able to rise completely to the the very brightest values again on PC where we have lots of performance riff does a software overdrive to try to deal with that it can't handle everything for full black to white transitions but if you're going black to gray for the next frame it can boost the gray up a little bit higher and does a pretty good job of working that out but again it takes processing power the other problem is when I've called the black smear problem where if you have some very dark imagery there and you move side to side it it has more persistence and smears across the screen and I spent at least three times I tried to fix this problem in oculus video by adding dithering into I would put it directly into the time warp layer reading the movie file directly in the and the scene file or the scene images and I'd add noise I had temporal and spatial differing and it never really worked right and I was very disappointed with this and I never quite understood until just this year I was doing some comparisons of the video screens and trying to can get to the bottom of some of these problems so I would I went into a dark completely dark broom closet all the way I am the light blocked out and I had my samsung phone down there with a color pattern grid in low persistence mode going from 0 to 255 stare at it until I could carefully look at everything and then it was clear that many of the low order values instead of being a smoothly increasing color gamma ramp there are flat spots into it so there are several areas where it's like 0 0 0 and then it bumps up to some nonzero value and this is why when I'm trying to dither goin plus or minus one value there none of that wound up helping because it was all inside this flat spot so that's something that I think Samsung could prob we fix I suspect that the way that comes about is you run the proper srgb gama curve and say the colors need to come down to here and you've got different responses for red green and blue so it's mathematically the closest but what i would argue for is that we should we should have an always increasing color ramp even if it meant that the colors weren't as pure and you may be had few shifts down at the bottom level so I think that that's potentially fixable but in contrast on the the Occulus go screen it's an LCD but all 256 color values are discrete there I can look at that in my dark broom closet and see that each of these does have an improvement so the dark black areas that did tend to look actually bad on the OLED screens look better here I'm it has more sub pixels where something else that's kind of an open secret is that the pentile arrangement that a lot of OLED displays especially use is basically a cheat on resolution I instead of having red green and blue for every pixel in a 2560 by 1440 display you'll have one green for every pixel but only one red and blue for every other pixel and in fact there are pentile schemes from some other manufacturers that have even less sub pixels where you have two sub pixels out of three on samsung there are some that only have one and a half out of three so this is a difference where the I am the ghost screen is full RGB stripes so it's got three pics three sub pixels so there's more actual resolution there it fills out the space better so the screen door effect is reduced I am you wind up getting some wins there so overall I'd say on net the screen is a positive a little more resolution a little less screen door effect no deep black problems but it does have the lower contrast that you'd expect out of an LCD I you notice it especially when the entire theater dims down to that dark area you can still see this kind of gray back light area like you're looking at a an LCD TV in a dark room but still I call it a net win the other disadvantage of the LCD is that it does hurt latency a little bit I am the way the OLED work is we we do a rolling shutter we chase the raster it's lighting up the screen for a little bit I am and we are drawing basically as close as we can within scheduling limits behind it and that's how we get to this super super low latency on the gear VR now with the LCDs I am the Oh LEDs change in microsecond or so they are just super fast you command that pixel and it lights up or shuts shuts off but LCD is the way they work you say I'm changing your value and they take a while to change older LCDs could take 20 milliseconds in some cases many desktop LCD as you can see a to frame ghost just when you're moving your windows around the better LCDs now can change in four or five milliseconds but still that means what we have to do is change all of the values with the screen backlight turned off wait a little while and then you blast the backlight and everything comes on so that is a global shutter effect which is the way rift runs in most cases and internally this used to be one of our great technical debates about whether what artifacts rolling shutter would have versus global shutter and many people argued that this was just not going to work out well on gear VR but now pretty much everyone agrees that it's an okay I've compromised you get less latency there are absolutely some artifacts that we don't fix with rolling shutter but they're things that in practice don't really matter Time Warp does an interpolation between the top and bottom across time as well as what your orientation changes are but if you have something like a I am a long like in gear VR if you had a bar in front of you that was moving up and down rapidly because it's a rolling shutter the bar would feel like it's tilted a little bit if you were moving your head up the entire viewpoint up and down Time Warp could fix all that up by moving the entire thing but if an animation of something moving up and down in front of you would not be correctly you know corrected correctly but it turns out that just doesn't happen all that often and the latency advantage of rolling shutter is is a big deal I am so when we don't have rolling shutter on the LCD it means that we've lost some responsiveness and we should fight really hard to find a couple more milliseconds here and there we can tighten up the scheduling I'm you know we may be able to do some kind of late latching even for attitude to carve a little bit off on the final GPU time warp but it's still unquestionably a few milliseconds slower than gear VR I am it's it's still very good I and we're hoping to make it a little bit better but it will probably always be a little bit more latent than gear VR but again on net I think it's a good a good win I am the the lenses the optics in go our second-generation Fran Elle's or there might be later Jim but they're they're newer Pharrell's off from our research team's development work I always thought that rift the fernell trades on rift were pretty serious I was pretty put off by the the glare effects that happen in rift when you see all of our early demos we'd get the nice bright Unreal Engine 4 duo go popping up there and I did see these fringes coming off in all the different directions and we you know we have design I that's one of the things like in gear VR one of the the design rules you should follow is don't put bright whites I am in the corners because of flicker sensitivity on rift you should never have bright whites across a black background not because of flicker but because of the the front Ellis what it does help a lot in other areas the biggest win for me and the optics is that I hear VR is really only clear right in the center and if you've got things further away it's just blurry almost no matter which way you adjust the focus wheel and this shows up most in games that have face locked like HUDs I am in minecraft or drop dead things like that where you might have information presented around your periphery I found that strikingly better on the go optics then on gear VR and in fact it's encouraged me to to go back and say now that we can actually clearly see more of this I'm reading some of my trade-offs around chromatic aberration correction where in the last couple years while we shipped initially on note 4 we had chromatic aberration correction on like the movie screens we turned it off on some later versions because we were having performance problems on some driver versions and part of my argument was it's a blurry mess away from the center anyways the fact that we have chromatic aberration is not the worst thing there but now that I'm spending so much more time in go one of the things that I did last week was renamed okra Matic aberration on movie screens and some really picky fix ups for the things that have to happen I if you're just doing a normal game rendering you render the 3d world you might have screens or different things in it you just turn on chromatic aberration correction and everything works but if you're using layers time warp layers specifically to get the high quality movie screen or the high quality panel there are some surprising subtleties to that we're turning on chromatic aberration correction can actually give you more artifacts in some ways where this was visible in you would see blue fringes at the edges of the movies in many cases and I you could carefully blend over it and they just wouldn't go away and there's two interlocked reasons why why that happens one is that I when we do chromatic aberration correction blending layers we have one alpha channel sample that blends this but we've spread the three chroma channels apart again rift with lots of computing power at its disposal does the the compositing of layers properly where every channel gets its own alpha value this is again a case where you do all of these things that riff does with mirror correction and overdrive and proper blending I'm just running the tie in an asynchronous space warp just running the compositor on riff from rift would take up all of the computing power on a GPU on a mobile system leaving absolutely nothing for your application so we have to live with these trade-offs hi but it's possible to fix a lot of these with with design issues so avoiding the the blending problem was a matter of I used to have a vignette around the screens to nicely fade off the screen so turn that into a hard edge truncated I cut but then you get visible stair-step edges in some cases and you still have fringes and this was one of the other unexpected aspects of it if you've got a a rectangle a screen in space and it's clipped from the texture coordinates 0 to 1 and you do chromatic aberration correction what that does is the pixels up in the corner red spreads out towards the edge blue spreads out inward the problem is that red spreads off past the edge of your layer and that red never gets drawn anywhere so you're left with only the blue and grains there which winds up color tinting everything around that and the solution is to make sure that if you've got a layer that's gonna be hard clipped you need to make sure that any colors that are gonna be in there will be inset say 16 pixels or so so there's room for the Reds in the in the corners to spread out and this works fine in the cinema where it's okay to have a little bit of a black border around things but it can have some trade-offs trying to do it in other panel applications and so on but still the net win for the fernell lenses I really love the built-in audio on go this is something that it pains me with gear VR when we try to tell people do good audio but the truth is most of the people most the time using gear VR are just hearing the phone speaker little mono scopic speaker up here because it's just a hassle to arrange to get doc the gear plug in your headphones hope that plugging in the headphone doesn't actually undock then the gear VR if you press too hard put this on that put the headphones over and worry about tangling up cords somebody that's gonna set down and watch a long show or something it can still be worthwhile but it's a huge amount of this friction that I keep talking about about getting people into the experiences and while rift has built-in audio I've always felt the little flip down headphones were fairly awkward for me it's it's been you know a little bit of friction on donning the headset and getting it down getting it adjusted so the way the go audio works is the drivers are inside the main headset but it basically pipes the audio through the first part of the straps and an audio file will probably have some complaints about this and say it doesn't have the best frequency transmission for whatever range blah blah blah but it's it really is the optimal for a convenient standpoint you put it on spatial audio works you could look around and hear things spatialize it puts a lot more value on to doing that so I count that one is a pure win I'm you know you can still connect other headphones to it if you want a you know a perfect over ear experience but for 99 percent of the times people are going to use it it's a it's a big win now the great thing about having our own standalone headset is that we can address things top to bottom that it's no longer the barrier between all the different players where what Samsung is gonna do what oculus is gonna do I am you know what the carriers will allow Samsung to do hi so that is one point that that wasn't made completely obvious but this does not have cellular service and that's one of those really tough decisions that UI you have to make on a product level where on the one hand cellular stores are incredibly enticing you know thousands of stores all over the place where if you get your product in there and they push it there they're behind it you can sell a whole lot of units but it's usually tied into cell phone contracts and having data plans and all the things there but the worst part is that you lose control over being able to just completely dictate the user experience and this has been I'm one part of the enormous set of problems that we have with with updates through Samsung I am there are so many players involved with this in doing updates I know gear VR is a nightmare for system updates I have this where at my house I've got an older s7 that I occasionally use for for playing gear VR minecraft at home but if I haven't used it in two weeks I can practically guarantee that I'm going to turn it on and there is going to be some combination of an Android update or a Samsung update or an oculus update and you know the drill about you turn it on it may be updates this and reboots you try to dock it Samsung needs to do an update to something you startup and horizon is downloading something else and this is it's a terrible user experience and this is something that you know we want to have high priority on making a lot better we are hoping the product is designed around if you turn it off every time after you use it you're still going to have some update problems you won't get the updates but I'm arguing for we should never ever date something while the user is in VR and if somebody did turn it off every time after they used it they would never be blocked they would just keep using the experience they've got that that may not hold through all the way to the end because it gets really tempting when you're maintaining backwards compatibility with things to say roadblock everyone forced them to update but some of the worst things will get much much better like the problems with driver updates and GPU driver updates this has been a nightmare I am the the problem is that we go and we work with Qualcomm and arm ion and we get these extensions built in but we can't even first get them on to the Samsung headsets I because Samsung has a whole process for they don't take things we're all from the vendors they they evaluate each thing and apply different patch sets and so on and then they don't even go out to their internal builds for quite some time and then the big problem is that Samsung doesn't even have the power to make these things happen to the users because when they send out a maintenance release when they can say alright we've updated our kernel our drivers all of these important things or we've got a new OS version it's still up to Verizon or AT&T or whoever to decide when their customers are actually going to get that over-the-air update this is gonna be one of the powers that is going to be very good for us in the standalone space where we could decide at any given week to say we're gonna replace the kernel or the driver or any of these things and this should allow extensions and advancements to happen a lot more frequently and I hope to use this many cases to improve the gear VR experience where we can have arguments with Samsung about whether something is a good idea or a bad idea and I absolutely know our place with Samsung where Samsung is about selling hundreds of millions of phones and VR is a small fraction of their business it's a huge part of what we're doing but they are perfectly justified in being conservative not wanting to necessarily jump on things and to let their most important businesses kind of drive their decisions but if we can say we think this is a good idea look we already did it here's the code here's where you need to patch Android I think we've got some Chan of you of getting some more improvements more rapidly into some of the gear vr side of things I'm like this is the like the the multi-view debacle the way I look at this where I've been talking about multi-view for at least three years here and yes it is there finally it seems to be reliable in the SI it's it's providing excellent performance wins for for many applications but wow that took a long time that was I very surprising when the world that I came from I had such good relationships with Nvidia AMD on the PC space we would do things like say all right this is a good idea let's go do this and there could be an experimental extension you know turned around in you know a week and we could have things out to users sometimes in just a couple months seeing this go all the way to three years and still not across all of the devices probably never will show up on some of the earlier ones was a very eye-opening experience and I there's an aspect of that that's fed into some of our some of the other strategic things where some of the problems with the way multi-view was set up I am there were there were some tactical mistakes made in the actual specification that I wish we could go back and change that would have made things easier but it relied on getting into devices and then getting into game engines and then finally actually getting taken advantage of by new applications we're working closely with Qualcomm on a foe V ated rendering extension which by strategic design is trying to bypass all of those things that made multi-view very painful to develop I it's trying to be set up in a way that we can enable it almost transparently to an application in general I'm never in favor of doing things completely behind in applications back but if we can make it just a matter of turn on a flag in your manifest and we will enable foliated rendering that's a far different thing from upgrade to the latest version of unity change your code to behave this way read you know rebuild everything and then it will start working so there are there are approaches that we can take that this has been a real change in my viewpoint we're five years ago as low-level PC developer I was deeply suspicious if any time drivers had to go do magical things because it was almost always not what I wanted I was like let me do the specific low-level things I know what I'm doing I'll take care of it you're going to make decisions that are not going to be optimal for me in various ways but this 3-year Odyssey with multi-view has changed my risk profile for that where I'm like okay yes the vendors very likely can still make terrible mistakes in the drivers in different ways but it's still likely to be higher user value if they can they can make this change and we can make 500 applications automatically somewhat better you know a little bit higher performance in the middle or a little bit better frame rate by using something like that I am but for any type of extension because we own the whole platform now we're going to be able to roll out AI fixes and changes and experiments much more quickly and so that's very exciting now I am go is still an Android device it's gear VR compatible I I would expect everybody I would not encourage anyone to develop anything go exclusive I you should still be considering gear VR the primary platform on mobile it will be quite some time before standalone reaches the numbers that mobile has and this is a point of strategic contention inside the company but I'm of the belief that mobile drop in cell phone based VR is going to be persistent there are there are many people that think that things like standalone will take over and will become the dominant form but I think it's likely that even if we look for or 5 years into the future that mobile cell phone based VR will still probably have the largest number of users I think that again our software's not great now as we make our software much better even the same type of platform will get to be a better more valuable experience for users and what constitutes our minimum bar of this kind of Note 4 level of performance we're going to be seeing that in $50 phones in future years and I think that there are massive markets where a a cheap phone playing VR applications will still have significant value to users as our software ecosystem gets better and better so go should not be looked at as a you know a completely separate thing it really is I our branding and marketing on this is going to be looking pretty weird because internally I can say well it's kind of a standalone gear VR but gear VR is Samsung you know go is oculus it's a different thing it's our mobile VR platform and we need to figure out and a better messaging about that but in talking to developers you should still certainly think about it as one target platform I in many ways go is going to be an excellent tool for developers because we have this lower level control we can we are much less likely to break things that are important for developers like the various profiling tools or the debugging tools you know we've gone we've gone through so many generations with Samsung where well this this tool worked this generation but on this next Rev it's broken again or it finally works over here please don't upgrade anything and and potentially break it in the future we can be a lot better about that we'll still make mistakes but because we can actually fix them the next week if we need to it's going to be important for developers alright now it's interesting when we consider what the actual use cases for it where on the one hand you might think this would be a perfect target we'll have better control over different things but because it is an all-in-one headset I've you know like I find myself I set it facedown on the dead on the table next to me so I spend a lot of time kind of going sticking my face down into the headset to quickly check something as opposed to the mobile stuff where everybody has their little phone holders where you've just got phones stuck up sitting in front of your your workstation when you're working on regular gear VR things so there's still a good argument for protesting on certainly you need to test on multiple platforms but as developments tools they'll have different values in different places then on the high end again I'm most excited about go because it's near-term it's coming you know quickly developers you'll all be seeing them like next month or something and then early next year I you know available the users but the high tech side of things with the Santa Cruz prototype direction is of course exciting for people that are are interested in where bigger technology bets go and again in terms of strategic disagreements or arguments inside there's a legitimate argument about what is the place of three degree of freedom tracking basically the gear VR and go level of things what experiences and things can people do there versus do we need to have six degree of freedom tracking to deliver reasonable value and then do we need controllers to deliver value so Santa Cruz is making the bet that it's all the way to rift spec essentially where it's six degree of freedom position tracked dual six degree of freedom touch controllers you know go is a couple steps backwards where it's a three degree of freedom controller and and a three degree of freedom tracked headset I am I still pretty firmly believe that there's a lot of value we can deliver at the low end and this will be one of those things that if we do everything right on go and it goes out and it doesn't get much traction then I will have learned something and we will have pivoted you know permanently probably to the higher end but I suspect it's going to do pretty well I'm realistically of course I don't expect go to go out and do Samsung type numbers because Samsung is a juggernaut on all of those kind of marketing and sales and the things that they do it's been amazing to see the things like I was in London seeing the the building covered with gear VR and gear 360 advertisements and don't expect that from oculus in the coming year that that level of presence is really a Samsung superpower but I have the six degree of freedom tracking side of things the argument that I always make is that a lot of what people do in gear VR especially wouldn't even benefit from six degree of freedom tracking where the video applications the 360 videos 360 photos even watching things in Netflix or them the cinema theater benefit very little or not at all from six degree of freedom tracking and it is true that this is where people are spending their time the arguments that we go around in circles with are saying that I you know I I keep saying I made the prediction early on that more than half the time is going to be in these media applications and it absolutely is the counter-argument is that well that's because our games aren't very aren't very immersive that they don't have the real present sense of ER you need hands to be able to do something in VR otherwise you're a passive spectator it's not surprising that there aren't yeah really involving activity is there and it's it's a credible argument you know we have reasonable people can differ on some of these things and ultimately the market will you know will have out will be informed by how people respond to the product but I think that there's still quite a bit of value there but still there's amazing things to be had by getting the full six degree of freedom tracking in a mobile space you know everybody talks about room scale VR but the ability to walk around a large space and have you know Stadium scale arena scale or world scale VR where you can move around and walk behind things rather than navigate behind things this is of course the you know core magic of VR this is the really powerful stuff that VR can bring and it is inevitable that all of this will come down to two very low price points in mobile there are still enough trends on things that you know quality of cameras and integration of different things that these prices are going to be dropping it is a question of where mass adoption happens though and at what price points I do harp again on I think $1.99 is a is a superpower for go and it's unlikely that we can throw all the other things in at once but when you take that everybody that's seen the demo that's playing dead and buried and I do agree that it's possible to take the magical essence of anything anybody's doing on the rift and the PC and bring that to a mobile form factor but it is not gonna be easy I made the comment a lot about how a high-end PC might be burning 400 watts of power and on a mobile system maybe you're burning for if that in a lot of cases so a factor of a hundred and power difference it can be bridged and if you look at yet dead and buried is done by crack internal developers you know Andrew and Ryan Diane oculus strike team people they they know what they're doing they're veterans they've been around for a long time and they bring over the entire essence of what they were doing you're sure you're missing a lot of subtleties in some of the graphics different aspects but you can get the the core magic and it's possible but it is a lot of work and again like my talk at the beginning I it takes people buying into some of these skill sets that are not necessarily being exercised very much right now I'm if you're interested in seeing that level of interactivity and performance you might as well start by you know sprucing up some gear VR level things because that's performance relative performance levels that you have to deal with some of the things that are interesting from the standalone engineering side of things I am it I've often commented that mobile performance you only get kind of half or less of what you think that you would get and largely that's because of thermal limits and everybody's had the issues of gear VR overheating and I stress about this a lot where everybody each time we get a new a new generation like the s8 is a lot better and people say well we want to do more aggressive graphics and I'm always like can we just hold back and instead have a device that never overheats it's a balance between saying well I really want to add my specular highlights to this and I'm like well I really don't want to overheat and have a user say well my device just kind of crapped out on me this you know this is not very good I am but with proper cooling it is possible to run the devices at at their maximum or very high clock rates but that's one of those engineering changes do we put on active cooling and have fans blowing over this and certainly we're gonna have lots of this computer vision work going on to track the world to track the controllers that comes out of you know the the minimal hardware that side that's going to be there and driven by the battery so there's a lot of trades there some of the other interesting I aspects of owning it like a discussion that came up one of the problems that we have on gear VR as lens fogging one of the reasons why it's less I am it's less of a problem on rift is you'll notice that the rift stays kind of warm if it just sits on your desk there it's burning power if you know there's it's probably burning as much power as active cellphone when it's sitting there idle I've for the cameras and the rift itself I am but that actually keeps it the lenses from being as much of a fogging problem because the temperature of the lenses determines when I humidity or sweat condensation will wind up landing on the lenses and something we've talked about for go where maybe we should if you've got it plugged in if you have it on the charger and plugged in we should drive the screen I turn the screen backlight on but leave the pixels all black so it it generates heat into the device and notably keeps the LCDs warm so they're gonna change faster we have concerns about various low temperature conditions for the LCDs as another trade off against ole Ed's I'm one of the other great things is having our own operating system team that I we can now track everything down like it's been so wonderful for me to be able to say I like on a Samsung phone you do a systrace you get a list of everything that's going on and I just stare at it dumbfounded like what is all of this crap running on my phone when I'm supposed to just be in VR and it's things you know like why is Amazon videos doing something in the background when I've never downloaded it but it's a carrier preload for something and all of that's gone on go I mean I go you get a nice list of here's all the processes they're all oculus processes from a few plus a few basic Android processes and and that's really great and now to get some of the really great kind of kernel ninja types in inside oculus being able to track down well why exactly is this interrupt blocking this other thing and we're being scheduled delayed for for various reasons so that's been that's been really good so back onto the magic software side of things this my big win for this last year has been VR shell and it's something that we haven't talked really explicitly about a lot we just kind of rolled out hey there's changes in all of our first party apps but a bunch of the people that hung around with me after hours last oc3 got to see sort of the preliminary version of that where I for years I've just been saying I was so frustrated how lot so many of the applications just looked terrible and I had this litany about nobody not even oculus first party was doing like a good job presenting text in VR I mean it pained me deeply how how home was an alias mess of unreadable text i when you started up I can't just point out our own applications and say do it like we do this I everybody would say well this is what VR looks like they would get they would get baseline by they put it on they look at home and the environments aliasing the text is hard to read I'm all these problems and I think that you know that hurt people's expectations and then people wouldn't even strive for better because they figured well oculus is probably doing everything right if I look like that I'm probably doing everything right but you know but oculus was not doing everything right and I had laid out the plan for years I would say you always have msaa on you always want to have I blended edges on text you never want to do a hard discard there you want to use srgb you want to use a custom Time Warp layer and have the resolutions carefully matched and if people followed like the app critiques that I would write up I would say this for years over and over about here's the steps to go make VR look good but very few people would actually follow them so last year I had was showing people just this little demo that I had where I took the top 99 applications in the stores like 99 random applications and I had a perfectly pre-filtered environment map and I used a cylindrical time warp layer which was the one new bit of magic that happened about that time last year where previously we had environment maps we had flat planes for movie screens but the addition of cylindrical layers was a surprising goodness where the lenses have a distortion where you have less detail on the sides but curving around you not only does it feel nice for VR but it almost magically corrects for the distortion of the lenses and makes for an even pixel density so I was using that new bit of technology and then just doing all the things that I had already been doing before I and it looked looked magnificent it looked so much better than what we had and so internally we were we always had this long battle between doing things like in unity and unity is a wonderful place for doing game engines but I've always said unity is not the most appropriate tool for doing a media browsing system whether it's for videos or applications or whatever and what I pitched was a new environment that I was calling VR shell where it would handle two-dimensional user interfaces and minimal 3d and environment things it would be focused on the user interface side of things and a point that I would make was that two-dimensional user interfaces are unreasonably effective everybody's surprised that they think well we have to be pulling things in and out we have to spin things away and do crazy tilty 3d things and almost all of those turn out to actually be negative for the user value experience about being able to find what you're looking for and be sold on the value of what you're looking at so I was proposing we push it all to basic 2d interfaces basic 3d environments but we have them all done right you know we guaranteed that the pixels are going to come out as good as we know how to do it there we're going to efficiently handle the loading of the environment assets and then because it's going to be an environment a system here we'll be able to switch between applications seamlessly no fade to black load up another application but instead you want to go to video it's here you want to go to browser it's here you want to go to Facebook it's here and then we start delivering synergies there where you can be in Facebook you can click on a link it goes to browser you can download a video it goes to video using kind of our optimal presentation tools for all of this and this was I you know this there was a lot of kind of institutional resistance to changing what we're doing the way we're building our applications and I cheer that this really was I think my big win for the last year getting our applications I migrated over to this new platform and I think that this is where a lot of our future value rests I am not only does it look much better but we are pulling in these synergies and the one thing that hasn't made it in yet that is our big focus for the rest of the year and towards go launch is that shell was always pitched to be a social substrate where we've seen enough experiments now where different applications get built for we have we have spaces we have all the other places that people other companies have made social applications and there's a trajectory that all of these things have where you build a social space you've got a chat room you have a couple things you want to do there but then everybody wants to make it expandable you had a plug-in architecture you add some way to start doing other other things and I contend that that's backwards that instead of building a social place and then hacking applications on top of it I have proposed with this that we build an application infrastructure that has social built in and there's gonna be trade-offs involved it's not going to have every nifty little social feature that the dedicated social apps have but my bet the play that we're making here is that having good enough social that's always there that can be a simple click away not an app launch and browsing and doing all the other multiple steps again that friction funnel that keeps people from doing things this is core behind what we're doing with venues there's still a you know a a directional bet that we're making here where we're starting off with venues being stranger focused for large events because there's still the shallowness of the friend graph it's hard to find people that have two or three friends at the same time that want to go do something but fundamentally it's not venues as this separate product and internally I've had this whole problem with how we look at it internally and Physician it externally where it's not much of a product itself it's really just enabling the shell core in the art the social core in VR shell I've and venues is going to be a trivial video playing app essentially that works with this but eventually all of that stuff should be available everywhere and people will say well what's the point of letting ten people join you while you're looking at your home screen I am there's probably plenty of things that don't make sense but even like browsing through the store or being able to have somebody sitting next to you watching you page through the stuff in the store and pointing out different things I do think that there's some value there I am shell is a multitasking environment I when you pop up the keyboard for typing in a URL or doing a search in home that's a separate we call them panel app locations we talked at the beginning we started off making home actually three separate applications we were gonna have you know social and store and I'm recents different things like that turned out we didn't do that and the landing page is one big application so it has the freedom to rearrange things in different ways but you do have the ability to do that and we're rolling out a new interface that's gonna sit you know that's gonna sit below a lot of the applications to provide common things that everybody wishes we had like easy ability to see the time see the battery level I were gonna have a universal bed mode in shell where you can just pull any of the UI so you can lay down and browse the web or Facebook on the ceiling while the environment around you still stays properly referenced without going to avoid theater system so I think a lot of these things are going to be very powerful there's still a few pieces like I I wince every time now that I see the the old pieces of system activities when you have the the quick to home or return to application mode which is still not in layers blurry low resolution all those things are going to eventually be part of and part of the home VR shell system that all applications will be able to take advantage of now the actual programming model for the shell applications this does mark the second time now where I've kind of failed to bring in a radical change in program development the first one was with the VR shell stuff that I was talking about two years ago and that I I thought it was a pretty powerful and compelling like the demonstration the live coding that I did here but internally the general thought was well nobody is nobody really is gonna want to use Lisp or scheme the racket system that I used so it should be JavaScript and we started hacking together a JavaScript version of that that Michael Antonov did a lot of work on but then it's like well if it's JavaScript then it should really just be web VR and we've you know spin up the web via our team start working on all of that but my hope of two years ago being able to have this super simple easy easy to distribute simple applicant via our application model you know we've lost that and I have a guarded relate or I'm kind of relationship with web VR where you general it's never wise to bet against the web in the open standards I think that we should support it as well as reasonably but we have some passionate web be our advocates internally and I'm always trying to steer them it's like no let's build better stuff in VR shell rather than spending a lot more time on web VR there's reasonable arguments you know reasonable people can differ on the priorities there but I want to make my play on let's do the best thing the hardware is capable of on our systems in VR shell rather than spending lots of time on Web VR stuff which has to work on various lowest-common-denominator systems I'm but then the second one when I started on the VR shell system it's one core application but it loads in communicates with each of these separate panel applications the first thing that I was trying to do was set up well what is the simplest fastest sort of application loading switching system I could do and I'm like well alright I'll spin up these services and then I will literally DL open and and load shared object files and I could set things up so I accommodate it's like virtual reality like it would be in the 1990s where you've just got positive you've got your C++ POSIX Open GL open SL and you write these simple things and it's one simple little make file and it had these half-second turnarounds applications could launch and show their first frame in 50 milliseconds and for the the low-level developer like me i was i loved this i thought this was amazing i instead of a whole directory full of stuff to build your project you could have hello world looked like one file one c++ file one make file and some of the the simple applications that i did had these magical response rates all these high qualities but it didn't even last i you know a couple months before people are saying well we need to load java code i'm you know we need to load all of this stuff that our applications depend on we need all these other libraries and it very quickly slid into panel applications are now full android apks i am which makes me sad they've got the whole Gradle to build and I am takes longer to start up and people do take the opportunity to pile lots of Java JavaScript interpreters and all these different things but I it's probably the right thing it's more important for people that have large extensive powerful code bases to be able to easily retask them for this than to be able to build these brand-new from scratch I super efficient diamond jewels I am so where we are right now is what a panel app is is an android service that an applicant that an APK provides i'm BR shell launches it manassas to draw on to which we can take care of positioning in 3d and there's a command channel to do things like changing the environment behind you loading up different models I am playing videos and all sorts of different things like that I am we are and so I hope that that type of annotation this is one of my other kind of uneasy relationships with Web VR is I think that there's this great possibility of doing super trivial website annotations where I instead of adding a big thing if you want to add a little bit of flavor in the browser we shouldn't be able to drop in one line into an HTML file which is basically you know send this environment command to VR shell so everything else ignores it but if you're browsing in VR shell and you go to this page it changes the environment around you I think there's some great low-hanging fruit that again the friction funnel if all it takes to make your website a little bit more VR spiffy is put one line into your into your HTML file and pointed at a 360 photo somewhere I think that's the type of thing tens of thousands of web pages could reasonably do if it's a matter of rethinking it for some virtual reality experience and writing a mini game engine in web VR that's something that I don't think many websites will actually go out of their way to do so I consider the air shell so far a pretty big success for us we have lots of work going on to develop it the intention is to become this next computing platform that you can do everything that you would do in a traditional computing environment in VR with high quality with good responsiveness and there's lots of active effort going on with that so going forward if that was my window for this coming year really the second half to make a similar step in the quality of our immersive video I am the power of a really good 360 video in stereo 60 frames per second high resolution is really high and we really lost something even from the very earliest gear VR innovator Edition one of the things that I was so happy about there was that we we had though packing an SD card that had 16 gigs of media on it including some high-quality 360 videos and you looked at those and that was many people's first impression and it's it's a WoW when you see something really great in contrast now if you start up gear VR you go to oculus video and you watch some 360 videos it's just not that great for for a lot of different reasons and I think it's it's fixable I mean I think we can make what I hand-waving Li call about a factor of 2 improvement in the quality of what most people see when you just go through that process you go through and say I want to go watch the 360 video browse through find something play that it should just come out a lot better than what we're giving today I and one of the things that I also think we can improve on that experience is some of the stuff I've been experimenting with are more user controls inside video playback where I always regret in just normal video players when you don't have things like frame advanced and frame back but in an immersive environment it's more it's almost a godlike power when you've got all the 360 stuff going on you like freeze forward forward forward jog backwards smoothly go forward that's a really powerful feel which plays to some of the strengths of virtual reality so I've spent a lot of time in the last 6 months I kind of leveling up my video skills and a lot of different ways reading all sorts of ISO specs and writing implementations and different things I am now I can't magically make a crummy 360 video look great although there is some interesting research opportunities and things going on with using machine learning to do things like super resolution and framerate interpolations a reasonable thing to do so there potentially is some room for magically make bad videos look better but what I'm concentrating on is reducing the Delta between what's actually captured and then what the user sees because there are a lot of steps between that right now that do a whole lot of damage to it so then the main there's kind of four things that I'm working on that make a big difference on video one the simplest most obvious thing that hardly takes any effort is just allowing users to take advantage of the bandwidth that they've got we see Facebook does all of these surveys about how much bandwidth users have on average and if you look across the main Facebook applications I you know you've got all these across the developing world people that are on tiny little straws of bandwidth and things that they do for videos they have very low quality versions of things one of the things that's that's really a shame right now that needs to be addressed is they have all the different levels of video for diet quality going from in some cases 4k down to this little 160 by 120 thumbnails but there's a single audio track for anything that gets uploaded to Facebook with a single binaural with stereo audio the 360 audio is much higher quality that's got a lot of bitrate but if you just upload a big video even if it's a 4k video and you upload it at 100 megabit per second master quality the audio is coming down to 50 kilobits per second and most people would say that even with good compression and everything you should have a hundred or maybe a 150 depending on the codec that you're using so audio takes a pretty big hit there again changes in priorities when all you've got likely is a little gear VR a speakerphone speaker maybe that's not such a big deal now that we're focusing more on go and you everybody's got good reasonable quality stereo audio we can care more about that so look at the bandwidth rates and lots of people on browsing the web or the Facebook application may am very low bandwidth but the gear VR users are way up there there are people with premium smartphones that tend to have much better bandwidth and the highest bandwidth version that you get from a video that goes up to Facebook is usually around 10 or 12 megabits per second and that's somewhere around our median bandwidth for users like half of our users have so difficult ly more bandwidth than that so the easiest thing to do is say well let's just allow and have a 20 megabit per second or some other higher rates there that let people that have the better bandwidth just magically get better pixels so that can be a big help I'm the other thing is the newer phones since the s since the Qualcomm s7 and all of the s8 the codecs have the ability to decode 4k video at 60 frames per second the earlier phones were limited to 4k at 30 frames per second or the quirky aspect way back on the original note 4 it could do 38 40 by 1920 at 30 but 4k by 2k only at 24 frames per second so encouraging more videos to take advantage of this greater Headroom that we've that we've got on the newer phones then I'm one of the things that's bothered me since the very beginning the original note 4 releases the frame release tempo on the videos has not been perfect where I say 60 frames per second is magically so much better but if you play a 60 frames per second video it does not come out one-to-one video frame 2 to display frame an interesting lyricism that got a lot worse we're on the original note 4 there was a tiny tick of about one frame per second would get missed when we were using the Android media player and I was heartened because I had noticed that and the Felix and Paul people pointed that out hey what's the deal with this little tick and I was heartened to know to find that anybody else actually even noticed that so that wasn't a huge deal but I always thought well what needs to happen is instead of locking to the audio time you need to lock to the video time and then slightly resample the audio rate because the videos are mastered at either 60 or 59.99 seven frames per second but the cellphone displays have more variability than you'd think there's one or two percent variability in the video frame rate which means that you can wind up dropping a frame every second or two just to compensate for that but if you instead match those one to one and resample the audio then you can have it this perfect everything is every frame new one if you don't resample the audio then you find after five minutes that you've got this very long voice delay between what's going on on the video and the audio I so I had wanted to do that for you know four years that had been on my kind of on my hit list and I've been trying to nudge other people to looking into doing that but this year I finally sat down and said alright I'm going to I'm going to address a lot of these video problems I then the last on the pre-processing side is avoiding resampling in as many places as possible where this was one of my keys to the super high quality text in VR shell where conventional VR rendering is a matter of rendering something to a flat 3d render your 3d world to a projection and then we distort the projection onto the screen to account for the lens distortion I so that involves a an inevitable loss of quality even if you put the I buffer up to an insanely high for for mobile resolution even if you ran it at like 1536 by 1536 even in the center there you would wind up having a double resampling of the images and of course on the outside you'd have aliasing so shell is great because you draw to a flat image directly on your text instead of a random floating place in 3d world and then that one gets resampled one time as it goes to the screen now videos get resampled a lot of times in the process of kind of becoming the final pixels on your screen it starts out that I am when you take a video on a camera it's going to be saving it out in some compressed form I am whatever comes off the camera hopefully it's at a very generous bitrate but that gets pulled into a program to do some stitching with it I am and then do eventually do whatever post-processing effects people go on to it so it gets decompressed there gets resampled into an equal rect and then saved out and delivered to you know to Facebook or wherever upload it wherever it's gonna be uploaded often there's multiple steps in between often it will be a matter of master at a high resolution than resample that down to a lower resolution and before going to Facebook and once it went to Facebook we would resample it to a cube map of some kind and this is a legacy of some early decisions that happened in speaking of early decisions one of the things from four years ago that I still get to kick myself fairly regularly for when I defined the cube map format the cube strip format that we used for 360 photos it's actually inverted and we ran into this in shell where alright we take photos inside unity or 360 shots inside unity they're like this but all of our thousands of 360 photos that are already up there on the web served to 360 photos they would all be in like rotated and backwards and worse for 360 for 3d videos they would wind up I swapped when you use they're the wrong one so we have fix up code dealing with this but the reason we used Q maps initially was two things one everybody knows equal wrecks waste a fair amount of data on the top and bottom you've got these massive pull spreads and it's about 30% more pixels that if you you know if you treat it as a cube map you still have unequal distribution look at the corners are much higher resolution than straight in front of you but it's still about 30 percent better than equiaxed and then the other aspect was being able to do the direct time-warp sampling now it turns out for videos that hasn't been that huge that important of a thing because the direct sampling only really matters when your resolution is high enough that you're getting close to the the resolution of your display screen and we've always been a long way off in video resolution so but those were still the reasons that we did that but the resampling means that you've probably noticed in 360 videos that sometimes you'll look one way and you'll see kind of a little stair-steppy artifact i am you know visible at the edges where things were resampled to this cube map we should have been able to fix those by just slightly expanding our filter to overlap it a little bit but the larger point is that every resampling does some damage in some way to them to the to the imagery so turning it from an e correct into a cube map even if in theory it covers the same number of pixels you take your 4k cube map and you turn it into a you know a 1280 by 1280 cube map that in theory even has more resolution in the cube map for most of the areas but it's likely still going to look would be worse quality than if you use the original one I am and it's especially worse when compression is involved because every step of these winds up getting compressed so you you h.264 compress this and you get macro block artifacts so you've lost high-frequency components here and you've got kind of a blurry block well then you resample that and you've got that blurry block that's now stretched across four blocks that get gets recompressed so you get multiple generations of compression artifacts I'm kind of baked into it the ideal world would be if you took something from a camera and the direct raw images from the camera were then I projected appropriately into virtual reality and it's possible to get what I would what I've been chasing with video quality for years now is display limited resolution where we have some media like our 360 photos that are at a higher resolution in some areas of the screen than what the physical display the 2560 by 1440 display on the devices is and some of these like the best 360 photos made by the photographers that really know what they're doing and have done everything right and certainly the synthetic ones like all the Oh toy render the Metaverse images no these are still the finest looking things that you see looking through a headset and for years I've I've been if only we could make these move if we could have this quality of pixels i and in our videos that would be a really magical thing it would be very different than what we've got now and in just I have a few weeks ago I started doing some tests there's all the ways that we've chased this we had the view dependent streaming was our big bet that we thought was going to allow us to get display limited pixel quality in the middle and the way that would happen is content creators could upload a 6k video like a 6k by 3k video and then we made 20 different versions of this depending on exactly where you were looking but it really did turn out kind of disappointing I in that you'd be looking here and if you sat there and you only look straight forward and you got to the right quality level and everything there are a number of things that had to happen to turn out right sometimes it looked really great you had extremely high quality but what almost everybody knew would notice about it is well that's interesting I look over here and now it looks like garbage over here for two seconds until it winds up picking up that stream and streaming that in and then it snaps into a super high quality and that was very much a mixed bag and I think most people decided that it was a net negative and this is something that we we could a be a lot in terms of how many people watch longer videos if we turn that off and just give them the cube corner like our best other projection versus that and the truth was people would give up watching those because there would be more hitches in the display and the differences in the video quality so that was not going to kind of give us this magical quality level but something that I that came up just a few weeks ago Matt Hooper had been working on the idea of like well what are we gonna do for our in-store demos for selling oculus goes those questions of do we just have an app launcher do we have some kind of intro to VR video what do we want to do well we really want to make the the best possible quality that we can put we want to put our best foot forward there get the highest quality videos and different things that we can mix in mix in some 3d environments but we started talking again about my my quest for this display limited resolution and we were saying well if we wanted to just do an experiment about what could this look like if we just everybody knows now that if you go from a 360 to a 180 video you you double your pixel quality and that's a pretty big step and it's like well you know what if you draw drew it in even more there would be some point at which you would be at the display limited resolution and by kind of a happy coincidence it turns out that just regular GoPros one of their resolutions where you're at 1920 by 1440 that's at a 90 degree by 120 degree field of view that is almost spot-on to what gear VR gear VR z-- actual resolution is so you know Matt went and set up a couple GoPros did the type of thing that probably dozens of you in the audience have gone and done set up a stereo camera rig worked out the syncing stuff and started taking some some footage with that and I played this with all my latest code I call the new suite there's video direct for the playback part of it and I've got - direct for the streaming and multi streamer for doing multi socket streaming all these different things so I plugged it in with the video direct back-end and it looks pretty amazingly good I experimented with a few different edge treatments I there's like a 3d model around the edges or a properly positioned 3d fade or just showing the raw camera edges and I've been showing a bunch of people around here in fact like one minute before they pushed me out onto the stage here i was showing one of the the producers that did all of the kind of staged production work here now this is what our video should look like this is what this is what's possible and it's it gets an amazing reaction then probably 20 or 30 people have have gotten an sm is-7 passed around to see this while I'm here I'm here the rest of the certainly the rest of the day and more people will see it and I'm looking at this as this is very much like what I did last year and if I can keep this going in future years this will be great show how text and imagery should look last year deliver VR she'll show how video should look here and deliver massive video improvements next year but it reinforces to me that there is magic there that there's something very powerful getting locked stereo locked 60 frames per second having everything work perfect without the glitches without the artifacts at that resolution I'm it's you know it's really a pretty big deal and these were just GoPros where professional lenses professional lighting I am you could stretch out the fields of you even if you weren't doing any more tricks you could go to that's not maxing out the codec resolution you could go to maybe 160 by 120 or something and still have this exact same super quality to go beyond that we start looking at many more kind of tricks and games we can do more projections if we don't mind a resampling on the way I have some I am some hardcore h.264 bitstream hacking stuff that I want to do for people that have location-based things that don't care about maybe loading up gigs of video I am if I can rebuild synthetic bitstreams on the way so there's possibilities there but again the put all of these things together I avoid resampling use the bandwidth people have available take better use of it using multiple socket streaming so on I and the way I'm doing all of this is for this narrow path we use exoplayer for and for most of our almost all of our video work it's great software open source from Google most of our video stuff is based around this I'm but you know it's a big java application that doesn't specifically cater for the like the super high res videos and things that we're doing so for this narrow path i have custom code that i my goal was to own it from kernel sockets to media codec layers and everything in between not having anybody else's code so for the specific cases i can wrap all of this stuff together and have a much much better quality I am presentation we are piecemeal putting some of this into the rest of the the systems where the video the video direct kind of back-end the video team has replaced XO players general backends so a lot of things get some advantage from this but they don't get the frame locking and resampling and then they're not using the multi streamer side of things lots of other stuff there so we're gonna have tiers of different things but the most important thing is that in the not-too-distant future here side loaded mp4 videos I am if it's in mp4 and it's a local file playback will get a lot better for 4k 64 these very demanding things instead of copying through an intermediate buffer it's directly rendering it it's a very power efficient as well as being very very frame locked you can play a 4k 60 video for something like three plus hours sitting there on like on an s7 class system so there's there's interesting possibilities there as opposed to trying to play 4k 60 video right now it's got this terrible tempo I shudder problems where it'll be smooth if you watch like the great Felix and Paul stuff you know we've got Emmy award-winning videos and I look at that and it plays smooth and goes stutter stutter stutters smooth stutter stutter stutter and like there's you know they're saying like typographers can say if you want if you hate someone teach them to recognize bad kerning you know how the letters aren't necessarily spaced correctly together well I want to teach everyone to recognize recognized bad video frame release tempo when you're looking at videos and you're watching something smoothly moving smoothly and then it glitches a frame and then it goes on when we were running the the main keynote we have all this animation on i'm critically looking at this going all right they've got the frames right they're not dropping frames when I was at GDC earlier this year and a similar super expensive high-end video projection system and they had the same damn tempo release problem animations going on smooth stutter stutter stutter smooth usually it's like smooth and then a little glitch as you're going from 59 to 60 but there's a problem when you've got a synchronous time basis where you fall into this smooth and then stutter and the graphs look absolutely terrible from it but there's good quality to be had from this I'm on the content creation side there have been a lot of interesting learnings and surprises as we look at the data and I as part of Facebook and we've got a great kind of analytics team on some of this there are some interesting insights that you get from really paying attention to what people are actually doing some of the stuff that people never would have guessed four years ago things like rollercoasters are unreasonably effective with people I mean though it is shocking not only do people watch the roller coaster videos they go back and watch them again and again there are some of the most replayed things which violates all of the basic wisdom about like what a a good VR experience should be focusing on comfort not doing the locomotion not turning in are you parabolic arcs and different things but you know the honest truth is people are doing them they'll watch them and then they probably go around they show their friends and all these other things another one is horror genre where that makes sense in hindsight that I you know putting someone in an immersive virtual environment cut off from the world that what makes horror tick as a genre can be much more effective there and hers never really been my genre I am but I recognize the effectiveness of this in many cases I know I like most of the face your fears with you know was fine but the creepy bedroom scene on there like I just don't want to see the rest of this it's I it's impactful and there for what it's trying to do that it has a you know a powerful effect and a lot of our videos just in 360 videos are are similarly very high ranked with horror genre some of the stuff with that's that comes down to kind of brass tacks on development where the dramatic stuff is often very popular the Lego things are always very high on our I am on our ranking this is a Lego Batman and so on Lego Star Wars now but it's very expensive to produce those you know high quality 3d rendering but things that are are also often very popular that are dirt cheap to produce are things like the animal videos I know nature videos bless somewhat less so but then putting actual animals in there and letting them come up right around the camera it's basically cat videos and sort of thing and that is also remarkably popular in VR one of the things that I that Google announced that I thought was was really important and I'm kind of disappointed that I haven't seen more aggressive follow-up on this because I thought it was potentially one of the the really great ideas here was pushing more on the 180 videos and allowing it to if we can convince people that do normal things like on on YouTube or whatever if instead of just having the one camera pointing at them you put a dual 180 stereo camera but you automatically pull out the appropriate 1280 by 720 Center part to send everybody that's watching it normally that becomes something that I you you serve all of the existing customers but then please stop flashing me go to QA I'll get it I'm but then you get this wonderful immersive experience that really is different and if you especially at 4k 60 it is at that distance when you're just like three feet away from people with good quality it is a magically different experience and I think that's an opportunity to get immense amounts of content I am and that seemed really brilliant at the idea and I think they did a few trials with that but I hope that rolls out really broadly because almost everything that people wind up doing I am Photography or videography wise we have all these things that we know how to do pointing the camera at it and some of the things that wind up being surprising talking about people that go through the work of 360 production is that I am you think about like all the challenges about okay obviously you can't have your set stuff back here but you can't have your director sitting behind the cameras which is something that directors are used to being able to do and kind of direct the action in various ways and there are all these challenges with 360 where there are certain things that 360 is better than anything for when you put yourself in some wonderful environment and you look around you appreciate the whole thing but it really does seem likely that that is a subset of the things that people want to experience and we also find that unfortunately people don't turn around nearly as much as we kind of wish they would for years I preach the virtues of swivel chair gaming for gear VR that this is you know the way to have the most fun in a gaming experiences to either stand or sit in a swivel chair and be able to turn around 360 degrees chase the action wherever it's going and I think the best games are still like that but the vast majority of people want to sit on their couch and have something happening in front of them so I think that there's an excellent opportunity for taking things that you're going to present normally in a rectilinear format but then also producing for free essentially a high quality immersive I'm version of it that you can get with almost no extra work because production costs matter we start looking at these and say alright it's great that we won an Emmy for the people's house but that was really expensive to do you know everybody can't build content like that and YouTube has certainly shown that a lot of surprisingly low budget inexpensive content winds up being extremely powerful for people and makes a big impact
Info
Channel: Oculus
Views: 61,768
Rating: 4.9090114 out of 5
Keywords: oculus, oculus connect, oc4, virtual reality, VR, mobile VR, keynote, John Carmack
Id: vlYL16-NaOw
Channel Id: undefined
Length: 93min 0sec (5580 seconds)
Published: Thu Oct 12 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.