The billion dollar race for better VR

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
the displays in the Optics inside modern virtual reality headsets are some of the most complex components Humanity has ever built and yet both apple and meta agree that they're still nowhere near good enough in 2010 Apple introduced a brilliant new marketing term something we call the retina display instead of giving consumers a of how many pixels their screen had and letting them do the mental math of figuring out what that actually means retina is simply a stamp of approval that says that the display is sharp enough and that you won't see individual pixels on it at all today every single product on Apple's website from the $329 iPad all the way to the $5,000 monitors all carry the retina label all except the Vision Pro in fact despite featuring 23 million pixels or more than six times as many as even a modern iPhone The Vision Pro is still only about halfway to having actual retina resolution and meanwhile other V headsets are of course even further and similarly despite Apple's display panels already outputting a mindboggling 5,000 nits of brightness less than a 100 of those nits or about 2% ever actually reach the user this means that the actual experience is a little bit dim if anything and the displays can't show proper HDR either now despite all of this I'm actually a believer in this technology not because the headsets today are perfect but rather because I've seen so much progress being done already and because I've seen many many exciting things in the pipeline for the future as well so in this video let's talk about the fascinating evolution of VR displays and [Music] Optics this video was sponsored by insta 360 more about them at the end this is perhaps the simplest way to make a VR headset we have a pair of lenses and then my phone pretends to be two different displays so we can have one image per eye and we need to show two different scenes to fool our brain and eyes into thinking that we're actually looking at a 3D scene our eyes help us see everything from two slightly different perspectives and you can test this yourself hold the finger up hold it close to you and if you close one eye and then the other then it will jump around quite a bit if you move it further away and do the same then that jumping will become much much smaller our brains can use this information to automatically guess the distance of basically all the objects that we see and then to create a sort of 3D map of the world and so on our headset having two different displays allows us to show virtual scenes from two different perspectives too which then allows us to achieve the illusion of 3D now the displays are actually way closer than anything that our eyes could focus on naturally and so to help we also add lenses which simply move the whole Focus distance as if the screen was about 2 m or 6 ft away from us and so once you can focus and also perceive things in 3D we have a very basic VR experience now of course modern headsets have become way more complicated than that by now and to explain what I mean let's take a look at their lenses this is a basic lens and it bends light by making light waves pass from one material to another at an angle but this design makes for a pretty bulky lens so how could we fix that well the actual bending happens right at the surface of the lens which means that the majority of the glass is basically dead weight and so to get rid of that we've developed so-called frel lenses here we cut a lens into segments we remove the excess material and we create a radically thinner and Light lens the light rays that pass through it still hit the edge of the glass at the same angle as before so the focusing still works But the lens is now much lighter and much thinner Fel lenses are popular in lighthouses where they allow us to have gigantic lenses without using literally tons and tons of glass and they're also popular in VR headsets like The Meta Quest 2 where you can see all the little Rings If you look closely the downside of this technique is that you can get a little bit of light scattering where the cuts are made which then results in the infamous God race that you might have seen if you've used VR before but fernel lenses do work and they're still pretty popular now another traditional problem in VR is that the lens also needs to be a certain distance away from the screen so that the focusing can actually work which then leaves loads of empty space shrinking this space is what so-called pancake Optics or folded Optics are for and you can see how effective this shrinking can be on this comparison photo by I fix it apple and meta have both fully embraced pancake Optics on their new headsets and here's the basic idea of how these work you add two lenses in front of the screen one of which gets coated to be a semi-transparent mirror while the other will let light with some polarization through but will actually reflect others on top of these there are also two layers that can change the polarization of a light as it passes through okay so the screen emits the light for our image and then half of that can pass through the 50/50 mirror lens that light then continues to the second lens which initially reflects it back because currently it is polarized the wrong way but once the light makes a round trip through the system its polarization gets changed a couple of times until it can pass through the second time to then land in the user's eye this hole back and forth creates an optical path that still has the same length as before and you also have these curved surfaces that do the actual bending but now the path is folded over itself which saves us a lot of space this allows us to create a much thinner headsets it moves the weight closer to our face and it is almost single-handedly responsible for the quest three being 40% thinner overall than the Quest 2 so that's great but the pancake Optics also have a major downside they actually lose a massive amount of light on the way just that first lens alone which we've turned into a semi-transparent mirror loses 50% of the light both times it it's used that is a 75% light loss in total so LCD screens can get a maximum of 20 to maybe 25% of their light through it but even worse olet screens actually emit unpolarized light by default but of course these lenses need polarized light to even work in the first place so you also need to add an extra polarizer up front that eats up another 65% or so this means that for olet screens their final efficiency is something closer to 10% compared that to frel lenses for example letting over 90% of the light through and you can now understand the tradeoff the PlayStation VR2 with frna lenses can output a very respectable 26 65 nits with a pretty basic screen but in exchange it is way bulkier than the quest 3 and division Pro despite those two actually also housing an entire computer right on your face meanwhile the quest and The Vision Pro might be thinner but both also struggle to hit 100 nits even despite having much much brighter screens it's a real trade-off and yet for Standalone headsets at least I'm pretty convinced that pancake Optics are going to be the way forward and the reason for that is a revolution in new display Technologies so historically headsets have used pretty generic OLED and LCD screens but Lely a new type of display called micro OLED appeared as the Next Generation technology micro olet screens have been used as camera viewfinders for many years which is why Sony with its large camera business has historically dominated their production but now they're clearly taking over the VR Space 2 they first popped up in an Enthusiast headset called the big screen Beyond and then in the Apple Vision Pro Sony has since confirmed that their next VR headset will have them too Samsung acquired the leading us micro OLED maker called imagin for $28 Million last year and the Chinese display giant Boe says that they have some micro OLED screens ready for production too basically everyone agrees that this is the next big thing for VR displays and one quick look at their spec sheets easily tells us why micro oleds can get insanely higher resolutions more than double that of high-end LCD screens even though they're much smaller as well their pixel response times can go even below 1 millisecond and 5,000 nits even current gen displays can get crazy bright while Samsung and imagine have even shown that their next Generation screens will go over 15,000 nits and that is of course on top of all the regular benefits of olet like perfect blacks really high contrast ratios Etc this is clearly the next gen Tech and here's how it works on a basic level in traditional OLED screens we place both the diods aka the things that actually light up and their control Electronics onto a piece of glass meanwhile in a micro oleds we instead built the electronics right into a chip called a silicon back plane and to deposit the diodes right on top of that and because we've gotten ridiculously good at miniaturizing chips we can fit a mind-boggling number of transistors and pixels on these fun fact Apple reportedly designed their own silicon back plane for the Vision Pro and then had tsmc manufacture it and then had Sony build their OLED Tech on top of that so in a way apple is now at least partially making their own displays for the first time ever which is which is pretty wild anyway while these micro oleds are absolute game changers there are still three clear areas in which they are not quite perfect yet first they are of course extremely expensive for now with estimates ranging between $456 and $700 for just the two displays of the Vision Pro that is more than most VR headsets cost all together and we're just talking about displays here and the second potential problem is that even at 5,000 nits they might still not be brighten enough as I said in the intro only about 2% of those 5,000 nits actually make it to your eye of course we now know that pancake Optics already bring us down to 10% but for the rest you can mostly blame something called Low persistence and here's how that works our headsets fake motion by showing us many pictures in a quick succession same as basically any other video division Pro can show about 100 frames per second which sounds like a lot but the human eye is very sensitive to motion so ideally for the perfect delusion you'd need at least 500 but ideally a, th000 of those frames and we're just nowhere near that so especially if you moved your head quickly you would notice that each frame was staying on for 5 to 10 times too long which you would perceive as a weird sort of motion blur which is one of the many many things that make people sick in VR and since we can't match even the bare minimum of 500 htz we have to employ a trick and that trick is that we only show each image for 10 to maybe 20% of each frame at most most and then we switch to Black for the rest in the brief moment when the image is visible it looks all right and then your brain magically fills in the rest the shorter you showed the image the better the clarity but of course you're now just showing black so everything looks darker based on the fact that we can still see a little bit of smearing in division Pro I think Apple chose the kind of minimum of 20% there combine that with the 10% efficiency of our pan kick Optics and you now have a total of only 2% of the light passing through 2% of 5,000 nits is 100 nits which is more or less what we see on division Pro this is a pretty brutal ratio and if you think about it it means that 5,000 nits is really the minimum here if you wanted a display that you could see at 300 nits for example while also showing the frames at 10% of the time to reduce the smearing even further you'd have to increase the brightness sixfold to an insanely bright 30,000 nits now micro olet screens at Sony have increased their brightness tenfold over the last few years so maybe that is possible in the relatively near future but it's going to be a challenge for sure okay and problem number three is that even though the resolution of these screens is incredible it is somehow not high enough either so Apple itself defines a retina display as one that offers roughly 60 pixels per degree this means that if a screen can show you at least 60 pixels across anywhere in a one degree of your vision you shouldn't be able to make them out as individual pixels anymore and the screen will just look perfectly sharp now 40 pixels per per degree are roughly equivalent to watching a 1080p monitor at a normal distance and that I think is kind of the floor for being really productive and comfortable but meanwhile in VR these are the numbers that our most modern headsets have today you'll notice that the Apple Vision Pro is way ahead of everyone else and yet even it is pretty far away from retina and not even in 1080P monitor territory and that is despite Apple squeezing its pixels into a narrower field of view than many of its competitors so we're still pretty far from actual retina resolution but if you try the headset it does feel surprisingly sharp how is this possible according to the Optics expert Carl gak whose fantastic tests I've linked to down below Apple actually tricks your eye in two different ways first they intentionally set the screen to be just very slightly out of focus using their lenses you can see this happening with these fantastic side by-side images with the meta Quest 3 and this blur hides the pixelation of VR displays quite convincingly now the downside of course is that you end up losing so much detail that the quest 3 then becomes technically sharper in many ways but Apple then uses trick number two to make up for that they artificially add font weights to make text appear Bolder they add thickness to lines that isn't there they artificially increase the contrast in scenes Etc in other words they fake detail on basically the operating system level that is philosophically really weird and of course everything being just slightly out of focus is one of the things that actually contributes to people having eye strain in VR so there's that downside to that but I guess the illusion generally works and people seem to think everything looks sharper than it actually does now I should say here that pixels per degree is an imperfect measure of VR devices because the lens over the display complicates things a lot for example headsets can often stretch the center of the image over loads of pixels which the lens then compresses to the right size so the part of the image that you most often look at actually appears to be sharper than a display normally would allow while on the other hand lens distortions also up a lot of clarity so in the end you might also end up seeing less detail than the display itself would allow so these PPD numbers are more of a rough measure but just for fun I did actually do a calculation that if you wanted to get a true retina display at 60 pixels per degree you would need three times as many pixels as you have on the current Vision Pro and if you also wanted to stretch that out to a more generous 120° of field of view you would then need 51 million pixels per eye which would then be four and a half times of even what the Vision Pro has today and that would have to run off of a battery and on a chip that is literally on your face so yeah the perfect display is still quite some time away so that's where High n VR is today and this technology is going to continue and evolve but there's of course also the chance that something completely different some completely new technology will replace it or kind of Leap Frog it in the future and to illustrate what I mean I want to show you one thing that meta really seems to believe in the company has long said that they're working really hard on holog graphic lenses and this is a research prototype that the company showed off in 2021 now this only has the display and the Optics in it not the whole computer but notice how insanely thin it is and that's because they replaced all the glass lenses with a hologram that's right a holographic lens if you want to understand how Holograms work in general I've linked to a link down in the description but for our purposes you can just think of a hologram as kind of like a 3D photo you make a 3D snapshot of a real life scene and then you store that on a thin film you can move the film around and you can see different parts of the 3D object captured on this film so if the thing that you take this 3D snapshot off is a lens then the Hologram that you create with the snapshot will also act like a lens this company shows off that even with their pretty cheap DIY kit you can make holographs of scenes with lenses in them and then you can see that the lens inside the Hologram distorts the light in the same way that the real one did in the original scene So Meta places a hologram of a lens in the headset and then get something that looks this ridiculously Compact and light like this is the display and this is the holographic lens and that's it that is just mind-blowing to me and meta says that they could even bake prescriptions into the lenses as well by just taking photos with 3D snapshots of many many different lenses that have all the various combinations and so you could order a headset that could arrive to you and it would have the perfect vision correction built in already without even having to add inserts or anything this is a 3D render that meta made to to show what a fully functioning headset made with real existing parts and even eye tracking and an outer screen could be shrunk down to using holographic lenses and boy does that look wild meta has been pretty realistic in the past about what technologies they projected for their VR future so I guess it's real and based on the various statements my estimate is that we're maybe 3 four years from this being in real headsets so if you thought that the crazy VR train was stopping anytime soon then I think the answer is that it's definitely not okay if you enjoy the last 20 minutes or so of me talking about VR Optics and displays that I think it's pretty safe to say that you're the type of person who enjoys learning what the bleeding edge of consumer Tech is today and if that's the case then I think you'll really enjoy this one too this is the insta 360 a pro an action camera that is about as high-tech as a camera can get today I took it with me on a trip recently where I put it through the torture test of scuba diving with it down to over 30 m among other things and the results were really impressive filming a dive is a huge challenge because the deeper you go the darker and less colorful everything looks which is why divers usually have big rigs with huge lights like this but I just took this little guy in its dive case I put it in full auto mode and it turned out some really impressive shots just on its own you just point and shoot and you have great footage of course the camera is an absolute pleasure to be used outside of water as well and even at night where something called Pure video mode can automatically optimize night scenes with the right amount of noise reduction Clarity and more so what magic is insta 360 using to make their image quality this good well to start with there's a massive 1 over 1.3 in image sensor that is bigger than what you'll find on even most high-end phones and way bigger than that of the GoPro for example plus it's also paired with really high quality Optics co-engineered with Leica and the camera is also surprisingly smart with a state-of-the-art 5 Nom AI chip that powers algorithms that analyze the footage and apply the right d noising and color enhancements real time I spent of hours color grading underwater footage and I usually can't get it to look as good as this camera can get it automatically you have 60 M of water resistance in the case and 10 m even without the case and that is despite this fantastic flip up touchscreen this makes filming yourself or anyone else at an odd angle so much easier and together with the magnetic quick release mount it turns this action camera into easily the most convenient form factor out there I also find it super convenient that you can stop and start recordings with simple hand gestures and also that you can cancel and pause recordings to save Sun space that is really smart and just for fun you can even apply AI warp effects to your footage to make you look extra Swanky in just a few Taps it's a truly unique experience and the first 20 people who buy an Ace Pro with my link will get an 11% discount on the product and also a free screen protector so check them out I hope you have fun with it and I'll see you in the next [Music] video
Info
Channel: TechAltar
Views: 108,112
Rating: undefined out of 5
Keywords: VR, virtual reality, AR, Augmented reality, MR, mixed reality, XR, extended reality, spatial computing, Apple Vision Pro, Google Cardboard, cardboard, lens, lenses, fresnel, pancake, how lenses work, how displays work, how VR lenses work, how VR displays work, explained, explanation, microOLED, micro OLED, microLED, micro LED, oledOS, oled on silicon, oled microdisplay, Meta Quest 3, Meta Quest pro
Id: OnNDAhigVtc
Channel Id: undefined
Length: 19min 29sec (1169 seconds)
Published: Thu Apr 04 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.