(lively music) - Is this another Vision Pro video? - Uh, yes. Yeah. Just, it's a good one though. I promise. - Yeah, mhm. I'm sure. - No. Okay. So I've been thinking a lot about what a second-generation
Vision Pro might look like. - But didn't this literally just came out? - No, I know, but okay, so obviously, there are lots of other VR headsets, but there are lots of things that got way more interesting about this one joining the market, and if you compare this to those, this is missing stuff. - Right, but that is also
the most expensive ones. - Yeah, yeah, yeah. There are things that the Quest 3 does that if the Vision Pro
did it, it would feel like it's a revolutionary change to this thing. It's a good video, I swear. - All right. I'll allow it. - This doesn't work though. You can't. - Okay. Fine. - Anyway. (upbeat music)
(logo whooshes) So I don't think the
Vision Pro is gonna be an every single year product
update, like the iPhone. I think it's gonna be one of those that's like every couple of years. You stack up enough tech
updates every once in a while and it's worthy of a new generation and then it's a bigger
deal when it comes out, and by the way, that's how a lot of other
VR headsets have been. The Quest 3 came out like
three years after the Quest 2, and the PS VR 2 came out like
six years after the first one. So, in a sense, this is
already very early thoughts, but it's been out for a little bit and we're really starting
to get a sense of where people are using it and how people are getting
the most value out of it, and let me tell you, a lot
of people are definitely still not convinced on the whole VR or
Vision Pro thing at all. Like, it's an interesting
product for sure, but it just doesn't do much, and I started having
those thoughts immediately as soon as I started testing this thing, just all sorts of missing stuff
that I would like it to do. So this is all those things in one place. So probably the biggest
thing that you've noticed if you've been familiar
with the VR headset world or used any other VR headset before is the Vision Pro is
missing shared experiences. Like, you are always so
alone in this headset. Like, meaning there are lots of other great, super immersive experiences, and you can be productive
and you can watch a movie, but no matter what's
happening, you're always alone. Even if you have pass through on, nobody else can see what you see, and so it's basically impossible to share what you're experiencing without, you know, mirroring
everything on an iPad or something like that. Now, FaceTime is close. It's one thing, but I'm talking about sharing one virtual space
with more than one person or seeing the same virtual environment at the same time with multiple people or seeing the same virtual object from two different
perspectives at the same time. You ever play "Rec Room"? It's one of the oldest but most
fun games on the Meta Quest. It's just so simple. You're just out there having fun, but it's a bunch of people running around a room or an environment
to different game areas where they can jump into an experience and play a game together. So, in that digital space, you look around and see other people, and I use people with air
quotes, but you see them, they see you, and you can
see the same environment from different angles at the same time and explore it together. It doesn't even have to be super realistic or high resolution or convince you that it's another reality, but it's just way more social and interactive and fun that
way, but on Apple Vision Pro, aside from FaceTime really,
all of the other experiences, it's just you in there solo. Like, there's nobody else in there. So, no matter what you're doing, no matter how immersive the movie is or no matter how good
the game is around you, you're in there by yourself and nobody else can
see what you're seeing. Like, just, here's a basic example. How sick would it be if you and the person next to you on a plane could sync up your experiences
and both watch the same movie in a virtual theater at the same time? Simple as that, or maybe if you were manipulating
a 3D object in space, wouldn't it make a ton of sense for someone else to be able
to share that object with you so that they can see how
you're manipulating it? Like, it feels like it
just makes too much sense. I told you guys about that Sky Guide app, which has like, you know, you look up and there's all the
stars and the constellations and it shows you where
they're all supposed to be. It is pretty awesome. Funny enough, there's
a laser pointer feature that lets you point a
laser pointer at the sky and circle around and point
at things and draw things, but, still, nobody can see what you're pointing at with the laser, even if they have a Vision Pro. So there are two basic
types of shared experiences that it would just be great
for Apple to add to this thing. The first one is two different
people with Vision Pros are in the same room with pass through on and one of them drops a
virtual object into that room and the other can see it at the same time. Manipulate the object.
They can see it as well. Great. The other is two different
people with Vision Pros in different places, halfway across the world, doesn't matter. They both turn on or join
the same virtual environment and can see it at the same time. As of right now, I believe
the first one is harder because the way these headsets
work with the pass through is, basically, in real time, they are mapping the
volume of space around you with the sensors on the front of it, the surfaces, the walls, the floor, the objects around you and everything. That's how Vision OS is
able to lock your apps floating in 3D space and how it's casting shadows
onto different surfaces in your room with decent accuracy, but there's no guarantee that, if you're in a room with someone else who also has a headset on, that both your headsets
are mapping the room in the exact same way. Like, maybe I place an object into space, but your headset doesn't see
space there so it's confused, or it doesn't map my hands or what I'm manipulating
in the exact same way. It's just not exactly
guaranteed to be the same, but the other kind, just sharing a virtual
environment someone builds with anyone anywhere in the
world seems like a no-brainer. Like, being able to watch a
movie in a virtual movie theater with someone else who also gets a seat in that movie theater, playing an obvious multiplayer
game in the same environment. Like, this stuff is pretty basic with VR. I love multiplayer table
tennis in the Quest. It's one of my favorite
shared VR experiences. You can play against people
anywhere else in the world. There's all kinds of
other multiplayer games like this in VR. That is a huge part of
the VR gaming experience, and I just wonder why this
doesn't have that yet. It seems like a huge, huge thing, especially, and I know, you know, Apple has environments
that they've built in and they're massive in high resolution, but you can only move a little bit before you're out of the
play zone, if you will. It's fascinating. I wonder if that's something to do with the way Apple thinks
about these environments. Nevertheless, what I know is that's the number one missing
feature on the Vision Pro that I'd love to see in a second gen. Now, the other big one
that stands out to me is, if I wanna use Vision
Pro a lot in, let's say, multiple, just two different
locations, home and work, I wish that it had a memory
of all the windows and apps that I leave open in each place. See, Vision Pro is actually already amazing at this in one space. It's seriously incredible. Windows stay locked to
where you leave them as it live maps its way around your space. So you can pin something to a wall and then walk around and pin
something to another wall or just in the middle of the room. Leave the room, come
back, they're still there. I even tried, ready for this? I pinned a window right here between these two cars in the parking lot. Then I turned and walked away, just left it behind me in the
parking lot, walked inside, walked all the way down this long hallway, totally out of sight, around
another corner into the studio, and sure enough, that window is still right
there where I left it between those spaces in the parking lot. The only thing holding it
back from being nearly perfect is occlusion. Basically, the only thing that ever gets between the window and you is your own arms and hands,
which is usually totally fine, but if you have something
like around a corner, it doesn't put the wall in
between you and the window, which is where it should be. So, sometimes it gets a little wonky, but overall 9 out of
10, already really cool, but the second it gets dinged is when you wanna do this
in more than one space. So let's say, at home, I've set
up all these suite monitors. I've got a virtual TV
in one room and a game and some windows all over
the walls or whatever. Cool. I pack it up, I drive here to the studio, and I put it on, and the second
I start opening apps here, they have to disappear from home. They basically disappear
from any other space and open in your new space. So there is no memory of
different older spaces, so when I go back home, my
windows are all gonna be gone, and I'll have to set them
each individually up again. Not a huge deal, but if it had a memory, wouldn't that just make sense? I wonder if you could set
up like little beacons. Like, all it really probably needs is like a QR code or some visual identifier, but basically, you get home,
you put the headset on, it sees the beacon, and then
it goes, "Oh, I'm at home," and then it puts your
windows all the way up where it already knows you
usually have them for work, and so you basically just
kick back on the couch, put the headset on, and
you don't have to re-set up each window in all the
same places you want it every single time. It just remembers that.
That would be sick. Those two features alone I think would make a dramatic update to how often I realistically
would use Vision Pro. Just, I don't know if that requires some Vision OS software update or if you need more
compute to do things like more memory of locations
or shared computing spaces, but that's the giant things that I think are missing from
it that would be awesome. Then the rest of the
stuff, on my list anyway, is maybe a little more icing on the cake. Like, here's another one. I think probably 99.99% of people who buy a Vision Pro have an iPhone. I think that's pretty safe. It might be a hundred, but I think that's a
reasonable assumption, and I found it interesting. I think a lot of people assume that it'll just connect to the iPhone and they think that they'll be able to see their phone notifications straight away on the
Vision Pro, but it's not. It's a separate device
like a Mac or an iPad. So, yeah, it'll show you iMessage stuff 'cause that's everywhere,
but it is a separate device. But, you know, the obvious difference is, unlike a Mac or an iPad,
when you put on a headset, well, now, you're wearing a headset, so if I were to get a
phone call on my iPhone, I couldn't necessarily even see that I'm getting a phone call. There is no notification
for it in the Vision Pro, and I'm gonna have to take the headset off to even see it coming and
accept that phone call. So it would be nice to have an option where these two things talk
to each other a little better. Just maybe a little hub for just my smartphone's notifications over to the side if I wanna check them. I think it makes sense. They
probably won't do that though. But here's a number for you. 3,386. That is the pixel density of
the Vision Pro's displays. It's kind of a ridiculous number. Over 3,000 pixels per inch each. So there's been some tear-downs. I'll link some below the like buttons. You can see them up close.
They're incredibly sharp. So, that and the high refresh
rate and minimal distortion all contribute to a pass-through and just everything feeling so real, but the number I'd actually
like to improve is 92. So the Vision Pros displays show 92% of the DCI-P3 color gamut. Pretty good coverage. Like, for a display, that's
honestly pretty good. The reference grade display that is, like, the Pro Display XCR, for example, that'll show you you 99% of DCI-P3. So, again, it's, for a
display, a pretty great number, but the thing about a VR headset is it's replacing your eyes. Like, the pass through is
pretty good for what it is, but without getting too complicated, 100% coverage of DCI-P3
is about 50% coverage of all of the colors that
the human eye can see. So this headset looks
great and it's very sharp, but it's only showing me a little less than half
of the colors of reality. So I wonder how much
they can improve that, because the human eye is obviously insane. It has crazy dynamic range
and foveated rendering and great sharpness
and all that fun stuff, but, you know, Apple's been really good at a lot of displays for a long time and these are some incredible displays. So, field of view, wider
please, but also, yeah, just generally more color
would be interesting. There's lots of other little
things that are obvious, like weight reduction please, of course. Higher quality screen recording is a niche little request of mine, but I think would help a lot for people trying to make
videos with these things. Also, even more specific,
keyboard pass-through while in an environment. So you're in an
environment, you're typing, you've got your virtual Mac display, and if it can see and recognize
my hands after a scan, I think it would be nice
if you could also build in a feature to scan your keyboard or recognize the keyboard of your MacBook, just 'cause it's gonna
look the same every time, kind of like my hands. All that being said, I'm
sure there are more things that people, you know, are
thinking about on their lists, but let me know what you think about Apple Vision Pro gen two. Maybe leave your wishlist
in the comments below. Thanks for watching. Catch you guys on the next one. Peace. (lively music)