I built an Apple Vision Pro app... visionOS tutorial

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
a few days ago me and my good friend Mark Zuckerberg made a video roasting the Apple Vision Pro a lot of people enjoyed it but a lot of people got offended as well because obviously the Apple Vision Pro is the greatest piece of technology ever invented after that video I went out and bought a meta Quest 3 did some soul searching and have come to realize that I never should have trusted Mark Zuckerberg in today's video I will atone for my sins against the Apple religion by building a vision OS app for the Apple Vision Pro from scratch and believe it or not you don't even need to own this $3,500 paper weight to start building apps for it my experience building a basic 3D app was surprisingly smooth and easy and there's a huge opportunity for developers here who want to make money on this new platform what we're building today is a basic app that can fetch animated gifts from the internet and then we'll layer in some random 3D balls just to figure out how Apple's AR reality kit works but before we get into this apple Vision Pro code tutorial let's first talk about the VR landscape in general if you've never tried on a VR headset you owe it to yourself to go to an Apple store or Best Buy and try the AP or meta Quest because they have a massive wow factor the first time you put one on I actually have an original Oculus dev kit that's almost 10 years old at this point and even back then they had that wow factor I think a lot of normies are freaking out about the Apple Vision Pro because it's their first experience with VR and their minds are blown but that honeymoon phase doesn't last very long there's already a lot of reports of people returning their apple Vision pros and I think VR headsets will always be a niche product unless they become significantly smaller and more convenient but that'll likely take Decades of advancement on the hardware side and despite that Apple has sold tons of these devices and it's making VR more mainstream which means metal will likely sell more of its devices as well and when it comes to development getting in early on a new ecosystem is one of the best ways to make money in the early days of iOS you could make $10,000 a day with a fart app and with vision OS you have apps like YouTube lagging behind so developers are selling YouTube apps for $5 a pop until Google builds the official app now the other thing to consider is that building a 3D app tends to be a lot more difficult than building a basic 2D website the graphics the gestures and just the requirements to build a good user experience are a lot more challenging for for an indie developer to pull off now the other question is should you build something for Apple's closed ecosystem or should you build something for Zuckerberg's quote unquote open system it's a tough call with a lot of trade-offs and likely comes down to the device you like more here's a quick review of The Meta Quest 3 versus the Apple Vision Pro the clarity of the graphics as well as the pass through is way better on the Apple Vision Pro however it doesn't beat the clarity of your naked eye and looking at the outside world through a screen is pretty stupid most of the time you'll also notice motion blur and when I take the thing off after using it for a while I get this weird feeling in my eyes it just doesn't feel healthy to wear for long periods of time and surprisingly I found the meta Quest 3 to be a bit more comfortable but neither one of them is fun to wear once you're inside your VR World you've got a bunch of apps to choose from although with apple things are tightly integrated into the Apple ecosystem so if you use iCloud and things like that you'll get a much better experience than if you're not an Apple user when it comes to content Met has been working at this for a long time so they have a ton of different games and apps to use like this game super hot is the most fun I've ever had in VR period whereas the Apple ecosystem syst is just getting started in general meta is much more focused on gaming because that's what most people use VR4 whereas Apple's trying to kind of start this new paradigm of VR as a productivity tool the final question is which one should you buy well the meta Quest 3 is a much better value and at a $500 price point I feel like meta has to be losing money on this thing which would make sense because I think they want to subsidize more people into their ecosystem and meta already makes an ungodly amount of money from its ad business but without price considered the AP is definitely a more impressive piece of hardware Apple has always made the best hardware and this is no exception now as a developer if you want to build something for meta quest You'll likely use a game engine like Unity or on real engine you can also use Unity for Apple Vision Pro but if you're not already familiar with unity you're likely better off just going directly to Apple's SDK and build your app with swift UI and that's what we're going to do now in order to do that you'll need to have a modern Mac running Apple silicon along with the latest xcode installed from there we can go to file new project and navigate to The Vision OS app template the most interesting option here is immersive space which can be full mixed or Progressive with full you're totally immersed in a 3D world like true VR in mixed reality you can see the outside world along with some 3D objects inside of it and then Progressive allows you to transition between the two different states for this tutorial we'll go with mixed that'll generate the template then bring us to this hello world content view now inside this app we have two different paradigms to work with we have Windows which are these 2D things that look like pieces of paper flying in space and that's where the main UI is going to go but we also have volumes which you can think of as a 3D object that takes up some volume in the space now the end user is going to want to interact with both windows and volumes but the way they do that in Vision OS is very unique some interactions are direct where the user puts their finger directly on an object and performs some type of gesture that you can then handle in your code like in this app will'll eventually have a button that the user can tap what's weird though is that other interactions can be indirect because Vision OS has eye tracking a person's eye can indicate an element that they want to interact with then they can pinch their their fingers together instead of tapping it directly pretty cool but now it's time to write some code we're using Apple's Swift programming language along with swift UI which is their framework for UI design that actually feels pretty similar to flutter what you'll notice is that we have a Content view then inside that view we can Nest different widgets like text and buttons to build out the UI in a hierarchy first we have a vstack which is like a vertical column of elements inside that stack we can then add a button then we can chain methods to it to modify its appearance by adding padding changing the color or adjust the shape to make it more rounded what you'll notice is that as we make these changes they're automatically updated in the simulator what's cool about xcode is that it has a built-in Vision OS simulator so you don't even need to own the Apple Vision Pro to develop apps for it but now that we have a button we need to make it do something useful so inside of it I'm going to call a method that will fetch a gif from the gify API ironically giy is a company that's owned by meta so Mark might not be too happy about me using it to build an apple Vision Pro app to use it you'll need an API key but once you have that you can make a request to that API for a random gift so it's just a really simple API to integrate into an app like this so now inside our fetch GIF method we'll first create a URL data task that sends an HTTP request to the giy API that happens asynchronously and responds with either data or an error but the thing that was actually surprisingly tricky here was deserializing the Json back into Swift Code it has this Json decode method but in order to use it we need a struct that represents the shape of the Json data from giy so to do that I created a few structs here that use the codable type which ensures the data can be encoded and decoded with Json now that we have that we can use that in the Json decode method and finally once we have the actual image URL we can set that as a property on the struct itself which then allows us to use it anywhere in the UI to do that we can go back into the vstack and we'll add an async image component but this time wrap it in a conditional that checks to make sure the GIF URL is defined because without that URL we can't show an image and now if we click on that button we should get a random image in the UI and now we have a minimum viable product for a million dooll Vision OS app one problem though is that the GIF animations don't seem to play but to be honest I don't really care enough to figure out why because now what I want to do is add some actual 3D objects to the scene Apple has this SDK called reality kit that tries to make this process as easy as possible for example the starter code comes with a 3D model so let's go ahead and add model 3D to the UI it's referring to a 3D model Nam scene which you can find in that packages directory under reality kit content then as you can see here it adds this little gray ball to the UI but now let's go in the file explorer and click on the scene that'll isolate the shape so we can view it here in 3D however there's another tool called reality composer Pro and we can access it by clicking on this button up here on the top right this tool is a 3D editor kind of like blender that allows you to add different nodes to the scene and then customize them by modifying the mesh and adding materials to it this is probably the best way to design a 3D scene but we can also do it programmatically let's go back into our UI code and create a new view called balls inside of it we'll add a new reality view that calls a method called generate sphere in fact we'll put this code inside a loop to generate five spheres we can customize its radius and then to give it some actual color we need to add a material to it which in this case will be a simple material that's red and metallic and then after that we'll generate some random numbers to use as the position for the sphere now we have these random balls floating around in space but we can also make these interactive and that's because reality kit has things like colliders that can detect gestures from the user or other things in the environment what we're doing here is adding a collision component to the sphere that has the same radius which then makes it possible to chain this gesture method to it to listen to a tap gesture when a sphere is tapped we'll go ahead and toggle the scale property on the struct that's going to trigger an update in the UI where we can run some code in this case we'll Loop over all the Spheres and update their scale to a different size and now in the Apple Vision Pro we should have this totally useless new feature although it's useless it gives us a pretty good idea for how things work in this platform the bottom line is that building apps for vision OS is probably a lot easier than you actually think if you have some basic 3D design and coding skills it's not all that hard to deliver that VR wow factor to end users my goal over the next few months is to do a lot more coding tutorials so let me know what you want to see in the comments and if you want more content right now become a pro member at fireship iio thanks for watching and I will see you in the next one
Info
Channel: Beyond Fireship
Views: 511,593
Rating: undefined out of 5
Keywords:
Id: _xfZIr5sDLw
Channel Id: undefined
Length: 9min 18sec (558 seconds)
Published: Sun Feb 18 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.