EVERYTHING NEW COMING TO UNREAL ENGINE IN 2024

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello guys, welcome back to another video. Epic Games has just updated the official Unreal Engine roadmap with all of the new features that are coming to Unreal Engine in 2024, and it is an ambitious roadmap. I mean, there are loads of new features that are planned, things like a next generation terrain solution. Loads of PCG updates like micro scattering, manual editing, and attribute sets. There's a whole new Unreal Cloud DCC solution, runtime virtual assets, and the possibility of having Nanite and Lumen supported on high end mobile devices. There's a lot more updates to the engine, things like new animation 3D gizmos, and some new editor UI layout changes. So yeah, let's go ahead and just jump right into the video. So here we are on the Unreal Engine public roadmap, and now it is important to note that they have this written disclaimer. Basically, the things that we see here on this roadmap are stuff that they want to implement, but they aren't promising that they will implement everything that we see. It's more or less sort of the stuff they are going to be working on and adding over time. So after the disclaimer, we have the rendering section. So these are all the different updates for rendering. So first off on Nanite, we have Nanite Dynamic Displacement. Dynamic programmable displacement allows Nanite meshes to be modified at runtime using a displacement map. Or procedural material. So unlike world position offset, where you can only operate on the original mesh vertices, now I displacement can tessellate the mesh at runtime into additional triangles to conform the detail on the displacement map. So you can use this to create things like material driven and animated displacement, and also create very detailed nanite landscapes. A very interesting feature because we're seeing nanite used more and more at runtime. And also for dynamic objects. First, we had it for foliage, things like trees and rocks. I think we're getting closer to the point where Nanite is going to be supported on things like other meshes, like skeletal meshes and stuff like that. The next update we have for Nanite is the Nanite Spline Mesh. This is actually added as a experimental feature in Unreal Engine 5. 3, but the remaining work that they are focusing on Nanite Spline Meshes are things like performance updates, optimizations. And fixing things like level streaming and stuff like that. Also a couple of different updates for things like virtual shadow maps, just more performance and control. And actually a lot of the features and updates on the rendering side were mentioned at Unrofest 2023, which I actually attended this year. So if you want to see a good breakdown of all of these rendering features that are going to be coming in the next version of the engine. I believe 5. 4. They live stream the Unreal Fest session and it's called rendering roadmap more data, more speed, more pixels, more fidelity. And it just goes more in depth about all of these different rendering features that are actually going to be coming in Unreal Engine 5. 4, which currently doesn't have a release date, but I can imagine it's going to be scheduled somewhere And 2024, either Q1 or Q2 of 2024. Next up, we have a large world coordinates on GPU. So large world coordinates in UE5 introduced support for double precision transforms and data types, allowing for the maximum world size to increase from 21 kilometers. to over 88 million kilometers. Now, large world coordinate rendering support on the GPU was based on a tiled representation, which has a number of limitations, including imprecision and jittering near tile boundaries. Actually, here's a little video that showcases. Some of the improvements with the jittering that are coming in Unreal Engine 5. 4, basically large world coordinates rendering is undergoing a refactor to support full double precision representation on the GPU, which will allow large worlds to be rendered more reliably when the camera is far from the world origin. For Nanite we're also getting a optimization for shading. Nanite compute based shading is a long term projected focus on moving Nanite materials from traditional raster shading over to compute shaders for a number of optimization and new future opportunities in addition to making it possible to clean up A lot of complex code required for the rastered approach. The ultimate goal is to fully replace the pixel shader path in its entirety, offering increased performance on both CPU and GPU, improving code maintainability, and will also make it possible to implement advanced Nanite material functionality that will not otherwise be possible. So this sort of sounds like a Nanite material system. That will offer benefits such as performance increases for your CPU and GPU when coming to material functionality. In rendering, we also have a temporal super resolution update. Basically, TSR is UE5's built in cross platform upscaling technologies, so images render at a portion of the cost of the full resolution by amortizing some of the costly rendering calculations across many frames. So the next version of Unreal Engine will include these changes to TSR. So we have history resurrection, a new feature on high epic and cinematic anti aliasing scalability. Previously accumulated details can be discarded for different reasons. TSR has to re accumulate those details again, which can result in noise and trail artifacts. So history resurrection keeps old TSR history that has these previously seen details to reproject samples from whenever there's a better match for reprojection than more recent frames. Again, this was another feature that they were talking about in the rendering update speech over at Unreal Fest. And it seems like this feature is guaranteed to be coming in Unreal Engine 5. 4. Next up, we have Lumen hardware ray tracing. So hardware ray tracing mode in Lumen has had a number of quality and feature improvements over software ray tracing, but is not currently practical for 60 FPS gameplay on next gen consoles. So performance updates for hardware ray tracing are underway with a goal of achieving four milliseconds per frame in typical scenes, which would match the budget for Lumen software ray tracing running at 60 FPS. PS. We also have updates to substrate or the experimental material system, which is currently available in unreligion 5. 3. So basically we're going to be taking it out from experimental into beta. So the next steps towards achieving the status of beta include testing substrate on large real world projects. That ensure legacy materials achieve performance and memory parity on all platforms. Parity of the final rendered image with legacy materials across a wide variety of use cases of the engine and just user experience and workflow improvements. So it just sounds like they just need to do additional testing to ensure that the performance and the materials are consistent across all different platforms. Heterogeneous volumes is also receiving an update. Of course, this was a new feature also introduced in 5. 3 that allows you to render different. Types of volumetric such as fire, smoke, or fluids driven by Niagara fluids or open VDB files. So remaining work for the deferred renderer includes support for shadows, translucency, and lumen rendering. Path tracing support for this was mostly complete in 5. 3, but they will also receive some minor updates in the future versions of the engine. Of course, we also have the orthographic rendering, another feature introduced in 5. 3. So planning on receiving some updates to this, mainly things like Nanite, Lumen, Virtual Shadow Maps, as well as making sure that orthographic rendering is compatible with stuff like TSR, water rendering, volumetrics, and much more. Another feature in the works is the path tracer adaptive sampling. Adaptive sampling allows the path tracer to skip parts of the frame that are deemed too low enough error so that subsequent samples don't need to be traced for entire frame. The goal is faster rendering times with equivalent quality levels. We also have path tracer denoising. Denoising is a standard technique used within path tracing whereby an image processing pass is applied after a completion of the render to reduce sampling noise. The current path tracer denoising options have limitations. Such as removing too much detail in some cases and not being temporarily stable. Multiple approaches are currently being researched for improved denoising, including better support for command line denoisers, porting existing open source denoiser onto the GPU, and a custom machine learning base denoiser. We also have optimization updates to shader cook time. Cooking is the process of converting assets such as meshes, textures, and materials into platform specific formats for deploying and packaging games. Cooking materials on a large project can take a very long time and often many hours. In fact, I remember at one of the Unreal Fest speeches, specifically, I believe the game Hogwarts Legacy, they were talking about how they had dedicated server room just for cooking their game because they had many different platforms they were releasing for like Xbox, PlayStation, Nintendo Switch, PC, Steam Deck, and they were constantly cooking their game 24 7. Now the process of translating material assets into shader code is undergoing architectural changes to run more quickly and better use of caching. Additionally, dead code elimination and other automated cleanup of the generated code will allow for a more aggressive deduplication of shaders. as well as many of the translated shaders ultimately compile to the same bytecode and can be reused. Next up, we have the RHI or the Renderer Parallelization. Render thread performance is often the limiting factor for UE titles. This is because some operations are restricted to this particular thread, even though current platforms and graphics API allow them to be done in parallel. So the goal is to. improve performance by refactoring the rendering hardware interface or RHI API to remove these constraints and to fully utilize the multi threading capabilities of the target hardware. So it seems like a common theme with all of the different future updates that are coming in Unreal Engine are just rewriting and refactoring of some other code to run more performance on the target hardware or target platforms. Which is certainly no easy feat, especially trying to go back and redesign and rethink how it's all set up so it can run more performant in the long run. Now, for world building, we have a lot of different updates, actually. In fact, first up, we have this world partition runtime hash. So the new world partition runtime hash solution containing a list of partition objects, which can be different types and easily expanded for project specific requirements. Each partitioned object holds their HLOD layer settings, and the runtime hash still supports data layers, one file per actor, and relies on the streaming source components. The new solution comes with two partition types. We have a loose hierarchical grid, a 3D grid where actors extends are used to vary the streaming cells bounds. And then we have level streaming, which includes a individual streaming level per actor set in the runtime grid property. For next generation terrain solution, we have A terrain solution to support large worlds with 3D modeling, layering, virtual texture, generic editor tools, variable tessellation, and nanite support. This aims to be replacing the current height filled only landscape solution in Unreal, so it sounds like they are Working on the terrain system to incorporate other different features other than the landscape system. Now skipping over some of these other world building features, another one that caught my eye was the server streaming. So basically this update is to the world partition dedicated server streaming in and out based on the streaming source position replication, allowing for better server balancing and optimization considering large worlds. So it seems like the world partition system, which they announced in unreligion 5. 0. It's just undergoing many more feature updates and compatibility checks for all the other facets of the engine to ensure things like, you know, performance increases and updates. We also have things like sub world partitions, so I guess this is sort of like having multiple sub world levels within a world partition level, and you can enable these through code and blueprint. And lastly we have things like static lighting, so static lighting baked for world partition, including data layers and level instances support. So this is actually a neat little feature because if you are using things like World Partition for creating open world maps, I don't know if static lighting is currently supported for huge open world maps. Usually you'll be using Lumen or Dynamic Lighting in parallel with things like World Partition. Also coming in the future is a lot of updates to the Procedural Content Generation or PCG system. First up, we have the runtime hierarchical generation. So essentially this maximizes the PCG graph execution at runtime by combining user defined grid size and generation distances per grid, adding PCG generation source components and adjusting generation policies for ordering priorities and updating frequency allows for. Creating richer dynamic environments, complex rules to be generated live and larger procedural worlds in an automatic partition runtime executed solution. Additionally, it can also greatly improve editor only workflows and iteration by generating only around the camera and other actor with a PCG generation source. I think this is one of the features I keep forgetting. PCG has the ability of generating dense forests at runtime. I always forget that this is an active feature of PCG. Next up we have this attribute set arrays. So PCG attribute set arrays can be created from graph parameters, blueprint structures, and arrays through actor property getters. Or constructed from multiple attribute sets within the PCG graph. So this allows for driving more complex logic from exposed parameters, such as biomes with their sets of assets. So I think this is a better way of adding groups of assets to your graph. I won't have to go into, I believe like the static mesh spawn node, and I don't have to manually add each individual tree mesh, I can rather just apply this. attribute set that has all of the different trees and rocks for that biome and reuse it in multiple different graphs. So a really neat feature that they're adding to PCG. Next up we have GPU generation. So compute PCG operations on GPU for fast, non colliding, wide amount of data processing. CPU and GPU execution can be defined, mixed, and shared data through branching within the PCG graph. Some more updates coming to PCG are some extra different. execution nodes. So the PCG framework is used in a wide variety of use cases, ranging from editor only workflows to runtime. So this can be things like I want to decorate my scene for like a movie set, or if I want to use it runtime for some dynamic gameplay functions. Now where the execution domains come into play is it adds control over when to execute. The PCG graph or say part of the graph. So they're adding more different things that you can call things like the manual execution, editor execution, internal with preview, build process, cook. We're going to have a begin play node. Run time or on demand. So just more different options for when you want to execute your PCG graph or certain parts of the graph. We also have things like dependency tracking coming to PCG. So you can see what assets are referenced by what. So just more different quality of life future updates. Now one really cool thing that they are working on is integration into UEFN. So they are Currently adding the support for the PCG framework into UEFN or the Unreal Editor for Fortnite, where you can use the tools for full runtime generation and also use it with the programming language verse. Now there are just so many different PCG updates, and I'm just going to name a More of these, we have micro scattering, which is a GPU based micro scattering, compute and nanite instancing for very high frequency details over objects. I think this is a very necessary feature to get that dense detailed forest floor. So I think this micro scattering feature will solve that particular issue. We also have things like manual editing. So you can actually go in and manually select certain trees or objects that the PCG graph has spawned and you'll be able to manually modify that assets, rotation, scale, location, all that stuff. Now, next up we have developer iteration. So it sounds like they are working on this unreal cloud DCC, which is a unreal derived data cache and content accessible storage solution designed to deploy on cloud infrastructure for distributed slash hybrid studios and teams. So efficiently shared cooked data between distributed team members with active slash passive region application auto connect to the closest region and also enterprise login and authentication. So it sounds like this is a whole separate cloud application to share things like cooked data. That way you don't have to wait for stuff like compiling shaders or if your team member has all that cooked. Data. We can just share it over and it should hypothetically save your team a lot of time. We have also incremental cooking. So Zen server is the codename of the new architectures for local slash shared unreal engine derived data cache or DDC. It provides improved cooking data conditioning for read and write performance by no longer storing cooked outputs as a loose files on disks. Data in Zen server is stored at a centralized service. For local and or shared projects and is inherently D duplicated part of the new Zen architecture will provide improvements to the iterative content cooking process by enabling incremental cooking of content configs and code changes across target platforms. So again, more updates to the cooking process in Unreal Engine. And at Unreal Fest, it's actually funny, someone mentioned that maybe in 10 years down the line, they're going to eliminate somehow the need to compile shaders. Just imagine you open up a project and you don't have to compile shaders at all. He kind of just joked at the fact that, you know, maybe in 10 years, we'll have that feature. Next up, we have the Xen server. Streaming to target platform. So the Zen server is a new architecture for new local shared data in unreal engine derived data cache provides improved cooking data conditioning and rewrite performance by no longer storing cooked outputs as loose files on disks. Instead of consuming full copy deployments, the server component enables local network streaming of cook data directly. To a target platform for the game client and game server for faster play on target iteration. So again, this is just making Unreal Engine have more shared resources by having something like a local server. In this case, it sounds like you can sort of cook your game and it will update to the Zen server. Then you can go to your game client or the game server. Test out your newly cooked build and it will just stream those files from the zen server So it'll save you a lot of time when you're actually play testing your game out And I think these particular changes are more or less for bigger companies that have a large q a department where they have Say multiple different platforms. They're packaging to and multiple different development kits where they have lots of different tests simultaneously running So i'm not entirely sure if this is going to apply more or less to the Small indie creators or solo developers. Next up, we have a runtime virtual assets. So reduce the shipping built install footprint by splitting bulk data from structured data with individual assets, streaming delivery directly to the game client for faster play iteration and smaller first download for players. So this is actually a neat future. I think what this is getting at is it was also mentioned in one of the talks at unreal fest. On mobile games, like for example on Fortnite, they have this feature where you can download I think like 50 percent of the game and it'll be ready to play and while you can play the game, it's actually downloading the rest of the game, the other half of it, while you can still play through things like the tutorial or maybe certain other aspects of the game. So I think that's what this is getting at with the runtime virtual assets. Of course we have a couple of other different developer iteration tools. A lot of these tools. Mainly apply to larger companies or studios that have big teams of people. We're going to go ahead and check out some of the other different updates, such as the different platform updates. So we have things like the desktop renderer on mobile devices. So the desktop deferred renderer will support on high end mobile devices with the Vulkan RHI, allowing flagship features like Lumen and Nanite to run on these platforms. So this is actually a very big update that is going to be coming. Because imagine now on mobile devices, we'll actually have support for Lumen and Nanite. Now this is not only huge for things like actual phone devices, but you can also potentially be able to use this on things like VR because they run on the same mobile architecture. So whenever this feature comes out, this will be huge because you'll be able to leverage Nanite optimization to create very detailed and dense scenes. That will run on such a light platform like mobile, which up until this point is just completely unheard of. Next up, we also have XR. So we're continuing our efforts to support major XR platforms, which includes our commitment to the open XR standard, as well as existing new rendering technologies that are applicable to XR devices. Performance is the utmost importance to XR developers. And as such, we're investing time into solutions that can optimize your project. We have things like eye tracking, variable rate shading. Single pass stereo for nanite was also added in 5. 3, and we're looking forward to exploring similar techniques to enhance your developer journey. So expect various quality of life improvements to your development workflows for VR, AR, and mixed reality experiences. Next up we have Metal, so several improvements are being made on the support of Metal shaders for Apple platforms. The Metal Shader Converter will be adopted to improve shader translation, resulting in faster compile times, as well as improved code generation, leading to better runtime performances. Also, Metal CPP will be adopted to modernize the usage of Metal, which should also result in better performance. Lastly, we have Vulkan Raytracing. So work is currently underway to bring Vulkan ray tracing up to parity with DirectX 12, including the Linux platform. So this will allow the full suite of ray tracing features to be used, including things like hit lighting mode in Lumen and path tracing. So those are all the platform updates we have next up. A lot of different updates for characters and animation. First up we have the modular rigging framework. This is a framework for control rigs to be modularized and distributed as modules. So they will be adding new features to allow you to create character parts and assemble them into rigs for real time or keyframe animation. So I'm wondering if this is like different prefab rig modules I can create and use to very quickly create. Or rig an advanced character. I guess we'll just have to see how this feature pans out in the long run. We next up have the skeletal editor, which was a experimental feature added in unreligion 5. 3. So we just have some more different additional updates, quality of life improvements to the skeletal mesh editor, things like expanding the component editing. So we have a component editor component staffing for bones, additional component selection schemes. So just easier editing and selection. Of pervertex bones and weights, we have things like animation insights so you can create bones and weights in context to your animation. So what this means is you can actually transform the bones while painting so you can like play an animation and see how the skeletal mesh deforms based on your, I guess, weight paints. And lastly, we have things like characterization. further simplified bone and weight creation tools by providing improved starting points and helpful defaults. We have things like, you know, copy paste and duplicate bones. The machine learning deformers also receiving some updates. So with ML deformers, we approximate a complex rig nonlinear deformers or any arbitrary deformation by training a machine learning model that runs real time in Unreal Engine. So some future developments includes debug filtering with the option to connect in game. to a heatmap against ground truth, the ability to generate multiple geometry caches and animations, things like noise filtering for caches, LODs, and much more. Extended development include development for structured ROM for a sample, clamping weights, outputs and inputs, and cleaning up the nearest neighbors. Animation in Unreligion also receiving a major update. We have things like new gizmos, So these are the new updated redesigns for gizmos for translating, rotating, and scaling when you're animating inside of the engine. So they are improved new look and style, proper indirect manipulation, improved arc ball, and a parent space option in a viewport while posing. We currently have only world and local. So I think this is a much needed. Update to the gizmo because I don't know if any of you have tried animating in the engine when I tried it a couple times It's always been a pain for me trying to move the gizmos around keyframing all that stuff Maybe it's also something I'll have to revisit when they add these new future updates to the animation pipeline We also have some updates to constraints So we have rewritten most of our constraint system to better handle level sequence workflows. So we have animated constraints are now fully supported in the level sequence, improved evaluation engine to better support constraint setups without causing cycles, and user interface work to improve the tool while working with it. Then of course we have the anim details 2. 0 channel box mockup, just an overall of the animation channel box to provide an experience. more familiar to users of other popular DCC tools. Another interesting feature they're adding is extensibility for animation authoring tools. So they're providing further extensibility within the Blueprint slash Python APIs for users to create things like customizable tools for animation authoring. So we have better selection scripting. being able to grab specific key frames and edit the selection of objects within sequencer. We also have things like layering control rigs. So there's a layered control rig system allows you to use the power of control rigs on top of many workflows without the need to bake the data down to edit the characters, which can be destructive. Sequencer is also getting an update. So they are providing a new set of features that will enable and support dynamic user defined operations of data to enhance. interactive cinematic experiences, things like the condition track, which allows you to set the active states of an animation track based off of a condition. This will allow for creators to work with more dynamic playback scenarios for handling game scalability, interactive events, or state based inputs. We have things like parameterize key frames, change the value of a key frame to a different value based on the user defined runtime logic. This enables creators to author more dynamic changes to the key frame level for game accessibility within UI or player driven inputs. And then lastly, we have sub sequence play rate, where you can dynamically change the play rate to selectively speed up or slow down sub sequences for gameplay slash artistic performance reasons. So sequencer, and this is probably targeted more to gameplay cinematics than things like short films themselves. Again, there are a lot more other character and animation updates to the sequencer. Those are just some to name a few. We have things like updates to simulation, the chaos pattern cloth editor, which was experimental future released in 5. 3. Sounds like they're adding more support for different types of weight painting, better runtime support and updates to the data flow graph editor, as well as just general UI slash UX. Development chaos destruction is also receiving a couple of different updates to the fracturing system as well as just general performance quality life updates. And we're also getting some more updates to the chaos flesh, which is a another experimental future that was actually introduced in 5. 2. As far as audio, we're getting a new audio insights debugger plus profiler. So the audio insights is a new tool that enables creators working in audio to intuitively debug audio and profile CPU memory to optimize your project. So the audio insights will provide more features and a unified tool set with designer friendly workflows and the ability for users to customize their layouts and views will allow things like audio event debugging, where you can track and record audio events sent to the audio renderer for analysis and the ability to sort and filter out audio events. By properties, including the duration. You also have things like audio output recording. So you can record the actual rendered audio from a capture session and you can scrub through that capture profile to hear the actual rendered audio when a given profile event happened. Now, in terms of framework, there are a couple of different updates, things like updates to the NavMesh and NavLink improvements. Also mass instance actors and this new state tree debugger. Now these are just mainly updates to general framework futures and some of these things like the state tree and smart object player interactions. I actually recently just learned what state tree can be used for and how you can set it up with things like smart objects. It's actually a really powerful tool to create things like AI that can interact with different types of smart objects. Sort of like behavior trees, but it can be used with things like the mass AI plugin, or you can have, you know, open world AI that will interact with different objects within the world. So yeah, hopefully I'll be messing around with this feature a lot more and actually show you guys a tutorial on this because I think this is a very powerful solution for open world AI. Modeling tools are also receiving some more different improvements, additional sculpting and mesh editing tools, general UI improvements, and much more. And of course, for content pipeline, we're having added support for GLTF import and export. So this is a common 3d model file format that is used as an interchange format in the 3d ecosystem. So as of 5. 3, unreal engine provides reliable imports and export features, both in the editor and at runtime. So in upcoming versions of the engine, the goal is to polish this gltf integration inside of the engine. So just more different, various improvements on that UI is also receiving. A couple of different improvements and new features. In fact, we have this quick UMG preview, which allows you to test your widgets, all the buttons and features inside of the widget editor, rather than having to play an editor to test it out. So this will just help with faster iteration and see very quickly how the widgets respond to your user inputs. And then finally, we have some various different updates to the unroll editor itself. So stuff like the content browser is going to be receiving an update things including a design overall to improve content discovery and organization optimized vertical and horizontal layout enhanced searching and filtering capabilities Also per editor preview scene settings. So in these little Sub editors will have some more different settings that we can tweak inside of these little viewports. That should allow you to quickly set up multiple test environments when inspecting your assets. It looks like the viewport toolbar is also receiving a complete overhaul. So as you can see here, this looks a lot more different than what we currently have set up. So, a little annoying because I'll have to go in and update, you know, my beginner tutorials to show you guys the new toolbar. But as you can see, we have all the scaling, the snapping tools. Moved over from the right side over to the left. So again, just an overhaul of the viewport toolbar. A little annoying for me and, you know, maybe everyone else using the engine. But I guess we'll just have to see how it pans out. So yeah, those are all of the different future updates that we can expect in Unreal Engine 5. 4. However, again, a lot of these things. Won't actually make it into 5. 4, maybe in, you know, 5. 5, I guess, or, you know, later on. But yeah, there's a lot of new changes, I think, and just a ton of different new tools that are going to be releasing. I'm not too excited about the, uh, new viewport toolbar, I guess. As you know, rearranging things in the UI and viewport is just a headache for people who are trying to learn the engine. But other than that, there's a lot of performance updates, so it sounds like they are just working on rewriting and refactoring a lot of their code to gain massive performance increases in the long run. But yeah, those are all of the new features on the Unreal Engine roadmap. I'll leave a link to it down in the description below. Let me know what you guys think about these new features. Also, let me know which one of these features you are particularly interested in down in the comments down below. Also, don't forget to subscribe to keep up to date with the latest Unreal Engine news, and as always, I'll see you guys in the next one.
Info
Channel: Smart Poly
Views: 204,308
Rating: undefined out of 5
Keywords: Unreal Engine 5, Unreal Engine 5 Release, Unreal Engine 5 New Features, Unreal Engine 5 Next-Gen, Unreal Engine 5 Nanite, Unreal Engine 5 Lumen, UE5 Release, UE5 Next Gen, UE5, Lyra Starter Game, Unreal Engine 5 Lyra Game Tutorial, UE5 Lyra Tutorial, Unreal Engine 5 Open World Tutorial, Unreal Engine 5 Open World GamesUnreal Engine 5, Unreal Engine 5 Open World Games
Id: GypbOLlFh2w
Channel Id: undefined
Length: 32min 29sec (1949 seconds)
Published: Tue Oct 10 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.