AMANDA: Hey, folks! Get ready to hack, slash,
and slam on the gas with this month’s free Marketplace content. Put your pedal to the medal with
an advanced vehicle system, embrace the elements with mesh-based Niagara
effects, siege your next project using medieval components, excel
in the office with high-quality assets, and power up your characters
in a turn-based jRPG template. Now with the help of the
permanently free Physical Layout Tool, you can take your level
design to the next… level. Claim all these free goodies
on the Marketplace today! Have you downloaded the
new MetaHuman sample yet? We released two new tutorials
to help you get the best performance out of them! Explore the scalability settings,
including LODs, hair options, and toggling ray tracing, and then
find out how you can use Control Rig to animate the MetaHumans to
express emotion and your creativity. Watch the new videos on the
Unreal Engine Youtube channel. And if you’re eager for more
information on The Rise of Real-Time Digital Humans, the next episode
of The Pulse will feature digital humans researcher and expert Dr. Mike Seymour—joined by talent from
Sony Pictures Imageworks, Skydance Media, Brud, and 3Lateral—to
discuss the recent exciting and innovative advancements in digital
humans, and their impact on the future of games, films, and beyond. The event takes place on March 17. Head to unrealengine.com/thepulse
to register now. We’ve teamed up with ArtStation
to showcase incredible artwork created in Unreal Engine,
Twinmotion, and with Quixel MegaScans during Unreal Days. Hop over to ArtStation to
check out the featured Unreal Engine channel and submit your
own work to be highlighted! Ever dreamed of
building your own home? A new real-time app created by Zuru
Tech enables anyone to design a building and order the constituent
parts for assembly on site. Head to the Unreal Engine
feed to find out how this innovative idea could change
the construction industry. In our ongoing journey to provide
you with better resources, we’d crafted a survey for the Unreal
Engine documentation—if you accept this quest to help us shape the
future, share your feedback and suggestions at the survey link
shared on docs.unrealengine.com! Now for our top weekly karma earners. Many thanks to: Everynone, ClockworkOcean,
L1z4rD89, MMMarcis, ShepperdsHook, T_Sumisaki, FatalBreak, fabiomsilva,
FridgeFace16, and Firefly74. Popping over to our spotlights—we’re
celebrating this beautiful cinematic called The Lost Path, created by a
handful of the folks at TeamQode. Labeled as the first in a series,
there’s a lot to look forward to in their upcoming projects. Give them some love on
ArtStation and then head to their site, badqode - q o d e. com This lovely mini-trailer was
created by MR3D-Dev, using the Quixel Medieval Map with other
assets from the Marketplace. They’ve also hinted at future
videos following Coming Home, so make sure to keep an eye on their
YouTube channel for more content! And lastly, join the young Oona
on an adventure to complete her first task to become a full druid. Travel through thick forests and
swamps, face the creatures inhabiting them, reach the Cloud Kingdom and
fight against the evil Meiga in this beautiful 2.5D platformer. Oona the Druid’s
Path is now on Steam. Thanks for watching this week's
News and Community Spotlight. VICTOR: Hi, everyone
and welcome to Inside Unreal, a weekly show where we
learn, explore, and celebrate everything Unreal. I'm your host, Victor Brodin. And with me today I
have Sebastien Loze, industry manager
for simulations, as well as Alban Bergeret,
solutions architect for simulation and training. Welcome to the show. Today we're going
to talk a little bit about building simulation
applications with Unreal Engine. And I will hand it
over to you, Sebastien. SÉBASTIEN: Thank you very much. Hey, everyone. We are super happy to
be here with you today. Thank you for joining us. And thanks, Victor,
for setting this up. We're really happy to
be on Inside Unreal today. When you start the
conversation like this, it's super important to define
what we're talking about. And we all come from
different horizons here when we gather
in these sessions. And as we all come here online
from different backgrounds, experience, and skill
set, some of you might already be very
clear and precise, and get a very precise
picture on what we call simulation in this context. But it's good to make sure
that, for the duration of the session, we're
sharing a common lingo. So what really is simulation? That's what we we're going to
go through on these little deck that we have there
that we can bring to the screen for you guys. So really, when it
comes to simulation-- and as you probably
detected, English is not my first language. So as you've probably detected,
that's very important for me to, in some cases, go
back to the dictionary. And let's do that together. Let's have a look at the
definition of simulation from the Oxford languages. So as you can see, the scope
of application for the word, simulation, itself
is very, very large. So we will be focusing
today in this session. The part we will be
focusing on today is this one, the production of
a computer model of something to study. And we can extend it. These computer models are
used to study, to train, or to analyze. The goal can be to
either augment your skill set, or preparedness to
complex and unexpected events, or to train the robots taking
care of our transportation or protection, or to
evaluate new vehicles or automatic systems
we are creating before they even exist. That's what we call
simulation in the context of our conversation today. And if we take a step back, if
we actually take a step back, I'm sure that if we're
all in that call together, we all agree that we're living
through a very exciting period of technological changes. In the past five
years alone, we've seen a huge advance
in cloud computing, in connectivity, artificial
intelligence and automation, as well as interactive
and immersive technology. Across all aspects of
businesses and society, we're seeing exciting shifts
emerging as this technology continues to evolve. And in some cases,
it's revolutionized how we work, how we play, how we
interact, but also how we train and how we learn. And the door has been opened
to richer and more immersive experiences and
content, from the game we play to the movie we
watch, as well as the way we collaborate in the
workplace and we train and set up training and
analysis of the world around us. And it's not magic. It's linked to some very
tangible points, very tangible elements that we're
going to discuss today. There are some specific
elements specifically linked to your pipelines. How we create things-- I mean, we've all been
developing virtual applications for a long time. And in this context,
it is remarkable to see the evolution between
the traditional pipelines that we dealt with-- and we are still dealing
with in many cases-- and the real-time pipelines. It allows for instant
and almost instantaneous ways of creating or rendering
your content in your scenes when you're creating
new environments. And these new
environments are dynamic. Like any video game,
the simulation world is using exactly the same
paradigm of nonlinear content, content that can be
evolving over time and that can be deformed
by the decisions that the end users are taking. And the beauty of it is that
this paradigm is not only touching training
and simulation. It's not only about simulation. I know Victor, when we
prepped, reminded that to me. It's not all about yourself. It's about everything we touch. And all these universes
that you see here, from architecture to
advertising, the benefit from real-time is everywhere. But today we'll dive in some
examples and some specific use cases related to simulation. And before I dve in,
I'll introduce this short 30-second
video to reflect on some of the
simulation use cases that creators worked on in 2020. So a short video for us. And we'll discuss
that just after. [VIDEO PLAYBACK] [MUSIC PLAYING] [END PLAYBACK] So it's a small vignette. As you can see from this video, simulation is
everywhere-- in autonomous vehicles, driving
simulation, in aerospace, as well as health care, defense. It's a very large domain that is
touching different places where it's either too hard, too
expensive, or too dangerous to try things in real life. "Don't try it at home." It's really the
paradigm of simulation. If you can't do it
in real life, you need to find a way to
experience it prior of doing that to fail safely. And the problem with
that, the challenge that everybody is facing, is the
digitalization of your world. And how do you bring
your real world to the simulation environment. Within the audience
today, we have tons of creators, inventors,
people creating universes that are far from real life. We also have people who
want to replicate real life. And it's interesting to
see that, with one engine, you'll be able to do both by
using different techniques and different capabilities. And we'll dive on the
simulation ones today. As I said before,
definitions are important. This one's coming
from Wikipedia. That's another source
that you can use. And here is a definition
of a game engine. The important part here
is that, in the context of building a simulation
application, many times, people get confused and see
only the shiny pixels and the beautiful
images that you produce with a platform like Unreal,
while at the core of it, it is a development platform
that you can build upon and create any type
of interactivity. And the values you will find
in simulation applications very often are
linked to the values that you want in a
video game as well. If you look at that table
here, the only point that is different is your goal. In one case, people
will enter an experience and will live a
fantastic experience and be entertained by it. They will live something
that they could not live in real life. And they will be feeling all
these emotions connecting them with the characters or the
scenario or the situation that they are playing. And they'll leave the
room with this memory of this entertainment and
feeling that they went through. In the other case, in
the case of simulation, what we want to achieve
is we want the interactors with the experience to be
augmenting their knowledge about something. They either will
learn a skill set or they will understand
better the outcome of a specific paradigm
that they will be testing to analyze that. So they either train to learn
and to learn new skill set, or they use simulation to
augment their comprehension of the real world. So as you can see here, I
mean, with all the environments that we are creating in
the context of Unreal, the one that is the closest
to video game development is developing a simulation
application-- something that is not linear, that the
operators or the gamers will interact with, and will
generate a different outcome depending on how you play it. And historically,
from firefighters to different interactors
in different fields, it was really, really close
to third-person type gaming applications when we saw
the first use of Unreal in the simulation domain. And it didn't start with us. It really started by the
community figuring out that, hey, instead of
using xyz solutions, I could actually build it
myself with a game engine and have fantastic
results out of that. Fast forward to today, the type
of use cases evolve like crazy. It's not only a third-person
type of application that you're seeing, but
it's any type of application that replicates
exactly what you would do in any type of simulators,
either to train people to do new things-- to
pilot complex vehicles, to control or to interact
with complex elements of the human anatomy
as you're training a surgeon on a
specific procedure for orthopedic surgery, as you
see on the top-right corner here on that slide, or preparing
for complex operations, as you see here, with the brain
pictures used by neurosurgeons in the Tokyo University
when they are doing their research on how to reduce
a clot on the brain prior to the brain operation. All these type of
complex procedures can either be
rehearsed, analyzed, or taught to the end users. So to summarize, simulation
is a very large field. It goes from space
exploration to health care, passing by civil aviation
and earth-moving equipment training. And because of all that,
we have to make sure that we have a very complete
platform solution with Unreal Engine to ensure that, no
matter what is your use case, you find the right
tools within Unreal. And when we look at the
workflow that anybody would be using when they're
building a simulation application, you always go
through three main phases. And there are more than that. But if we take a very,
very high-level point of view on any
simulation application, you always have
three main phases. You will create your content,
the environment you deal with, and the objects that
you interact with. You will develop your scenario. And you will create some
specific interactions and some deployment
capabilities for your solution so that it can go
to your end users. So from that
standpoint, again, it is another similarity
to video games. You have your content, your
storyline, and your gameplay. The same type of
mechanism exists there. And while some of you are more
familiar with simulation today than gaming, and
vice versa, you can see how these two universes
are feeding each other and in terms of innovation are
represented by the same need to evolve. And now we'll dive into
a more detailed slide. And bear with me. I hope that you don't
have a too-tiny screen. Otherwise you won't
be able to read that. But we'll help
you by zooming in. So yeah, that's
very, very small. But there are colors
on that slide. I hope you can see it. You can see these different
phases represented again. Environment-- creating your
environment in your models. Defining the experience
definition and the execution. Deploying your application. And making sure that you
have a post-experience result analysis. Let's dive in. We'll be nice and we'll zoom
in on some of these boxes. So first off, the environment
and model creation. While there are a lot of tools
within Unreal Engine that allow you to create and
inject a lot of objects into your applications,
in the simulation domain, some specific tools
and specific formats are important to be supported. And we want to make
sure that we build an ecosystem around Unreal. So we associated ourselves
with different players of the simulation industry to
ensure that their formats are supported properly into
Unreal, either through elements that we would develop
or through elements that they would develop. And as you can see
here, from the pipeline tools to the capabilities of
streaming different terrain server, terrain format,
We have a complete loop to ensure that,
no matter what is your favorite tool, no matter
what is your favorite pipeline, there is a way in for
you to jump into Unreal and not suffering while you're
developing your application. And by the way, today,
we have Alban with us. And that will be part
of its presentation at the end of our
session today, where we will be presenting
two examples of streaming services of
geointelligent content, geographical content that
can be now loaded directly and streamed
directly into Unreal. We gave the example of
the Cesium 3D tiles, so Cesium for Unreal plugin. And we will also
have a demonstration of the ESRI ArcGIS Maps SDK
for Unreal Engine plugin. And we will show
you how to inject all that great geographic
content into your applications. It's super simple. You will see there are
a couple of very, very well-defined steps. And it's not magical. It's all based on the great work
that we did with our partners to make sure that this
comes to you in an easy way. Now, when it comes to
the experience definition and the execution, the
creation of scenarios, the creation of
initial conditions, there are specific objects
and specific components within Unreal that
you all know about that you've seen
demonstration of recently with the arising of 4.26,
with these new capabilities for the different modules
that we have for the sky atmosphere, the volumetric
clouds, the water system as well. All these components can be
directly leveraged, obviously, in the context of simulation. And in that specific
context, there are some modules in the
execution of your scenarios that are representing a very
strong advantage in Unreal. In the simulation
context, for example, we can talk about Chaos and the
ability, through Chaos physics, having a lot of
new capabilities, including the full-body IK
that came in, very recently, into your applications. On top of that,
because we understand that monolithic simulations
are not trendy anymore and that everybody is dealing
with non-walled-garden environments, where we can interact with
different simulation systems, to make sure that we bridge
correctly with your existing simulation engines,
the interoperability is a core element
of what we need to maintain as part of Unreal. And that's why you have
these new plugins created by Pitch, by CoreDS, allowing you to have all this
interoperability setup so that your traditional
communication protocols such as
DIS or HLA, and soon, DDS, are supported as part of
your application in Unreal. And yes, for those
of you who are not familiar with
simulation context, welcome to this
acronym-fantastic universe. We have plenty of acronyms. So any questions,
feel free to use the-- or exchange chat capabilities
to ask all your questions. We'll dive in during
the Q&A session. And Victor is already
aggregating all of your questions as we speak. Another element about
the experience definition and the execution is the
co-simulation capability. In the simulation
domain, you probably already use some expert tools
for your for a specific physics element or for specific sensors
or for specific behaviors of your objects in your scenes. And this co-simulation
aspect is super important. There are some plugins
already existing for many of the players. The nature and the
structure of Unreal-- and we'll talk about that
a little bit further down in the session today-- the nature of Unreal allows
you to build your own module as much as you need. A good example of that is the
example that we released today. There had been a story
posted about Duality. And Duality, they're building
their Falcon software. Their application is a
machine learning and AI-oriented simulation engine
based on Unreal Engine, which allows us to leverage this
co-simulation capability to have your machine-learning
interaction directly connected with your Unreal
Engine experiment. The article that we
posted today is really focusing on their experience
and their development around Falcon and
the dedication they had on that one specifically
around the Honeywell aerospace drone surveying tool to analyze
the status of the power grid electrical lines as well
as the wind turbines that they are monitoring. In terms of deployment, there's
also a lot to be said here. But you heard us
many, many times, at Epic, talking
about these elements. We want to make sure that no
matter where and how you plan to deploy your application, your
game, your simulation system, you're covered. We want to make sure
that either if you're deploying on handheld
devices, or computers, or through any type of HMD,
we give you the flexibility to deploy your
application the right way. And furthermore, in the context
of simulation, many times, especially when you're dealing
with complex civil aviation applications, for
example, you will want to have a dome display
around your trainee. And to ensure that
this is something that you can do very easily, we
have this ability with nDisplay to help you to blend
and warp your images between different projectors
in a very coherent way. And that's another
key element that is leveraged by a lot
of simulation builders. Last but not least, the
first experience result, which is extremely important. Many simulation developers
had that challenge from aggregating systems that
they didn't fully control. Let's say you're
playing LEGO bricks and you're aggregating a lot
of applications together. If you don't have the
access to the data set that is generated by
your application system, you will not be able to export
all the interactions, all the behaviors of your
trainees or your analysts throughout the course
of the experiment that they will be running. And you will be missing out
all the value of this training. You won't be able to do an
after-action review correctly. You won't be able to connect
with your LMS correctly. So that's where this openness
of the core of Unreal Engine is super important. So it's a lot of information. And I hope you're staying. I hope you're still here. I know, Victor, we
have that subtle cue that he would wave at me if
everybody's leaving suddenly. Are they still here, Victor? VICTOR: They're still
here as far as I can tell. SÉBASTIEN: Good. Thank you. So to summarize, when you're
developing a simulation application, it's basically
some type of a layer cake here. So you have your
framework level, where you have your
development environment, your stand-up databases, your
interoperability protocols, as well as your game engine. On top of that, you need to
build your integration effort to connect with any other tools
that you want to connect with. On top of that, you will
build your specialty features implementation. And then, based on
all that, you will establish your curriculum
definition or your experience definition. Now, what we want to make sure
is that, while we're bringing you a very solid platform
for your framework level, we want to center the
integration effort for you and make sure that most of
the integration that you need are taken care of by our teams
or with our partners in advance of you developing
your applications. And that's why we're here. We're really here to
hear back from you and understand what
it is that you need to simplify your
process when you're developing your applications. I talked a little bit
about the pipeline. I won't dive in too, too much,
because that's really part of the core of our demo today. But in a nutshell, there
are several elements that are important to notice. The first one is, as
some of you know and are experts with, Dataspace
and Dataprep are allowing you to inject, into your
simulation application, virtually any type of 3D assets
that you already have on file. These systems, these
specific formats that you are already
using, we don't want you to have to
reinvent your files or adapt or adjust
to a new pipeline. We want to make sure that you
use your existing pipeline and you're able to not only load
but also optimize this content when you're building your
actual real-time applications. And once you set
up your pipeline-- and these pipelines can be
set in three different ways. They can be set based on coding
using C++ or using Python scripting or using our
visual programming interface, which is the Blueprint
concept that you see here. Once your pipeline is set, you
can reuse that and make sure that you leverage
that moving forward. When it comes to the
geographic data set that you want to integrate
into your application, there are already a ton
of tools within Unreal with the landscape
capabilities the foliage tools, all the marketplace assets that
already exist and allow you to already build games
and all this content that we call geotypical. Geotypical environments
are environments that look like the
real environment but are not actually
the real environment. So that's geotypical. And in some cases, you want to
create geospecific environment. So geotypical looks
like the real thing but is not the real thing,
but looks like, very much, the real thing. Geospecific is it is
actually the real thing. So I'll give you an example. If I fly over terrain
that is geotypical and that is supposed
to represent Montreal, where I live, I
will not see my building. I will not see
the building where I live as I fly on
top of that terrain. If I have a geospecific
terrain, then, if I fly over that
terrain, I will be able to recognize my house. So that's the difference between
geotypical and geospecific. When it comes to geospecific,
it requires the integration of GIS file, the Geographic
Information System files. And aggregate them together to
create a virtual environment. And that's a role for a lot
of tools that already exist out there that are supporting
Unreal by creating content for Unreal Engine, as you
can see on that slide here. But what's happening when you
don't have your GIS files, when you don't want to build
these environments yourself? That's when leveraging existing
data set is super important. And we didn't invent formats,
we didn't have a crystal ball and decide to reinvent the
wheel and redefine a new format. We went and sat with
some of the experts that are doing that already. We are close to the OGC. We're actually a member of
the OGC, the Open Geographical Consortium, as well as the
USGIF, the US Geospatial Intelligence Foundation. And in these two groups, there
are a couple of consensus there are super important. And basically, in
a nutshell, there are three main
formats that people want to interact with when
they are building simulators these days-- the ESRI ArcGIS
Maps SDK, as well as the CDB, the Common Database,
as well as the 3D tiles. And all these formats
have different benefits and different roles and
different pros and cons. And we couldn't decide for you. So we decided to make
sure that, no matter what format you
decide to use, there's a path for you to
use it into Unreal. And again, that's
part of the demo that Alban will
show you in a bit. I talked briefly about
deployment earlier. And I won't dive into
that more right now. But if you want, during
our conversation after, we can jump on more
details about this. Specifically we can talk about
our support of the OpenXR Consortium and what we're
doing along these lines. I mentioned the
domes and the work that we can do with nDisplay. Same thing-- we can dive a
little bit more in our session after the demo. When it comes to deployment,
cloud-based deployment is super important for all
the players in the simulation community and for
different reasons, either to leverage
very large install base of computers for
massive deployment of very complex simulation systems,
or also to make sure that IT work is
simplified with deployment that you want to do in places
where you can't install what you want on computers or when
you can't have the right GPU capabilities on some
specific parts of computers that you can't really
manage yourself. So that's where the
online capabilities and the distribution that we can
do, either through AWS, Google Cloud, Azure, is allowing users
to leverage our Pixel Streaming capability from
anywhere they are. And that was actually a demo
that we presented in December last year, during the
simulation conference, called I/ITSEC where we demonstrated
a project called Project Anywhere. In Project Anywhere,
all the guests to that conference,
the ITSEC conference, were able to, from
their computers or from their mobile phone
or from the tablet, anywhere they were in the world, to
connect to an application that was an Unreal-based application
running a geotypical terrain and loading it directly
into their application using Pixel Streaming. So we mentioned, briefly,
the dynamics of the objects and the simulation of
these dynamic objects. The co-simulation
is an essential part of the philosophy in Unreal. We don't want to lock you or
to provide a solution that is an all-in-one solution. We want to make sure that you
have the ability to connect your tools with Unreal. And that's part of
that vision here. The examples that we
listed in the chart earlier are represented here as well. One of the last points
I have before we jump to the demonstration
is to also keep in mind that no matter what you want
to build on top of Unreal or around Unreal, to
extend it, to customize, or to automate some tasks around
Unreal Engine, you can do it. You can do it yourself. The API is open. The source code is free. It's available for everybody. And you can download it today. It's on GitHub. And that's the right
place to download it and to have a look at it. So when we are connecting with
the simulation community, when we're building that
ecosystem, that's also one place where all
these builders and creators and inventors of solutions
can go to download everything that they need around Unreal
to amplify it if they need to. And some other people
are amplifying it and are supported
by Epic directly. That's the MegaGrants
concept that some of you may have heard about already. It's described here. In the recent years,
more than $8 million has been spent by Epic
and awarded by Epic to simulation-dedicated
applications. That gives you a feel of the
involvement and the dedication that Epic has towards
this simulation community. And there's a reason for that. As you can see here,
the simulation community is changing quite a lot. There are new drivers to the
next generation of simulators that are being built. And
between the progression of the XR type of applications,
the need for deployable more deployable and
more reconfigurable training systems,
we've seen a lot of new drivers pushing towards
an evolution of the simulation domain. Also, in the need for a more
realistic environments, which is not necessarily needed
when you're training humans but when you're
training machines. As a matter of fact,
it's a strange paradigm, but as a human, we
can observe animation and say that, if we see some
wide vertical lines in front of us on a screen, we can
very easily interpret them as rain, for example. Whereas a computerized system
with an electro-optic device like a camera or a
digital camera observing that same scene will say,
hey, yeah, that's cool. I see vertical lines. But it won't infer that as
rain, as the drops of rain, going to the
objective of a camera. So you need to go to an
extra level of fidelity when you're actually
training machines compared to training humans. So the need for more
realistic representation of your environment is growing
in the simulation domain. So the simulation
community provides a very strong and stable
working environment. And there is a big, big need. We hear, every week, about new
companies in the simulation domain coming to Epic
a or to our partners, our friends, asking for
skilled individuals on Unreal. And if you are looking for
a career change or a career shift, that can be a very
good place to look at. So no matter what
you're working on today, if you're working
on applications that are Unreal-based and if you
have developed your skill set around Unreal and
looking for your next gig, that might be an
interesting path to look at because
there's a lot of need. All right, so I'm done
with the presentation part of our conversation today. And I want to take a quick
time here to pause for a bit and see if we have any
specific questions that we want to handle right now. And if not, it would
be the right time to jump to more interactive
sessions with the demos that Alban has
prepared for us today. VICTOR: Thank
you so much, Seb. Let's see. Taking a quick glance, we
have some general questions. They're more general
in regards to the tools and some specific ones. So I think we can
leave all of them for later on during
the Q&A section and head over into Alban's
demonstration here. SÉBASTIEN: Excellent. Alban, the floor is yours now. ALBAN: OK. Thank you. Do you see my screen? Is it OK? VICTOR: Just a moment, Alban. ALBAN: Yes. I will wait. VICTOR: All right. Yes, we're good to go. ALBAN: Yes, good to go. Sorry. So I will start by
apologizing for my English. But I'm French, so I'm working every
day to improve it. But you will hear
my strong accent. Anyway, so my goal today is
to quickly demonstrate to you how it is quick and easy
to create Unreal world with an accurate lighting. Because in simulation, we
are searching for accuracy. And accurate lighting,
along with concrete georeference system. And once we have set
up this environment, we'll be able to inject
into this environment some data set from
ESRI and Cesium. So I will go live and start
by creating a game project. I start with an empty template. It's not mandatory to
have something better. And Blueprint--
it's fine for now. And I will just name this
project Simulation Demo and fix my typo. And that's all. Project is creating. We need to wait a little bit. VICTOR: Alban,
sorry to interrupt. Could you change one
setting in your OBS instance so that we can see
your mouse cursor. ALBAN: OK, no problem. For sure it would be better. My Screen Properties. VICTOR: It
should be in the Scene. ALBAN: Yes, I got it. You see it now? VICTOR: Yes. Thank you. ALBAN: OK, sorry. [CHUCKLES] It's
better with it, yeah. So first we will
start by setting some specific plugins and some
specific project settings. So I will start to
go to my plugin page and enable some plugins. First one is the Sun
Position Calculator. It's something which is
provided by Epic Games, but which is not
very well known. This plugin allows you to put
your sky light at an accurate location depending on
your position on Earth and the time of day. So we'll start by
enabling this one. Another one we need to enable,
just for the demonstration, is the Editor
Scripting Utilities. Because we will use them
inside editor widget. And just for my
computer, I will disable SteamVR, just because I
have a headset connected, not connected. If it's not connected,
I can have some error message that's important for
normal purposes. So that's all for the
plugins we need for now. And we need to,
since we are there, change two things in
our project settings. I will go quite fast. Victor, this video
will be recorded and available after afterward. So you will have time
to look at it afterward. So first thing we need
to set in our case is to extend the
default luminance range in auto-exposure. Why? Because we will work with
a physically accurate sun which has a very
high luminance value. And if we don't set that, we
will have a bad lighting setup. The other one is, since we will
be using the sky atmosphere, it's better to enable the
correlation between the Sky Atmosphere and the height fog. Yes, so that's all. Our project is set. And I can just restart-- I will not restart,
but I will close it, just because we will need to
bring some additional content in our project today. So my project is created
on my hard drive. And I will add 3 plugins for
the purpose of this demo. First one is ArcGIS,
second one is Cesium, and the third one is a
GeoReferencing one made by Epic and that will be available
inside the next version of Unreal. So I just copy it to
my project folder. And once again, for
the demonstration, I draft a little terrain to
speak about GeoReferencing. So I will bring this
data set into my project. That's all I need to set up. So Read-- not that one. If I go to my demo and open,
once again, my project, it will load all these plugins. Wait a little bit. OK, everything is set up. Not needed anymore. By default, Unreal starts with
the old sky system and some other actors.
We don't need that today. So we will remove everything
except our floor and the Player Start. Then delete everything. Of course it's dark. And we will start by
placing some Actor. First one is the SunSky. The SunSky Actor comes
when enabling the Sun Positioner plugin. And it's a Blueprint-- I will describe
it after a while-- but it's a Blueprint
that already aggregates the basic components
for your sky, meaning the atmosphere,
the sunlight, the sky light, a compass, just
to be sure that you are pointing to the north. If you just do that, you have
your physically-based sky, but you still notice that
there is a black part at the bottom of your screen. This black part is-- normally there will
be trees there. You should have a terrain. You should have
anything you want. And it should be
hidden by the terrain and you just see the atmosphere. So if you want to get rid
of this one, in that case, you can just place the fog. An Exponential Height Fog.
And it will have a blue, I would say,
gradient to this. So let's have a look at
what is the SunSky Actor. The advantage of
SunSky Actor is properties such as latitude,
longitude, and time zone, and also day and month
value, along with solar time. And when you move
your time of day, you'll see your
sun goes like that. So I can go there and move
my sun through the sky. So it's strange now, but we
will fix it very quickly. So what can I-- no, it's good. Just to speak a little
bit about shadows, I will start by
placing a cube, just to have some basic
geometry on that one. I will expand the floor to make
it more-- no, don't do that-- to make it wider. It's live. OK. And make my cube
having something like a house dimension. So let's say I have a
terrain with a house. So that's good. If I go backward
a little bit, you will see that, after a
time, my shadow disappears. It's not good. And I'm not very far from 0. So I need to fix to make some
settings to the default SunSky Actor to make sure that
everything is better. So first thing, we need to edit
our direction light, the sun, and give it a good physical
value for this luminance. And for the sun, it's 120 lux. It gives a good, stronger light. Another thing, we can go
to the shadow settings. And you will notice that,
by default, the range is only 2,000 meters. That's why my shadows
disappear very fast. So just add a bunch of zeros. And I have something very far. I set 2 kilometers. But when you do that, your
cascaded shadow setup is wrong. You have some blurry stuff. And we can fix that by working
on the distribution exponent. And if you raise this value,
we have crisper shadows when you are farther off. With that set, I can
have crisp shadows that goes from very far. So when editing the SunSky,
I'm going, for instance, to a specific time of day,
let's say 18:00 hours. You see this strange
behavior in the sky. It's because the
fog behavior related to direction and scattering
is on top of the SunSky atmosphere scattering. And you need to clear that. For that, we go to the
Exponential Height Fog. Of course, if I remove it,
I have a nice atmosphere. It's why the fog is
making some issues. And for that, we just go to the
directional inscattering color and set it to black. Then we discarded this value. If we enable and
disable the fog, we see at the horizon more
radiant for coloring the fog to the sky. And if you want
to adjust it, you can play with the haze falloff
to have something crisper or, I would say, more
polluted, more dirty. So this one is really artistic. So let's set it to 1 for now. OK, so we have something
which is quite well. If I test for another
value, time of day, we have something nice at noon
and something nice at sunset. And during night, OK, it's dark. But if I wait a little bit-- it's a little bit long because
of the Eye Exposure, the automatic Eye Exposure
takes time to occur. And why? I have something
very clear at night. It's not what I want. So we need to also
control the way our Eye Exposure is working. So for that, we need to
add a post-process volume to our scene and make it the
global post-process volume by setting it at
an infinite extent and go to the exposure
setting of this post-process. In order to work faster, we can
raise Eye Adaption Speed there. It will go, of course,
very fast-- too fast, but it's just for setting
up our parameters. So if I go to noon and
go back to the night, I have something very fast. Now what we need to do
is to tell the system-- not this one, this guy-- that, during night,
we don't want our high to add up too much to darkness. For doing that, we can clamp
the lower exposure value and have something which
is quite dark at night. And with that down,
you can have something which is consistent
from day to night. Is there something
here has-- no, OK. So right now, we have a sun. We forgot something
important, which is a moon. Currently, the SunSky system
doesn't come with a moon. But it's very
straightforward to add one. We will add a directional
light to our scene. Let's rename it to Moon. And the moon is very specific. It has a very low intensity. And a full moon is around
3 lux for intensity. So you can add this
light intensity to 3 lux. And yes, I've put
the moon somewhere, but I don't know where it is. And I will need to move it. Usually, when you are working
with a directional light, normally you press Control-L
and have this nice gizmo to tune the light. When you are using the
SunSky, the main light is driven by time of day
and not your actual gizmo. So we need to find a way to give
good orientation to our moon. It's a simple trick. We place our sun at
a specific location. We go to this light, copy
the rotation, go to the moon, paste the rotation. And here we have the
moon placed right here. Of course we don't see it. So we move to the sun further. And we should have the moon. Why? VICTOR: Because it's live,
and things don't always go as we want it to. ALBAN: Sorry. Yeah, it could be
because it's live. We'll try another one. OK, put my sun there. Oh, I think I took-- I copied the wrong rotation. Paste. And go to SunSky. OK, but she was there yesterday. Well, anyway, we will go on and
try to fix it after a while. Oh, yes, I remember. I forgot two important things. When creating the moon,
I just consider it as a directional light. And I didn't tell
the system that it was part of the sky atmosphere. I forgot to do that. Sorry. Because it's light. And for that, we need to go
to the Atmosphere and Cloud setting and say that moon
will affect the atmosphere and that its index will
be the second index, 0 being for the sun. And now, since I do that, I
can see the moon on the sky. Phew. Sorry. It was fun. And now, if we move the sky,
we can have something at night with a moon. Of course, you can imagine
setting your moon direction using additional scripts to
have something like actual moon phases or create a small sphere
to another model. While doing that, we are set. And we have something good
with sun, moon, time of day, good shadows. And we can save our
current level as-- let me just create a folder
first, call it Levels, and name this Outdoor. OK, that was our first part. Victor, do we have
any questions so far? VICTOR: Not specific in regards to the previous part of your-- oh, let's see here. Yes, we do, actually. Batsirai Biti asked, did
you add the ArcGIS plugin as part of the three
plugins you added? ALBAN: Yes. Yes, I will explain how
I get it, afterwards. It's publicly available
on ESRI ArcGIS. But up to now, what I did
doesn't require this plugin. The only thing you see is
because I put this plugin in my project plugin folder,
this plugin gets loaded, and I've got this extra
ArcGIS button along with this extra
Cesium window there. But we will cover that later on. Yes, last thing, this
message related to lighting, it's important, when we
are doing live simulation, we are lighting our
environment in real time, we are not baking lights,
something like that. So the moon should
be set to movable. With that down, this
message will disappear and we will not have to compute
and bake our light maps. Other questions, Victor or Seb? SÉBASTIEN: Yeah, up to now,
there's really nothing simulation-specific, right? This can apply to video
game or any application. But in simulation
industry, we need more. We need to know
where we are, where the scene is taking place. And that's the part I think
where the GeoReferencing plugin becomes very handy. Alban, do you want to do
you want to dive a bit more on that specific point. ALBAN: Yeah, so at
the beginning of this demo, I dropped, into this project,
the GeoReferencing plugin. This is a plugin I made which is
currently available on GitHub, but on early stage. This one is a current
state of plugin. And it will be part
of Unreal Engine 4.27. So you will be able to use
it in our projects very soon. So when you do that,
you have access to means to bring some actual location
on Earth for your environment. So I will start by saving my
current level as a new level called Georeferencing. And we don't need this
floor and building anymore. We will use another asset
I brought to the engine when preparing it. And I drop it in my scene. I make sure that it is
placed around the origin. And I need to go a
little bit faster because we are going wider. So I took a random nice country. You might probably recognize it. And it's a one-scale
terrain for France. And it's made from
geospatial data. It has a digital elevation
model, texturing, and so on. It's very low-level
terrain, but it's just to illustrate what I want to do. When I created this terrain,
I made its precisely located at one origin I know on Earth. The origin of this terrain is at
some coordinates I really know. So if I look backwards,
you will notice two things. My terrain is flat,
but the Earth is round. And so the Sky
Atmosphere is also round. And we have this crazy
stuff that my terrain is going into the Sky Atmosphere. It's normal. It's exaggerated there. It was just to illustrate
this flat-earth mode is more for small databases
of 1, 2, 500 kilometers, but not at scale. We also noticed this
strange blue thing. And this blue thing is related
to the Exponential Height Fog. It works well when you
are close to the ground, but not when you are in space. So when you are trying to
make bigger environments, get rid of it and consider
the Sky Atmosphere will do the fog for you. SÉBASTIEN: Alban, a
quick question for you. The content that
you're showing here is meant for referencing
placement here. But you don't really need
to have it there, do you? ALBAN:
What do I need to-- oh, the terrain. No, this is just
a random terrain. You can do it with any
kind of Unreal terrain you would like to georeference. SÉBASTIEN: So it could be
basically any primitives or any mesh that we would use there
instead of having an actual terrain, correct? ALBAN: Yes. You could take another
floor, make it wider. [INTERPOSING VOICES] SÉBASTIEN: Yeah. It's important to know that
you don't need to pre-construct and get GIS data and elevation
from a specific region to do that. It's just to give
you a reference point that you would dictate as
having the proper geocoordinates to start your application. ALBAN:
Yeah, that's right. You can georeference
any Unreal database, even if you don't create
it by geospatial data. So in order to define
our GeoReferencing in-- oh, why-- my plugin
is not shown. Why, why, why? Settings, to Plugins,
Georeferencing. This one is there. Why is it not showing? I need to fix it, because
it should have brought me another component. So I will close, open it
again, just to be sure. I have it there. Good. SÉBASTIEN: So
while you're reopening, we had a couple of questions
here about the two main plugins that you're using today. And I'll take care
of these as you're relaunching the application. ALBAN: Yeah, thank you. SÉBASTIEN: So some
questions came to us about-- ALBAN: It's ready. SÉBASTIEN: --the ESRI
plugin for the ArcGIS Maps SDK, as well as the Cesium
for Unreal plugin, in terms of availability. So just to restate that
the ESRI ArcGIS Maps SDK plugin for Unreal
Engine is already available as a beta on the
beta program from ESRI. We will add links, after the
fact, to the conversation. So you'll have all these links. We will also put the link on
the preview page for the Cesium plugin. The Cesium plugin will
be available for everyone as of the 30th of March. So the wait is
almost over, right? [INAUDIBLE] if you
currently had it. But everything that
Alban is using today will be in your hands, at the
latest, on the 30th of March. You already can download
your ESRI plugin. It will work directly
based on what Alban will demonstrate in a bit. And from the Cesium plugin
perspective, 30th of march is your D date. When it comes to the
geocoordinate plugin that Alban will
demonstrate in a second, it's already available
on GitHub for everybody, on the Unreal Engine
plugins on GitHub directly. We will share that link as well. Sorry. Go ahead, Alban. ALBAN: Yeah,
yeah, no problem. Related to GitHub, one currently
on GitHub is the older version. But I will make it go
through GitHub very, very soon, the newest version. Anyway, so I don't
know what went bad. But now I have my
plugin loaded correctly. And I have a new
kind of Actor, which is the GeoReferencing system. I can just grab it, drop
it in my level, and there you can define
what is your GeoReferencing. We have two modes-- flat planet and round planet. In that case, my
terrain is flat. I would say it's
projected like a 2D map. And we are in the
case of flat planet. We will see the round
planet case later. If you are aware
about geospatial data, you know that it's
a projection system. And every coordinate
system has a nice name. There are the EPSG number,
you have the well-known text, you have the proj.
string and so on. I will not elaborate
about that today. But when I created this
terrain, I asked that my-- I won't say that. I'm located in France. And in France, there
are two UTM zones. The first one I used is the
UTM zone 31 North simulation. Maybe people know this zone. And I just need to declare that
I'm working on this EPSG coordinate system. It means UTM 31 North. So geographic 1 is for when you
are expressing your coordinate in latitude and longitude. And this one, 33, 26,
is a one for WGS 84. And now where is my location? Where is my initial
terrain location? When I created terrain,
I put a 0, 0 zero origin, which is 0, 0 of Unreal, at
one specific location, which was expressed in that
coordinate system at 5,500,000. It's a big number, but it's
classical WGS number location. So that's all. We have defined that
our Unreal origin is at the specific location
on Earth, expressed by using these coordinates
in this projected coordinate system. That's all. Related to SunSky,
the SunSky still needs to be located on Earth. So in that case, we have to
make something consistent. And I translated this coordinate
into latitude and longitude using, I would say, an
online coordinate translator. It's really straightforward. And its time zone is GMT plus 1. What is in north of that? If I zoom on the compass
mesh, you have a compass-- oh, I'm working and
really moving very fast-- that gives north the right. And in fact, my north
is in front of me. So I need to turn my SunSky
to make it pointing directly to the north. So in that case, I need to say,
OK, north is minus 90 degrees. And when I do that,
and I exit SunSky and change the time of
day, you will see my sun at west at sunset, sunset
at west, and so on. You know the story. OK, so we have set up. So up to now, I just
made some demonstration and I didn't prove to you that
we were on the right location. So if I now show
my plugin content and go to the GeoReferencing
plugin content, we provide some utility
widgets that you can run, the Editor Utility Widget. And it displays your actual
camera location in the engine coordinate in the projected
coordinate system of choice-- we defined it in
latitude/longitude and also in
geocentric coordinate. Geocentric is a
particular frame which is also called ECEF,
which is centered at the center of Earth. And you can also point some
coordinates, some point, with your mouse. And will get, in the
bottom part of it, the coordinates you
are pointing to. So in that case, if I tried to
point something around Paris, the texture is quite ugly,
but there seems to be Paris. And 3 degrees
longitude, 49 latitude. It's fine. And in fact it was
not Paris at all. Paris is there. [CHUCKLES] There we go. That was the first proof
of a geocoordinated system. But we can add
other things to it. We provide also a UI which
is a kind of status bar that displays some-- your viewport coordinator. It's very straightforward. To add it to your
environment, you go to your Blueprint. Event Begin Play--
you create that widget-- create widget-- of which kind? This geographic status bar
with player controller. And you just add
it to the viewport. Yeah. That's all. Compile. Save. And now, if I click
Play, I go to Runtime. I'm below the terrain
of course, because I didn't change my Player Start. So let's go there and make
my Player Start there. Snap Object to View, Play,
and now, OK, here we are. And got to full
screen to make it more visible to see, at runtime,
the bottom of the screen, that we are located
at some point. I don't move. But it's quite normal
because I didn't set up a pawn to move correctly
with that scale. So I need to make a pawn that
will enable us to go faster. So let's create a
Blueprint component. It's classical stuff. It creates a game mode. I want to call it Simulation. SÉBASTIEN: The reason for
you to have that pawn with that very high multiplier for your
movement is because suddenly your application is
real scale, right? Your scale is 1-to-1. And the environment
that you're taking is multi-thousands
of kilometers now. ALBAN: Yes, that's
why I need to go very fast. SÉBASTIEN: The pace that you will get something that's more modest in
the placements that you will get. That's why we are creating
that enhanced pawn to move your viewpoint
or your player faster. ALBAN: Yeah. So in that case, I just go
to the Movement component and add a bunch of zeros there. It's quick and dirty. You should have
actual relevant speed or make something
clearer, that you change the speed based on
altitude, something like that. You can go on anywhere. We created this pawn more clever
than that, that was changing it's speed based on altitude. So now my simulation game mode
will use this simulation pawn as a base pawn. I need to go to the Project
Settings, Maps and Modes, and ask to use my
simulation game mode. And I would say you can start
with georeferenced terrain. VICTOR: And this is
typical Unreal framework stuff that applies to pretty
much anything you'd use the engine for. ALBAN: OK. And now I have something
which is going crazy fast. I can check that my coordinates
are changing dynamically. And I can also see that I'm
very high from the ground. OK? Next thing I wanted
to showcase, really, to proving that we are in a
fully georeferenced model, is, with this plugin, we will
provide a location probe. It is a Blueprint that you
can drag at a new location in your scenes. Or I just dropped it on the sky. So whatever. And this probe, when
I click on Play, you'll notice that
it will display-- zoom-- it will display
its actual coordinates. This is Actor
coordinate, expressed in the three-coordinate
system we had before, first one being
the projected one, second one, latitude/longitude,
and the third one, the geocentric one. It's very straightforward to
make some point of interest and to check that your level
is at the right location. I just wanted to
illustrate, if you look at this Blueprint,
what it does. It has a location widget
which is a probe text. And everything is
computed there. It's basically some strings. And if you go to the
graphs, the interesting part of the GeoReferencing
plugin, it always works. So at each tick on my probe, I
drop a GeoReferencing system. You just write
that, geographic-- anyway, I don't
know why I got it. And when you have
access to this guy, you can go to GeoReferencing. And you have different
functions that will enable you to convert
your coordinates at runtime. You can go from
Engine to geocentric, Engine to Projected. And once you are in
geocentric or projected, you can go to geographic,
latitude/longitude, and back and forth and so on. So what I do that is I take
my engine location, I express it in the
projected coordinate system. And I set the text
of block project. And these projected
coordinates, after that, I convert them to geographic,
latitude/longitude, and I set text block with
latitude and longitude. And same with projected
to geocentric. So you have access to that in
Blueprint, but also in C++. So if you want to make
accurate conversion of if you of incoming Actors
coming from, let's say, a sky tracker system
when you get aircraft location expressed in
latitude/longitude, you would just go in
this Blueprint node. And automatically it
will put your Actor in the actual Unreal
location which is consistent to the
location you defined before. Last thing you may probably
notice is that, in our probe, we have three arrows, it's
east and north and up vector. If you have played with
large database for simulation and you reach high
latitude, you will notice that, when
you are dealing with projected
coordinate system, the north is
not the y vector. You have a slight angle. It could be very little. But in my former life, we
made a shooting simulator for BMP-3, which is
a military vehicle. And there was a bug
in our simulator. And we confused the north
thing and the y-axis. And the small error angle
make us miss our target when shooting at target. So if you are doing
serious simulation, you probably know what I mean. This plugin will
take care of that. So that's all for the second
part, where I explain about how to georeference the world. So do we have other
questions right now? [INTERPOSING VOICES] SÉBASTIEN: Sorry. Go ahead, Victor. VICTOR: No, I was just
going to say, yes, we do. Seb, please go ahead. SÉBASTIEN: Sure. I mean, we have a
couple of questions that I think we can
handle further down the road about the plugins. But one thing that's
interesting here, I mean, we already went two steps. The first step is we created
a project out of nowhere. And we created
that base project. And the other
thing we did so far is we injected the geocoordinate
plugin into a project. And we demonstrated
how it reacts and what are the potential
of the plugin at the moment. But we showed a lot of
things within the editor. How can it be used in
applications in real time? What's the next
step in that, Alban? ALBAN: So in real time,
if you are creating your own plugin, you can add a dependency
to it in C++ or the source file of GeoReferencing our
public, of course. And you can have access
to all they include, and you can link, and
depend on this plugin to call the conversion
system by yourself. And if you are not a C++
programmer, as I demonstrated, you have Blueprint's node
that do exactly the same. I didn't go into all the nodes. But if I go there, we have
some additional methods to compute east-north vector. We have-- well, this
one is very specific-- we can get tangent
transformation at one specific
point on Earth if you have a specific
inset or a value you want to place at a
specific location on Earth. Because we are
syncing round, we can go to each side of the Earth. So this plugin will help you to
do all that boring computation. And it will be part of 4.27. One last concern is
that, currently, it's a very accurate plugin. We did all of our computation
in double precision. But it's running on 4.26 there. You see it. I'm not cheating at any point. And we are able to do
that by some tricks. Because currently the engine
is on single precision. And in order to be able to deal
with accurate 64-bit precision, we created some
intermediate shapes. Because we know that we cannot
double in Blueprint in Unreal 4. And once Unreal 5 will be
there, this plugin will change. And everything will be ported
to the new 64-bit system of Unreal 5. It will make our life easier. You will have more access
to internal double-precision feature in Blueprint. You will not be able to
rely on replacing to deal with your large
geospatial coordinator and so on. So right now, you can
go without Unreal 4 and make your
geospatial application. It's not a matter. You have all the tools you need. And in the future, we plan
to support double precision in Unreal. That will save a lot of concern. SÉBASTIEN: OK,
so just to summarize-- and before we jump
to the next part, a very important point
here that you raised, the fact that you had to jump
through hoops in the creation of this plugin is linked to the
fact that 64-bit is only coming in Unreal Engine 5. But all these hoops
that you jumped through for the creation of
the plugin allows users to, today, have the right level
of precision out of the plugin directly in their computations. So they can already rely on
that geocoordinate plugin today. And the evolution that will
come in the next generation of Unreal will
simplify all the things that we do behind the scenes. But as a user
perspective, the realism and the fidelity
of the coordinates that they want to use is
already there with this plugin. So that's a very
important point. And we could take some
questions at that point, Victor, if you want to choose
to go through them. Or we could continue
and start seeing, now that we established the
main structure of this demo, it will be time to jump
on the actual plugins. Victor, do you want to
continue with the demo or jump to some
questions at that point? VICTOR: Yeah,
let's go with the demo. We might even
tackle some of them through Alban's presentation. And don't worry, everyone,
we're getting your questions. And we will make sure that
we cover as many as we can during Q&A. We got time today. So Alban, please, continue. SÉBASTIEN: So really I
think that we demonstrated something flat and
of limited size. Let's go crazy. And let's do what
we always wanted to do when we started on this
simulation journey at Epic, is to answer the big
question that we always had. When I started at Epic,
the biggest question that I had from
simulation developers was, yeah, that's cool. That's pretty interesting,
this game engine. But how big of a terrain
can you ingest into Unreal? And we were saying,
well, there's no limit. You can get the
terrain that you want. You can integrate a very large
terrain, as much as you want. And our answer was good,
but not good enough. We needed to demonstrate
something to prove a point and to say, well, you know what,
if the world is enough for you, you will be fine. And that's what we're
going to demonstrate now. It's the answer to this
big, big, big question of, how big of a terrain can
we swallow in Unreal today. And the planets is
a good answer to that. So Alban, let's have a look at
the ESRI ArcGIS Map SDK plugin that you installed
on your project. And let's see how you implement
that into your application. Because at that point,
that's only the missing part. And you will all
see, from an existing project, how quickly it goes
to cloud streaming. Go ahead. ALBAN: Yes, flat
terrain are for kids. And now it's time to switch
to grown-up terrains. So I will remove everything. I'll keep my lighting
setup with my atmosphere. And I just call the ESRI
plugin there by clicking on it. It will add an ArcGIS map
controller to my level. And now, if I select
it, I have access to a specific panel where I
can select what kind of data set I want to see in Unreal. So we have two options. First one being a local scene. It means a small scene which
is around a specific location but with a limited map extent. When you explain it in
meters, for instance, it was something 1
kilometer by 1 kilometer. But if you want
to go further, you can make something big
using a global-scale scene. And that would be
the full world. So I will make something
around New York. So I need to coordinate
from New York. I will also say
that my camera will start at 3,000 meters
from the ground. And now I select
my base data set. I can have world imageries,
trees, other kind of ESRI geospatial layers. But I will just be something
living with the world imagery. And of course I want
terrain elevation. I don't like when
things are flat. So I just press on it. And now, if I click
Play, just by doing that, I have my Earth in a tree. And I can go anywhere
from the ground. And I have nothing
on my computer. Everything is streamed
in from the ESRI website. And I can go up to the
sky and to the globe. So I'm a little bit stuck. It happens sometimes, probably
a garbage collection, something like that. It will become live very soon. It's not crashing. Yes, it was just frozen. SÉBASTIEN: While
you're demonstrating that, I want to-- another question
that we had about what is the type of ESRI ArcGIS online
content that you can display thanks to this plugin. And the interesting
point is that, as you look into the structure
here of the layers that Alban is showing
right now, you can use the API key
from ArcGIS to enter any type of web maps, web scene,
layers, items, or services, to inject them into
your application here. So if you need to go online or
use the API keys that you're using for your existing
project in ArcGIS, you will be able to just copy
your identifier here, your key here, and suddenly see all
the content that you're dealing with traditionally
in the context of ArcGIS from ESRI. So anything that you have that
exists in your environment, you'll be able to look at there. So we're currently
running the second beta of this plugin that has been
released in February this year, so very recently. The team from ESRI is
planning on making it out of beta in the next future. And there will be more dates
announced by ESRI in the coming months, I believe. Currently, if you go to
the developer's server from ESRI on the
ArcGIS.com website, you will get access to
the entire documentation of the plugin and will be able
to download it and download a sample project that
contains everything that Alban has to
create manually to be able to start your application. So everything that
we've done today, you can find directly from
downloading the plugin. And you will get access to
a very complete and running project in Unreal
Engine that you will be able to directly load
on your system and customize to what you want. So no hand creation is
needed at that point. You will be able to
do a lot of things by yourself just
downloading the plugin. ALBAN: Yeah. Thank you. So up to now, we were just
using the basic base map. But with ArcGIS, you can add
other layers of data set, of course using your API K key. But some of these data
sets are free to use and that's the case
for the New York USA. So I will just quickly
add another layer, which is Building. And now it happens there. And I researched additional,
let's say, layers, for the demo. So I will go very quick
and paste the URL there. It's the transit
layer for New York. And you can also add population
density using this URL. Could have found it
on the ESRI website. And let's call it Pub layer. We have three layers. One thing we need to do-- it's not mandatory, but when
you go to World setting, ESRI provides a sample game
mode that user samples-- what is is it-- sample default Pawn,
which is provided by ESRI. So I encourage you, in using
this pawn for better motion. And now it's the same level,
but with additional data set, where you can see the
population density along with the traffic points in
New York and the buildings. So in that case,
the free layers I took are very basic. It's, let's say, gray cubes. But if you go to some
cities like Montreal, you can have full
photogrammetry assets with your actual nice buildings. And you can see my bottom
status bar is still there. I still have my coordinate. I'm able to do whatever I
want with this data set. OK? SÉBASTIEN: Yeah, thanks. That gave a lot more context
than just the Earth's surface and the elevation. So yeah, that answers
one of the questions that we had on that topic
about the types of databases that you can load. ALBAN: Yeah. I'm not able to monitor all
the questions while demoing. [CHUCKLES] It's a
bit complicated. So I trust you. So let's switch to
the other parts. And go to Cesium. I start again from my
georeferenced world. I remove terrain and probe. SÉBASTIEN: So what
you're showing now is really restarting from the same point
where we were when we branched to the ESRI ArcGIS plugin. And you're taking the
same core of a project that you had before and starting
to use the Cesium plugin, the one that we
mentioned would be released on the 30th of March. VICTOR: Yeah, go ahead. ALBAN: That's all. We are speaking a lot,
so it takes much time. But you can create what I
did in 10 to 15 minutes. It's very, very easy. So with Cesium, you
add not a button but an additional window. And you are welcomed
by a panel that enables you to
create a Cesium ion and use your own
account if you have one. So in our case, I will
use our Epic Games account with our magic password. Got it. And like for the other one,
we have a Quick Add feature that enables us to
choose a base map. So in that case,
we want the world. And we want to use the
Bing Maps Aerial imagry. But you can also use Label
imagery or Road imagery if you are doing something
related to mapping or driving maps on top. I press on plus, and everything
is added automatically. On top of that, you are
connected to your Cesium ion account. And you can see all the assets
that are currently related to your account at Cesium. And in my case, I have
some photogrammetry assets in Melbourne. And I will
also add it to the level. So as long as you have
assets, 3D tiles are set. You can use that panel to
bring them to your scene. So up to now, we
don't see anything because we didn't make
any setting related to georeference. We need to tell cesium where is
my Unreal location, once again, for the region. I forgot to say one thing here. If you go to Melbourne,
or probably to the world terrain, you can see it's an Actor
with some properties. In that case, we have
two ways to bring 3D tile assets to the scene. First one is when
using the ion server from Cesium, where basically
you have an asset ID and an authorization token
to have a look at it. And you can also use this URL
to point to an ion server, to an external server
of your choice, or even to your
local file on disk. So if you want to make
3D tiles outside of ion, you are able to with Cesium. So let's switch to the
georeferenced setting. We want to express our
origin in latitude/longitude. So for those who
know, Melbourne is something like this location. And if I do that,
you will probably notice that now we are
still in the editor. And we have the data
streamed in. It's already streamed in. It's very interesting. Because you can stream
the data right away, right in the editor, without
having to press Play. Last thing I didn't have data-- I forgot to update
it also on the latest demo-- is to the Unreal
GeoReferencing system. We need to, in that case,
switch to round planets. We need to say, if we want
to have UTM coordinate around Melbourne, trust me, it is
this guy for UTM 55. And in that case, we will express
the origin in not same as before, but
oh, latitude is clamped, of course. It's not the same order. That's all. If you wanted to go to switch to
a full geocentric or ECF mode, you can just click on that. It means that origin
is at planet center. It's basic definition of
the ETS coordinate system. So let's say on the region
tangent to the Earth's surface. We did it for the
GeoReferencing system. We need to do the
same for the SunSky to have something very
consistent with something very strange when your
sun is at north at noon. I never went to the
Southern hemisphere. But trust me, it works. So I put the same settings,
the right setting. And now I can just press Play
and have everything working. So I didn't rebase
my Player Start. So I'm probably a
little bit away. But now we have something
[AUDIO OUT] data set, which is coming in. So it takes some time because
I'm at home with a computer connection. And moreover, three
kids at home today. So my bandwidth
is not that good. [AUDIO OUT] going
very, very fast. So it's not the best
way to demonstrate. Yeah, we are inside
the full-Earth system, where we can go everywhere. Last thing I wanted
to illustrate, if I go to one particular
location, far away from my origin, we'll notice
that your horizon is not horizontal. For sure, because we
are using a round Earth. And if you go far
from your origin, you have this not balanced
but skewed horizon. And we have a way to fix it. Cesium provides a pawn which
is, if you search for a pawn, it's a globe-aware default pawn. If you create your own pawn
and everything from it-- let's call it Cesium pawn-- once again, go to
its motion. If you add a bunch
of zeros to make it faster. And once again create another
game mode, which will be CesiumGM. And it tells this
game mode to use this. Cesium pawn-- where is it? Yeah, Cesium pawn. Save everything, compile. And go in World Settings and
ask for using Cesium game mode. We are set. And we go and [AUDIO OUT]. And that's all. Play again. We are starting once again. You can see the status bar
with our current location. We can go to our SunSky
system, at runtime, and make something
in the morning, something nice in the morning. Yeah. And now we go anywhere on Earth. Everything is
streamed in on demand. And your horizon will
always be horizontal. And you can go, like that,
very far, at any time. If you have seen
anywhere demonstration, it's something which
sounds similar. Yeah, that's all. SÉBASTIEN: Thank
you very much, Alban. VICTOR: Yeah,
very impressive. SÉBASTIEN: That really
illustrates how simply you can go from a very basic project
to a point where you can start simulating environment. There are a lot of
questions, Victor, I believe. VICTOR: Yes. We received plenty. And so if you both are
ready, we can start diving into some of them. SÉBASTIEN: Before we're
jumping into questions, something I would like to us
talk to our audience today is to-- using the same
mechanism you had when you asked us all
these questions, can you let us know what you're
dreaming of building in terms of simulation application? What's the most complex
thing that you're dreaming of creating? And why is it that tricky? Our goal is to
try and understand where you're going so that
we know where to go next. Today we demonstrated
two different ways of integrating an open
and unlimited world into your applications. But we would love to see where
you see the next big thing and what is the biggest
concern you have when it comes to simulation. And the type of simulation
you're building-- are you in civil aerospace? Are you in the medical domain? Are you in railway management
or testing of systems? We're really curious about you. And that would be
great if you can talk a little bit
more about what you're doing with building. Now let's jump on the
questions if you want, Victor. VICTOR: On that note,
real quick, if you're watching the
stream when we're not live, feel free to use the
forum announcement polls. That's the place for
discussions post-stream. It's a great place
for us where we can follow up and continue
the conversation when we are no longer live. That said, let's
go ahead and dive into some of the questions. And we will start from the top. phiggis asked, when
representing a simulation, I want to achieve an
augmented reality tabletop use case while having a scaled-down
environment with WGS 84. What approach should be used? SÉBASTIEN: So in
that specific context-- and I see Alban
laughing because that's something we're
working on currently with a specific head-mounted
display for specific contexts. But it's not more complicated
than building the sample that you have seen right now. So once you have
your application, as we discussed earlier, the way
you deploy your application is handled by Unreal right. And in your
compilation, once you're building your application, you
can select what type of devices you will build upon. Obviously no sky will be the
best answer to your question. Put a very, very black sky. Because that's what you want
for your augmented reality application. So make sure that you have the
right light for your terrain. But the trick is
getting rid of that dome that you have around it. And your atmosphere will
have to be handled properly to suppress that. You want to add to that,
Alban, at that point? ALBAN: Yeah, I
didn't see any scaling value on this current beta from ESRI. So I don't know if it's
possible to change. But for Cesium, you
just take your tile sets, you lower the scale, and it just works. VICTOR: If you're using
a stereo augmented reality device, essentially like
a head-mounted display, I'm fairly certain you'd
be able to just change the world to Meters variable,
which is a global setting. All that actually does is to
change how far away the two virtual cameras are
from each other, which, if you think about it
in real-life terms, that's actually why
sort of an insect has a very small
interpupillary distance. And that's actually what
creates our sense of scale. And so we have that opportunity
inside Unreal Engine to just change the
world to Meter scale. And that will actually
adjust the perceived scale when you're viewing this using
a stereo device such as a VR headset or a head-mounted
augmented reality display. And that should actually
work with the plugins as well, considering that you're
actually not changing any world scale, you're just
changing the way that we are perceiving
what we have in the world. Let's move on to a question
from Raulpassos1808. Applications often require
online data reception to be more realistic. What are the ways
of receiving data that you visualize that will
facilitate this connection? Something like
Livelink or USD files? SÉBASTIEN: You want
to take that one, Alban? ALBAN: Yeah, but in
that case, the terrain server, the terrain data
provided by ESRI or Cesium is already
their own data with streaming capabilities. The only thing we do
is getting this data and integrating them in Unreal
as any other kind of geometry. They are live in the
physics system and so on. Since it's just a layer
of geometry in Unreal, you can get other sources of
data in your Unreal level. You can use USD, for instance,
because Unreal supports USD. And you can make some insets
at some specific location on top of these data layers. It's very straightforward. You have to think you
are not in a GIS tool. You are in Unreal Engine,
which is a game engine. And you can mix everything. It's just your imagination
that is your limit. Of course, in that case if
you imagine the amount of data to our warehouse at 50
centimeter scale from imagery, you cannot host it in your hard
drive unless you have a very good hard drive and
access to the data. So you need to rely on
online services to do that. But if you are more
interested in a smaller area, you can have it also
in your hard drive or your local
server, not relying on an internet connection,
but just a local network to have this put in Unreal. And also for security issues,
sometimes it's mandatory. We need to difference the
hosting mechanism and streaming mechanism and the
rendering mechanism. VICTOR: Next question
comes from nicolasfrm. What is the best way to simulate
the real-world global map, say for the purpose of
creating a strategy game, so that when the user clicks
on the province and country, information pops up, et cetera? SÉBASTIEN: Same way
Alban described the concept of interaction in
your application. You can integrate, on
top of your environment, the type of data set
that you want to map with your application. And using segments
and specific locations that you want to predefine,
you can outline any region of the map that you want. And like you would do in
any interactors in Unreal, you would be able to point
to a specific location. And these highlighted all
these augmented terrain will react to
either the entrance of a character on the
specific territory or the presence of a collision
with any region of your map. It would provide you
a set of information that you would pre-register
on your specific location. ALBAN: And ESRI
is working on the future by bringing some
special requests. Because when you are in
the geospatial industry, you have some vectors with
some attributes and so on. And the future will be to be
able to make these requests. But right now, I think
there is a ship file plugin in the marketplace where
you can import your vector data and get the associated
information. I didn't try it for a long time. But I think, with a
GeoReferencing system, you should work and
get you your geometry. VICTOR: Next
question comes from phiggis. What is the road map of
the geolocation plugin? ALBAN: First
to have it part of 4.27. Second, to make it
switch to 64 bits. And third, everything you want. Just express your needs
and we will fulfill them. VICTOR: Once
again, going to point to the forums as a good
place for suggestions, recommendations, and
discussions around the topic. SÉBASTIEN: Exactly. At the moment, it's
part of the plugin that we distribute on GitHub. But moving forward,
it will be part of the distribution of Unreal. VICTOR: Let's see. Next question comes
from Monochro100 . Since the slide mentioned,
are there any news on Cesium for Unreal Engine? SÉBASTIEN: There
are plenty of news. [CHUCKLES] So there had
been a recent blog post on the Cesium website
on that topic, defining more in detail the
roadmap and the release time frame, as we mentioned earlier. One thing that's
important as well is that, if you want
to stay connected, they currently have,
on their website, a registration form on which
you can enlist yourself to make sure that
you're up to date each time they are posting
news on that specific topic. ALBAN: I put
the link on the chat for-- on the Twitch. You can probably do
the same on YouTube. VICTOR: Thanks, Alban. We'll try to gather
some of them. And we can put them on the
forum announcement post as well, under the
Resources section. Mathieu Tremblay asked, in
the simulation industry, we make extensive
use of ESRI formats. It would be great to have a
seamless integration in Unreal editor to be able to
read/write these formats? Will this plugin allow that? I think that's a yes? SÉBASTIEN: Well,
currently, that's a no. [CHUCKLES] I mean, our goal with
Unreal is to, at that point, consume the data set that are
created by these fantastic generating tools that you
use in the ArcGIS ecosystem. And that's how you create
your content, that's how you create your maps and
all your data sets for ArcGIS. Same thing goes for 3D tiles. The Unreal Engine will be
leveraging the content, and at this point will not
be creating the content. There are pro tools
and middleware that have been
developed over the years by all these partners. And we don't aim to replace
it but to actually consume the terrain data that
are generated by them. ALBAN: And probably,
yes, Unity is at the-- we need it to press
Play to have our data streaming in the editor. That's currently a beta phase. And if there are a lot of
requests from customers to have it available
straightaway in the editor, that is probably something
that ESRI can consider. And we can help them in
implementing such a feature. VICTOR: Next question
comes from ChimeraXR. How does photogrammetry
work with Unreal? If I want to use a drone to
send imagery to a VR headset not in real time,
can UE4 support it? SÉBASTIEN: ChimeraXR,
your question is fantastic. We love this question. It's really the way
to go moving forward. I mean, you remember, at the
beginning of our session, when I mentioned that the
evolution of the pipeline is super important. Well, traditionally,
the sources of files that we are using to generate
your virtual terrains was only linked to GIS files,
GIS files that were sometimes outdated, sometimes
not containing all the right information. And that's a lag in the
pipeline of creating your georeferenced environment. The use of
photogrammetry is really where we, collectively, as
a community of simulation creators, see the future. The USGIF is talking about it,
the OGC is talking about it, everybody's talking about it. That's the subject. That's today's
subject-- how do you take photogrammetry elements
and bring them to Unreal Engine? Well, the use of the ESRI
ArcGIS data set as well as the use of the 3D tiles data set
allows you to do exactly that. And that's why, in the terrain
that you saw, for example, in the Cesium data set,
when Alban was flying over Melbourne, the
data set that you saw there was direct
photogrammetric assets coming from drones captures. And that's one of the
beauties of this process, is that once you reach the
format that you want-- and once again, we're not creating
the format data set here. And in that example, we're
just consuming the 3D tiles. But once you ingest your drone
data set into your 3D tiles, you can just load
it into Unreal. And it will just work. That's part of the beauty of
leveraging standouts for us, is making sure that, no
matter what format you use, it's ingested globally. One of the formats,
the core formats that is super important
in that process is the GLTF format that
Unreal supports natively. And it loads a strong
acceleration of that pipeline as well. There will be more
coming from Epic on photogrammetric
ingestion into Unreal. But there are a couple
of topics that we'll be able to talk to you more in
the coming months, I believe. VICTOR: Exciting. Next question comes
from Martin Sporrer. Is there already a
multichannel synchronization in Unreal Engine? Use cases for multichannel
projection systems? SÉBASTIEN: So we alluded to
the nDisplay display mechanism in Unreal. But I'll let Alban give you more
detail on that specific topic. ALBAN: Yeah. So in Unreal, we have a
technology called nDisplay, which is basically
doing everything you need to have a multichannel
visualization system. So basically you have
a host, a master, and several clients
connected to the master, when everything is synchronized-- the time is synchronized,
all the properties are synchronized-- and then you
can even framelock or genlock your projection system
using that technology. And you know where
is it used, it's used in virtual production for
all of the big screens with LED panels everywhere. And you can define
your camera location and the system
will automatically take into account your
projection system shape if you are projected on a dome,
or a flatscreen, or something. Anything you want which
support all the basic formats, the standard formats related
to projector calibration. And the only thing
you have to do is just create an
Unreal application and enable the nDisplay plugin,
set up a configuration file related to your projection
system, and it just works. So you can take this
geometry layer coming from ESRI or Cesium into
your Unreal application and make your flight
simulation inside the editor. That's all. VICTOR: And potentially
invest in a rather large farm of computers who can handle-- ALBAN: Yes, for sure. We can have several channels
on the same computer. But if you are really
looking for high performance, you should really have
a cluster of computers. VICTOR: There have been
some really cool demonstrations of the dome experience using
nDisplay in Unreal Engine, even for live performances. There's plenty of
content out there. ALBAN: There was a
demonstration on our booth at ITSEC in 2019. No, the last one before. VICTOR: The next question
comes from Yvain Tisserand. I would like to know if
you plan to develop things for virtual humans,
having NPC that can be more easily controlled
for simulation application-- chatbot system,
animation, locomotion. SÉBASTIEN: There's
a lot being done. I mean, we've all seen
the Metahumans creator capabilities. And that's a great
link to your question. I mean, if you didn't have
a chance to look at it, search engine "Metahuman
creator for Unreal Engine." You'll see a lot of
information about it. And these humans, while
sometimes looking, like, overwhelming for
developers are actually pre-rigged. So you can use them directly in
your application as simply as terrible avatars that
you are using in the past and manipulate
them the same way. So the same logic of
control applies and allows you to, while using the same
mechanism that you've always had for chatbots and
conversational avatars, create things that are
way more realistic, allow you to have more
realistic interactions with these virtual humans. So that's a question that
almost looked planted by us but it's not planted by us. I hope I answered
correctly to that question. ALBAN: Yes, but
they will add something. When you have your
character inside Unreal, you have also character
animation tools that enables you to have your
animation state machine, say, with all your
phased animation. But on top of that, you have
the inverse kinematic system when, on top of any animation,
you can drag your character and put it at a
specific location to grab something and so on. You have all of these tools
already in the engine. VICTOR: Next question
comes from John Graham. And I think we
covered this, but when will come Cesium be available? SÉBASTIEN: Yeah. So same mentioned
earlier, if you missed it. It will be available on the
30th of March this year. So, you know, a couple
of weeks away from us. It's coming up. So if you want to be sure of
being informed very early, you can register on
the Cesium web page to be part of their
communications. VICTOR: Next question comes from Martin Sporrer again.
Is double precision already launched? SÉBASTIEN: Alban, do
you want to take that one? ALBAN: Yes. It depends what
you mean launched. If you mean that it's available
for a user I would say, no. But if you mean if we're
working on it correctly, the answer is yes, and it
would be part of Unreal 5. But even if you don't want to
wait for 5 to experiment it, Unreal 4 already comes with a
rebasing mechanism, rebasing system, and you can dynamically
change your Actor origin to be able to support large
coordinates however using a float precision. I did a webinar. We can probably
point back to it. It's available on
Unreal Engine when working with Geospatial Data
in Unreal Engine, something like that. And I explain this
system, showed an example of a rebasing
Actor that can trigger. So rebasing system in
Unreal automatically, it is a system
which is used when you are doing world
composition, but, in fact, you can use it at any point. It's just a matter of setting
and translation of certain-- all Actors in Unreal
reacts to this translation, and you can support
large coordinates or not. But the rebasing comes
with a cost, as well. When you do the rebasing,
you have a small hike in your performance. So that's why 64-bit will
save a lot of things, and our internal experiments
so far proved that we will not suffer any performance drop. So I'm waiting for it a lot. VICTOR: Next question
comes from phiggis again. Does this approach work well on
a VR device, such as the Quest? SÉBASTIEN: Yeah, so as
we pointed out earlier when we talked about AR, I mean,
in the Quest, which is VR, the same approach
will apply, right? When you're building your
application in Unreal, you will be able to
build your application and generate it for different
devices, the Quest included. And as long as you're
connected and as long as you have the
right data flow, you would be able to load
your data set the best way possible in the Quest. You can also keep
some data set local. If you have, you
know, a data set that is local that you
want to capture there, you can load your data set
for your 3D tiles for example, and have them
locally on your Quest based on the
terrain on which you want to operate your scenarios
or your training environment. VICTOR: I saw some
chat around the possibility-- I'm not sure if we received
the specific question, but I was curious. Do all the tools that we
showed in editor today also work in a packaged
version of the project? ALBAN: Of course. VICTOR: So not
only editor time, also at runtime in a
packaged project. Yeah. ALBAN: Yeah, of course. VICTOR: Let's see. Next question comes
from Marvin235 . When importing data set
of geographic location, how does the engine manage
to create the collisions for the models and the surface,
and how does it perform with extra large objects? ALBAN: Good point. I cannot answer it for Esri,
because I didn't have a look at the internal stuff. But for Cesium, the
collision meshes are built on demand
when streaming in the engine on
a separate thread, so it's not creating any
spikes in the performance. And they are present
in this [AUDIO OUT]. And it's probably something
similar with ESRI, but I'm not 100% sure. VICTOR: Next question comes from Katrina K. Can you
access global data outside of the continental US? I'm working on a project
set in Sierra Leone and the GIS information
is challenging. SÉBASTIEN: So
the answer is yes. As long as the data set
is available, you know-- and that's always a
question of data set, right? We're not producing that-- we're not owning the data
sets here in these two demos that we showed you. As long as data set exists
in your environment, either on the ArcGIS domain
or on the Cesium domain, you'll be able to load
it into Unreal Engine. And depending on
the regions, you will have more or less
detailed data sets, because you will
have more or less detailed data set survey
authorized in a country or in a region. But we can give you
a couple of pointers. In the forum, we would
be able to give you more pointers towards
the sources of data that we used here. ALBAN: Yeah. VICTOR: Another
question from Marvin235. Can you compute a light slash
static shadow with this type, or is it recommended to
keep the light unmovable, or the light source
as unmovable? ALBAN: No, it's not
recommended, because your data set will be huge and
your light map costs will be too demanding. Light maps are well suited
for interior, stuff like that. But light maps are often
needed for indirect lighting, and we have lumen
coming very soon. So you will have-- light naps will be
probably part of the past for some application. VICTOR: Next question
comes from AngryBaguette. With the actual plug-in,
do we have access to feature database attributes? Is it possible to have some
interoperability with feature selection and modifying the
appearance, for example? SÉBASTIEN: So
depending on how you set up your application, you have
the ability, layer by layer, to have different treatments
on the way you would display things, either for one
plug-in or the other. That's a very simple and short
way of fixing that question. However, the complexity that you
will be facing moving forward is the need to have
very semantic content in your application. And the semantic
metadata that is coming in all descriptors
of these terrains are really your best friends
moving forward, right? So as the 3D tiles
next is coming up, you will see more
metadata that allow you to have specific
treatment for your data set. As for the Esri ArcGIS,
leveraging your layers would be a very good
answer in the short term. And we'll see in the coming
releases what are the best ways to interact with this content. ALBAN: But I
would add something else. You are inside Unreal. You can write your
own Unreal plug-in. And if you click on one
point and it's something in the geometry, you can
get the actual location-- latitude, longitude, whatever--
and take this location and from inside
your plug-in could make a request to another
geospatial data center and say, OK, please
give me the vector from this layer at this location
and then give me the data set. And you'll get it in
your plug-in and use, let's say, a UMG widget
to display it on anywhere. When we integrated a
skycraft tracking system, we made some web requests
to another server to gather aircraft data
and displayed it in Unreal. It was not in the plane itself. We took the information of
the plane to make requests to another external server. VICTOR: Next question
comes from Steven Haslemore. Is content only loadable via-- I think they are asking online,
or can we self-host content? SÉBASTIEN: I
think the question is, is the content only
loadable via ion, or can we self-host content? For those who are
not familiar with it, Cesium ion is the repository,
basically the server, from Cesium. It's a 3D tile
server, if you want. And in the demo
we gave you today, we connected to the
Cesium ion server directly into the application. There are three ways you can get
your 3D tiles into the Cesium for Unreal plug-in. You can get them from the
Cesium ion through the internet, exactly like Alban did today. You can have a local Cesium
ion server on, you know, location, where you are
at, on your intranet. So you have your own
little personal server. Or you can have your repository
of 3D tiles organized on your computer, and these
can be served directly into the plug-in without
requiring a specific ion server. Right, so three
ways are possible. So the short answer is, no. It is not only loadable
via ion, but it can be self-host contained. VICTOR: All
right, let's move on. We got a couple more here. Brian Windsor asked, is
there an ion database-- now that I understand what ion
is, and that it's not a typo. Is there an ion database
you can download locally, or do you have to
create it yourself? SÉBASTIEN: So, yeah. You can use a free registration
to the Cesium ion database and start using it today. Or you can, as we mentioned,
create your own 3D tiles, putting together the right
pipeline on your end, if you want to control the
content from A to Z. All right? You can use a repository,
which is usable today. Or you can find another way. Yeah. So that's pretty much
the answer for them. VICTOR: Next question
comes from Milan Petrovski. What is the best way to
stimulate cities that doesn't have aerial photogrammetry? Doesn't need to be the
exactly same buildings, but I need the layout
with generic buildings. OSM, I saw, is one way to go. Is there another way? SÉBASTIEN: So
as you pointed out, the OpenStreetMap is a type of
layer that you can easily get data. So you make sure that this
way you got your elevation and footprint of buildings and
road network correctly defined. That can be a very
simple way that works, you know, in different
environments easily. And on your OSM
data set, you can apply any treatment that
you want in the look and feel of your application. So you can give it
a feel of, you know, randomized distributed shaders
so that your buildings have a feel of buildings and
less of a gradient of colors that you want to use. But there might be other ways. Alban, do you have
any ideas around it? ALBAN: Oh, yes,
or better than that. There is already a free
plug-in on the marketplace that does it for you. It's Vitruvio from Esri, and it
enables you to generate cities right into Unreal. You can go-- I put it on the
chat for Twitch and YouTube, and you can have a
look at it if you want. SÉBASTIEN: Thanks. VICTOR: Thank you, Alban. Let's see. Next question comes
from ComradeInCharms. With the GeoReferencing
plug-in, could you describe how one would warp
a flat mesh with a given CRS to curved mesh? ALBAN: Oh, nice. Nice one. I think there is no-- well, it's a funny one. It's a funny one. But yeah, basically, the
GeoReferencing plug-in exposed you some
method or function to convert any location
to any other location. And it's accessible from a
Blueprint or C++ perspective. So currently I would say, no, there is no way to curve
a flat mesh using it. There is no magic
function to do that. But it could be possible to
write a C++ plugin that loads the mesh and iterates
over all vertices, getting their xyz-coordinates,
and curve them to the cef-coordinates,
something like that, apply some rebasing, and
create another mesh for you. So you could use this
function to, of course, curve it correctly. But it's not available
out-of-the-box. VICTOR: Next question
comes from Alexander Kranz. Is it supporting CDB OGC? SÉBASTIEN: So,
it's a good question. We talked about the three
main databases formats that are supported by the OGC. And we didn't demonstrate
in that specific demo for lack of time all the
plugins that exist for Unreal. There is a plug-in
for the OGC CDB that has been created by
our friends at SimBlocks. So if you dive in-- we'll send you
the link, as well. So SimBlocks.io is the website,
and they released in December last year the CDB OGC
plug-in for Unreal Engine. It's available through them
directly, and you can-- yeah, you can use your
CDB formats directly in the context of Unreal
using that plug-in. ALBAN: Yeah. I dropped it in the chat. SÉBASTIEN: Thank you. So yeah, it's called
the CDB SDK for Unreal, and it's coming to
us from SimBlocks.io. VICTOR: Making
sure I'm gathering these for the formal
announcement post, as well, to make
sure that they exist somewhere a little bit more
permanent than the livestream chat. I've seen a couple of
questions in regards to Python. Python seems to be a
language that is commonly used in simulation applications,
and I just wanted to reiterate. We talked about this before,
but there are currently no plans to build runtime
support for Python. Python is exclusively
used as a tool to help you import and do all
kinds of automated processes when you're working with the
editor, and not as a language that you would use to
write runtime logic. ALBAN: Yeah, but if
you are looking for examples that mix runtime Unreal
application along with Python scripting, you
can have a look at what the people from CARLA did. CARLA.org. It's an open source driving
simulator for autonomous vehicles and simulation. And they're able to make a
bridge between some Python scripts and Unreal at runtime. You create your scenario for
autonomous driving in Python and you have Unreal
running it in runtime. VICTOR: We've
received some more questions while we were asking them here. Some of them are general
Unreal Engine questions. Let's see. Quick glance through. Leandro Oliveira
asked, will Cesium be able to gather 3D
models from places with little OpenStreetMap data,
like countryside of third world countries? SÉBASTIEN: So that's where
your photogrammetry pipeline becomes handy. And that's why having tools
which are able to, you know, launch a drone in a region,
get a photogrammetry survey of an area, and generating
that area into the format of 3D tiles, allows you to combine
that data set with a data set already available
by Cesium, right? So you don't rely on
Cesium for the content. You have the global content
from them on the ion server, but you can augment it
with your own data set. So launching, you know, drone
surveys on a region or LIDAR surveys on the
region and generating your photogrammetry
data set, something that's now accessible
for a lot of, you know, operators in the
simulation environment. And that's one of the
great things of having an open format, is
that you can feed it with the data set that you want. ALBAN: And if you
have as a side project, you want to make something
that gets your OpenStreetMap building's footprints and
generate buildings with nice facades and
nice looking building, and export it as a 3D type
format-- because you can do it, it's a public format-- you'd be able to generate
a new kind of 3D tile layer that we could load in the
engine of your actual buildings. VICTOR: All right. With that said, I think we're
done for questions today. Sébastien and Alban, if there's
anything else you would like to cover, feel free to jump into
the forum announcement post. I will make sure that you
receive all the questions that we received
in chat here today. Before I have you
leave, I do want to mention that, if you've
been watching the stream today and you're curious about the use
of Unreal Engine in simulation and applications or
for game development, virtual production,
media and entertainment, any of those, make sure
you visit unrealengine.com, where you can download
the engine for free right now and start learning. We do have a
library of, I think, it's over 160 different
videos now when it comes to learning Unreal Engine
at learn.unrealengine.com. And you can also find
the many thousands of tutorials and videos created
by the awesome community on YouTube. Yeah, make sure you visit. There's a Reddit thread,
reddit.com/r/unrealengine. We also have Unreal Slackers,
which is an unofficial Discord community boasting over 50,000
members, unrealslackers.org. But also make sure to
visit our forums as well as our Twitter,
LinkedIn, and Facebook handles, where we post
all of the news related to Unreal Engine. It's a lot of good blog posts
and project spotlights going up as well, if you're just curious
what's going on in the industry with the use of Unreal Engine. If you are a little bit confused
by some of the terminology that we've been talking
about today in the stream, within about seven
days after the stream we upload the full
transcripts to YouTube. And this is a manual
transcript that is reviewed by a
"captioneer," is the new term I've started calling them. And if there were any terms
that you were unfamiliar with and you weren't really sure
how to spell them and search for them, the transcript
will be uploaded. As well you can just
turn on captions when you're watching
this on YouTube. What's really nice is that
all the timestamps are listed in that PDF that gets
uploaded, and so if you're curious about something
in particular, you can Control-F,
search for that, find out when we talked
about it, and then go ahead and do any follow up searches
that you might want to do. Please make sure that
you fill out a survey. Oh, almost had a instant
cough right there. Make sure you fill
out the survey. I think we're going to go ahead
and drop it in chat right now. Let us know what topics you
would like to see in the future as well how we did today. If there was anything
that was missing, make sure you let
us know so that we can improve in the future. Also, there are no
physical meet-ups going on right now in the
world due to the pandemic, but our communities around the
world are still spinning up virtual meet-ups. You
can go ahead and find out where all those
groups are located if you go to
communities.unrealengine.com. If you are interested in forming
a meetup group in your area, there's also a
button right there that says become a leader. You can go ahead and
fill out that form and we will go ahead and
get in touch with you. Make sure you submit us
all the amazing products that you're working on. A good place for
that is the forums. Unreal Slackers is another one. We frequently browse Twitter,
Facebook, and LinkedIn, as well, and you might be one
of the community spotlights that we aired at the
beginning of the stream. You also get a nice
little spotlight in the launcher for a week. We're still looking for
new countdown videos. I mentioned this to Séb and
Alban as we got started, when we were playing
the countdown video, that it would be great to
receive some countdown videos that are in other areas
of the use of the engine, such as simulation and training. These are 30-minute
development videos. I speed them up to five minutes. Send that separately
to us with your logo, as well as the name and
any links to your studio, and we might go ahead and
composite a new countdown video with your content. Make sure you follow
us on social media, and if you do stream on
Twitch, use the Unreal Engine and the game development tag. Maybe not if you're doing
simulation and training. I'm actually not sure if
there's a tag for that on Twitch just yet. We might go ahead and
see one in the future as the industry grows. But it's always fun for
us to go ahead and look at all the cool projects
that you're working on, and Twitch is a good
platform for that, where you're also
able to interact with the community in real time. Make sure you hit the
notification bell on YouTube if you want to see
when we go live, as well as all the content
that we're producing. I know the evangelism
team are producing a whole lot of feature
videos now that we're unable to visit conferences
and do them live, and so expect them on YouTube. There's a lot of cool
stuff in the works right now that
you'll be able to see in the next couple of months. Next week I have Richard Cowgill
coming back on the stream to talk about how to use Nvidia
DLSS plug-in in Unreal Engine. It is now available to be used. You can find it on the
marketplace and download it. Get ready, maybe prepare
some questions for him. If you watched the
last stream he did, I think we spent almost 2 and
1/2 hours talking about it. He was showing off some awesome
new technologies from Nvidia, but this time we're
actually going to go through exactly how to
use the Nvidia DLSS plug-in. And with that said, I would like
to thank Séb and Alban so much for coming on today, talking to
our audience about simulation and training. There's been a lot
of good questions, a lot of good content. And March 30th, right? That's what everyone
should look out for. SÉBASTIEN:
Yeah, absolutely. Thank you very much
for having us, Victor. It was really interesting
for us to be with you and all of our audience today. If you're building
simulation applications, and if you don't know where
to start, that's super scary. I mean, Unreal is
a monster, right? It's huge. You've seen the charts that
we displayed here today. You know, if you don't
know where to start, it's always easy
to send an email. You know, contact us. There's simulation@epicgames.com
that you can use in the simulation
context, right? If you don't know how to set
something up for your training or analysis for machine learning
or for any application you're working on in the
simulation domain, you know, just one line,
simulation@epicgames. That would be allowing you
to get in touch with the team and we'll take it from there. So thank you again, Victor,
for setting this up, and everybody for attending. It was a pleasure. ALBAN: Yeah. Thank you, everybody,
for attending and for your
interesting questions. VICTOR: Just
wanted to make sure that-- Alban did produce two
videos that we are now hosting on our YouTube channel
as well in the use of the two separate plugins. And so if you're interested
in a particular segment that is a little bit, I
should say, faster, with less of our
communication in between, I'll go ahead and link
those in chat right now. And you can also find them on
the forum announcement post page. Alban was nice enough to
put those videos together for us right before the stream. With that said,
thanks, everyone, for hanging out today. I hope you have a good
time and so do we. With that said, yeah. Stay safe, and we will
see you again next week. Bye, everyone.