Creativity lies in all of us. It's what makes us human. It starts with an idea.
The desire to create. To solve. To play. To share. Together we can build new worlds. Freedom to create. For all. SIMON JONES: Hi, everybody. Welcome to Unreal Engine Build:
Automotive 2021. I'm Simon Jones, director of
Unreal Engine Enterprise. Back in 2016, when we held
the first ever Build event in front of 300 invited guests, we
couldn't have imagine that five years later we'll be streaming
to an audience of thousands. We can see Build as a way to
showcase the amazing uses of Unreal Engine from non-games industries. And in 2017, we hosted our first
automotive event in Munich, Germany. Now this annual automotive
gathering has taken on a life of its own online. So for everyone watching at home
for the first time, welcome to the show. Of course, 2020 was
a year like no other. With many of us leaving our offices
behind and adapting to remote working. And since that first Munich event in
2017, we've seen such innovation from our fantastic community, including
projects across design, production, engineering, factory planning,
retail, and autonomous driving. We've seen your marketing teams
blend cutting edge techniques from Hollywood to produce some
of the best marketing content created by any industry. The latest of these technologies
being in-engine virtual production, which has revolutionized the
creation of launch films, of product reveals, and of
global events like the one today. You'll learn more about
that technology later, as we take you behind the scenes.
And perhaps one of the proudest developments for our team, we're
beginning to see Unreal Engine appear in the car dashboard
systems via our HMI program. You're going to hear success
stories covering all these subjects later today. Now the vision which drives our team
is at the core of Epic's philosophy. To build the best tools and put
them in the hands of creators. And we continue to work hard
to make content creation as frictionless as possible. Whether you're working on a car
configurator, designing an airport, shooting a movie, or making a
video game, we're continuing to invest in assets, materials, and
technologies to further reduce the high cost of content creation. And we have our eyes to the future
and the convergence of everything we do in a persistent digital world. Complete with a digital
twin of everything. For the automotive industry this starts with having a car
that you designed in Unreal Engine be able to easily appear
in a video game or a film. All of which is also made in Unreal. And this will keep getting bigger
and bigger until all content is available in a fully connected
and interactive environment, which is our vision of the metaverse. And there's people like us
and each and every one of you, being here today that
are taking another step closer to being ready for the
metaverse becoming a reality. This event isn't about Epic Games. It's about all of you. It's about all of your project stories,
not just the ones that made the stage. And even though this year, we can't
meet at the bar and share a beer-- I'm sure we'll all be
getting together soon enough-- we can hang out by interacting
live in the dev lounges. Our whole team looks forward
to meeting you there. We have an amazing lineup of
stories to share, and I'm proud and delighted to be the one to
kick off Build: Automotive 2021. That's enough from me. Let's get on with the show. DOUG WOLFF: Hi, everybody. I'm your host, Doug Wolff. I'm a business development
manager here at Epic Games, working in the automotive industry. The last time we all got in a room
to discuss cool automotive stories and Unreal Engine
was well over a year ago. And all things being equal, we would have been in one of
those rooms by now, already. However, we had to do things
differently this year and we're on a digital platform called Hopin. The video is streaming to you,
but the key is you need to interact. Interact with each
other and reach out to us. Build events bring you the biggest
automotive stories of that year. And one of the biggest stories of
this year is virtual production. BMW did something really remarkable. Not only did they build a virtual
set, they used it to live launch a vehicle at their Next Gen show. It's an amazing story. And I'm super excited to
share with you off the top. AMIAZ: Wow, hello Amiaz here So they told me,
what's coming up in terms of content. And then they told me I'm standing in a
cave and everything is so Unreal Engine. But for me, this is so very real. [CAUTION BY SKRXLLA PLAYS] ♫ Stop that ♫ ♫ Stop that ♫ STEFAN PONIKVA: Because
of the uncertainty, we had to change all our plans. We had plans for a live event,
and mid-April due to COVID-19, we said we can't continue
with these plans. We need to change this
into a fully digital event, and therefore we then
screened and evaluated the entire industry
when it comes to virtual productions, when
it comes to broadcasting studios, et cetera. And then after a while,
we came to the conclusion that this LED cave is
the best of the best. ♫ Stop that ♫
♫ Yeah ♫ ♫ I hear 'em chatting the noise ♫ ♫ Move too quick,
can't stop for the talking ♫ ♫ I hear 'em chat with the boys ♫ ♫ Man so tough but
mans keep walking ♫ ♫ Dress too sharp
with the poise ♫ ♫ White girls love to tell
me I'm awesome, yeah ♫ ♫ Hot like fire on the pan ♫ ♫ If you wanna touch man,
please use caution ♫ ANDREAS SCHIMMELPFENNIG:
The interesting idea is that we can go
into some scenes, and people who are
doing the presentation, they're really in the scene. And as soon as we
pull out, we want to destroy that illusion
so that you feel, oh, no, they are on the stage. ♫ --in a cage Never let out,
let out the, let out the ♫ ♫ Yeah
I hear 'em chatting the noise ♫ ♫ Move too-- ♫ ANDREAS SCHIMMELPFENNIG:
This back and forth playing with the spaces
with the impressions is really, really
exciting for us. ♫ --walking, yeah ♫
♫ Dress too sharp with the poise ♫ ♫ White girls love to tell
me I'm awesome, yeah ♫ HADRIEN LEDIEU: Everything is
reacting to the camera motion, and it feels really
a lot more natural. And it tricks you in believing
that the people are really in the world. ♫ Let out the, let out the ♫ ♫ Big shoes, check out the crease ♫ ♫ Flow like I'm Bigfoot,
step on the beat ♫ ♫ Make a mans run till
he stepped out the cleats ♫ ANDREAS SCHIMMELPFENNIG:
For the perfect illusion you have to stick to some
physical size relations. If, for example, a human
is too big really far away, it totally crushes the spatial
impression because it's wrong. ♫ Hot like fire on the pan ♫ ♫ If you wanna touch man,
please use caution ♫ ♫ Stop that ♫ ♫ Stop that ♫
♫ Yeah, I hear 'em chat-- ♫ OLE REINSBERGER: When Andreas
came up with this idea to produce Unreal content,
we said, OK, all right. Nobody has ever done it before. ♫ --caution, yeah ♫
♫ Hot like fire on the pan ♫ ♫ If you wanna touch man,
please use caution ♫ AMIAZ: Hello people, so now
Amiaz is getting a little pretty. ♫ 4, 3, 2, go ♫ SPEAKER: Do you have
a favorite world? ANDREAS SCHIMMELPFENNIG: Um. HADRIEN LEDIEU: I like the
simplicity of the design world. Like, it's sometimes I
like the forest as well. ALEXA GOLDSCHMIDT: The
one with Supercar Blondie when you have the
wood, and the stars, and it sparkles, and it looks
so amazing like in a fairy tale. HADRIEN LEDIEU:
Something simple. Just pretty. It's a nice color. ALEXA GOLDSCHMIDT: And actually,
the Mini with the basketball court, that looks pretty cool. HADRIEN LEDIEU: It feels quiet. It's calm, and it's just nice. ANDREAS SCHIMMELPFENNIG:
Jumping into these worlds, going out, going
to the other one. HADRIEN LEDIEU: You
don't feel stress. Like, it's just
a peaceful place, and that's probably the
one I like the most. ♫ If you want
to do it, just do it ♫ ♫ Do it like it's never
been done before ♫ ♫ Just do it ♫ ♫ Move it like I'm
moving in stereo ♫ ♫ Just do it ♫ ♫ Move it like it's all
you've been waiting for ♫ ♫ Do what you do ♫ ♫ Just do it for you ♫ ♫ Just do it ♫ ♫ Just do it ♫ ♫ Just do it for you ♫ HADRIEN LEDIEU: Oh, no. DOUG WOLFF: So
BMW have just shown us how you can use virtual
production techniques for a live vehicle launch event. And that was great. It was an amazing accomplishment. When it comes to shooting commercials,
virtual production also excels. In an indoor
controlled environment You can take the car and your whole
team from the center of a forest to a space station
in the blink of an eye. A bunch of top automotive brands
have been doing this at a facility called the Hyperbowl near Munich. Let's check it out. FRANK FOERSTER: Cut! OK, everybody,
moving to next scene. Glad you made it. Welcome to the Hyperbowl. This studio is a joint
venture of four companies-- NSYNK, ACHT, TFN, and Fournell. And now operates as an
independent virtual production studio that is open to everyone. With more than 400 square
meters of high-res LED, the Hyperbowl is the largest
volume of its kind in Europe. Even our ceiling is made of the
same high-res LED as the walls. Automotive is a
particularly valid use case for virtual production
because its hero, the car, is such a glossy object. You can't photograph a car
without an environment. And this is where classical
studio shoots or even green screens hit their limits. Now, we can
radically expand what can be produced in the
studio with all the benefits that come along with this. [MUSIC PLAYING] Here, we have steady
lighting conditions over the whole day, which can be
exactly reproduced at any time. You can shoot multiple
locations in one day. And this is time-saving
and cost-efficient, but this is also, of course,
much more sustainable. And another important aspect
is confidential prototypes can be kept out
of public at ease and still be shot as if
they were on location. [MUSIC PLAYING] CHRISTIAN GENZ: When I think about how many
streets had to be closed for various shoots in the past. [MUSIC PLAYING] We do not need that here! JAN PRAHL: And pretty cool for the first
time that it worked out so well. I like it! JULIAN KRUGER: What
really fascinates me about this technology
is that it opens up so many creative possibilities. When it comes to
content creation, you can assume that about 80% of
the traditional post-production becomes pre-production. You can use existing
3D pipelines to create your environments
to be used on an LED stage. Concept artists, 3D modeling,
shading, and lighting, it's basically all the
same, except you have to bring it all into Unreal. And make sure that it can
be rendered in real-time. Then you can benefit from
all the possibilities of a real-time engine, such
as changing light in real-time and altering the scene. But also, virtual interactive
shooting plan and previs. With this process, you can
create a lot of content in very short time. The ID.4 campaign was
shot in only two days. We had five locations, two
abstract worlds, six talents, and we created a 60-second
ad, various formats for social media, and then also
for different international markets. ENO HENZE: Virtual
production is also getting a lot of attention
for live communication, such as car launches
or virtual keynotes. For these projects, we have
combined the virtual sets with layers of augmented
reality in real-time. You can add virtual props
to your set designs. And you can even
present cars that only exists as a virtual
concept and have never been physically produced. STEFAN WENZ: Working
for a couple of years within the automotive
area, I know the expectations of this
industry on CG are very high. Moving footage has always
been the supreme discipline. But with virtual
production, known from Hollywood productions
like The Mandalorian, there's now a
cost-efficient solution, which reflects the
high expectations and can match the
creatives' ideas. Storytelling has become
more important with today's customers buying a lifestyle
alongside with the product. The CG environment
can perfectly match the communication
strategy and can be integrated into a seamless
storyline using various media assets. Starting at the
virtual courtyard for OEM internal
management presentations, reuse at the commercial
production, and all other media assets, the synergies
are obvious. Epic Games supports this by
providing a huge Megascans library for free in all
Unreal-based productions, which can be endlessly enriched by
recreating using the Quixel Mixer technology. ENO HENZE: Thanks for having us here from the Hyperbowl. It was a pleasure to be here. And enjoy the rest
of your Build event. See you. DOUG WOLFF: So in
those last two stories, we saw some absolutely
huge car brands using virtual production techniques for
both live events and car commercials. On the subject of virtual production, I'd like to invite a special
guest Alistair from our London Innovation Lab. ALISTAIR: Thank you, Doug. It's terrific to be here
at the Auto Build event today. I say here at the event, but
it's a little bit of an illusion. Some of you may have guessed
already, but the garage space that you see behind here,
is actually virtual. It's been created in Unreal and is
being displayed on our LED demo setup at the Innovation Lab. Let's take a break
for a moment and we'll take a wander around and dive
behind the curtain if you will, and let me show you
the Innovation Lab. Come with me. Being as this is a
break and I'm English, I'm going to grab a quick
cup of tea as we show you around. This space is actually
a converted television studio. And it's a fantastic space too,
to get lots of people in here to meet and greet and talk about
the sort of stuff that they're creating in Unreal. Obviously a little bit
quieter here today. We've also learned to virtualize
these things from the lab. So this is a great example. The Auto Build event will reach
thousands of thousands of people and we're able to do that from here. The lab is base camp for a
fantastic team of artists that we have here and that team of artists,
they really roll their sleeves up and work alongside you in the community,
to make sure that the projects that you're doing in Unreal are
as fantastic as they can be. They need to look as high
fidelity as possible. They need to be optimized for their
real time interactivity as well. And that's what the
group at the lab here do. Now here's our demo volume again. And just to point out, it's
actually quite a small and modest space, but you can see the type
of results that we're able to get out of this and bear that in mind
for the sort of projects that you might want to do. You don't always need
to have a big stage setup. Right. Let's jump back into
the virtual world. And I want to introduce you to
a great friend and colleague of mine over in America. Heiko, who runs the
Detroit Innovation Lab. I've got to actually
try and get him up on screen cause he's a hologram. Oh, there we go. Works on a good click. Heiko, fantastic to see you today. I'm afraid I've wandered
back onstage with a cup of tea. Great for me, but I'm afraid
I didn't bring anything for you. So perhaps as a hologram,
you'll forgive me. It's fantastic to have you here
with us in the Innovation Lab today. Can you tell us a little
bit about what you're doing over there in Detroit? HEIKO WENCZEL: Alistair,
look at this crazy setup. I've been jealous already about
the physical space you have. Now you're adding this amazing
virtual space on top of it, and you can summon me at will.
Look at this. If I ever grow up, I want to
be like you. But going back to your question, building a physical lab
and collaborating with partners certainly hasn't been easy in the
last year, but it gave us some time to rethink some options
that we have and looking at the technology we have with virtual
production, as well as what's happening in the virtual events
space overall, there was the idea to build an extension to
the physical labs in the virtual world, build a consistent space
where you come and visit us, partner with us, work with us together. And so we're building that out now. I'm hoping I can do some tests
with you, invite you into that space and sooner or later, we
all can summon Alistair to us. ALISTAIR: I look forward to it,
Heiko. I really do. Thank you so much
for joining us today. From virtual spaces
and virtual production, over to virtual interfaces and HMI,
which is a big theme this year, I want to introduce you to Joe, who's our technical product
manager at Epic Games for HMI.
Over to you, Joe. Thank you. JOE ANDRESEN: Hi, I'm Joe Andresen,
technical product manager for HMI and embedded systems at Epic Games. Last October, we announced
our HMI initiative and I'm here to give you an update on
what we've been working on. One of the things that we discussed
last October was this idea of design driven development, where your Unreal
project could be deployed directly to the vehicle in production. This allows designers to
iterate more quickly on their designs and get a more polished
design in front of the driver. Also our partner Vector Forum has developed an HMI starter widget
pack as an introduction to developing HMI user experiences in Unreal. What's great about this is that
there are UI templates that allow you to get up and running
quickly with common HMI elements. They've also developed
an HMI simulator. This allows you to take your Unreal
HMI project and drop it into a 3D car inside a 3D environment and feed
sensory information to your HMI. Things like blind spot warnings,
speed, day/night cycle, and much more are available for you to test before
your HMI ever gets to production. The HMI starter pack and
the HMI simulator by Vector Forum are available in the Unreal
Engine Marketplace right now. I really enjoyed hearing from all the
people who reached out to us and the cool projects that you're working on. If you haven't reached out
to us, it would be great to see what you're up to. Thank you. DOUG WOLFF: Thanks, Joe. That was a great update. You guys might remember Joe
from our HMI announcement. Our automotive partner in that film
was General Motors and they're here today with an update on not
only their HMI developments, but a bunch of other cool stuff as well. KAREN LEHR: How could a game engine
that my little nephew is hooked on, be a game changer at GM. That was one of the questions
I had when I joined the team. I'm Karen Lehr, and I lead the
IT immersive technologies team. It started out with a teapot. Then we made it spin.
Fast forward to where we are now. We've used Unreal Engine to
create efficiencies and improve experiences for vehicle design,
simulation, marketing, and now with our infotainment systems. Here's a glimpse at just a few of
the ways we've utilized the power of Unreal Engine and our
awesome partnership with Epic. SPEAKER 1: Global
Creative Visualization, or GCV, is part of the industrial design
function here at General Motors. And we're responsible
for the first public look at our vehicles. In addition to the internal
highly confidential assets created during development, we also produce the first photos,
videos, CGI images, and animation. When GM is ready to show something
new to the public, including the Hummer EV. For Hummer EV,
the team produced an extensive and beautiful set of assets
worthy of this revolutionary vehicle. I'm continually amazed
by the talent of our team. The portfolio photography and
video, often blended with CGI imagery and animation, shows the
true capability of this vehicle. Today, I was invited to share with
the design family and exclusive first look at the team's animation. STEPHEN GRAY: Hi,
my name is Steven Gray and I'm the manager of GM Design's
Global Creative Visualization group. We're responsible
for crafting the first look you as the public sees of GM's
products from a design perspective. We're based on Maya, V-Ray, Nuke,
Adobe products, and now, Unreal as our primary tools. GCV does a lot of live action
CGI integration, and we will often work with short
timeframes to put together projects. As part of our technical development, we're working hard transitioning
this pipeline over to Unreal for flexibility and reusability the platform presents us with. Currently, if we have a track shot
that needs to have the vehicle replaced, we need to go through a
lot of steps in different software. Unreal opens up options to
substantially streamline this process for faster replacements
through GPU rendering. It's not uncommon for color,
parts, or even the full service to change during our productions.
Unreal enables us to remove many traditional production steps,
and there's a lot of value in that. Given the complex and varied
assets we produce, separate pipelines for CGI images,
immersive projects, and VFX exist. Amalgamating this through the use of
Unreal for single data preparation pipelines and versioning systems is
really very powerful in our world. There are areas of our business
where we're already comfortably shifting asset creation over to
Unreal and getting results comparable to our traditional pipelines. It's something that team's
really excited about. Our clients don't care how a
shot is created just that it's believable and it looks great. That's our measuring stick. Optimizing our pipeline through
the integration of Unreal real-time workflow enables new creative
possibilities, our more complex storytelling, and substantial
increases in the volume of content. On the immersive side, we've found
that the use of a real video and photography adds a lot to create a
strong suspension of disbelief during experiences. Combined with tools such
as Megascans and photogrammetry, the future is exciting. Not too much I can say at this
point, but stay tuned on that front and expect some great
things from General Motors. SCOTT MARTIN: Hi, my name is Scott
Martin and I'm the creative director for user experience at General Motors
by integrating the Unreal Engine into our infotainment systems
to engage our customers with their vehicles in new ways, the engine is allowing us to make
vast improvements in a few key areas. By achieving more robust vehicle
visualization on our displays, more rich, more dynamic, and
tailored to a customer's vehicle. Having a more responsive and iterative
visualization process has been improving our design
time and overall pipeline, alive for quicker sketching,
quicker feedback and quicker overall turnaround times. Having the interactive data
flexibility is giving us performance boosts not only in hardware, but
also in animation and improving our onboard memory storage footprint. The Unreal Engine has been
opening the door to new user experience
potential thanks to that real-time environment onboard.
Epic has been instrumental in collaborating with our
development teams to help us optimize our processes and lay
the foundation for what's to come. They've been incredibly
informative and will be a key partner in our future
user experience endeavors. We're just beginning to scratch
the surface of what we can do with Epic's Unreal Engine and there's a lot more to come. STEPHEN MURPHY: GM is using
Unreal Engine on many projects, across many different
business groups. Product design,
engineering, manufacturing, IT, and marketing are all
working on Unreal based projects. We have proven Unreal can
help deliver business value. And we continue to create products
and applications in Unreal Engine. But for us, this is just
the tip of the iceberg. We have a mission to continue
to improve and innovate. We have a strategic goal to
make real time rendering, more available and more
accessible to our users. To empower teams to leverage
better VR and desktop hardware. Unreal powered in-car experiences, autonomous learning,
and simulations, and visualizing our map data, are just some of the
targets we've set for ourselves. Thank you for allowing us to
share our journey with you today. And we look forward to being able to
share more with you in the future. Thank you. DOUG WOLFF: Being able to use your car
model in multiple places is extremely valuable, whether that's a still,
a video, or in a car configurator. In fact, car configurators were
among the first use cases for Unreal Engine in
the automotive industry. Real-time, particularly VR
configurators, were extremely popular and you guys really
embraced game engines as a solution for those projects. Year on year, car configurators
and Unreal Engine get better and better and better. But I think it's going to be
really hard to top this one. Pagani teamed up with MHP, Monkey
Way, NVIDIA, and Google to make what is pretty much the best
car configurator of all time. Let's have a look at it. EMMA SCHRODER:
Hello and welcome. My name is Emma. I'm responsible for the project
management here at MHP and my team. and I are very excited to
share with you a short insight into our Pagani Immersive
Experience Platform project. If you have the opportunity to
experience the cars and the company, you get an idea about the
craftsmanship, the precision, and also the passion that every
single vehicle is built with. The clients are not
only configuring a car, they are basically
creating a very unique and hyper-personalized piece of art. This is an incredible, but also
very emotional experience, which was also one of our biggest
challenges in this project. Because every single vehicle is
built exclusively for a customer, the retailers don't have the physical
cars available in the stores. We had to develop a solution
that is visually appealing, photorealistic, and provides this
immersive feel, all inside of an Unreal Engine based application. In our end to end approach,
we created a visual creative and technical concept
together with the clients. We also developed realistic 3D
environments based on spheres or photogrammetry, like the
actual Pagani headquarter you can experience in our engine. This has also to be combined
with a very intuitive, easy and fun to use interface with
many interactive features. We've recreated high definition
materials, like the very special tinted exposed carbon
fiber options by Pagani. And after or during the configuration
process, the clients can visualize their configuration in driving
experiences or other digital giveaways that the retailers can
provide to the clients afterwards. And of course, we had to realize
the rollout and operation of our application on all
dealer systems worldwide. KIM WAGNER: Hi, I'm Kim and
I'm part of the Unreal development team here at MHP. We put a lot of time and effort
into creating this amazing looking interactive point
of sales application, but we wanted to get more out of it. And we wanted to use it for more
than just a pure dealer experience. Over the last years, especially with
the addition of ray tracing, Unreal basically achieved photorealism. So our goal was to use the
point of sales application as it is to change the content, be it static images or videos
delivered on demand or pre-rendered. We tried to break down
our requirements for this. We need a description of a
scene, get our application into the state and then render out an
image, easy as that. But different Unreal applications
might use different input models, different interfaces, or even use
external microservices like rule engines. We didn't want to spend much time
on integration and adoption of our tools for every application. So we thought, what if you created
a standalone plugin, with it's completely separate communication
interfaces and its own generic scene description. All the integration we would
need is a small translation there between this plugin and
our native Unreal project. Having a uniform platform enables
us to develop and reuse our rendering features in Unreal independently of the application
the plugin is integrated in. That started as a
rough idea on a whiteboard, and it became what we
call our Elastic Content Platform. The goal was to make it seamlessly
integrate into any existing Unreal project and make all
our custom rendering features part of it. We developed this idea in parallel
with the Pagani application and integrated it into it, transforming a point of
sales application into a flexible rendering backend. We use this internally to render out
work in progress video sequences to share with Pagani for feedback,
but we also generate hundreds of images for quality assurance purposes
between application versions. And this platform was also used
in our Pagani streaming demo to deliver screenshots
to the user at home. It's a lot of fun to develop
new ideas and use cases around this concept because every time
we want to get an image out of Unreal, it's easily possible. PASCAL BAYER: Hi. I am Pascal Bayer, application
architect at MHP, and I'm responsible for the service architecture of
our Elastic Content Platform. The idea behind our offering is a
modular service platform that can serve all of our customer's content
needs from a single 3D pipeline. Modularization is the key success
factor in building a scalable cloud native content platform. The Unreal Engine is our core
component for visualization. On top, components like a retail
configurator, a streaming configurator, and image on
demand generation can be added as needed. Within a few minutes, each of these services can be
scaled up and down automatically based on the content demands to
provide high availability, but also keep costs under control. This helped us to deliver a real-time
ray tracing streaming configurator for Pagani at the last GTC event within a few weeks only. The Unreal Engine
perfectly fits into the modular approach by connecting
our broad range of APIs on the one hand, but also exposing
additional ones for interaction with the content on the other. Additionally, this allowed us to
easily integrate existing systems in the enterprise world
like analytics or CRM. That strong extensible and open platform foundation is enabling
our current efforts towards a hyper-personalized
content strategy in the future. It will enable our
customers to create a more immersive experience. And a deeper emotional connection
with their brands by creating content on demand for newsletters,
landing pages, interactive ads, digital giveaways, and so much more. STEPHAN BAIER: So,
as you already heard from Emma, Pascal, and Kim, Pagani offers
the best possible digital experience to their clients. But we aimed to push the
boundaries between digital and real worlds even further by creating
configurable automotive film for Pagani. So we went for a three-day
production and shooting at the Imola Formula One
race track in Italy. The film which we produced might look like a regular TV ad,
but there's one big difference. Like in the digital showroom,
the cars and the film are instantly customizable. You can swap the model or color
in and out as you watch, it's completely personalizable. To push the visuals
as far as they can go the rendering will accurately
reproduce the real time lighting from the filmed environment using real time ray
tracing from Unreal Engine. The technology involved in the
production provides us a glimpse of the future of customer engagement
and could play a unique role in how we showcase personalized
content and products in the future. This is only possible with
real-time technologies like Unreal Engine, the experiences
we are building and which we are developing are more like games. All of this was made possible
by an Epic MegaGrant. And we were really
thankful to get this. And the best part for
the Unreal community, and this is something which we
are proud to announce today, the Imola race track environment and
the Pagani hypercar as it's used on the project will be made available to
download from the Unreal Marketplace after today's Build event. So thanks again for the event. Thanks again to Epic Games
for making this possible. And thanks for having me. DOUG WOLFF: Projects
that awesome are delivered when a group of companies come
together to really try and accomplish something special, we've already
heard from MHP, but those outstanding visuals were delivered
by NVIDIA RTX GPUs. We've got Sean here from
NVIDIA to tell us more. Hey Sean, how you doing? SEAN: Hey, great Doug, thanks. DOUG WOLFF: So what's the state
of pixel streaming at large scale and high quality right now? SEAN: Well that in my mind, the Pagani
RTX ray tracing streaming experience is the proof that we as an industry
have finally reached the moment when all three of quality, performance,
and price meet OEM requirements. First, the quality, the quality
of the Pagani experience in my mind is testament to the amazing
creative talents of MHP and the perfect suitability of Unreal
Engine with NVIDIA RTX ray tracing for this streaming use case. Next the real time photo realistic
performance of this experience, which can exceed 30 frames per second,
is delivered today by NVIDIA RTX. GPUs with dedicated ray tracing
cores and NVIDIA NVENC GPU recoding. DOUG WOLFF: When do you think
this will roll out at a large scale? SEAN: Well, that is a great question. And frankly,
I think at the answer is today. Pixel streaming with ray tracing can
be deployed today on NVIDIA T4. GPUs at Google Cloud, AWS, and Azure
for around one penny per minute. And that's for the entire 16
gigabyte GPU frame buffer. Now this pricing obviously improves
if you're able to fit more than one UE4 streaming instance on that 16 gigabyte frame buffer. And this is the perfect opportunity
to leverage NVIDIA's new DLSS plugin for UE4 that's now
available on the Epic Marketplace. So DLSS taps into the dedicated
tensor cores on RTX GPUs. And these are in addition to the
dedicated ray tracing cores upscaled using AI- artificial intelligence. This means you can use a lower
internal resolution in your UE4 scene and up-res on the fly on the GPU. So now you can fit more
streaming instances on a GPU and thus lower costs. DOUG WOLFF: This
project looks amazing. And all of the stuff
you said sounds great. OEMs are certainly
going to be excited. How can they reach out to you? SEAN: Yeah, I would love to talk to
OEMs about any of these topics. Please contact me directly
if you have any questions, seyoung@nvidia.com. DOUG WOLFF: Awesome Sean,
thanks for being here today. SEAN: Thanks Doug. DOUG WOLFF: So the amazing visuals in that
piece were produced by NVIDIA technology, but the experience was actually
delivered through pixel streaming via the power of Google Cloud. Let's talk to Travis from
Google to tell us more. Hey Travis, how are you doing? Why is pixel streaming
important for the auto industry? What benefits are derived with
doing this on Google Cloud? TRAVIS: Ah, great question. So I think the auto industry can
benefit in multiple different ways. So I think pixel streaming
really does lower the friction around delivering
interactive experiences around products and services. And so I think when you got
to do that, right, you unleash the power of virtual assets that
businesses can leverage to actually deliver these experiences or design
products, build products, right. Go through virtual
warranty processes. So there's a ton of things,
that they can kind of address along that value chain. And that extends all the way out
to things like marketing and sales, as you've seen with what we've
done with Pagani, we're really building sort of industry leading,
very interactive, high fidelity, real world simulated experiences. And I think that will change
the way customers see and interact with products online. DOUG WOLFF: Why do you
think Unreal Engine was the right technology for this project? TRAVIS: So I think there are
two reasons for that. So I think the first is
really the people, right? The people at Epic are great.
They're visionaries. I think they see a very clear
version of the future where interactive experiences play a big
role in everyone's life, right? Even outside of the role of
the traditional gaming space. And so I think the people
are definitely key. Great minds, great engineers and
great ability to deliver. The second thing is
really the technology. So Unreal Engine is the world's
leading 3D creation tool. And I think that will continue
to be true as the Epic folks integrate and build out more
capabilities into Unreal Engine. We're really looking forward
to Unreal Engine 5 and what that'll bring to the industry. DOUG WOLFF: Can you give us a
teaser of what future Google services will look like in this area? TRAVIS: Sure. Sure. So I think we've got a set
a design goals and principles around removing any friction
related to leveraging Unreal and delivering interactive experiences. We think the auto industry,
tends to be a sort of leading industry and in this
sector where high fidelity simulated, real world experiences
around vehicles and around other aspects of the brand are really
important to customers to help understand how that car might fit
into their lives and in the future. So I think you'll see
us deliver a service that allows everyone, any
brand big or small, to be able to leverage pixel streaming, to
build and deliver interactive experiences in the future. DOUG WOLFF: Thanks, Travis. Lovely to have you here today. TRAVIS: Thank you. DOUG WOLFF: We've seen
some really great stories so far and there's plenty more to come. But we wanted to take a few minutes
to share with you our vision for how Unreal Engine fits in the automotive
industry, as well as highlight some individual features that will really
make your projects shine this year. So here's Heiko not as a hologram
this time to tell you about our overall automotive strategy. HEIKO WENCZEL: Welcome to Build. I'm Heiko Wenczel. I'm the industry
manager for automotive. And I'm excited to have you here
today and talk about our future perspective and a little bit about
the past of what we've done to define the automotive
industry as a segment for Unreal Engine and the product
development that we see important to us. The automotive industry
has always been a driving factor in the manufacturing world. Like driving disruptions,
changes of like how things are done, how things are being produced
and message to the community. And we see a great opportunity for
Unreal Engine to support in that field and be part of the overall user
journey and the creation of the product. To support the different
industries that we're working in, we started to build innovation
labs around the world. The one that is the furthest
ahead right now is the London Lab, run by Alistair. If you think about the innovation
labs as an engagement point, it shows our general philosophy of like how
we engage with our product ourselves. We are part of the
creative community. We believe that creative tools
should be easy to access or free for the creatives in the world to
create and envision their products, work on a production platform or a
product platform that encompasses a lot of elements that are necessary
to be creative in this world today. And so what you see with Unreal
Engine being free is to enable that access to this metaverse
philosophy, as well as like, thinking about what it takes
to build open source platforms or openly accessible platforms as you see with Unreal Engine. And as I just pointed out as being
part of the creative community, one part of that vision is
that it's a connected platform. It's an open platform and
it gives you access to everything that you need to like build your
vision of a product or a process or anything else that is connected to
that world of real time technology. And if you look at what
we promote from a philosophy perspective, that all points
into that direction of building a metaverse kind of system that
connects libraries with real-time tools as well as with
the engagements in between to drive the development of future
industry engagements overall. Going back to the
automotive industry. Over the last four years, we worked
closely with different clients that were willing to share where they're
going and what they want to achieve. We learned a lot by the
use cases, the POCs and the products that came out of it. And it resulted in a better view of like how we need to make the
product work within that field. The Pagani POC that we have done
a key aspect where we engage with is the marketing and design side. We learned a lot out of these POCs. It was great working together
with Google and NVIDIA and optimizing the experience. And then going all the way
through to understanding what does it take to create a product
from a pixel streaming perspective that like is attractive and can
be part of the overall channel communication that you want to have
in that field, which all leads to that bigger platform vision that
we have to make sure that like the components are all there for you
to connect a grand vision and a user journey that is meaningful. And like, if you take that Pagani
POC and how it showed what kind of experience you can have and like how
quick that actually was created and connected to the other elements in
that field was particularly for me an exciting part of the journey to get
to that announcement that we see with the automotive platform definitions
that we're making all the time. And we're getting really close to
like showing you an overall picture of where we want to be in this field. If you take that automotive
platform graph that has been shared a couple of times, it gives
you an idea of how we connect to the principles of the digital
twin, as well as the digital thread, that's being used a lot
in the industry, but very seldom do you see a direct
connection to the technology that enables you to put that
into place within your production cycles or production definitions. And if we take Unreal Engine as
a core component of that, that form approach that you want to see in
the future from all the different engagement and trends level that
we're seeing in the industry, and that we heard from our industry partners
about, you see that like having a visual data layer that enables
real-time technology across the different departments where you want
to use it, as well as the connection from an IOT site, which is a lot
more coming, which will be a focus for me to understand how we make
that better available to you, and like see more POCs and application
in the future in that field. But also like connecting it to the libraries that we put into
the mix through Quixel and great shout outs to the components
that they created there and the library that they are building there
that enables you to bring the world into your view without having
to create it yourself, or having to shoot it like in complex manners,
as it has been done in the past. And what's open and what's left
really, for us to do in that platform is to make a Solid connection
to the PDM and PLM systems. So we're going to look as a key
step in this year and figuring out how we make that a more Solid
connection and a better experience overall, which I haven't
seen anywhere so far at a major level with any software. So we're going to try to figure
out how we can make that a more convenient approach to
truly underline and create the foundation of a platform system as we see it
based on Unreal Engine. DOUG WOLFF: Thanks Heiko for that
glimpse at the big picture. But what about some features
that can make a difference in your projects today? Let's hand over to
Thomas to find out, THOMAS CONVARD: Hi, everybody. I'm Thomas, senior product
manager at Epic Games. Today, I'm going to walk you through
some of our product features and our vision for the industry. Hopefully you get
as excited as we are about where we're
taking Unreal Engine. Building an interactive experience
for automotive, whether it is deploying a car configurator
or preparing a design review session involves many different
stages where data is transformed, content is created and assembled,
and decisions are made. Our vision is that Unreal Engine
is the open platform where creators can build complete experiences,
have full control over their assets and can connect to any of
their standard systems and tools. First, I would like to go through
a few of our achievements. Unreal Engine introduced real-time
ray tracing three years ago, and it has completely changed
the way artists visualize cars. Thanks to unmatched rendering
fidelity and performances. And Epic Games continues
to push the boundaries. Unreal Engine 4.26 brings performances and visual
improvements from global illumination quality to [INAUDIBLE]
shadows and many more. Also in 4.26, still images
and videos can be exported with the Movie Render Queue. This tool can now create render
passes for compositing outside of the engine and supports
industry standards such as OpenColorIO or Pro codecs. Datasmith can now import, process, and optimize huge amounts of industrial CAD data. It offers several options for
automation from visual programming to Python scripting. Unreal editor with
Datasmith is becoming the most open, flexible, and performant
tool for data preparation. With the new modeling tools,
users can create geometry from scratch with poly editing or
scripting, or they can modify Imported CAD data, adding details,
generating UVs, deforming meshes. All of those operations are performed
inside the Unreal editor without having to move data through
other software packages. I can also mention
the Variant Manager, which allows the creation and control
of complex product configurations. Unreal Engine 4.26 brings variant
dependencies that can be used, for example,
to design option packages. Animating has been possible in the Unreal editor for a long time
thanks to the Sequencer. In addition, we now
have Control Rig. It is a node based rigging system,
which allows artists to fully rig and animate a car or any kind of
character within the Unreal editor. In addition to engine features, we
offer content to help quickstart projects or illustrate
the best practices. The project templates shipping along with the engine,
provide foundations for the most common use cases in the industry,
such as photo studio rendering or collaborative design review in VR. Then on the Unreal Engine
Marketplace, we release free content that can be
used to enrich your project. The Automotive Material Pack is a
collection of high quality materials using the Quixel
library of textures. The materials are optimized
for real-time performances, as well as ray tracing. This material pack is updated
with each new engine release. For example, we added support
for anisotropy with 4.26. The Marketplace also contain
series of realistic automative environments produced by Epic Games. So far, we've released a bridge,
a beach, a winter road, and we have many more to come. We recently released a complete
configurator sample, which demonstrates how to put it all
together with varients, rendering, animations, in order to build a
full car experience. The project is available
from the learn tab of the Epic Games launcher. All assets are provided from
meshes to sounds and everything has been implemented with the
Blueprint visual scripting language. It's a great resource to learn
Unreal Engine. And UE5 is coming this year. With Nanite,
virtualized micro polygon geometry technology, and
Nanite allows users to forget about polygon count or draw call budget. And they don't have to worry about
tesselation settings or removing hidden parts. Highest detail can be used
for every mesh in the scene. UE5's dynamic
global illumination system, Lumen, allows artists to manipulate
light sources or move objects and see in real time,
the impact on indirect lighting. We believe that UE5 will change the way users create and
experience any 3D automotive content. DOUG WOLFF: As Thomas mentions,
the automotive configurator example
is available now. You get it through the
Epic Games, launcher under Unreal Engine on the learn tab. I can't wait to see what
you guys come up with. But I wanted to share another
exciting development with you. For years,
pixel streaming has been the way to deploy high quality Unreal Engine experiences on the web. And you could see from that Pagani
example, how great that can look. That was absolutely fantastic. But now there's another option. So you can choose the solution
that works best for you. Our friends at Animech have
developed the Mega Converter plugin. It allows you to take the same
Unreal Engine project that you're using for pictures or videos,
or even pixel streaming and export directly to WebGL. I really hope you like it. STAFFAN HAGBERG: Hi,
my name is Staffan. I work at Animech as the
Chief Marketing Officer. And we are here today to
present our new project, Unreal for Web. What we've been doing
for the last year is to develop a
plugin for Unreal that converts Unreal 3D
scenes and 3D models so you can just publish
them straight to web. Companies can
reuse their content that they've already created
in Unreal when creating web-based applications. Just convert your
Unreal files and start developing a web-based
configurator or an AR application, or whatever
you want to do in 3D online. Reusing all 3D materials
that you already have in Unreal saves
enormous amounts of time, makes projects much
more efficient. The first version
of Unreal to Web is aimed at the
automotive industry. But I mean, anyone can use it. So now, I'm going to hand it
over to Aidin, who's our CTO. He's going to give you some
of the technical overview of the plugin. AIDIN ABEDI: Unreal
for Web, it's more than a plugin, it's a
groundbreaking glTF exporter that converts your files. But also a Epic-enhanced,
web-based glTF viewer that takes your assets online
with key interactive elements that you are already
accustomed to in Unreal. Both are open source
and easily extendable. What's glTF? An open standard made
by Khronos Group-- that's the guys behind WebGL-- to efficiently share 3D
between a wide range of apps. It's specifically designed
for modern photorealism, compact size, fast
loading, extendability. In other words,
perfect for the web. What can the plugin export? All the standard stuff. Meshes, both static
and skeletal. Textures, materials,
default lit, unlit, all the blend modes. Animations, level sequences,
websockets, bones, levels, including cameras and lights. But now the interesting stuff. It can also export level
variant sets, let your users configure meshes,
material, visibility. Basically, make your own
web-based product configurator in three minutes. It can also export special
Actors and Blueprints. HDRI backdrops, sky
sphere, hotspots, and player controlled
cameras for interactivity. Oh, and one last thing. Let's not forget the clear coat. How do we make it possible? By lots and lots of
blood, sweat, and tears. We had to push some serious
limits putting glTF on steroids by crafting Epic extensions
to support all the features, then fork the material
analyzer and baking module to make the plugin
even more powerful. The roadmap is not set, but this
is what I'm looking forward to. More amazing Unreal shading
models, more compression and optimizations to get
every last bit and faster. Epic extensions to
more WebGL engines. Fined-tuned logging
to help understand how to get the best
out of each asset. Mobile focus. Thank you for listening. I'm super proud of
what we've done. Let me hand you back to Staffan. STAFFAN HAGBERG: We're
super excited to have been working with Epic for the last
year developing this plugin. We gave it to a
bunch of customers, and they all tried it out. The feedback that we got
from them was awesome. Finally, now we can take
our Unreal content online. Thank you very much for
listening and have a great day. Bye Bye. DOUG WOLFF: If you'd like to try out the
Mega Converter plugin it's available in the Unreal Engine Marketplace. So we've heard about our overall
automotive strategy and some individual features and developments. Now I'd like to bring Heiko back
for some additional thoughts and a look into the future. HEIKO WENCZEL: A
good summary of all these points that I
mentioned so far can be found in the Automotive Field
Guide, which we just released. And the purpose of the Automotive
Field Guide in general was to give you a good overview of all the
touch points we see within that approach of having a platform
for the automotive industry. You hear that term of having a
platform so often, but like truly showing where all the touch points
are and how we can focus on those and how we can develop against those,
is a key aspect for us. And you will find about
30 use cases overall that are at the central focus for us to
understand how to engage with the different verticals that we have. And if you look in the guide,
you will find that we have six key engagement points
that we believe give us good insights into the industry. And of course, with the announcements
that we've made already, and the things that you see in the event,
that's with marketing and design, which are very
conventional, very easy entry points for a very visual high
fidelity engine like Unreal Engine. But there is the other points
that are really important to us as well, which is the engineering
department where we see great benefits with using training
applications or previewing and analyzing where you want to go. And it's with the simulation and
training fields in general, that we see a great future from an AR/ VR/ XR
perspective of how there's going to be a massive change in the industry. If you see what the pandemic did
over the last year and how it forced people to think differently of
how they engage across borders, how you provide expertise to different
places in the world, if you can't travel, or if you want to ramp up
things, but you can't be there, or you have to think differently
of how you collaborate overall, those kinds of parts show us
where we need to do the most development, because we believe
that those things are gonna stay. And it's things that we're taking
to ourselves as well as how we operate in the field in general. And so I'm very excited to
see what the next POCs are going to be in the next two
years in that particular field. And so I hope you have a
chance to take a look at that Automotive Field Guide. And we hope, of course, for more
feedback and more engagement along those use cases that we've defined
to make sure we're as close as we can be to the needs that you see and
challenges that you have in finding solutions, um, especially on the
real-time side, but in the general way of how you engage with technology
on the automotive industry side. Over the last decades, people have become accustomed to
amazing experience on the gaming side based on game engines that
have created game philosophies and all kinds of technologies and
storylines of like how to make sure that customers learn quick
of like how to engage with the content, how to live in that world. And that world is now
coming into your product. And why shouldn't it be fun? Why repeating ways of working
that are kind of not as engaging as they could be and like
using game philosophy and real-time technology can change that principle
and can bring that kind of fun into the mix that you want to see
that sparks that creative, um, that every one of us has an in, in
him or her and like make us create better products in the future. I also want to use the
opportunity and give a big shout out to Quixel and the
Megascans library that we have here and that is available to you
because it creates a big piece of the metaverse that we believe in. It reduces the amount of time. It takes you to create content
that is not product-related and it gives you and us a
great opportunity of creating content that that can be shared
across IPs and brands and create a world for creatives to like engage
on different levels and create more value in shorter time cycles, which
is of course, always a manufacturing and especially an automotive problem,
like how fast can you go to market? How fast can you create
the right things? But if you take the, the last year
and the restrictions we had in traveling and creating new content
libraries and the capabilities that we see through the Megascans
and Bridge and Mixer allow you to create high quality realities that
otherwise would be either impossible or very expensive to create. And that makes it a key part
of the platform that we see for the automotive industry. And it gives you a unique opportunity
in creating better content overall. I think one of the key elements,
um, that we'll see in the future is the collaborative aspect
of Unreal Engine, the way how you share, and like promote
content that you've created it. Pixel streaming, WebGL, all
these touch points that are related and based on
cloud-based computing, we'll get more important and we'll drive
development and engagements overall. So the smart digital user journey
for any product in that field, but particularly for automotive
industry will be dependent on using the right technology at
the right time, building a Solid platform overall that allows us
to connect the different data sources as well as the libraries
and the metaverse as we see it. And overall that will
allow us to create Unreal Engine as a Solid meta part
of your production cycles. And so we're excited about the
things that are coming in the future in this area. And if you have any ideas,
or any needs that you see in the particular fields and in
the use cases that you've seen throughout this presentation,
then please reach out to us and let us know how we can help. DOUG WOLFF: Thanks to Heiko and the
whole team for sharing our vision of the automotive
strategy for Unreal Engine. It's been really rewarding to
work with multiple automotive OEMs around their own strategies
for game engines. BMW have such an advanced approach to Unreal
Engine that allows them to do some really cool and flexible things. Here's how they used that flexibility
across the challenging 2020. BURAK SOEHMELIOGLU:
I'm Burak from BMW, and I'm currently IT product
owner for 3D, VR, and AR at BMW. I'm working now for
seven years at BMW. I'm a computer scientist,
and now my focus is really bringing 3D and
VR/AR technologies to everyone at BMW. We started using Unreal
Engine at BMW six years ago, and it was the time
where the first VR headsets came to the market. And we had the idea of using
VR headsets in our development phase and early phase to
experience our future costs, and also give the opportunity
to our engineers decision makers to experience the car
functionality, having a ride. And it was, of course, not
just thing about hardware. Having the VR headsets
is not enough. We also needed the appropriate
software application for that. We already had a rendering
application at BMW, but the issue was that they
really did not support VR. So we are looking for a
solution on the market. And the main usage
for VR at those times was the gaming industry, so it
was clear for us to look there. And since the visual quality is
one of the most important parts for our first use case, we
decided to use Unreal Engine. The success of our first
project was so high that the usage of
Unreal Engine was spread across BMW really quite quick. So we started to use
Unreal Engine, also, for other departments
like the factory planning, but also later in the phase
for a dealer configurator for presenting the new
car so the customer can go to the dealer and
configure its personalized BMW inside VR using
an Unreal Engine. Here is a video for you
how BMW uses Unreal Engine for remote collaboration. Enjoy it. [MUSIC PLAYING] MARCEL STRZELETZ: So I use
Unreal Engine in my work for visualization things. We can show everyone who's
involved different solutions for problems. Then everyone is
able to imagine what will happen in the real car. We were able to show the
designers what we did, and they had the same quality
like before in the meetings. They were able to
move into the car and show me which perspective
they meant if they had something they didn't like. That was really cool. SENER YILMAZ: We are having
very big structural changes in our plans. We are implementing the
new model E4 with 2D tools, like on paper or PDF. We were forgetting, for example,
a scanner or a screwdriver. We have the possibility
now, really, to build up exactly everything
how it's going to be. It's giving us the opportunity
to be a communication tool. For example, teams meetings. I can open my project. And together with the process
planner and the logistic planner, we can try out new
alternatives, and we can optimize the planning
status in the early phase. This is saving us
time and costs also. MELTEM MIRZAOGLU: We have less
customers in our showrooms. Yeah. That's why it's very
important to have another way to show our customers
the different cars, different options,
different colors. EVE is Emotional
Virtual Experience. We use it all the time every day
to show our customers our stock cars, what the
options are, calculate or configurate a new car. And after that, I can show
my customers here also on the screen, or I can
send the link via mail with the offer to my
customers at home. It's easier for my customer
to select the right car. Half hour or one hour, and we
have a finished configuration with the customers. And the customers
happy, I'm happy. Yeah. JOHANNES KNIPPEL: For me
as a software developer, there's no barrier between
me and the business partners. When it comes down to
implement the software, what we use to
enable remote work , most people don't even know
it's a game engine behind everything. The great thing about
using the Unreal Engine is that it provides
the perfect space that has everything you need to
implement the perfect solution for any need. DOUG WOLFF: Work
habits are changing. They can leverage the
fact that Unreal Engine is a connecting technology that
allows people to collaborate. So whether you're
in a dealership, whether you're at home,
whether you're interacting with a customer, by running
off the same Unreal Engine platform, you can really
do awesome things. [MUSIC PLAYING] BURAK SOEHMELIOGLU:
Isn't it great How BMW benefits from the usage of
Unreal Engine for collaborative work. But the question is what's
the next level and the next level could look like that it's not just used for BMW
employees, but also extended for our suppliers, that we have a
common collaborative environment where we can work
together interactively. Thank you very much for participating
and enjoy the rest of this event. Thank you. DOUG WOLFF: Earlier in the show, Heiko
touched on the topics of simulation and autonomous vehicles. They form a huge part of
our automotive strategy. I'd like to invite Sebastien
Loze, our industry manager for simulation, to tell us more. SEBASTIEN LOZE: Hello, everyone. My name is Seb Loze. I'm the simulation industry
manager here at Epic Games. If you are attending
these sessions with us it means that you
already understand that Epic Games is
committed to provide a meaningful and
relevant set of solutions to the simulation community. As well as the health care,
defense, and civil aviation simulation applications,
the automotive domain represents a quasi
unlimited source of use cases, which can be
sorted in two main categories, analysis and training. Now within these two
categories, two main segment of applications coexist. The first ones are linked to
the driver related use cases. Everything where
the driver needs to interact with
its vehicle, this encompasses the notion of
driving simulation as well as, for example, the vehicle
human-machine interfaces or HMI prototyping applications. The second segment
of applications is linked to the analysis
and training applications where no humans are involved. These are all the autonomous
vehicle use cases, which goes from conceiving
these vehicles towards ensuring to fit them with
the right experience before they hit the road. As you can see, from the first
elements of this taxonomy to ensure for Unreal
Engine to remain relevant for this multitude
of applications, we have to maintain a very
strong platform approach. But this would not be enough. There's a particular problem
linked to simulation. Simulation application is a
stack containing your framework level, where Unreal Engine lives
along with your standard data set and communication layers. On top, you have
your second level, which is the integration layer. And this integration layer is
topped by a dedicated feature set layer containing
all these useful tools and functionalities,
specifically developed for your applications. So to summarize, you have
a layer cake of technology, integration, and
specialized feature layers. In the specific
context of simulation, the entire community realizes
that monolithic solutions are not often the best approach and
that walled garden technology cannot be the right solutions. So while we take care of the
first layer of our layer cake, by continuing to grow
the core of Unreal Engine and to innovate with
new modules and assets, as Thomas mentioned
in his presentation, we are also lowering effort
for the simulation creators on the second layer,
the integration one. By supporting our users
either at the technical level or the business level, we
are growing an ecosystem, which from motion cueing
systems to HMD equipment passing by machine learning
allows a tangible integration accelerator around
Unreal Engine. In these sessions that
you've attended here and in the presentations
you have seen, you saw a glimpse
of what is possible when you aggregate the
right software and hardware architecture around Unreal. As you saw, building
simulation solution for the automotive
domain is entering a new phase of maturity. And it is time to engage in
a more active way together. So send us the line at
simulation@epicgames.com. Our simulation team
is waiting to learn from your ideas, your
challenges, and your dreams. For the second part
of our presentation today, I'm very happy
to introduce the team from AV Simulation. In the course of the
last year, AV Simulation took their already very
experienced and established solution scanner and brought
it from its existing technology to Unreal Engine. This transition was a very
important move in the R&D team from AV Simulation. And they are here today
to talk to us about it. THOMAS NGUYEN THAT:
Hello everyone. I am Thomas Nguyen That
an automotive domain director at the AVSimulation
and I will present you the amazing work we have
been doing with Epic Games to develop the new
SCANeR generation tool. So we develop SCANeR
which is an open modular and scalable driving
simulation software. SCANeR is a tool to simulate
a virtual environment to test automotive systems. It means that we build
virtual roads, virtual traffic conditions and scenarios
for automotive engineers that works on ADAS systems, headlamps,
chassis, body design, and HMI. We also build driving
simulators, which are a very big machines with
a screen and motion system. At the moment, we are developing
the two largest driving simulators in the world. One for BMW and the other
for Renault. Our aim is to accelerate and
secure the development of autonomous vehicle. There are two big challenges for
this incredibly complex system. The first one is
that you need to make sure before putting the
vehicle on the streets that they will be
safer than human drivers. It means that you have to
prove that they can perform in any day-to-day situation. Every time of the year, any
weather or traffic conditions and for every road in the world. It's such an
incredible challenge that it has been
computed that if you had to prove that in reality
it would require billions of miles of real driving that
would take hundreds of years. Simulation is then
the only solution. The second challenge is the
interaction between the driver and the system. You can build a very
sophisticated autonomous system, if it is not
understood or trusted by the humans in the
car it will be useless. Simulation is again, very useful
to study the driver behavior in complex driving
tasks because you can study how the
driver behaves and reacts to dangerous situations in
a very safe environment. It means that we
have to simulate not only for the driver but for
the complete perception system. And it's a much more complex
task because the human brain can make adaptations to
understand the difference between the simulation and the
reality but the sensor system won't. Any slight difference between
the simulation and the reality will change the results. To address these
two big challenges, we had to invent and develop the
next generation simulation tool. To immerse not only
the driver but also the complete perception
system- camera, radar, lidar. And for that reason, we have
chosen Unreal from Epic Games. Why? Because it's the more
realistic and performant 3D engine on the market. It's physics based and
we need to bring trust in the simulation
result. It's very complete with a full
editor and very easily accessible to the final users. It also brings a
very rich marketplace with a lot of high
quality 3D assets. We also found out that
Unreal was very popular among our customers. We have developed a
complete seamless workflow that very easily
brings a high realism to our customer with their
existing assets and scenario. We have developed this
very rich and very detailed environment model where
you can dynamically change the time of day,
the weather condition, where you can have multiple
vehicle, pedestrian, animals, in the scene. You can program very complex
scenarios from everyday life. We also have developed
NCAP scenario tests with our partner Utextron
that behave exactly like the one on the test truck. We also have developed
an automotive camera and some models that generates
photorealistic images and exhibit exactly the same
default than the real camera. That really allows us to verify
how your perception system will behave in any complex situation. We also provide a
more advanced workflow for advanced users that have
development skill to completely customize and enrich
their rendering. We also have a very
complete roadmap to improve the
environment creation, to integrate all
the sensor models and to move our headlight
simulation to Unreal. With this new
solution our customer can recreate super realistic
simulation environments that they can use to perform
their validation tests and prove that their
systems are safe. This is really a strong
accelerator for them and it will bring
autonomous system industry sooner than expected. Well, thank you very
much for listening and I wish you a nice event. DOUG WOLFF: We've got one last
story for you, and it's a great one. Over the years, we've had a
bunch of the world's biggest car brands present at Build. But we've never had
Volkswagen until now. The teams that Unevis and
Effekt-Etage worked with Volkswagen to create something truly remarkable. A pipeline that ingest raw
CAD data uses AI and produces marketing ready assets. It's called Project Solid Machine and I'd like to share
it with you now. RICK POLEY: Simplification in our
complex industry is always in demand. After all we at Volkswagen are a
very data-driven company. When a customer configures a car on
our configurator, he also defines the DNA of a digital twin, which is
born right inside of our database. One single car might hold millions
of different personalization and configuration options. Each car part that changes on
the inside and outside of the car originates from a gigantic
database, which holds detailed information about size, function,
identification numbers, and so on. We call this car DNA PR
Configuration Script. Ever since real-time technology
was able to render an image of our complex digital twins, we were searching for more and
better ways on how to automate our production pipeline and to
capitalize off our wealth of data. PIERRE GRAGE: When VW approached us with
the idea to use real-time technology in order to cut down post production
times of their digital marketing images, I was more than intrigued. So we had to find a way to
somehow convert VW's CAD data into real-time data without changing the
underlining structure. Yet CAD data was never intended
to run in real time performance. With close collaboration with VW,
Epic Games, and Effekt-Etage, we developed a solution to this
challenge called Solid Machine. A challenge considered
by many as the Holy Grail of the automotive industry. Well, at least to marketing it is. MIRKO HAENSSGEN:
In a modern-day real-time engine, the CAD data has to be
entirely simplified, restructured, regrouped, and the appropriate
metadata has to be extracted and assigned to the correct geometries, optimized for real-time productions. Solid machine handles the error
prone and tedious process of real-time optimization automatically.
Our AI routines will analyze the CAD data and check for errors and
correct them wherever necessary. PR string metadata will be
connected to the correct real-time geometry, wrong shader
material or geometry names are corrected by a fuzzy logic system. And digital materials are getting
segmented and automatically assigned to the correct real-time car parts. The CAD data will be ready to use in
the Unreal Engine with breathtaking performance, but we also need to
reduce and fix a lot of polygons. This is where our Solid Slim
module comes into play. Our own semi-automated reduction
algorhithm, Solid Slim, automatically detects polygon errors, and corrects
them before the reduction process. Geometry parts can be
categorized in reduction groups to not having the 3D operator,
making adjustments for the same kind of part over and over again. With Solid Leap,
we've created an AI tool for artists. Leap analyzes the output of the
material in a huge material library. This can be a picture of a
car or a reference picture, if you have one. We can see how the car paint
shader is supposed to look. We tell leap to only look at the
car paint shader, and then leap will analyze the reference picture
and try to match it in Unreal. The AI does the same thing an
artist would do, but it just tries out more variations in less time. It's a great tool to speed
up the artist's process. And our first addition to our
AI powered artist's tools for Solid Machine. One of the most challenging
tasks is getting the correct PR configuration string
for multiple car configurations. What do you do if you work at an
agency that has to set up a bunch of different car configurations,
like a Passat variant with certain rims and certain colors. It's highly probable
you don't know any of those PR codes. With Solid Matter, we learn all possible car
configurations of a car while we are processing the geometry parts.
Solid Matter is able to show the user if a given configuration is
known to the AI and if not, why it is different and what would
be the closest known alternative. The user then has the possibility to
see that alternative configuration with or without changes enabled
that he or she wants to add. This way, we know if the configuration
given is valid, or if there will be car parts that are missing
and might lead to visual errors in a later review stage. PIERRE GRAGE: In Solid
Creator, everything you saw comes together and gives the user the
power of Unreal and Solid Machine in one easy to use interface. Solid
Creator works right in your browser. And it feels like playing a computer
game while you are making great looking marketing pictures. So let's say you want to
change the configuration of a car. You can do that via
the configuration tab. This is also the place where
you can check the PR string for buildability through our AI. Let's say you want to
change the car paint. For this, you open the paint tab. Now you see all the possible paint
options for the current PR string that our AI has learned so far. You can change configurations
easily with click and do the same for the rims. To get a better of view
on how this change looks, I simply need to go through the
cameras in the upper menu. The PR string adapts in the background
and can be accessed at the bottom of the configuration tab. BJORN KOWALSKI: Solid Machine
and it's AI tools help us to cut down production times and really to focus
on the beautiful pictures instead. The technology eases
our pain when working with automotive CAD data and makes more
complex poses is much simpler. Solid Machine, in combination
with Unreal Engine is bliss. Not only were we able to
adapt the Unreal Engine into our production pipeline very fast, We are also able to produce movies
entirely within the Unreal Engine that people thought we
produced in classic DCC packages with compositing. MARK GRUSKA: With Solid
Machine we're going to the next level of car visualization. It's a combination of CAD
data with AI components really brings us to another level. I'm looking forward to
see how we can save efforts and fasten up the processes. I think that's really the
next big step in our business. PIERRE GRAGE: It was a great honor to
work with Epic, VW, and Effekt-Etage on this project. We can't wait to see what our clients
will do with this new technology. Thank you for joining us. Stay well and have a great day. DOUG WOLFF: So that was Project Solid. And how inspiring was it?
But how inspiring were all of the stories that we shared today? To be honest with you,
that's what the point of a Build event really is. The projects we inspire today, become
the Build event stories of tomorrow. If you have been inspired and don't
know where to start, the best thing you can do is go to UnrealEngine.com
and download Unreal Engine. It's free. Once you begin your Unreal journey,
we would love to hear from you. And there's lots of ways
that you can reach out. You can go to our Automotive
hub page that collects all of our industry stories in one place, you can sign up for the
newsletter there, and we would encourage you to do that. If you have a specific
question, for example, we have a great training department. They can take your whole team and
skill them up on Unreal Engine. Or you'd like to join
something like the development initiative we have for HMI. You can reach out to us on social. Our tag is @UnrealEngine
pretty much everywhere. Twitter, Facebook,
Instagram, LinkedIn. Now this event here is
designed to be the kickoff of automotive content in 2021.
We have a great lineup of stuff. We've got webinars
around specific topics. We have a web series called The
Pulse that has automotive episodes. We do summits with small
groups of people and really dive down on certain topics. There's a lot of it coming up. The way to stay in
touch with that is just to watch your inbox because that's where
we'll be dropping all the information. Now off the top of the event, I
asked you guys to socialize and chat throughout the whole thing,
and I hope you've been doing that. And just because we're coming to a
conclusion that doesn't mean we're going to stop. Off to the left here, you can see the link
for our dev lounges. These are specific chat rooms
that are manned by all the people that have been in this
film for you to ask all of your questions and interact some more. They'll be open for about an hour. So I hope to see you there. I would like to take this opportunity
to thank all the speakers. It's their content and time
and dedication to being filmed that really makes up all the
real interesting stuff in this event. And finally, I'd like to thank you,
the audience, for your attention. Have a great day.