DAVID WEIR-MCCALL: Hello, everyone. Thanks for joining us, and welcome
to Build: Architecture 2021. My name is David Weir-McCall,
AEC business development manager at Epic Games. BELINDA ERCAN: And my name is Belinda
Ercan, product marketing manager for Twinmotion. We'll be your hosts for today's
event, and in the next 90 minutes, we'll explore exciting
customer stories, and we'll learn how
real-time technology is changing the world of architecture. DAVID WEIR-MCCALL: Right. And that's why we've gathered
these amazing customer stories from across the
globe, which will be just a glimpse of what is possible. But if you want to learn how to do
it for yourself, after the event, we'll share details and resources
and our post-event messaging at unrealengine.com. BELINDA ERCAN: Right. We are going to present
our stories with the help of the virtual production
technology currently changing the face of Hollywood. If you've seen the popular
Mandalorian series, you have seen Unreal
Engine in action, creating virtual worlds in real time. David and I are here in
a Montreal soundstage, and we've created a virtual
studio setting custom-made just for this event. This will be the stage. And we'll be covering three acts. Act number one will be all about
storytelling with Twinmotion. We'll go over to act two, which
goes beyond with Unreal Engine. And in the final act will
be building the future with digital twins and open worlds. DAVID WEIR-MCCALL: At the
end of the 90 minutes, join us in one of our dev
lounges for some real-time Q&A with many of the people involved
in today's presentations. We're going to kick things
off with our general manager of Unreal Engine, Marc Petit. But please stick around
for the entire event, as we'll be sprinkling little
announcements in each of the acts that you will not want to miss. So buckle up. Your trip to the future starts now. MARC PETIT: Hello, everybody. My name is Marc Petit, and
I'm the general manager for the Unreal Engine at Epic Games. As you know, Epic Games
is assembling a portfolio of solutions for content
creators across all media, with a particular focus
on the needs of the AEC industry and visualization. You're familiar with
the Unreal Engine, but we also have our
Twinmotion product, which is, for you, the simplest way
to harness the power of the Unreal Engine. And we also, more recently,
brought into the family the RealityCapture team, which
is the best solution out there in terms of photogrammetry
and giving you this ability, using drones, or lidar
scanning, or photography, to actually capture the
existing 3D world as built and bring it into the virtual
world with your design. The reason why you're hearing so much
about game engine, the AEC industry, and particularly the Unreal Engine,
is because it brings photorealism to the real-time domain. You've seen this with the UE5 demos. UE5 is going to make it even much
further in terms of visual fidelity. The promise of that photorealism
in real time is important. Real-time 3D is kind of a revolution
in content creation for two reasons. Reason number one. It's the most efficient
way-- cost-efficient and time-efficient way-- of
creating your visualizations, creating those pictures and those
video to represent your designs or to sell your design. The other important aspect
about real-time 3D that is it enables digital twins. You're familiar with
the concept and believe digital twins are very
important for owner-operators to make sure they understand
how their assets are performing. And today, you will hear a number
of presentations about digital twins and how the Unreal
Engine actually allows you to get to the digital
twin in a very easy manner. As you know, architecture
is all about spaces. So the ability to explore a design
in VR is critically important. We bring that to the
table, especially with the level of visual
fidelity that we achieve now. Or just letting people
interact with the space-- guided tours, or even
use the interactivity to do those condos
configurators that we've seen. And finally, more
recently, we announced Sketchfab joining Epic Games. Sketchfab is important
for us, because this is how you publish your
content on the web-- how you publish your
3D model on the web, just like you would do by
putting a video on YouTube. Now, you can take your 3D
models and put them on Sketchfab to make them available to
everybody on the internet. We think that real-time twins are
transforming the way you create your visualization and your content. And also, real-time 3D is
powering your digital twin. As the internet embraces
more real-time 3D, we'll see the creations
of more ambitious projects around those shared virtual spaces,
often referred to as the metaverse. We don't quite know what the
metaverse is going to be. But we think that the
right way to get ready for it is to embrace real-time
3D, embrace the Unreal Engine, and start bringing
your design into it. Thank you. Today, we have a great lineup
of speakers for this event. I hope you enjoy the content
and hearing from your peers. Thank you very much, and
we'll see you in the chat. DAVID WEIR-MCCALL:
Thanks, Marc, for sharing how real-time technology enables
content creation for creators, moving us all to the
future metaverse. BELINDA ERCAN: In this first act,
storytelling with Twinmotion, we are introducing a
range of customers, from freelancers up to
large firms, that use Twinmotion for visual storytelling. And believe it or not, we have
plenty of those storytellers, because over one million people have
already registered for Twinmotion over the last couple of years,
which explains why Twinmotion is the second most popular
real-time tool used by people who care about their pixels. DAVID WEIR-MCCALL: What's
the most popular one? Well, according to 2020
CG Architecture Survey, it's the Unreal Engine. And today, we're going
to help you understand why there is all of this momentum
towards our real-time solutions. BELINDA ERCAN: Right. Now, we know many of you
are designers and creators, and you just want your
visualization tool to be kind of like that
extension to your creative mind. But such an extension needs
to be very simple, and fast, and quick, and rich in
assets, and collaborative. And let's not forget, all
of this with the highest visual quality right out of the box. Sounds crazy, I know. Sounds unimaginable. But you really don't need
to imagine this any longer, because Sam Anderson has something
very interesting to share with you. SAM ANDERSON: Thank you very much. I'm Sam. And as many of you may know,
the next Twinmotion preview has just been released. I'm here to take you
through a few key features. Here, you see Twinmotion's newest
photorealistic rendering technology. The new release leverages
the power of Unreal Engine. You'll be able to render your 3D
scenes with path trace technology right inside of your viewport. It's going to take advantage
of all of the GPU hardware to give you a very accurate,
high-level visual quality of shadows, lighting,
and reflections. If you're used to working with a
workflows as you are right now, that's totally fine. You can do that so that you have
quick prototyping and immediate results. And then for later design
stages, you can quickly switch to path tracer
mode if you need to add that extra level of
photorealism to your story. To further refine your story,
we've added nearly 300 skydomes into our Twinmotion library. These are going to give
your models that extra touch of realistic lighting quality. You can browse through a
variety of skies, or even upload your own custom HDRI. Now, if you would like to,
you can combine the technology of path tracer with the HDRIs so
you can leverage the information in the sky texture so
you can get the correct reflections in the background and
in your glass, making your story that much more immersive. Now, in addition to
the skydomes, we've added other assets into
our Twinmotion library, including 3D plants,
decals, and furniture, all optimized for
real-time performance. Currently, we have 10,000 assets
inside our Twinmotion library. We'll be able to add
into in Twinmotion assets as they are developed. Now, this will be regardless
of the Twinmotion release so that you can bring in
your assets at any time. Many design firms work with design
components in different softwares. For example, you might have
structural data in one program and your landscape data in another. For us, it's very important that
Twinmotion helps your data stay more interconnected and streamlined. We've been continuously improving our
data smith plugin technology, which now enables you to link multiple
files of various CAD data into one single Twinmotion scene. You'll be able to
reload updated geometry, develop the models
in one environment, and even take your
project into Unreal Engine if you need that next
level of customization. The Datasmith plugin opens up
a degree of interoperability that was not present before. Now, the creative
design process isn't all about designing and visualizing. It's also about collaboration
and communication. With our Twinmotion Cloud, you'll
be able to share your design presentations to anyone, regardless
of software, hardware, or location. Anyone with an invite link
can access your presentations through their browser on both
desktop and tablet devices. Last, but not least, we added a
new content type, panorama sets, which allow you to curate
lightweight, high-visual 360 images. With the next release
of Twinmotion, you'll have a true design companion tool. It will be simple, rich,
powerful, and connected. Be on the lookout for an offer
from us to download it for free. BELINDA ERCAN: Thanks, Sam. You see, that one simple, quick,
fast, collaborative extension of your creative mind is becoming
real with the next version of Twinmotion. I'm sure you'll be keeping your GPU
busy with the path tracer right out of the box. Also, there's a lot in the
release beyond what Sam had time to show you, so make
sure you try it out for yourself. And please share your work with us. I'm looking forward to your stories. Now, speaking of
stories, the next one is about how Twinmotion
actually changed someone's life. I'd like to introduce Pawel Rymsza. PAWEL RYMSZA: Hi, my name is Pawel. I come from Poland, and I was a
software analyst for five years. And at the time, I was also
finishing my first apartment. And I thought it would be really cool
to be able to show it to my family before it's built. That's the time
when I found about Twinmotion. And I quickly downloaded it
and started to work in it. And after a few days, I was able
to showcase a quick walkthrough video to my family. They were really, really
impressed by the quality. And in the end, that's exactly
how my apartment looks. So I was able to make the final
version of it inside Twinmotion. I found out about Twinmotion
challenge which was taking place. It was a short animation of
a modern building in fall. And actually, I was able to win it. After a short while, I found about
another Twinmotion challenge, and I had to create a green
space in an urban jungle. And I created image
which actually won again. And that gave me additional
boost of confidence to think about it maybe not
as a hobby, but something which I could do for a living. So I decided to
create my own company. I was finally able to
self-sustain myself from my work, which was a great feat. And that gave me a lot
of boost of confidence to just keep working
and keep getting better. The reason why I
started using Twinmotion was because it was super easy to use. It was really quick. I had no architecture
background, yet I was able to create my
apartment in a few days. I'm still surprised
by how quickly I'm able to create a vast landscape with
trees, grass, people walking, cars driving. A few clicks, and you have
everything set up for the render. For 99% of people,
the things which you can do into in Twinmotion
in minutes are more than enough to present to the client. The current version of
Twinmotion introduces one of the biggest improvements
so far, which is path tracing. It's actually a way to make your
scenes absolutely photorealistic, especially visible in
interiors, where light bounces through different objects, and thus
creating those soft shadows which are now present inside Twinmotion. Other than that, there
is also HDRI skies, which allows you, with one click, to
change whole scene from day to night and give you also a correct light
representation inside your room or inside your apartment. Thank you very much for
being able to tell my story. I really encourage everyone
to start using Twinmotion. It's really easy and fast. And hopefully, you will have as great
of success in using it as I did. Have a great day, and
thank you very much. BELINDA ERCAN: Thanks, Pawel. It is indeed an incredible,
life-changing story, and certainly an inspirational one. If you want to see more
of Pawel's inspiring work, head over to our official Twinmotion
community group on Facebook. Now, speaking of the
community, we also wanted our talented Twinmotion users
to share their passion with you. So we'll have them show
their artistic work in this short compilation. ANH PHAM: There are three things
that I love about Twinmotion. First, it's very intuitive. As a PC gamer, I don't have any
trouble navigating around the scenes and getting used to
the user interface. ARNE VAN KEER: For me, it's all about
images and animations, preferably as realistic as possible. Its ease of use-- whenever you like a
texture but you don't like its color, with one slider,
you can change your entire scene. PIERRE-ANDRÉ BIRON: What we love the
most about immersion in our studio is the perfect integration in the
pipeline with the Archicad plugin that allows us to import, very
fast, our project in Twinmotion and to make some changes
very, very easily. BLESSING MUKOME: Making Twinmotion
visualizations interactive through the Twinmotion-to-Unreal
Engine bridge, as well as through cloud
sharing experiences. ANH PHAM: Second, the
rendering time is very short compared to other programs. ARNE VAN KEER: It's easy when a
client is with you at the office. You can just change
some simple lighting, and your scene will look completely
different without having to wait. The cloud is also a good option. This way, the end user can
easily walk around in his design without having to download anything. CONMES: You dive into your project
and you get the result in real time. That's perfect. And you get what you see. JÉRÉMY TESSIER: The updates and
the latest iteration of the Quixel Megascan library make it the
perfect tool for me today-- a real pleasure to work. DAY SENA: The VR experience
is great, especially if you want to share
your project with others. That is the reason that I
started to use Twinmotion. And when I'm working
with Twinmotion, my mind escapes from the domain
of the daily architecture. And I see possibilities of
creating new forms of art. ANH PHAM: Third, it's very easy to
teach the program to my colleagues and integrate it into our
design workflow at the firm. BILAL KHAN: It's just
really fun to use. Since everything is real-time,
the response is real-time. Really enjoy the exploration
of design options. It just feels effortless. ANH PHAM: It's not
just about the program, but also about the social aspect
when people come together and share their knowledge. To me, it's been a very
fun journey so far. JÉRÉMY TESSIER: The Twinmotion user
community is super friendly and very stimulating, and never
lacks imagination. I learned so much with them. Thank you, Twinmotion teams. BLESSING MUKOME: I think what
I like most about Twinmotion is the creative
freedom it provides me to express my ideas,
my work, in what I find to be the simplest way possible. JÉRÉMY TESSIER: Have fun,
and never stop creating. Bye! DAVID WEIR-MCCALL:
What a great community. Amazing to see the different ways
that people are using Twinmotion. BELINDA ERCAN: And
speaking of community, we are often asked, where can I find
someone with real-time skills? Well, recently, ArtStation
joined the Epic Games family, which is a leading global platform
for discovering and showcasing art and connecting creators
with opportunities. If you're a creator,
ArtStation is a one-stop shop with all of the tools needed
to showcase your artwork, improve your skills,
and develop your career. DAVID WEIR-MCCALL: That's right. And it really was designed
with artists in mind. You can share your 4K images, videos,
360 panoramics, or even full 3D models. It's also a great place to earn
passive income selling digital or printed assets, or even learn
something new through ArtStation learning. And for anyone looking
to hire a creator, ArtStation is an excellent
free tool to easily search and browse artist
portfolios from a range of different industries. BELINDA ERCAN: And now, our next
story is about transformation, and comes from Ernesto
Pacheco from CannonDesign, which is a large architectural
firm based in New York. And they will show you how they
used real-time to transform their process with Mercy Hospital. ERNESTO PACHECO: Hello, my
name is Ernesto Pacheco. I am the director of
visualization at CannonDesign. We are global design
firm with over 20 offices throughout North America, Asia. My main role at CannonDesign
is research and development of digital technologies
into our design practice. We have been using
Unreal Engine since 2015. We introduced this tool as more
interesting ways to tell the story. The only thing that you have to
take into consideration with Unreal Engine is that it's a
very sophisticated tool. It takes a little bit
longer to getting used to. And also, training takes
a little bit longer. So when Twinmotion came to
light, we were very excited. We saw Twinmotion as this
one-step solution for designers. We wanted to expedite
especially animation processes, and have smaller teams working
on tasks throughout the firm. One of our talented designers
from our St. Louis office, Allison Mendez, had
an animation project that she needed to
complete for a client. This project included
over 300,000 square feet. And for this particular project,
we had interior, exterior shots. We also had a lot of
landscaping elements moving. We actually partnered with
landscaping architect. So we were able to share the
Twinmotion library with them and have them optimize landscaping
elements we could use and iterate throughout the space. We are always
fascinated with how easy it is to iterate with Twinmotion. One of the challenges
that we actually faced is how we share this type
of solutions with clients. We used to export an executable,
share that via a cloud service, and then the client will have to
have access to a gaming laptop to actually open these files. So when Twinmotion offered the cloud
service, we jumped right into it. Having access to this cloud service
allows designers to just send a link, then the client
can just open that on the laptop or a mobile
device, for instance. They don't have to have the dedicated
hardware that runs a high GPU. One of the things that
we love about Twinmotion is how easy it is to
just open the file and jump on a virtual call
with the client or design team and have these organic
conversations happen. We can move things around,
change materials, add assets, and everything happens in real time. You really don't have to wait
for anything to pre-render. It creates this organic
interaction with them, and the whole experience
becomes more natural. It's a testament to how
easy it is to pick up Twinmotion for any type of work--
any type of visualization work, in this case. And we were happy with the outcome. We have Ricardo Orfila
from our Baltimore Office to take on this project. This was actually his second
animation project ever. I was very impressed
to how easy it was for him to pick it up and
actually lead an animation project all by himself. In a nutshell, Twinmotion
is super easy to use. It's amazing when somebody
opens Twinmotion and starts playing with it. You can see their
smile on their faces. It's super fun to use. You have access to a
real-time solution. That means that you can
have real-time reviews with clients, which is priceless. It's something that, now,
we do on a daily basis. Thank you again for the
opportunity, Unreal Build 2021, and I hope to see you soon. DAVID WEIR-MCCALL: Thanks for
sharing that story, Ernesto. The Twinmotion cloud
service is really helping firms connect and
collaborate with their customers. And with a new panorama
feature that we're launching for Twinmotion Cloud,
we're excited to see what you can do. Next, let's go big. And by big, I mean Jens Kaarsholm
and Vladislav Saprunenko from Bjarke Ingels Group,
otherwise referred to as BIG. See what I did there? They're headquartered
in Copenhagen and known for their innovative design approach. And they're here to talk
to you about how Twinmotion is helping them tell their stories. JENS KAARSHOLM: I'm Jens
Kaarsholm, BIM director here at BIG in Copenhagen, Bjarke Ingels Group. Normally, I'm looking
after our BIM operations, but I've been an early adopter
and a long-time advocate for real-time rendering, and
also one of the ones pushing a lot for real-time rendering
internally, here at BIG. And what we particularly like
about Twinmotion, in this case, is the animated part, and being
able to really tell a story. That's where we find Twinmotion
being the one to go for. The project we want to
showcase today is our AICD4 for the Terminus group
in Chongqing in China, which will be the new headquarter
for the Terminus group, and also this sort of innovation
hub, which will be super exciting. It's a project that we
worked on for quite a while. And obviously, as you
can imagine, we are doing a lot of different options
with all of our projects here at BIG. It's obviously a fantastic way to be
able to utilize real-time rendering to be able to sort of walk the
project, even in the early stages. And it provides us
a better way to hone in on what option we want to
go for and to sort of push the amount of options that we go
with further into the project, where we would normally have
to narrow down way earlier. In this case, we can keep
more options in play. And I think that's where
real-time rendering has its big positives that you can
really explore, and utilize different options, and
play around with things, and still get a really good result
out in a very short time frame. VLADISLAV SAPRUNENKO:
Hi, my name is Vladislav. I am visual designer at Bjarke Ingels
Group here in Copenhagen office. Since the core idea of
the design was to remove the barrier between the
architecture and nature, and also show how the
robotics and artificial intelligence would look
like in the future, we need to emphasize those ideas. And motion in real
time, in general, helped us to find the good solutions
during the design process, and also in the final
video production. We're using the real
time engine not only for the final production
of digital media, but also during the
early design stages, such as searching for the ideas or
seeing how the space will interact to the people. And the real time engine
allows us to actually be as a human inside the project
and see how it would work. We learned that Twinmotion has
a lot of really handy tools that allowed us to reduce the
production time in general and meet deadlines in time, such as
the huge material library and the 3D models that allowed us
also to use and experience different scenarios that we
wanted to present to the client and approve different
architectural ideas. If I can compare the real time
engine to the traditional ways of digital production, I
can say that the artists can feel the pure freedom in
terms of what they're creating, such as the moving
images or animation production, because the
software allows you, having no knowledge of
the technical aspects, still be able to produce
nice images and animations. JENS KAARSHOLM: Thanks for watching. I hope you enjoyed what
we showed you here today. And enjoy the rest of the event. Thanks for having us. DAVID WEIR-MCCALL: Thanks, Jens. Thanks, Vladislav. Looking forward to seeing more,
according to Architectural Record, HOK is the fifth largest
architectural firm in the US. And as today's event
is a journey, we're going to see HOK appear
in each of our three acts to show how Epic
products are being used across a firm in different ways. We're going to start by bringing on
Greg Schleusner and Katelyn Lagace to talk about how they're using
Twinmotion for interactive design reviews on a rather
important building. GREG SCHLEUSNER: Hi, my
name is Greg Schleusner. I'm the director of
design technology for HOK. HOK is a global architecture,
engineering, and planning firm. I'm going to talk to you today about
the Centre Block Project, which is Canada's parliament. The Centre Block Project
is a full seismic architectural, interior, electrical
and mechanical, and heritage upgrade to the building. During this time, the
parliament has actually left the building for the
duration of the project, which is at least 10 years. Fundamentally, the complexity of
this project is added to the fact that it's an ongoing
construction site. We're continuously doing
laser scanning, monitoring, and many other activities that
help us understand the project. The ability to bring all of
this information together, the ability for us to execute
on all of these different areas of expertise, is
fundamentally why HOK and WSP were brought onto the project. Now let me introduce you
to Katie, who will show you how we're using
Twinmotion on the project. KATIE LAGACE: Thanks, Greg. I'm Katie, and I work on the
visualizations on the Centre Block Project. We used Twinmotion and
Unreal Engine as a way to bring our clients
with us on our journey through this project in real time
through the different design phases and iterations. We take the scan data that
we're getting from the exterior and interior of the
building and we process that through Reality Capture, and we get
point clouds and high-poly meshes. We take those high-poly meshes
through our procedural optimized workflow, through
Houdini, and that way, we can get some LODs for use in
Unreal Engine and Twinmotion. With this, we were able
to create visualizations for our internal
teams and our clients. An example of this is, we were able
to conduct multiple accessibility studies with our design teams. We were able to invite differently
abled people into our live demos, and they were able to
put on the VR headset, explore around, and
actively tell us what worked and what needed revisiting. This helped the designers a lot
to see their designs in real time and see, for example,
maybe what's in reach, maybe wasn't in reach for
someone with limited mobility. The flexibility to
do this in real time saved us a lot of time and effort
during the production process. GREG SCHLEUSNER: Thanks, Katie. We really like the
way Twinmotion makes it easy to iterate
early on in a project. What makes it unique,
though, is the fact that we can transition
that data into Unreal. This is something we'll show you
later on in the presentation. BELINDA ERCAN: Thanks,
Greg and Katelyn. As Greg mentioned,
part of our vision is to allow Unreal Engine
users to also benefit from Twinmotion's rich 3D content. And that's why, in the last year,
we've released all of the Twinmotion content for free on the
marketplace, and we've built a bridge to import Twinmotion
projects directly into Unreal. DAVID WEIR-MCCALL:
And with this bridge, you can start a project in
Twinmotion and continue working on it in Unreal Engine. We're also pleased to
announce that we've been working with the
team at Vectorworks to improve the Datasmith
integration for Vectorworks 2022. The new workflow uses our
direct link technology to dynamically sync
changes while never losing the work you apply
into in Twinmotion or Unreal. This new workflow allows our
tools to integrate more seamlessly into your workflows,
and lets you iterate more effectively and creatively. BELINDA ERCAN: That's right. And in addition, for a
limited time, Vectorworks has announced that all
current Vectorworks users will get a full license off
the current release of Twinmotion for free. But not only Vectorworks
users can benefit from our efforts to make Twinmotion
accessible to as many people as possible. We are proud to announce
yet another partnership, this time with the world's
largest provider of PCs, Lenovo. Many of Lenovo's professional
workstations, powered by NVIDIA RTX, will be eligible for free copy of
a commercial Twinmotion release. This is a great combination
of both software and hardware. DAVID WEIR-MCCALL: And it's
available for a limited time only. So stay tuned, and we'll be sharing
all the details after the event. With that, let's take a short break. You can go stretch your
legs, grab a cup of tea. And we'll join you on our next leg
of the journey in Unreal Engine. DAVID WEIR-MCCALL:
Welcome back, everyone. Well, that was an
action-packed first act with the announcement of the
latest version of Twinmotion, with the Lenovo partnership,
and the new Vectorworks plugin. There's a lot of great things
happening for you around Twinmotion. In Act One, our customers showed
the storytelling potential of Twinmotion. But what's great is that the
journey doesn't end there. BELINDA ERCAN: Right. The journey continues. And in this next act, "Going
Beyond with Unreal Engine," you'll see customers
using the engine to go beyond the limits of traditional
visualization and storytelling. And to kick this section off,
we'd like to bring in our product managers, Rod Recker and Pierre-Félix
Breton to talk about how Unreal Engine 5 can take things
to a whole new level. ROD RECKER: Thank you, David. Some of you may have heard of
something called Unreal Engine 5. I'm going to tell
you about just a few of the many new capabilities
of Unreal Engine 5 that will help AC users. Lumen is a game-changer
for interior lighting, Nanite allows for massive worlds
with unprecedented object counts, and our new large world
coordinate systems will allow you to create
projects that can span the Earth. A lot of you have been
using the Unreal Engine 5 early access build
and have already seen how significant this release is. That's why one of the most
exciting aspects of Unreal Engine 5 is that it takes real-time
interactive visualization to a whole new level of quality. With it, you can easily bring in
data from a number of sources, interact at higher
levels of visual quality, and make stunning, physically
accurate imagery all in one tool. In short, Unreal Engine
5 is making it possible for you to spend more time designing
and less time managing your data and renderings. Datasmith is one of the
elements of our tool that makes it easy for you to bring
your CAD data into Unreal Engine and Twinmotion. If you haven't looked
at Datasmith recently, it now includes our
Direct Link technology, which allows both
Twinmotion and Unreal Engine users to create a live
connection with their CAD tools and seamlessly sync their data. You no longer need to spend time
manually exporting and importing data. Direct Link is available
now for our Revit, Archicad, SketchUp,
and Rhino plugins, and is coming soon for 3DS Max. Another important part of
our Datasmith capability is the addition of an SDK that allows
third parties like Vectorworks, Esri, and others to bring their data
into Unreal Engine and Twinmotion. We now have over a dozen AC tool
vendors supporting Datasmith, and the list is growing rapidly. In fact, it's my pleasure to announce
that Bentley Systems have recently developed an integration of their
iTwin platform with Datasmith. Now you can extend 3D and 4D data
from any Bentley solution powered by iTwin into Unreal
Engine using Datasmith. In this video from one of our
favorite partner studios, Solis, you can see an example where
Synchro construction sequencing data from a massive
project in Central London was exported from iTwin to Datasmith. The construction sequence and
animation is pulled from Synchro and made accessible in Unreal Engine. This project, 100 Liverpool
Street, is under construction by Sir Robert McAlpine
and the 4D planning work was produced by Freeform. Using these tools, you can now
explore your biggest projects in unmatched fidelity and communicate
with your stakeholders in ways never before possible. Unreal Engine was already
known for the quality of its real-time visualization. That's why one of the most
exciting aspects of Unreal Engine 5 is that it takes that
quality to a new level. With Lumen as its
rendering system, you can immediately see interactive
global illumination results with little to no preparation
or waiting for pre-processing. In most situations, you no longer
need to wait for light baking or spend time performing
UV unwrapping. You can get right to
seeing your creation and speed up your design iterations. This real-time capability also
makes it easier and faster to collaborate with your
clients and stakeholders. The combination of Lumen with our
physically accurate path tracer that was introduced
in Unreal Engine 4.27 shows our commitment to making Unreal
Engine a complete solution for all of your visualization tasks. You save time because
you no longer need to move your design data between
different visualization tools. You can stay in Unreal Engine and get
both high-quality, real-time visuals and physically accurate stills
and animations for marketing or simulation purposes. Now, Pierre-Félix Breton will tell
us about some additional things that we're working on for Unreal Engine 5. PIERRE-FELIX BRETON: Thank you, Rod. First, I would like to thank the
ArchVis community for demonstrating a strong interest in our technology. We're impressed by
your work showcasing the capabilities of Unreal Engine 5. Myself and the team are
watching social channels, and your feedback helps us
building the right product. We look forward to continuing
the conversation with all of you. Now, as Rod mentioned,
visualization artists and designers will be able to use Unreal Engine
5 to deliver beautiful results much faster than before. And there's another important
aspect to the story-- we are seeing more
and more people use Unreal Engine for visual project. You may be interested in
creating interactive dashboards to monitor the state of
IoT devices installed on industrial equipment,
inside buildings, or events happening at the scale of a city. To help you create digital
twins of the physical world, I'm pleased to tell
you that we're working on adding support for web-based
remote control and IoT protocols as part of our standard feature set. For example, we're currently
working on adding connections for MQTT, Node-RED, and OpenAPI. This will make it easier to
start your digital twin projects in Unreal Engine. Later on, we will show great
examples from our users showing digital twin capabilities
done with Unreal Engine. But for now, let's have a look at
even more great examples of Unreal being used for AEC projects. BELINDA ERCAN: Thank you,
Rod and Pierre-Félix. I mean, Unreal Engine 5
and Datasmith will truly unlock a level of
real-time exploration we have never seen before. And all of this moving us a step
closer to an interactive version of our physical world. Now, to add on to this,
we're working with companies all over the world to bridge
their workflows with ours through Datasmith. And even though we already
support many Autodesk products with Datasmith, we're not done. We are now working
on a proof of concept to use Datasmith to link Autodesk
Forge and BIM 360 platform with Twinmotion on Unreal Engine. We'll share more details
about this early next year. And if you're interested
in joining our beta, you'll find all the information
on our post-event web page at unrealengine.com. Now, I want to introduce
Jose Uribe from Pureblink. Pureblink is a long-time Unreal
Engine agency based in Toronto. And they're not only creating
high-fidelity visuals, no. They're building entire
interactive experiences to engage with the public around
a very large development project. JOSE URIBE: Hello, my
name is Jose Uribe. I'm the creative director
and founder of Pureblink. We are a creative digital
agency specializing in real estate visualization,
interactive applications, and storytelling. Over a year ago, we
were engaged by Elad to help them visualize the third
phase of Galleria on the Park. This is Galleria III. It is part of a large
master plan composed of eight different buildings,
an eight-acre park, a very large community facility
with plenty of activities. One of the key challenges that we
had with this particular project was the fact that it is located
in a very industrial area. So for the typical
prospect buyer, it would be very difficult to really
understand and visualize and experience what
this master planned community is going to be like
probably 10 years from now, once everything is built. Our
approach was to not only create visuals or renderings of different
areas of the project where we could visualize how the gallerias or
the park were going to be like, but also an interactive
application and a video component. And for all these different
components, we used Unreal, and it was a critical
technology to allow us to reach the level of photorealism
and the level of cinematography in the video. DROR DUCHOVNY: Preparing to
launch a project during COVID had us quite concerned and Pureblink
provided the 360 solution that allowed us to bring clients
through the community, whether they were in the sales
center physically or at their homes. We believe that Pureblink's
solution provided us the ability to hit our threshold,
our sales threshold, within a shorter period
of time during COVID. PIERRE-FELIX BRETON: Right now,
we're extremely excited about Unreal and all the new features
we've been testing heavily-- Lumen, ray tracing, global
illumination, path tracing, both on 4.27 and Unreal 5. It's absolutely amazing
what you can do with it. Lighting is one of the most
important components about it. Having a tool that allows
us to play with lighting and how in, like, practical
lighting, natural lighting, diffused lighting, ambient
lighting, different sorts of ways, and trying it out in real-time--
it's completely priceless. We started to code and develop
our own WebGL application that connects the geometry of the building
with the database of the unit availability. That is the same type of
capabilities that we have for the touch screen applications. We're experimenting
with SketchUp in order to be really the tool
that allows us to connect one single model into our
touch screen applications and our online applications. We're extremely excited
about the future and how we can integrate these, not
only in the applications that we're doing at the moment but also
in mixed-reality applications, virtual reality applications,
and virtual production with using nDisplay. Thanks for listening. I hope you enjoyed the presentation. And enjoy the rest of the show. DAVID WEIR-MCCALL:
Fantastic work, Jose, and great to hear from
Elad how virtual worlds are supporting the real estate industry. Next, we're going to bring on
the team at Foster and Partners who have been pushing the limits
of real-time technology for over 20 years. Let's hear from Francis and
Gamma about a novel approach of merging the physical
world and the virtual to help stakeholders understand the future
in a more immersive setting. FRANCIS AISH: Hi, I'm
Francis Aish, head of applied research and development. GAMMA BASRA: And I'm Gamma Basra,
head of visualization and film. We work at Foster and Partners,
one of the world's leading architectural engineering
and design studios. We communicate visually using
images, films and real-time games engines like Unreal. FRANCIS AISH: We've worked on
augmented and virtual reality for many years, developing
custom tools that allow our designers to create,
analyze, and experience their ideas in intuitive, yet powerful, ways. Today, we'd like to tell you about
some of our latest AR and VR work. The projects we've worked
on are still sensitive, so we're going to use a dummy project
to illustrate the key concepts. We use a wide range of tools
and media to communicate ideas, as different tools give
different experiences. Physical models can give
great overviews of a design, but we wanted to
explore how best to show the street-scale human experience. We're long-running users of Unreal,
with a very good relationship with Ken and the Epic
team, so we had already built a very detailed,
high-quality visual model of the new tower and
its context in Unreal, and used it for regular VR reviews. We'd also previously conducted
VR and AR design reviews using VR backpacks,
hololenses, and iPads. These can be done at 1 to 1
scale indoors in warehouses, and got very good
reaction from clients. We always like challenges, so
together with the design team, we posed ourselves a harder one. Could we walk the stakeholders
around the real site at 1 to 1 scale, but give them the VR
experience of the new building to help them understand
the spatial experience and how the building would
fit in within the city as part of the design and planning process? To begin with, we started
laser-scanning the site and context to align the virtual and real worlds. The site itself is
approximately 100 meters square and can be that dynamic
environment with pedestrians walking around and changing
lighting, as well as great visuals. Unreal also provided
the development platform we used to create our custom
software called CloudCom. We also work closely with HP and
Microsoft on the hardware side. We then customize the hardware to
enable robust multi-user tracking of this scale. Of course, as with so many
projects, COVID got in the way, and for a while we had to
continue working in our bedrooms and back gardens. But because of the
multi-platform nature of Unreal, we could quickly create an
iPad version of our tool and stakeholders could
use that to continue to participate in the process
from their back gardens. Once we were allowed back on-site,
we continued testing and refining system, getting feedback
from our stakeholders. Critically, this is a
multi-user experience. You can see the other
participants and speak to them and point out features
of the new building. It's an incredible,
multi-sensory experience of walking around the plaza, seeing
the very powerful virtual visuals, but feeling the real
plaza beneath your feet and hearing the sounds
of the city around you. Something we found very
useful was switching between virtual and real
reality to compare and check the present and potential future. And by incorporating screens
on the rear the backpacks, we made the process inclusive to
allow larger numbers of stakeholders to participate. Overall, the user reaction
has been incredibly positive. We've also developed remote
collaboration functionality to allow remote participants to
join and contribute to the process. And one future use could be to
increase design participation using tablet AR and other low-cost
XR devices to allow broader range of stakeholders to participate. So we think this has pushed
the boundaries of outdoor 1 to 1 scale VR. This ability to collaborate at
any scale and from any location is incredibly powerful. It's giving our clients
and designers a way to experience their
designs like never before. DAVID WEIR-MCCALL: The idea that
the physical world and the virtual can be brought closer
together to drive decision making is a great approach. And you can see from the hardware
that Fosters and Partners had to feel that we're still in the
infancy of using these technologies. But there's no doubt it'll
get simpler over time. So thanks Gamma and
Francis for giving us a glimpse into the future of XR. Our next presenters are Henry
and Caesar from Zaha Hadid. Who hasn't heard of this studio? They are the world's
most consistently inventive architectural
firms, and you may have read our story on
their Honduras configurator, but they're here to reveal
details and discuss their results. CESAR FRAGACHAN: We are
presenting to you today an overview of platform design and
game engine technology at Zaha Hadid Architects to deliver engaging
interactive spatial experiences for clients and consumers. HENRY LOUTH: The
computation and design group is a practice-embedded
research team focusing on creating computational geometry
and building strategic partnerships to bridge industry advances
toward project-based applications. We incorporate these advances into
workflows at Zaha Hadid Architects to address issues of sustainable
development of the built environment. CESAR FRAGACHAN:
Shifting design thinking to integrate game technologies
and platform development has empowered our capacity to disrupt
conventional procurement processes and bring together stakeholders
to deliver high value, locally relevant, resource-effective
supply-chain integrated design solutions for residential living. HENRY LOUTH: There
are several problems faced in architectural and
construction industries for which game technologies
are particularly well-suited to negotiate. Construction projects take years
from conception to occupation, during the course of
which markets fluctuate, resulting in misalignments
to the consumer base they were originally intended. Similarly, the priorities
of key stakeholders, including regulators, developers,
operators, and occupiers, in addition to
architects and engineers, is often competing, resulting
in delays requiring alignment and compromise for success. The Beyabu Residential
Configurator is the latest build in a lineage of residential
platforms developed at Zaha Hadid Architects, with Epic and Unreal
Engine offering investors, occupiers, and developers ways
to configure properties to suit individual preferences, offering
ability to position units, configure add-ons to the envelope,
and customize the interior space layout and fittings, as well. Monetizing such features as
AR and development rights in the process to configure modular
building components in real time using a web-based application. CESAR FRAGACHAN: The configurator
leverages real-time ray tracing technology to
achieve the highest level of visual fidelity for both the
interior and exterior of the units. Additionally, pixel streaming
enables us to share the configurator experience with clients and with
the use of custom frameworks to maintain non-destructive
workflows with designers. HENRY LOUTH: Platform
design offers clients a universe of feasible
variations at their fingertips. For instance, there
are over 2,200 ways to configure a custom resident
in the Honduras Configurator. We have developed and curated
a cohesive digital kit of parts that result in vibrant and
diverse residential communities. Exteriors and facade kits
are created fit for place. The platform approach offers
local construction companies, building trades, and
local crafts persons to benefit from an infusion
of digital technologies into the economy. CESAR FRAGACHAN: Platform design
and architecture, engineering, and construction helps
to test fit, simulate, and explore scenarios to design
market-tested, demand-driven solutions. The configurator can be used
socially to curate a community, to accommodate the peculiar
spatial needs of family members, invite friends, and connect
to like-minded individuals or to share resources
and costs with neighbors. HENRY LOUTH: Health, well-being,
and protection of the environment are fundamental to
the platform approach, providing consumers with
local and renewable materials, such as sustainable lightweight FSC
timber in structure and cladding options, as well as pre-fabrication
off-site and technology integration, reducing waste and the total
carbon footprint of the schemes. CESAR FRAGACHAN: One
of the challenges we face has been interface
and experience design. Developing device actions,
storyboarding, heads-up display, and sequence behavior
to maintain a seamless, immersive gameplay typically
prevalent in gaming environments. HENRY LOUTH: At Zaha
Hadid Architects, our workflow is tuned to
realize complex geometry as fabrication-optimized
and manufacture-ready. We continue to develop
lightweight control mechanisms and retain live geometry
as long as possible to empower design,
interaction, and exploration. However, game technologies pose
distinct real-time optimization criteria for content
creation, management, . And version control to prepare
geometry for the real-time pipeline. We are taking steps towards
multiplayer modes of play and distributed gameplay
models to capture real-time decisions of human
actors in a digital marketplace for real estate bidding,
negotiation, and property valuation. CESAR FRAGACHAN: The value
of interactivity and occupier engagement in the
design process resulted in converted money
traditionally left on the table to direct sales of housing units. The immersive environment, coupled
with the unparalleled co-authorship of the built environments
in Unreal Engine, increased the margin of close
sales while increasing overall unit variability to suit individually
and meet investment goals. HENRY LOUTH: Twinmotion's
gentle learning curve allowed multiple Rhino users
unfamiliar with Twinmotion to learn quickly,
deliver stunning imagery, and contribute to content
preparation from day one. This empowered the technical
and creative aspects of the project to move forward in
parallel with a very small project team. With clients and end-users
distributed globally across both hemispheres,
web deployment integration was a priority, which
Unreal streamlined. The web-based platform
removed potential issues with client-side hardware
requirements, technical support, remote connectivity issues, resulting
in a simple, seamless, no-nonsense browser design exploration. It's been a pleasure
working with Epic to deliver an unparalleled service
model for architectural design, exploration, and delivery. Thank you for your time. BELINDA ERCAN: Thanks,
Henry and Cesar. It's great seeing more and more
examples of companies building sales configurations for real estate. It's officially a thing. And now, it's time for Greg
from HOK to come back and join his colleague, Mark Cichy, to show
us their Twinmotion project taken to the next level in Unreal Engine. They have created a single
interactive experience of the Parliament Building in Unreal,
combining detailed ladder scans with design data. And with Jozef Dobos
and the team at 3D Repo, they're exploring the
creation of a federated model for their daily visualization needs. GREG SCHLEUSNER: Hi, it's
Greg again, from HOK. Centre Block is such
a complex project. It's important that we give
the stakeholders the ability to monitor and view the
work as it proceeds. Mark Cichy, my colleague, is going
to walk you through that process now. MARK CICHY: Hi, I'm Mark Cichy. I am a principal and the director
of design technology at HOK. We currently have about 75
terabytes of RealityCapture data. That accounts for terrestrial
laser scanning, hand scanning, orthographic imagery,
and photogrammetry. The RealityCapture data
that we currently have represents the first three years
of the project's existence. We still have around
seven years to go, which means that 75
terabytes will balloon by the time we get to the end, to
around 350 to 400 terabytes of data. We've developed an application
leveraging Unreal Engine that allows us to concatenate all
of the aspects of the reality capture we've done to date. This streams the data in
from a variety of sources, including our local data
center and Cintoo in the cloud. We will ultimately
deliver the application via Pixel Streaming on cloud. The primary purpose of the
Pixel Streaming application is meant to give context to the
entire trajectory of the project. So it allows you to, in
effect, trace the entire state of every space in the building
throughout its history. That means we can go
back to the beginning, before construction,
when the building hadn't had any demolition acted upon it. And it means that we
can always go back to the very end, where we've
reinstated the finished materials. We can also investigate
anything in between. We can look behind
mechanical ductwork. We can see at what state drywall
was removed and replaced, and if there's any differential
between what the design documents say in regards to
where a conduit may be placed or where it actually is in reality. And we can actually track
that delta and see over the course of the history of
that space where the services are and what the finishes
ultimately transition to. Leveraging Unreal Engine and
Pixel Streaming in this way allows us to create a
comprehensive and robust versioning system of the geometry
and the point cloud scanned data throughout the
project's entire history. We've been investigating
the use of 3D Repo in parallel to create a
supplemental versioning system that allows us to trace our BIM-related
data and the Revit model specifically throughout its
existence so that we can compare that information when
upgrades are performed or when major model
changes are executed. In addition to this, we've been
leveraging the new plugin that's been developed by 3D
Repo to be able to stream that content and those differential
checks into Unreal Engine. Now, I'll hand it over to Jozef
to speak a little bit more about that process and what they
are doing with Unreal Engine. JOZEF DOBOS: Hi, I'm Dr. Jozef
Dobos, founder and CEO of 3D Repo. With HOK, we've been working
on countless innovations over the years, and this
project is no exception. We've been instrumental in
cloud-based data validation and large-scale geometry
processing that would not have been possible otherwise. Using 3D Repo, HOK were able to push
their complicated 3D models straight from the authoring tools to the cloud
for version control and federation purposes. Since all of the data is
available through the APIs, there is no need to recompile
and reissue the game anymore because all of the data is instantly
available using Unreal Engine and 3D Repo as well as during runtime. This way, client-facing
visualizations can be developed and kept in
step with the engineering process without any further delay. HOK were also able to retrieve
3D locations, metadata, and other engineering information
directly from the cloud without the need to build
their own infrastructure. 3D Repo has been built directly on
top of the existing AEC workflows. So Unreal experience
can be simply enabled by installing our own plugin. For us, UE5 delivers significant
performance improvements, as well as the decoupling of the
rendering pipeline from the underlying
data structures so that we were able to populate
that dynamically, on the fly. UE5 was not just about visual
fidelity, but more importantly, about processing
capability and prowess. We are very much looking forward to
continued partnership with both HOK and Epic Games. BELINDA ERCAN: Now, seeing all this,
I think it's a good time for a break so you can reflect on all the
magic that you've seen here today before we head in into our final
act, which we'll look at global scale projects and digital twins. So please stick around. We have a few more
announcements to make. DAVID WEIR-MCCALL:
Welcome back, everyone. Before the break HOK
and 3D Repo showed us how real time technology
can be used to federate large models and a glimpse
into the great companies, like 3D Repo, who are incorporating
Unreal in their pipelines. Now, as we enter act three we're
going to go one step further. BELINDA ERCAN: And
one step further means connecting the physical
and the digital worlds. So in this act, it's all about
open worlds and digital twins. You'll see projects that
truly expand your horizons. Because we'll be exploring
global scale projects, and connect real time data to Unreal
Engine to create such digital twins. But what exactly is a
digital twin, David? DAVID WEIR-MCCALL: That's
a very good question. And if you've looked online
recently, digital twins mean a lot of different
things to different people. And I've seen the heated forums. So for us, supporting connection in
Unreal to your live data sources, to empower your digital
twins are really key for us. So an example of a digital
twin would be, say, a digital model of a city that
shows the status of all the traffic lights at any given moment. That would be a good
digital twin of a city. So today we'll touch on all
aspects of creating a digital twin, from capturing the digital version
of the physical asset, all the way to a live data feed. BELINDA ERCAN: So with
Capturing Reality, having joined the Epic
family earlier this year, the possibilities around
capturing the physical world have dramatically increased. And since every digital twin starts
with a copy of a physical asset, we're quite confident that we
have all you need to get started. Now, I'd like to introduce Elmer
Bol to share with you where we are at with Quixel and RealityCapture. ELMER BOL: Hi, everyone. My name is Elmer Bol. I'm a product director
in the Quixel team. For those of you who
don't know Quixel, we've been all about
capturing the world, and creating digital
twins that others can then leverage to create their own worlds. So we've spent the
last decade capturing some of the most iconic
biomes in the world with our goal to create the Megascan
library that's available to you all today, and that you can leverage
in creating your own worlds. But really for us, the
mission is going to be how do we make scanning more
broadly available to the world? And how do we enable you to go
and scan the world yourself? And with that, we've recently joined
forces with Capturing Reality. We're very excited
about this because they bring the best 3D photogrammetry
technology into the fold. And we're looking forward to actually
making that available to all of you. And this will really
energize us and enable us to create much easier
ways to capture the world, and allow anyone to
create these real time 3D experiences, to leverage these
digital twins of the real world. We just released our latest version
of Capturing Reality, which I really encourage you to go out and try. It's got a ton of AC functions,
leveraging drone capture, and other types of things. Please go out and try it yourself. You'll be thrilled to use it. But really for us, the
next step is how can we let anyone go and capture the
world around us to leverage into 3D experiences? So with that, really
excited to work with you, to get your feedback on this, and
to continue evolving this tool. And with that, thanks very much. I wish you happy scanning. I look forward to
seeing you next time. Bye bye. BELINDA ERCAN: Thank you, Elmer. I mean, there are a lot
of steps to the Metaverse. But this is certainly
one of them where RealityScan can help our customers
capture their world for free. We'll share more information
about this and other announcements at the end of the event. So please stay tuned. Now, speaking of worlds,
let's also talk about scale. You know in architecture projects
range from small scale floor plans, all the way up to entire
city scale developments. And if you work with such
large scale assets that cover hundreds and thousands
of square kilometers, well, we all know that that
can be quite a challenge. But Sebastien Loze, our industry
manager for training and simulation, will now touch on ways
to bring such large scale worlds, along with their plethora
of data, into Unreal Engine. SEBASTIEN LOZE: Thank you, and it's
very exciting to be here today. Hi, everyone. In the simulation world
one of the first questions we get asked when we introduce
what we do at Epic Games is, this is all great,
but how large of a world can I handle with Unreal Engine. This question comes from
the past, from a time when monolithic application
structure was not allowing developers to get more than dozens of square
kilometers in a game engine if they wanted to get the
right level of visual accuracy. The entire world can
be loaded, streamed, or explored in Unreal Engine. And we'd like to introduce
you to these large worlds and how to inject them into
the applications you create. In addition to the photogrammetry and
real world data capture technology the previous sessions
just presented to you, it is important to understand
that a plethora of data is already available. These data, from the
GEOINT community is just waiting for you to be
leveraged in your applications. Data captured from the real world
can be geographic information system files or 3D environment
databases already sitting in a format ready to be
just loaded and streamed in UE. In both cases we worked with
seasoned and smart individual teams, companies, and global
organizations to make sure that we're supporting standard
formats for large environments the same way we do for
3D models or CAD data. Over the last years,
many application creators with expertise in the
GIS world added methods to get GIS files as sources in
your Unreal Engine projects. To do this, you can simply use
the applications of your choice, such as World Machine, Instant Terra,
World Creator, TerraForm, Terrain 3D Builder, or Terravista to transform
your GIS files into your 3D environment. We also partnered with the Open
Geospatial Consortium and the US GEOINT Foundation and
others to identify what industry standards
and open standards should be supported so that you don't
even have to create your data set, but you can use ones
from your ecosystem. Today the Esri ArcGIS map SDK, the
OGC CDB format, the 3D tiles format, and the Super Maps
format can directly be streamed to Unreal Engine. What we've just described
is the last mile to ensure a complete
pipeline, allowing creators to bring the smallest details
of a piece of furniture to the entirety of the
Earth into Unreal Engine. Now that we have the
content figured out, we need to make sure to develop
the right tools inside the engine, like recently in 4.27 when we
launched the georeferencing capabilities. This is just the beginning. And more will come
with Unreal Engine 5, including World Composition and
other tools to simplify large world coordinates handling for developers. On top of these developments by
Epic, the ecosystem around Unreal is evolving very fast. And today we're introducing you
to the work of Blackshark AI. This AI based digital twin is
a geospatial platform offering a 3D semantic map of our planet. Using satellite imagery and
various other input sources, Blackshark is extracting
information about our environment and reconstructing this
information in a 3D way so everyone can virtually
interact with and better understand the world around us. Recently Blackshark has chosen
Unreal Engine to be the first game engine to deliver the world's
best simulation and visualization capability. Using AI to augment
the geo data existing is a disruptive and revolutionizing
workflow for all industries, as well as democratizing geospatial
data access for everybody. More details about the Blackshark
plugin for Unreal Engine will be announced closer to the
launch of Blackshark's AI Digital Twin USA. And I recommend that you stay
tuned for more on that very topic. As you can see, to the
question, how big of a world can you ingest into Unreal Engine? The answer today is, how
much can you dream of? And with humans looking at the
stars, aiming to return on the Moon or walk on Mars, I would
not be surprised to see more planets than Earth simulated
in Unreal Engine pretty soon. So with all that, I want
to thank you very much. And back to our hosts today. BELINDA ERCAN: Thanks, Sebastien. With global data sets from
Blackshark, Cesium or Esri and RealityScan on
your phone, it's never been easier to capture your world
and explore it with Unreal Engine. Now, for one final time, let's bring
back the creative team from HOK to learn how they
integrate open worlds and digital twins in their project. GREG SCHLEUSNER: Hi, Greg again. Katie walked you through Twinmotion. Mark and Jozef illustrated
our integration between 3D Repo and Unreal. Now we're going to put the
Center Block on the campus. Sebastien previously mentioned
Esri and Cesium integration. We're going to illustrate how
we're using Cesium on the project. And Mark's going to
do that for us now. MARK CICHY: Cesium presents
an exceptional opportunity for architects and
designers, because it allows us to real time
stream geospatial data directly into our models. This saves us a significant
amount of time and resources in having to manually or
parametrically automate the development and modeling
of the surrounding context. Plus, it has the added benefit
of having a community supported and developed around it that enables
additional contributors to develop content, which would otherwise
be difficult to do on a firm by firm basis. This represents real, tangible
benefits to us on the Center Block project, because it allows us to
visualize, along with the client, how the context will
appear within the models that we're developing on the project. We're currently
investigating and developing a tool that will allow us to tie
the real time sensor data that is being captured by our
project partner, WSP, and our structural geomatics team
to a Cesium model that will allow us to correlate both the context
and the geospatial rich reality capture data in one
continuous location. There's an exceptional
opportunity here to develop a comprehensive
digital twin that incorporates all of the data that we've
generated from various sources, including our development models, our
geospatially referenced point cloud data, and any other related
construction 6D, 7D, or 8D BIM related data. We and the client believe that
there's an exceptional opportunity to leverage all of the data that
we're collecting on the project, and incorporate that into a
comprehensive digital twin will be part of the building's
operations and facilities management for years to come. GREG SCHLEUSNER: Thanks, Mark. We've shown you Twinmotion, the many
different ways we're using Unreal. We think this integration
of external data into real time technologies like
laser scanning, photogrammetry, BIM information is the future,
and frankly, the current way that you must do heritage projects. What we're most excited about is
how this will trickle and move into our existing work
as we move forward. On behalf of HOK and
WSP, I want to thank you for taking the time to let us
tell you the story of our once in a lifetime project, Center Block. Thank you. DAVID WEIR-MCCALL: Thanks, guys,
and the amazing team at HOK for your videos over
the three acts, showing how a company can
start with Twinmotion, but then evolve your
project into Unreal, and tackle any scale using
our tools and ecosystem. Now, just like we saw
in the HOK example, the crucial part of a digital twin
is that connection to the real world. Presenting live, real
time data from IoT sensors and other sources in that 3D context
can accelerate decision making, and anticipate problems. You see, the world we live in
is growing ever more complex. And we'll need methods of presenting
information in an intuitive manner. There's going to be a million
different forms of digital twins, from helping us make better
decisions in the design and operation of a building, to
public engagement within a city. So let's check in with
some companies who are working to connect Unreal
Engine to the real world. First, we'll hear from Shi
Yanjie and his team Vouse. They'll show their efforts to build
a digital twin of Changi Airport in Singapore, which in
2019 had almost 190,000 people per day using the airport. So understanding what's going on
at any moment can be quite crucial. YANJIE SHI: Hi, I'm
Yanjie from Vouse. We are a design agency
based in Singapore. Vouse is an assemblage of architects,
programmers, and game designers. We use game engines
in the architectural and built environment sector
on a wide range of projects, ranging from residential, to
commercial, and even to heritage. Our projects are often
tailored specifically to the needs of the client. And we deliver from
the conception stage all the way to development
of the software, as well as maintenance of the
project after it's completed. Today I am very happy
to share with you our project with Changi Airport
Group, the Changi Airport digital twin. Vouse has always been a partner of
Changi Airport Group since 2017. And we developed a digital
twin of Changi Airport over the past two years together with
Mr. Ao Chin Wen and his colleagues from T5 Planning The digital
twin is around 1,200 hectares. And that is roughly the size
of 3,200 football fields. Our approach to digital twins is
to build a data integrated model. We stream in live operational
data into the 3D model. And we use that as the base
to develop applications for different end
users of the airport. The reason that we
chose Unreal Engine is because of its ability to provide
real time graphics by the same time we can integrate with
backend databases. And this allows us to scale
with the size of the project. Throughout the two year
period, we deployed to different platforms
and applications from the same project file. The base model was built based
on CAD and BIM information. The 3D models are
constructed modularly and assembled in
Unreal Engine to take advantage of the many
optimization tools, often used for open world games. These includes instancing,
as well as the foliage tools. This is important, because
on a project of this scale your performance is key. Using geo referencing we
aligned the model with real geo coordinates that allows for
integration with sensors and data. With Blueprints we view integrations
with the Changi backend databases that streams in live
data and projects them onto the model in real time. To talk a bit more about
the technical details let me invite the next
speaker, Pun Hei, who is our senior developer on Vouse. PUN HEI CHIU: Hi, I'm Pun Hei, a
senior software developer at Vouse, talking about our application
which uses aircraft transponder data to visualize the position and
movement around a digital scale model of Changi Airport
in Unreal Engine 4. The application can
display plane position and additional information in
real time, relayed from a server in the airport. It also allows users to play back
historical simulation records transmitted from the server
itself or from records stored in a local database. Going forward, our future
plan is to integrate as much information as
possible into the application to serve as a central hub of
information for monitoring purposes, such as arrival and departure
times at each gate, human traffic counters, temperature and
humidity sensors, and so on. YANJIE SHI: Thank you, Pun Hei. That was an example of deploying
the digital twin on a macro scale. On a more micro scale we investigated
the role in traffic infrastructure from the angle of the actual user
by deploying the digital twin on to a driving rig. We showed that a detail level
can be achieved with ground level signage and markings. With this, end users such as
technical staff and operators of the airport who may not be
familiar with architectural drawings can be engaged to test
out the digital twin. This opens up opportunities
with the complex tools that Unreal provides to build
scenario training and design validation applications. This is especially
important in today's world where digitization
and remote training become increasingly important. The digital twin is
currently deployed internally in Changi Airport. It is meant to be a live model,
which is consistently updated with new content and features. This project serves as a
foundation for Changi Airport Group to continue exploring
digital transformation in the domain of airport planning,
simulation, and operations. Thank you for listening
to my presentation. I wish you a pleasant evening, and
please enjoy the rest of the events. Thank you. DAVID WEIR-MCCALL:
Thank you, Shi Yanjie. Great example of how Unreal Engine
can tie together many different data feeds into one easily
understood context. The team at Vouse have
showed us how owner operators can use digital twins to
manage their complex assets. But what about the public? Could they benefit
from a digital twin? Tim Johnson from
Buildmedia has been working on such a project with the Wellington
City Council in New Zealand. Let's go see where it's at. TIM JOHNSON: Hi, my
name is Tim Johnson. I am the technical director
here at Buildmedia. Unfortunately due to New Zealand's
COVID lockdown and social distancing requirements we're unable to
bring our full presentation. However, the Buildmedia
team is still very keen to share the Wellington digital
twin and how it was created. Buildmedia is a visualization
and experiential studio creating meaningful digital experiences. The team is a mix of
specialists, ranging from gaming, programming,
animation, visualization, GIS, and architecture. We love pushing the
boundaries in everything we do using Unreal Engine. Unreal Engine has allowed
us to explore new industries and capabilities we never would
have imagined only a few years ago. And the Wellington digital
twin is one such project. Buildmedia has visualized many
large scale infrastructural projects across New Zealand and
Australia, and spent years creating accurate 3D visualizations,
placing these in context to help stakeholders and
the public understand the scope and impact of the project. With the Wellington
digital twin, we wanted to create a dynamic
real time model that presents information in a
non-linear and engaging way, bringing together our
visualization capabilities with sensor, geospatial, building,
infrastructure, and online data from across government and
non-government organizations. The aim is to help make better
multi-agency decisions, understand complex issues like climate
change, and economic development. And most importantly, as with all
our visualization work, better engage with the communities
and stakeholders. We've been developing the Wellington
model over the last few years, using it to create a number of
animations and visual simulations, which are accurate photo
montages, and marketing imagery. With everything we have learned with
UE4, we decided to take the leap and bring the entire
model across into engine, and push what could be possible. VINCENT VULLIEZ: Hi, I'm
Vincent, one of the lead CG artists at Buildmedia. I'll be walking you
through how it was created and how you can get started
on your own digital twin. When building any kind of
digital twin ask yourself, what are we trying to achieve? This helps you define
the scope of the project and where to direct your efforts. For Wellington City, accuracy in the
visual appeal was our main focus. Wellington was created using a number
of data sets from multiple agencies. The availability of
good spatial data is one of the key factors for
a successful twin of a city. Work with your local
geo detect system so that surveyors, engineers, and
architects can all be coordinated to the same reference point. We started with a small
area, and continued to expand the model outward to
keep on top of optimization. We were surprised every day as
we continued development just how much you can throw at Unreal Engine. Starting with the terrain, we used
publicly available contour data from Land Information New Zealand. We generated height maps and exported
terrain tiles as static meshes from 3DS Max. In the material, the World
Position Offset function creates the Earth's curvature. Projects were modeled
and animated in 3DS Max, and imported through
individual Datasmith files, with materials automatically assigned
in Unreal Engine through Dataprep. For the buildings, a detailed
photogrammetry model of the CBD was supplied by Wellington
City Council, 18,000 buildings with unique 3D models and textures. Unreal Engine has inbuilt
optimization tools that were key to manage draw calls
and getting a stable frame rate. Using Blueprints and
shape files, we accurately placed trees, street lights, cars,
boats, and even wind turbines. And finally, my favorite
part of the project, we let Unreal Engine's
amazing volumetric cloud and atmospheric system
bring the scene to life. TIM JOHNSON: The Wellington
twin is connected to IoT sensors located
across the city, bringing the Unreal model to life. We capture and record traffic
volume, cyclist movements in and out of the city, pollution
levels, pedestrians, temperature, weather, and more. We access this data using REST API,
a mixture of JSON, XML, and HTTP requests, depending
on the data source. This is very straightforward,
using easy to find UE4 plugins on the Marketplace. It is still early days
connecting to the Wellington digital twin
infrastructure, and we're excited to explore this further. The Wellington twin is also used
to visualize a number of projects around the city. These projects have been included in
the model for public consultations in stakeholder meetings. From our experience,
the consultation phase is critical for any major project. Having all the
information in one place makes it more efficient
and effective to engage with the industry and
public stakeholders. It can also provide a history
of proposals and feedback. Every model, every decision can
remain within the interface. This then demonstrates to the public
that they are valued and engaged throughout the whole
consultation process. SEAN AUDAIN: Kia ora! My name's Sean Audain,
and I'm a city innovation lead at Wellington City Council,
working on our digital twin project. Here in Wellington, we've been
investing in a digital twin because it helps us understand and
communicate the future of our cities and the complexity of
what makes it work. What we love about Unreal Engine is
its ability to take disparate data feeds and put them into
a universal experience, which we can use to help make
that future real for people. Our aim is to be an
informed democracy. And so that ability to
universally fuse together that future of the city
is everything to us. TIM JOHNSON: Pixel
streaming will enable us to share the digital
twin throughout the council and potentially in the
future to the public. Streaming the digital
twin from the cloud means more people are
able to access and utilize the twin for presentations
on any device. Messaging is then consistent
across all departments. We want the Wellington digital
twin to enhance public, agency, and stakeholder collaboration,
encourage open data, and stimulate the
co-designing of Wellington for the benefit of
future generations. Buildmedia is looking forward
to the continued development of the Wellington digital
twin in Unreal Engine, and hope this will be a
great asset to the city. Thank you. BELINDA ERCAN: Thanks,
Tim, and the team at Buildmedia for giving us
a glimpse of how anyone might use a digital twin to either
learn more about their city, or simply make better decisions. DAVID WEIR-MCCALL: And
finally, before we go, we want to make it easier for any
of you to build a digital twin. Earlier on, we heard
Pierre Felix mention some of the efforts we're
making around Unreal Engine 5 to support native methods
of using various protocols. But we also have something
that you can play with now. BELINDA ERCAN: Because
earlier this year we awarded a mega grant to
WSP, a global engineering firm to bridge the gap
between Microsoft Azure's digital twin platform
and Unreal Engine. DAVID WEIR-MCCALL: A team at
WSP's Swedish office together with Microsoft have created an
Unreal Engine plugin and sample project to help people get
started with digital twins. The resource includes an application
showing a floorplan of a building, a scripted installation of
all the necessary Azure, IoT, and digital twin components, and
even an IoT device simulator. BELINDA ERCAN: And
this sample project shows different ways of
visualizing real time data, from using our Niagara
particle system to show airflow and temperature,
to utilizing our lighting methods to show when lights are on or off. We even use our animation
Control Rig to simulate people entering and exiting a room. It will also show you how to connect
to Azure's digital twin suite of tools, such as their time series
insights to get historical context. DAVID WEIR-MCCALL: We've
documented all of this so that you'll know exactly how
to leverage the resources to build your own digital twins. And all of this is open source, and
free, and available on the Unreal Engine Marketplace today. A massive thank you to
WSP for their contribution to the digital twin community. BELINDA ERCAN: And that's us. In the last 90 minutes
we have shown you how you can tell your
stories using Twinmotion, and then create unlimited
worlds in Unreal Engine. We have shown you how our
customers are experimenting with these transformational
technologies, and what they have learned so far. DAVID WEIR-MCCALL:
We've also shown you where these technologies might take
you if you master them and deploy them on your most complex problems. This is a journey, and we're all just
starting to unravel what's possible. We're at the earliest
stages of exploring how these immersive technologies
will benefit people and lead to new efficiencies. BELINDA ERCAN: Now, not
everything will be a success. But we'll learn from the
failures and use them to power a better future, one
where people take a more central role in
the design of their spaces and will have better access
and insights on the data driving the world around them. DAVID WEIR-MCCALL: So
what's the next step? Spend a lunch break with
a free trial of Twinmotion and see what stories you can tell. It might just change your
life like it did for Pawel. DAVID WEIR-MCCALL: See you later. BELINDA ERCAN: Bye bye.
DAVID WEIR-MCCALL: Bye, everyone.