Fortnite Trailer Pipeline | Unreal Dev Day Montreal 2017 | Unreal Engine

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] my name is Brian Poole I am epic games media and entertainment as a solutions engineer for all of North America and I'm sure you've seen a number of really technical presentations today for my presentation we're going to switch gears a little bit and go up to the 30,000 foot level and take a look at the big picture we're gonna be talking about pipelines and workflow so I'd like to give you a little background on myself I've been working in the film industry for about 17 years back in 1999 I moved out to Hollywood California to learn a new software package called Maya and it was supposed to revolutionize the animation industry and frankly I think it did while I was out in California I went back to school and I found that I was in the right place at the right time and I got a job up at Lucasfilm and at ILM working on the new Star Wars prequels as a previous artist and I know what you're thinking I had nothing to do with Jar Jar so ultimately previous became my career and I went to work on dozens of different movies it's kind of like a virtual cinematographer and it served me really well but today I'm on this new path and I'm working in a for a completely different kind of company so a game company which kind of surprises me and I'm transferring my film experience over to embracing the Unreal Engine so as a solutions engineer it's my goal to assist animation and visual effects companies integrate Unreal Engine into their pipelines so I want to start off with a quote from my former boss mr. Lucas art all art is dependent on technology because it's a human endeavor so whenever you're using charcoal on a wall or designing a proscenium arch that's technology so what was he trying to say here essentially art pushes technology and technology pushes art the two go hand-in-hand like a symbiotic relationship in and in just you know the past five years we're seeing such a shift in technology that once again we're going to have to restructure our approach to animation and VFX for film production so the next evolution of animation technology will unquestionably rely on being real-time interactive immersive programmable nonlinear and viewer customizable our clients are already starting to take advantage of these capabilities and we've got a few examples here first off is Safari this is a new animated excuse me animated episodic KIDS show that's being produced by digital dimensions right here in Montreal and it will be released in France first in October and then distributed in the US next year by Universal we have barbie vlogger this is being done by house of moves in Los Angeles and this is a basically it's a Mattel property and and it's a real-time barbie variety show and Barbie and her friends are you know taking questions from the audience and answering them in real time with an actor in a mocap suit feeding and streaming that mocap data into the engine and broadcasting this out live and finally some of my previous friends from Halon entertainment we're using the engine to create pre-visualization for war on the planet of the apes and the new realtime technology is really helping directors in Hollywood you know see their films being constructed before they actually have to go up and shoot them and with the engine it's even better because they can actually get a better sense of lighting and the processes is quicker so this is all great but today we're going to be talking about fortnight and for this presentation we're going to take a snapshot time to examine how the team at Epic Games constructed this particular trailer so at epic we we understand that creating animated content is not an easy task so if we're to provide really excellent tools for our artists there's an expression we have to eat our own dog food and in other words we need to use Unreal Engine in production and you know try it out for ourselves so we decided to create this trailer in order to learn all the benefits and challenges of creating an animated short in real time as a result there's lots of progress being made and some of those changes and advancements and technologies are making it into the engine in you know 416 417 418 and onward but there's still a lot more to do so after my presentation there's another presentation that's going to be given about the road map of unreal and what's coming up in the future and it should be interesting so for right now let's get started and we'll take a look at the trailer being run in the engine in real time all right people let's search this dump and get out of here ASAP dum what's wrong with the dur burger yes no everything garbage rats severe structural damage some people just can't appreciate fine dining [Music] did you hear that you've got nothing to worry about trust me I've got the heightened senses of a ninja oh I am so sorry hey hey it's okay we're gonna get you out of here Wow thank you we thought you were more of those things monsters are coming looks like we're staying put Knights all right you two sit with me yes ma'am let's get to work [Music] [Music] [Music] I guess Meat Is Murder you killing me man oh okay that's up to the referee back from the part where we almost died it was amazing hey don't mention it it's all of the Nights work all right guys who's up for pizza [Music] I can't begin to tell you how many times I've heard that song alright let's see here so let's talk about setting goals of what was expected for this particular trailer in 2002 f17 they started production for this trailer to be three minutes in length it was essentially there to drive excitement and retention for the video game for tonight because it's comprised of six sequences a hundred and thirty shots and we were told while the team was told it must be performant in real-time at 24 frames per second so that's basically a max render time of 40 1.6 second six second milliseconds per per one twenty-fourth of a second and for those of us who were coming from film you should remember that number because that's going to become the new standard the new 24 frames per second scheduling was reasonably aggressive production started around 2000 January of 2017 and finished in about 20 weeks this was not including sometime in late 2016 for some development and pre-production work throughout the production calendar the team size ranged anywhere from a small five people all the way up to about 20 people depending upon a workload all of the character performance animation was outsourced to a company called steamroller studios in in Florida and I'm not exactly certain of their of their animation team size the applications and formats that were utilized for this well unreal obviously the game engine but we have Maya for character animation max for modeling MOBA for motion builder for motion capture Premiere Pro for offline and final editing and then we have some additional secondary programs substance designer for materials blender for destructive simulations shotgun for production tracking perforce for version control Olympic and fbx of course with a file format and of icon motion systems where the was the motion capture system so let's get underway and start talking and familiarizing ourselves with the five phases of production most people who are in creative industries well they're probably already familiar with this whether you're in film animation or games they are development pre-production production post production and distribution in I list this as a traditional linear approach because these techniques were kind of developed you know during the Industrial Revolution and these the manufacturing industry concepts were applied to film production and as a result each phase of these five phases of production basically produces a product that has to be completed before it can be passed to the next stage it's very assembly line driven and this fit the technology of the time very beautifully and it allowed Hollywood to churn out movies at a really incredible rate you know during the Golden Age there's some specific pros and cons with this linear approach you know it's the pros are it's very formulaic so therefore it's easy to establish a certain budgetary model for films you know it's designed to spread work incrementally across different departments and and and work groups and you know it had basically in the resource for driving the animation in film industry during the Golden Age so the cons of course for this approach is it's very linear highly departmentalized that's also a bad thing it's kind of has an archaic way of handling the version control of all the different products that they're producing and it has a very slow revision process it's difficult to schedule and it's just generally slower but ultimately you know computerization and the the advent of the visual effects industry would eventually change all of this so today we have a more modern outlook of the five phases of production it's more of a nonlinear editing well excuse me that the nonlinear editing processes and computer graphics fields and films like Jurassic Park essentially helped usher in a more interdependent five phases of production so very quickly we realized that the action taken in one phase of production could dramatically alter the preparations and execution of another phase and this was really good and it was really bad it's really good because the entire production team started thinking of the film as a holistic entity rather than just you know in an assembly line approach but it was bad because the directors started to catch on to this CG technology and essentially started asking for whatever the hell they wanted and the producers of the time are pretty much happy to go right along with it I'm sure all of you have heard kind of the the saying fix it in post correct this was you know this is practically Hollywood's mantra and it wasn't until about you know the late 90s that you know the whole advent of doing pre-visualization was used to kind of help improve the the blueprinting process during pre-production so it helps you know reduce costly mistakes and control expectations of the director and so forth so this model is still effective for today but the artistic demands that we're seeing on technology keeps increasing and something has to be done about all of the tighter schedules and smaller budgets that a lot of visual effects houses animation studios even game studios are having to address and face so this brings me to methodologies I want to do a comparison basically against the current model that's I would call me Oh traditional CG animation and approaches to how to pull this off versus a more next generation version of production using more real-time technologies and game engines so there's a couple of different categories here we have primary creative tools developmental model data and production organization version control and so forth let's take a look between the two primary creative tool we have the DCC or the digital content creation package like Maya or max that's commonly found in animation studios today in a next-gen studio it would be focused around real-time tools like a game engine the developmental model again current model it's a pool based system you know each department in an animation studio has to produce a specific product that has to be finished or close to finish before it gets passed down to the next department so the next department can actually start work in a next-gen system it's more of a an artist fetch data as needed make individual changes produce a change list push it back up into the system and it disseminates across the entire workflow data production and organization current model it's very decentralized all of your data is stored on various servers it could be you know any number of different places and the production organization is very departmentalized next-gen studio highly centralized data organization typically there's a there's a Depot of data that is stored where everything is being transferred to and the production organization is designed to be more cooperative version control current model if you're a small studio it's typically manual version control with symbolic linking and if you're a large studio and you're lucky you've got scripts and people that can write you know different ways to handle the version control systems better in a next-gen system much more atomic transactions acid-based transactions for database handling which makes it much more durable system and less prone to problems work flow current model less parallel more parallel and finally I'm just going to talk about the output target in a current model essentially all of the end product is working towards producing images and layers that can be assembled in a comping process and sent off to editorial and it's very common in both animation and visual effects in a next-gen model it's more about and this is what epic kind of embraces is a final pixel output philosophy that's generally we want to produce as much inside of the engine as possible and get those final pixels out from the engine directly to your audience so whether that's being up on the silver screen being web broadcasted or broadcasted on air so knowing this let's take a look at a typical animation studio pipeline this is a modern pipeline nonlinear they have some very distinct stages that you will find in which there is a story department it has a you know that story department sometimes will work you know anywhere up from 6 months 12 months to a year and a half producing story they produce storyboards that are fed into editorial which produce a story reel that story reel is fed into the rough layout Department meanwhile also running kind of a little bit in parallel you have an art department in a production designer who are basically feeding concept designs and sketches and such into the modeling department the rigging department and surfacing department now this all kind of happens as various concepts and designs are approved so we the end product out of the asset construction phase along with the story reel gets fed into the shot-blocking process of rough layout and final layout rough layout basically creates the individual or the entire sequence comprised of multiple shots and they're fed basically the story reel as a reference point and as a complete shots in rough blocking they replace the storyboard shots with a 3d blocking replacement when we get over to animation and after the sequence is completed with all of the individual shots and rough and rough layout complete it will go over to animation the animation supervisor then you know they start producing individual performance shots for each individual shot of the sequence and it gets completed and eventually fed back into final layout which massages in the various cameras and so forth and we move into a shot enhancement portion where we have stereo and crowds and character effects and potentially even a mat Department and ultimately this end result gets pushed out into the lighting and rendering pipelines where final images you know the the scene is lit it's rendered and its images are spit out to go into a finalization process and editing and so forth so this is kind of generally how it's done today before I get to to the fortnight animation pipeline there's some things that you should understand that the for light the for tonight project had some specific advantages number one and the biggest of course the fortnight video game was being produced right so all of the game assets that were within that game IP could be pulled for use inside of the fortnight trailer construction so we hadn't already pre-built models we had game rigs we had some game animations we even had constructed scene assemblies for the different portions inside of the game and all of this was used to feed into a pre-visualization process for fortnight the traditional cg process well they would have to construct all of this and you know that could be months worth of work so we had a huge head start so let's take a look at the fortnight pipeline it's quite similar but yet there are some differences we didn't have a dedicated story department but we did create storyboards storyboards were fed it into editorial to produce a story real sounds familiar right from there our rough layout was essentially taking those game assets combining it with motion capture and doing a technique that in previous we call first unit previous essentially that's the director working with live-action actors inside of a mocap volume with a camera and he can then interactively in and to intuitively and rather fluidly produce shots and block out the animation with real people it's a tremendous difference in approach it's a live-action approach to animation as opposed to a more traditional resource where your key framing everything and you're having to use a rough layout unit to produce your previs so to speak all of the data that was constructed during this rough layout process was essentially routed into sequencer where we did a really rough loose path of creating level sequences we kept it very loose and freeze so that you know we could create individual sequences and shots and so forth it's actually eventually going to be a two-part process to this I'll get to that in a second mom I'll get to it right now after everything was completed in our additional one-off shots were done there was a refining stage or re-indexing and conforming of all of the level sequences that were produced inside of sequencer and we're gonna get through this all in more detail here shortly but during this time things are redo every index shots are officially created sub levels are defined and all of this is being done as sequencer essentially sequencer becomes the hub of production whereas in the traditional system editorial and layout is the hub of production so step four essentially is there's an input-output process all of the data that was created during this rough layout phase would be would have to go through it an exporting process and sent to an ass that enhancement phase where the game models may not have been exactly sufficient for a cinematic trailer so they have to be appraised and improved and so forth so they would go through this there would be additional rigging steps to improve the rig and textures were resurfacing was necessary etc and it would be fed into and into the Maya pipeline for animation where keyframes would you know and my key frame performance would be done you know all of this then would be routed back out of Maya and into sequencer again where we would start the process of final layout lighting effects simulations post effects and so forth and output for rendering now each one of these steps are gonna go and get a little more detail in in just a second so what are the practical differences between these two approaches well the current animation pipeline is a very front loaded story process like I said you have 18 months to kind of figure out your story in a next-gen pipeline it's a very integrated story process a lot of people found that on fortnight the director had considerably more control to make suggestions in the entire process in a current animation pipeline shot construction is very deliberate it is defined very early and upfront with the storyboarding process which is again fed into rough layout and by that time you know what your shots are period the next-gen pipeline for fortnight it was much more of a like a live action shot construction process like I was mentioning the director actually could on the day make decisions I mentioned this already editorial and layout is the production hub of the current system in the next-gen system it's all about sequencer and then after in the current system after things are done in layout well it becomes a very team governed system by the animation supervisors the head of layout you know all of these extra creatives are are essentially taking over the director starts to lose his creative influence while in a next-gen pipeline again the director has it's much more director centric and he's capable of making changes throughout the process we'll talk about revision and revision systems in a couple of extra slides so story development we're now going to get into the specifics there were storyboards that were drawn up for fortnight they were used for initial ideas and there were multiple concepts that were considered for this all the storyboards were essentially scanned and put into Premiere Pro and there was a temp a cut made and a story reel made that was essentially used as reference for the mocap shoot this is not too far too different from the traditional approach of creating a an animatic or a bordo Matic so next step we're into this this mocap and previz session that we're having that we are calling rough layout not in the traditional sense but essentially what we have here is we have a vikon motion builder solution that was being used the the action was examined for logical breaks in the action the progression and the location of where the animation was going to occur with inside of the scene assembly so the idea was this let's record as much motion capture as we can in large single takes as possible and then we would review it and import it into the engine and validated to make sure that everything was playing back fine and everything was good during this time we're starting to basically block out what we thought were you know what was best for the rough levels sequence definition throughout this process again very loose and free at this particular moment and all of these takes and shots were assembled in one large fat master level sequence from there basically there may have been additional changes additional changes in the script or voiceovers if that was the case there would be a one-off sessions with the director to capture those particular shots and those were also added into this master level sequence and once that was complete that sequence was divided up accordingly into what would be defined as shots and all the excess and deleted are all the excess footage in these duplicated level sequences was deleted so that it would be just the individual shot cameras could were added in ordered with inside of the duplicated level sequences in order to help basically frame and a chain you know obtain your screen direction and so forth and by the time the individual shots were completed in this rough phase all the unnecessary actors and set pieces and such were basically removed from the cameras field of view that wasn't necessary in order to kind of like keep things simple so this process once it was complete required a time of conforming all of this inside of sequencer so it's at this stage that loose definition that we made well it's time to start getting serious so all the rough level sequences that we made were organized into a more traditional sequence beat shot show structure that's commonly found in film the level sequences were completely rebuilt to support this new configuration and officials shot numbers were assigned to the and re-indexed and assigned with with inside of sequencer and also in put it into shotgun for production tracking it was at this point that the fortnight was branched off from fortnight main so we still were attached to the actual game and it was at this point that we knew that it was time to create a separate fork in the road so that we wouldn't wind up making any destructive changes to what was happening on the game front so with that let me just show you quickly just so you can see here the this is basically we're inside of sequencer here and these are the six individual sequences that were created there is the opening sequences the DB sequence and on through each each one of these sequences when you enter inside you can see that there is a nested let me see your there we go these are the individual level sequences that make up the individual shots and if you go inside of each individual shot you can see that there is a number of folders that contain the specific characters and the scene assemblies and cameras and Lighting's for that single shot let's see you're also go back over here all right so we have all this data it's now inside of the engine and we have to start thinking about animation process so we have to get data out of the engine and into the Maya pipeline because our animators are going to be working entirely inside of Maya so for the various scene assemblies and environments and even though the character's game asset characters and their motion capture was all exported out unfortunately manually because at the time we did not have any specific tools that were designed to do this process for us and these scene assemblies were brought into Maya so that the animators could have a reference for their performance restaging shots with additional rough layout usually was not required unless it was absolutely necessary our our goal was to take what was specifically blocked out in our first unit previous or rough layout stage and and go from there so just a word quickly about version control epic of course uses perforce with a customized rapper that is known as unreal game sync it's unreal games basically helps us track change lists and timestamps all of the specific information and detail about each individual change that's being up so this is how this process was utilized on on the film or excuse me the trailer and I also want to talk briefly about batch IO again I talked that this process at this time was highly manual it's all we had so we could do scripted batching and exporting out of Maya with Python that was easiest but everything else in unreal engine we had to do manually and this was a major pain point and I want to make note that Python is coming to unreal and is expected at least last I was told in 419 so for visual effects houses and for animation studios this is a big major yes so next thing I want to talk about is sub level organization so inside of unreal they the CG supervisor the director they wanted to make sure that everybody could possibly work on this show in a parallel fashion so they created a series of sub levels in order to organize the labor by specific discipline and and ultimately to control see an assembly visibility and you know character visibility and a number of other factors so we have some specific sub levels that are created of course we have a persistent level which is the virtual scene container of all the sublevels and inside of that there is environmental sub levels for the various scene assemblies I'll show you some of those in just a second there's a character sublevel where all the FBX bodies for the characters would reside and we'll get to that there were sub levels for the various sequences or the sinha sequences for each beat there was lighting sub levels blueprint sub levels and a post-processing sub level for doing post-processing effects after the fact let's see here if we go so just to show you here the sublevels were designed you know to have specific visibility cues that would be override on an individual you know set up for each individual sequence and potentially changed inside of each individual shot so for this particular case you can see some of the properties and which levels are visible and which ones are not depending upon where you are inside of the if sequencer so here we have specific visible sub levels DB cine DB lighting this is you know specific for the DB sequence lighting exterior etc so this process was essentially a way to help organize control visibility and allow a more specific means to work in parallel alright okay so while we're preparing for our for the animators to have a reference for blocking there's also a stage that's happening in parallel in which we're enhancing all of the game assets so the the fortnight characters the game assets worth good some of them were usable with inside of the trailer most were not several or not and the idea was is that they had to be enhanced so these game asset characters were brought over and they were brought in to max and all of the heads and bodies were separated the models and textures were then augmented with additional resolution and extra topology were needed any of the baked lighting was then stripped out of the assets texture maps because as you've heard in the last presentation fortnight used a dynamic lighting setup so there was going to be no baked lighting and we'll explain why and just and all the facial geometries implemented a universalized topology so that we could exchange data between characters and have shared expressions and so forth the character rigging all the game rigs they too had to be enhanced so characters inside of Maya now were rerigged with the art version 2.0 tools inside of Maya additional leaf nodes were added for anammox you know good way to supply secondary animation for free and the entire facial rake was rebuilt with new controllers and additional joints in order to approve facial appearance so let's talk briefly about this whole FBX and olympic cash workflow the body for each individual character would be in FBX and each body for each hero character which was our four primary four Denari characters the mother and the child had an average of about 55,000 polygons or 110,000 triangles each Olympic head was an average 35,000 polygons to 70,000 triangles again a universalized topology and so each character would basically add up to approximately a hundred and eighty-five thousand triangles per hero character so if we were to look at this you have the individual FBX body and the head so it's kind of funny a lot of times for a number of the scenes you see a lot of people walking around without heads but in the breakdown for each individual shot the bodies of the characters would be referenced into the shot while the heads the olympic caches were spawned on an individual shot basis all right that's fear okay so let's take a look at the original game rake here is our lovely soldier here this is this twenty six facial joint game rake it looks pretty good you know you get some decent expressions however the face and the structure looks harder sharper and there's a specific limitation with FB X's implementation and unreal you can only have eight influences per vertex using FBX and Unreal so this gave us some limited deformation capability the improved cinematic facial rig and improved topology on the faces up the number of joints to 201 joints for the the facial rig and this rig was combined or it was a combination of joints blend shapes and lattices and you know basically taking advantage of all of the of the animation and deformation capabilities inside of Maya and ultimately this deforming mesh you know it removes that whole influence limitation with FBX so this Olympic cash for each head for each shot would be exported and converted to GPU morphs inside of Unreal Engine and blended and streamed out of memory in real time so here's an example what that looks like you can see it's it's much more fluid it's much more expressive the face looks softer and you know very expressive and inside of Maya all performance animation was outsourced to steamroller studios in Eustis Florida steamroller or they utilize their own asset and production management pipeline to produce and track all the data as steamroller completed the individual animation for H shot it would be uploaded to box where it was downloaded by the epic animation team and tweaked were necessary and then integrated into unreal all Maya files at epic were then tracked through perforce to ensure version control this was tricky because you've had an FBX body you had an Olympic head and he had various props that the characters were carrying all facial performance was keyframed and exported as an Olympic cash body performances again utilize a combination of game assets mocap and key framing exported as FBX prop animation was keyframed and exported as FBX and all destructive simulations were actually done in blender and exported as an Olympic cash material matching for Olympic just a touch on that the materials would need to match between Maya and what would be in unreal and you would assign your textures and materials to the actual faces of the geometry not just the overall mesh and then the all characters inside of Maya was primarily using quads so it had to be triangulated into a mesh and exported out as an Olympic cache where the ABC file was then imported as a skeletal mesh inside of unreal and again converted to PC a compressed morph targets and I was told make sure to tell them use GPU for morph targets needs to be turned on when you're importing this so I want to note that there isn't a a YouTube stream that was just released on the 21st by tim Hobson and Sam dieter which gives you an excellent overview of the Olympic pipeline on a step-by-step basis it's a great demo so take a look for that online and you can learn more about this process all right out of animation we're now back in the engine and we are doing final layout and scene polishing so we've got two you know they're usually was no need for additional restaging of shots unless it was absolutely necessary and if it was that animation changes could be done inside of Maya then brought back into the engine all camera animation constructed inside of Unreal Engine and let's see here I've already mentioned that character animation performances would if it was required to be changed would return to Maya make the changes and been brought back into Unreal lighting briefly on this there was a system of light rates that were used for some sequences and other sequences used spawnable lights generally the team found that light rigs was the preferred methodology however spawnable lights worked as well both were for performant lighting priorities were utilized to alter the light planning on an individual shot basis so you could have a light plan for the overall sequence and then when you got into the individual shots you could override that light plan and make changes specifically for the shot the choice of lighting was dynamic lighting overbaked lighting mainly for the benefit of the director when you're trying to do creative control and you're trying to you know make things look good a baked lighting though when optimized probably gave better performance it was just taking too long to make the changes that the director and cinematographer may well may have wanted and so they chose to go with dynamic lighting instead a distant field ambient occlusion was connected to the various base materials of of objects in the scene providing a soft shadowing effect for anything in the scene that was being hit with non shadow producing lights and then they also utilized P CSS soft shadows on dynamic lights and spots you know to give a nice soft shadow appearance with the lowest overhead possible Effects and post-processing this is a I want to point out if you did not get the chance to see the SIGGRAPH live presentation online please check it out because Ryan Brooks gave an extremely much more detailed description of this entire process but essentially real-time Navia strokes for the fluid simulations were used to do all the 3d volumetric storm clouds and fog and and I'm just going to leave it at that I would highly suggest that you take a look at the YouTube presentation it goes over all of the the the processes of creating the the fog and the the the deaths of the husks and how they they would appear once they were struck by you know bullets or whatever so let's see I also want to mention that I stated that there was a post-processing sublevel that would also allow you to implement the tone mapper capabilities inside of unreal and they use this to do color correction with inside of the sequences and shots as necessary and this is kind of interesting because this whole process lets you control with the tone mapper the entire look and feel of you know the general appearance of your final outputted pixels so the guys at the at the team here they went and created an interesting little extra well this day and sequencer here excuse me it will go in engine and let's see actually sorry escape I want to also do this it did an interesting little variation of the fortnight trailer to do kind of a film noir look so we go for Lac on the white have a little vignette and do some work on the audio and even cooler put infant theorem all right people let's search this dump and get out of here ASAP garbage rats I'm also gonna customize [Music] [Applause] did you do that you've got nothing to worry we've got his boy transform long I've got the hiker in a white papers so I want to point this out just that you know using post-processing effects you can make immediate changes to your project especially if you're going live to customize your your final output on an as-needed basis which is I think pretty cool so I know I'm running out of time all right so final output all the shots were rendered out of sequencer as image sequences those image sequences were brought into Premiere Pro for for final editing that the conforming process between editorial and sequencer is still something that is leaves much to be desired unreal does have support for EDL's but this process was definitely a manual conforming process to make sure that there was parity between what was being seen in editorial versus what was currently inside of sequencer but there's a lot of discussions right now and part of my interaction with animation visual effects houses is to improve this process so okay that basically wraps up the the pipeline and I guess if we have questions we can go from there yes yes Hey to what extent was sound design implemented in the production process on the trailer using sequencer it's my understanding that most of like in a traditional process there was a recording process of the characters you know voiceovers I believe most of the sound design was handled through the more of a traditional editorial process than in within with sequencer but all the tracks were recorded and inserted into sequencer to help with performance blocking and you know timing purposes I'd have to get you more information for that but good question thanks yep questions over here I have another audio question oh great audio engine will be kind of actually out and can be used in sequencer or like more in-game besides like using so what specifically is your question I guess when the audio engine will come [Music] I can't give you an answer for that one I am unaware I will check with my tech guys though and see if I can get you an answer after the show yes I was just wondering did you have an actual so you mentioned about look that were added was actually sequencer used to completely control the color grading process of the entire trailer or was there like a more classical past like you did with a pole for example the desire was to keep everything within the engine and utilized the tone map our capabilities in the engine to make modifications and changes to any type of color for on a shot by shot basis so it was purely a tone mapper that was used systematic than a shot by shot basis that's my understanding thank you question in the back Hey haha that was a great talk very inspiring I founded a studio basically a couple years ago on the idea that the future of animation was gonna be real-time super great to see this process reflected in the adjustment of the pipeline I'm wondering if you guys I know this is for night but if you guys have thought about or have done any experiments in unreal with AR kit or a arc or and how you see the pipeline changing or this the idea of this pipeline changing with augmented reality and virtual reality Wow definitely as part of the Enterprise team it's kind of our goal to handle all non-game interactions with our clients so AR kid AR core augmented reality huge big thing I was just looking at some today in one of our meetings in which I was be able to you know look at different pieces of furniture inside of an iPad and you've seen some of the other examples I would say for the future of this definitely the idea of being able to utilize augmented reality as a storytelling device is it's intriguing it's different it's I would say what would be accepted or expected within the next you know three to five years especially as the technology continues to get better and we get you know decent glasses that you know can make this lightweight and portable and simpler to use but yes I would say you will definitely see the story process embrace that medium another question here so if I understand correctly you had a process where you exported everything that was in sequencer to Lamia and you did editorial refinement I'm not entirely sure I understand why that was required and the corollary question is how does sequencer deal with actually generating media that is suitable for an editorial package in other words fully unique ID time stamped correctly with timecode frame rate management all of that stuff all of that is definitely something that has to be addressed there's again at this particular stage it was all about just simply outputting you know image sequence files outside of out of sequence or out of unreal and imported into Premiere Pro the whole relationship between you know conforming between Premiere Pro and with sequencer was at this stage manual at best that's definitely something that I've heard from a number of different clients requesting that kind of functionality to be able to have a level of parody between sequencer and an external editorial package you know trust me that's something that we're we know about and are planning to address so to answer your other question the idea of still allowing an actual film editor to have his hands and his eyeballs on the content is still of great value so you know until we get editors who are fully proficient inside of using unreal and using sequencer for doing editing that may be a little bit difficult so the whole idea here was to provide the footage to the to editorial in order to have them work their magic so to speak so it is more a question of skills as opposed to a question of functionality well it's a bit of both actually so you know you've got to have editors who are familiar with this kind of technology and you know being able to put this all together plus you know we're clearly missing some key tool sets to allow parity between the two thank you you know to point on that it would be really great you know even in the earlier stages when you're doing storyboarding and creating a story reel that ultimately will be brought in and followed as a guideline you know much in the same way of a traditional pipeline it'd be nice to have something in reference where if the editor makes changes even as early as the story reel that it would update sequencer and likewise if you made changes in sequencer it would inform editorial that's hugely crucial
Info
Channel: Unreal Engine
Views: 26,222
Rating: 4.9819412 out of 5
Keywords: Unreal Engine, Epic Games, UE4, Unreal, Game Engine, Game Dev, Game Development, Unreal Dev Day Montreal 2017
Id: LS0VsMMTaQA
Channel Id: undefined
Length: 59min 30sec (3570 seconds)
Published: Mon Oct 09 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.