Using OBS and NDI for External vMix Processing

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone welcome back to streaming alchemy i'm john mahoney and on today's show we're going to be taking a look at how you can use obs as an external video processor for things that you're doing inside of your vmix productions so before we get to that i would just like to invite everybody if you have any questions any comments please feel free to put them in the uh comment section below and we'll try to get them here on the show also if uh you'd like to join us on the show and you know talk to us live here in the studio you can give us a uh a call through live tear on the calling system we have here and somebody in the studio will be happy to get you on so let's actually get started here when we thought about this we realized that there are some things while vmix is incredibly powerful and even with scripting and everything else there are some things that you just can't do inside of vmix right now and one of the advantages that we have today is that we can have multiple components in our production workflow in this case obs and vmix and we can connect those components using ndi and that's exactly what we're doing here we have a vmix session that we'll use to actually produce and switch the our production our theoretical production here and we're also using obs then to do some very specific processing using something that obs has that is incredibly powerful and that's plug-ins but not audio plug-ins which which remix does have with the vst plug-ins these are generic video processing plug-ins that allow us to do some really incredible things inside of obs so the one that we're going to be using uh in today's example for today's show is going to be something called stream fx and it is an incredibly powerful single plug-in on its own and it does like 3d video transformations it actually does uh unless you apply luts to your so colored lookup tables so this would let you do things like do sophisticated uh color toning and tinting uh that would be sort of outside the space you'd normally get with just color correction in vmix so very cool there and and it it also lets you do blurs and that just blurs but blurs with masks and so it's it's very very powerful for a lot of things and we'll we'll dive into that so really the first example we want to show is something which uses uh obs to create a virtual set for vmix and it's basically using different layers with different blur levels applied uh and compositing that all so but let me before i jump at that let me just check the comments here so we have a rudy has joined us so rudy always good to see you uh we have uh samuels joining us samuel thank you for taking the time we have uh kxo radio so yes let's do this thank you uh we have peter from berlin and he says he hopes to be at nab 2022. so uh we're we're we're rooting for an nab 2022 as well so we have uh terrell who's giving us a hoot hoot from dallas we have uh gee and he's uh basically telling us how audio is low and so we're we're we're trying to work on this so i apologize i know we've been we've been having some audio issues the last uh few few uh a few days uh last few shows so apologies for that uh so we have terrell terrell says he thought it was his hearing aid so no terrell you're safe so we have uh uh thomas from uh levokusen uh hopefully i said that right in germany so thomas thank you for joining us we have uh tom from kansas city tom always great to see you here and we have a terry so terry thank you for joining us i appreciate everybody making the time here so let's jump in and probably a good place to start would be let's take a look at the final output so i'm going to just cut this out as the output from vmix and we'll take a look at what uh what we have here so this is just a there's basically four layers here we have a background image layer we have a layer with the television screen sort of the the frame and the the the monitor mounted on top we then have a layer with the video image that's inside the screen and then we have another layer which is the talent that's talking and pointing back to the screen so these are the four layers we have but let's talk a little bit more specifically about how we did this so let's start first in obs and in obs we we have this set up with just these three background elements and that is as i said the background which is right here we have the monitor content and then we have the monitor in front so if i if i just turn off the monitor content you'll see that the screen is just cut out here and we layer the content in transformed behind this the content itself is actually coming from vmix and we'll we'll talk about that in a minute but the way we did this is these images have all come in and they were just basically stock footage we took uh off of story blocks and we took and used this filter called streamfx to add different levels of blur to create the sense of depth inside the composite here so first start with the background if i go in and look at filters though these plugins are basically video filters so we added a filter called background blur and this is part of the stream fx library and we then have the ability to pick what type of blur are we looking for uh and so we're using a gaussian blur and we're doing a subtype with zero and there are different types so you can get you know linear multiple filtering boxes so there are different styles of blur that will give you slightly different results but you can pick from them and again the way they are applied you know with an area or directional so all of these things are you know different ways that you can apply a blur uh you also get to sort of use the size and so if we look here as i move the size of the blur it goes from one which is basically not blurred you could go and make it like totally out of focus if you really wanted something that was sort of very deep depth of field look but here we just had this set at around eight which we thought gave us a nice intermediate look one of the areas and we we didn't do this in this example but one of the ways that this could be incredibly powerful is it also lets me apply a mask in using this which means that i could decide what areas of this background image would get blurred and to what extent so i could basically paint the blurring i wanted that actually lined up with the depth so all the objects like the chairs right here and the front edge of this those could have almost no blur but i could make this entire back wall and these chairs much heavily more heavily blurred and to have that type of impact now which would make this even more realistic we think it looks looks good as a sort of a quick uh blur application but if we wanted to create that real depth of field this would give us an opportunity to have even more realism in this so that's what we're doing in the background the monitor content is slightly different so the monitor content is coming to us from vmix via ndi so we have a playlist we set up in vmix which is feeding whatever we want to show in the monitor and so we have a very slight uh content blur on this just so it sort of has some level of separation from the host that we have uh and what we're also doing is we're doing a transform so we're taking the video that's being sent to us from this playlist in vmix and we're sort of rotating it and scrunching it to get it to fit inside the frame of the monitor and so you know we're using a perspective camera and it took quite a bit of tweaking to get this exactly the way we want it but the fact that we can actually do this where we have that subtle blur and the transformation here in 3d space so you sort of can rotate things so i could take and if you wanted to for example create something like the star wars effect with the scrolling uh credits you can actually take the image and tilt it in expand it up and then you could have everything sort of going out into a point in the distance and even apply a blur so as things got off to the edge of the screen that could all just blur out so a lot of flexibility with how we're using this this is just one example but the fact you have these types of transformations and the ability to apply different types of uh masks to them is incredibly powerful so we did that for that and then for the monitor itself we just have like i mentioned a very very uh slight blur so that's the monitor layer and you can see it's just a transparent png and for the filters just a a very small uh amount of blur in here for what we want and this combined creates these senses of depth that create a little bit more realism with this so what's cool here and let me switch over now to the vmix interface uh here is the playlist that we're using and so this is just everything that's looping through the uh the playlist as videos each one of these videos let me just like start the the plane so you can see now we have this coming in and what this is doing actually is it's sending this out one of the outputs is ndi obs is picking that up as an ndi feed doing the transformation and the blur and then in this composite this is all coming back into vmix as a single composite and we are then doing the green screen of the talent so this could be sort of in studio talent we're doing the green screen of the talent here in vmix and that's composited if i go over here you see that i have this set up as layers where i'm taking in i'm i'm going over the feed that's coming in from obs and i'm applying the keyed green stain green screen footage on top of that for the lady we have here that's doing the presentation so as a as a collective set of processes this is really cool because we're feeding this with the monitor and we're applying the talent here in vmix the feed is going into obs obs is taking that doing the composite doing the transformations in the depth and then feeding that back in and that's where we do the green screen on top of that so there's round tripping going on and there's uh this external processing and these types of things i mean vmix doesn't have the capability to do a blur baked into it and it doesn't have plugins for video yet so that would be great to see that there but this just gives you one use case for how you can do this round tripping using ndi to external video processing apps for some of the things that you may want to do that you just can't do alone in vmix or scripting so so yes so peter just uh made a point here and i want to uh i want to jump in so you do have you do have delay and this is not this is not something that i was trying to to gloss over here for certain types of things especially when you're doing uh video uh transformations and things anything that requires significant video processing the latency will go up so this is not sort of a zero time processor but you can use this in cases where you [Music] where that delay isn't significant enough to impact the production so the fact that the talent is here and this is running out you may turn around and say there could be a second of delay but if this is for something where somebody's going through slides uh that type of delay may not be that noticeable and you have to keep in mind that not only is it doing processing but we have the ndi round tripping too so you're probably in each direction adding a few additional frames of latency just for ndi and then you have the processing that takes place but i mean that's a very good point i'm glad you pointed out peter because you have to use this in a selective way and make sure that the ways you're using it makes sense for your production and the other thing to keep in mind is if you you you want to make sure the audio is routed through the longest loop so if you have somebody talking uh and they're over on obs and that that is going through there uh you don't want to be taking that audio as a separate route if this new delay is going to be introduced so you got to make sure that you keep your audio and video in sync with these types of things but in terms of the general processing capability something like a virtual set probably is a great way to leverage this because even though you see the latency on the screen to the audience they just see the composite and they don't know that what's going on behind the talent you know took place you know 700 milliseconds ago or something like that so you you keep uh you know you can you can make that work very well all the things it just it just wouldn't work for so so good for pointing that out so before i jump to the next use case let me just catch up because i know there were a few more people uh that uh so he's mentioning that our uh audio seems a little better now which is good and uh so uh thomas uh the same so thomas said yep so good there i'm glad everybody can hear us uh so uh team 4m is just saying uh giving us a hello from munster germany so team 4m thank you for joining us here uh we have uh we have george uh bass or beast i'm sorry if i'm saying your name incorrectly there uh giving us an hello from stuttgart uh and he's also mentioning that uh this is cool that he uses a combination for overlay spectral waveform uh yeah so there's there's some cool stuff you can do with that uh you know especially where you you want to do things where you you would like to do like audio uh audio analysis you know doing that sort of as a side chain could be very good too so it gives you a good view for like loudness levels and other things inside the studio so that would be a case where you can use that uh without having to actually reintroduce it into the production but you use it as a as a support tool so again another great use case so thanks george uh we have uh mike so uh mike graham is saying so in this case the talent would need a reference monitor to see the finished output like a weatherman on tv yeah i mean this would work exactly like you know your typical virtual set green screen which is you know what a weather uh man would be using you know when they would be you know talking in front of sort of a blank wall you you would need some sort of uh talent uh confidence reference monitor that they could have if you were doing this type of thing but again for some of these things the two could be really disconnected so this could be something where you just want to have a a background which places something which places the talent in a location for production but you may just be doing branding and other things where the talent isn't necessarily interacting with the background directly or it needs to maintain that level of sync with the background so in those cases you wouldn't necessarily need to have that talent monitor there but in most cases that's good because it's good for the talent to if the talent feels they're immersed in where you know the place you're virtually sticking them it it makes it easier for them to behave in a more normal way and to do things that would sort of be fitting for the environment they're in so just a good a good way to sort of keep everybody in the virtual headset uh so peter is saying maybe oes and vmix in the same machine to avoid some network delay yes so we uh we are actually uh we are actually running these on the the one machine so in this case the the ndi is is probably uh you know the pretty pretty de minimis in terms of of latency back and forth but they're still i mean you're still going to have you know at least a frame uh each direction just in terms of syncing them up but yeah i mean it's uh again like i said it's it it depends on the use case and it depends on what's being done so if you're doing heavier processing on the obs system uh you could actually end up getting better performance if you had a bigger end processor doing just one thing as opposed to putting them on the same box and having that process to do multiple things so uh it really would be something you have to look and see but i think both of those could be uh you know there could be multiple approaches and find which would work best so uh so uh so george is telling us his name is pronounced like baseball so i have a bass okay so george bass george thank you uh so we also have brulefu uh just giving a uh hello to everybody uh so bruce from scotland so bruce thank you for joining us so let me show one more example uh which i think is an interesting though more of an arty use case uh to to to use obs with and what i'm going to do here is let me cut to this and what i'm doing here is i'm going to be creating a sort of a vignette blur uh around my myself so here let's just take it let's take this full screen here so you can see uh i try to keep sort of the center piece in focus here but then you sort of blur out of focus as you get to the side you can see as like my hands go up they're all blurred and you can put this around you so this has less latency because you're really just doing image processing there's no video processing here so this is something that it's going to be a little out of sync as you can see with the audio but i think overall this is a another use case where if you were doing something like a church service or a wedding and you had different types of like say you know a christmas procession or something you could do things like this now where you give that that softer more you know ethereal look to it and then switch back so let's go back to the regular camera here and i'll i'll jump over to obs and i'll just show you what we did here so in in obs what we have is just two elements here we have the guest video which is just coming from the camera and then we have an overlay mask and that's all we have in this scene in obs so if i go here and i look at the filter what we did here in this filters we're applying the same type of blur we applied last time on the layers but in this case we're now using a mask and the mask is uh basically just a circle with a you know a black core that transitions over a gradient to a white outer band and then we just are playing with the size of the blur we want so if you look here you can see i can sort of take the blur away or i could could make it really severe but at a certain point it it starts to look like a hard break between the clear and the non-clear so we just positioned it somewhere like right around here which we thought gave sort of a nice soft transition but this is not something again i wouldn't use this for for audio sync is because obviously we're a bit off with here unless you were delaying it somehow to keep it all in lip sync but we think this is great for things that are sort of effect shots and you know the perfect example would be you know a wedding where there may be you know a ceremony going on at a wedding if you were live streaming that this could be a really nice type of effect to apply uh that we think you know could enhance the production value so again this is very simple but all of these things i mean this is some of the capabilities you have here if if you're looking you have you know the ability to apply different masks and to actually the mask could even be video which would add to the delay but you could do things that would be sort of softly moving all around the image uh but this again just another use case example of some of the things you could do here so let's just switch back now to uh to the main shot so hopefully this just gives you a general sense of some of the things that you can do with these types of outboard processing and [Music] there are lots of plug-ins though for obs and i mean so for instance there are plug-ins that do things like face tracking that can move a camera so you could do obs use obs to take a camera feed from vmix run it through the face tracking in the plug-in and use obs to do ptz control to sort of move the camera or to actually have a zoom in like something where you do like a 4k and a 1080 and have it move around there there's plugins for all these things so as you start to look at the power of plugins in obs i think there will be lots of creative ways you can figure out uh to apply them to things that will work for the different types of productions so before we wrap the show today let me just jump back to some of the uh to some of the comments here so uh so uh bruce from scotland uh hoisted up a beer for us so thank you that would be that would not be a bad thing to be doing so uh so he said that john is going for the david hamilton effect yeah thanks gee uh but uh you know uh george bass uh made the point he said it's great for beauty shots and i think that's that's the sort of thing it's something that you know photographers use and they're actually filters specifically for that in photography but it gives you those kinds of potential effects uh and you know again it's you could think of that almost like ice and you put a window in front of somebody like a window frame and that you see the camera through that so lots of different creatives that uses that you could probably do for for something like that without give you that frosted look but uh so let's see so uh mike was asking uh with that blurring example that you could work for a virtual panel discussion we wanted to blur the face of the guest uh if they needed to remain anonymous yeah so the the quick answer is is is yes yeah that could be something for there but this would probably require you know for somebody moving sitting backs going side to side you probably need to do something that would track this but this potentially and again i haven't looked through all the stuff this potentially could be something that you would be you'd be able to do with camera tracking where you could blur the center of the screen you know in in the rough space where the person's head would be and as long as you could camera track fast enough using even an automated tool like a plug-in in obs you could do something like that where even if the person moved around you just make sure that the the camera tracks quickly enough so they stay blurred through the production so you know that's especially if you're doing something live so uh so we have uh torture ch says hello hello torture good to have you here uh so i again i i do not i would do a very poor job pronouncing cyrillic but uh you could say uh your pc can use resizable for from nvidia uh so right now we're not i know that's that's in the rtx uh you know 30 uh family of gpus and we're we're running a a 20 family here but uh yeah i mean that is possible that it it could it could have a a little bet a little less delay uh you know but uh this is this is one of those things where uh we have to go with what's available to us and uh the the 30 series has been like a a lottery if you can if you're lucky to get one but thank you for pointing it out i mean the resizable bar is is an interesting capability and it is something that that vmix does leverage and they've had some coverage of that so so george is asking are there any plans uh for the vmix guys to implement a plug-in system uh i don't know i mean i i have no real relationship with uh with martin you know to where i'd get any type of inside information but uh i think i think it would make a huge difference uh to vmix as a complete solution because it would it would stop limiting it to the feature functionality that could be developed in-house at vmix because they're you know they're they're a great product but they're also a small company so i as another small company owner i appreciate the limitations you have in in what you can develop in a given period of time and still produce a quality product so if you had something where you could open it up to an ecosystem of video plugins uh even if they had things that were certified for them where you could go and say okay it has to meet these criteria i think that would be that would be great not just for vmix but for the community in general you know everybody could really benefit from that and it would it would take and let the third party community fill in any gaps that people see in what they want to do in specific productions so let's see uh you know so so bruce is saying the the 20 family for the win uh yeah that's uh right now i mean it looks like nvidia is actually going to be going back and releasing 2060s again with more gpu memory so even nvidia may be getting on board the 20 family bus again so uh so uh jp is just join us always said hi john great to see you again uh looking forward to the show thanks so jp thank you for joining us here so i guess that's a wrap for today's show i know it was uh probably you know a little off the beaten path for some of the things we do but i just thought it would be an interesting way to showcase uh more of a distributed processing pipeline uh so hopefully you guys enjoyed it please uh stick around for the post show where we can talk about anything and uh last week the post show was twice as long as the as the main show so sometimes we get on some interesting topics there so please everyone that that's here you're welcome to join us for the pro show here and uh anything uh you know that uh you guys want us to cover in the upcoming season so as i mentioned we're uh next week next week's show is going to be our sort of final official show of the uh season two of streaming alchemy then we're going to have sort of one more show on the 24th christmas eve here uh for you know basically sort of an ask me anything video show well maybe show some stuff that i thought was uh cool to to do but wouldn't really fully make a full show so you know we got two more shows but we are really trying to put together an editorial calendar of what we'd like to cover in season three which we'll start right probably at the beginning of february and uh that will give us time to to get some of this stuff prepped but still make sure we're covering the things you're interested in so uh before uh you know before we run let me just see is there anything uh okay so why don't i i'll cover everybody that still has an outstanding uh comment in the post show so if i don't see you in the post show i'll see you next week uh take care everybody and thanks for watching otherwise i'll be back in a couple seconds alright welcome back everyone to the pro show so let me i i wanted to make sure because we had a few more uh posts up there so george uh mentioned that uh he does agree that you know they've already done this with the vst audio so it's talking about vmix supporting plug-ins in the architecture and i agree and i i know doing you know a video plug-in architecture probably has more challenges and would require you know some level of uh significant underlying development the the way they process video so i can understand why it might not be there you know well while they still have the audio but it's i really feel strongly that if they were to add that that would basically put vmix into a whole new space uh that they could could start to mine and would allow people in the production space to do things that may not ever attract you know a normal product feature but could certainly be you know applied to different specific niches in the production space so let's see so brewster said he's got a question regarding an issue i've only just noticed and he said he'll try to condense it into the chat so yeah i mean bruce this is this is the time so if you you want to throw it up there and you know we can talk about it and if it's something that [Music] i don't uh have the information on hand to answer you know it's something we certainly follow up with in any of the chats so so let's see uh torture says said thank you for your work we really appreciate it well yeah torture i think for us this is this is a a piece of something we look forward to every single week and having this community uh developed the way it has is has been incredibly gratifying for for us so we're really grateful for all of you and everybody that tunes in every friday so uh so jay rafael said first time catching you live and i'm on my way to the hospital oh my gosh well uh uh jay raphael please uh take care of yourself i hope everything is okay and uh if you're on your way to the hospital i am sure you have more important things to do than uh than watching this this stream so please uh be well and safe so we also have let's see uh america newscape uh so he just said uh it was a great show and he enjoys these process i do too i think sometimes with the uh the post shows you can get into topics that you wouldn't normally think of as part of a regular show but sometimes the tangents are the most interesting pieces uh when you're when you're having these types of discussions so uh torture this says what's the main difference with the last iteration of ndi okay so there there are a few things but the main thing is the move to reliable udp and what that allows now is for udp traffic to traverse lossy networks that means that you can go over wi-fi where the signal may not be uh high quality and consistent you know you could have it you know intermittent gaps in that and reliable udp means that ndi can now traverse that if there is enough bandwidth there it can traverse that even if the signal may drop in and out because it can recover things that were lost during those dropouts and that's the same for sort of the wide area uses now where you see ndi bridge coming in so as a tool set goes ndi bridge is probably the most significant new tool that was added as part of ndi5 but as an underlying component we think the reliable udp is is really a big winner there the other though is from the audio side you now can plug in use vst 3 plugins uh to process audio in ndi so we think that's a another it's a it's brand new in ndi five but we think it's another big win uh for you know the ndi ecosystem to now have that ability uh baked into ndi itself so very cool for that as well so uh hopefully that that gave you you know enough of a sense but the ndf ndi5 tools are available definitely worth downloading uh lots of cool stuff in ndi bridge probably you know beyond the the ability to relay ndi over the internet the next biggest thing is the fact that you can do uh basically uh transforms so you you're able to take ndi and convert it to ndihx so it could be lower bandwidth and traverse maybe a wireless hop that you have in your in your signal chain so things like that so really cool stuff in in the tools as well as just a sort of a general upgrade of the overall ndi technology so let's see so bruce is saying vmix desktop capture windows capture on the same device when i capture the full display video is perfect when i select the specific windows to be zoomed in teams doing it to be in zoom teams chrome it's jerky any ideas i am not sure one of the one thing that's that's a possibility is that if it is if it's not a a normal like sort of video reference size and it has to process it there may be additional delay in the way capture works uh that could make it a little less smooth whereas sort of a 16 9 or a 4 3 may be things that vmix handles in a more optimized way i am totally speculating here uh so you know that we've used uh desktop capture for some things but again we we haven't done it with any uh you know with anything sort of motion critical uh it may also be that desktop capture really doesn't handle video it you know like full frame style video well and it may be be jerky naturally because you are sort of looking at a single video window and expecting 15 or 20 frames per second and it may not be able to do that so again not sure of that uh if you were capturing uh these types of things one way potentially especially if you're doing it with zoom one thing maybe to actually use like an ndi screen capture and pull that in as a single input and then cut that input up into different segments uh where you could go and sort of crop in and so that would let you potentially uh layer this in and create individual feeds for everybody in a way that might be a little more efficient because you move more of the processing uh into the computer doing vmix as opposed to where you're doing the screen capture so just a couple of thoughts on that if anybody else has anything uh i would definitely uh you know appreciate so actually uh yeah john john drinkwater just just mentioned that it does take more processing to grab grab a window so yeah it it is probably related to the processing and then to the frame rate you're expecting it's probably some combination of those things that uh make it more uh stuttery when you're when you're looking at it so let's see uh so jay rafael who was on his way to the hospital so i'm not sure why you're where you're commenting here still uh they said he'd sell his soul for a 30 series uh if you're on your way to the hospital why don't you uh hold off signing any contractual bargains for your soul uh and just just worry about staying well there but uh i i understand the sentiment behind that absolutely so so let's see uh so torture saying you know thanks for the answer so yeah as i said i'm not sure whether this is the answer but just just some conjecture here so uh jp uh odette is saying i would love to hear some ideas on bringing wireless remote cameras into vmix playing with mobile phone via ndi so the first thing i think is if you're going to do something wirelessly you really need to be in the ndi hx space so that would be the first thing i would i would look at uh so if you're bringing in cameras make sure they're they're hx based and the second thing which you mentioned is is the reliable udp so if you're you're going with this definitely look at ndi hx in the sort of the ndi5 space where you do have that reliable udp because that will allow these types of things to to work you know much better if you're doing ndi that said there are other wireless solutions for bringing cameras so i know teradek has a series of wireless transmitter receivers that can work with the native camera signals as opposed to ndi and then you can bring that in locally and that could be something where maybe do ndi as part of the laptop where you take the receiver converter push that into ndi and feed that in you know that's that's another possible way to do this but having that reliable connection when you are wireless uh that's critical and so that comes with ndi5 uh and it comes in some of these these platforms i think it's the teradek cube i'm and i'm sorry if i'm not uh not remembering this but it's not something we use in house here but that would be that would be one way to to potentially uh another way to potentially do this beyond ndi so let's see so so torture says for now we are only using ndi for short connections uh at ebu and geneva we had some random failing complex networks with ndi so we still took to yeah so we still think to stick to ndi between conference room and the uh i'm sorry i'm not sure what that term is but uh i i understand the sentiment that uh sdi yeah i was i was asked this question this week uh on a show whether i thought there was uh a future for sdi and you know what were the advantages of sdi in a ndi slash ip world and your point is exactly the the reason ndi is is still going to be around for quite a while getting network infrastructures to transport ip based video streams especially like really high quality ips ipv video streams is not simple and it is basically an area that's out of our control in a lot of cases i mean we may have a studio but our studio infrastructure passes over potentially additional corporate infrastructure and generic internet infrastructure and then guest infrastructure you know if you're pulling them in from another location so there are a lot of unknowns in doing you know any type of of networking so sdi eliminates that and uh you do get a it's a digital signal but you do get that point to point screw the bnc in on both ends and you know you're getting a signal across that line so i i agree with you uh ndi itself is great but it just requires the right infrastructure to really become your go-to solution across the board and unless that's an infrastructure that you control fully i would i would agree with you know your decision for a mission critical thing to to go with sdi across those hops that you don't control so uh so john jenkwada just mentioned he has a uh edictron uh oh i'm guessing you mean the the jib so so now he needs to understand their their api so uh edicon makes some really cool uh sort of dollies and camera uh automated camera control for uh you know pan and tilt so very nice things there they also have a jib so you can get sort of those big flying shots where you could look up at something and see it sweep down or sweep up so very cool stuff on that as well and they are computer controlled so you can you can actually go and create these predefined movements for that so very cool stuff i'm really looking forward to see what you uh what you're doing with that and uh you know love to see what you're programming with the api so thank you john so jay raphael said he's a true dedicated alchemy man well uh please thank you but uh be uh be well and be safe there uh i i i definitely you know don't want you to do anything that uh jeopardizes your health here oh that's lee i'm sorry uh lee thank you for thank you for speaking for for jay raphael so uh maybe uh you know maybe it's uh it's it's just a uh it's just a habit at this point but uh we appreciate it so so let's see so uh jp now that said yes it is the cube so thank you jp i trying to do that at the top of my head uh torture is saying uh reggie is a studio or switcheroo so okay thank you i i always love learning something new here and uh so no so uh i i i apologize for my monolinguistic skills here i i i wish i my my french were as good as your english but uh so thank you very much uh no need to to apologize for your french uh and uh i guess america newscape is is dropping off so he just said uh take care everybody so american escape thanks for joining us so uh if you have any questions you know let's let's get them in i just wanted to to share one other thing that happened this week which was really nice uh on wednesday i was invited to uh join uh office hours which if you haven't seen office hours it's something that's hosted by alex lindsay who is one of the sort of at the at the pinnacle of the streaming community in terms of you know knowledge and the types of productions that that he's been running and so he had a he had me come on and talk about our new life hair product and take questions from everybody but uh that was that was a lot of fun and if you haven't seen office hours just look it up alex lindsay office hours on youtube uh and you'll get to see what's on there but it is a community of incredibly talented production individuals uh that have huge real world experience they bring and let people ask them any questions they want so you know it's very similar to some of the things we're doing here but uh it was really fun for me to to be a part of that and to to get this to spend some time so if you haven't checked out office hours every day different shows they're usually from 9 00 a.m eastern standard time to uh i guess to uh about 12 p.m eastern standard time uh but great group great community and if you tune in on any given day you're gonna learn a lot of cool stuff so definitely check that out so let me see so bruce uh just threw out he said uh just gone up on my pc and may have figured it i need to play with it more but i'll be in touch when i know some more okay uh please uh share because i i would love uh you know if you if you find out more for to share that with everybody and uh so matt wade just said he appreciates your content uh and have learned a lot as i try to become a video producer uh matt this is this is a great community here so please if you have any questions uh just post it in uh probably the facebook group so the streaming alchemy group and facebook has the most continuous traction but if you post things as comments on youtube as well you know i try to get to them but you'll get more of a community response over on facebook so uh i guess uh this is probably a good point for us to to wrap the pro show so as i mentioned we will be on next week and so on next week's show we're going to be doing a deep dive into the new live to air 12 remote guest channel neural net product so we figured you know we want to we want to give uh you know a little exposure to you know the product that's uh helping us pay for this show each week so this is uh this is something that if you're if you're interested uh we think it could be a lot of fun we'll talk about things in in more general terms as well but we really we're pretty excited this is a new product we're going to be uh launching in january uh and you know we we think it's uh it's got a lot of potential so we wanted to share it with everybody here before we actually roll it out and give everybody a chance to see sort of what i do has my day job so i hope you get a chance to join us oh so uh let's see just very quickly then before we have it so uh george uh george said uh good night from germany so george thank you uh matt just made a quick comment says will do he's he's moved to vmix from ecam so matt i think you're you're gonna you're gonna like vmix i mean ecam is a great product vmix just opens up a lot more personality based things that you can express in there that would be unique to you unique to your productions so i think you'll enjoy it enjoy that a lot so uh so mike uh greg had a quick question lta with six guest version will a 2060 gpu be sufficient so the the quick answer is yes it it will be sufficient uh you you do get three native and codes under 2060 and you know you can do three additional software encodes for the remaining uh return feeds the six guests so you should be fine with that and a and a decent uh uh cpu so probably like an eight core cpu and the 2060 would be fine for six guests uh without a problem so uh we have jose uh said uh so he's saying thank you from portugal he's like jose thank you as i said love having everyone here as part of the community uh we have kevin michael reed so he said uh bring your audio back into vmix and then send it out to your encoder after that that's been successful for me yeah kevin thank you yeah we're this is very strange i mean since i know it was rapping but i'll just throw this out we're finding that the audio on our internal recording is fine so we see these audio problems on what gets ends up getting encoded so we're going to spend a little time in between the the two seasons here trying to dig in and understand exactly what's going on there so this so oh i'm sorry so so uh so um i i didn't oh i'm just been told that you're responding to somebody else but uh my my apologies we're talking about the audio issue we were having in the beginning so uh but uh okay so i guess we're at a good point to do a wrap here so thank you everybody for hanging around for the pro show and we'll see you all next week take care
Info
Channel: The Streaming Alchemy Show
Views: 357
Rating: undefined out of 5
Keywords:
Id: wPMAsrkG9qs
Channel Id: undefined
Length: 56min 5sec (3365 seconds)
Published: Sat Dec 11 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.