Introduction to Gstreamer (Gst-launch) for embedded devices raspberry pi jetson nano

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] well in this video content i will introduce with you gstreamer gst launch tool actually so we will create some pipelines with the gst launch on the embedded devices but that doesn't mean of course these commands these examples will not work on the normal computer right you can simply run these commands on your computer on your virtual machine as long as you run linux so what's pipeline pipeline for example simply when you have some camera here right so then this camera will give some output to you then you put this output into a filter then after this filter you will send this frames the output to another from screen or somewhere to the screen to to see it so this is called the pipeline gst launch uh will give you ability to create this pipeline simply uh using the command line this is just a command line tool so in this tutorial in this video content we will create some command line applications and with command line applications i will show you how you can use this command lines with the python language and also with the cute and actual qml language so we create some different applications sending frames from one application to another application and some kind of testing applications i don't know camera usage pi camera usage for json nano for raspberry pi a few things like that i hope you'll enjoy let's start i created a page here within the github that you can follow the comments and you can find some examples i will in the next i will update this page for the new examples or new some kind of nice ideas about gstreamer also python and qt so simply you can follow this page with the video i will uh simply follow this content right now for the installation uh you should first update and upgrade your system then you should install these packages to your system then i am not going to do this again so after this you will have a g streamer tool gst lounge and some kind of useful tools also that we will talk about it so now let's switch the hello world application the gst launch to use the gsc launch right now i am the raspberry pi sometimes you will switch the idea also than just nano right with the such connection so before start you should export the display if you have an slash connection but you can try these things on the razer pipe directly so you don't have to do that i did before but yeah you can do like this export display zero which is the x server number the name then for the hello world application just type x gst launch line 1.0 and as i told you we are our aim is to build a pipeline with these things right with the gst launch so to create a pipeline we need a source element first which is here for an example for the test for testing video test source we will use the exclamation marks to to send one uh out of one element to another or to sing for example it's like uh using the pipes right for example when you just start the ps aux then it will you know print everything but when you want to filter it for a single uh specific process you just you know use the grep and pipe and grab the and the process name right this is exactly like this so exclamation mark then we will send this video test source output to the x server then x window to do that simply we have x image sync elements what is the type of keyboard yeah x amazing i hope it should work yeah on the screen uh you should see an i don't know what is the name kind of old televisions kind of colorful rectangles so this is simply hello world we create simple uh pipeline here with the test source okay you can ask yourself what is this video test first what is this x image sync elements whatever you have as an element you can inspect it with the tool which is called gst inspect 1.0 then just give the the name of the element with your test source then it will give you some information some features of this element okay this is important because right now this is a set test element well probably it will it supports all of the most of the let's say formats here nice of fancy features okay these are nice but when you have you know raspberry pi under pi camera or other usb cameras some cheap usb camera these things cannot support all things right so this is why first you should simply inspect these elements then understand what's going on there which feature you can use so for example this is there were some names yeah people who create these things and blah blah blah version license uh source release date okay okay okay just pass pass g object these are related with the gstreamer still but we can talk about later in the next videos maybe this importance the pad elements uh this is source right this is sort of the thing availability always i'm not sure what is it clearly this is caps which is called that actually with the capabilities with the caps you can change the format of the output you can change the size of the window we will do this for example for the formats as you see that there are a bunch of formats here that it's nice that we have to play with to understand the formats to understand the difference and also here you can change the width height frame rates multiview mode won't be i don't know what is it and also there's an expired but here it says with video x row these are uh this can be changed according to the source element okay for example in the in media we will use another for example source element for the pi uh camera this will be different in the uh raspberry pi so the capabilities will be different we will see that is why this part is important for the notation okay let's let's change the width and height parameters for the hello world application okay so come back to your gst launch from this hello world application just put an one more exclamation mark here and i don't really score like that it's called an exclamation mark or something i don't know okay so put this then here we should use video x row width which probably you know 500 let's say just give something but this is actually it's important for you know this for the for the video test source it's not important but this sometimes uh this should be kind of an integer or i mean it's an integer but uh multiplication of kind of an integer because you can break the frames right pixels then you will see some broken images this is important maybe we can see later hey you should check the uh just inspected maybe you can find some more information about relate to the source element let's put 500 again it is okay i think right now to test so run see in the screen size yeah changed it's nice okay but we can do it more we can change the format for example i need to check the format again here well okay i will choose some i'm going to choose a format that is not supported by the xmh sing to teach you to show you for example the gray 16 la i'm not sure the la i must i can't remember what's it but this is the gray black and white right so we just come here and put just formats it's equal to oh yeah we can copyright did that copy oh yes also already yeah like this so we will see an error i think i hope yeah nice that is what i expect so simply says additional debug info gst base source c item check for source loop gst pipeline whatever whatever it is source 0 streaming stopped reason not negotiated what what who negotiated there yeah the source of the elements should be negotiated between each other this is the idea right here so what's so hard to understand right now if you don't know all the elements if you don't have any mushroom let's say an experiment or experiment let's say at least then it's so hard to understand but it simply says x image syncs element cannot support this format so what we can do here solution is very simple there is another element we will put in a pipeline another element to convert to converts this format to something which is understandable for the xmas thing like actually which is an auto auto uh video you don't know it's just a video convert it should be let's check video convert yeah it's it's working see as you see then as black and white simple well actually okay i showed you that it makes x image sync but there's an also xv image thing which supports more formats naturally okay so if you don't need every time video convert then i will show you for example as i remember x video image thing uh supports this format i2 i4 2 0 but x image sync cannot support this let's try without video convert okay i want to take i want to i want to get the same uh problem again yeah nothing same problem here same error cannot negotiate it so i just put here the v we put here v here just v this is different thing element well it's working see x image sync cannot support doesn't support actually uh what was the name i four to zero formats but xv image scene supports uh i don't know all the formats uh you should check for ninja internet i can't remember all okay let's continue this is simply the hello world application right then i don't know that's okay for the hello but okay let's let's let's check i forget something with the gst launch gst launch oops no not gst i i i need to look psg impact inspect and here for example to play there's a pattern see these are also the features also the arguments actually after the video source element that we can put these things for example um for the pattern for example we use zero here this is a default in a zero so this is smta 100 percent color bars color bars okay in the second pattern there's a random television signal i think this this is kind of a noise this is black 100 black 100 white nice red green blue okay for example let's use this first pattern and how we can use it here simply put pattern is equal to echo two two ah no there should be a space see this is important spaces are important here it's just a black i i wanna i wanna put one and then why i put two just one wow noise right two what was that yes black right so let's check another one of the gamut checkers okay and the numbers blank gradient spokes pinwheel okay i get 21. let's check out 21 oh nice oh i can get my kind of weird thing there might be time stamp problem oh there's another problem that for uh it's because i think the width and height parameters i'm not sure guys that computer is slow nice okay it's a raspberry pi but for something yeah this is kind of the uh probably we didn't hate parameters the orientation or you know the percentage maybe i don't know i don't know going to now yeah you learn gst inspect and simple header board so let's switch right now uh what's the switch let's check yeah pi camera usage to use the pi camera okay let me clean the screen to use the pi camera for example i don't know let's first start with this in the nvidia if you want right we can start in the nvidia let me switch to nvidia right now here yeah i'm in the nvidia right now so there is an another source element which is an nr and vargo's camera source that we can inspect inspect and push cam source or camera source camera source okay yes bunch of information nice and some just kind of a total sensor mods that require them you can't read these things sensor id saturation um i can play nice i didn't check before ga gain range okay gain range tnr strain this is different features that we can play later okay let's continue with the pet elements capabilities caps where is caps oh here x row as you see that there is another new string memory so this is different different x-roll see different than the before oh nice it's not much see because it's not a test element right um i don't know the paths let's check the pads uh the formats i can see the formats here you see i don't know where formats don't support anything a format here only this format nice very rich yeah it's normal right this is a foreign media specific source right now so normal thing so let's let me take the camera here i have another camera but it's not doesn't have a long cable so maybe i can put like this sorry okay let's run uh this pi camera on the nvidia json nano or nx or whatever but be careful you just connect correctly the pi camera right the correct gci port so gst launch i also take this from the internet actually you can find also i didn't ride myself now argus now camera source networks camera source exclamation mark exclamation mark and video x row memory and vmm yeah that's just i type here what i showed you before nothing more then let's put some width 300 and hate 300 again i know i know maybe frame rates but i don't want to put any frame rate right now if you see some problems i can play with it formats should be uh nv none 212. okay another mark here in nvdo now yeah i will not use video convert elements here there's another elements in the for the nvidia this which is called in with cone converter in with cone and for the transform it has n for the opengl actually it is an agl conversion nv agile transform nvad lagino agl transform and for the sync element it's not we will not use this x uh thing here but if you have this is not directly using the x image thing we will use an nv agl gl sync which is called it's quite weird to call but n v a g l g l s scene i know i typed correctly nv aegl glas yeah no anonymous for couldn't post power sketches extra memory i think i put an n here so this is wrong right it should be comma and hey it's formatting this is wrong in the in the agl transform nv agl transform it should be and i didn't see anything yeah i saw myself there but it's rotated right sorry about it yeah because of the camera position let's check any flip method here maybe we can have we can flip this with the camera source philip and do you see do you see any flip thing here some rotation i don't know i'm sure that there's something i couldn't remember i don't know whatever we can check later so simply right now see i can run the pi camera on the uh nvidia with the gst launch nice okay so um what should we do on raspberry pi to run the pi camera with the jc lounge there is a actually i will show you right i'm going to show you just two ways okay so let me switch to raspberry pi right now so um another pi camera is connected to raspberry pi here please be sure to enable first the pi camera okay from the from the i don't know preference or something let me check uh yeah preference and raspberry pi configuration that actually you you will find from the interfaces tab the camera should be enabled here probably probably you should restart all the system after this i couldn't remember before the pi command source that you have to install something from the internet that actually i'm going to show you another source element which is generic one also that we can you can also i'm going to show another example for the usb cameras that we can use this tool for the usb cameras usb camera and csi camera or whatever this is generic tool generic source element so what is it what which which is the four and called in a v4 l2 source gst launch v4 l2 source this is the source element so then here just we put the quantities right caps video x row same memory 8 300 with 300 let's say let's say let's put some frame rate here frame rate that you have to check you know the source within uh inspect tool thirty and one or one another mark here then x reimage thing no it's not working bye it's not working let me put here video convert no it's not still working that's not nice v4l source video x raw heat with frame rates we should enable lighting first to v4 l2 bcm 28 i noted here but i forget it v4 l2 okay let's try again so simply i will again put here v and frame rate hey okay v for outsource right not still not working so i don't understand why it is not working [Music] video maybe it's because of the usb commander no okay okay this is this is different problem this is different thing the video zero so i think we should first list with this v4l before l2 tool there is a control tool to list devices like this so this is the code decode isp trust webcams my webcam platform which one is in pi camera this is the codex icp this one video too i think we should put here dev and device i think as i remember device this part of the the video too yeah it works it works yeah yeah what happened before okay let me let me explain let me explain so first um okay i tried these comments before probably right so um i didn't connect uh raspberry pi the spy camera so i just test the webcam usb web and it worked and i test pi camera without webcam it worked but when i combine them so there is there was there was a problem but i couldn't figure out so when you uh type like this you know v4l source or something blah blah blah it tries to find the correct device and it tries to find the video zero this is not correct this is for here as you see that for the truss webcam but these are these are different right so i mean for the trust webcam these are not exist that is why negotiation cannot be created so what i did then simply i just inspect and the list of devices right just with this with this uh with this tool then i figure out that my device is here you know just i try to also if it's not work you should run the commands just i just run the you you showed right to enable it but i think i didn't check with the mod pro but it was already enabled but i am not sure you can also run this command again so right now we are connecting with the pi camera with the wall v4 v2 with the source element v4 l2 source right so but it's not the i don't know i'm not sure if it's called the best practice or not but there's another way that you can use for the same actually much more efficient let's say like like the uh now argus source element i'm just stay here oh god so what is it it is rpi camera source actually so when you search on the internet on google you will find a github page there's an instructions clear i've compiled and built it here before so i will not do it again i'm not going to do it again so simply git clone this repository and follow the steps there and install some tools autocom automatic libtool pkk config and whatever affiliate development packages then configure it with the auto gen asset and make and make install after this you will have rpi cam source whatever the element was the element yeah so let's check not with the gst launch with the gst inspect let's check rpi cam source still bunch of informations with different numbers what was i don't know so some kind of different you know feature sensor mod shutter speed inline headers we can check these things later in much more details because actually there are really good nice things then come here brightness there are a lot of things the video direction for example see these are i think so for uh position that i'm looking for here see here the jpeg video x row and hash 264 format see well i'm going to use against x row here again that you see that there are some kind of generic formats if you if you if you remember in the nvidia there's some kind of specific formats for the n media right so it can be it's about the source element that you use it's so normal the pad elements as you see okay and that should be at the preview also i remember the preview at this one preview express just on the preview ah yeah this one display display preview window overlay okay we should uh set the false this one otherwise it will bring two uh different output one of them is an x row x window one of them is in the just an overlay with the aj lighting or something but i don't know you can try but i will put simply false gst launch 1.0 here then rpi cam source preview false with the mark exclamation mark again no yeah video x pro we just put width this should be like this the number 320 for the integer things 32 i think as i remember which should be 32 the multiplier of a 32 and uh integer 32 i mean and and 8 should be the 16 i think as i remember but you can try then you can see the difference you can put 320s in the problem and let me check is we need the frame rate no we don't need a frame rate right now video convert x image scene you can see me again now we are you know using the rpi camera source cool it's working nice okay we already learned how to can use okay we already learned how we you know how we can use the pi camera for a raspberry pi and nvidia just nano or nx or other aborts with the gst launch of course with justin this trimmer well now let's play with other things i will right now i'm going to show you an rtcp example which is rtcp it's called an real-time streaming protocol right now okay we just play directly connect to the pi camera or usb camera or something well but we can also use the rtc to send the frames over the network one point to energy to do that we first there are a few ways that i'm going to show you one of them i think we can talk about these things in this another video with the python also with the rtcp that we can play with but here i i'm going to show you with a c language that simply you should go to uh first uh if you don't if you don't install installed rpi cam source you should you should do this right we just did before so just go to the free desktop org and there is an tar file which is ngc rtcp server we need a server first for the rtcp and enter it then configure and make install after this where was gst rpi can be not this one first install this i couldn't find but we can install again then start file entire file gst rcp at this one xf gst yeah because of wi-fi i think so fast and go this file and configure it configure chain build systems type whatever i hope it will work gonna find whatever okay so make to compile yeah it took some time a few minutes social don't make install this is the last step as it says relinking i already installing whatever then go to the gst rtcp server examples here there are a bunch of examples already compiled so we can use a test launch this should be six and there is one parenthesis and so yes simply there's some debug options um you see we are using rpi cam source source elements here it just passing the string so preview falls as we did see this is the same uh kind of pipelines pipeline why do we pass as a gst launch it's kind of a pipeline here just we pass as a string that we will do this for the python also it's kind of mods there's false i just use this example this is a hush you know this format is different and with hate frame rates and parts these kind of things then run this okay so in the this port and test we can access to the uh rpa camera right now that we will test so my raspberry pi ipad res is okay let me open oh let's just remind me later um i have an ubuntu machine here virtual machine let's be sure first the device is blinkable the raspberry pi from here you know otherwise we cannot talk with it otherwise i will use windows that's nice yeah i can think so i don't need any rpi cam source or something here right this is rtcp now we isolate we abstract the system so with the uh to use rtcp i used before item here let me check gst yeah this one exactly i use it so this is the not correct ip address this should be zero and 14 this should be eight five five tests correct latency zero buffer mode auto some kind of test uh elements the code bin this is decoding right we will talk about this later video convert we already told auto video scene sync false just source the synchronization so actually i connected the raspberry pi from the wi-fi it will be a little probably slow but let's try it looks nice i hope it will bring something yeah great here well actually it's not slow it's wi-fi and it's nice wow not bad not bad it's nice okay so as you see that also we can connect with the camera plus by camera or whatever uh with the rtc protocol so this is you know for the security reasons i don't know how you can use it but this is exist right also you can connect from your for example vrc gun player who any any other program which supports the rtc protocol it's nice from your phone so until now okay we talked about rtcp pi camera usage and the hellboard application so the next one uh we build the usb camera i will show you how we can handle the usb camera on the raspberry pi and now we can actually show to the v4l control tool that i just showed you again and we will run uh we will see the output of the some of the chip camera here on the screen with the x window okay some bunch of output data okay then let's switch anders the next section the next session i'm closing it this one let me clear the screen so to check which devices are connected to your device or system or the raspberry pi or your computer we can use for v4l to we use actually for right control and list devices so here you see that i have a trust webcam it's that cheap one of them not quite not cheap quite cheap uh usb camera that shows the video zero and video one now i'm going to show you how you can run this video one video zero or i don't know whatever the trustworth came with the gst launch like a pi camera so let's start with the gst launch again and v4 l to source which is a generic source element device the video s video one we are using here decor bean here that i just showed before the code decode bin uh this constructs the decoding pipeline actually pipeline like i like a video convert like we will not care if using video convert we will not care which format is will be used actually to change to do for the x image thing for example decoding is like this it's uh using it's if try to find some decoders uh using the available decoders right and the mixers while auto plugging so then we need a video convert for for what for x image scene right that we did before no not a capture device dev video one okay let's try zero oh it should be zero okay this is a usb camera another pi camera as you see the items uh i didn't put any number any with 8 parameters it's just a default one it's quite big right in the screen so you should try video one or video three to find out quickly right simple so for the video for the usb camera is like that it's quite simple with the with four video you know v4 l v4 and that have no v4l tool it's a quite simple generic tool so let's switch with the opencv usage and the gc trimmer opencv energy streamer with of course the python python 3. so okay let's let's write simple hello world and python example together on the just nano i'm not using the right now raspberry pi for these examples because the python actually the opencv version is not a correct one here in the raspberry pi right now for me i set up uh opencv myself and that just nano that is why i'm going to use a jason uh right now you will see the version right now i put uh on the python so you can find this in this exams uh examples on the internet on the github page that i showed in the beginning so i'm not going to write again so simply as we can talk about on it this is the example first example as you remember you know this is the first hello world video source video test source and video convert and x image sync or x image sync directly here we are using the app sync absing is uh for if you if you don't write directly somewhere to window this one is used okay so you should import the cv too this is the opencv so check the version that 4 4.4 it should be um let me check here to to be sure yeah i i i noted as an at least four but i'm not sure a correct version maybe there's another problem but i'm not i couldn't get it work in the with the 3.2 version in the raspberry pi so with this uh as a string we can pass as a string this the pipeline for the video capture if you check this definition of the capture video capture function that you can see that you can pass the string there so after passing the string we have the cam object which has an read method that will return red and frame objects from the frame these are the frames actually real frames so we can show these frames using the in-show function image show or in a while loop right in the next also we can add something to frame right if you see this then you would also probably if you are more familiar with the opencv than me then you will probably also think more things right now right putting something on the frames and playing the frames it will it will be nice so let's go back and run the python3 hello application hello world gstg streamer python whatever it will print the uh yeah version 4.40 this is the version i'm using right now it's working but there's a warning yeah that doesn't matter as you see that uh we just run the first example hello example with the python this is really nice guys but believe me you can do whatever you want for the image processing here sending the frames one process to another and manipulating the process of frames only with the python with the simple scripts in a few lines of code these are very really uh handy approach let's say they're powerful things so for example i have another uh let's say example here right so gstreamer drawing python i just give it a name maybe i can change the names please follow the page on the github it will draw as you see that it will draw the circle on noise on a noisy test image right that i just tried before nice right so let's check what we did here so just a number to just pass there so let's pass the pattern snow to this object to run in the video capture and after getting the frames with the read method using the opencv circle method i just put some constant circle there on the frame see it's simple to the frame then frames are shown on the imp show with the improv on the x window whatever this is really simple and nice nice very nice so let's switch quite advanced topic right now i'm still i'm going still working on the just nano adjacent board now we are done with the raspberry pi here but you can test if you if you install the opencv on the razer python everything will work also the cute now we are going to use the cute and also a python and also shared memory approach to share the frames from python to the cute application another application this thing simply uh to see the example what what i mean here okay let's first let me copy this pipeline's quite long i don't wanna write it's quite long pipeline what it simply does it creates some q identity actually it creates a socket here this is the socket pad that we can find in the tmp folder under the tmp folder and some the size for the shared memory size that you can find all the commands in the you know web page disabled server okay so we are in the local environment this will not work in the network for the outside of the device we are in the local environment this is maybe for the security right for the sacred issue you can use this i will open maybe i have already no let me open the neve terminal here so after running this it will create a server i hope yes it creates a server right now if you if you what was the name temp stock write let me check again yeah attempt tempsr if you checked for example well i don't know why with the cat but temp uh temp foxy there is a socket right now here i think it's a unique domain circuit or something so we can connect the socket with another you know command line another command as you see there is no image on the screen right now so i'm copying here another command and passed it yeah it is already working it's quite huge uh see here it's like a huge blush yeah so what i did simply i create server here in this such connection somewhere in the local area so i connect again with another you know the screen then i run the example that is simple so think like that this can be the python example right we can run this command in the python and we can run this command for example in the qml and the queued so we can share simply the frames between two different applications like an ipc this is nice idea so let me show you for example a few simple startup examples i can't catch six so okay i think it's killed already there is an uh jst example so pressure is correct i couldn't figure out what is the difference here right now but no worries so simply as you see there are two different strings right now this is for the first string for the nvidia uh you know now argus cam source we played before and we sent the app sync you see and for the string for the gss string we are taking from the app source this is the point then then we send to the video writer gst string and for this is not important right now i just here put an example but it's it's empty it will work i think then we will read the frames this is for face addiction okay we just correct we just directly update the face direction okay so it will on this is this will be also the next example that i'll show you but let's let's continue with this this is an also a simple faces action example it will update the uh it will detect the faces right and it prints this to the frame it sends the frame so then uh let's flip the frames i mean the frames then it write the frames which is uh to the g streamer that we can use from the another application so let me show you python 3 gst example python so it is running right now it's nice it is running right now so now um i compiled i compiled this example so we can line we can run symbol like this i have an hfs platform here i mean i was explaining these things in the other video so you can check you can check my uh old videos for the qt installation this is not just a platform of the cube that i'm going to use in hdfs which is faster and let's run so now we are not seeing anything and why i think we should remove the temp temp socket okay let me run this again okay let me run this again i just clean up the socket run again yeah it's working right now so let me show myself a little bit it's quite slow because of the python oops i think this one is correct way so python detects the faces and you know print the rectangle on it on the frames and sends all the frames to the socket with the g streamer then from the g streamer using this streamer we are reading these things with frames from the qt but why it's not detecting any face come on detect my not my eyes but my face this is a really simple example you can use much more better ones yeah it's just detect right now let's see this is just an uh very helpful example for the camera we can check the camera what i did here cleaning up okay i know i know you will clean so draw wreck not here come back let me like this gst no come back gst example client main camera just simply mean kml here we are using media player um you know tool in the qml and you see that we can directly write here just i put some examples also we can direct the right see here sm certain memory source socket path temp full like we did before from the terminal see we just use the cute cutie cute video thing for the sync element nothing that is all nothing more then also we are running this media player with video output source player like this how to play through here this is a procedure nothing more see just a three 40 lines of code maybe 39s maybe that's all i just uh call from the c plus you can also run directly html i didn't try it but i think you can download back client two and second one it's quite also nice let me run the example again and run this example share number client plot form a glfss oh yeah this is same problem we should we shouldn't do the control c right it's not a good idea we shouldn't do this yeah don't try this at home cleaning up segmentation of course of course i'm not a good boy so just simply remove temp full sockets it's nothing more there so let's run again the example this is actually the face addiction shared memory part and example that you can find on the page don't worry so then run the secure example again now we are seeing some let me put myself for humanity okay so right now okay there's a red uh rectangle that you see i will move it i don't know it's a mobile object right now i just change a little bit in the qml it's simple so with the face selection see i can also move the rectangle it so you can put some buttons you know menu you can design i don't know whatever you want it depends on you this is very simple it's really powerful so don't worry about this is slow things say back to you know a face detection thing it's about the python if you uh run a much more new and modern way modern approach for the face detection it will faster i just try you know to give you an example anymore see this is working very well you can put text you can design whatever you want this is perfect so guys uh that is all for me here and thank you it's quite editing it's quite a long video right in the next ones um we will go more deeply there yeah right we are talking about just we have been talking about photos just a video by camera generally but of course the gst launch is used for all the applications for also for all the applications but yeah we are not focusing it right now and also we can run some you know much more complex uh examples for reading videos or other kind of source that i will show you also these things on the nvidia maybe nvidia is a better idea to use i don't know we will check so see you again take care yourself don't forget subscribe bye [Music] you
Info
Channel: Ulas Dikme
Views: 1,799
Rating: undefined out of 5
Keywords: Raspberry pi, jetson nano, gstreamer, python, qt, hello world, image processing
Id: rPcQiDHyGnI
Channel Id: undefined
Length: 64min 49sec (3889 seconds)
Published: Fri Jun 18 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.