Hand Tracking & Gesture Control With Raspberry Pi + OpenCV + Python

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey gang tim here at crow electronics and today is hand recognition and finger identification with a raspberry pi single board computer [Music] have you ever wanted your raspberry pi 4 model b be able to identify every single joint or finger in your hand live or perhaps use hand gestures to send commands or utilize hand key point detection or hand pose detection then this video is exactly where you need to be machine and deep learning has never been more accessible and this video will show just this here in the table is everything you need to start locating hands naturally you're going to need a camera you can use any pi camera module today i'm using the high quality with a five millimeter lens attached to it bring along with you the most powerful raspberry pi you can get your hands on here i'm using a raspberry pi 4 model b you're also going to want a micro sd card that has been flashed with raspberry pi buster os that has been connected to the internet and is updated and has the camera enabled we have also installed a number of packages such as opencv and mediapipe to the os on this micro sd card at the time of recording opencv works best with the previous buster os jump to the article link in the description to see a step-by-step process to do all of this we will also set up our raspberry pi to run as a desktop so a power supply hdmi cord and the normal keyboard and mouse combo are needed too once the booting sequence is complete you're going to be welcome with the familiar raspberry pi buster os now i've created a couple of different scripts to do hand recognition you can find and download all of them by jumping to the bottom of the article page use the link in the description with the file downloaded unzip the contents onto your raspberry pi desktop or wherever you deem appropriate each script has been filled with comments so you'll be able to know exactly what's going on behind the curtains if you're curious the first one to check out is the simplest one which you saw at the start of the video named simplehandtracker.py all you need to do is right click it and open it up using funny ide or any python interpreter that you would like then with it open run the script by pressing the big green run button and as soon as you do it it's going to open up a live stream of what the camera is seeing and initiate hand identification the really sweet thing about this machine learned system is the way it predicts where it believes your fingers are going to be even if it cannot see your fingers completely it's quite incredible how accurately it knows where your fingers are or where they're going to be so if we look at the script now we can see we import those two important packages mediapipe and cv2 this is part of the method of getting that hand framework drawn onto the live video here we can add confidence values and other kind of settings so for example right now it's only able to track one hand but let's stop it just for a sec here we're going to increase the number of hands to two and when we run the code now naturally should be able to deal with two hand you can increase that number even more but keep in mind the raspberry pi frame rate will decrease the more computing you have it doing you can also easily alter the code to spit out the x y location of a particular joint or fingertip to the shell for instance if we jump into the script and i uncomment this section of the code is going to print to the shell the x y coordinates of the tip of my index finger which is referred to in this code as joint eight and you'll notice that the index finger has been given an x y coordinate which is based on where i put it the natural next step is to add some extra functionality to our previous scripts so that way it will be able to tell when we have certain fingers up and also give a total of how many fingers are up and how many fingers are down i've already added this functionality to another code and you're going to be able to find this code in the unzipped location called our fingers up or down.py and module.py both are fully annotated for understanding this is going to be very similar to the previous script but will also provide a total finger count up and down and specific details on which finger is up and which finger is down this statement then gets printed to the shell five fingers up zero fingers down and thumb index middle ring and pinky tip are all those fingers that it can see if i give it a rock and roll sign it this way so you can keep track of what's going on so it knows my pinky finger and my index finger are up two of them are up and three fingers are down now it knows four fingers are up one finger is down and that identifies them all correctly the way this script is working is it knows the coordinates of each joint and if it determines that the fingertip is below the first knuckle it's going to state that the finger is down as it currently stands the script works for a single hand but it can be altered to accommodate more you will notice an increased delay when doing so also as an easter egg there's an added text-to-speech feature so whenever you press q on the keyboard as the script is running the raspberry pi will enunciate to you exactly what it sees letting you know exactly how many fingers are up and how many fingers are down allow me to demonstrate you have five fingers up and zero fingers down you have two fingers up and three fingers down naturally the next step is to use our hands to control general purpose input and output pins on our raspberry pi gpio pins are the doorway to controlling an almost endless amount of sensors motors actuators or signals so i'm going to wire up this 4x4 globet matrix add a few lines of code to our previous script and we should have a hand activated light system you're going to be able to find this script under the name glow bit gesture control dot py so depending on how many fingers i raise in front of the raspberry pi camera when the script is running it's going to change the color of this matrix right here allow me a second to dive into the script so you can see exactly how i altered the fingers up or down script to add this glow bit functionality the first difference you're going to see is at the very top of the script where i've added two new import statements the first news statement imports all the functionality of the globe it library which makes it easy to light up this 4x4 matrix to whatever color we want and the second imports the general purpose input and output library to our script this second imported library enables us to send data through specific gpio pins on our raspberry pi the next line the gpio set warnings to false just keeps the shell clear of any warning messages that the gpio library might want to output the next line under that is our first line that takes advantage of the added glow bit functionality in it we are determining the general behavior of this 4x4 matrix a value here of 255 for brightness means that we will be maxing out the brightness that this little guy can output and rate limit fps equaling 80 means that the refresh rate of this matrix will be 80 frames per second after that all the code is exactly the same until we get closer to the bottom scrolling down we're going to pay attention to the fact that we're inside the wild true loop i have created a number of if statements that have a chance of running with every iteration so let's follow through one of these if statements if say we have one finger up to the camera when this occurs this first if statement is going to trigger for up will equal 1. inside this if statement we can now see that it will create a variable named c with the value matrix dot purple this value comes directly from the globet library and it has several color functions like this that we will take advantage of this line in combination with the next is going to make all the pixels in this matrix be assigned to the color purple in the other if statements that will be triggered for different amounts of fingers up or down you can see i have chosen different matrix color values for c which then get told to fill the matrix in the same way the final step after all these if statements is the matrix.pixel shows statement this is used to refresh the output of the matrix to the correct color if wiring up and coding for a glow bit is new to you i have a link in the description that will give you all the information that you need now this kind of gesture control can also be used not only to control hardware but also software for instance let's take our script a step further and create some gesture controls to aid accessibility when watching a video if you open up computer gesture control dot py in it i have coded in the ability to start and pause videos and also control the sound so when i lower a finger it's going to start the video when i lower two fingers it's going to stop the video if i lower three fingers it's going to turn on the music and if i lower four fingers it's going to stop the sound you will notice some delay using this system initially i had play and pause set to the same keystroke which was spacebar but this was less than ideal as if you're not fast enough with your hand inputs you could easily end up with multiple spacebar inputs which stutters the vlc player the solution is found by opening up vlc and creating custom hotkeys by using the preferences setting in here you're going to be able to configure a hotkey which sole purpose is to play and another which sole purpose is to pause the particular ones i created were ctrl alt b and ctrl alt n almost all programs will let you create dedicated macros in a similar manner to this diving into the script for this you can see i added the functionality in the exact same places as i did for the glow bit script you can see a difference here at the top where i have imported keyboard and sub processors this adds functionality to the system the keyboard package will allow us to execute in python those vlc hotkeys that we created to play and pause our video the sub processors will enable us to directly alter the system sound to a specific percentage via this script scrolling down to the section where in the previous script we were controlling the light depending on how many fingers you can see this section has been remixed there are still if statements that have a chance to be triggered with every iteration of the while true loop looking only at the first if statement that i added you can see that if four fingers are seen to be up then it will execute a keyboard function that is a vlc hotkey to start playing our video the next if statement is very similar just if three fingers are shown then i'll execute that hotkey that we created to stop the video the next if statement will activate when it only sees two fingers and it's going to turn the volume up to a hundred percent starts by setting up a variable named volume to equal 100. the next variable command may look a little complicated but mainly it is just formatting to make the data understood when it gets outputted by the following sub processes function when it gets outputted by this function it will result in full volume for the raspberry pi system the final if goes through the same process but for one finger and if it sees that one finger it's going to turn the volume to zero i also have a section here that you can uncomment which will shut down your raspberry pi if it sees zero fingers up i kept accidentally triggering it so i uncommented it here but it is a great demonstration of how to run a terminal command through python so i left it here just for that if you continue to go down this direction you could definitely control a youtube video to play pause fast forward or thumbs up depending on the figure gestures you provide to the camera there is also a potential for true gesture recognition here but it will require a re-tinkering of my scripts hopefully this has got you excited to try this out for yourself and tinker with some code and make something really rad in your maker verse this hand recognition system is absolutely raring to be expanded upon to take it to some really amazing places so until next time stay coz [Music]
Info
Channel: Core Electronics
Views: 63,168
Rating: undefined out of 5
Keywords: real-time hand gesture recognition, hand keypoint detection opencv Raspberry Pi, RASPI, Raspberry single board computer hand detection, hand gesture application, accessibility using hand tracking software, open-cv and MediaPipe, Cvzone, hand gesture recognition project, applications, human hand-detection, vtuber, multiple hand tracking, hand identification and control using MediaPipe, Control Media using hand tracking, GPIO control using hand and computer vision, tensorflow
Id: a7B5EZVHHkw
Channel Id: undefined
Length: 13min 1sec (781 seconds)
Published: Mon Dec 20 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.