Face & Movement Tracking System Using a Raspberry Pi + OpenCV + Pan-Tilt HAT + Python

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey gang tim here at crow electronics today we're building a face tracking system for raspberry pi using a raspberry pi camera to search for faces it's going to lock onto them with the pimmeroni pi k pan tilt hat following them around the room then we're going to take it to the next level by adding a movement tracking feature and a patrol phase [Music] locking your face into the center of the frame means always keeping the action in shot this video is going to provide code for the pimmaroni pi k pan tilt hat but the fully commented code has been written with some logical alterations so that any pan and tilt system can be controlled using a raspberry pi single board computer the intention here is to not only create an easy to use face tracking system with a pan-tilt hat but to do so in a way that can readily be expanded upon no matter what system or features you choose to use the backbone of the system used here is going to utilize opencv machine learning to identify faces and identify movement on the table before me is all the components you will need to get the system up and running real fast this video makes extensive use of the pimmeroni pycade pan tilt hat if you haven't set up one before check out our online guide as it will provide you with all knowledge link down below i'm going to be using the raspberry pi official camera module version 2 but you could use any camera module with a similar form factor i am mounting the hat to a raspberry pi 4 model b the extra power that these boards have are uber valuable for computer vision systems like the one we're building towards here you will also want a micro sd card that has been flashed with the raspberry pi os that has been connected to the internet is updated and has the camera enabled we have also installed a number of packages to the raspberry pi os through a number of terminal lines that can be found in the article and description below with these components and the normal power supply hdmi cord monitor and periphery combo you'll be good to go so we've jumped to everything connected plugged in and with power to the system the first code i'll demonstrate will get our system right interface tracking download link for all the codes talked about here in the description find and download these codes from the bottom of the article page and then unzip them to your raspberry pi desktop like i have done over here open the codes by right clicking and selecting thony ide which is what i'm going to do right now on the file face tracker dot p y when run is going to actively identify faces and attempt to keep the found face in the center of the captured video by panning and tilting the servos after a second or two on screen you're gonna see a live feed from the raspberry pi camera now i'm going to show it my face and hopefully it's going to draw a box around my face and use the pan tilt servos to keep my head centered in the middle of the frame oh there we go it's working straight away it is interesting here to test how well the system locks into your head and the different head orientations that it can or cannot pick up for example i found by cocking my one's head it makes the face invisible to the facial recognition of potential interest to those who have seen the video on facial recognition with the raspberry pi you will notice that the machine learning used to identify faces here is different from the one used in that video the system used here doesn't identify and label between different learned faces but in exchange for that it is much faster at finding human faces so after achieving this the natural next step was to create a patrol phase for a pan tilt system this will be active whenever it doesn't see any faces on screen and will allow the pan tilt to move around its degrees of freedom logically in the search of one then to aid this patrol face further i'm going to add in the feature to automatically swivel the camera towards any objects that it identifies as moving then whatever her face is seen it's going to lock into that face just like before and thus disregard performing these two new features these added features definitely make finding elucid faces much more likely so let me show off its features the code is named facetrackpantilthatpimoroni.py and i'm going to open it up the same way as before into thoni ide the code can be downloaded at the bottom of the article page so as soon as i press this green button you're going to see the system revert to sentry mode and this is going to occur whenever it's unable to see a person or movement and a live feed of that is going to be coming onto your desktop so when movement's identified like me sidestepping here or like me moving my hand like this it's going to place a green circle on the video feed to where it approximates the center of the movement is occurring then the system will attempt to make the center of movements be the center of the camera frame then when it finds a face like before it's just going to snap to it and paint a blue box around that face on the video stream and then it will do whatever it can to stick to that face like glue making your face be the center of the action then as soon as it cannot find a face it will return back to sensory mode so there are a lot of settings you can adjust and worth checking out is the config.py file for just a small sample of the things that you can change you can find that config.py file right here having opened up the config.pyr file just like before here in this section you can see that you can alter the default positions that the servers do in sentry mode if you scroll down to the bottom of the config file you can see a sensitivity setting which if you increase will make it more likely to pick up small movements however if you crease it too much it's going to produce false readings the movement detection used here is a similar method to the measuring speed with raspberry pi video check that out linked down below and a big thanks goes to claude pajeol once again as i based a lot of this on his previous work particularly the face track demo which i'll also link in the description there are a number of excellent places we can do with this for instance what to do when you see multiple faces perhaps putting preferences to one face over the other based on particular parameters would be a worthy code addition or picture the scenario where there's people you're interested in but they're all wearing masks in that case instead of running facial detection we'd be better off running an object detection layer and searching just for people also there are many types of pan tilt methods and i have attempted to make it so that all these can be incorporated successfully and painlessly into the code i've created many common pan tilt systems use slow moving brushless dc motors and some even use stepper motors a little tinkering with the code and you'll be off to the races running any pan tilt system successfully so hopefully that has got you planning ways to forge ahead in a new direction with machine learning there is a lot of magic and an endless amount of creativity possible within this world i'm only just getting started so until next time stay cozy [Music] you
Info
Channel: Core Electronics
Views: 57,525
Rating: undefined out of 5
Keywords: Code to create a sentry turret, Raspberry Pi and pimoroni Pan-Tilt HAT, Pan and Tilt with RASPI, How to do face tracking on Raspberry Pi Board, How to make a camera that can track movement, how to create a camera that patrols, Velocity tracker raspberry pi, machine learning, face identification, facial recognition, human head tracking, targeting human face, Rasp Pi, single board computer, open cv, open-cv, openCV, artificial intellegence, method to create tracking system
Id: T_892SKVNf4
Channel Id: undefined
Length: 7min 32sec (452 seconds)
Published: Mon Nov 08 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.