Raspberry Pi AI Kit - Unboxing and Installation Guide

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone I'm gilad and I lead the makers and developers community at Halo today I am very excited to introduce Raspberry Pi new AI kit featuring the Halo 8l entry level AI accelerator delivering 13 tops with a typical power consumption of around 2 Watts you can find the kit starting today at official Raspberry Pi resellers with this release we are also launching the Halo Community platform and opening our developer Zone to all users links are in the description we've put a lot of effort to ensure that the installation flow and examples are as straightforward as possible enabling you to keep data processing local ensuring your privacy optimizing performance and managing costs according to your preferences all of Halo examples are open source and we encourage you to use them in your projects and products today we are releasing three three basic pipelines for different tasks detection pose estimation and instant segmentation these pipelines are built in Python for easy integration additionally Raspberry Pi has integrated Halo inference into its official rpy cam apps repo which is Raspberry Pi C++ camera framework in this video we will review Halo installation flow on the pi our a new GitHub repository and the available examples so let's jump to the pi okay so what will we need the Raspberry Pi 5 the Raspberry Pi AA kit a micro HDMI to HDMI adapter an active cooler Raspberry Pi camera model 3 or the high quality camera Raspberry Pi display cable and a 27 USB 7 power supply let's start building okay let's start by unboxing Raspberry Pi 5 and let's connect the active [Music] cooler sure to connect the fan to its connector let's prepare the camera [Music] [Music] [Music] and let's open the AI [Music] kit the kit comes pre-installed with a thermal pad between the md2 and the board you will probably will not need additional uh heat sink if uh you're keeping uh Your Design uh open to Air and ventilated enough [Music] okay and we are good to go let's get it connected so we are in the new freshly installed Pi OS on the Raspberry Pi we went to H GitHub and searched for Halo AI Halo rp5 examples let's review what we have here so this is our new rep repo it is this build to give you examples for the Raspberry Pi 5 we have an installation guide which we'll review in a minute and Halo examples we have the detection example the pose estimation example and the instant segmentation example we will review them soon now let's start with installing the installing the Raspberry Pi go to Halo packages installation andt switch to the Halo Raspberry Pi installation guide open the terminal and let's start with the installation flow first of all you will need to update the pi we assume that you are starting from a fresh new Raspberry p operating system if you need guidance on how to do it there are plenty of tutorials in the in the network we sure to select Raspberry Pi 5 the 64-bit posos and put the SD card in the Raspberry Pi and we are ready to update the system start by R by running sudo opt update and the sudu opt full upgrade this will install the new uh the latest uh Raspberry Pi Corel which also includes the Halo driver support [Music] okay and we are done and now in order to achieve Optimal Performance on the Halo device it is necessary to set pcie to gen 3 Gen 2 will also work however the result will uh suffer from lower performance the new raspberry pie herel includes the option to change to Gentry in the new raspy config UI so let's run sudu raspy config go to Advanced options PCI speed and enable gen 3 this is it now you can reboot your P okay so we are back after reboot now we should install a Halo software you should only run sudo op install Halo all this will install the following software components Halo firmware Halo RT runtime software you can see more information in the Halo RT GitHub repo in addition it will install the Halo Tapas core package the Halo core package is a derivative of our Tapas repository the Tapas repository is our uh application layer H used to develop application faster with the G streamer framework the tapa score package will install only the Halo elements and the postprocessing fun functions this is used as a dependency uh for the examples we are showing here in addition this will also install the rpy cam apps uh Halo postprocessing software stages uh you can see more documentation in the official Raspberry Pi P cam apps repo once this is done we can reboot again and this concludes our installation back from uh reboot uh the installation is finalized let's let's verify the installation by running these commands Halo RT CLI filmare control identify this will make sure that the chip is identified if you get something like this you are clear to go let's check the installation the installation of the Tapas the Tapas Halo tuns we are okay you can exit by pressing q and last thing we want to check the the Halo element installation this is the gamer element which run inference on the Halo device if you got this output you are good to go if Halo or Halo tools were not fine try deleting deleting the G registry H specifically if you are not running on a clean posos if everything works good you can go back to the Halo rpy examples please see the troubleshooting section if you got uh any issues and join the discussion on the Hello Community form maybe your answer your question was already answered so okay so let's continue and see the demos go to rpy basic pipelines documentation in order to install the application clone this repo once this is done CD into the repo directory and you need to configure your environment as we said we are using the tap the Tapas score package we are use the package config file in order to get Halo dependencies you can set all dependencies and the virtual M sourcing the setup M script once this is done all your environment variables will be configured and you will be inside Halo virtual en for this demo when you're coming back again you can just rerun the setup uh environment script uh to make sure that everything is set up again your virtual environment will not be overrun we will just open the same one okay let's install the requirements the pting requirements make sure that you are install inside the virtual end when running it and let's run the download resources script this will download our Network hes and a sample video while this is running let's review the application structure the application is built on of three uh three parts the first one is the user defined data class this is a user uh user defined class which is passed as an input to the Callback function which will run on every frame running in the pipeline it it is used to communicate between the main application and the Callback function it extends the app callback class defined in the Halo RP common file this can be customized with your specific variables and functions the second part is the application callback function this is where you should add your code this is a userdefined function that that processes each frame in the pipeline it is called from the identity call back element in the pipeline which is placed on after the network inference and the postprocessing this means that the gamer buffer which is an input to the F to this function already includes the network output as Halo meta data and the frame itself each example demonstrate how to pass the specific metadata to each pass to each task for more information on the Halo metadata object refer to the Halo object API the last part is the gam replication class no changes are needed to this class in order to run the basic pipelines this class sets up the G Pipeline and handles EV event and callbacks it is extend the gstreamer app uh class which is defined in the Halo OPI common file the applications can modify the network parameters and the pipeline by overloading the get pipeline string function to see more information about how to build pipelines with the Tapas infrastructure visit the Tapas documentation okay now we are set and we are ready to run our first application let's start by running the detection examp okay as you can see this is a detection application it uses YOLO v6n as a default it also support YOLO V8s and YOLO x s leaky let's running it again and watch uh the age top as you can see now we are running at 30 FPS and the CPU is using about a fifth of its capabilities so you have plenty of CPU for your application you can exit the the application by pressing contrl C okay let's run with the minus minus help flag in order to see additional options available for this application so we have the ability to control the input we are able to run from file a USB camera or a Raspberry Pi camera we are able to add the use FL the use frame flag which en enable some additional postprocessing in the uh callback function the minus minus show FPS will show you the FPS in which we are running disable sync will work when we are walking from a file this will make the pipeline run as fast as possible this is available when you are working with a file when you are running from camera it will just run at the camera speed the dump dot is a debug feature that we will review uh later on and the network flag allowed you to select different networks the yolow v6n this is the weakest one but the fastest the YOLO V8s this is uh the most accurate one however the FPS might be a little bit slower and the YOLO XS saky which is somewhere in the middle let's run with the video input let let's add the show FPS flag and the disable sync as you can see now we are running at 150 FPS and if you will see it at the edge toop the CPU is fully loaded this is expected because currently the battle neck is the CPU not the Halo so the CPU is limiting us to 150 frame per second if you want to check the actual speed that the network can uh use see Halo RT CLI Tools in Halo RT documentation let's see the pose estimation in order to run the pose estimation you can run this line we are using YOLO V8 pose Network as you can see all persons which are detected are printed to the terminal and their left eye coordinates and right are I coordinates are also printed let's switch to the instance segmentation example in order to run using a video file copy paste this line as you can see we are running instant segmentation on the [Music] video and the detections are printed to the terminal okay now let's see how to run with the USB input we have a specific guide to do this you can run from uh Raspberry Pi camera using the rpy from file you can just add the file path and in order to use a USB camera you should add add the device location the V4 L2 device note that usually by default it will be Dev video Zero however this is uh this is not always the case so you might be you might need to check uh several uh camera files you can do it by using the FF play uh function uh this will probably be one of the uh of the even numbers I video Zero video 2 4 6 or eight let's try Dev zero okay we got it on the first chart this is great so now we can run the in the instant segmentation with the USB camera okay and here we are new and more sophisticated projects and examples are coming soon join our our community and follow this channel to get notified what would you like to see next freegate integration Unity integration maybe just a wacky robot if you have requests suggestions or crazy project ideas please write them down in the comments [Music]
Info
Channel: Hailo
Views: 18,043
Rating: undefined out of 5
Keywords:
Id: _aCyR8XJcws
Channel Id: undefined
Length: 16min 20sec (980 seconds)
Published: Thu Jun 06 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.