NEW STUFF in Frigate 13 with my configs and alert automation in Home Assistant.

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
it's been a while since I've done a frig MVR video and one of the reasons is because once I installed frig it just basically runs I don't have to do anything to it I'll do an occasional tweak here and there on one of my camera settings uh for changes in environment or whatever but other than that it just runs well version 13 beta is coming out and I've been playing with the beta version there are some changes that need to be made to your configuration file and some new features and updates so I figured I'd go over those and get everyone ready for the version 13 when it actually gets officially released of course you can run the beta at any time uh if you want to and I've been doing that for a couple weeks now with no significant issues so let's jump into what's new uh the features the braking changes and then talk about my configuration and then we'll talk about a little bit of the alerting that I use with my frig in [Music] VR okay this video is not going to be a full install of frig on anything you can use an add-on version or you can use the docker version or whatever you prefer to run frig now I do have a video that talks about how I installed frig with Docker on my Dell Optiplex my little mini uh tiny computer and it's been working fantastic I've got a coral TPU on it so that helps a lot uh but you can watch the video here it's in my blog post and I'll also link it in this video here so let's just get into some of the stuff now they have a full writeup on the GitHub page and I've taken bits and pieces of that uh and highlighted them here so you can reference them quickly so the first thing to talk about is frig plus frig plus is a subscription based service that you can use uh with frig in VR now there's two tiers it looks like there's a free tier so you can upload and annotate your uh your clips for training purposes uh and it's a frig integration and then you can uh pay $50 a year roughly and then you can upload annotate you get the integration and then you help support this open- source project and then you get 12 model training credit and it talks about all of this stuff here um so you can have models that are trained from scratch specifically designed for the way frigate actually does things so it's better analysis of your uh your object detection and your images uh with less resource usage and then you can upload your own examples and your your model then gets tuned for your specific environment and then it um also trains to detect a more relevant set of objects for things or for the use of security cameras so things like person face car license plate and then you get into your big delivery providers um that you can use with this frigate plus model so with the subscription the the paid subscription you you get 12 model training credits that you can use to train those models for your specific environment so that stuff is now being uh rolled out in version 13 uh By Invitation uh you'll need this frig subscription or frig Plus subscription uh and then you can download these subscriptions uh and put them in your frig configuration it talks about how to do that all of this again is on this GitHub releases page um in addition all of that other stuff I just talked about you can submit false positives to frigate Plus for training the models again you get 12 credits training credits with um the subscription now before I go any further let me just mention that I am not a uh FR is not sponsoring this uh they haven't talked to me about making this video I enjoy using frot NVR I was a blue IRS user for a long time that's also great software but it's resource intensive it requires a Windows PC and so I prefer to run something something on a small form factor PC and not be tied to Windows which is what I'm doing here with frig and I've never really looked back since doing um frig over blue iris all right so in addition to that you've got support for SBC small board computers and other devices if you're running 12 version 12 of frig which is the latest version there support for the Intel gpus and Nvidia gpus with version 13 now coming out there is a community supported board framework and this allows uh the community to add support for all kinds of different boards right now there's a Jetson 4.6 and five device support and also a lot of the rock chip s so's are supported and this will just continue to increase in or the number of supported devices will continue to increase as uh the community continues to build this out using this new uh this new framework for community supported soc's or sbc's and all that other stuff now FR frig's bread and butter is object tracking and motion detection and so there's a lot of updates in this section as well in version 13 frig switched to using the norf norfair model for object tracking which allows for many improvements in how objects are identified and tracked this is the norfair model is a custom customizable lightweight python library for real-time multi-object tracking uh you can add this detector with a few lines of codes here's a bunch of features it talks about and this is installed in uh the version 13 of frigate so that's what they're using now motion detection is more efficient and it also recalibrates when there are big lighting changes such as a change from dark or day to night mode so the IR mode changes if there's lightning in the area if you have PTZ and now frig supports PTZ if there's big changes of movement in the framing or in the picture it'll adjust to that using this uh more efficient model and it will recalibrate that as well the recording timeline is used to improve the detection accuracy so what that means is there's an 8 by8 grid uh created for all of the cameras and this is updated daily so there's always a fresh uh grid of expected regions so that the system can better uh do its object detection and it's more accurate that way it is also Al now possible to set the minimum amount of time an object needs to be in a Zone before it's considered to actually be in the zone the example of that is if you are if you have a kind of a driveway and then a street and somebody walks puts a foot into your your driveway section for a second or two while they're walking down the street you don't necessarily want to uh say that that person's in your zone or in a Zone you want it to just kind of ignore that so what you can do with this is use this inera set setting and set it to some value that allows you to say okay I need somebody to be in a zone for X number of frames before it's actually going to trigger that zone and say somebody's in there so that's helpful in situations like that or you're in a busy area where you have lots of stuff passing by your Zone and every once in a while something pops through there that's brief you can do that all right so those are some of the object tracking motion detection improvements recordings one of the favorite things I like in this new version is it is the ability to export recordings either as a regular speed video or as time-lapse and you can do this based on uh user selectable uh time date ranges so for example going over to frig here we can select a camera uh let's just say uh the driveway camera I want to either do real time or time lapse let's say I want to do a time-lapse video and I want to make this video on you know Sunday uh from let's just say 12:00 p.m. to Sun Sunday at um 6:00 p.m. let's do p.m here and then I will do this is kind of confusing 6 PM oh here we go scroll a little bit uh 600 PM all right so then I submit this and what it's going to do is going to create this export this time lapse and then you'll see it pop over here in the export section and if you already have a Time laap create a Time laap created and you are done with that you can always download that playing it easy as clicking it or downloading it actually it doesn't actually play it it'll download that time lapse for you so then instead of having to stitch bunch of files together uh in your your whatever favorite media tool which I have had to do in the past you if you remember forget records in segments and so you might have um a one minute segment so 60 Minute segments you got got to take 60 fra or 60 videos and or small files and put them together the export makes it super simple to get either a regular vide uh for a time frame time range boy I'm really having a good time talking in this video today um anyway you you can get it for a Time range as a regular recording or a time lapse which is what I just showed you there so that's really nice one of my favorite features because sometimes something will happen and I'll want to see uh a time lapse of a specific time and that makes it easy to do that especially during storms those are always fun to watch a lightning storm come through or wind storm you can see stuff blowing in that timelapse okay so there's that a timeline data metadata is now stored for specific moments during an event so you can highlight when an object is detected enters or leaves a zone or becomes stationary so looking at my camera here let's look at one event this doesn't have any stationary stuff in here but you'll notice across the top here and it's hard to read but you can you can see that says to your person detected at uh whatever time that is so then it shows you where that person is and then you can see that they're in the actual camera area here and then they entered the zone so I have a Zone here it's kind of off the side uh so they entered the the frame or entered the camera here but entered the actual Zone which is a smaller subset of the camera picture here they entered the zone at this time and then they left the Zone down here so you can see when things happen and if someone or something comes into this Zone you'll see other options here for whenever they are um stationary resume activity or whatnot so it's kind of nice to see that it gives you a good indication if you're not sure what trip the zone or trip the event you can click these up here and kind of get a timeline of where things happen now um one thing that you'll see here which is impossible to read let me see if I can get it bigger here uh disclaimer this data comes from the detect feed but is shown on the recordings it is unlikely that the streams are perfectly in sync so the bounding box and the footage will not line up perfectly The annotation offset field can be used to adjust this so in my configuration which I'll show you later on you can I'll show you what we're talking about with The annotation Offset you can see that actually my box is pretty lined up here typically that would not be true because where it puts this in the detect frame is not where it is in the main main recording uh video so the Box may be here or I may be hearing the box over here or whatever so setting The annotation offset fixes that and you kind have to play with that a little bit to make sure it's in the right spot all right so that is an awesome piece here there are some efficiency improvements recordings maintainer now lives in a dedicated process recordings maintenance is now fully asynchronous database rights are now more efficient and the recording config now can have a sync on start start up enabled and what this does is it checks for recordings in the database that are not on the disk and deletes them so if you actually go through um and delete a bunch of files on your disk to make space or whatever if they don't exist on the disk and you run the sync on Startup it will take care of the database and clean the database up for you just means it'll take a few extra seconds or so to start up once it does this especially if you've done a lot of work or have a huge database or huge files okay so we talked about all of this export stuff and everything else and actually I don't know why exports in here twice let's just edit my blog on the live video here all right so this is another cool feature audio event triggering is Now supported using yamet yamet has over 600 different sounds that frig can listen for and this is sound classification of the amnet it's a deep net that in predicts 521 audio event classes from the audio set Corpus was trained on and employs the mobile net V1 depthwise separable convolution architecture don't even ask me what that means it just does this stuff and you can actually uh use this in different uh programs and what frig has done is included this now in version uh 13 if you're using home assistant and using the integration this will not work for you unless you're running the 5.0 version of the integration which is currently in beta if it if you do want to use the sound you'll need to install the ver the beta version and that will create uh different sensors which means you can automate those so if you hear a siren if you hear a dog barking if you hear screaming what any one of these 600 sounds and assuming your camera has sound that's enabled and is being picked up by frig of course if you hear any of these things you can automate against so I'm going to play with that haven't done it yet I think it's going to be pretty cool to to see what that can do manual events via the API you can now create uh events manually in the API so if you have a doorbell ring or a sensor tripped or if something happens let's say you have a sensor in the backyard um that trips and you want all your cameras to start firing off you can do that through this Main Events API or manual events API you can set the set the event link the recording save preferences and much more there's full documentation that is linked in my blog post and on the GitHub page for frig as well the frig now supports PTZ cameras and that can be done with Basic Auto tracking and you can see an example of this on their website here under PTZ control uh here's some Auto tracking that is being done with frig uh same thing down here just some testing that was done by the frigate uh the frigate developers so that's interesting to see see how it tracks all of that that'll be more fun to play with or that'll be fun to play with and learn more about it told you I can't talk in this video I don't know what's going on I am drinking an energy drink but uh that's about it like cherry vanilla all right breaking changes now this is a major release so we're going from 12 to 13 so it's always going to be some discussion around some of these breaking changes so here are some of those highlights I'll talk about since the detection model has changed the default values for motion detection have also changed it's a different model so you're going to want to start over from scratch the recommendation is remove any specific values in your config and start tuning and tweaking them again so if you have anything in your config file that deals with the motion detection settings you'll want to start over on those and make sure that you reset them up so it might be a big change for some of you I've really tweaked these models that you already running so just keep that in mind before you upgrade that you will have to do some changes if you have configured settings for stationary objects and have set that interval to zero you're going to e either need to delete it or increase the value because stationary object validation is now required you can't just set it to zero and disable stationary objects you have to set something in this value more of that in the the configuration and there is a whole configuration site or documentation for uh the beta version in fact this is it right here and you can talk about uh avoiding stationary objects uh setting up stationary objects here um and it's all of this information is listed here uh note there is no way to disable stationary object tracking with this value so this interval that was we're just talking about uh you need to set that in there uh if you have configured said we just talked about that super important changes to the database have been made so they're they're rewriting the schema or redoing the schema so if you are on version 12 or anything before version 13 you will need to make sure that you back up your database if you want to revert back to 12 because it will overwrite the database and change the schema so make sure you back that up it's simple as copying the location of your DB files to somewhere else safe so that it doesn't overwrite them uh the default location or I say that on the docker version I don't know how the add-on version works where the the database stuff is stored the default location for the database has been moved it is now in the config directory this is because a lot of people are storing their files on a Nas and moving this to the config directory speeds things up and prevents lagging rights to the database which then causes other problems so whenever you upgrade to version 13 and start up uh frig it's going to go ahead and reconfigure things and move the database for you however if you're running Docker you will need to reference the new location in your volume section of your Docker compose or however you're pointing the uh the database or the pointing your your mounts to that particular directory so in this case for Docker you're going to move it to host path config folder for that database rather than what it used to be before all right so very important that you deal with these database changes configuration for retain days needs to follow the record retain days format and I'll show you that down here actually I'll show you that now if you go down here and you look at um record I can find it here so record retain days used to be able to do record retain uh days would be on the same line now it has to follow the record and then retain and then day so it's just essentially just another level on the configuration make sure you're doing the three level configuration here I've already I already had that changed long ago in version 12 so I didn't have to do anything but if you haven't changed it this is now going to break you don't change that uh again home assistant 5.0 integration is required if you want to have all the features working I was able to upgrade to 13 without upgrading my integration it works fine except for the sounds and some of the new features so if you want everything to work that's now available in version 13 of frig then you need to make sure you're on the beta version of five or if you're watching this video later on and it's already released as a regular version non beta version then just make sure you're on 5.0 or later if you're using the tensor RT it now requires the Nvidia driver version 530 or greater okay so that's a lot of stuff about what's INF frigate let's talk about my config briefly this video is getting really long but we'll go through the config a little bit I'll just talk about one camera that I have and then you can look at all of this stuff here and update it change it copy it and put it in your own config obviously you need to use mqtt for some of this functionality to work so I have my mqtt stuff already set up that's beyond the scope of this video uh you can look at any of my videos in the past that talk about mqt and figure that out you probably already know what you're doing if you're running frig already um and then we have uh The annotation offset which is what I talked about and I have in mind is a negative 800 they can be negative numbers so you will have to experiment with this number to see how that uh lines up with the the red boxes on your videos uh here's all my goto RTC stream setup I set up a stream for um a main and a subchannel for every camera and then I reference those down on the cameras and this is my camera driveway camera uh this is my path rtsp path which then references that go to RTC stream and then my input arcs are set for that particular camera and the fact that I'm using rtsp restreaming this particular role is a detect and that's why it's called driveway sub if you look at driveway sub up here that is this particular camera with channel one subtype one that's the URL for my particular cameras and then the same thing for record it's just the different camera for the full uh resolution stream you don't want to you don't want to run uh detect on a full resolution stream that's CPU intensive and 99.9% of the time not necessary and then of course you set your roles detect to record depending on what you're doing and then for my audio this is my recording setting for my audio it's got to be AAC audio so that you can record that into frig my detect is the width and height and frames per second of my detect stream my substream in this case 640 by 480 it's running at 5 frames per second that is set on the camera itself and then fret just matches that here I want to track persons dogs bicycles cats and cars I want to enable snapshotting with a time stamp and a bounding box on the snapshots I need to have the the actual object to be in any of these four zones in order for the snapshot to to be taken otherwise I don't care about it it won't take a snapshot propping is true height is 500 retaining is three days for my snapshots and then I Define all my zones this is probably out of order but that's fine I Define all my zones here with the coordinates uh if you've used frig at all you know how you can Define these zones I have a lot of videos on that and because this video is super long I'm not going to go into how you do that here but here's all my zones you can set an inertia this is again is that inertia telling it how many fr frames something needs to be in a in a zone for it to be considered part of that zone commented out here but essentially if you want it to be immediate it's one frame uh and then for this right side I only want to track person dog cat and bicycle on the left side I added a car because I'm doing some specific automation on that left side of the driveway to turn on an ultrasonic sensor so all the zones are set up here same same thing for every other Zone and then my motion mask I set a little mask to motion or mask out my clock on my camera I don't need to see that if you have a clock running on your camera as part of the camera stream from the camera mask it out save some CPU because every time that second ticks on that clock it has to detect that and see if that's an object it wants to deal with so make sure you turn that off recording is enabled of course I keep it for 3 days that means that it records constantly so I have a 24x7 recording on this this particular camera for three days worth of retention and then I keep three days of active objects in these particular zones uh and then I have a precapture and a post capture of 3 seconds and 10 seconds respectively okay so that's one camera you can read through all of the settings on the how all of this runs there is a full uh configuration file reference it tells you everything you could possibly want to know about how you would set up frig here feel free to take a look at my config and apply it and test it as necessary uh only other things I have on here are the detector is an EDG TPU USB device and then I have a disabled rtmp I don't use it because goto RTC works for me in that case and then I just I enable bird's eye and I turn on restreaming in bird's eye and that's simply just to uh be able to continuously stream the bird's eye stream on some of my tablets and use the uh frig card on my tablets you can turn you you can disable this and set it to false just keep in mind if you restream it does use a little bit more CPU so if you have CPU issues already don't turn on bird size uh restreaming just leave it off all right so quickly uh again this is long video let's talk about alerts this is one of the most useful features I find in home assistant uh with the latest version of Home assistant many of the uhu automations can be done via the UI but I'm going to show you the yaml I use the non blueprint version and of course you go into your home assistant instance and then you would you basically would uh create a new automation here blank automation uh or new Automation and put this stuff into it uh so if we look look at my this particular uh automation here again you can do a lot of this in the UI but I do this in the AML it's a lot easier for me and then I just copy it so I can go over this this real briefly uh and what it's doing and then if you have questions you can ask these those to me on Discord or somewhere where I can actually get technical on the display YouTube comments are great but it's hard to answer technical questions there when I have to do examples so essentially what we're doing here is this is a frigate driveway with image that's my title I gave it uh it's looking for mqtt events and that's why you need mqtt uh enabled so that it can pick up these events this is in home assistant so home assistant and fret need to be talking to the same mqtt broker uh it's looking for any fret events with a payload of the driveway and then uh there's some variables I set so I set before and after the way the mqtt works from frig or Vince work is there is a before event and there is an after event and so you use both of these to determine how you're going to make things operate uh and then the camera name the ID the label score for a bunch of other stuff here so first thing I do is I check to make sure that I already I haven't triggered this in the coold down period and down here at the bottom I set a 90 second cool down that means is I will not get another alert with in 90 seconds you don't want to keep firing away at alerts if you have a busy area or busy Zone because it's just going to go crazy all the time so I give it at least 90 seconds for this particular camera and you can set that whatever value you want and you can actually change it based on the camera that you are or the yeah the camera area you're doing so that has to happen first if that's okay and it's not in the cool down period then we go in here and we check and make sure that the payload type is new so that would be a new event we check to make sure that the ined zones length is zero which means it hasn't already had something in that zone we come over here and we check to make sure so these these are two or values so are the it's either these two things or the length of Zone is greater inner zones is greater than zero so this is a before zones that has to be zero that means that it's a new event and nothing is there's not been in another Zone the the the particular ID of this event and then it also checks to see if the after zone is greater than zero all these kind of have to be satisfied and then it has to make sure that there is um one of these zones it's in one of these zones so either the driveway left side driveway right side or both because the length has to be greater than zero so the length of this list this driveway left side and right side is a list and the it has to be in the after zones so the variable after zones up here uh has to be there uh and then if it's there and it's greater than or be it'll be greater than zero then it says okay this is satisfied and in this particular one I don't want to see a trigger if there's a car in there because I do something different with the car stuff so if all of these things are actually met these conditions are met the way we asked for them and it's going to come down here and it's going to notify my phone through the home assistant app tell me there's driveway motion it's going to send me a picture of the thumbnail or the snapshot that it got from that event and it's also going to give me an actionable URL so I can click on the uh the link in there and it will send me to the mp4 file that was generated from this event and all that's done here couple other things it does with Android is I give it a channel name that way I can set a specific sound in the Android application for the sound and the priority hi TTL Zer which means that it will send it to me regardless of the Sleep status of the phone that's not not the right way to say it it gives it a high priority which means it bypasses some of that battery saving garbage that prevents stuff from being sent through I also send it to my computer through the H agent Su uh H agent the computer name is Superman same thing I just in this case I just sended a a thumbnail and I get a little pop up in the standard Windows um notification area that it um has done something and then this base URL and this action URL are set down here uh in this blurred out section and what that means is that I have a base URL which is where it gets the picture from and I have an action URL which is where it gets the clip from and the reason these are different is it sends me this image here and then if I want to actually play it back it goes to the API event section and goes through that MP4 or goes to that MP4 based on the ID so that's basically how that works there's also a um another automation that I do here that talks about or that um turns on my atom Matrix light and the transducer so that when the car goes in the garage you can determine if you're close enough in or far enough in I have a video on that also referenced in the blog here that talks about how I set that up all it does is turns on this um this switch garage atam Matrix for 2 minutes and then turns it back off again don't need that thing running 24/7 I don't know how long those transducers last or that little atom Matrix board I don't want to burn it out so I just turn it on for two minutes it's enough time to get in the garage park the car and then it turns off again so same thing here um automation is all set up based on those zones and anything else any of these conditions have to be met before it'll fire that off I also get a notification that the car there's a car in the garage so all those things fire off all right that is a quick rundown of frigate 13 beta and some of the new stuff and by quick I mean it looks like this is going to be another one of my long videos if you're still here at the end thank you so much for watching thank you for supporting what I do and being a part of my mostly Chris Community where I tell you about things that you may or may not want to know about and with that thank you for watching thank you for subscribing if you're not just push it it's quick it helps a lot of algorithmical baloney nonsense but it makes the videos get out there so that I can make more videos we will see you on the uh next video
Info
Channel: mostlychris
Views: 35,167
Rating: undefined out of 5
Keywords: Smart Home, Home Assistant, frigate nvr home assistant, frigate nvr setup, frigate nvr docker, frigate nvr demo, frigate nvr review, home assistant automation ideas, home assistant automation examples, frigate home assistant notification, home assistant frigate notifications, frigate home assistant notifications, security cameras for home, security camera system, nvr security system, nvr security system setup, best nvr security system 2023
Id: 5s_Kz7chpaU
Channel Id: undefined
Length: 32min 40sec (1960 seconds)
Published: Sun Nov 19 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.