Automagically summarize repo activity with a bash script (ft. mods + pop)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
it's a lot of work trying to keep up with what's happening in your code repos let alone what to focus on here is where AI can come in handy okay personally I don't think that AI is a great tool for doing your job for you but it can give you a really good base to work off of for those who don't know GitHub has a CLI that allows you to see any of your GitHub related activity for a certain repo through your terminal I use it quite heavily in my workflow I actually can't remember the last time that I created a poll request through the web interface instead of just using GHP create Dov an OG charm Community member even created an entire dashboard with the GitHub CLI to view all of your activity in GitHub right from your terminal okay we can access our GitHub issues from our terminal and we have an AI chatbot that also uses a CLI why not just write a little script that can summarize the key trends that are happening in issues therefore help you figure out where you need to Target your attention better yet let's get those summaries sent via email so that you've got a record on there and it is timestamped and you now have to stay accountable to your responsibilities okay whether you like it or not for this thing to work you got a few things you got to install okay you got to have pop which is being able to send emails from your terminal you also need mods which is the AI chatbot that is scriptable and both of these things also require environment variables to be set you can either add these environment variables to your script or your RC files so whether that's bash RC zshrc or whatever other shell that you use and then if you're wanting to access it from your existing terminal session you can just source that RC file or just start up a new terminal session and you'll be good to go because I keep my DOT files public I like using using skate to protect my API keys to protect me from myself mostly so I'll include an example here of what the comparison looks like between defining the environment variables normally or using skate to obscure that sensitive information skate is Ed and encrypted gets encrypted on your machine so in case you're worried about that some of you might not have heard of mods or pop so let me explain that real quick before we get into the actual meat of the script all right mods is is a command line interface for interacting with AI chat bots so you can tie it in with any open AI compatible endpoint or even local AI models so you can use your own self-hosted models it does use gp4 by default but also has some ports that it expects by default which is 8080 for your local model to run on you can I don't know read more about that in the description if you're curious pop is a terminal based program that allows you to send emails either with command line or a Chey so right from your terminal you can either have the option to send through resend or with a custom SMTP setup in either case you're going to have to set some environment variables so I will attach a link to the blog post that has all these details written out in case you prefer to follow written format but I'll also put them on the screen here so that you can see what's up all right so now we can actually put it all together step one is that you need to set up your open AI API key environment variable step two is that you have what ever environment variables you want for pop setup whether that's resend or your custom SMTP server setup and then we're going to run this script so what's happening in the script is as you can see I am defining the API keys in the script itself because they're always going to need those environment variables and then I'm checking that they've provided an argument the argument is going to be the path to whatever repo it is that you want to get information from and then if you have provided a path to a repository we are going to go to that repo we are going to retrieve all the issues and then sort them by bug reports and enhancements and then we are going to send it via email with pop and you know because it's always fun to layer on the complexity like a cake or dip whatever's layered like an onion why not layer on the complexity of what if we wanted to schedule this thing to happen on a regular basis and how we do that is with Kon now as soon as I mentioned cron I had some people say system D timer is what you should be using that's what all the the Giga Chads are using on the internet nowadays so you're actually behind on the times Bunny and I said wow that's tragic but anyway I'm going to teach you about cron jobs because they are the more popular solution and more widely accepted okay system D might be better in some ways but for those who don't know what Kon is Kon is a Damon that is used to execute scheduled commands mainly on Unix systems it comes pre-installed on most Unix systems I think okay my spreading misinformation I don't know if it still comes pre-installed it always comes pre-installed with my Linux dros so should come pre-installed and if you're a Windows user you can also use WSL which is the windows subsystem for Linux another option if you're on Windows is that you can use Windows task scheduler to automate your tasks but I'm not going to get into that here okay to configure this thing you use cron tabs which are files that have all the information on what you need to ex cute so it will have the interval of when this script will run it has a path to the script it has potential logging information in there KRON jobs don't execute in your regular shell while before it might have been okay to set your environment variables in your RC files your cron jobs won't have access to that so if there are any environment variables that you need to pass forward you do that in your cron tab file in my case I'm depending on my go path because that's where the binaries for pop and mods are installed because I used go to install them I had to add my path and I had to add go path to carry those environment variables forward into the cron tab this is part of where I found it helpful to define the environment variables in the script itself because the Cron job will have access to those given that they're in the script and a few things that I did to clean that up making sure that my script was executable in the system moving it to user bin so that it's executable from anywhere and just shows as repo summary and then I also moved my logging because I do want logging moved that to VAR log which on Linux systems is typically where programs will put their logs I did have to change the permissions on the directory that I created within VAR log because it is a protected directory so keep that in mind my Chron tab wasn't able to write to that directory right away because it needed permission to write to it I'll include all of the information that you need to get started with that and then of course because we're layering on the complexity so why not go all the way I was looking at over engineering this with containers the goal was to maybe get like a plug-and-play experience so they could just run the container the container you could have a Cron job running and then when you don't want that thing to run anymore you just shut down the container right it would also make it easier potentially to export all of our environment variables from the host to container and it really centralize the dependency management you have we wouldn't have to install all this stuff it would just only be installed in the container Docker files are so good for this so why not try and containerize it yeah so this ended up not being a good idea and I realized that after I got it working I mean it was fun to explore okay don't judge me it was not the right solution because the GitHub CLI requires a git repo to run properly right but then even if you were to Mount Your git repo as a volume in your container there's like like some weird ownership issue that you get it's not that much benefit for like the amount of setup that it takes like it's easier to just not contain Riz it so I didn't think there was a ton of value added there but I did include the docker file cuz I knew that some of you would be curious about it anyway if you want I can walk you through some of that it's essentially all of the steps that I'd outlined before I copy over the script I make sure that that script is executable I copy over my path and I install pop and mod as dependencies I also install the GitHub CLI and I clean up the cash after each time that the container gets run and I set the work directory to whatever repo it is that I want to follow whenever it starts up it already knows what I want from it I have it run the script when it starts as well given that Docker file I have to do a few extra things when I run the docker run command I do have to provide some additional information to make it available at runtime this includes one making it an interactive Terminal Two mounting whatever repo it is that I want to get the information from as a volume for the container and then I also need to pass all of my environment variables as arguments and then finally I provide the image ID and then I say we're going to run repo summary and we're going to run it in the work directory which is just like slash cuz that's that will be like the current directory once we're in the container if you do want to challenge yourself a little bit you can try again a Cron job set up in the docker container if you're you know a real go-getter let us know what you think in the comments let us know if you try it out otherwise I'll see you in the next video where we'll explore some other thing and over engineer it and see how we go also check out this video for like mods or pop or something okay I'll put something cool in the corner here and you got to click it
Info
Channel: Charm CLI
Views: 3,274
Rating: undefined out of 5
Keywords:
Id: xQcUC0OAjL0
Channel Id: undefined
Length: 9min 49sec (589 seconds)
Published: Thu May 02 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.