FastAPI & Alembic - Database Migrations in FastAPI apps

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in this video we're going to add the almic package to our fast API project and almic is a package that allows you to create and manage migrations within a python application and we'll see how nicely almic plays with the SQL model package that we installed in the last video and whenever we make changes to our model classes we're going to set up a limic in a way that it detects those changes and automatically generates the appropriate revision file in response to those changes and having a proper migration setup using something like a liic is a really important part of application development you need a way to manage these changes that you're making to the underlying database and a way to track the historical revisions and changes that have been made to that database so if you're enjoying this content give the video a thumbs up and it would be greatly appreciated if you'd subscrib to the channel if you've not already done so and let's dive in so let's start with a quick look at almic documentation you can see that it's a lightweight database migration tool for usage with SQL Alchemy and that includes the SQL model package package that we're using with fast API that builds on top of SQL Alchemy so we can use almic with that as well now a note about pronunciation I don't know what the best way to pronounce these tools is is it SQL Alchemy or SQL Alchemy if you have an opinion on this let me know in the comments I'm going to keep using SQL Alchemy and SQL model because it rolls off the tongue a little bit better in my opinion now we did a video on Olympic previously it should be appearing on the screen now and if you're not familiar at all with olymic that would be a good introduction to the package what we're going to do to get started in this video is go to the installation section and we can install a lmic in a python virtual environment by using pip so let's copy the command and go over to VSS code where we have the project open now this is an existing fast API project with database connectivity already implemented and we're using SQL model here to create all of the tables in the database and the database that we're using at the moment is SQL light now ideally we want to manage the database and manage any changes to that schema using a lmic or any other migration tool so what we're going to do in the virtual environment is paste that command that we've copied in it's the PIP install a lmit command we're going to run that and it's going to install a lmic now once that's completed we can clear the terminal and we have another command that we can run to start the migrations in this project and that's the almic init command and we can give the almic migration environment a name I like to call it migrations once we execute that it's going to create a directory on the left hand side in the root of this project called migration so the almic init command is going to generate that folder and some files within that folder and it's also going to create this file at the root of the project and that's the almic do file now if you want to learn more about the contents of this file and the folder that was created check out that video on almic that we did what we're going to do now in terms of this fast API application is we're going to use the models that are defined in this application so these models that have table equals true these are SQL model classes and we want to manage any changes to these classes using a lmic so for example if we look at the album class here this represents album data and there will be an album database table under the hood and at the moment the album has an ID and it has a relationship or a foreign key to a band and as well as that because of the structure of these it's inheriting a couple of extra properties now let's say we wanted to add a new field to this class and to the underlying database table we want to manage that change using almic so when we run almic commands we want to autogenerate a migration file and when we apply that migration it's going to change the underlying database so let's see how to do that now if you look at the migrations directory on the left hand side it's generated a file called script.py M and again I don't know how that's pronounced now I'm going to pronounce this Mao and what we're going to do in this Mao file is import SQL model at the top now you can see that when you run Olympic init there's a couple of imports at the top from the Olympic package we import the op and that stands for operation and we also import SQL Alchemy as sa now in order to autogenerate migrations from SQL model classes we're also going to import SQL model at the top of this Mao file and if you're wondering what Maco is this is a template library for Python and the contents of this template are going to be injected into the revision files that are generated by a lmic so we're defining what is essentially a schema for those revision files and we also want to import SQL model as part of each migration the next thing we need to do is go to the env. py file and this is a python file and what we're also going to do at the top is from the SQL model package import the SQL model Base Class and underneath the almic import from our models package we're going to import the two models that are defined with the table equals true keyword argument and that's the album and band tables and I say tables these are actually model classes and they're defined in models. pi this is one of them and the band model is here at the bottom so we're importing them into aic's environment file and essentially telling almic about these models what we're now going to do is we're going to find a property in this file called Target metadata now let's get down to this property what we can do here is add the model metadata object so for example if we had a model called my model in the application we could reference the metadata here and set Target metadata to that now we've imported the SQL model Base Class and we can reference a metadata property on this and what it's going to do is add the metadata for all tables or all models sorry that have table equals true so again telling almic about those model classes in the Target metadata property now there's one more thing we need to do here before we can actually generate a revision or a migration file we need to tell almic what the SQL Alchemy URL is in other words we need to tell almic what is the URL that points to our database now to do that we can go to the Olympic do any file and we're going to we're going to search for a property here called sqlalchemy do URL now the value of the SQL Alchemy URL should point to the database that we're actually connecting to so for example if we have a local SQL light database this should point to that or if we have a database on a remote server for example it could be postgres or MySQL we can provide the URL to that database in this property now what we're going to do in order to point to the SQL Alchemy URL is we're going to Define this in the env. pi file so let's go back to that and at the top of this file from From the Path lib module in Python let's import the path object and we can use the config that we have here in order to set properties in the almic configuration and one of the properties we're going to set is the SQL Alchemy URL so let's go back to n.p and below this config I'm going to point to our database so from the n.p file we take the current directory and we look at the Parent Directory which is the project root and then we're looking at the database.sql light file within that directory and I'm going to change this doesn't have a three at the end once we have that path we can resolve the full absolute path on the file system and that's going to be the path to our SQL light file and after we've got that path we can use the config object and in order to set some configuration there's a function called set main option we're going to call that function the first argument is the name of the property that we want to set so let's set that to SQL Alchemy do URL and the value that we want to set for that property we're going to Define that using an F string and we're connecting to SQL light so we're going to use the SQL light protocol and that has three forward slashes after the name and we're referencing the value on the line above using that F string syntax so let's save the n.p file and what we've done on these two lines of code is tell almic where our database lives by setting that SQL Alchemy URL property now if you have a simple configuration again we can go back to almic do any if your database has a remote URL you can just set this directly here without having to call set main option but because I'm doing this dynamically in Python using file system operations we're doing this using the config do set main option function in the n.p file and with that we're now ready to generate the initial revision for this application now in order to generate a migration we can go back to the command line and we're going to run a command and that's the almic revision command and because we want to autogenerate these revisions from the models that we have and these are the models that we pointed almic to using the SQL model metadata we can pass a flag for this and that's autogenerate and when we generate a revision file we can also generate a message for that and this- M and then a message it's very similar to what you do when you create a commit using git each commit that you generate should have a message and it's the same for a lmic revisions we use the- M option now I'm going to call this initial migration so let's execute that command and what happens when we do that if we go to the versions directory within this migrations folder the command has generated this initial migration P file so we can go to that file and what we're going to do is take a look at some of the properties and the functions in this file now you can see the revision has an identifier and the down revision identifier is set to none because this is the initial migration if we generate a second migration it's going to point to the previous one in this down revision property now one thing to note is that the upgrade function here is empty and so is the downgrade function so I'm going to explain why that is in a second but the upgrade function here the code in this function defines what change you actually want to apply to the database when you run the migration against that database and conversely the downgrade function here will Define what happens when you want to reverse those changes now the reason that these functions are empty is because we already have this database file on the left hand side and the tables within that already exist and the reason for that if we go to database. Pi is because we called this innit DB function in the last video and that is going to use this helper function from SQL model to actually create the tables in the database for the SQL model classes now what I'm going to do is delete the database.sql light file and then if we go back to the command line I'm going to remove the initial migration file as well now after we've done that we can rerun the almic revision command at the bottom and again that's going to generate a migration file and we can look at that now and see what is in this file now the upgrade command you can see that it actually contains some code so what's been generated here in this upgrade command now I'm going to tab some of these operations over so that we can see this a little bit better and let's start with the one at the top it creates a table called band and it's using the schema that's defined here so we have a SQL Alchemy column for the name the genre and the ID so we use this op object which is imported at the top from almic to call the create table function and we're doing that for both the band and the album Now each SQL Alchemy column has a data type for example a string or an enom type or an integer and these types coming directly from the model classes so if we look at the band model here we have an ID which is of type integer and that is also the primary key now let's go back to the migration file and you can see that the primary key constraint has also been added when we generated that revision and we also get some constraints for example some Fields here in the album model are nullable and others are not so when we run the Olympic revision command with that autogenerate flag what almic is doing is it's looking at the models. pi file and it's looking at at the SQL model classes and if you're using SQL Alchemy it's going to be exactly the same and it's creating the table by introspecting those classes and looking at the properties and the data types on each one of those classes now if we look at the database.sql light file at the moment and I'm using the SQL light extension for vs code here to do this let's look at the database now and you can see we only have one table it's called almic version so let's go back to the migration file we have a couple of create table statements how do we actually apply those to the database before we do that let's just go down to the downgrade command and very quickly look at this this is for reversing a migration and for this particular migration in order to reverse it it's very simple we call the drop table function on the almic op object so again let's go back to the upgrade function how do we apply these changes and actually create the tables in the database let's go back to the command line and we're going to use the almic command line tool now we use the revision command to generate the file what we're going to use now is a command called upgrade and if we want to upgrade the database based on the most recent file that was generated we can use almic upgrade head now let's run that command and we're going to look again at the database so let's refresh this SQL Alchemy or SQL light database sorry and we can now see we have an album and a band table in this database now what I'm going to do in the fast API code is very quickly remove one of the functions that we defined so let's go back to main.py we defined this lifespan event that called inet DB we no longer want to manage this from Fast API we want to manage the database completely using a lmic migrations so let's remove this function now and we can remove the Life Span event that's passed into the fast API object and we can now remove the import of the async context manager at the top and everything should still work as expected but we're now managing the database using the migrations now I want to very quickly test one of the endpoints so let's go back here and I'm going to send a post request to create a new B and in the database and in order to do that we need to actually start the server so let's run uicorn and point to our main app when we run that we will get the server running and we can actually send this post request and when we send that at the right hand side we get back the data for the newly created band so everything is still working we're going to stop the server at the bottom and I'm going to close this response page let's now demonstrate an example of why these migrations are so useful when you're working with model classes so I'm going to clean this up a bit I'm going to close all of the files at the top and we're going to go back to models.py now the typical workflow when you're developing an application is that you have a model and you realize that something has to change you need to add a new field or you need to remove a field or you need to add a new relationship from one model to another now remember the revision files the revisions themselves are autogenerated from our SQL model class definitions so when we change a model definition we want to generate a migration file that encapsulates those changes to our database schema so let's do that now we have a band model here and let's imagine I wanted to add a field to this model and let's say we want to add a field that tracks which year the band was formed so we're going to add a new field here called Date formed and that's going to be of type date but we also want to make this nullable so let's give it a union type of date or none and I think we have date imported at the top here from Python's date time module and that python date object is going to be mapped to the underlying date type in the database now what we need to do with a lmic when we change a model like this is we need to generate the migration so let's go back to the command line and again we're going to run the almic revision command and to that we're going to pass the autogenerate flag and we can also pass a message to that as well so let's add the following message we've added the date formed field to the band model and we can now execute this command let's go back to the migrations folder and the versions folder within that and we can look at these files and we can see the new revision file that's been generated by that command so if we inspect that file what we're going to see here is that it points in the down revision to the previously generated revision file and of course this new revision has its own identifier and if we look at the upgrade function here we have a new operation and it's calling the add column function and the table that we're adding the column to is the band table that was the model that we changed and the column that we're actually adding is a column called Date formed and that was the name of the field that we added to the model and the type for that in the underlying database we're using the SQL Alchemy do dat type and that's going to set the data type in the new column in this table to the date type and finally because we used the union of date and none it's going to set nullable to true if you look at the downgrade command to reverse that change it's a very simple one we call the drop column function and we're dropping the date formed column from the band table now one thing to note if we need to change anything about this autogenerated revision we can actually do that here in the two functions that we have but if you're happy with the code that's been generated what we can do is go back to the command line and I'm going to clear the terminal just now and we can run the almic upgrade command now I need to spell almic correctly here it's almic upgrade and again we want to apply the most recently generated revision file so we use the head command and when we run this it's going to add that new column hopefully to the band table now we can verify that so let's minimize the terminal and we're going to bring back the sidebar and we're going to look at the SQL light Explorer here and I'm going to refresh this and if you look at the band table here we have a single record in this table where it now has a new column and that's the date formed column that we added to the SQL model class and we then generated the revision and when we apply that revision it's going to add this column as you can see here to the band table so the Crux of this is that anytime we change our models in the application after we've done the initial setup with olymic we can then just run the Olympic revision command and the olymic upgrade command to apply those changes to the underlying database and that allows us to manage the database from our application using these migration and that's a lot easier and a lot more scalable than trying to manage these outside of the application we can very easily keep the database structure in sync with our models by using a lmic with fast API and SQL model and if you're using a different python backend package for example that might be flask or light star you can do this in a very similar way with those packages so so that's all for this video we've seen how to use a lmic with fast API and we've seen how we can manage databases using changes to our model classes if you've enjoyed this give it a thumbs up and subscribe to the channel if you've not already done so and what we want to do next in this series is actually build an application using fast API so if you have any ideas for what you want to see as that application drop them in the comments and we can consider those for the next step of this series so thanks again for watching and we'll see you in the next video
Info
Channel: BugBytes
Views: 4,050
Rating: undefined out of 5
Keywords:
Id: zTSmvUVbk8M
Channel Id: undefined
Length: 18min 33sec (1113 seconds)
Published: Thu Apr 04 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.