Rust & SQL Databases (With Diesel)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
when using a sql database with your application there are two patterns you should at least consider using one is orm or object relational mapping and the other is automated schema migration luckily diesel provides both of these for rust applications in this video we'll walk through the value diesel provides and how to leverage it in your application make sure you stay until the end for some thoughts on productionizing with diesel object relational mapping refers to automating the conversion between sql structures and the structures of a higher level programming language in our case rust doing this can eliminate the need to put sql queries in your code which can really make things a lot cleaner and less error-prone it's almost never the case that our database schema stays the same forever as new features are added to the application more tables and columns will likely need to be added as well manually adding these tables and columns can be tedious and error prone automated schema migration makes schema changes more testable repeatable and also reversible developers just need to write migration scripts for each schema update and then they can be confident that that change will be applied consistently across all environments and also will be reversible if there's a problem in this example we'll be using diesel with postgres sql but it also works with mysql and sqlite in this video we're going to build rust flicks so rustfix allows us to manage users videos and views of those videos by those users so we can do things like rustflix video create bad movie and then a title and then as description really bad movie and it'll create that video behind the scenes it's storing that to a sql database and then we can do rust flicks video show and now we can see that video exists one thing to note about diesel is that it makes use of a command line application since we're working with postgres we'll actually need to install the lib pq library before installing the diesel cli so to do that we're going to do brew install the pq notice after the install completes it shows some suggested values for the ld flags and cpp flags environment variables those are actually for c and c plus plus compilers we're going to actually set the rust flags environment variable to the value that they suggest for ld flags so we're going to do export thrust flags equals and then copy this value when we actually install the diesel cli we don't want to worry about the my sql or sqlite libraries so we're actually going to disable the default features and specify only the postgres feature looks like that worked let's see if we can run the diesel cli yep so we kind of have the skeleton of the restflix application already set up we have it parsing command line arguments now all we need to do is incorporate diesel and we want to be able to specify our database connection url via an environment variable so we're actually going to create a dot env file in our package directory and we're going to specify the database url here so as long as we use the env crate we can pull in this environment variable when our program starts before we do anything else with diesel we need to run diesel setup and diesel setup does two things number one it reads that dot env file connects to postgres and we'll actually create our database and also it creates the migrations directory in our project now let's talk about what a migration is in the context of automatic schema migrations a migration is a unit of sql that describes both how to apply and reverse a schema change the diesel cli can help us set up the file structure for a new migration i can do diesel migration generate videos so now the diesel cli has set up the file structure for this new migration and we can see it created a directory inside the migrations directory that's prefixed with a date time and then postfix with what we named that migration which in our case is videos migrations are run in chronological order as specified by the time stamps on the directory names beyond that you can arrange migrations however you like in my case i'm going to create actually a separate migration for each table that this application is going to use so i'm going to have one for videos one for users and one for views we're only going to focus on videos for this video you could if you wanted to you could potentially put all those table creations in one migration if you wanted completely up to you the important thing to note is that migrations are run chronologically based on when they are created so as long as you don't have a migration that depends on something in a subsequent migration you should be in good shape now let's look at our migrations directory in each of these migrations we have an update sql and a down.sql and the up.sql file will contain code for applying that schema migration down.sql will do the reverse if we were creating a table in up.sql we'll likely be dropping that table in down.sql and this is what helps make these migrations repeatable and also reversible if there's a problem and again in restfix we're going to create a migration for each of the tables that the application uses but we'll focus on videos here because the concepts for users and views are basically the same so in update sql we're going to do create table videos id serial not null so the db is going to decide on the id for each insert so yeah each video is going to have an id a title a description and we're never going to delete videos from the database because a view might reference a video that had has been deleted so we want to keep it around so we're going to have a removed flag that's going to be a boolean and of course our primary key is going to be the id and like i mentioned in down to sql this needs to undo everything that happens in update sql so we're just going to do drop table videos so that contains a self-contained migration that's reversible and repeatable now if we want to test running this migration on our database all we need to do is diesel migration run and that ran the video migration so now we should see a table called videos at this point yep nothing's in there but it's the tables there now if you want to roll back the most recent migration you can do diesel migration revert and so we should see no table now yep table doesn't exist if you want to verify that a migration is properly reversible you can just do diesel migration redo and that'll roll it back and then rerun it and so if there were any problems with the rollback part you'd get some errors here one one really important thing to note here is that if you're testing your migrations and you later decide oh i want to add another field to this table so i want to add like stuff here if you just run your migration again it's not going to do anything you're not going to get this new field because as far as diesel is concerned that migration has already been run and it doesn't need to be run again so you actually need to redo that migration and we can see now we have that extra column that we added this is something that can definitely bite you when you're starting out with diesel or any other automated schema migration tool the other thing that diesel migration run does is it generates schema.rs in your source directory which is generated code that will allow you to interact with your tables in rust now that we understand migrations we're going to look at cargo.tamil we're going to add diesel as a dependency as well as env for reading environment variables and also chrono for dealing with date times in main.rs we'll add some mod statements for some modules that we're going to create we're going to create a db module we're going to have the schema module that was auto-generated for us and we're going to add a module called models and we also need to add this macro use macro create diesel now we're going to create db.rs in our source directory and that's going to contain some kind of boilerplate code for establishing a connection to the database we're going to start off with some use statements the diesel prelude env for loading those environment variables and standard env we're going to create one function in this file and it's going to be called establish connection and it's simply going to connect to the database specified in the database url environment variable and return a pg connection struct for that connection we're going to grab the database url from the environment variable and then we're going to establish a connection to it and that should do it for db.rs now we're going to create the models that we'll use to represent the video structure in our rust code we'll create a file called models.rs in our source directory we're going to write a use statement to grab the videos structure that is in the auto-generated code in schema.rs the structures in this file are going to leverage some diesel macros that will enable us to pass them into some of the diesel functions the ones we're going to use are queriable which indicates that the structure can be used to form lookups and also insertable which means the structure can be used to perform inserts and the third one which is a little bit less intuitive is as change sets which indicates that a structure can be used to perform updates on an object and again we're just going to look at the structures for videos here users and views use the same concepts and we're actually going to make two structures for videos one is for inserting a video because you don't need the id when you're inserting a video since that's generated by the database so that structure is going to have all the fields except the id field as i mentioned we're going to derive the insertable macro and we have to use this diesel table name macro and refer to the table that this structure is meant to be inserted into and the actual structure is going to be struct new video and it's going to use a lifetime annotation which you'll see why in a second here it's going to have three fields title which is going to have that lifetime [Music] search in and the reason this lifetime annotation is here is because title and description are string slice references in order for this to compile we need to verify that title and description have lifetimes at least as long as the new video struct otherwise those fields would point to freed memory essentially so that's what this lifetime annotation is for the next structure is going to be used both for reading query results and also for updating videos so we're going to derive debug queryable and as change set and this truck is going to be called video oops and it's going to have four fields id so this should be all we need to model so start interacting with that videos table and rust so we have this ops module with this video ops submodule and we already have the function signatures for the functions we're going to implement the program is already parsing command line arguments and putting them into these structures that get past these functions so all we need to worry about in this file is actually reading and writing from the database there's a few use statements we need to add we need to grab the models that we just created from the model module and we need to grab that established connection function that we created in db.rs and then of course the diesel prelude now we can start implementing these functions if you'd like more details on how the command line arguments are parsed into these structures here there's a video i'll link down in the description below that'll go over that so first we're just going to print some debug output to verify that the right thing got passed into this function [Music] and in the auto-generated code in schema.rs there's also a dsl or domain specific language sub module for every table that we create so we're going to need to reference that and a use statement here and then we're going to call that establish connection function that we created in db.rs we're going to use that new video structure that we created in models.rs because this is going to insert a video and we're going to populate it with the fields from create video which is a structure that the command line arguments got parsed into and at this point we can actually perform the insert pass in the connection object that got returned by establish connection and output this error if anything goes wrong so that's all we should need for inserting a video you can see we referenced the auto-generated code in schema.rs here using this video structure and then we passed in our new video model to the values function and then we called execute and pass in the connection that we established fairly straightforward now on to update this is going to be pretty similar to create but we're going to be using that video structure instead of new video now we're going to invoke the update function and this is really cool we can actually reference this videos.find function which is in the generated code and we can pass in the id of the video that we want to update and then we say set uh because it's an update pass on the connection and if there's an error i'll put this message okay that should do it for update and last but not least we're going to implement show videos which is actually going to output all the videos in the videos table the auto-generated videos structure has this filter function that allows us to filter out entries that we're not interested in in our case we're going to filter out anything whether a mood field is true so we reference that videos table in the auto-generated code and we filter out any rows where removed is true and then we call the load function referencing the video structure that we created as a structure that we should read the results into and then we're just going to print the results and that should do it let's test it out all right let's build and test this so let's do rust flicks video create help so it says we need to specify a title and a description rustflix video create good movie uh description really create movie okay it says we created the video trustflix video show cool let's see if it shows in the database yep looks like it's there looks good so i do have a few thoughts on working with diesel in a production environment first you probably don't want to use the emv file you probably want to set the database url environment variable using your ci cd system so you can have one environment for tests and another for production the other thing is recall that we had to run diesel setup and then diesel migration run ideally in production we want these things embedded in our code so we don't have to install the diesel cli on our production machine now unfortunately there isn't a programmatic equivalent to diesel setup although there's some discussion about creating one but there is a programmatic equivalent to diesel migration run which is in the embedded migrations module you can call the run function and this is nice because you can actually embed your migrations in your binary so you don't have to deploy those separately and then you can run them from your code every time your application starts so it seems like right now the idiomatic way to do things is to install the diesel cli on your production image run diesel setup before your application starts and then call the embedded migrations run function to run any migrations that haven't yet been performed so that's a quick run-through of working with diesel let me know in the comments if you're interested in any other sql crates and maybe how they compare to diesel thanks for watching and we'll see you in the next one
Info
Channel: Code to the Moon
Views: 62,168
Rating: undefined out of 5
Keywords: rust, rustlang, sql, postgres, diesel, schema, migration, orm, object, relational, mapping
Id: tRC4EIKhMzw
Channel Id: undefined
Length: 14min 59sec (899 seconds)
Published: Tue Apr 26 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.