SQLx is my favorite PostgreSQL driver to use with Rust.

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
there's a few rust crates available when it comes to working with postgres but the one I've always gravitated to has been sqlx or SQL X I'm pretty terrible with names naming issues aside to get started with sqlx we first create a new cargo project then we add both the Tokyo and sqlx create to the cargo.tomel make sure to enable the same feature Flags I have which is the runtime Tokyo rust LS and postgres feature flag with the crates added to the cargo Tomo we can run a quick cargo check to make sure everything installs correctly now we can connect to our database first let's change our main function to be an async function that is part of the Tokyo runtime we'll also make sure that this function returns a result now we can go ahead and create our database connection using a connection URL which contains the configuration details of our database instance I'm assuming you already have a postgres database set up and you know what those configuration details are if not you can check the source code in the description which has some instructions on how to set up your own postgres instance the connection URL contains the database type which is postgres in our case the username password the host the port and the database name once we've connected we will receive a connection type we can use this connection type with the sqlx query to actually run our queries against our database connection here you can see we're getting the result of OnePlus One which is two whilst having a single connection is pretty good for getting up and running it's actually recommended to use a connection pool instead especially in production this offers better performance and concurrency when it comes to reading and writing from the database this is easily done with sqlx by creating a PG pool type which has the exact same interface as a connection so we can use it interchangeably now this is all well and good but uh we're using postgres as some form of calculator at the moment what we really want to do is interact with data in our database both reading it and writing it to tables so in order to do that we're going to need to create some tables now the recommended way to do this when working with SQL databases is to use database migrations fortunately sqlx provides migration support out of the box using the migrate macro first we can create our migrations in a directory of our choosing in this case we're going to use the slash migrations directory in the root of our cargo project by default this is where sqlx will search for migrations in this directory we're going to create a new file which is going to have our initial database migration inside of it the name of this file has to be the version underscore description dot SQL the version in this case is going to be zero zero zero one this version has to be sequential so that sqlx can know in which order to run your database migrations inside of the dot SQL file we have our create table statement which is going to create a new books table for storing data on books now back in our main code we can actually call the migrate macro with our database connection next we can run our code and then check our database to see if the table was created there is one caveat with using sqlx for migration so they mentioned the documentation in order to handle changes in the dot SQL files without code changes a cargo build script needs to be added fortunately this is easily done using the sqlx CLI which can also be installed using cargo once installed the following command adds in the build script which is all that we need now that we have our table set up we want to probably write some data to the database sqlx is not an orm database driver orm stands for object relational mapping it basically allows you to map database commands and queries into objects because sqlx is not an orm this means we have to construct our SQL queries by hand let's go ahead and create a new struct that will represent the database table that we have book now we're going to write a new function called create which will take an instance of the book and the database connection and we'll create an insert query using the book notice that we're using the bind function to bind our input parameters into the numbered parameters in the SQL query after that we can call the execute function on that query and await for the result let's go ahead and create a new book which is going to be Salem's Lot by Stephen King now if we check the database we can see that our book has been created updating is a very similar process instead we're going to create a new function called update which will take in an ISBN and the actual book we want to replace it with again we use the bind functions followed by the execute function and here we're able to change details about our book using our new update method with writing being one half of the puzzle the other half is reading or fetching data from our database we've seen this before but we can use the query function in order to pull rows out of our table once we have the row from our table we can use the get function to obtain the value by its name or its index make sure to import the sqlx row trait you can see to do this we're using a fetch one query but there are actually four methods to obtain these rows from a query these are fetch one fetch optional fetch all and fetch the fetch one method is used for pulling a single row out of our database this method does assume that the row exists however and will return an error if it does not the fetch optional method is very similar to the fetch one method except it returns an optional which will be none instead of an error being thrown if no rows are found the fetch all method will return all of the rows that match the query as a vector which can then be iterated or mapped to convert into the type we're expecting which is our book and finally the fetch method returns all the values as well but as a stream like type that can then be iterated through the rows in the results set this is a more asynchronous approach than the fetch all method which can be useful when working with larger data sets the try next method that I'm using also requires the Futures crate so make sure to add that to your cargo.com as well as well as pulling data out by rows sqlx provides some magic for automatically converting those rows into concrete types using the query as method to do so we first need to derive the from row type on our bookstruct then we can use the turbo fish operator to describe the type we want to convert it into then we have our Vector of books with far less boilerplate Magic often when working with databases there are times when you'll want to perform a group of operations atomically this means that the operations occur together all at once or not at all databases provide this through the use of transactions starting a transaction is easy in sqlx all we need to do is call the begin function of our connection or pool which returns our transaction type we can then use this transaction type as our connection type for our queries once we're done we can either apply our queries using the commit function or revert them using the rollback method one quick thing to mention is all of these queries are prepared queries by default sqlx automatically prepares queries that have parameters associated with them which improves performance when performing the same queries multiple times sqlx also provides a number of feature Flags to enable usage of popular rust crates with postgres types Json columns are supported using the Json feature flag with the flag enabled we can then set up our data structures to use surday which SQL X uses internally this allows us to encode and decode Json when reading or writing data from columns euids are supported by enabling the uuid feature flag we can then use the uuidcrate to write to and read from postgres uuid columns the Chrono feature allows us to use the Chrono crate for timestamps with the feature flag enabled and the Chrono crate added timestamps and dates can be mapped from their Associated chronotypes into postgres the IP network feature allows for interfacing with IP cytotypes within postgres after enabling the feature in the cargo.tomel one can use the embedded ipnetwork type within the sqlx in order to interface with cytotypes as well as types another feature flag that sqlx provides is the macros feature these macros provide drop-in replacements for the query functions we've been using whilst also it bringing in powerful compile time verification this compile-time verification performs static type checking of our queries to our database in order to use these macros the database URL environment variable must be set and can be done so easily using a DOT EMV file the last feature I find useful is the support for postgres's asynchronous notifications using the listen and notify commands by using the PG listener type we can specify the topics we want to listen for and wait until we receive them this allows us to use postgres as an asynchronous event notification system sqlx is one of my favorite crates to use for asynchronous rust I really appreciate the level of support it has for core postgres features and the integration with many popular rust crates I'd also consider using it for other databases such as MySQL or SQL Lite I hope you enjoyed this video and I'll see you on the next one
Info
Channel: Dreams of Code
Views: 48,862
Rating: undefined out of 5
Keywords: rustlang, rust programming, tokio, postgres, postgresql, pg, rust database, databases, sql, sqlx, rust coding, coding, db, pgdb, psql, rustdb, rust lang, programming, tutorial, rust tutorial, how to, database, mysql, sqlite, postgres sql, rusty, rustup, cargo, crates, crates.io, postgres crate
Id: TCERYbgvbq0
Channel Id: undefined
Length: 9min 46sec (586 seconds)
Published: Sun Mar 05 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.