Postgraphile For Beginners Mini Course

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in this video i'm going to show you how you can build the server side of your applications using the excellent open source tool postgraph file so what's postgraph file and why would you want to use it well postgraph file is an open source project for creating high performance graphql servers that primarily use postgres databases and the reason you might be interested in using postgraph file is because with it you can build the server side of your applications in a fraction of the amount of time it would take to build the server using most modern programming languages and frameworks so developing with postgraph file is fast you can get your server built really quickly but you might be wondering what is it about postgraph file that speeds up the development process to answer this question let me explain how postcraft file works in the simplest case basically to use postgraph file all you need to do is create a well-structured database and point postgraph file to the database and then postgraph file will inspect the database schema and automatically create a graphql server that your application can perform queries against so basically what i'm saying is with postgraph file you might be able to create a server without writing any server-side code which obviously can save you a ton of time now you'll definitely need to write some sql to create tables views and maybe some sql functions and a few other database related things but these are the kind of things you're already doing in a conventional server project but with postgraph file after you've written the sql needed to create a well-structured database there's a very good chance their server is done and ready to be used with your application all right so how exactly do you use postgraph file well that's what we're going to cover in this video now there's a couple of soft prerequisites for getting the most out of this video you should be somewhat familiar with both sql and graphql now i'm not saying you need to be an expert in both of these to watch this video but it might be a bit challenging to follow along if you're a complete newbie with these technologies okay are you ready let's get started i'm going to show you how to use postgraph file by building an actual postgraph file server for a web application that we'll look at right now so what's the web app we'll be building well it's essentially a flash card app that you could use to help learn new things but it's more than just a flashcard app because we're going to implement it to use spaced repetition so what's spaced repetition oh hold that thought for now i'll explain in just a moment here but first let's take a look at some drawings of the web app that we're building the server for let's start by looking at the cards page you see here this page lists all the flash cards for the signed in user from this page you can edit an existing card archive a card delete a card and create a new flash card if you click on the add card button you'll be taken to this page where you can enter a question in the corresponding answer now once you've created a few flash cards the home page will show you how many cards need to be reviewed and clicking the begin review button will take you to the review page where you're prompted with a flashcard question like you see here okay as a user when you see the question you'll attempt to answer the question in your mind then you'll click the show answer button to see if you remember correctly now when you view the answer you'll also see these four buttons which allows you to score or grade how well you remembered the answer so for example if you remembered it perfectly without thinking about it you'd click easy and on the other end of the spectrum if you couldn't remember the answer at all you'd click again you're probably wondering what happens when you click these buttons well to answer this question i need to circle back to the concept of spaced repetition so what is spaced repetition well first of all let's briefly talk about the human brain our brains can hold a ton of information in fact it's estimated that our brains can store somewhere between two and three petabytes of data that's enough storage for about two to three hundred years of recorded video so our brains can store a lot of information but unfortunately our brains don't work like we might want them to i mean if our brains have so much capacity why are we always forgetting things like where i put my car keys or where i left my iphone well there's a lot of reasons why you forget things but i'll give you one big reason it's pretty healthy to forget some things for example imagine what your life would be like if you remembered all the bad things that happened to you since you were born yeah i don't know about you but i'm totally okay for getting the bad stuff okay so how does your brain decide if it should hold on to a memory or let it go well this is where the concept of spaced repetition comes in our brains have what's called a curve of forgetting which is essentially a plot of memory retention over time well let me give you an example of how this curve of forgetting works imagine i've got a rope and i show you how to tie a special knot then i hand you the rope and i ask you to tie the same knot you might be able to tie the knot on your first try or you might have to watch me do it one more time but pretty quickly you'll be able to tie the knot now the question is how long will you remember how to tie that knot that you just learned well typically most people will remember how to tie the knot for around 24 hours but soon thereafter the memory will fade and you'll forget now an interesting thing happens if you practice the knot just before it falls out of memory right around the 24-hour point basically by practicing the knot again your mind says to itself hey i'm using this memory again so it must be kind of important so the memory is strengthened and you've got a new curve of forgetting which is around three days now if you practice that not again three days later your memory is strengthened even more and now you'll hold on to this memory for about a week and every subsequent time you review that memory just before you normally forget it you're strengthening that memory and the curve of forgetting grows and grows and grows so the idea of space repetition is that you can sort of hack your brain into remembering things by recalling a memory just before you typically forget it and the nice thing about spaced repetition is that you can memorize things with the least amount of effort possible okay so now that you've got a sense of what spaced repetition is do you have any ideas on what these buttons are for well basically clicking these buttons will set the spacing interval on when to review this card next so if you click the again button it means you totally forgot the answer and you need to reset the spacing interval to zero in other words you need to practice this one again right now and as you can imagine clicking the other three buttons will set the spacing interval to different points in the future okay so there's two takeaways from this little discussion on spaced repetition first of all you should use spaced repetition when you learn new things because it's one of the most efficient ways to do so secondly the server we need to build isn't just a dumb credit interface there's a bit of logic to it that we'll need to implement but how do you implement server side logic if all you're doing is creating a database well stay with me and you'll see all right there's two other pages i want to look at real quick a page for creating an account and a page for signing in so basically we'll need to somehow implement user registration and authentication okay let's start coding i've got my editor open and you'll notice that this project is mostly empty except i initialized it with git and npm and i've got an rc file that specifies what version of node.js to use but for all intensive purposes we're starting with an empty project now keep in mind the vast majority of what we'll be doing going forward in this video is creating a well-structured postgres database that can be used by our flashcard application so how are we going to create the postgres database we'll be using a database migration tool named dbmigrade which exists in npm the node package manager so i'll go ahead and install it right now of course you don't have to use db migrate you can use whatever database migration tools that you prefer next i'll install the dbmigratepg library which includes the appropriate drivers so dbmigrate can talk to the postgres database okay i've already got postgres installed on my mac via the excellent postgres app if you're following along on your own you'll need to make sure a fairly recent version of postgres is installed on your computer i'm using postgres version 10 for this video next i'll create the database we'll be using by keying createdb and i'll name the database learn dev now i'll use adam my editor to create a new config file for dbmigrate named database.json next i'll create a dev key which has an object literal with the configuration information the driver will be pg the postgres driver the host is localhost and the database will be the one we just created learn dev now keep in mind depending on how you set up postgres on your computer you may need to pass additional configuration options such as a username and password but in my case where i'm using the postgres app i don't need to pass any credentials along okay our initial setup and config is done and now we can begin the process of structuring our database for our flashcard app so to create a migration file i'll pull up the terminal and i'll key npx a command line tool that comes with npm then i'll key dbmigrate create then i'll provide a file name of my choosing which i'll call create schema and lastly i'll add the option sql file this sql file option will basically create two plain old sql files that we'll use to specify how we'd like the up and down migrations to work you'll see what i mean by this in just a moment okay so this command completed successfully and you can see that it created three files in a new migrations folder these two files here and here are the files we'll be modifying to construct our database notice that the name of the file is essentially a combination of a time stamp the name we specified here and either up or down the up file will include the new things we want to add to the database and the down file will define how to remove the things we add in the up file essentially allowing us to revert the changes made in the app file okay let's open up the new migration files that are located under the migrations folder and then within the sqls folder the first thing we'll be adding to our database is a new schema named learn i'll do this by keying create schema learn now there's a good chance many of you have worked with databases for many years but never did anything with schemas like this and you're probably wondering what exactly is a schema in postgres and why am i creating it you can think of a schema as just a namespace for now and the namespace is kind of like a folder that we'll be putting stuff in such as the tables we'll be creating in just a moment here now as many of you know you don't need to create a custom schema when you're using postgres you could just use the default public schema and if you're using the public schema you probably don't even really think about it so why are we adding a custom schema well i'm not going to tell you why just yet we need to cover a few more things still but i'll give you a hint the reason we're creating a custom schema has to do with security or more specifically it has to do with what information is made available by the postgrad file server but more to come on schemas in a bit next i'll open up the corresponding down file and i'll say drop schema learn this command essentially removes the schema that gets created in the up file okay now let's apply our migration in the terminal by keying npx db migrate up which will run any commands in the up file all right we see the sql command that was run here and there were no error messages so all as well it worked now if we wanted to reverse the migration we just ran we'd simply run npx db migrate and any guesses what comes next yep down and as you can see this sql command was executed and our database no longer has the learn schema of course we do want the learn schema in our database so i'll go ahead and run the migration again you know how i added this sql file option right here i don't want to have to add this option every time i create a new migration but i do want all of our migrations to be plain old sql files instead of javascript files which is the default dbmigrate allows you to specify default options by creating an rc file or run command file which i'll do right now in the atom editor the name of the file will be dot db migrate rc the rc file contains json and i'll specify the sql file option should be true okay i'll close all the files that are open in my editor then in the terminal i'll create a new set of migration files and i'll name it create table person but this time i don't need to add the sql file option all right we see the new migration files here which i'll open up in my editor in this migration i'm going to create our first table called the person table and i'll do this by keying create table and then i'll enter our custom schema name learn dot the table name which is person then i'll add an opening and closing parentheses next i'll add the columns i'd like this table to have starting with an id column type serial which is the primary key then i'll add a first name column of type text and i'll say it can't contain null values similarly i'll add a last name column and lastly i'll add a created at column time stamp with the time zone and i'll set its default value to be the current time or the time now now i'll define how our down migration should work by saying drop table learn dot person okay let's create our new person table by running our migration and it was successful all right now that we've got a table in our database we can start using postgraph file so i'll go ahead and install postgraph file by king npm install postgrad file okay so the next question you probably have is how do you actually start or run postgraph file well there's a couple of ways you can run it like a node.js middleware with many of the popular node.js servers like express.js or coa the other option for running postgraph file is to simply run the postcard file command that was just installed with the postgraph file library in this video we'll just run the postgre file command which we could do right here in the terminal but we'll be specifying quite a few different options to the postgrad file command so to make things easier on ourselves i'm going to create a shell script named server.sh and i'll make it executable next i'll open up our new script file in adam so to start postgraph file i'll key npx host graph file then i'll extend the command onto the next line and i'll add the database connection string it should use by keying the dash c option followed by a valid postgres connection string in my case the connection string will be pretty minimal i'll keep postgres colon slash and often you'll provide some credentials to use next right here however in my case i don't need to provide credentials at least for now so add another slash and i'll specify the database name learn dev then i'll continue the command on the next line where i'll add the schema option and i'll tell postgrad file to use the custom learn schema so do you remember a few moments ago when i posed a question why are we creating a custom schema i bet seeing this option right here is starting to answer that question in your mind basically we're telling postgraph file to only expose things it finds in the learn schema so for example if we wanted to create some tables in some other schema postgraph file won't expose those tables the reasoning behind why we're using schemas will become even more clear a bit later when we set up user registration and authentication so hang with me for now okay let's start up our postgraph file server by king.server.sh and as you can see from the response it looks like it worked i'm going to click on this link right here which will open up the graphical tool which we'll be using extensively through the remainder of this video okay so what is this graphical tool and how do we use it well this is where we'll be writing and trying out our graphql queries now if you're not super familiar with graphql queries you can think of them sort of like sql queries well not exactly but it's a reasonable analogy to start with basically we can write graphql queries to fetch data from things like tables and we can write queries or what's called mutations to add new rows to tables or to do updates and deletes plus a bunch of other operations many of which we'll cover later in this video let me give you an example of a query to fetch data from our new person table so to do this i'll enter the keyword query followed by an operation name for this query which i'll call people query then i'll add a set of curly braces and i'll start keying the letter a and we see that graphical gives us some options for autocompleting what i started typing this first option all people looks pretty relevant for the person table so i'll finish adding the field name then i'll add another set of curly braces and i'll add the keyword nodes followed by a set of curly braces and now i'll press the hotkey control space and we see a few fields listed the last four of which match the column names in our person table i'll start by selecting the id field then i'll select the first name field and the last name field okay this is a valid graphql query that is essentially saying fetch me all the people but only return each person's id first name and last name i'll go ahead and run this query by clicking the run button here and we see the results of this query over here basically we get an empty array as the response which makes sense because we haven't actually added any people to the person table let's do this i'm going to pull up a sql tool i used named postico and i'm going to run the following sql command insert into learn person then i'll specify the first and last name columns and i'll say the values to insert are a few first and last name combinations now i'll run this command which inserted three new rows and we'll pull up graphical again and i'll run this query again and this time we see the three rows i just inserted cool okay i didn't need to insert these new person records with sql like i just did we can also insert records into a table using graphql i'll get rid of this query then i'll create what's called a mutation and i'll name this mutation create person and i'll add a set of curly braces next i'll type the letter c and notice what shows up in the autocomplete here it's a graphql field name createperson which does exactly what you'd expect it allows you to create a new person now the next question you probably have is how do you pass along the first and last name of the person to create well you do this by passing in some parameters to this great person field so to do this i'll add a set of parentheses and then we automatically see an autocomplete suggestion that includes this input identifier i'll go ahead and use the input parameter followed by a colon and a set of curly braces and then i'll enter person followed by a colon and a set of curly braces and then we see the input fields for the person table next i'll add the first name of joe and the last name of smith and i won't worry about the id and the created at columns as these are auto-populated okay if you notice the red squiggly lines here under a creep person it's an indication that our mutation isn't valid we need to specify what information should be returned after creating a new person so i'll add a set of curly braces then i'll key in the person field indicating i want the newly created person to be returned and i'll say return the id first name and last name now i'll run this mutation and we know it worked because there were no errors and we see the newly inserted person right here cool we can also do updates in graphql let me show you an example let's say we'd like to change this name from joe to joey so to update the name i'll create a new mutation named update person then i'll use the field that postgraph file offers us named update person by id and i'll specify the input should include the id of the record to update for and i'll add the person patch parameter where i'll say to update the first name to joey then i'll return the updated person's id first name and last name now i'll run the update person query and as you can see over here it worked we can also perform deletes with graphql which i'll demonstrate right now i'll create a new mutation which references the delete person by id field and its input includes the id to delete and then i'll specify what data to return which in this case is the deleted person's id and first name now i'll run the delete person mutation and it worked okay so hopefully this gives you a general sense of what postgraph file gives you but we've barely scratched the surface so hang with me as there's a lot more to see now by default postgraph file uses some rather long names for the graphql fields for example a few moments ago we ran this query and the field name we used was all people i kind of wish it just said people here well this simpler naming convention can be achieved by using a custom postgraph file plugin let me show you how this works first i'll kill the server by pressing ctrl c next i'll install the postgre file plug-in i'm referring to by keying npm install at postgraph file contrib pg simplify inflector now we've got a tell post graph file to use this plugin to do this i'll pull up our server shell script and i'll continue the command on the next line adding the option append plugins and providing the plugins module name alright i'll restart the server then i'll head back to graphical then i'll click the history button and click people query so this is the query we ran to fetch all the people but notice what happens when i run this query again we get an error that says it can't find the field all people well that makes sense because the plugin we just installed should now make the field name people not all people i'll go ahead and get rid of the query and i'll press ctrl space which gives us a list of all the fields we can query against why do you think we still see all people listed well the reason is because we haven't refreshed the browser since restarting the server so the graphql metadata stored in the browser is out of date this is easy to fix by just refreshing the page and now when i press control space we see the field name we expect which is people so i'll go ahead and choose the field people and i'll create a query similar to what we ran earlier and when i run this query we get the result we're expecting okay so here's the takeaways i want you to remember about what we just did when we added the plugin postgraph file offers a bunch of functionality which covers the vast majority of use cases you'll ever need to handle but if you happen to run into a scenario that postgraph file doesn't handle you might be able to extend postgraph file with a community plugin or you can write a plugin yourself but honestly you probably won't run into any show stoppers that requires you to write a plugin i want to enable a couple of additional features to post graph file that will add some interesting features to graphical to enable these features i'll add the enhanced graphical option and the allow explain option next i'll kill the server and restart it now i'll pull up the browser and refresh the page and as you can see this page looks a bit different now in particular these two buttons are new the explorer allows you to see all the fields and parameters that are available to you and lets you create queries and mutations by simply selecting what you want for example when i click the created at checkbox you'll notice the field gets added to the query over here the explain button shows the results of using postgres's explained keyword on a query basically can help you determine how efficient the underlying sql query is and if the query is inefficient you can address the issue with indexes or other fixes below the explain section is the sql used to fetch this data up here okay i bet i know what many of you are thinking as you're looking at this sequel you're probably thinking that doesn't look like any sql i've ever seen yeah i get it it's not what you're probably used to seeing so what's going on here well keep in mind these sql queries are generated dynamically based on the graphql query and some optimizations are happening to minimize the number of queries that get sent to the database in fact for every graphql query tree like this graphql generates just one database query and it avoids the n plus one problem so the sql might look a bit crazy at first glance but it's quite efficient i also want to point out the documentation in graphical which you can open by clicking right here basically the docs provide a description of the entire graphql schema in other words it shows all the possible queries mutations fields parameters data types and so on so for example i could search for person and i'll see all sorts of fields and types associated with our person table if i click on this person type we see the table columns listed here however notice that there is no description for the person and there is no description for each of these fields if we'd like to improve the documentation that we see here we can do this by simply adding some comments to our database let me show you how this works i'm going to go ahead and kill the server then i'll create a new migration now i'll open up our new migration file and i'll add some comments to the database first i'll add the comment to the table learn person which is a user of the flashcard app next i'll add some comments for each of the columns in the table and actually i'm not going to bother to create a down migration script for these comments so it's not necessary now i'll run a migration and start up the server next i'll pull up the browser and refresh the page then i'll open up the docs and search for the person type and check this out the comments we added here are now displayed here nice okay it's getting kind of annoying to have to kill and restart the server every time i make a change in the database but this can be avoided if we enable another feature when we start the postgraph file server so to do this i'll extend the command to the next line and i'll add the watch option now anytime we apply a migration postgrad file server will notice the changes and automatically update the server and it'll automatically push the metadata changes to graphical in real time all right i'll kill the server and restart it with the new watch option and i'll open up another terminal session where we can work with the migrations it's time to add another table the card table so i'll create a new migration and i'll open the new up and down files first i'll create a table in the learn schema named card then i'll add the id field then a question field of type text an answer field and a review after timestamp field next i'll add a spacing field of type integer and i'll add a check which basically enforces that the spacing field must be greater than or equal to zero now if we were building a traditional server using whatever your favorite language is you typically need to validate values sent from users in your server code but in the case of postgraph file we're not writing typical server-side code so we need to lean more on the features and postgres to validate and check values so in this case we're having postgres validate the spacing column value right here next i'll add an ease factor which is a numeric type and we're having postgres verify or check that its value is greater than or equal to 1.3 next i'll add a sequence column okay you're probably wondering what these four columns are for well they're going to be used to determine when a flashcard should be reviewed next essentially these four fields will be used by a simple spacing algorithm we'll implement in a bit here and an important thing to know about these four fields is that they should never be manipulated by users they should be set exclusively by our spacing algorithm so somehow we need to control access to these columns but i don't want to deal with this just yet we'll come back to this in a moment here next i'll add an archived column followed by a person id column type integer which references the person tables id column this person id column will hold the unique id of the person who created and owns each flash card lastly i'll add a created at column next i'll add in a few indexes for this table an index for the review after column the archived column and the foreign key person id i'll add some comments on the table and each column in the table in the down migration i'll simply drop the table next i'll go ahead and apply this migration okay we don't need to restart the server since we've added the watch option and we don't need to refresh the browser either so i'll just add a mutation to create a new card and i'll run the mutation and it worked we added this flashcard data cool now there's a problem with this mutation do you see what the problem is well remember how i said these four fields shouldn't be manipulated by users well i just manipulated the spacing value i set it to 100 which will break our app but there's another similar but more subtle problem which is the fact that i'm specifying the id of the person right here but should we really trust the user to pass their id in this query if the user was devious they might just send a query like this with some other user's id effectively corrupting the other user's set of flashcards really this person id should be set by the server based on the logged in user right okay so how can we fix these issues well in part we can address them with something called smart tags let me show you how this works first i'm going to roll back the last migration for the card table next i'll rearrange the comments a bit and then i'm going to modify the comment for the review after column adding a special comment called the smart tag that is at omit the ability to create and update values in this column then i'll separate the smart tag from the comment with a new line now i'll do a similar thing for the other column comments okay so what does this do for us well it'll prevent these fields from being updated by users via our postgrad file server let me show you first i'll reapply our migration then i'll pull up the browser and check this out right here graphical is complaining about us trying to add values to the person id and spacing columns because it's not allowed to anymore nice so can we still view the spacing field in the person id field let's create a query and see so i'll add a card query that includes the spacing and person id and as you can see in graphical it isn't complaining about them so yep we can query these fields now we still need to figure out a way to set the values in these fields but we'll get to that in just a bit here there's another field available to us in the card query that i want to draw your attention to i'll go ahead and enter the letter p and take a look at the fields that show up in the autocomplete list check out this first one right here person so what's going on with this field i mean we don't have a person column in the card table so what is this well since we added a proper foreign key to our card table postgraph file automatically added the person field which allows us to easily query information about the owner of this card so for example if i wanted to include information about the owner of the card i'll just add the person field then i'll add a sub selection of the fields i want to include from the person table i'll add the person id first name and last name now when i run this query we don't get any results because i just dropped the card table and recreated it but for the moment i'll do the following i'll insert a row into the card table using this sql statement then i'll run the query again and as you can see we were able to query our new card and the person who owns the card cool now it's worth noting that the smart tags are easily bypassed with this sql expression but we'll address these types of security concerns more in a bit here the next thing i want to do is add a new table so i'll create a new migration to create a response table this response table is essentially a log of the response the user provides when reviewing a flash card so i'll add in the appropriate columns for the response table then i'll add some indexes as appropriate and i'll add some comments now i don't want users to be able to create update and delete rows in this table well at least not directly so to prevent postgraph file from performing creates updates and deletes i'll add a smart tag on the table comment to omit create update and delete operations now i'll apply our migration okay to reinforce what we just accomplished with smart tags in the response table i'll start a query and if i start to type response we see some response fields available to use confirming that we can read from the response table which is what we want but look what happens when i start to create a mutation when we look at the available mutation fields we don't see any fields to create update or delete responses which is what you'd expect with these table level smart tags oh one more thing we need to drop the response table in the down migration file okay the next thing we'll add is a couple of postgres functions to handle the process of scoring how well we remembered the answer and setting the appropriate spacing in other words sitting the next time the card should be reviewed so i'll create a new then i'll open up the new migration files i'm going to write this function offline and i'll briefly walk you through it okay i'm not going to walk you through the implementation of this function because it's not particularly interesting and it's really only relevant for my flashcard app but i do want to point out the basics of creating postgres functions so i created this score response function in the learn schema that takes three parameters and it returns this type here score response which is this custom composite type i created up here now the dollar dollar you see here delimits the start of the function body and if we scroll down we see another dollar dollar that delimits the end of the function body then down here we see the language the function is written in and the function is marked as immutable meaning it's a pure function that has no side effects and lastly we see this security definer which we're going to skip for the moment but it's important and we'll come back to it later okay so you're probably wondering how this function gets used well hold that thought i'll show you in just a few moments here next i'm going to go ahead and add the down migration code to drop the function in the custom composite type now we'll go ahead and apply our migration and i'll create another migration for another function that will handle scoring in other words we're going to create a function that will get called when these buttons get clicked in the app next i'll open our new up and down migration files and i'll add a function offline and i'll walk you through the basic logic in this function so this handle score function takes two parameters the card id and the score for how well the answer was remembered basically if the user clicks this button the score is zero clicking this button means the score was one the score for this button is two and this button's score is three the return value for this function is a row from the card table right here i defined a few variables which get used in the body of our function now in this block i'm coring a card based on the past card id and storing the result in this card variable then on this line i'm calling that other function we just finished creating and i'm storing the return value in the response variable next i'm setting a new review after timestamp to be the current review after time plus some spacing value in days now in this block i'm updating the card setting the column values that the users aren't able to manipulate and i'm returning the updated card row into the card variable then right here i'm simply logging or inserting the user score into the response table and lastly i'm returning the card okay after the body of the function you see the language used in this function and you see that this function isn't marked as immutable like we saw in the last function it's marked as volatile because it's not a pure function we've got side effects happening in this function and lastly we see this security definer syntax again now in the down migration i'll drop the function next i'll apply our migration okay at this point you might be wondering what do we do with this handle score function that i just created well you call the function when these buttons are pushed which begs the question how do you call this function with postgraph file well as you might have guessed postgraph file automatically adds the handle score function as a graphql mutation that you can use let me show you how so i'll create a new mutation and i'll key the letter h and check it out we see our handle score function or more accurately it's called a field in graphql alright so to call it you'd create a mutation that looks something like this okay the next thing i want to work on is creating a graphql query to find the next flashcard that should be reviewed in other words when you're on this homepage and you see that there are three cards to be reviewed so you click the begin review button which should take you to the review page well we need to write a query that fetches the next card to review in other words this card we see right here but before we start on the next card query i want to insert a few more card rows so i'll run this insert command right here okay so we really need to write a next card graphql query that's sort of like this sql query you see here basically we need to query for cards that have a review after timestamp less than or equal to the current time and the card shouldn't be archived and the card must be for a given person then we want to order the cards by the review after timestamp and only return the top card but again we need to do this type of query in graphql alright we'll start with this next card query which returns all these cards then we can add some sorting filtering and limiting on the query by adding some parameters to the cards field so i'll start by saying order by the review after field in ascending order and running this query works but we've got more to do still next i'll say only return the first one card which gets us closer to what we need but we still need to filter the cards to only show the cards that are past due based on the review after field there is a condition parameter where we can choose a field to filter with but what would i put here well this condition is problematic for what we're trying to do this condition parameter only allows us to do equality comparisons so i could filter for a card with a specific timestamp that's not what we're trying to do what we want to do is something like this where we'd say review after is less than or equal to some date value but this isn't valid syntax so it won't work okay so are we stuck or is there some way to get around this well i'm sorry to say we're stuck it can't be solved of course i'm kidding this is easily solved there's a couple of solutions that come to mind i'll show you both of them one possible solution is to create a database view so to create a view i'll create a new migration then i'll open up the new migration files and in the up migration i'll create a view in the learn schema named next card then i'll add a select statement where i select every column from the card table where the review after date is less than or equal to the current time right now and i'll order the results by the sequence review after and id next i'll handle the down migration by dropping the view now i'll apply the new migration so how do we use this new view in postgraph file well i bet you could intuitively guess the answer at this point basically postgraph file exposes the view as a query field similar to tables so let's create a new query that uses this view in graphical i'll go ahead and press control space to see what fields are available to use and as you can see right here there's a next card field so i'll go ahead and use it and i'll create a query that includes the id question and answer fields then i'll run the query okay we still see more than one card which isn't exactly what we want but at least we know that they're cards that actually need to be reviewed and they are in the order we want them to be reviewed in now all we need to do is modify this query so that it only returns one card so i'll add the parameter to the next card field where i ask for the first one then i'll run the query again and as you can see it worked there's another way we can handle this next card query which is to add a postgraph file plugin that enhances the filtering capabilities let me show you how to do this so first i'll kill the server then i'll install the postgraph file plugin connection filter library or plugin next i'll modify our server shell script by appending another plugin right here by separating the first plugin from the next plugin with a comma then including the plugin module name now i'll go ahead and start the server again then we'll try out our new filtering capabilities in graphical i'll change the next card's field to just cards then i'll get rid of the custom parameter first and i'll press ctrl space to show you the autocomplete list so do you see any new parameters in this list this filter parameter is new it's made available by the plugin we just installed i'll go ahead and select the filter parameter then i'll choose the review after field and then inside a new set of curly braces i'll trigger the autocomplete list and check out all these different filtering options in our case i'd like to use the less than or equal to option and then i'll add in a date time value now i'll add the first one parameter again and of course we'd need to set the order by as well but for now i'll skip that and just try out our query and as you can see we're able to filter the date as needed okay we've basically implemented the required tables views and functions to implement this app except we still need to handle some security related things such as user registration and user authentication so let's deal with the security related stuff the next thing we'll do is create a new migration where we'll create a new schema now i'll open up the migration files and in the up file i'll create a new schema named learn private and in the down schema i'll drop the learn private schema okay you're probably wondering what's this learn private schema for well hang with me you'll see in just a moment here next i'll go ahead and run the migration then i'll create a new migration for a new table called person account now i'll open the new up and down migration files and then i'll create a new table in the new learn private schema named person account then i'll add a person id column of type integer which is the primary key and it references the learn schemas person table by the id column and i'll say when this reference relation is deleted cascade the delete to this table next i'll add an email column of type text that can't be null and i'll add a simple check on the format of the email this validation isn't sufficient really but i don't want to add a ridiculously long regular expression right here but i hope you get the gist of what's possible in the way of data validation lastly i'll add a password hash column of type text which can't be null okay so what's this person account table for well as you might have already guessed it's for storing the sensitive information about a user's account the email and the password hash can you guess why we put this table in the new learn private schema well because we don't want to expose this information via postgraph file in other words we don't want there to be graphql queries that can return this information okay so how is it that putting a table in this learn private schema prevents users from coring it well let's look back at our server shell script and in particular look at this option right here the schema option with this option we're telling postgraph file to only expose tables views functions and so on if they reside in the learn schema so by putting our sensitive information in another schema we're ensuring that it won't be accidentally leaked okay one more thing in this up migration i'll add an index for the email column then in the down migration i'll drop the person account table now i'll go ahead and apply our migration okay we're going to need to hash the user's password as it's a best practice and we'll do so using the bcrypt hashing algorithm but to use bcrypt and postgres we'll need to add an extension so to add an extension i'll first create a new migration then i'll open the up and down migration files then in the up file i'll create the extension if it doesn't already exist named pgcrypto in the down file i'll drop the extension next i'm going to create a new migration that creates a postgres function to register new users then i'll open the new migration files and in the up file i'll create a new function in the learn schema named registerperson which takes the first name last name email and password this function will return a new person then in the body of the function i'll declare a person variable and i'll insert into the person table the first and last names and i'll return the new row into the person variable then i'll also insert into the learn private schemas person account table the columns person id email and password hash the value for the person id is in the person variable using dot notation the email is a pass parameter and the password hash will be generated by calling the crypt function passing in the password and assault by calling gensalt and passing in the bf value lastly i'll return the person variable and close out the function body then i'll include the function's language i'll mark it as strict and i'll add security definer okay as we've seen before this function will become a graphql field that we can essentially call with a graphql mutation but let me ask you a question is there anything in this function body that bothers you or seems problematic if you don't see anything that bothers you let me give you a small hint here's the hint this register person function will be called by random users on the internet when creating an account so do you think both of these inserts will work for random internet users in particular what about the second insert which is touching a table in the learn private schema okay it might seem concerning that an untrusted random internet user can insert a value into the private schema but it's actually desirable in this case and it's possible because we're using this line right here security definer basically by adding security definer we're telling postgres when it runs this function regardless of who calls it run it with the permissions of the user who created the function in other words always run this function with the permissions of the user who applies this migration well there's a bit more to the security story but we'll get to that in a moment here okay next i'll go ahead and create the down migration which drops the function then i'll apply the migration all right let's try out this new register person function so here's a graphql mutation that effectively calls the register person function passing in the relevant input parameters and it returns the new person's id first and last names now i'll attempt to run this mutation and it worked cool okay now that our postgraph file server can register new users the next task is to create a way for existing users to sign in or authenticate but before we deal with authentication we need to create some database security roles one of which we're going to assign to users upon successful sign in so i'll make a new migration to create some rules then i'll open the migration files and in the up migration file i'll start by creating a role named learn postgraph file which is a role that can log in with the password grid password this is the rule that will have postgraph file use in the server shell script in just a moment next i'll create a role named learn anonymous which is more like a security group than a user account and this role will get used for non-signed end users that make requests to the postgraph file server you'll see exactly what i mean by this in just a moment now i'll go ahead and grant the role learn anonymous to the role learn postgrad file you can think of this command as saying anything this role can do can also be done by this role this will make more sense when we start granting privileges to this role in a few moments next i'll create a role named learn person which is the role that authenticated users will be a part of and lastly we'll grant the learn person role to the learn postgraph file role okay let me quickly summarize what's going on in this migration on this line we're effectively creating a database user with a password then on these two lines we're effectively creating two security groups one for signed in users and one for anonymous users then we said anything these security groups can do can also be done by the learn postgraph file role or user in the down migration i'll drop all three roles next i'll modify the server shell script and in particular i'll change the connection string to use the learn postgrad file account with its corresponding password and i'll specify that the server is running on the local host i'm also going to extend the command and add the option owner connection which i'll pass in a connection string that uses my local super user account the only reason i'm adding this connection string is because this account here isn't going to have the necessary permissions to watch for schema changes keep in mind in production you wouldn't actually use either of these options on the bottom here next i'll add another option default role and i'll pass along the role learn anonymous basically this tells postgrad file that if a non-authenticated user makes a request to the server give them the permissions associated with this role of course we haven't actually assigned permissions to this role yet but we'll do that in a moment okay i'll go ahead and apply our migration then i'll restart our server so that it runs with all our new options we're about ready to deal with user sign-ins but there's one more thing we'll need to add first a special postgres data type we'll use to generate json web tokens or what's commonly called jots if you aren't familiar with jots they are essentially a cryptographically signed token that gets sent to the user upon successfully signing in then on every subsequent request the user makes they'll send the jot to the server as a form of authentication you can think of a jot as an alternative to cookies in the browser so to define our new data type i'll create a new migration then i'll open the migration files and in the up file i'll create a type in the learn schema named jot which is a composite type that includes a role field of type text a person id field of type integer and an expiration of type big integer so basically this type here is a representation of a security token that will get sent back to the user when they successfully sign in you'll see this in action in just a moment in the down migration file i'll drop the jot type now to make use of this jot type we've got to pass a couple of new options to postgraph file first we need to tell postgraph file what composite type to use when generating the jot to do this i'll pass in the jot token identifier option then i'll pass in the name of the new type we just created next we need to provide a secret passphrase that postgraphal will use when it signs the tokens upon user sign-in and it'll use this secret to verify the tokens haven't been tampered with when the server receives graphql queries from authenticated users okay i'll go ahead and run our migration then i'll kill the server and restart it now we can finally create a function that handles user sign-ins or authentication so i'll create a new migration and i'll open up the migration files next in the up migration i'll create a function in the learn schema named authenticate which gets passed a user email and password the function will return the jot type we just created then i'll declare an account variable of type person account now i'll select star into the account variable from the person account table where the email equals the email passed as a parameter here you'll notice that i'm qualifying the email with the function name here to avoid ambiguity between the email parameter name and the email column name in the person account table next we'll check to see if the provided password is correct by saying if the account password hash equals the value returned by calling crypt passing in the password in the account's saved password hash then will return a new jot type which is created by king and opening parentheses followed by the role of learn person followed by the person id followed by the expiration time as a unix timestamp which is the current time plus 30 days and lastly i'll cast this new composite type as a jot type however if this password isn't valid we'll just return null next in the down migration file i'll drop the authenticate function okay i'll go ahead and run our migration and now i'll go ahead and create a new migration to assign permissions to the roles we created a few moments ago in the up migration file i'll start by altering the default privileges to revoke the execute permission on functions from the public role in a moment we'll selectively grant execute privileges to the functions we've created next i'll grant usage on the schema learn to the roles learn anonymous and learn person now i'll grant privileges on the function register person to the role learn anonymous in other words we're saying it's okay for anonymous users to call the register person function next i'll do a similar thing to the authenticate function allowing both the learn anonymous and the learned person roles to call it now i'll grant a similar privilege to the handle score function to only allow usage by signed in users which are people in the role learned person next i'll grant all privileges to the table person to the role learn person in other words people who are in this role can do inserts updates deletes and selects on this table now i'll grant select and delete privileges on the table card to the role learn person and then on the next line i'll grant insert and update privileges on the id question answer and archive columns of the card table to the learn person role i'm just going to add the final few grants and then i'll walk you through them so this line gives users in the learn person role the ability to use the sequence here basically this sequence is what auto generates the unique id when new cards are inserted this line lets signed in users do selects on the response table and this line lets signed in users do selects on the next card view okay hopefully the puzzle is starting to come together for you on why we created roles rules allow us to control access to all the different parts of the database all right in the down migration i'll essentially reverse everything i added in the up migration next i'll run the migration and now we can go ahead and try out our sign in functionality which is to call the authenticate function so in graphical i'll create a new mutation named authenticate person but this time i'm going to use graphql variables in the query let me show you what i mean by variables after the name authenticate person i'll add a set of parentheses then i'll define an email variable that starts with a dollar sign and i'll say it should be a type string and i'll add an exclamation point on the end which means it's a required parameter then i'll do a similar thing for the password next i'll enter the authenticate field name and i'll add the input parameters where the email parameter is the value in the email variable and the password parameter is the value in the password variable then it'll ask for the jot to get returned so are we done with this query let's try it and it failed as you probably expected and we're getting an error response indicating the email and password variables weren't supplied which of course is true we didn't provide an actual email and password but we can do that down here in the query variables panel so i'll click on the label query variables which expands the panel then i'll add some json which includes the email key with the corresponding email address followed by a password key and the password we used earlier when we registered this user now i'll run the query again and it worked and what you see right here is the jaw so what's the point of using graphql variables well with variables you can reuse a query with different inputs without having to do any sort of string concatenation let's take a look at the json web token that was returned here to help you better understand what this jot is i'm going to copy it to the clipboard then i'll open up the website jwt dot io then i'll paste the jot into this input and i want to draw your attention to this section over here basically this site decodes the jot into its three component parts the header the body and the signature the interesting part is the payload here or what's often called the claims which as you can see includes the role the person id the expiration time stamp and a few other claims that postgraph file added on its own now when a signed in user makes a request to the server they'll send the jot along with the request typically as an authorization header in http requests then the server will verify the token hasn't been tampered with and then the server can use the values in the payload as needed for security purposes okay let's try using this jot in graphical to include the jot with graphical requests i'm going to use the mod header chrome extension and i'm going to get rid of this other jot that i was using for something else and i'll paste in our new jot then i'll click the check box to add this authorization header next i'll add a card query and actually let me remove the authorization header first and i'll run the query and as you can see from the response permission was denied so i'll go ahead and add the authorization header and run the query again and this time it worked which is good but do you notice any problems in this response let me give you a hint by adding in the person id field and running this query again so do you see the problem now basically we're running this query as this signed in user with a person id of 7 but we're seeing cards from other users which isn't good each user should only be able to see and manipulate their own cards however this can be easily addressed by enabling a feature in postgres called row level security i'll show you how this works by creating a new migration then in the up migration file i'll alter the table card to enable row level security next i'll create a policy named select card on the card table for select sql commands and i'll apply this logic to the role learn person using the following boolean logic i'll say selects are only allowed when the person id of each row equals the following expression okay so let me explain this expression when postgraph file receives a jot temporarily saves the payload or claims of the jot to the database connection session and this current settings call allows us to access the jot's claim values so basically when this jot is sent to the server which has this person id set to 7 we can access the person id like this so essentially with this policy a given user can only see rows in the card table that have their own person id i'll add similar policies for updates deletes and inserts the only significant difference between these policies is that the insert policy uses with check instead of using the difference here is that select update and deletes have existing saved row values to check against so these checks happen before the corresponding sql command occurs but since inserts don't already have row values to check what happens is the insert occurs then this boolean expression is checked and if it fails the insert fails now in the down file i'll simply reverse everything i did in the up migration file okay i'll go ahead and apply the migration and i'll run this query again and this time the query works but we don't see any of the cards because the existing cards don't belong to the signed in user all right let's go ahead and add a new card using the following mutation actually do you think this mutation will work let's try and see it failed and it's complaining about a violation in the row level security policy so here's the problem when we're running this mutation the person id isn't being added to the card so the insert policy is failing okay so how do we get the person id column populated when creating a new card well it's not a value we'd want to send as a parameter because we just can't trust users on something like this however there is a way of handling this using database triggers let me show you how i'm going to create one final migration then i'll open up the migration files and in the up file i'll create a function in the learn private schema named setpersonid which returns a trigger then in the body of the function i'll use the new keyword which represents the new row to be inserted followed by a dot person id which i'll set to the person id from the jot which again you can access by calling current settings and passing in a string representation of the claim value you're wanting so jot claims person id then i'll just return new and close out the function next i'll create a trigger named card person id then i'll say before insert on the table card for each row execute the procedure we created up here learn private set person id now in the down migration file i'll simply drop the trigger and the function okay let's apply this migration then we'll try running the create card mutation again and this time it worked cool now let's see if we can get any data from the card query and we do now we get the one row we just inserted cool okay i think we're mostly done so the next question you might have is how do we take graphql queries like these and use them in applications like our flashcard app or a mobile app or a native app well i haven't really pointed this out but when you start postgraph file you see the api endpoint it exposes right here in our case it's an http endpoint that accepts post requests but if we were using the real-time features of postgraph file it would also accept websocket connections as well and if you use postgrad file as a node.js middleware you could potentially accept requests in other ways such as with http get requests now since postgraph file is just using http for transport you could send requests similar to how you might make rest api calls by using the fetch api or a library like axios however i'd suggest looking at some of the more popular graphql client libraries like apollo which adds all sorts of nice client-side features similarly for mobile apps and other types of native apps you'd likely use one of their popular client-side libraries so what do you think about postgraph file i think it's a pretty clever use of postgres that offers excellent functionality and allows you to build your back in in a fraction of the time it would take with modern programming languages and frameworks oh and keep in mind there are a ton of features i didn't cover like the real time capabilities of postgrad file so here's my challenge to you the next time you're in a hackathon or implementing a prototype or creating a side project or just starting a new project at work consider using postgraph file i'm pretty sure you'll save yourself a ton of time and you'll probably really enjoy using it additionally if you end up using postgraph file i'd urge you to strongly consider supporting the open source project financially either personally or through your employer this project is mostly maintained by benji gillum and your financial support allows him to spend less time contracting and more time working on postgraph file couple final words if you like my teaching style you should check out my courses at knowthen.com and of course it would be great if you subscribe to my youtube channel and don't forget to click the notification bell thanks for watching and i'll see you next time
Info
Channel: knowthen
Views: 6,142
Rating: 5 out of 5
Keywords: graphql, postgres, postgraphile
Id: eDZO8z1qw3k
Channel Id: undefined
Length: 58min 17sec (3497 seconds)
Published: Wed Oct 14 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.