ElixirDaze 2016 - Building and Testing API Endpoints with Phoenix by Brian Cardarella

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
so I'm gonna talk about building api's in Pahoa niché's first my company Dockyard we build ember applications and ember is really the reason that brought us to Phoenix as a back-end technology we started as a rails application shop but as we were building out more and more complex client-side applications and that's been paying for performance optimization over the rails applications at least in the initial estimation we found that these performance issues are really magnified on client side apps and so we actually stopped doing back-end development for about a year and a half and when Phoenix came out we're gonna start on my radar at least we pursued it very heavily to the point where we I I don't know if it's true but at least I've heard it's true that we were one of the first shops to come out and say that we were embedding on Phoenix as our back-end technology and we've continued to invest in that bet we released a lot of our libraries love which I'm gonna be talking about today and a couple months ago we actually ended up hiring Chris to come on board and we have him working on Phoenix nearly full time we look at that as a protect or protecting our investment so that he can continue making Phoenix the best possible framework possible this is our little version of the our logo Phoenix sized or punic size however you want to say it there are stickers of this out in the hall some over here if you want to grab some and I think a few people in the audience follow me so yesterday I went a little bit of rant on twitter if you follow me now it's going off on basically page objects in ember and it continued quite a bit this is really like fueled by the fact I'd missed my flight and it was sitting in the airport but I ventually got the point where I was really kind of grasping at straws I start attacking abstractions and then my my last tweet of the of it was ironically I'll be going over all the abstractions that we're it in a Dockyard so we'll be getting into that apologies for for that so that's the end of my slides I'm gonna be going into code for the rest of it so let me so I'm gonna I'm going over a basic API that I ended up building out last night and it has some has some common components to it it has account creation it has authentication and it has a query API and some relationship data and then I'll be going over libraries that we've built in order to make this better so I'll show the library Lee library less version of it first and some of the pains that can come about from that and then we'll see the library is that we've written in the impact it has upon the code base so I should start by saying that the application here is built with JSON API in mind if you're unfamiliar with JSON API it is a schema format that's really being championed by the Ember community right now and I think it's starting to kind of get out outside of it but it came about from true the creators member saw that many of these restful services all had different schemas that they were emitting and so if you're consuming all these different schemas you have to have custom adapters for every single one whereas if you're building a single JSON API a service the schema is already predefined it's a spec at the standard it's 1.0 and all that good stuff if you're interested in it you can go to json api or to see more about it but so the the api here is written with it in minds and some of the libraries that we've written in order to enable us to write JSON API is for that however some of the concepts I'll be going over could be brought into other JSON schema type api's jason api is a weird name to say because it's named a library but it's also you know it's a concept in general so anyway we'll start with you will start with right here in the router and the library that we use to build out our Jason API is this called just serializer this is not a Dockyard library this is a library written by Alan Peabody at Julian they're up in Vermont and it's a really nice serialization library at the time when we first started building Jason API backends with Phoenix is the only one available I think if you go on hex there are a few more now but what I like about just serializer is that it's trying to cover a bunch of edge cases if you check out the Jason API spec it's a very verbose spec and so you plug in just steriliser into your API pipeline content type negotiation will actually detect that the accepts header and the content type header are correct for Jason API and then the Jossie rely sorry the deserialize ur will actually take the inbounds the inbound request parameters and do certain things in order to make them easier to use in in a lexer so all key types in Jason API are hashed they have hyphenated and just serializable you know underscore them and that type of thing we we then piped through the the API pipeline into our API scope and the next thing that you have to do is and a lot of this is covered in the Jessica realizer read me but I won't spend too much time on it is you have to configure a plug so I don't know if you can see the text down there but you can figure it to handle the jason api format type and then you have to go recompile your plug and all that's covered in the sorry in the adjuster réaliser readme next when we are creating a user we try to we want to keep as much code out of our controller as possible at least to a certain degree and when we're doing certain things like encrypting passwords a really good place to encapsulate in handle this is in the change set so we can see here that our creation function is only really taking us down to paths case statement it's staying us out of the hack happy path saying that our data destruction happened properly and we render out the happy result or the sad path meaning that the the changeset failed for whatever reason and we're rendering out it's in this case of 422 inside the change set oops we handle our validations but the nice thing is that after we're done with our validations we can encrypt the password so on the actual model here we have a password hash field but we also have the past run password confirmation fields now in order if I know that there are some newer authentication libraries that have come about but we've continued to build our own custom authentication internally at least to the point where I feel that those other libraries are providing enough value for us not to do this anymore the reason why we build out our custom authentication is because it's so simple in an elixir it's actually really nice the there's library call come on in and we use decrypt and in order to create our password hash we just pass it through the hash pass for a salt function we pass it our password but here we only are encrypting if our chain said spelling so now that we have this our users actually created now if you want to authenticate our session we have a whoops that's the wrong controller we have a session controller again we're handling the happy path and a sad path now what we've done is that we extract out all the authentication Strat all the intent occation code into a strategy called authentication and that we just put into the strategies directory so if you're coming from other frame other languages like Ruby you can just put pretty much anything into any directories in Phoenix there's no there's no requirement of saying that you're your class name has to map back to a given directory structure at all because it's not doing something like load this file and then guessing the location of the file based upon the actual class name everything gets compiled when you're running the server and so it's just available at runtime so our strategies here for authentication end up being pretty simple so this this here allows us to build out authentication with different types if we want to authenticate against a incoming like signup email and say we have token that's associated with that we can just use Phoenix's pattern matching oh sorry elixirs pattern matching right up in the write up in the function definition but here what I like to do is we will grab the ID of the account but we also create a polymorphic authentication strategy so we grab the type of the account we do that by grabbing the struct off of the account model and we sign this into the session the reason we do this is because we may have different models that are handling different types of authentication if we have a regular user model if we have a like a client model if we have an admin model we want to be able to differentiate between these models we don't want to just say we're gonna rely upon the account ID itself next would be if we go back to the router for a second and we look down here we can see that we have a pipeline for authorization so in this case we have a post that we want to create but we don't only want to do it if we have the right the other rights to do that meaning that for signed in as a user in this case we've limited that to the create update and delete functions sorry actions so if we take a look at sorry the pipeline right here we're using the authorization strategy and we're just now this is actually rolled based authorization you can pull this out of pretty much you can pull this out of this repo this repo is going to be available for everyone to check out if they interested in education and a pipeline for what what ends up happening is that I don't have to build this like really ugly nested if statement with many different conditionals I can create something fairly flat by relying again upon pattern matching which ends up being really really nice to handle it will go through here and then probably check this to you that I'm advocating if not it's going to return an unauthenticated session and just halt the actual request so a lot of this stuff is fairly simple as is and I'm sure that those that are in the audience that I've already built out Phoenix applications probably are looking at this and say oh that you know we've done this already this isn't this isn't too interesting but where it becomes a little bit more interesting is when we get into posts so one pattern that I really like that we've been using we I've used it now on to applications is with composable queries so in this case we have somebody that is in hitting the index action and just saying I want all the queries or I'm going to provide some query pram to say I want all the so I want to post I want all the posts for a given user ID and with ecto because we can build up this query object over time and with recursion in the actual leveraging recursion we can iterate through all the query parameter being this very simple version of it ends up being what's this right here less than 10 lines so on line 8 we hit this build query function passes in the query parameter a list so now we can iterate over it properly we call again to build query function this time we're on line 16 is superb guarding against his map next it goes into line 21 and we pass in queue first time goes in it's gonna be the post model and here it just iterates over the key values this very simple version of it is going to say okay for each key is equal to this value we're just gonna add this this statement onto the query and then we continue to iterate until we're actually done and then it goes back up to line 10 and because of I'll be showing in a minute the the nasty nature of testing out JSON API responses we ended up putting this order by just to normalize the test a little bit until I show you the other libraries then we just call repo dot all and this is a this is this feels great this feels like a really really nice pattern to actually build off of and felt so nice that I extracted it out into a library called Inquisitor so now Inquisitor you pass it the the model that you're going to act upon and it will give you you'll right into during compilation phase they'll write into your module some function with that model name in mind and then you just call build post query and pass it to params now in this case Inquisitor will give you the default function of mapping keys to values but there are more complex situations you don't want to just do key values like maybe you want to add a limit that could be available to someone on the query prams which case they can just do limit equals five or eliminate equals ten perhaps you want to actually in this what we're doing here on line 16 is query against the published date of the post so we have we're guarding on the the values a month in year for the keys and we're building out a date fragment in sequel and we just add this on to the query so what this would look like in use is that's probably really small right this may not look fantastic I fly through my server so with this very simple query statement we're getting back everything that we need the there's only three pieces of data in this database we just asked for all the posts in the year 2016 January but we can actually compose more onto this so if I won I have a limit down here on line 22 I can add this and now we only get one response back so with with the lickers pattern matching and leveraging recursion inquisitor allows you to build out a fairly complex query system very very quickly alright so let's move on to some testing and how we're doing testing so let's check out what testing Jason API responses may look like if you're you're not using some extra some of the libraries so start with will just go to the nastiest one which is gonna be posts so when we're so not only does Jason API have a verbose response schema it also has a verbose reply request schema so in order to send data to Jason API to a Jason API endpoint you have to observe their schema and this ends up being kind of painted but right so the the object is a data object and then nested within there is a type that the type name of the object the attributes within there there may be an ID in there as well and doing this over and over and over and over again can become pre monotonous well it's even worse is now asserting the expected payload so if we have relationship data that's coming back we have the original primary object then we have the relationships that are embedded within that object there may be that we are including the the full object not just its metadata and that would be in a separate object called included clue this is not a fantastic way to write it and you can even tell as we go further down that my editor starts you know throwing a fit because I can't even syntax highlight it anymore and it's all valid syntax but it just just becoming silver gloss right now I think it's all blue just it's completely gone you know bluing on me so doing this is a huge Alice a waste of time I shouldn't say with the time testing is fantastic and you should be doing it but it's not really helping you if you're just killing yourself doing this so one of our engineers dan mcclain he wrote a library a few months ago called Voorhees and if you're familiar with the Jason Voorhees the master killed people before he is Jason that's where it comes from so yeah I know exploring alternative names but the first pass that that was just some I think a lot of a lot of this shows just how we're evolving and learning how best to write libraries in elixir a lot of us are coming from different ecosystems and it takes a little bit of time to change your mindset over from how do we go from you know best practices in this other library in this other framework and instill the language over to elixir in Phoenix so the first passive Voorhees actually say the current pass because what I'm gonna show you is very experimental is just you pass in kind of just some expected attributes and it's a little bit nicer API for for doing something so verbose as this but the more I've been writing a lecturer the more I've been focusing on the more I want to do right here what's on line 13 I want piping I want to be able to just say I want composability I want to have something that's you know very simple but also very powerful and I just want to be able to pass things through without any issue so the the updated version of this is now this so rather than having those huge Jason objects that were building up in our tests and then doing actual equality assertion a nicer API for so let me think like this assert data in a certain relationship functions and what this does is you pass in your model and it's taking this that's actually not correct this should be it's assigned to the con variable but actually should be a payload because JSON or response on line 24 returns it in terms of body of the response but takes the body the response and it's going to iterate through the data object to find if there's a matching data object with the model that you're passing in in this case it makes some assumptions for you so it's gonna say hey we're gonna try to understand what this model's primary key type is and then we're gonna find the value for that to see if that exists and then we're also going to try to find if the if the type itself exists we may pull it off the struct you may ask ecto for some help on that it depends upon the situation what you're passing in and then it will force the data segment of the response to a list and iterate through that and try to find if that if a corresponding data object in there with the same ID and type exists at that point it'll iterate through all the attributes inside that data object and make sure that all the corresponding fields on the original model passed and have the same values if that's the case then yes it's happy it starts that with relationship it'll go through a little find that same data object and then it will check to see if the relationship metadata for in this case for users one exists so here we would expect a adjacent API response for the post endpoint and we want to make sure that it's also returning which users associated with this post so this is a huge improvement over segment others and ends up being a really really nice API to work with so in addition to that I'm not sure if you saw on the previous actually I'll bring it up again so dealing with authentication and authorization during your test suite is can be a little bit hairy as well so here we have on line 31 we're posting to the session path or creating our session then we have to recycle the connection object and then the connection object I don't know if this is a bug or not but it loses the connect the content type after the post session so I have to put in the request header again here on the content type so we have another library that we've written right here called octet support so octet support we'll give you some basic authentication authorization test support helpers in order to ease the pain of having to do a lot of this boilerplate over and over and over again so the first of which right here my 22 is authorized as and it will just authorize you as a given user in which case in here it will simply just do the assigns object on the session so what what our authentication or our authorization strategies actually have within them is that it will first try to return the assignments object if these x objects exist that's the account they're all authenticated at otherwise it falls back to the account ID or the account type and tries that them to kid against that in addition we also get this nice nice macro called require authorization require authorization will expand out to a huge test and what this will what this saves you is the boilerplate of having to this having to test against 401 s for pretty much all the actions that you don't want to allow access to if you're not authorized as any user or if you're authorized as a user that shouldn't have authorization to that given route so by default you can just do this here and it will go against all the regular restful actions you can send it the only you can do accept but you can also do different roles so let's say you had an action that only admins should be able to access right so you want to test that against not being authenticated in any state and you want to test that against being authenticated as a regular user so in this case you would do something like no off and then off whoops and this this keyword list here any anything on the left-hand side of the keyword lists that are just atoms it's just for documentation purposes it doesn't there is no magic about no auth I just happen to use no off there's no magic about auth either but the value that I'm gonna sign to the off key would be something like a I'll just call it off in this case and then when it's actually building out this test it's compiling the macro it will it will take the connection object and it will send it to this function and so you can do whatever you want with the connection object that at that point prior to hitting the API endpoints that's going to test against so we may have something like this just con and then do and it would do like auth regular user con something like that and returns the connection object so this function clearly doesn't exist but I think you get the point and the other thing that we've done to it we're being mindful of of compilation time to speed it does not pump out 20 different tests it pumps out one test you may not be aware but when you're you when you're building macros that are then emitting macros it's actually slow to do that so it will have one kind of monolithic test they'll iterate through all this stuff for you the where we get away with it is actually in X units assertion messaging so if something fails for the update action if it doesn't exist or you do it doesn't return the unauthorized status code then it will actually you know it will tell you properly in in your test suite so--that's require authorization and I'll go down to and I don't have to remember from a few minutes ago but the the actual queering tests were huge you're like forty fifty some cases close to a hundred lines of code and we've cleaned that up into probably less than ten lines of code here what's really nice is that if we're querying against sets that we want to make sure we're not returning certain data we also get refute functions as well as in addition to Voorhees assert functions so here we are querying against the user ID and we want to make sure that we're only getting postback that we never that belonged to user ID one and so we want to make sure that we are refuting that data of post two because that's associate with user ID to this and that said on the model side this one more complex we've written a library called valid field and valid field will allow you to validate your change sets it change sets so we take the original model struct and we passed it through this with change set function this will make an assumption that if you don't pass it any arguments that the regular model dot change set function is the one you're using for your change sets but you can customize it and pass in a reference to whatever change that function that you need and the idea here is that this style actually works really well for unit testing your validations so in this case let's say because because we don't actually care about which validation is satisfying these conditions and you shouldn't care about which validation is satisfying these conditions what we care about is the behavior that's being driven by the change set so something like like a pattern matching and a exclusion inclusion validations these may satisfy the same condition it's up to you which one you want to use so on my eleven email we care that that example.com is a valid state and nil empty string and fubar are invalid states we've done an application where we had to we weren't allowed to accept users that had military email addresses so in that case we would just add a military based email address to the right hand side and we just run our test suite that fails now we just add a format validator to actually ensure that we're not accepting anything that with a gov ID or m il sorry mi l sent the prefix or postfix on the email address in addition there may be situations where you're trying to test against data that needs to be set up so on line 18 there's a put parameter that allow us to actually inject some data into the change set so we can test something like password confirmation where this pattern doesn't work and I can't I haven't come up with a good solution is against database of constraints so in ecto things like uniqueness and other constraints can be mapped back to validation both the constraint message errors so you can capture those messages and omit them properly and handle them properly however what will happen in your database is that you can't target which constraint you want to fail in which order at least in postgrads I don't know if that is something that happens in other databases but let's say you have a list of three different database constraints that you're checking against and you want to assert that the uniqueness validator is gonna work so you know from a conceptual point of view it's easy to set up right so we just created an existing model in the database with the same email address and then we just run it through here and you should capture it but in reality what ends up happening is that if the unique dis constraint is not the first one that the database fires then it will hit another constraint and just completely blow up we could do something a little bit more hairy and actually set up all of the other values that are to be expected to satisfy the other constraints but it's not super clean solution I like clean solutions so I'd be interested in knowing if anybody else has has some ideas around that finally there's a nother library never written called ecto fixtures and I know that there's a few other test database data harness libraries that are out there I've been using ecto fixtures for I've been writing it for past couple months and it does some things well does some other things not well especially its its current API for generating data is not fantastic but I consider that nuance to be solved the parts that I think it does really really well is handling complex data sets so whereas other libraries may insert data one at a time as it's actually being accessed ivko fixtures will collect all the data that you're looking to insert and then it will actually analyze it to determine if there are any sorting that needs to happen so for example if you're using relationship constraints in your migrations you may have foreign key constraints in the database so in order to manage that with the libraries you may actually have to make sure that you're observing the race conditions they're saying I need to make sure that the parent is inserted before the child okto fixtures doesn't care actually sets up a directed acyclic graph internally and then it will analyze the relationships between all these different data points and then say ok we're gonna make sure that all children are always inserted after their parents makes that pain much much easier in addition we're leveraging currently in this version tags test tags so we can just do at tag fixtures name the fixture files we want to pull away from and then on the context over here it's always injected as a data key on the context you know Luxor 1.3 I'm not sure what when that's due out but I actually made some commits to xunit in order to allow module attributes in X unit to be collected in addition to just tag so if you're not familiar with module attributes I mean you've seen them right there like add tags in module attribute but if you're in familiar with our module attributes actually the tag module attribute she works with an elixir because of during compilation phase the test macro whenever it's done you know whenever very collects it it will delete all tell tag module attributes that currently exist so that's why you can have this tag be completely separate from this tag but in addition to that you can accumulate module attributes so you can do a tag foo bar and here and then within the content within the actual setup or in the context object you'll be able to grab that data it's all accumulated properly so what what will happen in elixir 1.3 and ecto fixtures will take advantage of it and hopefully other libraries will use it too but we're gonna we're going to be have a victors module attribute and then we'll be able to do this and we'll be able to continue to accumulate that so if you want to do fix yours dogs right and then we can actually pass options off of that in order to customize what the the data set is we can't currently do that here if we did tag fixtures dogs this will blow away the previous value at fixtures we can only insert new new keys so so that's that's that's gonna be coming actually I don't know when it's coming but hopefully it soon because it'd be pretty cool so how much time do I blows okay so these are some of the libraries that we've written a Dockyard that I've actually been pulling out and extracting like a madman over the past week to get them ready they're very early development but I think that they're good enough to start playing with in fact some of them only exist in PRS right now the assert relationship all the assert data stuff on board he's exists within a pull request the the tag module attributes stuff fractal fixtures exists within pull requests but we're kind of in this early stage right now for our for Phoenix development I hear a lot of people that are always interested in running Phoenix applications but they're coming from other frameworks and places where it's safe for them to be right they have this huge ecosystem of tools that exist the lessons that we're learning at dr. we want to share with people I think that that in one way that we can really make Phoenix assess it's successful is if everyone's doing this if everyone's communicating everyone's coming back and trying to build the best tools possible but also we should not be trying to always go back and reinvent what exists in the other languages so one of the lessons that I've taken away from doing a lot of this work is that composability is king and this the composability is really one of the the key features that I really like using in the libraries that we're building out however should not feel the need to force composability if when you're building out libraries your ability on extractions you should keep composability in mind for what that API may look like but if you feel like you're just really kind of forcing it down the composability throat then it's perhaps not the right thing to do so I'll end with this here links to everything the repo that was showing off if you want to go through it dissect it pull stuff out that you want to use go for it it said my personal github repo be Carter Ella it's right on there and then the other library is here just to realize Inquisitor octo fixtures auth test support valid feel Voorhees and then come on in for the encryption these with the exception of Jillian and Voorhees they're all available in the dockyard organization so that's all I got thank you very much
Info
Channel: Confreaks
Views: 7,659
Rating: 4.9058824 out of 5
Keywords:
Id: zoP-XFuWstw
Channel Id: undefined
Length: 37min 48sec (2268 seconds)
Published: Wed Mar 16 2016
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.