Building a Course Platform in a Weekend - Mahmoud Abdelwahab - (Next.js Conf 2021)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
- There are many ways to make money online, and one way is to sell your knowledge. Maybe you have deep expertise in a certain topic, or you just have a unique perspective. So you decide, you know what? I'm gonna write a book, film a video course, or do both. Then, you go to a course posting platform and you upload your content and then you start selling. Now, this approach works. However, as developers, we always get this urge of wanting to build our own solutions. This is me acting on this urge. Now, to not fall into the trap of building their own course platform and never releasing my course, I give myself a deadline, one weekend. One weekend to build the following features. Users should be able to view free content on my website, this way they can be familiar with my style and know what to expect with every piece of content that I publish. They can also see the landing page from my course, click by, which re-directs them to a checkout page, where they can enter their credit card info. They can then use the same email that they use during checkout to login into the platform and to view protected content. Now to build this project, we'll be using a lot of tools. The first one is, Next.js. Noe in case you don't know what Next.js is, it is a full-stack framework that has a fantastic developer experience. It allows you to build your front end using React, and, makes it super easy to build your API. It has a file based system router, so, each file in the pages folder is automatically a route. There's also support for dynamic routes. So for example, when you're building a blog, you want a page that shows each individual post. Now you can have a page that displays different content depending on the URL PREMs, which is super neat. Now to build your API using Next.js, each file in the pages slash API folder is automatically an API end point. What's also cool is that when you deploy Next.js, each one of those endpoints is automatically a serverless function. So this way, you just, you can scale your app very easily without you needing to do much work. Now the final feature that we'll be leveraging when building this course platform, is tonic generation. Next.js supports different data fetching strategies. You can fetch data at bill time. So this way you can serve your users, just static, HTML, which is super fast and can be cached. There's also service-side rendering. So each row with each request, the page gets rendered on the server and then sent back to the user. This is great for SEO. And you can also do client-side rendering. So maybe you have a page where, you know, it's not super important for data to be kept up to date. Maybe it's like an admin dashboard. It's fine if you just show some loading spinners. Now for the course platform, we'll be kind of leveraging all of these, depending on the type of page that we want to display. So yeah, but the main focus is on static generation. Now, since I have control over my platform, I don't just want a regular blog. I want to blog with interactive components, and this can be done by using MDX. MDX is the markdown for the component IRAP. It allows you to embed JSX code in your markdown files and this JSX actually gets rendered. So you get the awesome authoring experience of writing markdown, but at the same time, you can have interactive components in your blog posts. Maybe it's a data visualization or a component where, you know, readers can tweak values. Like the possibilities are endless. This can help you craft a better user experience where they can develop a deeper understanding of a certain topic or help you communicate your ideas better. Now to add support for MDX, I'll be using next MDX remote. Now, the way it will work is that I'll have a folder in the root folder of my project. It's called content. And inside of it, there are two folders. One that will contain the public posts. So like my blog, and another folder that will contain the lessons that will be behind the paywall. Where users have to pay so that they can log in so that they can view them. Now, in the pages folder, I have a file column long got TSX, and this file is responsible for rendering the page where I'll be displaying the list of all blog posts. So the way it works is I'm using gets static props, and I'm just fetching all posts from the processed item and I'm passing them as prompts. And then what I'm doing is just mapping over all the posts and displaying them one by one. So I'm showing like the title, the description, and then the categories. Now for the individual blog post, the logic is actually almost the same. I'm using a dynamic route. So I have a folder called blog and inside this folder, there's a slug dot TSX file. And in this file, I'm also using guest static props, but the difference is I'm grabbing the slug from the URL PREMs and then just, you know, getting the data for the requested blog post. And this way we have just a blog. This blog is just public right now, and anyone is able to just read the content and it's built using MDX. So we can invent here, just react components if we want. Now, when it comes to styling, there are a lot of options out there. You can use CSS files, you can use SAS, you can use ACSS framework like Bootstrap. You can use CSS and JS, like stock components, or you can use a utility first framework like Tailwind. Now, personally, I'm a big Tailwind CSS fan and in case you're not familiar, here's how it works. You build your UY by combining utility classes. Now, a utility class is a single purpose class that just does one thing. So for example, you can have a class that has margins, a class that adds Buddha radius at class that adds mock shadow and just so on and so forth until you just cover all CSS properties. There are many. Now in the beginning, my experience with Tailwind wasn't smooth at all. I've been using CSS and JS for a while and just seeing how Tailwind works was like, oh, I need to learn a lot of stuff. And like they're ruining my HTML. But after I actually, you know, gotten used to the API, I've gotten so much more productive and this is for multiple reasons. So the first reason is I don't need to worry about naming classes anymore because sometimes I just sit there thinking, okay, how should I name my classes? Also, onboarding is just, there's no onboarding. So if I see a project that uses Tailwind, I can just immediately contribute because I'm familiar with the API, there's a convention. And the final thing, and to me, this is just one of Tailwind's best things, is that it has great defaults. So out of the box, you have a design system. This means you have consistent values for things like margin, adding, font sizes, colors, and just everything. But at the same time, there are no constraints. Like you can configure, like everything. You can configure the generated classes. You can configure like the values it's just super flexible. And to me, I don't see myself writing CSS any other way. So now that we have a working blog that has support for interactive components, thanks to MDX, and the styling and taking care of using Tailwind. Now it's time to add authentication. And for that, I'll be using Next Auth. Which is an open-source solution that just makes it super easy to have authentication to next JS apps. There's support for many authentication providers. So you can log in with Google, GitHub, Facebook, Twitter, Discord, most likely Next Auth has it, or eventually will add it. You can also bring your own database so you can have support for a session based authentication. And then finally, what's super cool is that there is support for passwordless authentication using magic links. So a user will enter their email. They'll be sent an email with a login link that they can use to access the platform. And this is exactly what I need. Now, to set up Next Auth, we're going to create an API end point that handles all the authentication logic for us. So things like signing in signing out, sending emails to users that contain log in links, so that they can log in. Handling other authentication providers, because we can have multiple ones. So to do that in the pages slash API folder, we'll have a new folder called Auth. And in this folder, we'll have a file where we configure Next Auth. In this file what I'm doing is first importing Next Auth from Next Auth. And then I'm doing a default export while calling next Auth and I'm passing in a configuration object. I'm then specifying a providers array. This is an array of authentication providers for signing in. So you can actually have multiple ones when you include them in this file. So you can have a GitHub provider you'll import it and then include it. And you also need information like the client ID and the client secret. Now for setting up a passwordless authentication using email Next Auth has two requirements. The first one is to use an email sending service, so that we can actually send the emails that will contain the login links. And we need to have a database. Now for the email sending service, you can use whatever service you want, as long as it supports the SMTP protocol. This is the protocol used for sending emails. So for me, I'm using Twilio SendGrid, and I just create an integration and specify that it's SMTP and I grabbed the following information. So I got the host, the port, and then created a new password and just passed in the username. I also specified a from email, I used like my own personal email, but for example, you can use like no reply@yourdomain.com. Now, so far authentication will actually not work because we haven't included a database yet. So we're going to do that first and then come back to this authentication portion. When it comes to working with databases, I've always been intimidated by them. I started out as a front end developer, so this layer of the stack was just so abstract to me. Which database should I pick? What are the differences between each database? How do I actually work with these databases, and how can the design of my database evolve over time? I had so many questions and things were very confusing to me, until I started using Prisma. Prisma is a next gen ORM, that makes it easy to work with databases. Now if you're not familiar with what ORMs are, ORMs, Object Relational Mappers, allow me to use a programming language to work with the database. This is instead of using the database's native querying language. Now the added benefit is that you don't need to constantly context-switch between two different languages when you're building your backend. So for example, you're writing your business logic, using a JavaScript or type script. You don't need to like, oh, now I need to think in sequel, for example. You can just write more code. And this makes you more productive. Plus, since you're working with a programming language, the editor is on your side. So things like auto completion, type safety, you're gonna have just comments. And the overall developer experience when working with ORMs is better than just using the database's native querying language. Now Prisma has support for different database providers. There's support for Postgres, My SQL, SQL Lite, Microsoft SQL server. And there is Mongo DB, which is in preview. Now, which database I'll pick, I'll go over that in a bit. But first, let's add Prisma to the project. The first thing you need to do is install the Prisma CLI as a development dependency. So to do that, just run NPM, install, Prisma, and dash dash save, dash dev. And after that, you can use the Prisma Anik command, which will set up a basic Prisma project for you. So just run MPX, Prisma, and net. And in the root folder of your project, you'll find a newly created folder called Prisma with a schema dot Prisma file in it. This is where you define your data model and design the schema of your database. Now this file is ran using the Prisma schema language, so to actually get like things like syntax highlighting, auto formatting and auto completion, make sure that you install the Prisma extension for VS code. Now, again, the reasons we need a database is, A to store the information for users who purchase the course. And since it's a requirement for having email authentication using Next Auth. Now, thankfully Next Auth has an adapter for Prisma that makes it seamless to have session based authentication and, to just add support for logging in with email. The first thing we're doing is we're defining a user model. Now, a model in your schema dot Prisma file will be mapped to a table in your database. Now we're going have a table where we have all users and each user will have the following fields. So a user will have unique ID of type string, and this will be randomly generated for us by Prisma. We, each user will also have a name and email, both of type string, both are optional, that's because we have this question mark. However, the email has to be unique. So we can't have two users in the database with the same email. Then we have an email verified field, which is used to determine when the user verified their email. There's also an image field, which is again, a string, a type string and it's optional. And so this is for the, so that the user can have a profile image. And then finally we have a field called accounts, which is an array of type account, this references, the model account. And then there's a field called sessions, which is an array of type session, which is also another model that we have down here. So what we're saying is, a user can have many accounts and they can have many sessions. Now the reason a user can have many accounts is because say, for example, we add a support for logging in with GitHub, logging into Google and logging in with Twitter. If a user uses each one of those authentication providers once, an account will be created for each one of them. And we want all of these accounts to be associated with one user and Next Auth takes care of that. Now the reason a user can have mini sessions is, well so that they can log in multiple times in the platform. For the database I decided to go with PlanetScale. It is a serverless, my SQL database that has built in support for Prisma. Now the fact that it's a serverless database means you won't run into the connection pooling problem. This is a problem you face when using serverless functions like Xes API routes, to access server-based databases. What happens is that each request opens a connection to the database. So during traffic spikes, you quickly exhaust the databases connection limit. This leads to failed requests and slow performance. Now planet scale introduces a unique workflow when you're working with databases where you have different branches. So the way it works, you have your main production branch. And whenever you want to update this branche's database schema, you create a development branch, apply the changes to this branch, and then you make a deploy request. This way, you actually have non-blocking schema migrations and your database won't slow down, during the migration process. Now I went to planet scale, created an account and created a new database. Then in the database settings, I checked the automatically copy migration data and picked Prisma as my migration framework. After that, I promoted the main branch to production. Now I'm going to use Prisma migrate, which is part of the Prisma ORM to actually create the tables in my database. And what's also cool about Prisma migrate, is that it keeps your Prisma schema, and your database schema in sync. Locally, I'm going to use the PlanetScale scale line, to create two development branches from the production branch. I'll do that using the P scale branch create command. I'll call the first branch initial set up, and the second one shadow. The shadow branch is needed to make Prisma migrate work properly. Then in two other terminals, I'm going to connect to these two database branches, by using the P scale connect command. I'll also add two database connection strings to my dot ANV file, so that Prisma can connect to each one of those database branches. Now blended scale doesn't support foreign key constraints. So we have to specify in our schema dot Prisma file, that we don't want to generate any. So first I sent the provider to my SQL, and I also added a shadow database URL, which will be loaded from my dot ANV file. And then I enable the referential integrity preview feature flag. And in the data source block, I added the referential integrity property and set it to Prisma. And now we can actually create our migration. So in the third terminal, run MPX, Prisma migrate, dev dash dash name to name our migration. And since this is the first one, I'll call it a net. As you can see, a migrations folder has been created for us and we can see the generated SQL. And now the next step is for us to create a deploy request. So to do that, we'll just run the P scale deploy request command. And if we go to PlanetScale, and we refresh, we're going to see an open deploy request. So all we need to do is add changes to deploy queue and wait for it, and it will be deployed. So now I'm back in the file where we configure Next Auth, and I need a couple of changes. So the first thing I've done is I installed Prisma client. Now Prisma client, which is part of the Prisma ORM is an auto-generated type safe query bolder. So that's what we'll actually use to send queries to the database. Now, the next thing that I've done I installed Next Auth as Prisma adapter. And the Next Auth as configuration object, all I did was just say adapter, Prisma adopter. And I created a new Prisma client instance by saying const Prisma equals new Prisma client. And I pass it to the adapter. Now, the next thing that we want to do is actually check, okay, is the users are trying to log in in my database. If they are, then we'll send them an email with a login link. Otherwise, we won't send them anything. So to do that, we have a function called send verification request, and we can add our own logic here that sends equity to the database. And for that, we'll be using Prisma client. Now, all I need to is just say, const user equals await Prisma dot And if I hit control space, I'll get auto completion from VS code. And what I see here is actually coming from my schema dot Prisma file. These are my models. So I can just say dot user dot and hit controls piece again. And I can see the different available functions that I can use. Now, these are functions for doing crud operations. So create, read, update, and delete. Now what we want is to find a user by their email. So to do that, I'll use defined unique function. So now next to it, I can actually see how to use it. So what I need to have is an object with, and specify a filter by saying where, and then have an object. So let's do that. So I can just say, find unique, do an object, hit control space. And this says where, and if I hit control space again, I can see that I can filter either by the ID or the email, that's because both of these fields are unique. Now I can, now I'm going to say email, because this is coming from the friend, like from the identifier right here. And what I need to do is, okay, check if this query returns that user. If it doesn't, then we'll just return and not execute the logic that actually sends the email. So I want you to do is just say, if not, user, we will return. And that's it. We've added logic that checks, okay, the, is this like, did this user purchase the course or not? Now the next step is for us to add support for payments. And whenever a user pays, they will be somehow added to the database so that when they tried to log in this logic will execute and they'll be able to access the protected content. Now to be able to accept payments from users on my website, I'll be using Stripe. Which is an online payment processor. Now what's cool about Stripe is that it has developer friendly APIs. But most importantly is that it gives you a hosted checkout experience out of the box. You don't need to like mess with credit cards, credit card validations. And it just saves you a ton of time. Now to keep things simple, I only have one product. So I already am logged in and I'm just going to go to my dashboard. And in the products tab, I can see that I already have a product. Now you can add a product where you specify, like all the details, like the price, and maybe even accept like recurring payments. I'm just going to make it a simple $5. So now I'm on the landing page for my course. And when I click buy now, what happens is I send a post request to an API end point where we're creating a checkout session. Now, if I go back to my pages slash API folder, I have a file called checkout dot TS. And this is just straight from Stripe's docs. And we're defining a server-less function that creates a session for us. So what I have is, the price and this price comes from my dashboard and yeah, you just, you creating a session and we're specifying the payment method types and the mode, and what happens on success and what happens when canceling. So for example, we can just say, okay, when the payment is successful, we can direct them to a page that says, yay! You bought the course and we can create kind of like an onboarding experience. And yeah, this is how simple it is. Now. You're going to need the Stripe secret key, which also you can find in your dashboard and that's it. Now the final piece of the project is, in the event of a successful payment, we want Stripe to let us know the info of the user that just paid. So this way we can grab that info and save it to the database. And when the user tries to log in, they'll be able to log in and access the protected content. Now this can be done through web hooks. Now to configure web hooks in your Stripe dashboard, there is a developer's page. And on the left, we have a web hook stat. Here you can specify which type of events you want to receive and where Stripe should send them. Now, all you need to do is you click add end point and you have one of two options. Either you specify the URL of your API end point, like the deployed version, or you can test any local environment using the stripes CLI. Which is super nice. And now what you want to do is actually select the different events that you want to listen to. And for us, the ones that we're interested in is the charge succeeded. Now I don't need to do all this stuff because in my app, I already created an API route and I called it web hook. This will be the end point. And I created a deployment to vercel so that I can actually test this. And if I actually go to this end point right here, like this is the hosted end point, I can see a log of like, different charges so I can just see, okay, some of them failed. Some of them succeeded and I can click individual ones, and I can find just all of this information. And for us, the most important info is the user's email, which we can find like, here It's called official Campo at g-mail dot com. So this is super nice. Now there's one final thing. We want to secure our API slash web hook endpoint. That's because anyone can just send us an event saying like, Hey, I paid for this. So we want to make sure that the event is actually coming from Stripe. And this is done by including a signing secret in the request. And in the logic, which we'll look at right away, we will have a check that checks, okay, is the secret included in the post requests that we're receiving? If it is, then this event is actually coming from Stripe. So now I'm in the slash API slash web hook file, and here's what we're doing. So I imported Stripe and I also imported micro. This is going to be used for parsing the request because by default next JS does body parsing. So we are actually disabling body parsing by next JS and doing it ourselves because it needs to be parsed a certain way. And then we're also importing Prisma, and we're creating a new Stripe instance by passing in the Stripe secret key, which is coming from our dot ENV file. And we're also specifying the API version. And then we are just setting the web book secret to the Stripe public secret, which is, which also comes from our dot ENV file. Now in the handler function, the first thing we're doing is we're checking the request method. We want to only receive course requests. We don't want someone to just, you know, send us like a get request. And what we're doing is we're saying like a buff equals a eight buffer requests. So like this is so that we can parse the request and then SIG for signature equals request the headers and then the Stripe signature. And what we're doing is we are actually constructing the event using three pieces of information. So the parse request, the signature, which is included in the, in like the request itself and the request headers, and we'll also passing the web hook secret. So this is now the event. And what we're doing is we're checking the type of the event and making sure that it's HR succeeded. And what we're going to do is actually write a personal client query to save the user to the database. So if I actually expand what's happening is I'm grabbing the charge. So it's event from like event dot data dot object, and I'm just doing a Prisma client query. So what I'm saying, await Prisma dot user dot upsurt. So upsurt, what it does is that it creates any record in the database. And if there is an existing one, it won't throw an error. Because we can't have two users with the same email. And also this way we can accept payments multiple times from the same user, which is cool. So what we do is we're saying, await Prisma dot user dot upsurt. And we're saying create. So if we don't have a record where they use with like the email that's coming from, charged up billing details by email, we're going to create, but if the email already exists, we're going to just update the email. And if we have like other fields, we can also update them here. And where, because we're finding by the email. So that's really it. And then I'm just saying res dot status 200 dot send this works. So this is our course platform. And that's it. You just saw how to use a bunch of tools together to one literal equal project. Now, maybe you don't want to build your own course platform and that's completely fine. To me what's important is that you see the full picture and you can see how the different pieces fit together. Maybe you're a friend of oppor and you've been curious doing full-stack development. Well, using next JS CPR routes along with Prisma and PlanetScale, is a great choice. Or maybe it's the opposite. Maybe you're a backend developer and you're looking for a great front end framework, using Next JS, makes a lot of sense. Now there's a reason that I pick these tools. All of them offer great developer experience. And at the same time, they have great developer community. So if you have any questions, if you run into an issue, if you want to ask for help, you will find someone to help you. And to me, that's just super important. [music]
Info
Channel: Vercel
Views: 2,380
Rating: undefined out of 5
Keywords:
Id: qEBEo76gKK0
Channel Id: undefined
Length: 27min 23sec (1643 seconds)
Published: Wed Oct 27 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.