Build and sell your own API $$$ (super simple!)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone so today i'm going to show you how to build an api and make passive income from it as a developer i'm going to show you how to do this step by step by making a node.js application that uses express as well as the package axios and cheerio with a beginner approach making it as accessible to as many people to follow along as possible by the end of this tutorial you all have the knowledge of how to build an api that you can make money off and sell and we will end with going through the steps of how to list it on the rapid api marketplace as a developer when you launch your api on the rapid api platform you can essentially sell access to it for those that want to utilize what you have made this access comes in tiers that you can set yourself allowing you to have full control over how you monetize what you have built and as it is the largest hub for apis out there at this moment in time the footfall will be in our favor meaning that you could take your api idea from a simple form of passive income to a full-blown startup depending on how much time you want to dedicate to it so what are we waiting for let's do it the only prerequisites i ask of you before starting this video is have a basic understanding of javascript or if you're feeling adventurous please follow along anyway there will not be a huge amount of code involved and you have my permission to take what i have built and change it to fit your needs but first let's refresh ourselves on what an api is and api stands for application programming interface they allow for technologies to essentially talk with each other and are essential to so many services that we rely on today they are behind most apps we use on a day-to-day basis as they can shape the information passed between one technology to another and they can even connect things such as our microwaves or cars to the internet apis are everywhere as a developer you might use tick tocks api to get a live tick tock feed onto your website or even use them in a two-way stream to get post or delete data from a movie system for example there's a reason why these words are popping up and that is because these are the most popular http request methods in fact i can use them to either get data from this endpoint post new data to that endpoint edit data with the put request to this particular endpoint or delete all the data at this end point if i wish the endpoint is essentially an address that points to a specific chunk of the data that we are working with and this is exactly what we will be building today we are going to be deciding what happens and what kind of data returns back to us if we visit a certain endpoint that we construct ourselves if this is the first time you are seeing the word api and i have not come across one before i do have a tutorial on them that will go into much more detail than this refresher explainer that i invite you to try however if you feel comfortable or would like to carry on let's get to setting up our project okay so i hope you're ready in this tutorial i'm going to be building an api that tells us climate change news from all the various publications all over the world in one place and people can choose to purchase this information from us we can do this with any publications i'm going to choose to go on the topic of climate change but if you want to go on the topic of crypto that is completely up to you now i did mention that we will be using the rapid api platform so please go ahead and sign up i'm just going to sign up with google and i'm just going to select my account that i want to associate with rapid api and just wait for that to complete i'm just going to fill in my name and your kubo organization n a and click done great so here are all the apis to our disposal as i said as you will see there's lots and lots of apis however i want to create my own api so i'm just going to go ahead and click here and then leave it at this point now it starts to start coding our api so here we are i'm just going to start off by creating a blank project using webstorm please feel free to use whatever code editor you wish and just create an empty directory so just like this so we can start completely from scratch okay before we get going i also just want to make sure that everyone watching has nodejs installed on their machines node is an open source server environment we will be using it to create our own server or in other words backend it is free and allows us to use the javascript and language in order to create our backend so i am a big fan now i am using a mac so i would just click right here in order to download this onto my machine however here are some other options for you for installing the source code okay so just choose whichever one applies to you now i already have this installed so once that is done for you let's get up our terminals and check that it has worked to check this has worked i am simply going to type node v v is for version this will show me the version of node that i have installed on my computer this version is very important if you are watching this sometime in the future and for some reason this tutorial is not working one of the reasons could be is that the dependencies we are going to use and their versions might not be compatible with our node version this is also the case when working with other projects not just this one so keep that in mind as you progress on your journey to becoming a web developer okay great you can actually also easily switch your node versions by installing the version that you need using nvm install and then the version so that is now done now we are using node version 0.10 32 so that is something you can do if you need to use a different node package for now let's go back to using the first one that we have so i'm going to go nvm use and the package that we had is 14.17.6 and great so once again we are now using this version of node that's the one we have just downloaded now that we have that we need to install one more thing if we are using max and that is homebrew homebrew is a free and open source software package management system that we will be working with and installing packages with in the next section so we need this in order to do that okay great let's carry on now it's time to get coding the first thing we need to do is run npm init this will trigger initialization and spin up a package.json file we are doing this so that we can install packages or modules into our project to use if you want to have a look at the thousands and thousands of packages to our disposal as a developer just visit npmjs.com for example if we google our popular package let's say react we can see exactly what it does how to install it and how many times it is being installed into projects all over the world okay so that is where you can find all the packages now let's go back to our terminal as a general rule any project that uses node.js will need to have a package.json file the package.json file does a lot more than just hold our packages and the versions of them that we need so if you'd like to know more about it please post here and google beginner's guide using npm however if you're comfortable with that let's carry on so let's go ahead and create our package json file that we have been talking so much about using the component npm in it making sure that we are in the directory that we just created okay so in our project so this is how we do it once again we have just written the command npm init and now we are prompted to answer these questions so do we want to call our package name climate change api yes i'm just going to click enter this is the first version that we are building the description for now i'm going to leave blank the entry point is going to be an index.js file that we create the test command we're going to leave blank and i'm just going to leave all these blank for now and click enter so now if we look in our climate change api directory we can see a package json file has been created with all the keys that we are asked to fill out as you can see climate change api has been generated version one we left the description blank we have said that our entry point is going to be an index.js file and we have one script the author we left blank we can give it an author name i'm going to put any kubo and that's it for now this is essentially where we're going to see all the packages that we're going to need to work with to create our web scraper great so i'm just going to minimize that and create an index.js file so once again just in the root of my climate change api project i'm going to get a new file this is going to be a javascript file i'm going to call it index so now we have the point of entry for our app okay this is essentially our server we're going to write our code to create our server in this file now the first package that we are going to install to use so we're going to install it i'm going to show you the package that we are going to use by just opening up npm right here the package that i want to install is called cheerio so here is the package that we are going to need cheerio is a package that we will be using to essentially pick out html elements on a webpage it works by passing markup and provides an api for traversing manipulating and resulting data structure cheerio's selector implementation is nearly identical to jquery so if you know jquery this will be familiar to you so now that we know what we will be using it for let's get to using it to pick out elements from a web page specifically this web right here okay so this is the webpage that i'm going to be scraping and the things i want to scrape are the titles of the articles as well as the urls so if we inspect this page and just gravitate to here and then pick out for example click here we would want the i would say i probably want the div with the class item header and then looking at the h3 tag with this class name as well as the url that comes with it so i can build my climate change api that will give us information about climate change from various sources all over the world okay great so hopefully that makes sense hopefully i've explained what we are going to be doing and why we're using cheerio to do it let's go ahead and install cheerio so it tells me that to install i need to use this combined npm i is for install and i'm just going to go back into webstorm and get on my terminal and i'm just going to type npmi i could type npm install that is totally up to you i'm going to install the package cheerio just like it told us to on npmjs.com and just click enter making sure that we are in the project making sure we are in the climate change api directory okay that is pretty important so that has now been installed and there we go we will see that a dependency has shown up we have just installed cheerio along with the version of it okay so this is the version that we are working with for this tutorial if for whatever reason you are getting a different version and that might be causing issues please go ahead and just replace that here and just fill out the correct version and then once you do that just run npmi again okay npm i will essentially install all the dependencies that you see in this object right here and then generate a package lock json file from it so in this package.json lock file you will see that we have indeed gone to the npm js registry and installed cheerio okay so here we go we can see cheerio has been installed for us great so once again a package lock json file has been generated because we ran npm in store and that installed all these dependencies and that is what generated the package log json file great the next package that we are going to need to install is a package called express js so express js i'm just going to search that for you in here as well express is essentially a backend framework for node.js we're going to install it in order to listen to pass and listen out to our port to make sure that everything is working okay what i mean by this is that if we visit a certain path or url it will execute some code and it will listen out to the port we define but enough talking let me show you what i mean by this so once again i'm just going to copy this command and go back to my project i'm just going to clear that using command k and run npm install express and just wait for that to install and then it should show up in our dependencies just right here wonderful so again this is the version of express i am using for this tutorial if for whatever reason your code isn't working it could be down to the version that you installed okay we have a few more packages installed the next package i want to install is a package called axios and axios is a promise based http client for the browser and node.js axios makes it easy to essentially send http requests to rest endpoints and perform crud operations this means that we can use it to get post put and delete data it is a very popular package and one that i use quite a lot on a day-to-day basis as a developer so once again i'm just going to copy that i'm going to go in here and install it into my project so that it shows up in our dependencies wonderful and those are the three packages that we need for this project so hopefully you've got to this point hopefully you understand how to install packages or dependencies into your project i feel now it's time to carry on and get to some actual coding so i am just going to start by going into our index.js file and defining the port we want to open up our server on this can be whatever you wish i'm going to choose to run this on port 8000 just like so and then i'm going to write some code this is just some standard syntax for listening out to the uh port to make sure it's running but before we do that we actually need to initialize express so first off let's get express so i'm going to use the package express again this is just standard syntax we need to get the package i'm going to save it as the const express okay so this is just something that you will see being done here as well in the setup axios require axial so this is again something that we will need to do so i'm going to copy that in order to use axios in our back end this is something that we need to do so i'm going to put that here as well and again for cheerio const cheerio equals require cheerio so that's all three of our packages done now to initialize using express well i'm gonna show you how to do this so what i am gonna do is essentially get express and call it so whatever the package comes with i'm saving as express and then i'm calling it so it releases all its wonderful energy and all its packages and all the stuff that it comes with i'm gonna save this as app so we can use it further on okay so once again this line calls the express function it calls it and puts the new express application inside the app variable to start a new express application so we have called this express up here and we are now calling it and saving it as app so that we can use it in the rest of our project as we please with all of the expresses powers that it comes with things like app use so app views so essentially this is from the express package or app get there's a lot that it comes with essentially it comes with a lot of power great so now that we have that let's get to using express and using uh what we store is as which is app in order to get our port up and running so first off we listen out for the ports i'm going to use express to listen out express listen out for port and then i'm going to use a callback and i'm just going to console log out in our back end server running on port and then also support okay so this is what we need to write in order to just get a message to show us that everything is running fine on our server however we also need to write a script so i'm just going to go back here and under script i'm going to get rid of this test script actually so let's just get rid of that we don't really need it i'm going to write start so i'm going to write a script for running the start command and i'm just going to use no demon index js no demand will essentially listen out for any changes on index js so now let's run our back end i'm just going to use npm run start and there we go server is running on port 8000 so our back end is working our app is listing out any changes made on port 8000 wonderful so this is looking good we have now officially used express and express comes with this listen and we are using it to listen out to any changes on port 8000 or as we defined poi thousand we could have defined as whatever we wish we chose eight thousand so this is now working let's carry on now the first thing that i want to do is let's just start scraping our first webpage so what i'm going to do is write a path so we'll do some routing once again i'm going to use express i'm going to use app get just like so and then i'm going to pass through a pop so if for example i just passed through the home page like that that is a homepage path and then this is the syntax for routing and then res jason and then let's just write something welcome to my climate change news api and click save remember no demand is listening out for any changes okay no demands restarting due to changes it's starting on index.js so now if we visit port 8000 so i'm just going to get rid of this localhost 8 thousand welcome to my climate change news api so that is working so what we have done here is listen out to any time we visit the home page and then if we visit this we get this response this response jason welcome to my climate change news api okay we've passed through a request we've passed a response this is just a helper that um webstorm is giving us so i've not typed this out that is a helper as is this is telling us that this is a request and this is a response i can change this i can make this annia is great and that just means let's save that that just means that now if i visit the home page there's nothing there and if i visit anya is great if i spell it correctly welcome to my climate change news api so that is working i'm just gonna make my back to the home page hopefully see now how that works great but this is in our api we need to actually get some you know interesting data coming back and i want to scrape the internet to get the data from certain news articles coming back to me so let's go ahead and do that so to do this i'm going to keep that as is and once again let's write app get so exactly the same syntax as above let's pass through a request and response just like we did before and this time let's say if we visit news well then i'm going to use axios get and i want to essentially visit this url okay i want to visit this url so let's grab that and let's just paste it i want to visit this url and i also want to wait for that to return because this is essentially going to return a promise it's returning something back to us so once that return comes back i'm going to do some chaining if you don't know much about async javascript i do have a course on this that i really recommend it's a five part series so we're going to do some chaining with this so we're visiting this url and then the response that comes back to us well i want to save this response so cons let's save it as html and i'm just going to save it as response data okay just like so so now if we console log html and i'm just going to visit this path be sure to save that page and let's visit news and now let's go back to here and see what comes back to us uh whoops i'm wrote that's not meant to be here let's delete that and save and visit this again enter okay so now you will see the html of this website coming back to us okay so this is essentially everything from the guardian websites page that we visited this website page to be exact and it's coming back to us so that's all i've done however this is great but we need to carry on we actually need to pick out the elements so i'm going to show you how to do that and to do that we're going to use cheerio and actually we can just get rid of this and cheerio has some commands that will help us do this load comes with the package cheerio and i'm going to pass through the html so the response data say to html and i'm passing that through into cheerio and let's save this as the dollar sign so now essentially that's allowed us or you will see this is going to allow us to pick our elements so this is the syntax for using the cheerio package we need to get the dollar sign up or essentially what we have defined here is dollar sign we're now going to use so i'm going to use that like so and then i am going to look for any a tags that contain anything to do with climate so what this means just make sure that this is contains so what this is doing is looking on this webpage right here and then finding any elements that have the a tag so for example this one right here and if it contains anything to do a climate change for example this one doesn't so for example this a tag consists of the word climate so that should be picked out so that's exactly what i am doing i'm going through all the html and using this syntax looking for a types that contain the word climate okay and then we just pass through the html and then for each one that comes back i'm assuming there's more than one ah i'm just gonna write function so i'm gonna pass through a function so essentially a callback function and whatever so for each one that comes back to us i'm going to grab its text because essentially once again i want to grab the text that's in the a tag okay i want to grab that so let's save this as something i'm going to save this as the title so i'm going to go into this so for each of them and get the text and i'm going to save this as their title so we have decided this is going to be the title of the data that comes back to us and next we need the url so what do we want the url to be well once again whatever comes back to us whatever we are saving this i want to grab the href of it so if we look in here here's the href so again for every a tag that we find i want to grab the href attribute so i would do so by essentially getting the attribute like so attribute h ref okay great so that's what we're doing and then let's push it into its own array so this is some javascript work articles is what i'm going to call my array and i'm just going to save it we can put it up here actually so i'm just going to put it outside of this app get so const articles so it's global we're going to get the articles i'm going to push a new object that we're going to create and the object that i'm going to push in is going to have the title the url and then well let's just see what this looks like so i'm just going to display the articles in the actual browser using res jason so we're pushing this object into the articles array and we're going to display the articles in the browser when we visit forward slash news and then let's also catch errors so again this is some chaining uh if you know a synchronous javascript then this will make sense to you if not once again i do have a tutorial on this a five part series on asynchronous javascript so that is how we just catch any errors okay so now let's visit forward slash news again and there we go so we are getting our data back if it looks like this is because we need a json viewer really so if you don't have the chrome extension json view here it is essentially just makes everything a lot more readable i'm just going to add that to chrome like so okay so that is being added and once it is finished then the icon will be visible and there we go so that is now a lot more readable for us this is looking great so every time that we found an a tag that contained the word climate we essentially created this object we picked out the title of whatever was in that a tag and we picked up the hatred of it so now i have an array full of titles and urls from the guardian page that we scraped okay so how cool is that we've officially created our first scraping tool but i really want to make this a lot more meatier i really want to give this a lot more value to anyone who wants to purchase my climate change news api so i'm going to scrape from a lot of different websites and i'm going to show you how so this is part one we've essentially learned how to scrape a website to retrieve back an array full of anything that contains news about the climate crisis or the word climate the title and url but now i'm going to loop that in with a lot of others so let's do it so first off i'm actually going to create an array of newspapers that i want to scrape so we've got our articles here actually above here i'm going to install the newspapers so const news papers and i'm just going to make an array and i'm going to actually put through an array of objects and you'll see why i'm not just going to pass through the urls i'm going to pass through the name of the publication so these are some that i found before let's go with the times the address that we want to scrape and this is going to be this url right here and i'm just going to paste a few more that i have so we already have the guardian let's also have the telegraph so the times guardian and telegraph all on specific climate change pages so there's a lot more data for us so we can start with three let's do three and then we'll add a bunch more later so here are the three newspapers that we want to scrape now i'm going to essentially edit my um function right here to loop over all three publications so i'm going to show you how to do this now this time i am going to actually write this function outside of this so i'm going to actually just use this as a template i'm going to do this outside here so maybe let's do it up here actually so this time for each of the newspapers so every newspaper in my newspaper array newspapers i think that's what we called it so for each one of these for each newspaper this is why i said javascript was a good prerequisite because if you don't know javascript this can be extremely confusing but you know if you just want to use this code and just want to take it then please just listen to me talking through it for you anyway so for each newspaper so for each item in my newspaper array there's three items in there one two three i want to so i want to get axios get so we're just using what we did before this is a good refresher and then i need to pass through the url so i'm going to pass through the newspaper address news paper singular okay because we for every item for every newspaper in our newspaper array we can call this whatever we wish we can call this dog it doesn't matter we are just saying that for each item in our newspaper array well i want to get the address making sure to spell address in the same one that we did above so i'm passing through the url so just like we did here and passing through the url however just by looping over the array of newspapers and then so just like we did before then we get the response okay and once again we need to get the response data save it as html and then pass it through into cheerio load and save it as the dollar sign so that's the same let's carry on now once again because i have looked at all these different newspapers and i've actually found newspapers that work to this style and i've sort of adjusted this so that anytime you find an a tag it just so happened that each a tag had some text in it and it also happened to be that the a tag had the h reference but that's expected from an a tag so i can actually reuse this so once again for each of the three here i'm finding an a tag that contains the word climate let's just make sure to close this off okay and once again what i am doing is looking at whatever comes back okay whatever comes back uh so whatever comes back is this and i'm getting the text from that a tag and then i'm going to say this as title and once again i'm looking at each of the a tags so this is the a tag and i'm going the ah and i'm getting the attribute h ref and whatever this value is i'm going to save that as url equals okay and once we have that i'm going to push it into articles articles push and i'm going to make a new object which has a title and a url but this time also a publication okay so this time i'm actually going to have the source newspaper address okay so whatever we pass through whatever newspaper we pass through this time i'm gonna get the newspaper address oh actually we should probably have the newspaper name instead that would make more sense let's have the newspaper name show up as the source okay so now instead of having all this i'm just going to delete all that i'm going to just return the articles because this function will run and we're collecting all the articles so what i want to appear here in the json and the browser so once again we use our json is the articles so now if we visit forward slash news we get articles from the times we get the source url the times the title this is looking great we also get the guardian articles so we scrape the guardian and we also get telegraph okay however this does not look like the correct url let's just double check why okay so we are going we are visiting this url and we are scraping it so once again let's go here and once again i have decided that i want to search for any a tags in here so for example this will probably be an a tag so yes we are an a tag that has a url so this url is incomplete it seems that it doesn't have a base and then i'm looking inside it for some text so anything inside this parent chart of a i'm looking for some text and there is our title so this is where i can find how we need to make an adjustment so this is the full url because if we click on here it will not take us to anything so our api is broken people would not like that so all i'm going to do is go back here and pass through a base so this doesn't need a base so i'm gonna have that empty this also doesn't need a base the telegraph however needs a base and i'm just gonna pass through the base which is essentially https forward slash www.telegraph dot co dot uk okay so now that we have that base it just means that the source we pass through so the url well the url is not going to be just the url it's going to be the newspaper base if that exists plus the url okay so we're grabbing the url but we're appending the newspaper base in front of it so that is looking good let's try that out again let's refresh this and now if we visit the telegraph section you will see we have created a new url one that includes https forward slash www.telegraph.com.uk wonderful so there we have it we have completed step two we have essentially got three newspapers and scraped all the information all the articles about climate change from there along with their titles and the urls if we want to visit those articles wonderful now this is looking good however i do want to make a another route and that is to get information just from one newspaper article so i'm going to show you how to do that just under here so once again we're going to use app get just like so and this time the url i want to pass through so i'm going to visit sorry it's going to be news forward slash and if we want to visit a particular newspaper id this is the syntax to do it uh bear with me because i think this is best explained by showing so you need these two dots and then whatever you pass through i want it to return something so i'm just gonna pass through response this time making sure to put async right in front of it and then i want to grab whatever newspaper id i pass in front of news so i can do so if i console log request okay and now let's just go ahead and visit forward slash news and anya is great once again and click enter nothing will show up here but if we visit our console here you will see a request or everything that comes back requests has been essentially console logged out for us and in the params you will see that the newspaper id is anya is great okay so now this is this whole thing right here all this text is the request if i go dot params that will get this and if i go newspaper id because that is what we've called it up here i can call this whatever i wish i could call it dog and in this case this will come back as dog equals any is great so now if i console log that and once again let's visit something else anya is awesome and scroll down here anya is awesome okay so hopefully that makes sense whatever we pass through this dot just means that you know it's an id that we are passing through an identifier it could be whatever you wish you can call it xxx but essentially it's going to be saved under parents when we visit the page so now that we have that i'm going to use this to my advantage i'm actually going to save this as something so rec params newspaper id and let's save this as the newspaper that we want to visit so i'm just going to put a newspaper id cons newspaper id so if we visit let's say forward slash news forward slash the times i just want information from the times okay that's all i want so i'm just going to go axios get and once again pass through the url well we know that we want everything we want the url from the times so we're going to use this array and what i'm going to do is use some javascript so we're going to get the array and i'm going to filter the array to find the newspaper okay so once again for each newspaper so each item each of the three items in my newspaper array i'm going to go through each one and if the newspaper's name equals the newspaper id okay you don't have to make it strict so that is going to come back so if i pass through news four slash the times it will come back and if that matches the times well i want to get the address so to get this back well let's save this as something let's save this as const newspaper i'm just going to comment this out for now and if i console log newspaper i expect that if i visit forward slash the times oops making sure to spell exactly the same okay so visiting that nothing will happen here at the moment but if we visit here we get back the object from our array we get by the whole object and we go into this object we want to go into the array so here's the array i want to go into the array and because there's only ever going to be one item hopefully i'm just going to go into the first item and grab the address so i'm going into this item and grabbing the address and let's save this now as newspaper address okay so we have now hopefully got our newspaper address let's see if that works on another newspaper so let's now visit and let's go visit here and great we are getting back the url of the guardian if we pass through the id of guardian wonderful so now that we have that let's carry on so we can use axios get to pass through the url so the newspaper address of whatever path we visit based on the identifier and then let's do some chaining so once that comes back with a response we're going to get the response and once again we need to get the response data and let's save this as the const html so this is going to be cheerio at work because we're going to get cheerio load making sure to spell cheerio collect correctly like the package and pass through html and now let's save this as the dollar sign so we can use it and once again i'm actually going to collect all the articles here again you can call this whatever you wish and that might be confusing maybe we can go specific articles specific articles just to differentiate from the other articles array that we have globally so let's say this are specific articles and empty array and this time once again we are going to look for any a tag that contains the word climate on the page that we are visiting and then html we need to pass that through and for each item that comes back to us so for each a tag containing the word climate that comes back to us well let's write a callback function oops we have to make sure that these two are the same okay so there we go contains a climate and making sure this is a dollar sign i'm gonna get its text and save this as the title so constant title i'm also going to get the url by going into this and getting the attribute of href and now we need to push we need to create an object and push it into specific articles so i'm going to use push create an object which again well i'm going to pass you a title uh the url the url is going to also have a base i think we need to get a base so if there is a base to be added this is how you would do it we would go into our newspapers array again and filter by newspaper so for each of the three items in our newspaper array if the newspaper name matches the newspaper id so just like we did before we're going to go into the object and get its base and what should we say this as let's save it as newspaper base just like so okay so if a newspaper base exists then we're going to add it to the url so right before the url and once again the source well we know that this is just the news paper id great so this is looking good and then of course we want to display this in the browser so i'm going to use res jason to display the specific articles okay and once again we're just going to catch any errors so this is the syntax for doing so console vlogs now visit this page we will get all the news articles from the guardian and then if we go the times making sure to spell it exactly as we did in here then we get all the newspaper articles from the times and if we just go back to all the news articles we get all the news articles and there we go so now we've done it we've created api we've figured out a way to get all the newspaper articles here and then if we go forward slash and then use the id to get specific articles which we will be explaining on the documentation when we build our api we get just specific articles great before we move on to put this on rapid api i'm actually going to add a lot more newspapers in here so i'm just going to actually just paste a few that i made earlier okay so now we have a lot more data coming back to us once again i'm just going to visit here so news and ta-da it's getting lots and lots and lots more data up to 730 lines of data to be exact wonderful okay so this is looking good let's carry on before we move on i'm just going to format this a little bit better just so it looks a lot more neater i'm going to do selected text optimize import rearrange code and run and great okay uh we don't actually need this it turns out so let's get rid of this and now to prep for deploying onto heroku well i need to include the notable package actually in the project as right now we have installed globally on our machines but hiroki doesn't know that so there we go npm install nodemon and finally i need to change the port options for a roku so give it an option like this and great so this is looking wonderful again uh you will be able to get this from my source code i will share it please feel free to use it for educational purposes or to build your own api okay so now let's go on to the rapid api platform okay so here we are back on our rapid api dashboard let's actually call our api something i'm going to call it climate change climate change live let's call it that and then a an api showing all the latest climate change news around the world as we do have australian articles in there and some american ones so it's quite wealthy category i'm gonna say data maybe or news let's go with news i think that's probably most appropriate and then the owner of this api will be me that is correct and i'm just going to keep it as ui and add the api just like so okay so here we are we need to add a base url well for this i'm actually going to deploy my app onto heroku this should be relatively painless so all i'm going to do is ask you to head over to heroku and you should see all the projects i've posted on this so far and i'm just going to go ahead and create a new one give it a name let's go at climate change api and choose the region and just click create app so there we go now we should have a lot of information on how to deploy this using heroku okay so if you haven't downloaded the heroku command line interface please go ahead and do so now i'm just going to do that with you here so here we go i'm going to copy this command go into my terminal just create a new tab and run brew tap heroku brew and brew install heroku now if you are an apple user like me you're gonna also have to install homebrew so please go ahead and do this simply by copying this command here and pasting it in your tongue i already have this installed so i'm ready to continue so this is only for apple users if you have another machine please do use the equivalent okay so that should be fine that is installing and that should be ready to go so we have the command line interface installed let's carry on with the instructions given to us so once that has finished downloading i'm just going to run hello crew login and then just click any button to open it up so this is just to make sure that we are logged in so i'm just going to click log in here just like so and we are now logged in we can close this page and continue with doing this process so now we're going to run git init and that has now initialized an empty git repository in my project okay so making sure that we are certain climate change api now let's carry on the next command i need to do is take this command right here and just paste it like so so we are just going through all the commands that we go great so now before we move on i'm gonna actually add a git ignore file so that we don't upload any node modules before we start adding files so i'm just gonna go ahead and write git ignore and what i am saying is that i want to ignore the nerd modules from being committed and just save that okay so now we can carry on i need to add the files the next thing we need to do is commit the files a message let's say final commit and push it to master so just like so and there we go the build has succeeded so let's go ahead and check it out okay great and there we have it we have deployed our site it is now live all our data is now live just right here the next thing i'm going to do is go back to rapid api and continue so now that we have done that i'm just going to upload an image that i want to represent my api and in here on the website i'm going to use the url that we have just created so we have deployed this url this is where our app is going to sit i'm going to use that right here and next i'm just going to go ahead and wipe that in the next thing we're going to do is add some endpoints so what shall we call the first endpoint that we are going to create well i'm going to create my first rest endpoint so just go ahead and click here and let's call this get all climate change news and we can give a description saying that this end point will return back all news about climate change from all over the world now with the endpoint here we're just going to specify what we did before so to get all the news articles we had forward slash news and this is indeed a get request we're going to get that data so that is all we have to do we don't have any parameters we need to worry about for this occasion and then we have all these options to us we are using node.js and we are using axios as it has picked out so this is all we really need to do for now but however if you do want to see the others please go ahead and do that okay so for this one i'm actually going to start off with the newspaper id so i'm just going to go ahead and go forward slash news and then we can't use dot newspaper id that will not work as you can see we need to put this in curly braces just like it is prompting us to do so that's what i'm going to do there we go newspaper id is already picked out that this is my parameter so that is looking good and let's go ahead and put an example value so we know that one of them is guardian so i'm just going to put in guardian and you can see that has populated right here and go ahead and save that so we've got that and we can see some example responses showing up we also have mock responses to if we wish so if you wish to fill this out please go ahead and do so here so i can actually do this so i'm just going to go ahead so this is essentially all really useful when people visit rapid api and see our api so they can see what kind of responses to expect when they visit a certain endpoint it's really useful and it just allows people to have an idea of what your api can provide and what kind of value it gives and of course we have the plans and pricing now here are some public and private plans this is sort of what it looks like and the basic one is free so for example i can say that we have unlimited requests or we can change the monthly requests so on the free option to be 1000 per month we can have a pro plan so let's go ahead and add a pro plan and once again this is going to be for all end points and once again you can change the quota type and quota limit and then have a ridges as well and then let's say that we charge 0.1 extra for overages okay so you can do whatever you like let's just go ahead and save that so this is a little bit higher it's not super high i'm not going to charge anyone like an extravagant amount for this but these are the options that you have okay so there we go this is looking good let's go ahead and publish this so i'm going to make api visibility public and there we go here is my api live on racket api you can see the two endpoints that we have so we can get all the news and we can also get the news based by newspaper id or a newspaper we of course have to give people the newspaper ids available and that is all done in my api documentation that has been actually generated for us nicely by rapid api so here you can see all the example responses and you also have the drop down that we saw earlier so if you want to make this by node.js it gives you the code available to you so you can just copy this and put it in your project and there's a bunch of others too so whatever you feel more comfortable with that is available as an option okay so hopefully you've enjoyed this tutorial i hope you now have your own api that you can sell and pass on to people pass on to friends to make some money out of it again like please feel free to charge as much as you feel comfortable with charging and selling your api for for now that's it for me thanks very much all the code and source code will be available in the description as well as links to rapid api so please go ahead and do check those out you
Info
Channel: Code with Ania Kubów
Views: 181,130
Rating: 4.9254236 out of 5
Keywords: what is an api, restful api, rest api, rapid api, rapidapi, rapidapi hub, startup idea, selling api, software development, express routing, cheerio, nodejs, web scraper, news api, build news api, climate change api, ania kubow, coding, coding tutorial, nodejs tutorial, nodejs project, backend developement
Id: GK4Pl-GmPHk
Channel Id: undefined
Length: 59min 10sec (3550 seconds)
Published: Wed Oct 13 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.