Set up Postgres and MongoDB in Django | Docker, DB Routers, Logging

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
what's up everyone welcome to the last video 2021 and in this episode we're going to uh we're going to talk about how to add multiple database to a django project so the first question appears uh in the head is why even we need this let's first okay let's first um describe the problem and then we will give a solution for that the problem is assume that we have an application that runs really time consuming or let's say heavy tasks behind the scenes let's say the we have a task we are passing it to a salary distributed query and it starts to send emails to a tons of recipients let's say a thousand users and meanwhile there there can be some errors or i don't know some warnings critical messages that we that we can miss uh during the process or maybe let's say the smtp server will down or there will be some disconnection maybe email is not valid there are so much many cases that we can face during this uh time-consuming tasks okay and if we didn't and if we don't uh don't capture logs we will never know what's going on uh with the environment and the users can end up with crashing the application will not work and we will never know where the problem is coming from that's the point where the login comes and uh so we have to keep these logs in somewhere okay let's say we don't have mongodb we can we are only using postgres but the postgres is a relational database and we we have to keep some uh how to say um we have to keep some useful data inside there not like logs because there will be tons of logs and that will mess up all the database assume that there was a really long message with trace with uh with stack trace and that will overload the postgres extremely and of course there will be crash at some point maybe the data transfer will be slow or actually that is really bad approach to keep logs to keep right logs in i would say every a few seconds let's say in every few seconds we are writing logs with postgres and also the other users are using some relational maybe someone is uh registering for database or maybe someone writing another useful data to a database so everything will be messed up i hope uh you understand what i'm trying to say and at this point we can use mongodb to actually separate the log objects from the postgres and navigate all these log data flow to mongodb so as you know the mongodb is non-relational database and we can actually keep our logs there that will be more good approach instead of keeping everything in postgres so that will balance all this data flow with these two between these two databases and our application will work will work like a charm okay so i will go through this blog post and if you want feel free to go um the url in the description or you can just navigate to the pilot.dev and you will find this blog post there okay so i hope you have before starting everything i hope you have basic understanding of docker celery redis and of course django because i'm not planning to go in detail by explaining what all these configurations are doing there i just want to show you how to connect how to implement multiple databases but this stuff also is important because we are we will also um we will also configure the mongodb inside compose files so you have to understand what these configurations are okay so let's start by let's open our code editor and i created a new folder named my project and i will keep all files inside there so first let's create a new folder named app that's going to be our app and inside it let's add docker file okay enter now let's copy the dockerfile configuration from there we are basically pulling python image and these stuff there this package are for building postgresql and there are some environment variables maybe as it will just prevent some caching by cache files and then uh we are just um navigating we are saying that our working directory is going to app and we are copying everything from the uh to app folder inside docker container and lastly we are running requirements.txt so which means we have to create another file named requirements.txt so let's see what's our requirements there copy and paste so we are we're going to use salary as a distributed query to pass to pass some tasks there django as always and here it is we will come this in a second um then engine pilot monkey engine this is just a package that we need to install if we are using the mongodb in our django project pymongo and what we have here psy cop g2 it has a really hard name to read that is for postgres and redis which is message broker that celery will use and faker which is um fake data generator okay all right so let me explain you what is this uh what what is that extra requirement here that i'm pulling from my github repository and this is just for repository of jungle and i did some i did some small refactoring there since the jungle has some problems with newer versions of django it just throws some not implemented exceptions something like that and as i see from stackoverflow from github issues the people are struggling with this that's why i decided to make some small chains and applied it to the requirements to keep everything smooth and you can do the same actually you don't have to because i highly recommend to keep everything like that to avoid any errors during the process but if you decide to use another version of django maybe not 3.1.3 but maybe four or five then you have to follow the steps that i will show now okay so that is the original repository of jungle and in here if we navigate to setup py scroll down and you will see install requires block here configuration actually it seems it supports django uh django versions above 2.1 but we have to refactor the refactor of this part here by adding actu i just navigated to fork repository of mine set up ui and just update it like that add this so which means that support django three point above 3.2.1 and smaller end equals the current version of django that we will use okay if it's if you decide to use five maybe then you have to write three uh three point one point five okay and the second file you have to change is requirements.txt as you see i update it like that let's see the original one first the original one is uh it's saying django above version two and smaller and equals to 3.1.4 but i change it to 1.3.1.3 okay so do the same replace the current version that you you are using and done then just update the path right here add as a requirement and hopefully that will not throw any errors but i highly recommend to keep it same as i did in this video and everything will be fine for more information just navigate to blog post i'm giving even more some detailed information about it in case you missed in case i missed something while talking okay so that's it for jungle configuration alright so let's continue with the next step and the next step is creating docker compose file and in total we will have five services which is mongodb postgres app which will be actually django project salary as a distributed task redis message broker that required for salary and so let's um simply copy all these configurations and create new file outside of app directory named docker compose normal and paste everything there um so i will not go through uh through this in detail but assuming that you already have some basic understanding of how compose services docker works that was um the requirement before as i mentioned before starting the video because otherwise if i explain each line there that will we will go out of time and the video will uh will very very long so that's why i'm uh kindly skipping this part maybe i can mention there's some couple of things like uh for service we are fetching a manga image the official image and document hub we are naming the container the same as image name and there are some required environment variables that we need to set and we also have the m file we are going to create it in a second and we are pulling these sensitive credentials from there we're going to pull it from the diamond file instead of keeping this inside docker compose yellow okay we have some volume configuration to share data between containers and many we are mentioning ports that's the default post for the mongodb and same applies for other services as well such as postgres uh redis um our django app and finally redis actually salary as a distributed test key okay all right so um the next step after that is creating an environment file let's copy all this content right there create another file named inf and paste it so we are keeping all our credentials inside file now it's time to create our django project let's go and copy this command open up terminal like that and paste all right that will create our junk project inside app directory here and we also named our project app and as you see it holds all necessary uh the root configurations there such as settings urls the manage py is here so good that's our django project then uh the structure look like below exactly and then we have to yeah integrate postgres with django and for that we need to update database configuration inside settings.py so open up settings up ui scroll down and you will see databases configuration right there by default django django's using secure light if you see one amongst the project files simply delete this because we are not going to use it and also replace this configuration with the following so let's import the os module twice i can't save it okay let me just let me just edit permissions really quick all right save okay um so we are the default database is going to be postgres we named it as default uh as you see it holds postgres configurations postgres credentials let's say we are pulling it from the m file right there and by this way we will authorize uh to connect with our database so next we let's go down there we have to create uh the co app i will explain why in a second let's navigate to code data and let's run the command paste the command and it will create another app named call and inside it let's create another um but before that let me explain why why we just created a new app so the reason is sometimes postgres is launching before django which which makes django uh throw some exception like it can connect to postgres database and it happens a lot uh even though we configure it there inside app service depends on postgres but as i see from the issues from the stacker flow from the people actually from developers that are facing with this issue frequently which means we have to write some custom command to wait for the postgres before launching django some of developers are creating sh files shell scripts to force django to wait for postgres but we will achieve that but we will achieve it by creating some custom command that will force um that will stop django and wait for postgres and then continue launching django okay that's why i created a core app and inside it we will create another directory name management and inside it will create another directory named commands and we will place our custom commands there that is the approach that official django documentation offers okay so we're going to follow them we created our core app then let's create a new folder named management the permission okay so unfortunately ubuntu has some really annoying permission system let's try one more time management and inside it we are going to create init.py to mark it as a python package and new folder again named commands and also we have to add init py here as well all right so now our custom command name is going to be wait for db py okay now navigate to blog post i'm going to copy the content of the command save by the way i saw it on stack overflow actually i will state the link that i forgot but i will put it under the description of this video and you can find more explanation regarding this code submit okay um save it so as you see django command to post execution until db is available we are checking for database uh db con that's coming from the connections model django db connections we are we're checking for default why default because in settings our database name postgres running on the default name so we're checking if postgres is there if not then it will throw operational error which means database not there then we are waiting for one second like time sleep one and trying again until database connection is available okay so save it and the next step is updating our token compose file exactly so since we add a command custom command we have to run it in somewhere and the this point going to be inside our service app service we're going to update the command block there so first waiting for db then migrate if we made some migration files we have to migrate it and after that launch django server okay i will copy this part go to docker compose app service and update this command block there and save now let's switch to blog post again and next we have integration with mongodb all right so inside settings py we have to update the databases configuration and we need to add mongodb there so let's navigate to settings py inside database configuration we're going to add next new non-rel for mongodb so default is keyword for the postgre is running on the default keyboard and the mongodb will run on the non-rail keyboard so um so we are going to use these keywords for database router i will explain a second so if for example the model belongs to non-rail that means that shield right inside mongodb if default then it will write inside postgres okay so we are naming our databases like that to know that default is postgresql is okay and these configurations are just uh required by but the main thing is here the engine is going to be jungle the jungle as i said in the beginning of this video is handle all sql queries behind the scenes by keeping django orient same the django area will not change in any way thanks to jungle it will it will process all these queries in the background and let us create anything in the mongodb so that's it um the next step is let me just switch to blog post again setting up db router yeah that is the another main point of this entire project by setting up a database router so what we are going to do with database router we have to separate our models for example the as i said uh the logs should go mongodb but some the other ones must go to postgres so we know if we then if we don't mention actually if we um if we don't set up any database router all models including logs will go to the default one which is postgres default database that's why we need to tell django that okay if the model the created model is log then write it to the mongodb okay so uh that's the point where database router comes and let's just copy this class non-railroad class and we're going to add it inside youtube's directory let's create another directory named utils inside core app we're going to add unit py to mark it as a python package then create db routers py right tv routers okay then paste all right so before we go on let me just let me show you the automatic database row thing and this is the methods that database routers provide and we will go we will going to use three of them database for read database for write and hello migrate okay so here we have database for read and we are checking if model name is inside non-real models assume that we have model named log we are defining it here with lowercase and as a string if we have another model named user then we're going to write it like that user so let me remove so if the model name is the model name is log then return non-rel database right so non-rail is mongodb as we defined it here else return default which is postgres same logic applies for db right that's what read for example if we are filtering or we are selecting something but for write if we are inserting such as update create then we are checking same if model is non-rel for example log then return mongodb else postgres and finally hello migrate a low migrate is only for postgres because mongodb doesn't need any migrations since it's non-relational so we will only migrate postgres postgres models so if db is non-real or model name is in self-nominal models return false which means that the data for mongodb models we don't need migration okay we don't we don't need migrate anything but if it belongs to post raise then return three which means that okay this model should migrate right so that's it for database router so once you complete it navigate to blog post again and now we need to tell django that use our custom database router instead of default one and to achieve that simply copy this line and we're going to add new configuration inside settings on the database calculation actually next after next database configuration named database routers and we're just giving the path of our custom database router and that is inside core utils database router's non-railroader right here okay so by this way we are telling django that navigate incoming data flow to the proper database by using our custom database router so that's it and finally we have to set up salary and redis and since we are building very small actual very simple application that will go through all these processes and we have to add very heavy tasks such as generating tons of posts maybe thousand posts or 100 posts and to achieve that we need to set up salary and radius to pass this heavy task in qui then we will able to test our application so inside um our root actually inside our django project we should create a file named cellular py and copy copy this code there and paste it inside so simply the salary will discover all tasks alongside the project by following this configuration as you see auto discover task and then we have to add we have to update init rpy of our django project like that and so next next we need to add these following configurations for salary inside settings by i will go and end of settings py and paste it so simply that is just salary configurations such as task time limit ignore result broker url which will listen for redis let me navigate to environment to see that salary url is redis broker url here oops and red is channel url there actually we don't need channel but salary url will be enough maybe we should remove this okay anyway just keep it same keep it like that and then the next step is going to be adding models and salary tasks okay the salary and reduce configuration is done then we need to add our models okay so models is going to be lock as i said now let's create a new for a new directory inside core app named models let's mark it as a python package py and then create another file named models models exactly models then paste the model there as you see from jungle import models as manga models i just gave alias to this to separate the actual model which defined in django.db so simply we're just i'm giving alias to how to say to separate these two terms and class log our model name is going to be log and if you remember we mentioned we defined it inside database routers right here log so by this way if we create if anywhere in the code if new creation of the log object is detected then the database rather will navigate it to the mongodb instead of going to the default one which is postgres right so for log model we have uh id for mongodb it's required to to define id field and it should define like that underscore id models object id field and message message is going to be a log message naturally it's going to be text field and then create that update that simply for letting us know when this object is created then usdb is going to be non-real and ordering is just for ordering the created data by created date and then string representation is going to be self.message itself message itself save and then let's continue okay so then we need to add postgres models create another file named models paste so simply that is just post model with title and description filled nothing really complicated here um next we have yeah let's import both these models inside it so if we write like from models import log we don't have to mention mod models inside in it we are importing both of them so finally i'm saying finally but it seems that it's going to be endless call tasks let's copy this we will create a new file named tasks inside code folder that will hold salary tasks actually which is let's say the time consuming tasks or really heavy tasks that needs to be run behind in the background we imported logging as you see we are calling the logger by this with this function and we are adding handler extra handler for the uh for the logo model and the reason is once the once the log is produced at some point of the process we want to write it to the mongodb right away and by adding a custom handler we will write the log message um to the mongodb once the log is produced okay so we don't need to create it manually the handler will um create the object automatically in the in the mongodb so after this step we're going to create a login handler so keep it like that then we have shared task which is going to be passed through salary we have to decorate the function if we are planning to pass it to the salary then we we're going to decorate it with shader task you should already know that if you are familiar with celery so we are using faker generator which will generate some fake i don't know sentence or fake text text this is just fake data generator model we are using then number of posts deranded will return some random number between five and hundred then we are iterating through and we are checking if a reminder is zero um divided by five number divided by five then title is going to be none why not because um the title as if if i navigate to the postgres models there the title is required we didn't add blank to attribute there which will uh produce some error okay so title is known that will create an exception and our we are catching the exception and creating log for that so if the number is uh while iterating through this if the reminder is zero it will produce error so by this way simply we are pushing uh the functionality to produce errors uh until all execution is done okay so some of posts will will produce errors so we can know that the objects is writing to the mongodb all right so save it and let's continue core youtube's log handlers okay time to create our log handler inside utils directory where we'll create another file log handlers.py paste it save log message to mongodb so that is our custom logging handler we defined in tasks there we are adding handler named logging handler so what's going on there as you see we are inheriting the class from logging handler and let's go inside it let's look to the emit method right there so do whatever it takes to actually load the specified logging record so uh we can refactor or i don't know as the dock string said do whatever it takes and of course we want to save the log right away after it's produced we want to create this logo object inside mongodb so simply create log message use native or emfgender log objects create message and it's going to be record so by this way once the logo just created it will go to database router and db for write functionality will start it will check if model name is inside non-rail models of course the it's log model then it will return an unreal database which is mongodb not the postgres but mongodb okay so by this way the log message will be written inside inside mongodb alright so the next step let's go to the next step oh it's creating views okay let's copy the views is there let's paste it like that so we will return json response so i don't want to configure any templates or something like that simply json response will let us know that the functionality has started and from tasks we are importing create random posts and to actually pass it to the salary we have to add a delay in method so create random posts that delay by this way the function will be passed actually the task and the task will be passed to the salary and the following functionality will start this functionality right there generating random posts and catching uh arrows the next step is configuring urls so let's update urls as well it's inside the project folder urls and then so that's it finally docker compose python manage py make migrations cool so let's make our migrations paste what just happened postgres models oh i think i just missed the oops let's rename this to the postgres models all right let's check if it is true yes run one more time the app label isn't in installed apps really let's go to settings let's see if we included core inside install labs no let's add save that's it hopefully it will not throw anything again docker controls run make migrations no change detected in core oh it's already created okay so the error the arrow before that one was because of the because i missed misspelled the name of models it should be postgres models actually this part was good but the file name was wrong so i just changed it everything worked properly so the migration file is created for log and for post but it will not migrate this one log because it's not because it should not apply to the postgres okay and now you will see let's now run docker compose up all right so document compose up and in detach mode i just forgot to create the admin user please create admin user to see if so we can see that our created object is present in tables i would say the visual interface will let us know that our object is created successfully so create super user let's keep it real email out stone password root route and yes so finally what should we have to do um so let's make sure that each con each container is successfully running oh my god there are so many containers right there compose put down all of them i think we started the container multiple times all right clear appears so there is not any running container great now docker compose in detach mode all right so everything is up navigate to let's go to admin route route we also have to add configuration for the admin page in order to see inside admin py so from models import log and post and same for post now we should able to see them inside admin great so there is now any logs there is no any post yet all right now navigate to home pass and the salary will automatically start the task and our log objects will start creating success too now if we check one more time you will see that our random post is generated and there are also logs inside mongodb the number 50 failed and the message is recorded there lastly i want to prove that this object is actually written in mongodb created inside mongodb i will go to my vs code where i have extension right there mongodb so i already connected to that to maya to mongodb then you will see that we have 11 documents yeah we have 11 logs and we have 11 documents which proves that the log objects created inside mongodb not in postgres see there are different documents there and of course post models is created inside postgres whew that was it that was really big project actually i hope you understand everything because there are so many details that i maybe skipped while explaining the flow but believe me if i go through this in detail that will take forever the main propose was there for to see the big picture to see the infrastructure how all these multiple databases integrated with django especially and postgres together and how to configure database router how to configure settings and especially docker compose we have as you see we are running services like the postgres and even more with redis and celery which helps us to run time consuming tasks behind the behind the scenes in the background and all of them to get together creates really powerful application to handle lots of traffic and of course monitoring what's going on wild process simply by reviewing the logs okay so also we are decreasing the overloading of postgres that's our main database right the relational which will hold really necessary data all right so the source code is available in github i will put this i put the link in the description of this video also please go and check the um the blog you will see a lot more there and so that's it thank you for watching and happy new year that was the final actually the last video of 2021 and see you in the new year you
Info
Channel: Developer Timeline
Views: 3,285
Rating: undefined out of 5
Keywords:
Id: FxfzN9txtgE
Channel Id: undefined
Length: 36min 45sec (2205 seconds)
Published: Sat Jan 01 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.