Airflow Docker: run Airflow 2.0 in docker container

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello everyone this is cody2j today we are going to learn how to ink store apache airflow use darker so what is darker docker is a platform that uses os level visualization to deliver software in a package which is isolated from its environment with docker we can solve the problem of it works on my machine but not on yours in a nutshell using docker we can make sure when the coding projects works in my development environment it also works in a deployment environment no matter what kind of os or hardware differences of the two environments as long as docker is running on both of them doesn't it sounds great let's get started let's open vs code create a project name on my desktop directory and open it i'm going to name it airflow underscore docker let's open the terminal and check whether we are in the right directory or not by command pwd as we can see we are inside of our project directory that's great then i'm going to open apache air flow's official website and look for the documentation we click the quick start section instead of running airflow locally we are going to install it with docker so we click running airflow in darker before we actually do the installation we have to ink store docker and dock compose in our laptop no worries it's just as simple as you in-store any other software if you are running with a mac or a windows laptop what you need to in-store is just the docker desktop application you can find the download link in the description for both windows and mac os once you've download the file just double click and follow the installation steps ok as i already have it extorted i'm going to launch the docker desktop application it might take less than a minute or more depending on how powerful your laptop is once started we can see the docker icon on the menu bar when clicked it indicates the status of darker when you see the green text which says darker desktop is running we can go back to vs code to check darker and darker composed versions like command docker minus minus version and docker compose minus minus version if you see the version output it means you have a running darker and darker compose now we should have all the preparation work done let's go back to the airflow documentation and download the official docker compose yama file by copying the cool command and paste in the terminal and then enter to execute it once successfully we can see the doc compose file has been downloaded in our project directory let's open the yama file we can see it defines many surfaces and composes them in a proper way by default the yama file defines airflow with ciliary executor to make it simple we are going to use local executor therefore i'm going to change the core executor from celery executor to local executor and we don't need celery result back end salary broke url for local executor so we delete them radis is necessary for salary we don't need it either so delete its dependency and definition and we also don't need celery worker and flour we are going to remove them that's it we saved the yama file and are ready to go i suggest you pay attention to these steps and watch back and forth to avoid missing any of them you can also find a github repository link in the description below by which you can get the finer version of the yama file ok next step we need to create folders for dacs logs and plugins which are quite self-explanatory saving airflow decks logs and customized plugins just copy the command paste in the terminal and execute it we can see all the folders have been created successfully under our project directory echo the airflow user id and group id to inf file it's only necessary when you are using linux os i'm using mac os so i just skip this step next we are going to initialize the database by the commands docker compose up lflow init we can see that it is going to download all the necessary docker images and set up an admin user with airflow as the username and password once you can see the airflow init is exited with code 0 it means the database initialization is complete next which is the most exciting step we are going to run the air flow with command docker compose up minus d which means in detached mode running containers in background let's check what containers are running by command docker ps we can see from the output that there's an airflow web server airflow scheduler and a postgres database let's open the browser and input 0.0.0.0 8080 port to check our airflow web server in the meantime we can also open docker dashboard here we can also see all the running containers in our airflow project we type username and password airflow to login boom we can see that airflow is running properly in docker we can see all the example decks let's pick the first one and start it when we click the refresh button we can see the tasks have been scheduled and executed in the end the successful dac and task run have been marked in dark ring that's awesome congratulations you have got airflow running successfully in docker thanks for watching if you have any questions or suggestions feel free to leave it in the comments section if you enjoyed this tutorial consider subscribing click thumbs up or share it with your friends and colleagues i will talk to you in the next one
Info
Channel: coder2j
Views: 37,423
Rating: undefined out of 5
Keywords: airflow, airflow tutorial, apache airflow, apache airflow tutorial, airflow tutorial python, airflow for beginners, apache airflow tutorial for beginners, airflow tutorial for beginners, airflow 101, airflow docker, airflow introduction, airflow explained, apache airflow use cases, airflow example, python, etl, datapipeline, dataengineer, data engineer tutorials, airflow docker compose, airflow docker setup, apache airflow docker tutorial
Id: J6azvFhndLg
Channel Id: undefined
Length: 8min 24sec (504 seconds)
Published: Tue Mar 02 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.