Easily deploy a Laravel application with Docker

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone Andrew here I've received a lot of recommendations from those who have watched my laravel docker tutorials for a video showing how to deploy a laravel app built with docker to a production environment so that's what I'll be showing in this video today you'll see two different ways of deploying a laravel app to an external server and getting your site live in a matter of minutes with docker and docker compose so let's get started what I have here is a super basic laravel application that I threw together for this demonstration it's a list of posts and I can click on a title and see the content for an individual post there's a database cedar and factory class for the posts that I can call with artisan as well so it's pretty simple but it uses a lot of basic components that a standard laravel app might have in order to deploy this app we'll be making use of docker compose just like we use with our local environment however if we take a look at our current docker compose file there's a bit of an issue with the setup running this one exposes the engine decks web server on port 8080 and we can see that in the browser when we visit our development site this works great for our local installation but if we were to deploy it like this we'll need to specify the 8080 port at the end of the domain in order to access the site what we need to do here is expose the port that our web server and browser naturally expect 80 we could do that with this file but I'd prefer to keep my local environment set to 8080 so I'm going to copy this file over to a new file called docker compose dot prod dot yml and adjust the port accordingly this way we have that separation between our local development environment and our production environment the other thing that we'll need to adjust is in my sequel passwords since this application will be available to the public we should use more secure passwords than just secret ok that's better everything else looks good and should say the same we're going to be deploying these on a digitalocean droplet it's a service that I've been using for years and all of my side projects and just genuinely enjoy what they offer the following steps from here on out matter if you decide to use a different host as long as you're able to ssh into the server and perform sudo commands i'm going to create a new droplet with the latest LTS version of ubuntu 18 point o 4.3 it'll be a standard flavor and the smallest partition this should be plenty powerful for our application the location looks good I'll use my SSH keys and will set the hostname to docker laravel foe 1 all right let's wait for this to spin up and it's ready we can copy the IP address and we can verify that it's up by SS aging into it as root while we're in here we can complete our first step let's add a non root user with sudo privileges we'll call this user laravel and set a password for them skipping through the rest of the details asked then we can add them to the pseudo group using user mod AG followed by sudo and the user name that we selected ok perfect let's back out now and back in to refresh that user and create a new folder in their home directory that our site will live in we'll just call this the same as our development directory laravel dagger test now let's exit this session and talk about a deployment options for this first method we're going to use our sink this command is available natively on Mac OS and Linux systems but you'll either have to install a third-party program for Windows or use an alternative like FTP or cwr sync so the syntax for our sink is pretty simple it's our sink followed by any options then a source location and the destination you can use this to copy between folders locally or from a local machine to a remote machine which is how we'll be deploying this site so let's do it our sync and our options are a which sends all files and folders recursively throughout our entire source directory V verbose giving us a good amount of console output as it runs Z which sends the files compressed saving bandwidth and transfer time and H which transforms any byte values into more human readable formats since we're in our laravel docker test directory on a local machine that's our source and our destination will be prefixed with a username and the IP address of our server followed by a colon and then the complete path to our remote laravel docker test directory let's hit enter and watch a go alright everything's transferred over it's quite a lot but that's because I didn't use something like filter which could have removed the node modules directory since that's only being used in our local asset development oh well let's press on SSH back into our server and since all of our files are here we can get started installing docker I'll be following this article from Brian Hogan and we'll just go through step by step our prerequisites are met we have a server running Ubuntu 18 point O 4 and a non root user I didn't set up a firewall for this demonstration but it's recommended to keep your non-used ports closed off from the outside world okay first things first update our package list using sudo apt update then install some prerequisite packages next adding a GPG key for the official repository actually adding in the docker repository to our apt sources updating the package list yet again and then ensuring that we're installing from the docker repo instead of the default Ubuntu repo okay that all looks right finally the last thing to do is actually perform the installation once that's finished we can verify our docker services up and running using systemctl perfect our second part to this installation is getting Ducker compose setup and I'll be using this guide from Melissa Anderson by the way if you would like to check them out both this and the previous docker installation guide I've linked below ok our prerequisites look good so we'll just go down a list of commands like the previous tutorial first off grabbing the release file using curl then setting some permissions so it's executive all and finally we can verify the installation using docker compose version that's honestly all there is so let's navigate to our site's directory in our server and get started just like when we bring docker up locally with docker compose we can use docker compose up with addy flag so it remains running in the background but there's a catch here we're using a separate production docker compose dot yml file with the modified ports from earlier if we just ran Ducker compose up it'll use the default docker compose yml file to get around this we can pass in the F flag first and specify the docker compose file or files that we want to use in our case docker compose dot prod yml then we can proceed with the rest of the command as usual up D build oh it seems like we ran into a problem well we know that docker is running we just saw the service earlier let's use docker info and see what it returns ok permission denied while trying to connect to the docker daemon socket alright so I found out what's causing this and it's because our laravel user that we created earlier isn't a part of the docker group we can fix this with a few simple commands first by creating the group if it doesn't exist then adding our larval user to the group using usermod AG dagger and to refresh these changes will log out and log back in as a laravel user alright time to see if that worked perfect docker info was displaying back the information that we're expecting now it's time to try to bring up our docker compose network on our server just like before we're using docker compose with the F flag and our production file then up D build everything seems to have spun up fine so that's great there's just a few tweaks that we need to do before we can actually visit our application we changed out the my sequel database password earlier and so we'll have to open up our apps D and V file and make the change there as well okay now if we go to our sites folder on our server we can run docker compose PS and view all the containers that are running for application everything is contained within these docker containers we haven't installed PHP or nginx or anything like that on our actual server so if we need to run a migration for instance we'll need to rely on our artisan container that's exactly what I'm going to show you now because we need to get our database prepped and seeded for our application by using Ducker compose run with the RM flag we can specify a container to act as its namesake command and it will auto destroy itself once it's complete so for Migration docker compose run RM artisan migrate takes care of that the migration finishes just like a would if we were using PHP installed on our actual server I mentioned earlier that I set up a factory in cedar classes for the posts so let's put those to use docker compose run RM artisan DB seed class post cedar seeds our database with 20 random posts ok it looks like that's finished up so now we can visit the IP address of our server and we should be able to see our app beautiful our posts are all here and we can even click on them and visit the individual pages there's no additional dependencies to install or software to configure on our server and because it's running in detached mode we can exit our SSH session completely and the site is still up and running so with this deployment method what if we wanted to make a change to our site how would we go about doing that well let's change this section here on an individual article we'll go in to our blade template and make the adjustment we can see that change in a local version but not on a deployed production one which is expected just like we did when we initially deployed our site files we can use our Singh to send over the changed file or files to our server by default this should only send over the files that have been modified since our initial deployment but let's see what happens ok it looks like our change has been published successfully although unfortunately even though I didn't record it this process did mess up the permissions and E&V file on the server and they needed fixing moving on I'd like to talk about a second method of deployment and it's the one that I prefer before we do that we'll have to bring down in our current production site and start over SSH into the server and we can run docker composed down to bring down the container network that's verified by then going to our server's IP address in our browser refreshing and seeing that our application doesn't exist anymore there's nothing at port 80 that our server is responding to and if we list our containers with docker compose PS it returns an empty table we'll also go ahead and remove the entire laravel docker test directory including its contents and rebuild it as a blank directory okay I've exited the session and we're back on our local machine this time I'm going to deploy with git and github will initialize the root directory as a git repo by running git init and then commit all of it by using add and commit am with a simple message right now we don't have any more to push that commit to so let's set up a new repository on github I'll just call it video laravel docker deployment and add in a short description this is a public repository so you'll be able to check it out if you'd like but this method will work for both public and private repos our next step in pushing out this repo is setting the origin and then finally we can run the push if we refresh the repo page on github we can see our site's entire source code here you'll notice though there's no vendor directory there's also no DMV file this is intentional and part of the default get ignore that laravel comes with PHP vendor dependencies are annulled by composer on each environment and the dot env file might contain API keys or passwords that shouldn't be committed regardless of a repos visibility that's fine though we can handle both of those from the server speaking of server let's get this deployed out we'll SSH back into our server and navigate to the larva test directory and then we'll clone the entire repo to the current directory using the SSH method Oh could not reach from the remote repository using this method github compares SSH keys from the machine making the requests our server to the github repository one doesn't exist for our server so we're denied read access adding one as an app to the breeze now let's head back to a repository and go to settings deploy keys any remote machine that would need access to our repo should have an SSH key added in here will title this simply remote server and will need a key to paste in this box your server might already have a key pair setup and your users home SSH directory but this one doesn't so I'll create one now SSH key Jen T RSA takes care of that I'll save it to the default directory and leave the passphrase blank then I can use cat to display the public key that's the ID underscore RSA pub and paste it into the appropriate text box on github let's save that go back to our server and try the git clone again it works all of our files are now available in the site directory remember before about that dot E and V file missing let's take care of that now I'll just copy and paste the one from our local development environment into our server but making sure to change the my sequel password to the one that we set in our production docker compose file that's looking good so it's time to bring up our docker container network again using the F flag to specify the production file and running up D build let's visit our site ah that's right we're missing our vendor directory another problem that can easily be solved using docker compose just like with artisan composer is another container that we've included in our network for use with this application so instead of how we'd run composer install if it was set up locally we can use docker compose run RM composer install to utilize that container and have it do the work for us we'll also run our artisan migrations and database seeding at this time too you let's see if our applications working now perfect everything has been deployed successfully and our app is working just as expected but what if like our last deployment we wanted to make a change to our app well let's go back and revert what we did previously you saving it will then commit these changes in get and push that commit up to github once that's done we can go back into our server to our site's directory and run get pull that's it that'll pull in our changes that we've made and they'll be instantly available on our app and that's all for now you've learned how to deploy a docker eyes laravel application to an external server and get it running using docker compose you've utilized our sync and github to initially deploy a site and then make changes to that same codebase as always if you have any questions please feel free to reach out to me in the comments or on my Twitter which i've linked below huge thanks to my patreon subscribers and everybody else who supports these videos thanks for watching you
Info
Channel: Andrew Schmelyun
Views: 36,657
Rating: 4.9144893 out of 5
Keywords: docker, laravel, docker deployment, docker tutorials, web development, web dev, web dev tutorials, laravel tutorials, laravel deployment, laravel docker
Id: G5Nk4VykcUw
Channel Id: undefined
Length: 21min 21sec (1281 seconds)
Published: Wed May 20 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.