AWS Tutorial - AWS Database Migration Service (DMS) - Migrate data from MySQL to S3

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello all and welcome to this tutorial on AWS database migration services or DMS today we will discuss how to migrate data from my SQL Server to a s3 bucket before we go further ahead with the tutorial a couple of things to keep in mind first is that not all AWS DMS replication instances are free so in case you happen to do this tutorial by yourself later you can potentially incur charges for DMS service and EMS application instances second thing to keep in mind that if you are doing this tutorial later on your own ensure that you clean up your resources after the tutorial is over in order to avoid any future surprise charges for this particular tutorial it requires a certain amount of pre prep and all the steps that I have enlisted on this slide I have actually gone ahead and completed those steps in order to move further ahead with the actual database migration steps so you will require a certain amount of pre prep that is setting up both the data the target and the source data sources and also creating you know a table a database and a table in MySQL inserting a few records so let us review through all the pre prep steps the first thing that you will need to do is you need to create an empty s3 bucket name my DMS bucket for CSV now that is the bucket name that I have used in this tutorial we are free to use any other bucket name as well just ensure that the bucket is empty even if you have something in the bucket you should not typically matter the second thing is launch a MySQL RDS instance in your default V PC or in your custom V PC enough for this particular to I have launched my MySQL RDS instance in my default V PC you can potentially even go ahead and launch it in your custom V PC when you are launching your my iya SQL RDS instance ensure that your use of free tier and you use the instance type as DBT to micro with MySQL version five point seven X one thing to keep in mind is that the version of MySQL is very important because DMS does not work with any version that is higher than 5.7 it is typically compatible with version five point five five point six and five point seven the next thing after we have launched by my SQL RDS server instance is we will launch a Amazon Linux ec2 instance in the same B PC and preferably the same easy as well and in the security group for our Linux ec2 instance ensure that you have inbound ports twenty two and three three zero six open three three zero six since the port used by my SQL and since this is going to be a client and it will communicate with our MySQL server we need to ensure that port 3306 is open and 22 is for us to SSH and you know create databases tables or even check insert records and check how many records are there okay that's just for basic client-server interaction okay the next thing that we need to do is we need to modify our RDS security group so the RDS server that we launched over here creates either its own security group or if you're using an existing security group you need to ensure that you provide access to EC twos security group and the PCs default security group now the reason why you need the EC to that is instance that you created on in step three that is it is security group because that ec2 is going to connect to our MySQL server it is client and the default V PC security group is because the replication instance that we will launch it is typically associated with V pcs default security group if you are creating a separate security group for your replication instance then you need to ensure that that security group is able to communicate with RDS on port 3306 just like this port right here okay the next thing that you need to do after you have done your basic setup is SSH into your ec2 instance and switch or change to the root user and that the command for that is sudo su and go ahead and install the MySQL client and this is the command sudo yum install MySQL and after you have installed MySQL you need to go ahead and create a database called as library and a table named books and insert a few records in the books table now what I have done is I have actually created MySQL dot txt it is right here on github the URL is right here I will even have it posted in the description of this video so if you want you can go ahead and download this file from github as well and use it for your own tutorial so let us switch to my SQL txt and look at this particular file so this is my SQL exe and as you see the first the first line over here is created a based library and you can use this command to create a database the next thing that we are going to do is create the table books and this is the schema of books table as you see it has four columns and the primary key is the book ID and after we have created the books table what you need to do is you need to insert these five records in the books table says you see we have five different books over here MySQL Oracle sequel server db2 and Aurora it is a pretty simple table and keep her simple guys so that you can do that you can focus more on the replication rather than focusing on the database and the table and the tables contents okay okay so I will have this particular MySQL txt posted on this particular github URL so you can probably download it from there and use it for your own tutorial when you're practicing okay so these are the core steps so let's move on to the next the actual set of steps where we begin with the data migration okay so before we go there I'm gonna switch to my AWS account and let us review through a couple of things so the first thing that I want to review is s3 bucket so my s3 bucket is right here as you see it's with the same name and it is empty so bucket is there so our target is present the next thing you want to ensure that a source is present so this is my RDS MySQL instance as you see is right here my DMS 2018 and I believe the instance is available and we should be able to see the instance details shortly especially the instance endpoint is very important so it is right here scroll down so this is the engine as you see it's 5.7 DB 2 micro instances available and this is our endpoint and on four three three zero six and again my default security group and my ec2 security group are able to communicate to this particular RDS instance on four three three zero let's quickly just check that as well to ensure our setup is just right and complete so the security group should come up shortly and there it is and this is inbound and there you go this is my ec2 security group and this is my default security PC security group and they're communicating on port 3306 so perfect and after that I think it's of copy-pasted my my my SQL server endpoint right here this is the endpoint and this is endpoint that we will use to also connect from a client okay and this is my DP name username and password as you see they are all the same I mean this is just to ensure that I don't forget it guys okay you don't want to spend time debugging that you already see your pass what was wrong okay so this is coming back to our RDS instance and these are rest of the other default properties as you see they all right here okay so our server is also up so our sources up as well and finally this is my ec2 instance right here and where is my CEO this is not easy to instance ok sorry that was not there was a security group let me quickly switch to my ec2 instance so this should come up shortly I have one ec2 instance which is my client actually and where the MySQL client has already been installed and it is up and running right here and this is the public IP so you can copy this and you can putty into this right there ok so what I have done is I have actually connected to my ec2 instance right here as you see so let me just go ahead and clear everything so I'll switch to the root account now and then we will connect to my SQL Server so then so the command to connect to MySQL server is this so MySQL - each the entire DNS of a MySQL server - capital P is support - username and then it will prompt you for the password so I'm gonna control C copy this command pasted right here in big danger and again my password is my DMS 2018 and there it is some connected so let us quickly see what databases are there on the server so show databases as you see a library database is right there so use library and the database has been changed and we can go ahead and season up what tables are there in this database as you see a books table is present and then we will say select star from books and as you see we have our five records we can see the schema of the table as well okay a title published a description and all of my records part opium so in short we are all set okay our server a source of a target s3 bucket and a client - SQL is also connected and we also have a database in a table up and running okay so let me go back to the presentation and okay so this is the presentation right here mm-hmm okay there it is so now net was begin with the actual steps for database migration service so the first thing that we need to do is we need to create an I am role that provides a read/write access to s3 remember we are going to create a bunch of folders and create a file in s3 and also read/write access to our Dean s so that is a first step guys so let us go ahead and do that so I'm gonna switch back to my AWS console okay and let me switch to I am so this is I am right there and your we will create a new role and again since this is for demo purposes and I just want to ensure that we have the right right amount of access further for our role I'm just going to go ahead and give full access to s3 okay and RDS as well so I'm gonna go ahead and create a new role right here and it is going to be for DMS and it will select the MS right then click Next and permissions and I'm just gonna keep this simple guys again the focus is not on security over here the focus is on database migration so we're going to select nest three and basically I'm just going to use one of the available policies instead of creating an inline policy but with a set of permissions and I'm gonna review and I'm gonna give this the name as my SQL to s3 okay and create the role so our role should be created in a few minutes sometimes takes a while so there it is okay so our role has been created so let us review it and we need to also provide access to RDS so as you see it has access to s3 because it's going to write over there and I'm also going to go ahead and attach policies again I'm going to use one of the available managed policies for ideas because it needs to attach arias it needs to be the click connect ideas and also have permissions to read the database and the table and also extract physically select information from articles I'm going to just go ahead and provide full access at this time and the policy has been attached and as you see role has two policies oh I selected the wrong one it looks like that okay I'll go ahead and selected Alexa by mistake sorry guys okay there it is so now I have the Amazon RDS full access perfect and this is the AR end for my rogue some will copy this and just gonna stick it on my notepad right here somewhere okay we will be using the AR and for this role when we set up the actual migration tasks okay and the endpoints okay so our role has been created so that was the first step the next thing is now to go to the AWS database migration service and create the replication instance now ensure that you create the instance for the smallest size like DBT to micro again even if you select the smallest size you can still incur charges for the replication instance okay only a few instances are free but yeah you can still potentially go ahead go ahead and in still incur charges for your replication instance I must switch now and I'm going to go to database migration services right here and the first thing that we will do is will create a replication instance so click on replication instances so here is replication instances and then as you see only less instances label-free comply with free DMS program guidelines you can visit this page I've I've tried this several amount of times of typically not seeing a free instance but anyways you can feel free to you know explore this and if you're able to get a free instance that's perfect okay so I'm going to go ahead and create their application instance now okay and I'm gonna give this a name DMS replication instance and I'm gonna give the same as the description as well and ensure that the T to micro is selected so it's going to be DMS T to micro and it's going to be my default V PC it's not gonna be multi easy and you can leave it publicly accessible that's fine in advanced ensure that you're allocated storage is around 8gb is fine it's going to be in my default V PC I'm gonna put this in US East one a and the security group that I'm gonna associate is going to be the default security group remember we had modified our ideas to ensure that what ec2 and default security groups can communicate over three three zero six port okay and let's go ahead and create this replication instance so it generally takes a while to create the replication instance okay so I'm gonna refresh and see as you see is creating right now okay so the next thing that we need to do is after we have created our replication instance is to create a source endpoint for my SQL table and then test the endpoint and after that we need to create a target end point for our s3 bucket and test the s3 endpoint as well so I'm gonna switch back and we are going to create a couple of endpoints now we can go ahead and create the endpoint guys but you will not be able to test the and point until the replication instance is going to be available okay so they're still taking some time I'm gonna go ahead and click on endpoints right now and let's try and create an endpoint yes right there it's coming up so it's gonna be a source in point and as you see it gives you an option over here to select RDS DB instance so check this option and as you see it automatically selected our RDS my SQL Server is given an endpoint identified as the source and again the source engine is MySQL and it has also got our servers DNS right there the port is three three zero six which is the right port we are not going to encrypt so it's going to be SSL again my username is my DMS 2018 and copy this over your same username we'll copy just because the same thing is my password and ok let us check the any Advanced Options that we need we okay I think we are good with the Advanced Options right here and for VP C this is going to be in my default V PC and the replication instance is the one that which is created I believe as instance is up and running now so it is giving us the option to run the tests so let's run this test and check if it's successful okay so the replication instance is still not active so once that application instance is active only then it will allow me to go ahead and test this endpoint so I'm gonna give this a few minutes and I suppose my recording and once the instances up and it allows me to test the endpoint I will resume recording again so software is up and running now and is available as you see it's right here it's available now so we should be able to now test this so let us try and test this endpoint so click on run test and generally testing the endpoint also takes a while and hopefully all our pairings are correct and the test is successful if the test fails and we have to figure out if any parameters are incorrect so generally less if you are trying this by yourself and the endpoint connection test fails just check your the DNS of your server or the port or they use the name or the past but these are the generally the four parameters which you know you tend to goof up okay so our connection was tested successfully and a source has been created so we are going to go ahead and click on save wait here so as you see our source endpoint is now active the next step for us was to basically create the target endpoint as you see over here for our s3 bucket and test that endpoint so let's go ahead and do that so click on create endpoint and this is compute target and we are going to give it the name as s3 target and the tag the engine over here's gonna be s3 and we need the service access role erm so I'm going to go ahead and copy the ER in of our role right here okay and paste it and the bucket name that we had was my DMS bucket for CSV not going to give any folder over there in the Advanced Options just leave it blank at this time the pcs remedy for the PC and the instances the instance that we just created the replication instance and click on run test okay so it takes a couple of minutes to complete this endpoint testing hopefully this is successful again if this part is not successful or if you get any errors definitely check your bucket name check the roll axis should have for retried access to s3 and that was one of the main reasons I mean when I was trying this earlier I was getting a lot of errors so I just kind of decided to go ahead and give full access to the particular role okay so connection was tested successfully so our target endpoint is also active and connected so after we have created our endpoints the next step for us is to basically go ahead and create the actual task to migrate the data and this task will actually migrate the data from our MySQL which is a source and to our s3 bucket which is our target endpoint and after we have created the task we need to run the task and finally after a task is run and it is it runs successfully hopefully in that case it should create ad or CSV file in our s3 bucket under the folder library and a subfolder called as books okay so let's click on task right here and create a new task and we are going to give this a name as my SQL to s3 okay this is our replication instance right there this is a source endpoint as you see it's selected our MySQL source end point s3 target endpoint and migration type is migrating existing data and we check this to start tasks on create because we want this task to immediately start in target drop tables on target is fine anyway this is an s3 and you can't do much on s3 except like creating a table or dropping a table on heating so it's not actually a table as a folder essentially and a file so drop tables on target option is just fine we don't have any Ella B's I'm not going to include any Ella B's right there and then for our table mappings right here we are going to use the guided version so you want select where so the name of the schema as you see is library is coming up on its own because endpoint is connected and the name of the table is books and the action is include and say add selection rule so basically it is going to look for the schema library and look for the table books and migrate all the data from this particular books table into a CSV and post that CSV in our s3 bucket so go ahead and click on create task and after our task is created remember we had checked the start task on one create option so I should immediately start after it has been created now here you can see the overview and here you can see the actual table statistics as to how many records were read and unloaded to are in our s3 bucket in the CSV essentially so keep on refreshing this as you see right here still in there is still creating right now so once it is in a running state then we will be able to see so as you see starting now so try and see okay there is nothing over your in table statistics again click here and the moment it is in the running State we should be able to see something in our table statistics you know how many records so remember we have five records and all those five records should be X and copied to a CSV file so it is still starting okay okay there it is okay I think it's already trying it's not yet it's not refresh incorrectly but as you see library schema books table completed so it's already completed the road it has loaded five records in total okay in there is it was running and then it went directly to load complete awesome so if we go back now to s3 bucket so this is our s3 bucket right here and we should be able to see the library folder and the subfolder called as books and in that subfolder we should see our CSV file so click on our bucket right here as you see this is our library folder under our library folder is books under books is our CSV file right there so there it is if you want you can go ahead and download this file and try and open it okay so let me try and open this it is probably downloaded in my downloads folder so there it is there my CSV right there and you can open this in Excel and it should actually have all the records in this so as you see this is right here so as you see all our records all the file records have been extracted out successfully to our CSV file okay yeah so that's pretty much it from me guys I mean this is this was a quick tutorial on AWS database migration services and you can much migrate from any database to any database I mean this one is MySQL to s3 you can even go the reverse route that is from s3 to MySQL and I'll have a separate video created for that or probably from MySQL to sequel server or Postgres or MySQL to Aurora or potentially any other database so hopefully this tutorial was helpful and it cleared your doubts and laid a good foundation around database migration services the steps are the same a few properties will change here and there depending upon the type of database but the tasks and the whole process of migrating data from one database to the other database is is the same okay guys so thank you very much and please post your comment if you like my video do like it and please subscribe to my channel so you would be the first one to know the moment I post any new videos so thank you and have a nice day bye bye
Info
Channel: NamrataHShah
Views: 21,103
Rating: undefined out of 5
Keywords: MySQL, S3, DMS, Database Migration Services, AWS, Certification, AWS DMS, Migrate data from MySQL to S3, Associate, Professional, AWS Data Migration Service
Id: ncBG_fVCOQc
Channel Id: undefined
Length: 32min 13sec (1933 seconds)
Published: Thu Nov 01 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.