Best practices for data migrations to Microsoft 365

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
all right good morning good well uh good good morning good afternoon and welcome everyone to our event today uh today's event is best practices for data migration to microsoft 365. we have a lot of really great content for you today and we're thrilled to have all of you here today now i'm joined by my colleague joshua vadish joshua is a project manager within the onedrive and sharepoint customer engineering migration team he is an expert in all things sharepoint onedrive and everything that we're going to be talking about today and i'm super excited for for you to meet him here in a few moments so really the goal of today is to talk all about migrating uh large data sets um to to onedrive and sharepoint um and and we're really excited to to dive into this now this is a teams live event and if you've not joined a team's live event before it is similar to a normal team's meeting but differs in a few ways the primary thing that you'll want to remember is that the main way that you can interact with us today is through the q a tab so that should be in the top right of your screen so you can just click on that and then in uh in that space you can ask us any questions we have a few folks who will be responding to those questions via text and joshua will also be responding to those verbally importantly all of today's webinar will be recorded and so you'll be able to watch back today's presentation in full or any sections that you might have missed so with that let's go ahead and kick things off i'm going to bring joshua onto the stage and we'll kick things off so joshua when you're ready the floor is yours yeah thank you ryan um i'm excited to be here this is one of my favorite topics so again i echo ryan says thank you everyone for coming um i have a lot to cover so i apologize if i talk fast um like i said i just like talking about it um a little introduction of myself um i came over from the acquisition of the mover software to microsoft almost a year and a half ago um i worked there for uh almost eight years seven to eight years um helping them build the migration tool um and then also leading their their white glove managed service and sales uh service so i was there for uh all the largest projects that came through mover that led to the that acquisition and it's um as i said something i i'm really excited to talk about um we touched last week in a similar webinar on how to choose uh the tool within microsoft that's best for your migration and this is sort of just a continuation of that the best practices once you've started your migration project and starting your migration project to get your all your data over 10 365 in the smoothest and quickest way possible keeping your security keeping that metadata that can be kept that you need and then of course keeping your folder structure and all that user owned data so your users can uh quickly and safely use the data the m365 in a similar manner to they were the source um so we start early with some early best practices um that you can start thinking about before you actually do some lifting and shifting of data um so we talk about scanning for early planning so you definitely want to understand what you're up against likes to have we're going to talk about improving your effective communication during the migration to your end users and and all stakeholders um and then how to like i said set up your and execute migration for maximum speed and as little user impact as possible so of course you want to make sure your users are safe and sound in the source while the migration is happening and using the data as expected so we're going to start with scan and prepare just so you know what you're up against and we start with scanning your environment and this is going to give you a heads up on a few things that you really need to start worrying about um there are tools microsoft that can help you scan your data mover is certainly one of those um for free you can easily access your if it's the cloud to your cloud storage source and use their free scanning tool to scan your data we suggest that you scan all the users their data so you know exactly how many files each of them have and you can start anticipating some file and folder issues prior to the migration that might be a sticking point for that migration so we start with from those results um resolve your items with long paths so you're going to see in a scan certainly using mover an output report showing you how many files are owned by each user and you can scan every single user whether you know they have data or not but it's important to understand these things but one of the biggest things that it's going to show you are the files that exceed microsoft's long path limitation of roughly 400 characters so the path to any particular file in in onedrive and sharepoint can only be so long and unfortunately a lot of cloud storage providers allow you to exceed this um um you know a lot of competitive cloud storage providers allowed you to exceed this these would fail if you were to upload them try attempt to upload them into m365 due to that limitation so you want to worry about these beforehand the scan is going to not only show you what files exceed microsoft's long path limit but also what users own those the full path to that and furthermore movers reporting specifically is able to tell you where you should remediate parent folders to fix the most issues so we suggest leveraging these reports so you can clean up these files that exceed microsoft's long path limitation prior to migration so they don't fail um some suggestions for this is communicate this to your users so often our customers will take two approaches an administrator will do it on behalf of the users so they'll have access to the source uh user's account they'll take the report that was given uh through the scan process and they'll just go in and manually remediate based on the suggestions the report gives you but we've seen better results when they put it into the hands of the users themselves so because this report will clearly show you what users own it and what files are affected by this um csv format you can easily filter and send out only the files that are relevant to the user you're communicating with we also say give them a deadline to remediate this is going to be important because often if you don't give a deadline when you're doing the velocity migration you're actually lifting and shifting the data or copying and pasting the data in the case of microsoft software um if the users haven't done this these files are going to fail so we we see better results when you give that users a deadline but this also gives you the ability to hold back these particular user transfers while you do everyone else that's not affected by a long path um limitation uh you can do all those other transfers uh hold back those users and then give them a deadline to remediate so we usually say friday at this time you can you know you you must have all of these fixed and if you haven't you're you're not gonna we're not gonna be able to be able to upload that data and you'll likely need to manually migrate it um we suggest using movers scan reporting in fact if you're doing a cloud cloud migration certainly um partners in our partners in our space such as skysync metalogics they can do these migrations uh but mover offers this free scanning tool as well on top of its free migration service uh within microsoft owned by microsoft so we certainly say use movers reporting why not it's free and it's easy to use it'll give you information on these long path limitations and this helps you avoid errors that increase migration time the reality is uh any software if it receives a failure trying to download or upload the data it's going to retry but of course that retry attempt is going to add time to your migration so if you can avoid these errors before it hits velocity before you're actually migrating the data you're going to decrease that time that the software's retrying and retrying those files a good real world example of this and uh expedia was announced uh as a migration customer for for microsoft at ignite 2020. we have a public case study coming soon uh but expedia was a great real world world example of this because um we were able to give them the list of the users whose files were affected by this they did exactly as we suggested gave it put in the hands of the users gave them a deadline and by the time of migration they had no files that exceeded uh this long path limitation we did not see one failure due to this of course any tool you use is going to tell you this failure if you don't get to it in time but we say as always avoid errors if you can and if you know about it and you know there's a way to avoid it why not do it so here's an example of movers reporting showing you these i'm not going to hover on these too long but we can make this available to anyone who asks um but you can see here simply it's it's broken out into what user owns it the path and then where we suggest the common parent folder that could be fixed uh to reduce that path length this is my favorite topic and i think this is the one that most people ask me about is is data throughput and that's really going to come down to data distribution how many files each users own and then how much concurrency you can get in the transfer and we're going to talk about that later a little in a little more detail later but from the scanning you're going to learn how many files each user user owns this is probably the most the the most important information you can get from the scan results because you want to know who owns the most data for that optima optimization of time we're going to talk about in a little more detail but most cloud storage providers uh api limit per user um that means each user is only we're only able to request the the data from um a particular user in your source uh so quickly um but another user's transfer and uh limit in that source cloud search provider doesn't affect any of your other users this means you can stack all your user transfers on top of each other and get great concurrency on that data but how fast that one transfer goes is really limited to how much data that user owns the owner of that transfer so for example if i owned a million files and i'm going to give you a number here that you can hold on to is uh roughly one file per second per user can be downloaded we don't like to talk about downloaded and uploaded sorry we don't like to talk about file size though package size is important um but we don't see an impact in migration time when a file is for example five gigabytes versus one gigabyte really what matters is that overhead and downloading the up and uploading the individual files um so the more files any given user has the longer that transfer is going to take at one file per second per user so we suggest splitting once you've learned that splitting up the ownership of that data to either users not in use on your source or a service account um because this will give you additional lanes to migrate that data this will give you additional api calls to to play with because they're per user so if you have joshua again who owns one million files we don't want his transfer to take one million seconds we'll split that up into we'll take half of that and put it into a service account um so then we have one million files split between two users that we can run in parallel each with their own one file per second per user api call limit um so then both transfers are down to 500 000 seconds versus the 1 million so it's just keeping that in mind and the scan is going to tell you who owns the most now microsoft's standard is no user should own uh more than 400 000 items just for the sake of migration speed this is not a hard limitation this is not a hard number this is just a best practice any users that own more than that should be that data should be redistributed to keep it below 400 000 items thus 400 000 seconds um we talked about uh package size a little bit this is going to be really important is um and file size package size should be around roughly 200 uh 250 megabytes so uh uh users should own at least that much to get optimum uh transfer speed um so yeah i i chat about this all of this a little bit here but uh one point to bring up is highest throughput on evenings and weekends so you split up your data you're getting that the additional lanes of of concurrency each user is getting one file per second um but the less the healthier your tenant is the less activity on your tenant the faster the tenant is going to be able to upload same on the desktop the source how much it's gonna be able to download of course your transfers are gonna work on working hours without affecting the users but we will get the highest throughput on evenings and weekends so that's something to keep in mind uh mover specifically has technology to uh analyze your tenants during your migration to make sure you're getting uh the best throughput based on the health of your tenant so that's something that's automatically happening if you're using the mover software for your migration so another best practice remove unnecessary tasks so when you're managing a migration and i show examples here you're basically creating uh transfer paths and when you're doing cloud to cloud these are typically going to be user to user as as you can see here um but you don't want to include users that don't own data so we talked about scanning we can see the users that had the most data what about the users that don't own data well if we included those users you know they don't own data from the scan if we included those users they waste a server space that could be used on somebody that does own data bootstrap time of course so we have to load up that transfer the software has to do it's it's you know start the transfer has to do with listings on both sides and see there is no data and then close the transfer and that's taking time from somebody else's transfer the server can only hold so many transfers by default this is 10 using mover of course that could be ramped up by asking support um but uh you don't want to waste time in space with somebody that doesn't own data now if you're using a cloud to cloud migration tool most of those and certainly all within microsoft can rewrite your permissions so the data will still be reshared to those users they just have nothing to migrate so there's no reason to create a transfer for them this is also going to bloat migration reporting so reporting is going to be extremely important for communicating to your users your stakeholders uh the migration you don't want to confuse anybody by including what we're seeing here is successful transfers but they owned nothing because that could confuse users that could confuse your stakeholders and that also makes reporting just harder to go through uh because it has unnecessary tasks in there and then of course those unnecessary tasks consume unnecessary api calls uh we didn't have to do anything against these users so why use up those api calls against the tenant again that could be used on somebody else uh we talk about planning and communication sorry do we have questions i just was going to throw one at you joshua this was a specific about investigating some links within the files they wanted to know does the scanning software to the detail of investigating links within files especially excel for instance yeah unfortunately no so what all it's doing is uh counting your files so all it sees is the name of the files the size of the files um but it doesn't know the contents of your files it's not gonna see your your links so unfortunately no tool we have can do that as far as i'm aware this is something we would suggest uh the customer reach out to their source cloud storage provider for um none of our migration software will recreate any links for you uh if we're doing sharing links if that's what we're talking about here um but uh if if that's what you need uh you can request that from your source cloud search provider if we're talking about internal links like an excel link in or a link within uh excel document of course that's carried over um as just contents of the file uh but mover does no modification to the actual contents of the file or or any migration tool i know of isn't going to make any modifications or changes to that so uh hopefully that would be able to answer the question i think did and there was a second question and then i think we'll move on um that was specific to combining what eric shared last week and what you're sharing today really is just the notion as does the throughput guideline about number of total gbs applies to migration manager so i think the question is when mover makes its way into migration manager does this all still uh uh retain exactly what you're saying yes absolutely yeah so one of the best features of this um uh sorry just keep skipping ahead oh um the uh um integration is actually they're making scanning better uh so scanning will be wrapped up into your admin console with migration manager obviously and it's gonna automatically happen so the way mover is today um uh and we didn't really talk about it in this call uh in this uh meeting the integration that's happening right now move her into m365 uh post uh acquisition um but in the future all these features will be here and it's going to automatically happen on your tenant when you load up your your box access in the admin console so it's it's going to be easy more streamlined give you the same information and once the integration is done uh look at better reporting can be added with the new technology so we have planning and communication this one i always say a migration is 90 planning communication once it's in the hands of the software it's really easy to to just track and manage the projects so we suggest building a project plan and communicating to your stakeholders um obviously determining who's migrating and then redistribution of your data i'm not going to touch on too much when we get to that but because i've already spoken about it and then how to map your users so we say use a project plan i mean this isn't too difficult um we're using planner in this example this is included in m365 of course um where you can set up all your tasks sign those tasks to um an owner and then give a deadline um as always you know sign an order give a deadline is is really works for planning a project like this um uh so we say you know uh remember to check the plan communicate the plan have weekly status updates on the plan but be agile um uh be be willing to move things around if you have to um so that's for us a great resource and we can actually help you with the tasks we use internally for migration so if that's something people are interested in i can share this project plan with you and you can turn it into your own um here's an example of a gantt chart we use for the same customer so very much just another visual way of seeing uh that plan uh we we just love being organized over here so uh we definitely uh when we do our white glove service we definitely suggest uh using a project plan now you need to communicate your plan and this scares a lot of people but you have to understand if you're using any of microsoft's tool tools uh mover migration manager spmt it's not gonna have impact uh on your users in the source so it's safe to communicate to your users and and say you know we're planning this migration it's not gonna affect you um here's what it's gonna look like um uh here's our cut over period that's going to be important timeline to address here's how long we think it's going to take and here's what the delta pass weekend is going to look like and then you're on boarding to m365 so of course management needs to understand cost benefits users need to understand how this is going to affect them such as where why are we migrating your data where is it being migrated what are the benefits how does it impact me such as sharing links uh some limitations of files that can't migrate uh depending on what your source is um so if we're talking about a google source as an example there's some proprietary files there that don't migrate um so just understanding those and make sure the users understand those then of course the benefits right there's there's a ton of benefits to being an m365 as we all know uh what are those benefits and how can they expect to use it and then how disruptive is this change going to be so you want to talk about timeline here because there will be a period you have to do a cut over and delta pass typically this is just a weekend and this is the only time we say users can't touch the data because you want to get that new modified data over there before the cut over period so they need to know about that certainly we had when the world goes back to normal and everyone's back in the office we've had clients put up posters in their office you know something on the fridge saying we're migrating on this day get ready um you want to provide a do's and don'ts um so there are certain things that your users should and shouldn't be doing during a migration mainly move major structural changes so we'd say don't move around folders don't rename folders because when you have to do that delta pass it's going to work on differential comparison um so you'll do that first pass of data eventually that weekend i spoke about where you need to do your cut over um you'll want to do that that incremental delta um so if they've made major changes that's going to cause duplication of data so that's something that really needs to be communicated to your users and i'll show you a template uh in a quick second um final delta passing cutover date we talked about putting posters up around the office and then preparing your support staff so make sure your support again is aware of the common questions that are going to be coming and certainly we can help you microsoft can help you with that um uh but again i touched on those a little bit where's my data you know common concerns of sharing that might happen during a migration uh the differences between the source and destination so they know exactly structurally where their data is if i use an example um you know google drive shared with me or box doesn't have shared with me so cloud storage providers are different preparing your support staff for those questions and then your staff so they don't have to ask those questions here's a sample communication template we give examples uh or sets timeline expectations it provides the do's and don'ts and of course we can share this after the webinar provides information on unsupported files this can be customized customized based on your source destination um and then visually communicates the differences between the environments as you can see here a slight difference in how um data lands in inbox certainly shared data between box and onedrive and then of course it gives you that guidance for post migration support uh next best practice is going to be extremely important and that's mapping your users accurately this is for twofold you need to know that you're putting the data into the right place that one's obvious so if you're taking the data from any of these users ry burke uh you want to make sure you're putting that into to our myburke's uh account um because ultimately the software is it will take the data from whatever path you tell it to and it'll put the data in whatever path you tell it to so you want to make sure that's accurate but also it's important for re-sharing permissions um mover is able to auto pair as long as the names are exact um but some software may not be able to do that that we're partners with or you may have users that aren't exact they don't have a provisioned user or went through a name change so you need to make sure these match the ones that do not have exact matches for the sake of permissions as well because the software needs to know who to reshare the data with as it's copying um so if our weiburg had access to someone else's data we need to know what their destination upn is to re-show that data so as you can see here um most of the match so we'll do the sharing properly we'll do the mapping of data properly but you'll need to uh tell us when they don't match and who that matching user is on the destination side um the mover tool migration manager they're going to give you opportunities to do that and alert you to uh the users that don't match this will also show you what users need to be activated still so of course um software is not able to access the user that hasn't been provisioned that doesn't have an active onedrive uh this is gonna help you determine uh while you're doing this which users you still need to activate in order to do that proper pairing um so we do recommend provisioning your sharepoint online sites libraries and your onedrive users prior to the migration so you could do this mapping properly and then don't change your upns during the migration because you can see how important they are so so we say if you're planning on changing your upns um um wait until after or do it before however there's a you know there's management around what happens if you have to do it during migration that we can walk you through it just complicates the project a little more and then we say decide on a destination folder name this is important to us as well because uh when you're mapping your csv uh deciding where you're taking the data and where you're putting the data we suggest putting it into a destination folder that you can simply add to the the end of the path on the destination side this helps keep data pre-existing data separated from migration data so you can help communicate um and then it also helps like i said those major structural changes um from happening so you don't mess with the delta pass you can say to your users in your communication don't touch the migrated from box folder don't touch the migrated from google folder whatever maybe until this date uh so it doesn't mess with the delta pass so that's yeah uh what we're talking about here exactly that is migrating into a migration folder um the risks of pre-existing data is the merging of folders that's really the major risk the delta pass works by uh differential comparison as i mentioned um so if you have two folders named the same thing in both source and destination there are unique folders that hold different content um and you try to migrate it to the same place in the structure those are going to merge and then files are going to look at timestamp differences to determine whether or not needs to be migrated so again we say put it in a migration folder it keeps things safe and your users when the migrations are is done can easily just take it out by using the move function in onedrive for sharepoint uh organizing for onedrive and spo split so often our customers like to choose some data to go from uh one place and then into one drive and then often the shared data or the departmental data is going to go into sharepoint online so a lot of customers like to organize this way own use their own data into onedrive uh department shared data into sharepoint online team site libraries you have to plan for this sort of migration because typically when you're doing a route user level transfer to onedrive that's going to copy everything within the route but they may have folders you want in sharepoint online instead so you have to identify these either through using reporting off of your scans in mover movers certainly has individual reports that will break down what data is owned by your users and then we suggest getting that data out of the path of the onedrive transfer so it's owned by joshua right now but we don't want that folder migrated let's say the sales folder migrated to joshua's onedrive we need to get that sales folder out of the path of joshua's onedrive transfer so again this is about changing ownership to a service account then you can map your transfer to that service account straight to sharepoint online online library that you want copying only that folder or the folders you know that you want to go to that particular library so this keeps duplicates at a minimum or mitigates your risk of duplicate data this will also help you uh organize and track onedrive transfers versus sharepoint online transfers so you really know that group data is done versus that privately owned data uh autodesk was a great example of this they were scared to do this so we have a public case study of this i apologize i couldn't uh get the link before this presentation but we'll make sure to share it out to you folks uh after this um they were a great example of this um they were easily able to go through each user determine what data they wanted in sharepoint and not in sharepoint online team sites and not in onedrive and they did exactly that they created a service account one for each unique library they changed the ownership of the box data from a user to these service accounts and then we simply just mapped those service accounts in box to their sharepoint online library so this was very straightforward and again we were able to track them separately so they could tell the users okay your personal transfer is done um and then they could tell the the department adds okay the department data is done uh so this was uh yeah this was a great example of that it took organization it took uh labor from autodesk in order to determine that and understanding their data but understanding your data obviously is a extremely important part of the migration as well questions on that uh topic yeah either drop down mark i'll let you go i just was gonna throw three questions at you but ryan if yours was more tactical um i don't want to overwrite that i'll read i'll read the first and then you can go with the the other two so the first question that uh that i wanted to read out is um one person asks when migrating from a large file store what's the correct balance between migrating just all the files over versus having users decide like which files manually uh to move over is there a correct approach to find the right balance there like move everything or remove just some of the stuff yeah ultimately you want to think about throughput right so again that kind of comes down to ownership of data um or if we're talking about a file share how the data is distributed between servers um or between that file share so ultimately yes you if you can do cleanup beforehand you should do that um yeah this all comes down to throughput and how long your migration is going to take if you don't care do it all do it all at once and do it all and just let it run we're going to have per transfer line again it really depends on how your source is structured one file per second per trend as long as your file server is able to open the floodgates and allow a massive amount of download without limitation then you're not going to then it's not going to matter how many concurrent transfers you can run as long as you know that source isn't going to give back offs you can give you can load onto mover or any migrating software massive amount of transfers and data and do it all at once so it really is going to come down to the limitations of your particular uh file share cloud services are easy to predict you know and then we've predicted roughly one file per second per user a little different on a file share um harder to predict so we typically say you know keep each user or home drive or whatever it may be uh down to that similar number i gave before 400 000 items um so yeah i mean it's personal preference just understanding those numbers understanding the limitations that uh microsoft and mover do not control the how fast um or any limitations in downloading the data um so it certainly is going to come down to that but we we very much say clean up your data beforehand because that's just going to make for a cleaner and a quicker migration so hopefully that was able to answer the question yeah i think it did um and i think that context of knowing your source and destination what they can handle i think is is really great advice uh the other two are are fairly tactical i think with what you showed one person was just asking is there a project plan template that can download so that they could quick start and use it in their own organization yep we don't formally have anything here at microsoft um anything i've shown that i'm aware of anything that i've shown is just what i've used internally my team have used but we are willing to share that it's not obviously not a formal marketing approved document but of course we can work on that anything i have here that you've seen i can provide links to um and and what you've seen here we've used historically even at mover prior to the acquisition uh for our project management uh um i need to make a correction we don't use plan we didn't use planner for the one i showed we use microsoft project um and then we do have common task lists that i can send um that are relevant for most if not all migrations gotcha so maybe uh eric davis is who posted that if you wanted to follow up with us for any direct information we can to reach out and contact you we'll pass that along to joshua the next one i think gets back to the previous um section but i think it's important if you didn't address it i think you might have but i just want to double check they were asking for when you move content that may have characters within the file name that aren't allowed which fortunately is a lot smaller than it was a couple years ago but if there's anything about the file name or the path it's too long how does the tool recreate that yeah so obviously yeah there are some files that microsoft just doesn't or some characters that microsoft just doesn't accept um i can't speak to other tools but i will speak to how mover does this mover will automatically strip those invalid characters from your file during migration so it knows that it's not going to be able to upload this file because of an invalid character it knows microsoft's api is going to reject it and that will show up as a failure in your log the developers of mover knew that the only remediation was manual removal of those characters so they just wrote it into the software to do programmatically so what will happen is any characters the software knows microsoft would reject are stripped um and replaced with an underscore and then you see that that name change in your login so you'll see just a quick line on that file in the log saying name change too and it shows the new uh uh version so yeah that will allow the software to accept it that does technically make modifications to the name of your file and that's why it's logged in the log screen great there have been just a couple other questions but i think for sake of time um we'll save probably one or two for the end but uh next section yeah um and we're near we are near the end we're at the meat of it now is running your migration so you've determined how much data youtube your users owned that helped you determine who's migrating uh that helped you determine how you had to split up data and that helped you to determine ways to to increase throughput so at this point you have everything defined you have everything planned you know exactly what users are migrating who they map to and for that permission rewriting you know who they mapped to at both source and destination now you can start submitting those files uh to run your migration first and foremost we suggest running a pilot uh then we say notify microsoft of your migration i'm going to talk on migrating in a big bang approach using reporting and rerunning your errors and then finally what the cutover looks like so for running a pilot we suggest doing this because you want you you want to know you want your stakeholders to know and your leadership to know what this is going to look like how to impact users how long it's going to take um uh you know and what the end user experience is going to be this is going to help you with training as well when you do m365 training so we say choose users that are willing to cooperate because you might need to wipe out that that pilot um you might need to get feedback from users on what the data looked like when it landed where the permission is still there um only include users who own data because you'll just confuse users who don't don't include any users that own a lot of data because as we talked about one file per second per user will take uh too much time for for a test include users that are sharing data to each other so you can see those collaborations being reset during the migration um now one confusion that comes up a lot is your pilot users do not need to cut over to m365 early and separate from everyone else this is a common misconception once the data is there we can sit on it and then when the main migration happens it's just going to do a differential comparison that delta pass for these users or you can wipe out the data and just redo those users that's time but it's free using mover so there's really no risk to doing that and then run as many pilots as you need for management buy-in you know we have um if you're doing this yourself on mover as an example uh you can do as many of these as you want you know the tools designed to to allow you to pick and choose what transfers you're running you can load them all up um you can tag your pilot users and you can filter by your pilot users and really track them that way um but run as many pilots as you need fast track who uses the mover software so the fast track service that will run your migration for you under certain conditions they will certainly allow you to do as many pallets as you need as well we also recommend notifying microsoft if your migration is over a hundred terabytes and there is a path for this an official path for this in you yourself the customer or your accounts team if you're working with an accounts team to submit a request um a form that i do believe we share later in this um alerting microsoft so they can prepare your tenant for what's about to happen so this will just be ensuring that the health of the tenant is there there's nothing preventing uh mover from uploading as much data as possible uh so it's just a short form that says you know when are you migrating how much data you're migrating when do you expect to be done so we do recommend doing this if you have over 100 terabytes this will help keep the health of your tenant and the destination you know and it's optimal um space to upload that data [Music] this is the scariest one for a lot of people running your migration is a big bang and often i have to convince customers that this is okay but the reason why we suggest this the biggest reason is that it will provide the highest concurrency for the quickest migration scenario so the way most software is created that i'm aware of but certainly mover and migration manager is you can load up all your transfers at once all of them let's say you have a 10 000 user transfer you can load up all 10 000 uh transfers huge run uh mover in particular will limit how many of those can run at any given time just to avoid hammering source of destination servers um but as one finishes the next will take its place and they'll just keep doing that while you're sleeping right or while you're working or and you can go check on it for status updates um often we get customers that come to us and say hey this pilot or this migration isn't going as fast as we expected and we found in that they just weren't submitting enough jobs so we would see oh you weren't submitting data the last couple days um you need to constantly be submitting data so we recommend just throwing them all in there the reason why this scares a lot of customers is because they believe support will be hammered the day of cutover if you're doing the your entire organization um we found that to be from customer feedback autodesk certainly gave this feedback um uh in their case study they overstaffed on that first week and they actually laughed about it to us after they were expecting more support questions and we found that uh if your colleague next to you they might be in a different team uh whatever it may be the water cooler chat you know everyone hanging out with the water cooler um that's where we can ask questions because uh you know we're more comfortable asking our friends and colleagues um you know where did my data end up or or things like that so um big bang is going to give you a quick migration um there won't be breaks in between it um you know there will be stopping to to plan what your next phase is plan the scheduling of your next group um it also avoids sharing concerns so if you did your migration in groups departments as the most common example and you certainly can do this way but what if your data is shared between those groups um you certainly don't want someone who has access to data that's already had its delta cut over a week prior in a different group editing that data because there's not going to be another delta picking up that data so it's important understanding that if you have groups sharing with each other but you're migrating in separate groups um there is going to be concern with those collaborations that you need to worry about um and then yeah as i said i think the biggest one is concurrency is the key to throughput mover will be constantly monitoring the health um to make sure you can have the max 10 current transfers or you can request higher um if you're going through fast track you can request that from fast track of course if you're self-serving on the mover tool for free you can request that from uber support um so that you're running up to 100 transfers at once which will uh will give you a speedy migration um use migration reporting so we very much say use reporting that you're getting from your software we're proud of movers as well and of course we'll be wrapped up into migration management when that's done because that's going to give you a full report of all the failed files when your migration is done so you can see exactly what why the file failed what the file was who owned it all in one csv we have other reports that will show you the status of the migration so basically a report out of the ui you would see if you're self-serving that's color coded that can show you certain information that you can print out and share to your stakeholders and leadership as well so we certainly say use that uh the reporting the the consolidated reports but of course you can get individual user uh reports as well that go right down into um with the users on and um um to see that granularity uh but keeping in mind with movers specifically i apologize i should have put this here those logs are only available for 90 days for security purposes uh best practice rerun your errors so at this point you've run your migration you've done a big bang approach you're using reporting to see what transfers have failed files this is clearly shown in status and it's clearly shown with that color coding i mentioned so a transfer that has any failures will will end up error or yellow we recommend re-running this um you know consistently until you get a consistent number of failed files because often we'll see back off failures um that can be remediated just by re-running your transfer anytime you rerun a transfer it's going to do a differential incremental it's going to do that differential comparison and only migrate what hasn't already been migrated or what's been updated based on timestamp so you see a yellow transfer rerun that because there's a good chance that failure in the log is a back off that can be picked up just by doing a simple rerun if it doesn't clear up that same file you know there's a permanent problem it's not a transient error so then you can really start digging into that and asking support what's going on with this file so as again transient errors are something you want to avoid reporting to your stakeholders you don't want to report something that you could just fix by doing a rerun and then as i said rerunning to get consistent number of failed files mover is going to show you how many field files you have in a transfer during its run so you can clearly see that decrease or stay the same that's sort of the stuff we look for is that consistency um and then again transiting error is bloat uh you want to avoid migrate migration reporting bloat because you need to share that and you need to download that for download that from the microsoft servers so the larger that is the harder all of that's going to be um uh another optimization that's built into the software is mover will automatically rerun your transfers up to two times if it completes with an error so we sort of thought about this and handle it for you in that we know there's there may be transient errors mover software will just automatically kick it off again so don't be afraid about this and then like i said it's going to do that differential comparison it will do this up to two times so you can it'll help you see that consistency or if it's a transient error now the final delta in the cut over this is the end of your migration you've done your planning you've done your communication you've you've defined your migration um with a csv you've uploaded it to the software and that's actually doing the migration then you've done your big bang approach and you've done your first pass of all of the data this first pass you know if you're doing a big bang approach it's a copy and paste if you're using mover so it does not affect the users using box or google or dropbox or ignite or any of the cloud storage providers you're leaving or file share but you will eventually need to do a delta to pick up new modified files if the users are using the data while you do the migration um so this is the time where you're going to communicate to your users you cannot use the data after this point um usually friday evening uh regardless of time zone based on their time zone uh and then you're going to want to run or fast track will run if you're using the fast track service a delta pass to pick up those new and modified files and this is very quick it's just a listing of the data on the users it's looking at the timestamps between source and destination uh and the name since determining what needs to be copied over uh we recommend running a full delta prior to that weekend to gauge the timing of that delta so ideally you're scheduling your final delta a little bit after the main migration is done with a little breathing um so then you can do a delta and determine if it's going to take just overnight depending on the size of your migration two nights or even three nights to anticipate in case you need that sunday or monday morning to complete it do that full delta figure out how long it takes now you can communicate to your users how long they can't have access to their data again on a weekend um you of course want to prevent the users from editing or adding delta during or after or data during or after this delta because this is the final delta uh any changes will not be copied over unless you do another delta um communicate this day and its boundaries with notice that one's an obvious one and then plan to use the entire weekend even if it's done saturday morning which is common saturday night which is common um give yourself the entire weekend just to validate that everything went well and that's actually one of the benefits of running a delta prior to the weekend is to validate that it's going to go well and that you've handled all the errors you need to and you know what to expect the expectation then is first day monday morning your users will be using the data as expected and as planned so that's the end of it i apologize if it was a lot i give you some resources here but we'll also share it but feel free to feel free to use this email address um uh give it to your accounts teams if you have questions on migrations if you need some migration consulting and you're uncomfortable reaching out to me directly give it to your accounts team and they can reach out to me and we can get you some migration expertise and advice as needed that's what i'm here for uh joshua there are two final questions that i think will be nice to wrap up on and certainly with all the resources and your availability through email is is a wonderful offer and then i've got one announcement just based on the most common question that wasn't content oriented but first the content is there any way to check the last accessed file on a file like compared to modified or created similar to what you were saying around the context of don't bring everything if you don't have to maybe they're asking can i review through this criteria of what was last modified or created yeah unfortunately no not built into anything we own um so yeah there's no filtering option there's no filter by time stamp there's no filter by file type these are features we're thinking of adding in the future once that integration is done the other one is specific to box and it's got a nuance around working with external users and i'll kind of front load with certainly always understand the destination if you want to share externally once files are in microsoft 365 of course support that but the nuance is when they're migrating off of box and they have files and folders that are shared with external users is there a way to map that during the migration and with the context of setting up the destination with the right permissions how would you think about things that have been shared already with external users yeah this is i i hate to say but one of the toughest topics when talking about a migration is we don't have any internal way to really help with that um so none of the tools microsoft owns will re-share externally um this is partly and of course they would need an m365 account to share too uh but it's also a security concern so we believe that um and the way most you know migration services do their permissions is simply by reading access on the source and pairing that to a destination user then writing that same access on the destination now if mover were to do this from a securities perspective it would just send out notification to um those external users that data has been shared to them right and you may not want um you may not want that to happen it might be old data people do gain access to it so i i hate to be that guy but the the only advice we really have on external sharing is understand it in the source then manually set it in the destination either at the folder level pre-migration on a parent folder or your sharepoint site or on the data on the destination if we're talking about box box can get you this report of all folders that are shared externally and who those externals are but unfortunately it's manual work so whether it's you know something you have your users re-share their external data uh post migration as needed this is a great time for cleanup of your data and from a security perspective um or writing a script we've had some customers write a script on the destination side uh once they understood the folders that were shared to those externals that would go through and reshare the data to the externals so there's nothing programmatic and internal and internal i can think of that can do that for you um but box certainly can help with that um but the software won't transfer those external permissions yeah i think it's important what you said you know it goes to the the planning and the execution once you know your source and destination and that we don't change permissions but you can certainly set the destination to adhere to how you want to share your files then hopefully it's just a to b and and then activate um so the last thing and joshua i'd love to get your sort of final digits on the summary of what's the what's the top something that somebody should think about before they think about moving into microsoft 365. but the the one uh broader note uh that question was asked a number of times is the session being recorded and was the previous session recorded and absolutely they were recorded and we're going to make it really easy to find on the original blog post that ankita published we'll have both links out there or if you go to the original links per each session we'll make sure that the video shows up there as well so that you can access that on demand but final thought from you joshua and then we'll let ryan take us out yeah i would say on the the most important thing is understanding the differences between the source and destination how you use the source and how you're planning to use them 365 and if those match um so obviously you're going to want to worry about integrations with third-party apps will those third-party apps work in m365 will the users have access to the same data that they had access to before will they be able to use the data in the same way they did before and are we getting features that uh we didn't have before that we want so i would say understanding those major differences and setting your users up for that is going to be one of the most important things the differences between environment and how the data is exposed how the data is shared and how users get to that data should really be understanded understood because from there you know once you've done your planning and you're actually executing the migration it that's the easy part the hard part is the the you know the user management all that change management you have to do that doesn't even involve the software um so i would say yeah that's the most important part understanding how your users that use the data the limitations of source and destination the the differences in the environments then of course um that change management that uh and communication with your users on all of that stuff from there it's just going to be loading jobs and doing status updates awesome joshua thank you so much for all this great information and thank thank all of you uh for attending today we appreciate your questions your engagement it's been really great um hearing from all of you about your questions and concerns and and uh and things that you'd like to learn more about as far as it uh goes to onedrive and sharepoint and data migration so with that thank you all for joining we look forward to seeing you at a future webinar and wherever you are in the world take care and have an excellent day thanks so much
Info
Channel: Microsoft 365 Community
Views: 1,830
Rating: undefined out of 5
Keywords: SharePoint, community, Microsoft, Teams, OneDrive, Migrations, SharePoint Migration Tool, Mover, Migration Manager
Id: dEPtoWBAYKk
Channel Id: undefined
Length: 55min 54sec (3354 seconds)
Published: Wed Apr 14 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.