ZFS Essentials: Auto-Converting Folders to Datasets on Unraid

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hi there guys and welcome back to another video so again this video is going to be looking at CFS on Android and in the last video we converted the cache pool here formatting it with ZFS and creating our first C pool and so now we've got our first steeple it's time to have a look at what we can do and how we can use it so let's do that right so first let's have a look at what's on this C pool we can see here I've got the app datashare domain share and the system share now when on raid creates a share in a z pool it doesn't create it as a regular directory it creates it as a ZFS data set now you may be wondering just what's the difference well a regular directory on your computer is like the basic version of a storage option it holds files and folders without any Frills it doesn't do anything else however a ZFS data set is like the premium upgrade of this in addition to storing files and directories it comes packed with extra features it can recall previous versions of its contents called snapshots and it offers encryption for advanced security and supports compression to make better use of your storage space so basically it's a more powerful and flexible way to manage data just as a premium option often provides more features and benefits than the basic one now you may be wondering just how do we know if a folder is a regular folder or a data set well that's a good question and one way we can do that is we can open up a command prompt now don't worry I'll show you another way as well but in the command prompt we can just type ZFS list and that will list all of the data sets in the actual server we can see here the name of my pool cache and I've got these three data sets here app data domains and system so if we minimize this a moment and we have a look inside of my app data well I've got all of these folders which store various app data for the individual containers that I've got running on this server but if we bring up our Command Prompt again we'll notice that none of the actual folders inside the app data are listed here as being data sets so whilst you keep that in mind I'm going to show you another way we can have a look at the data sets and folders so I'm going to go across to the apps tab here and I'm going to install a plugin called ZFS master and this plugin gives us additional functionality when using CFS on unraid and whilst it's installing I want to give a big thank you to the author ICA saint for making this awesome plugin and whilst I'm here in community applications I'm going to install the user script plugin as we're going to need that later on okay so let's install that and a big thanks to the author squid for making this super useful plugin okay so with that done now I'm going to go back to the main tab here and now with the ZFS Master plugin installed we can see here under ZFS master it lists all of the z-pools that we have and we can click this button here which says show data sets and we can see my three data sets Here app data domains and system now one other thing this plugin can do is if we click on actions here it allows us to take a snapshot now for those of you guys who are not sure what a snapshot is basically when you take a snapshot of a data set it freezes that in a moment of time so all of the contents of that data set are Frozen with that snapshot so in the future if you want to come and roll back to that then you can just easily roll back to that snapshot so basically as the word infers the snapshot's like taking a photograph but instead of just looking at the snapshot you can actually go back in time to when that snapshot was taken now there's other things we can do with snapshots as well such as cloning a data set from a snapshot but we'll be looking at that in a future video so I'm going to click the button here now to take a snapshot and here you'll see there's the option to recursively create snapshots of all the descendant data sets so if I click onto that and click snapshot now the snapshots being created we can see here now the data set here is highlighted blue but on the right hand side here there's a icon which looks like a camera represents how many snapshots we've got and at the moment I've got one so if I click onto actions here and then go to snapshots admin we can see here I've only got one snapshot now I did ask it to snapshot all of The Descendants all of the child data sets but remember all of the things inside of the app data these are not data sets these are regular folders so let's go to the dashboard a moment and under Docker containers we can see I've got my Ambi server running here so if I click on web UI it's all working nicely but now imagine I go into my shares tab here I go into app data I go to MB server and carelessly while I'm not thinking I accidentally delete a load of stuff inside the app data which is pretty much going to obviously break MB so when I go back to dashboard and I'm going to restart MB when I go to the web UI it's now asking me to set up a brand new MB because I deleted the app data and it's just recreated it with the default so let's close this and stop the container and now I'm going to go back to the main tab here and I'm going to go to actions then snapshots admin and here I can roll back the snapshot but and this is a big caveat at the moment is rolling back the snapshot will roll back all of the containers to how they were at this moment of time because the snapshot is for the whole of the app day to share but anyway let's roll it back and we can see it's successful so now let's go back to the dashboard start up the MB server open the web UI and we're back to exactly how we were before my MB server set up correctly and there's my user ad that I can sign in with great okay so we're finally getting to the purpose of this video so what we really need to be able to do is make individual snapshots for each of the containers individually so that would mean we'd need to convert inside of our app data all of these folders into data sets but before we do that let's just go back to the main tab here and we can have a look at some more of the functions of the ZFS Master plugin so obviously here we can create more data sets if I click on to create here I can make a new data set the first part here is the name of the pool for me that's cash so if I start typing app data I can click onto that here and I can make a new data set let's call it test and here I can actually set compression if I wanted to I could set that for lz4 I could set a quota here if I wanted to and also I could encrypt the data set if I wanted to do that as well so if you remember at the beginning of the video I was talking about data sets like being the premium option well here this is where you'd set all of their special features but again if you didn't want to you don't need to set any of them at all and as well if you want to create a new regular folder you don't even need to create a data set at all you can just create a regular folder and use that anyway I'm creating a data set so with that done I just click on to create and it says here a data set successfully being created so I can click onto OK and now we can see that child data set listed under the main data set here but if we go and look inside the app data here we can see all of the folders and the data set here that says test but there's no way we can really tell the difference so I've got a little script that will look at a data set and tell you what a folders and what are data sets so to install that let's go across to my GitHub click onto repositories and let's click onto list data sets or folders click onto this script here then we want to click Raw and then I'm on windows so I'm going to click Ctrl a and Ctrl C to copy all of this and we installed the user scripts plugin earlier so I'm going to go to settings and user scripts and I'm going to add a new script let's call it folder or data set and I'm going to hover over the Cog here and go to edit script I'm going to delete this bin bash here and then paste the script in now if I scroll up to the top here and here in this line where it says Source data set we have to put the name of the pool and the data set name now the name of the pool for me is cash and the name of the data set I want to look at is app data now we can double check that's correct I'm just going to open up the main tab in the new tab and scrolling down here to CFS master I can see under pool here the name's cash and under data set name here the data set is app data so with that done I can click on to save changes and then run the script and so here it tells us the data sets here 's the one we created test and the folders the regular folders are always here so this script can be quite useful when you're not sure inside of a share which are just regular folders and which are data sets Okay so now it's time that we convert all of these into data sets now to do this manually would take forever so again I'm going to use a script so head across to my GitHub once more and you'll see this script here unraid ZFS Auto data set update so click onto that and here you'll see a detailed readme all about the script now please do take the time to read through it as this script is manipulating data on your server it's important to know exactly all what it does but basically what this script will do is it will look at a data set and look at the top level directories there and see if they're regular folders or data sets and any regular folders it will convert them into ZFS data sets now inside of the script are various variables it can be set to run as a dry run which basically means the script will run and you'll see what it would do in a test scenario so you can be sure that you want to run it then here we've got the source pool that we specify for me that's going to be cash and the source data set here for me that's going to be app data now what this script will do is it will stop all of the containers that are running before this process happens because obviously we don't want data being manipulated when a container is using it underneath here you'll see another variable and this variable here containers to keep running we can specify in quotations separated by a space all of the containers which we don't want the script to shut down now the first time we run the script we'd probably have this empty but after we've run the script one time all of our app data folders will be data sets but if we install another container while that freshly installed container will be into a folder so it's good to have this script run every night and then any new containers that have been installed will be converted to data sets and so then if you put in all of the containers here which have already been converted then it doesn't need to shut them down and so if you're watching a movie or something when this it runs well it's not going to shut down your Plex or MB container now a similar thing here for VMS maybe you want to run this on your domain share so if you want it to stop VMS you've set this to yes and again exactly the same principle here we specify here the VMS we want to keep running and here cleanup equals yes this is normally set to yes but if you set this to no then the source files will be left as well as the data sets and you have to manually clean up afterwards now here replace spaces equals no some folders may have a space in it and you can set this here for those spaces to be replaced by underscores because sometimes with CFS data sets some particular software doesn't like data sets with spaces in it although it's absolutely fine to have data sets with spaces in anyway so those are the variables that we're going to set so basically the working principle of this script is it will first stop all services that are running that need to be stopped and it renames the original directories for example the MB folder will be renamed to MB underscore 10. after that a data set will be created with the original name so for MB it would just be created a data set with the name of MB after that then the script it copies all of the data from inside of the original folder which has been renamed into the new data set then if you set cleanup to yes it will clean up those temp files and finally it will restart the services that the script stopped and then the scripts finished that's job done okay so let's set this up now so click on to the name of the script here again let's go to raw and I'm going to select all of this here and copy it to the clipboard I'm going to go back to the server here add new script I'm going to call it auto convert to data set I'm going to get rid of Bim bash here and then paste in the script now I'm going to scroll up to the top here now the first thing I'm going to do is I am going to run this as a dry run so make sure the first time you run it have dry run set to yes now the source pool name well I know mine is cash and the data set with the folders inside I want converting is app data yes I want it to stop any containers that might be running I don't need it to stop any VMS and clean up I'm gonna have as yes okay so that's all the variables I need to do so now I can click save changes so when I run the script the first time I run it with dry run I'm going to run the script in the foreground here not in the background so I'm just going to click here [Music] and so we can see here everything that would have happened if I had like Let The Script run normally and this all looks great to me so I'm going to click onto done now I'm going to edit the script and I'm going to put dry run to no and this time I'm going to run the script in the background the reason being if you've got a lot of app data it can take a long time and if you run it in the background it doesn't matter if you come off the page or you shut down your browser it will carry on running so let's click on run in background and done and we can see what's happening by clicking on the log here we can see here stop the MB server that was the only container running and now it's going through and converting all of the folders into data sets so I'm going to pause the video here and come back when it's done okay so we can see the script is now finished we can see here it went through and converted all of these data sets each one as well it tells you the size of what it was we can see here it restarted the MB server look and at the bottom of the script it will tell you which folders were successfully converted to data sets so with the script finished I'm going to run the other script here just to have a look inside and now we can see that the contents are all data sets and it says there are no folders present perfect so now we've converted our data sets let's have a look inside ZFS master and see how that looks there so now you see that we've got all of these child data sets so this time now if I go to do a snapshot and I recursively create snapshots for all of The Descendant data sets we can see here now that each of these individual data sets have now got their own snapshot and the parent data set here has got two because of the one we ran earlier now something to notice here this test data set which I created earlier now if you try and use the normal file manager to delete a data set well it actually doesn't work you see it's still there so to get rid of a data set the easiest way we can do it using a GUI is we can actually use the ZFS Master plugin for it but when we look here under actions well we can only see take snapshot that's because we need to change this to be able to have a destructive mode so to do that we go to settings here and scroll down go to CFS master and destructive mode set that to yes now with that done when we go back to the main Tab and we scroll down we can see the test data set here and if I now click on actions I can click on Destroy I'm going to check force and recursively to destroy all children Independence and that will delete the data set now I'm going to create another test data set quickly let's call this one test two and now I'll show you how to delete it using command line so let's open up a terminal window and to delete it let's just type ZFS destroy and I'm going to put a hyphen R for recursive so if there are any snapshots in it it would delete that as well and I'm going to put an F as well for force and the space and then the name of the pool and for me that's cash forward slash app data and forward slash test two okay so that's destroyed okay so that data set's now deleted obviously with command line it's much easier to make a mistake and if I wasn't careful and I didn't put test two at the end well I could have deleted the hole in my app data so when using command line always be really careful okay so that's no longer there now next thing let's go and install another container doesn't matter which one we'll do we'll just pick any random one let's put in the unify controller okay so that's installed so now if we go back to the main tab here if you notice we can't see the UniFi controller listed as a data set well that's as expected if you go to settings user scripts let's check the folder quickly so now we can see all the original data sets we've already converted for that new container by default when unraid installs a container it will make it as a folder even if that folder is inside a data set now maybe that's something that will change in the future but for now that's how it is so I'm going to click onto done here and I'm going to edit my script and what I'm going to do now is I'm going to look at the name of the docker containers I want to make sure don't stop and for me the main one is going to be MB server so let's copy its name here and I'm going to go back to the script and let's edit the script and here where it says keep containers running this first container here we're going to paste in NB server because I don't want that to stop and here you can put in all of your containers that have already been converted for the sake of this video I'm only going to put this one in so I'm going to click on Save changes and again let's run this in the background okay so that finished really quickly we can see the first thing it did is stop the unify controller it skipped everything that were already data sets because it had already converted them and then it just converted the unify app data folder into a data set here okay so now the next thing to do in my opinion is we want to set this to run every night on a schedule so I'm going to put custom here and here we can put our custom cron schedule so what I'm going to do for that is I'm going to use cron tab here which will help me generate the correct format so at the moment we can see here this here means it's going to run at 405. so the first bit is the minutes the hours the day the month and the day of the week so I want mine to run at 2 am so I'm going to put a zero here and a 2 here and so I can see that's going to run every night at 2 am so for me that's perfect I'm going to copy this and paste that into the cron schedule here and click on to apply so now every night this script will run and convert any new containers into data sets so then I can easily snapshot them and roll back things individually should I so need to okay so that really brings us to the end of this video now in the next video I'm going to show you some scripts that will automatically snapshot anything we want which can keep snapshots on a retention policy for example seven daily snapshots four weekly snapshots and say three monthly slap shots and also in the next video we'll be seeing how we can send those snapshots replicating that data set to another Z Pool that z-poo can be on the same server or another server wherever you need it to go I find ZFS data replication super useful because it actually allows us to actually backup data while it's live and running so for instance all of our app data can just be backed up whilst the containers are using it no need to shut anything down same for VMS as well but guys like I said that's the next video but if you enjoyed this video guys please give it a thumbs up hit that like button share it with anyone who you think might find it useful and as always I want to give a huge thank you to all of my patrons and supporters out there thank you so much guys for all of your support without you guys I really wouldn't be able to make these videos anyway guys it's time for me to go now but whatever you're up to for the rest of the day I hope it's good and I'll catch you in the next video
Info
Channel: Spaceinvader One
Views: 27,314
Rating: undefined out of 5
Keywords:
Id: pLFaDnTVpuM
Channel Id: undefined
Length: 23min 11sec (1391 seconds)
Published: Sat Jul 01 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.