How I Automated a Supply Chain with Machine Learning, AWS, and Python

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hello everyone and welcome back to the channel today I want to go over a video on how I automated a supply chain using Amazon Web Services machine learning and then Python script I want this to be a brief overview of how I did it not necessarily a tutorial on how to do it so if you have questions on what I did specifically please let me know in the comments below if this video gets enough traction I'll be sure to make a full tutorial on how I did everything so if you're interested in that please hit the like button below there's four main steps that I followed to implement a supply chain machine learning algorithm the first is the data everything that you do when machine learning comes from the data if you don't have a data collection system and there's no way that you're ever going to be able to use that data to predict so the first requirement is that you have a way to collect data in your working environment the second step that I followed is data cleaning or feature engineering as machine learning engineers call it what this means is that you have to figure out the good data from the bad data this means that you have to figure out a way to understand your system enough to only use the good data in training your model the third step is picking the actual machine learning method that we want to use there's a ton of different algorithms out there and you have to find one that fits your problem the fourth step is visualization of your results you need to be able to look through the results of your machine learning algorithm find the ones that are important to you and act on those very quickly I'll try to cover all these steps very briefly in this video if you want a full tutorial please let me know down below the first thing we need to look at is our data collection so for me the data that I had working in the supply chain was the consumption of a raw material given by the part numbers up here each month denoted by the month and the index column so for example our part number over 136 in the month of July 2011 consume forty one thousand three hundred and sixty so essentially I have a time series for each of these part numbers I have this data for multiple years and a lot of part numbers zooming out we can see just how big this data set is the way that we personally got this data was the SAT we just exported monthly the raw material consumption for each part number and put it into this spreadsheet step number two the data cleaning and feature engineering is for us since we're working in a supply chain if there's any demand we need to be able to cover it we don't want to fall below any number of our predicted demand in the future that means that we want to have a complete customer availability and be ready to make whatever product that our customer wants so even though some of these columns are very sparsely used we still need to keep them in ourselves although a machine learning algorithm may not be too accurate and predicting this time series we still need to give it a chance for the third step picking the algorithm that we want to use I picked the deep aar forecasting algorithm through AWS I'll try to very briefly explain why I picked this one using a graphic on the screen but please know that there's a lot more that goes into this selection compared to other machine learning algorithms the deep AR algorithm from AWS actually takes into account every time series in the set so instead of using an algorithm like a remote which just takes in one time series and predicts off it the deep AR actually takes into account every time series in the set and makes one big model to predict the outcomes of each time series you're using one singular model for the visualization of the data I use the Jupiter notebook in a Jupiter notebook you're able to execute Python code and collaborate with others this is exactly what I needed for this project so that's why I picked this there's other options that you can pick for this as well Google collab is one that you might consider too so now let's get into the specifics how can we do it the first thing that we need to do is to load in our excel data into an Amazon s3 bucket Amazon s3 is just a simple storage solution and what that means is that you just put in files here and then you can actually access these files from other Amazon services so I created a bucket and then I placed in the excel sheet into the bucket next we need to use Amazon's machine learning platform which is called sage maker and sage maker I just created a notebook that will hold my Python code these notebooks are just Jupiter notebooks so that means that we can collaborate with other users in these the way that our data will flow is that we will take the data from the s3 bucket that we just placed in feed it into sage maker and then take the output data from sage maker and feed that back into an excel sheet in place of that intend s3 bucket once you place that excel sheet into an s3 bucket we're able to download it and then we can use it in our own Python script to manipulate the values under graphs opening up the notebook we'll look at the code a little bit but if you want a full tutorial on this be sure to like the video so essentially we feed in all the packages that we want to use and then we import our data here so we say that our bucket is here and then we specify where it is within the bucket and then we load it in we also designate an output path here so we say this is where we want the output of our data to go now we had to set some hyper parameters so here I've set frequency as a month using the character M and I'll talk about this at the end of the video but there's some improvements that can be made if you're trying to do this for yourself I have a prediction length of five so this is saying that I want to predict five months into the future and then a context length of twelve a context length is just how many frequencies you want to look back so I'm saying I want to look back twelve months into the past however these algorithms automatically take into account seasonality so twelve months is more than enough for this application once we load the data in we just need a simple check to make sure that its input correctly here I've done that which is putting in a graph next we can create the time series there's a few more complex operations that are happening in here but we won't go through them and dip all we're doing here is formatting our excel data into a way that sage maker can read it in once we have the data formatted into a form that we can read it we need to specify our estimator an estimator is just a type of machine learning algorithm that you want to use here we're specifying deep a are your the hyper parameters that I've set for this project obviously if you have more a box and then your script is most likely going to be better and that's one change that I would make to this the data I was working with it was very large and we found out that everything past 80 was not necessarily beneficial since we were looking at thousands and thousands of pounds of these draw materials we just needed to be in the ballpark additionally there's a lot more work that went into these hyper parameters so be sure to look at these whenever you're trying to apply this for yourself when using these Amazon algorithms they create an end point and end point just starts up that way you can make live predictions from your model here we're defining the predict function that we're going to use this is just copy and pasted from Amazon and then our prediction times are a little modified for a specific case once we execute our predict function here we return graphs that look like this so the blue is the target which is the actual values that we put in and then the prediction median is what it actually predicted to happen in that month lastly since I didn't denote a risk assessment for these items I tried to have different levels of the raw material that I would need for the different confidence levels although I didn't denote this specifically in the Python script if there was a critical raw material and that knew that we often ran out of I would use the 90% confidence here instead of the 75 to purchase at that level all you need to do here is to compare to Excel sheets using Python I have examples of how we can compare to excel sheets and other videos and I'll link those in the description below if your predicted consumption is higher than your inventory level and that should trigger you to make a purchase order since I was only in this role for a few months before moving on to my next opportunity there's a few things that I never got the chance to implement but if you're trying to implement this yourself I would encourage you to think about these in this example we started from the raw material consumption the better way to use a machine learning algorithm to predict what levels you should purchase at is to look at the customer orders when you look at these orders you should try to predict a level and that these orders will come in once you predict the orders that you will have then you should relate them to Bill of Materials and then purchase your raw materials by these another thing that we didn't cover in the script was the lead time of the raw materials if your lead time is much longer and then we should have a way to trigger a Pio to happen much more in advance unfortunately this algorithm didn't take that into account thirdly I mentioned it in the video but a risk assessment was never done so that means that we couldn't truly automate this system because we don't know what components we run out of in the algorithm lastly the granularity of these calculations wasn't the best in reality if you're running a supply chain you should have the lowest amount of granularity in your calculation so that just means where I use monthly consumption because that's all I had if you have daily consumption then your algorithm will be much more exact and that's all for this video I'll link all the services that I used in the description below remember if you want a full tutorial on how I did this step-by-step leave a like on this video if you have any questions or comments in the meantime be sure to post them in the comments until next time [Music]
Info
Channel: Derrick Sherrill
Views: 26,773
Rating: 4.9799218 out of 5
Keywords: supply chain management, machine learning, automate supply chain, amazon web services, python programming, S3, SageMaker, DeepAR, Machine Learning Algorithm, scm, ml, aws, Derrick Sherrill
Id: x01N1kIQhUs
Channel Id: undefined
Length: 8min 28sec (508 seconds)
Published: Sat Apr 20 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.