R you Ready to Python? An Introduction to Working with Land Remote Sensing Data in R and Python

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
okay good morning afternoon or evening everybody I'd like to welcome you to the are you ready to python an introduction to working with land remote sensing data with our entice on earth data webinar this is your host Jennifer Brennan while everybody is logging in we have two optional polls at the bottom left and middle portion of your page and we certainly appreciate your feedback on these polls I've got two p.m. Eastern Daylight Savings Time so we are going to go ahead and get started here what I like to do first is go over a few logistics related to this webinar to ensure best audio experience the conference has been placed in silent mode however if you have any issues there you have any questions what I'd like for you to do is enter those into the Q&A pod and that's located on the lower right hand corner of your screen this works like a chat as far as the webinar being recorded it is being recorded the recording will be posted to our online Adobe Connect webinar catalog as well as to our NASA Earth data YouTube channel within a few days of completion and I will make all presentation files available for you to download at the end of the webinar regarding timing this webinar is one hour long we've allocated 45 minutes to the presentation and live demonstrations with an additional 15 minutes for the question and answer period after our speakers to finish their presentations but we'll do then is we'll move to a final set of polling questions and then from there we'll transition directly to the Q&A period you will have an opportunity to ask your questions throughout all portions of the webinar with the exception of the live demonstration by using the Q&A pod again this works like a chat questions will not be answered using the raising hand function it has been disabled and then just one final note depending upon the volume of questions that are received today will extend the Q&A period in additional 15 minutes to 3:15 p.m. Eastern Time for those of you who are able to stay on the line what I'd like to do next is let's pull up today's agenda let's pull that up here okay during the first half of the webinar our speakers will provide an introduction to the land processes deck they'll show you how to navigate NASA LP dak website and also locate elearning resources excuse me we'll then transition to two short video tutorials that will show you how to download and execute both an after level 1t script and a ver script and then the second half of today's webinar will be spent on live demonstrations using both R and Python on Jupiter notebook today we do have two speakers the first speaker is Cole cradle with the remote sensing scientist at the NASA LP DAC and then our second speaker today is Erin Freese who is a data scientist at LP DAC with that it is my pleasure to introduce our first speaker Paul cradle cold Thank You Jennifer good afternoon everyone today Erin and I will be providing information in live demonstrations using R and Python to work with land remote sensing datasets archived at nasa's land processes distributed active archive Center or LP deck that says LP DAC is located in Sioux Falls South Dakota at the USGS earth resources observation site Center or arrows it is one of 12 method acts and is part of NASA's earth science data systems program being the lane processes oriented DAC the LP decks are data sets including variables such as surface reflectance land cover and vegetation indices of the land surface what is the LP deck and what do we do the LP DAC is made up of over 40 staffs working as part of an interdisciplinary team from scientists like us to software engineers to science communication specialists we archive earth science remote sensing data distribute and provide open access to the data offer user support and conduct outreach to the user community this webinar is a good example of those last three functions our user community includes over a hundred and seventy thousand users representing 126 countries the products that we archive include mission products such as aster motifs and the newly released mass of beers products as well as derivative products from NASA missions developed by P is in the earth sign community as Jennifer mentioned we will begin by showing videos highlighting elearning and tutorial sections of the LT deck website and how to find and execute our readily available Python and our scripts later we will go into live demos of general uses for working with remote sensing data in our in Python at the end of this webinar we hope that you will know where to find our available tutorials and how to execute them and have a better understanding of how to work with land remote sensing data in our in Python our invites on are both great candidates for working with remote sensing data for multiple reasons first they are efficient they are quick easier used in higher level programming languages offer a peeler appealing new user development environments such as our studio or to clear notebooks and are reproducible on a similar note they also allow users to batch process and automate some of the steps needed to pre process and process satellite data this is a huge benefit if you are working with mini files with a typical point-and-click manner of performing these steps in GIS or remote sensing software may be extremely time consuming ardan Python are also efficient because you can do everything from downloading pre-processing processing and analyzing the data all in the same place and of course we would be remiss in not mentioning that bowls are freely available but moreover both are widely used within the remote sensing and GIS community but also many other disciplines as well which means that when you do encounter issues there is good documentation in a large user base to help you overcome those problems in order to download data from the LP decorator pool you will first need to provide your natural earth data login account credentials the elf evac data pool is where we store all LP dackel and remote sensing data which we can see here if we click on beers we see a list of directories showing the four beers surface reflectance products that were recently made available if you go into a directory and actually try and download a file you will be prompted for your natural earth data login user and password if you do not have a natural earth data login account you will need to go to the link provided instead of the natural earth data account and authorize the lp back data pool it is free and very quick and easy to get signed up here is an overview of the data products that we will use in today's presentation we will begin by showing how to convert after level 1 precision terrain corrected registered accent irradiance or actuality from digital numbers to top of atmosphere reflectance we will specifically be looking at the after visible near infrared or via NIR bands available at 15 meter resolution next we will look at the visible infrared imaging radiometer suite or fears and how to deal with hdf5 eos five laughter mentioned your surface reflectance recently became made available on the LP deck data pool we will focus on the BMV o 9 a one product today but there are actually scripts available to process all four in their data sets the NPO 981 is an eight-day global one-kilometer product we will wrap up our presentation by showing how to leverage and Pierce quality services to access quality info from one of our motives vegetation indices products mod 13 a to version 6 now I will hand it off to Erin will show where to find LP deck elearning materials and how to download them we'll give a quick video demonstration of the website this is the Opie deck website LP deck at usgs.gov this is where you would come to find data sets that we have find any tools that we have any resources that we might offer regarding the usage of our data sets so if we go into this user resource drop-down click this e-learning you're taken to our e-learning section where we have webinar material tutorials video tips things to help you use our data you can click on these individual tabs to filter the material you can filter the material also by using this search dialog so as I start putting in beers I get one tutorial in return and I can click this link and it takes me to a landing page so all of our scripts and tutorials that we develop have a landing page they follow a pretty strict structure we have an objective section that gives you kind of the idea of the purpose behind the scripted tutorial we have a prerequisite section that gives you an idea of the systems that we've tested it on the programming versions that we've tested the dependencies that you would need to download in order to use it and then a procedure section that gives you kind of a quick start on executing one of our scripts we normally have in our script or Python script and this gives you kind of the first steps to execute those scripts and then we have a related tutorial section that any materials that are related to this specific tutorial or script you can get links to those as well so we're going to click into this Python script here it will take us to our public-facing git repository so in this case we're in a directory that contains both the Python and our script most of our repositories aren't quite structured like this but for this particular one they're both in the same directory if you're familiar with git you can and probably data or the documents over to your machine in a couple ways you can use the cloning function so you can clone the repository to your machine you can use the download function that will download a zip file onto your machine for the entire repository but in many cases you're interested in a single file so in this case we're going to click into the the Python script to get the the text representation and then we will go up into this right-hand corner and click raw file and then do a right click save as and then what we want to do here is remove that dot txt extension and change it to a dot pi extension then we'll click Save and now we'll go and see where that's downloaded to and there you go you have your Python script from our git repository and so that's how you kind of interact with our landing pages and for tutorials and how to download scripts from our repository and now I'll pass it back to Cole to give you a demonstration of this process so as Aaron mentioned we can find LP tech tools and tutorials on our e-learning page you can sort by type of presentation or you can search by keyword these tutorials come fully automated meaning that you simply need to download them point to the files that you would like to be process and it'll batch process the files for you so here I'm going to type in Astor L and T into the search bar and we see here that we have a couple of tutorials on how to convert Astor LMT from HD F to G Otis so click on that first tutorial there so this is our how to convert Astor Ln T ratings to top of atmosphere reflectance tutorial this data recipe demonstrates how our in Python scripts can be used to convert ester l1 to data from digital numbers to radiance in from radiance in the top of atmosphere reflectance the scripts will do this for each of the visible near infrared and shortwave infrared bands in a given hdfs note that if you are interested in using aster thermal bands you should check out our how to reformat and georeference aster l1t to hdf file tutorial aster LNT data are stored as unscaled digital numbers however if users are interested in measuring the biophysical properties of the surface by calculating metrics such as vegetation indices they will need to first convert the data to reflectance the script will output gio tips for each aster band in the original digital number format as radiance and his top of atmosphere reflectance we have created both are in Python versions of this tutorial and here are testing environments all testing was done on Windows operating system all testing was done using our version 3.3 and for Python we have tested both in Python 2 7 as well as Python 3 4 there's also a list of libraries you'll need in order to run the scripts so moving down here we see the procedures for both the r and python scripts first we will copy the script i've already downloaded an astral 1t file and I've also already started in our session where we can open the newly downloaded script I will show you how to change the input directory and execute the entire script in a second so here I will click on the R script and I'm brought to our git repository that Aaron just showed everyone I will download this script by clicking on the raw file in the upper right hand corner of this page and then once I'm taken to the raw file I will right-click and select save as I will change this to all files and the important thing here is to change the file extension to reflect that this is an R script which is dot R and then we can go ahead and save that now it will move over to my our studio console here we can go to click open we can navigate to where we downloaded our new R script so here we have our file we will look deeper into what this script actually does in the next after L&T live demo where we will work interactively with aster L&T data and our but big thing here is to change this input directory so wherever you have your aster l1t file stored like so next thing I'm going to do is press ctrl-a which will highlight the entire script and then we can go to the upper right hand corner again and select run so now our is running the entire script now I will move over to a directory where I previously ran the script so here we can see an example of the aster l1t output files including data for each band as digital numbers radians and top of atmosphere reflectance lastly I want to show you the actual results so here we can see band 1 top of atmosphere reflectance and we can see our data is spatially referenced with the correct UTM zone applied so now I will pass it back to Aaron will show us another readily available script from the LP DAC that works with beers data yes so we put together some scripts and drafted some tutorials on how to reformat and geo-reference beers surface reflectance HD f EO s files and so to get to this we will start on our e-learning page and start typing in the beers to filter our list of materials and get to our tutorial link so we'll click there and we will be taken to our landing page for these scripts from here we can see the additional information regarding the purpose and the usage of the scripts so the objective here beers data is our newest data set and it's sort as h DF e OS v files and because of this a lot of our programs that our users typically used just don't quite or haven't caught up to being able to visualize these data in a spatial manner and so this script will convert each science data set within the beers file into a geo TIFF so that it's easily brought into a GIS and remote sensing program so these files will be geo-referenced and ready to do any analysis that you want to throw at it and so currently we have four separate surface reflectance products and this script will actually work on all four so moving down to the prerequisites section we've developed all these or this script on a Windows machine and haven't really tested it on the linux as of yet for our we've tested it in in version 3.3 well in python we've tested to 7 and 3/4 and so you can see the dependencies below for the procedures we always ask you to download the data first for the r script again you're going to have to go into the script and change that in directory variable to the directory folder that you want the data or that you'd save the data for the python script we've actually wrapped it up so that you can actually execute it from the command line without having to go into the script and changing anything and so we're going to click into that python script and go to the public repository and then i'm going to click on the script on this screen which will take me to the text version and I'll click the raw file right click and save as and now I will change the extension from txt to Python and then we'll go to that folder and so you'll notice that we have saved that Vere's file in this directory i've also went out and downloaded a file previously for HDFC OS or at this biers file specifically the oh nine a1 product and so now we are going to very simply execute the script from the command line and create some geo tips for each science data set within the the beers file so I'm going to go down and open up my command prompt I'm going to very simply just type in Python and then I will drag or add the Python script and its path to my command window and then I'll drag and add my file directory so something note here I copied over the file name the scripts won't act on an individual file like this so you have to take the path name down to the directory where the data is stored and it will execute on all the files that are stored within that file so I'll hit enter the script typically runs pretty fast so now we have an output location and we will go there we'll take a look at what we got for an output and so these are all of the science datasets that are within the beers file and then we can take a look at these within or in ArcGIS so apology geo tip-in it's the n7 band which is the near-infrared band and you can see that it it comes up on the screen referenced correctly and to kind of prove that we'll take a look at the properties you can see that the dimension information and the cell size information as well as the projection information has been applied and it's geo-reference so you can now perform your analysis on that so now you know how to execute the script in a par from Python and to get these geo reference views geo tips and so now I will hand it back over to Cole who will start the live portion of the webinar showing how to process and interact with after l1t files in our all right as Erin mentioned now I will show you how to work with Esther L&T data more interactively in R to do so I have created an R markdown document or our notebook this is an ideal format for demonstrating how to work with data in R because it allows me to break the code in the code box that I can then run and explain what each specific block is doing so in this live demo I will show you how to set up your working environment define functions read metadata geo reference raster arrays and defined coordinate reference systems also we will look at how to subset specific data sets and apply our previously defined functions on raster's items one through six are all covered and are readily available aster LNT the end to reflecting script that we showed earlier today for today's demo I will go a few steps further and show how you can take the output results from that script and visualizing plot aster data including how to generate an RGB composite how to calculate NDVI how to visualize histograms and calculate statistics and ultimately export your results to geo tips so we'll start here by setting up our environment so to execute a code block in our notebook we click on this green arrow here so I'll go ahead and run that we'll start by loading the necessary packages into our setting our current working directory creating our output directory and then creating a list of Astral and T files in our working directory so for this demo we just have one aster L&T file located in the directory next we will set up our calculations so there's two main calculations the first from dn2 radiance and the second is from ratings to top of atmosphere reflectance however in order to perform these calculations we will need to define certain variables before hit including the unit conversion coefficient or UCC the UCC is set by band and by gain designation so in order to setup this list of values we creates what's called a data frame in our folder on this block here and so here below we can see the output where we set things up by bands and by game designation which we will later used to call and select a specific value in order to calculate from the end to reflectance next we can define functions for our calculations so here we can run this and we've set up our calculations which will later use to pass through the entire raster array in order to convert from digital numbers in order to get to top of that atmosphere reflected next we will read the file metadata I'll run this here the important call here comes from GDL this GTL info call is going to report back all of the file metadata for the Astor LNT HTS file and there's a lot of it we can see here over 400 lines in this very large string object and it's all very good information but we only need a few key bits of that information in order to do what we're looking to do today so grep can help us with that grep or regular expressions it's an extremely helpful pattern matching function so in the code block below we search for the very large object string in order to find the solar direction or solar Zenith angle that is used in the conversion from radians to reflectance so I'll run this here first we just call grep and then a specific text string that we're interested in and that returns us back where in that very large string object it's located next we actually input that into the metadata objects and it gives us the information provided on that line we'll take it one step further and use substring and as numeric in order to convert that to a numeral to arrive at thirty four point one six which is a value that we can then input into a function in order to perform our calculations and so the reason why I spend some time explaining all that is because it's actually very important using grep and subsetting strings is a very important step and not only looking at srl wente data in R but in looking in all sorts of remote sensing data in both R and Python so here is another example of how we use grip the search for the gain designation for each a strobe and so here we can print out the game for band one for this particular duration is normal next we can define our coordinate reference system so I'll run this here basically we are searching for our upper left and lower right bounding box coordinates in order to set up our extent we will also search for the specific UTM zone zone that is used for this aster L and D file and then we will use the extent call from the raster package in order to actually define that extent which we can see output below next we will compile our coordinate reference or CRS string to attach projection information to the raster's so if we run this year and we input the UTM zone that we grabbed from the metadata we see our projects or string here with the correct UTM zone attached next let's get a list of the science datasets using this get sub datasets call from the GDL utils package so if we run this here we are presented with all the specific datasets within the aster LNT file including the three that we will look at today - one and three n the visible near infrared bands next we can set up each of the visible near infrared bands so I'll run this below and then we also generate output file names extract the band names for each file and then attach the output directory so we can see in our output here we have our output directory our original file name and then this is for band 1 below is where we actually extract the specific datasets using GDL set the coordinate reference system and define the extent so here the big calls GL translate which opens a tester l1 thdf file extract the specific sub data set we're interested in and exported as a geo TIFF we can also attach the CRF by using this raster call here and then define the extent using extent and there we can see our extent for our raster layer next is where we actually convert the end to top of atmosphere reflectance so I'm going to go ahead and run this and we begin with some basic logic that defines the UCC value based on hi normal or low gain designation for each band and then eventually we use the calc function in order to run the function over the entire raster array the reason why this block is so long is that we're doing it for all three of the bands so again I want to reiterate everything above can be found in our Astro l1t conversion scripts on the LP DAC website look as an example of how you could further apply your results from those tutorials and so we're going to begin with a basic plot function and this is coming from the raster package so we can run this and so here we can see we're just running plot and then our reflectance from band 1 and here we can see the entire aster Ln T image this is an agricultural field located in Saudi Arabia however next we want to actually crop this image so we'll run this here we'll set up a sub extent here and then actually use the crop function from the raster package to crop all of our arrays and then a big thing here is to set your fill value equal to na so it's ignore it in the plotting and statistics and then we can go ahead and do a basic plot again and so here we've zoomed in on those irrigated agricultural fields next we're going to stack the three visible near four red bands I'm going to start this before because it does take a little bit of time but so here we're taking all three of the raster's and stacking them on top of each other using the stack call from the raster package and then this plot RGB also comes from the raster package which allows us to input the three band composite define which band goes to red green and blue and then here we can see our output and so here now we really see those bright green agricultural fields from the surrounding desert environment next we will define and compute NDVI using band to red and band three near-infrared reflectance and then we can plot our results so I'll run this here we'll begin by defining our function for two bands and then we'll actually pass each of the band three and band to raster arrays into the calculate and DVI function and then plot the results so here we can see in shades of green our higher end DVI from agricultural fields compared to the surrounding desert environment if we want to stretch the image a good way to determine where those bounds lie is by plotting the histogram so here we call the his phone on our master array you can define other things like the main titles of distribution of MTBI and here we can see that there's a high distribution of desert pixels near zero and then there's some irrigated agricultural areas here with higher NDVI so we use that information to set this zealand parameter in the plot call here we can run that and so here we've set these specific bounds so that we can stretch the data a little bit better we can also set things like the maximum number of pixels that that will be plotted and here we can see our NDVI image so next I just wanted to quickly show you guys how to calculate some basic statistics here we are going to use cell stats which is again is coming from the raster package in R to calculate statistics on our entire reser array so if we run this here we call cell stats and then you can select whichever statistic you're interested in here I've selected mean main max and standard deviation and then we can plus our printout our results and then last but definitely not least is to export our results to G Ottis so I'll run this we set up our output file names and then use the write raster function in order to export those to geo tips and then very quickly I just wanted to show you guys the results so here is the RGB image that we just created in R and then here is the NDVI image that we just created an R with a different color map applied to it and so that higher resolution in astir is a good indication of the health of the vegetation in this particular scene all right without further ado I'm going to pass it back over to Erin who's going to show us how to work with V R surface reflectance data in Python okay so I'm going to show you this tutorial in Jupiter notebooks Jupiter is an interactive programming platform that allows you to display both code and rich text elements it's great for tutorials so I'm going to kick off a session right here so while we're going to tell you about the first tutorial the dump when we created this bierce tutorial to kind of demonstrate how to process and interact with the beers data so it's an extension of the previous script that I just showed and so in this tutorial we'll bring in some beers data and we will show you how to bring it in to your Python environment and interact with it and do some simple plotting as well so let's click into it so this tutorial will actually be able to be executed overall for products that we have so the a1 g1 the h1 and the CMG's that we have toward the end with the the mapping functions you would have to maybe tinker a little bit but for the most part to visualize and interact with the data you don't really need to do much but change the input directory and so we're going to start off by calling in the modules that we'll need to interact with these data sets and so we're going to start with importing the h5 PI library which is essential for reading in hdf5 files and then some just common geospatial libraries as well so this cell block here is not all of these sorry for the jumping so this so blocked here I am specifying the directory where all of my data is stored and I'm creating a list object and so I'm printing out that list object I see that I only have one file in there if I had multiple files they would all show up here in this code block I am specifying a specific file so in this case I'm using the index 0 which means I want the first file in that list and then I'll use this to add to the input directory name so that I can set up a path to a file where I can pull it in using that h5 PI library so this block will create a output directory for any tips that I create any graphs that I create they will be dumped in here something to note the biers data the vir style data is on the same sinusoidal projection as our modis collection is so what this will do is later on in the script will specify whether we're dealing with the tile product or a CMG product and it will assign the proper projection depending on what it is exactly and so let's dive into the data let's pull in that in file location that we specified earlier and so now we have the hdf5 object in my workspace I'm going to strip the the name off and remove the extension and so this will be used in like outputs if I want to output some geo tips I would tack this on to the front with the the layer name diving into the the file a little bit these HS are these beers files have two main groups within this HDFC dose group and this HDFC OS information group inside the HDF AOS group there is more groups called additional and a grid group so this grid group is where all of the data or science data sets that are within the the beers file that's where they're stored along with some attributes that will need later on this hdf informations group has a struct metadata that 0 elements within it and this really gives you the the global metadata so this will give you the the origins and lat/long it will give you the dimensions it will give you some other useful information and this is a kind of a big metadata object that you saw in Cole's presentation and will print that out in a little bit but so this is the all of the groups that are within the HDF file this little code block shows you all the groups in their hierarchical order and then you can see in this grid group we have all the science datasets listed out here moving on that metadata struct metadata object so we're we're digging into that and we're creating a list of string objects and we're going to use that object to get information that we need in order to georeference our output images so you can see we have things like upper left pointing meters lower right projection information all this can be used to make decisions or create outputs so we'll take that object we'll do some string manipulation and pull out the upper left longitude and the upper to the left latitude coordinates so it can get my origin point in this block I am pulling out an element called grid name and what this is is used to set the projection and so depending on this grid name I can determine whether it's a CMG or a tile product and so if it has CMG within the name it will be assigned that geographic coordinate reference system or if it's other it will be signed the Sonia sinusoidal projection system so that's used in outputs so this code block will list every science data set that's within this beers file you can see we will use this information either as an index or other to kind of do some further examples of some simple plotting here and so we're going to call in the plot Lib library which is a pretty common library in Python for plotting we're specifically interested in B and v 2m for and the m3 spectral bands which correspond to the RGB and we're going to plot these and so I am referring here to that this data set object here and I'm pulling out those RGB objects and so this object actually contains both data and attributes in it and so I can specify which I want to look at and what pieces I want to pull out to use later on and specifically from the attributes I want to pull out the scale factor here and the fill value and so I'll apply those to the data and then plot those results and so for these bands there are the same so the scale factor is the same between bands and then still fill value is the same as well so here I am from that art object pulling out the data values here and then applying the scale factor for all three of those bands and then in this last part I am creating a new PI data stack of the RGB bands which I will then later plot and then in this code this block on I am specifying the fill value for that data stack and so when I do a plot here is the true color image using the RGB bands and so another thing you'd want to maybe do is again create the the NDVI index and so to do that I need the near-infrared band I need the red band and so again here I am pulling out the nearest red band from that data set list and I'm taking out the data from that object and I'm applying the scale factor to that data set and so that's the scaled NIR array and then I'm going to apply the no data value to that array and then I need to in this case applied the Nano data value to the red array as well and so when I take this NDVI object which is near infrared minus R and divided by a near Brit + R I get this image down here okay so we can do a little bit more to make this a little bit more more pretty we can kind of restrict the values that are shown and so we do in histogram to see where the values fought for fall and kind of make a decision depending on what we see in this histogram so we can then use math information to create our kind of final image so in this image where we're adding a title we're kind of given idea of where the location of this tile is we are restricting that the data values that are to be shown depending on what we see in this histogram and we're applying that to yellow to green color map again and so that is our our finally put in here I do a savefig which is saved as a PNG so I can have some distributable images in this case not geo-referenced but this next block which is hashed out in the block that would actually run through each of those science data sets that we specified up there it will use all the geo referencing information that we pulled from the metadata and apply that to be output and create a geo TIFF and so if you unhatched this and executed it it would recycle through all those bands and give you however many geo tips for each of the science datasets with them so that's the the beer script I've jumped right into the quality tutorial that we wanted to just show and so this is based off of a web service that we developed a little over a year ago probably and it's a it's the service that is designed to go and decode quality value so anybody who's familiar with modis quality knows that it can be a pain to actually decode the quality layers and we tried to make it a little bit easier for our users by prefer by developing this service and so this is a web service and so in that because of that we need to import this request module so that can make service calls over the Internet and then I import the pandas library just to create some pretty tables at the end just to separate the data and make it more legible so let's check this one off again with the jumpring okay so there's there's different pieces to the service so depending on what you have built on to the the service URL here which is within our appears API that we have developed if I add quality to the service URL I will get all of the products with layers that have quality associated with them and so this is only for motive our motive collection at this time and so a quick output view of this is after running that service I can filter it down using some indexing and this is what I get I get a dictionary of or a list of dictionaries that have all sorts of key value pairs that I can use to do some further filtering so maybe I just want information about a particular product that I'm interested in so in this case I want some information on the my d13 a to version five product I can do that in two ways so I can use that list that I just created and perform some filtering on that saying for this particular key this product and version I want this value associated with that and for that output I get two layers the EBI layer and NDVI layer and both of them have this Associated quality layer now it's the same for both of these layers you'll see that with most of our products but that's really what I'm after you can get the same results by using the service I can tack on along with the quality I can tack on the product name with its version and I can get the same exact result back from our service and again we're after this quality name you can see right there which I will then again add to the service and then make another request and so the results of this request is basically all of the possible values that we can get that you would possibly run into in a quality layer so this isn't very helpful but if you take it a step further and add a actual integer value to the end of this service URL you will get the actual decoded values for this individual pixel value or what have you so to make that a little bit prettier through that in the pandas dataframe you notice so for adjacent cloud detection know for maudlin it's good quality okay and so if you're familiar with this that you would see that this is or for real with motifs quality you notice that this is out of order the service actually returns the the data values back in order but Python rearranges them by alphabetical order so to keep the proper order we use this code block and then we throw that data into another data frame and now you can see this is in its proper order from for each of its bit words madlyn followed by b.i usefulness and then I can further drill into that that data container and request a maudlin bit and then that can get that so but you're not limited to in Python to only decoding our single integer value we've actually taken this and made it operational over entire arrays we've done this for a Python tool that we've developed developed for ArcGIS that will take in an array of data and create an output image that has been completely decoded and so this is just a quick example of the modus quality tool or service which you can then kind of take and expand and use as you wish so I think that's about all we have to demonstrate I think I will hand it back over to Jennifer to open it up for questions and finish things up okay Thank You Erin thanks everybody at this point move to the final set of polling questions and we'll get these just two minutes or so then from there we'll move directly to the Q&A period all right well thanks everybody if these just a couple of minutes and then we'll move forward to the question and answers you okay everybody we will give the polling questions just another 30 seconds or so and we'll move directly to the Q&A period okay everybody let's move to the Q&A period and have questions I like free to type those in here for those of you who asked about whether or not the presentations would be available for download yes which you'll see below in the lower right hand corner of your screen you will see the first three demos with the e-learning tutorials and the two scripts demos for Astro level 1t this webinar is being recorded so the recording will be recording links will be sent out to all participants within a couple of days in registrants within a couple of days of completion and then you've got the slide deck down below as well if you click on any of the files you'll be prompted to download just check and see if we have any questions here anybody have any questions okay let's see here okay do these functions work on a Mac either cool or Aaron you know I think a couple of our we've had a couple issues with Mac users we don't actually have between the two of us don't have a Mac to test on and so there there seems to be more issues when you're looking at the metadata the actual importing of the data values doesn't seem to be affected but depending on I guess your your operating system or even your G doll version you could get a different list I could get a compacted list of metadata depending on your version of G doll and so how you how you actually string subsets that stuff kind of varies between the two operating systems and so that's that's kind of been our experience at this point yeah it should work on a Mac the biggest thing is if you are having issues on a Mac I would check with your G doll version before you come to us with questions that's the biggest source of differences in how it actually reads the file metadata is based on the actual version of geo setting salts okay thanks our next question is that the quality service contained the quality layers and a config file is is it queried does it query the metadata to determine if it is a quality variable no you have to explicitly specify what quality layer you're talking about the service draws from some more or less look-up tables that we've developed beforehand for each quality layer and so it will query basically those tables to give you the correct output for a quality value okay thank you Aaron are there additional questions anybody okay while we're waiting to see if there are additional questions but we'll do here when the Q&A period is over I will leave the virtual meeting space open an additional ten minutes or so to you know depending on when we finish for people to have an opportunity to download the files but we will log off from the audio component of the webinar however if you think of any questions you have any additional comments that you'd like to convey to us and the speakers please enter those into the Q&A pod here's an additional question here one second do you have any scripts for deal location I have to beat my more specific on that one for level two data for level two no scripts right now No okay thank you and our next question is how do these tools compare with envy we've never done direct comparison with MB what Python and are offering or kind of less clicks right it's more streamlined the upfront effort to get something developed and operational for yourself it's probably more than what you would experience with envy but after you get that what protocols I guess setup it's pretty easily to slightly modify and execute on a different set of data and so as far as performance we haven't done any direct testing okay thank you and then the further clarification on the earlier question regarding regarding the scripts for geolocation the reference was level to swath data yeah we do not have any scripts for geolocation of level two SWAT data at this time okay thank you and the next question is have you tried these scripts on the new Landsat collection one images no we have not tried these scripts on land set okay and we have an additional question here can these scripts be adapted to other satellite data for sure the the I guess lesson within can definitely be applied and modify you probably won't be able to I mean do an exact match but if you know especially for like the HDF file if you know how to kind of parse the the hierarchy of that hdf file it's pretty easy to expand I mean that to other hdf files and generally I mean hdf es5 is a NASA standard and so they technically should have a very very close structure so you should be able to navigate one big asset that's stored at one location within the same manner that you navigate a data set from here you know I'll just add to and absolutely I mean the workflow itself should be very very similar so say if you took the beer script and you are interested in working with motive data you should be able to look at what's going on in the beer script and adapt that to your specific need for whichever motives product you're interested in looking at which particularly in terms of the workflow of setting up your environment okay thank you are there any additional questions right now what I'm going to do I mean let me scroll down again to see oh yes there is okay so the next question is are has memory issues with large files what's the best option to handle them um to be honest my best answer that would be if you're having an issue is working with very large files and are I would consider moving into Python um not only can you handle a larger file than Python but it will be a lot faster than working in our um if that is not an option I would recommend splitting your files two smaller tiles I've actually had to do this before where I'll split a single raster array into multiple restaurants bring one in at a time actually run all my calculations on it and then remove it from memory and then upload the second tile and keep going that way so it's one or two potential options okay great and what I've done right here is type into the Q&A pod two resources for end user data inquiries or script you know questions or any kind of question you have with respect to their data holdings and that is the user services office email which is LP document USGS gov or the you know submitting a ticket through our more centralized earth data support system which is support at earth data nasa.gov are there additional questions space open for all of you right so thanks to our speakers : Aaron and thanks to all of you for listening in on today's webinar thank you everyone all right thanks everybody all right bye-bye now right
Info
Channel: NASA Earthdata
Views: 18,857
Rating: 4.9292035 out of 5
Keywords: webinar, land processes, Python, data, MODIS, services, EOSDIS, Earthdata, tutorial, LP DAAC, NASA, Earth science, VIIRS, NDVI, ASTER, land surface temperature, surface reflectance
Id: jDgn1ktZpBU
Channel Id: undefined
Length: 60min 42sec (3642 seconds)
Published: Mon May 01 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.