RUS Webinar: Glacier velocity with Sentinel-1 - CRYO01

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
my name is Pradesh Meghalaya and I will be guiding you through this webinar good afternoon to everybody and so we will be looking at a monitoring glacial velocity but sent in one and using offset tracking and we will be looking at the peeterman you Asia in northwestern Greenland so let me first just say a few words regarding the outline of this webinar if you have attended previous webinars you know how this goes so first I will say a few words on Rose Copernicus if you already familiar with it we apologize but this is also for the benefit of the new users or new new people listening to the webinar and then I will briefly introduce the study area then sentinel 1a which we'll be using for the exercise and the exercise and finally to a Q&A session this all should take approximately hour to one and a half hours depending on the Q&A session I would encourage you to submit any questions that you might have during the webinar using the question forum you happen to your GoToWebinar panel this is due to the fact that we are usually quite a lot of people so if everybody starts to post their questions during a Q&A session it might happen that we will not have time to answer all so in this case we will try to answer questions as they come during the exercise as well so first as I said I will introduce briefly your service it stands for research and user support for sentinel core products and it's an initiative that's managed by the UL sorry funded by the European Commission and fund and managed by the European Space Agency its main goal is to provide free and open scalable platforms in the form of virtual machines and to users who wish to exploit the Sentinel data for research purposes or just to learn how to use the data for R&D and so on so the rich environments generally come in the form of virtual machines that you will see today because I will be using it for the exercise they are pre-installed with the suit of open-source tool boxes such as snap QT is Brad and many others as well as development environment such as Python Eclipse art and so on and apart from the fact that we provide the virtual machines we also provide specialized remotes and specialized remote sensing helpdesk so if during your processing and you have any issues with processing of the data you don't know which they got to use and so on and so on and there is the remote sensing help desk again this service is free from the European Commission and ISA and we can advise you how to proceed so apart from the specialized one sensing this the help desk and also the the virtual machines we also use the virtual machines to provide training activities such as webinars and face-to-face events so we organized webinars as webinars such as this one approximately once a month we always advise it about two weeks in advance you can find it on our web page you can also find it on our Twitter account and Facebook and so on I will show you the routes web page a little bit later and then we also have a face-to-face events which we append also add announce on the on the web page either they are as a part of conferences or their standalone events for up to two days and they include usually theory and practical exercises with the virtual machines so here we can see the back pages and so the first one is the one where you can request a virtual machine and the second one is the training web page so now let me just go to the web pages quickly and show you so first let's have a look at the Bruce portal web page and where you can apply and bridges but you can register and apply for a virtual machine if you are interested in it you can also for example all this this this webinar is available on as a training kit on the virtual machines so if you wish to repeat the webinar you I will give you a code at the end of the exercise and you can use this code to request the virtual machine with the specific training kit in order to step repeat this webinar you also get a step-by-step guide and including in the training so you can use that to practice this webinar and many others so just to show quickly here you can see you can read more about what routes its what is its purpose and main aim and so on here you can read more about the offer as I said before all and that first offers is always free and here you can hear me more about the computing environments and the software that's installed by default on the virtual machines and some limitations that apply to the duration and processor number and disk space and so on that we have and here you can also find other resources to learn by yourself and so on and here you can register you can create your account and then we can proceed to login once you do that so everyone go through the steps but I don't need to log in in order to open my virtual machine on which I will run this training so just give me a second very well and once you have your account on your login you have this new web page where you can see your profile dashboard and training so and you can from your dashboard you can request any use of service or virtual machine and you see can see that I already Requested mine and I have one here and from here you can access it request about the nerdkits to be uploaded and many other things get support from our your help desk and also chat with the support desk and so on okay so now let's just quickly introduce also the second web page that you probably already seen because you that's where you have registered for this webinar so this is our routes training web page and if you go to the trainings you can see the upcoming trainings in which I believe at the moment we have one event which is a face-to-face event for December that is taking place in Denmark unfortunately this one is already fully occupied there's no more free spaces but there will be new trainings come up upcoming soon so you can see them here on this webpage and then you can go to the past sessions and if it is a webinar you can see example this was the last webinar that we have and it's here that was the one from October and you can also find the recording of the webinar so the entire webinar is recorded and you can play it here on the webpage or on our YouTube channel and you can also see the Q&A session sort of summary of the Q&A session because that one is not recorded on the video so you can get all the information provided during the workshop as well so you can also have a look at our e-learning and some news and so on so now let's go back to the presentation just very quickly so I was mentioning our YouTube page so this is our YouTube channel it's called whoops Copernicus training you can also find all the recorded webinars here on this web page and this is sort of outdated screenshot so there is many more now available you can also find three more videos that are regarding - how to how to download Sentinel data how to request a researcher machine and how to register photos Copernicus and other information so now this was just a short intro to ruse and let's move on to explanation or short description of our study area so today as I said we will be looking at a Petermann Glacier in the North West Greenland we can see it here on the map it's a large Marion terminating glacier and it's one of the largest remaining floating ice shelves in the northern hemisphere it had appropriately 70 kilometers of ice shelf of floating ice and it had been has undergone quite rapid changes in the last approximately 10 years so in 2010 in August 2010 approximately 1/4 of the floating ice shelf or about 260 kilometers square broke off we can see that here and then another event occurred in July 2012 when approximate 330 kilometers squared which is roughly twice the size of Manhattan again tore off of the reducing the size quite significantly so we will monitor the velocity of the of the glacier and how fast it moves we will be using Sentinel one and just a few words about the Sentinel one mission the mission comprises of two twin polar satellites that are in the same orbit faced approximately 180 degrees to each other they're called central 1a and system 1b and they carry active sensor in c-band with the wavelength corresponding to approximately 5.4 centimeters the short it has very short revisit time approximately one way or less under high latitudes and the repeat time which is meaning that the satellite is acquiring from the exactly same orbit exactly the same geometry is six days and it provides all weather and data acquisitions because it's a active sensor as I said before and it provides four different imaging modes actually on the video that you see on the right side of the screen it shows the past in interferometric wide swath over the Petermann Glacier which is exactly what we will be using for this training so all right so this is the last slide that I have and now let's move towards the exercise so I will go back to the webpage and I will access my virtual machine from my dashboard ok so this is how our virtual machines look is basically a remote desktop that you can access via your internet browser and so the only correct is it really to use this machine is a reasonable internet connection and it's already pre-installed with with a lot of software but you can also installed install any software that you wish provided you have a valid license or it's open-source software there are some limitations to the proprietary software that can be installed on this machine which are given by security measures and so on but if you are a user of course and you have any specific questions you can offer ask us so today we will be using snap chat will open here and snap is called as a notification platform and it's a software which is specifically developed by ISA to process isang satellite data in this case Sentinel data and it contains three two boxes which are are dedicated to Center one two and three we will today be using mostly certain one two books of course since that's the data that we will be using for our exercise and if you are familiar with snap then you this is will not be any surprise for you if you're not I don't is shortly introduce so there is a product Explorer while all the loaded products you can see and down here you have quite useful things such as navigation which shows you where is the view located on the full image and view that you have open and then there is a world map which shows you the location of your acquisition that you open and color manipulation for example so let's open some images to see how it looks in practice so today we will be using two images from the 19th and 21st of September 2017 which were acquired over the glacier so now if I go to the world map I can see the location of my two images you can note that there are 12 days apart which means they are acquired in the same exact same geometry by a same platform so they are both acquired by sensible 1/8 if I wanted a shorter time between I would use central 1b acquisition which is 6 days apart which is very buddy you go especially for fast moving very fast moving glaciers although or very fast moving phenomenons let's say ok so these are our two images they are both acquired in the inter hermetic wide swath and they are both a ground range detected products which is a little one product it's produced by Isan available globally and this one contains four bands it contains amplitude bands in H H and H P polarization and intensity bends in the same intensity is derived from the amplitude so that's why it's only saved here as a virtual management physically safe is calculated on the fly and you can see the two polarization so it's do polar do pole acquisition so let's open the intensity h h then just to have a quick look at the image this might take a little while because while my machine is actually where strong and has 32 gigabytes of RAM and the image is rather large so open so there we go and so this is our full Sentinel acquisition in the interferometric white swab and this bright area a brighter area here corresponds to the ice sheet Greenland ice sheet and here we have Banaras trade and here we have the top of the glacier and then also the water surface and then these parts here correspond to the to the bedrock so if you look actually at this image here and you look at this image here you can see the day it appears to be flipped horizontally or mirrored horizontally and this is due to the fact that the scene was acquired during a descending pass meaning that it was moving from north to south and the satellite is always looking right which means that the first pixel that was acquired it was right here and the satellite always are sorry this the snap software because this image is still in the radar geometry and the snap software always displays the image with the first acquired pixel you could say in the upper left corner which means that this is because this was the first one that was acquired it was sort of the image of your split and you have the first acquired Vixen upper left corner here we will think it is this in the next steps so now let's move into the processing employed so we need to apply identical pre-processing steps to both of our scenes there's quite a number of steps that you need to perform on centinall one later on sar data on this particular type of SAR data before you can use it for your final analysis of course of course these steps differ depending on the analysis you are performing but in this case we have several steps that we need to perform so since it would be quite time-consuming to do each step one by one on each of the images we will actually use a very handy tool and snap which is called much processing which you can find here in tools much processing but the inputs too much processing is the graph of the process so first we need to actually build our processing chain that includes all of the processing steps steps that we wish to perform or not all of them but all that we wish to perform in this step and then we can run this on both of our images so first when I open the graph builder you can see that I can only have or I only have two operators the read operator and the right operator and I have also the tab corresponding to each of them at the bottom of the window and here I can set my parameters so the first thing that we need to do when we start using Sentinel one data for majority will absolute majority of applications is update the orbit files so the orbit file vectors are basically two orbit vectors sorry orbit state vectors are provided in the metadata of the star product and they're generally not very accurate so there is precise orbit files that are made available within days two weeks after the generation of the product and if you're processing the data snap can automatically download this map is updated orbit method data and update the metadata available in the product as you have downloaded it so we can first input this one operator and we can find it if you go to radar you right-click you go to add radar and apply orbit file and here we go so we here we have the apply orbit file operator and he can move on so next we can the terminals removal operator again and we right-click go to radar go to radiometric and thermal noise removal so what is the thermal noise removal basically a thermal noise noise insert imagery is the background energy that is generated by a receiver itself so and it excused the radar reflectivity towards higher values and it hampers the precision of radar reflectivity estimates I apologize for my a pronunciation so in the level one product that we are using now in the metadata there is a lookup table that's provided with measurements for each data set and we can use this lookup table to update or to correct for the four rows then the next step is going to be calibration so again you can go to radar radiometric and calibration and calibration basically in this step a typical SAR data processing which produces the little one images that we are now using does not actually include the radiometric Corrections and significant radiometric bias is still present in the data and the audiometric correction is necessary for pixel values to truly represent the background the radar back scatter of their reflecting surface so if we wish to perform some comparison between star images acquired with different sensors or acquired at the same with the same sensor but at different times if friend modes or processed by different processors we generally always need to apply the calibration to avoid large errors so and then now we need to connect our graph so you can just right click and click connect graph and now we have our graph created and at the moment here you can change all the parameters but we will not do that we will actually change the parameters in the batch processing menu so here we can now say the graph just click on save and save the data somewhere the graph my graphic example and I can close the dialog and now I can go to the budget processing and here I have one tab that says input output parameters and I can use this icon here to load all the open products that I have in snap which is of course the two and then I can click on refresh just to load the parameters of each of the products so I can see what type it is when the date of the acquisition and the relative orbit truck and so on and then here I can choose which directory is the output directory where I wish to save my my outputs and here this is quite important and this is an option that you can use to keep the source product name for your output name for your output product and this is important in case you are for example saving the product in the same folder that you have your input products if you have if you do that then your input products will be overwritten because it keeps the same name so but for our case we are saving them in a different folder so I can leave this selected and I can load my graph that I just created so it's my graph there we go and I have all my tabs available here so I'll apply orbit file here we can finally set the parameters so actually here I will not change anything about just the default options you can choose here basically from two different orbit files that are available online we will use the precise ones there restituted orbits are generally a little bit less precise than the precise ones but they are available sooner but since our acquisitions are more than a year ago already the precise orbits are of course available then we will leave everything else by default and go to thermal noise so in the new terminal ice menu we again don't really have to change anything we can select here polarization which we need and want to process further in our case we will only be using H H polarization so I can click on it and in this case only the H H will be processed and it heard that and here I have two options to remove and we introduced thermal noise of course we want to use to remove options there may be some rare cases in which you which one to introduce thermal noise option but not for us and then we have the calibration you can see that I only see the H H polarization here available because I limited this in the previous operator and I have again three options we will here use the Sigma naught which is the next scatter radar viscosity coefficient which we will be using and then you could click run this will take a couple minutes so I will not actually run the process as I have the data pre-processed for this purpose and so I will just post dialog and open the pre persist products however yeah you would just click run and then you would see how long you can take around two minutes or so close and let me open there we go so once it would be processed this is exactly how my my snap would look I would have additional two products that were created here they would have the same names because remember I kept the heat product name option but if I open the Bands I can see that I only have a single band here which is the Sigma of H H so let's just simply open it it actually does not visually look much different from our original data but we can use it for our further processing so in the next step we need to register and our data and at my table set tracking method so again we will build a graph to do this but now we will only use the graph builder to basically build the graph and apply it because we will input both of our products so let's do that so we have one read operator here and go add another one so week two to read both of our products and then we will use the death assisted registration which is in radar registration and deaf assisted registration and connect both of these two are registry so image gorgeous duration is basically a process of geometrically aligning two or more images so that corresponding pixels represent identical area on the Earth's surface this is why we are trying to do this and it is possible to wear gesture two or more products using on the orbit state vectors that we updated in the previous step however for the purpose of offset tracking we need more precise registration so we are using this domestic registration which also as the name suggests uses a digital elevation model to help improve the registration accuracy so in the next step we will use a subset so we are not really interested in the full extent of our image and it also increases the processing time so here we can go to raster and geometric and subset to have the subset operator I will not set all the parameters of later here below but now I will just input all the processing steps so the next step that we need to do is the offset tracking and it will be available in radar SAR applications and offset driving I will explain a bit more about offset tracking once we are setting the parameters and to finish the graph we will add one more writer greater right - and then we will connect our subset student of subtracting operator and that one to do second write operator and then we will also connect our subset to the right so this is due to the fact that and we actually want a product that contains all of that so it contains the original to Sigma not Dennis but also the result of the offset tracking and the result of of subtracting contains a vector file that contains all the ground control points with their location in the first image and in the second image and estimated velocity between the two positions and since we wish to retain this we cannot use the band merge operator here because during that that would be lost so we are actually going to stack the two products later but in this case we now export the offset tracking separately and also the original data subsided and couragous ok so again let's go through the parameters so in the first read parameter we would set our product number three which is the first pre processed product from the 9th of September in the reach 2 we would choose the product number 4 which is the one from the 21st of September and then we can go to the domestic registration here we can leave all the defaults we just need to change here in the digital elevation model you can see that it's actually screaming and here in red letters telling me that the entire image is outside of the SRT invalid area so snap tends to automatically use SRT and three-second resolution unfortunately SRTM is not available in such high altitudes sorry latitudes and altitudes so we need to use a different digital elevation model in our case we will just use the asset 30 they are automatically downloaded by snap so you don't need to have them stored on your computer or pre download them in any way at least for the ones that specify auto download for the ones that do not like it I start team one second grid you have to download them for and put them in a specific location and the snap-on holder but for us we will just use the outdoor download one and then you have two more parameters here to set which is the demo sampling method and there is something method of the image and tile extension and whether to mask out areas with no elevation we can leave this the area with no elevation of course in this case would be the sea surface but we can see leave this option and we will also leave all the defaults for the other settings then we can go to the subset so here in the subset operator you have two options on how to proceed with this subset you can use pixel coordinates or geographic coordinates for our case we will use pixel coordinates because we only have a at this point once we've performed them assisted registration you only have one product that's with some hooks to the exactly same exactly the same grid so if we wish to do a subset we can use the pixel coordinates as they will be we don't need to compare multiple different products however if you have multiple different products that are not resemble to the same grid such as for example if you use multiple sends in the one products that are not geo reference and you wish to have the same exact area I would advise you to always use the geographic coordinates which are given as well-known text format polygons and you can visualize it on the map here and which are absolute geographic coordinates as compared to the relative pixel coordinates that do not correspond to any physical spatial or to any physical geographic coordinates so now I will just go in the coordinates that we want to use that of course always depends on your area of interest so for us i minimized what make the image a little bit smaller and then we can move on to the asset tracking so now you can see that I have two bands here one grasp on you to the basically the master image to the first acquisition which is on the 9th of September and the one on the 21st of September and we will now use these two to perform the offset tracking so officer tracking basically is a method that is used as the main motion of a feature between two acquisitions through cross correlation on selected ground control points encourage assert images so we have the master in the slave images and the movement velocity is then computed based on the offsets estimated by the cross correlation and the velocities are computed or the velocity is completed on the GCP or ground control point grid are interpolated to create a velocity map this method is very very commonly used for a glacier motion and motion estimation and we need to fill in a few parameters to perform it so first we have this an output grade so it's this ground control point grid that I was mentioning before we have at the moment 40 pixel spacing which corresponds to 400 meters we will increase this to 60 in this case so this is basically have both options we have the azimuth and range spacing directions and as I said we will send it send it to 660 pixels or 600 meters in both directions and this is sort of a balance between the level of detail and the smoothness of our output product and 600 meters is sufficient for us and it is always a trade-off between as I said the smoothness and the level of detail so it very much depends on the level of detail that you're trying to achieve however with more detail you will also get more outliers and were originally estimated velocities the next thing that we need to set is the registration window dimensions which is right here and the size of the registration window depends on the maximum velocity of the glacier which you should generally find out prior to any processing from literature on historical data and the period between the data acquisitions so in our case this is 1/12 days and the maximum speed of the Petermann Glacier is approximately 5 meters today and this means that the glacier surface will shift by a maximum 60 meters between the two acquisitions in this case we have it set to 180 sorry hundred and twenty-eight pixels which corresponds to twelve hundred and eighty meters which more than generously covers this 60 meters shift that we expect during the 12 days and the last setting that we will use is here the maximum velocity so I've mentioned already that for Peterman to issue this is approximately 5 meters today and we set this here in order to filter out false high values and basically they met the outliers and just to explain how the processing of the offset tracking is performed so for each point in the user specified keep secret in the master image the operator will compute a corresponding pixel position in the slave image using normalize correlation and if the computed offset between the master and slave GCP positions exceeds a maximum offset computed from the user specified maximum velocity then the GCP point is marked as an outlier and then we perform a local average of the offset on pallete GCP points or classified as outliers and we have filled the holes caused by the outliers and basically we fill the holes by computing local weighted average for the missing points and then we compute the cities for all points from the GCP credit from their offsets and finally for all the pixels in the master image from the velocities on the DCP points that we interpolate the velocity for each pixel in the in the input image okay so I hope this was coherent and then we can write the output so for the first output we will for the offset tracking output we will use the right two operator here so we can save it just two we can leave this default name here so you can see that the processor always attaches a name to the end so we have a stack which is created like this and then velocity which is created by the offset operator and then we have the other write operator here which is writing our subsetted product which we have in the beginning here because the order of the operators down here is in the order of adding them so we originally had one read write operator so this one there in the beginning and we can see that here we have no velocity we just have two stacks which okay so we can now click run again this process is very very computationally heavy so it would take approximately 20 minutes for the data that I have here with the computer that I have and so in this case if you re running this webinar on a computer that does not have 32 gigabytes of RAM you have to be patient because this is going to take a long time to process due to the cross-correlation steps and the courageous tration as well so we would click run button my case I will just close you can also save the graph prior to running it or after running it to be used for for your future studies as you can of course always change the parameters and apply this to a different study area so I will now close and just open the pre-processed data there we go so what was my processing has finished I have two new products here so they're as the index five and six velocity and the simple stack here and the velocity so now we can just have a look at the velocity product so if I open it you can see the ground control point grid overlaying the image and the ground control point grid is saved here in the vector data point so this is the current control points but it's actually this one that one called velocity because for each of these points we also have the offset and estimated velocity so now I can actually turn it off just a little bit better visualization so I can do that in layer manager here I can see the product structure and I can go to vector data and select velocity now I can have a better look on my on my estimated velocity and I can go to color manipulation just to see my maximum and minimum values so I can see that I have some outlying values that are minimum here of 0.23 and then I have some maximum values that I detected in this acquisitions are approximately 4 meters per day corresponding to the red value is here and they're floating so now the next step what we want to do is we want to actually stack the products so we want to add these bands into our product here so we have one overall product that contains the velocity bands the velocity vector and also the two original Sigma naught H H bands here we can do this many different ways but in order to keep our velocity vector and not so if the personnel disappear we will actually use the bend mouth so if you right-click on the product here you can take the first option and it's depend math and here we can change the name to Sigma naught H H gene says this similar name is here just to keep track and I want you to select the virtual option here because if I choose the virtual it will not save the data physically and in my in my data set it will always be calculated on fly which for what I'm not anything to do now it would not be possible because I'm using and will be using data from a different product so and then I go to edit expression and as I said I would be using data from different products so I can have to change the source of the data sources yet to the second product which is the number six this only works for products that have the same exact grid and the same exact extent and so on only in those you can use bands from different products to calculate an output in this case we will use the first planet of the 9th of September because that's mainly selected here and we can see the software can find it it tells us there is no errors and we can click ok ok again and I can see now that this band has been added to my velocity product again it always gets overlaid by the vector you can turn it off but in this case I don't have to do this now and I will add the second band as well so I will go again to bad-mouth here we go I will again be select virtual and the expression again change my product how my product 5 to product 6 and choose the second band ok and now I have one product here which contains my input data velocity data as well as my diversity vector so this is the one that I will use or any further processing and the next step that we need to apply is a deterrent correction so why the tyrrhene correction our data are and is still in radar judgment or you can see that they're still flipped but more ou2 double graphical variations of the scene and the tilde B satellite sensor distances can be distorted in the star twitches a miniature lighter encourage him to compensate for this distortion and project the scene to geographic projection so we can do this by going to raid our geometric tear in correction sorry I mean correction and we choose the range Doppler in question and we can here choose our input product which is the number 5 so the one that contains or our bands and not as vectors we choose to output name and we choose the processing parameters so here we can see we have all three bands available here we also use the digital elevation modes what you to correct for the terrain distortions again again hear the snap choose is automatically the SRTM but that would of course not work as i said before so again we will use the ad set 30 then in the rest of the parameters we don't have to change we have options again to for the Emory sampling method Mitchell something as they are and you can also an option here for pixel spacing so if you wish to change from 10 meters as it's now two principal hundred you can also do that to reduce the resolution of the image and the last thing we can set is the map projection so in our case we will use the UTM world geodetic system from 1984 and we will use the automatic option with automatically determines the zone of the UTM based on the location of our data so in this case it chooses the zone 21 and then William also the oceans here and the mascot areas where elevation is basically what before any perform and you can also choose what other bands should be output in your should be present in your output in our case we just choose the selected source bands we don't wish to add any other ones but you can add the digitalization model latitude longitude incidence angle and so on and then again once we have all our parameter set I would click run again this would take approximately two minutes not very long if you're doing it on your own but for a webinar purposes not really what we want to do and so I will close and I will again load the preloaded session or the processed images I'm using these session files is basically can save image of the products opened in your snap so it is sort of like if you are experienced with QGIS like the project in QGIS it does not unfortunately save any visualizations or opened views it only saves the loaded products so if you for example know that you need to open the same products every time to perform different things you can use this sorry I clicked on the wrong maybe open this okay there we go so sorry for that and I clicked on the wrong function but okay so I closed actually the four two original products and the 2p processed products so what we are left with is products five six and seven and the seven being the one that we have now just processed and applied the terrain correction - so here's the suffix terrain correction and if I go to the bends I can see all my three bands and in vector data I can see the velocity and that is important um and now let's open just the velocity really quick now let's open actually the Sigma naught H H band so you can see that it's now the image is shifted the product is projected in Luo Luo Tian we can close again do a velocity vector because that's what making a little bit of mess here and now actually finally our image is oriented correctly I look at map so it approximately somewhere here facing sort of Northwest and we can now play with the visualizations so what I could do now is to load the velocity map over my original data so I can use this sorry I'll explain so you can you go to the layer manager and you can use this plus sign to add an image band or type on the grid and in this case it gives you the option to choose from the other bands present in the product so I can choose the velocity here and click finish and it will overlay the bands on top of each other sorry sometimes you have to move around for the image a little bit for it in order for it to be loaded completely to snap oh now I lost the top again uh sorry about that so I can actually go to navigation here and click on zoom' snap generally loads products in tiles and sometimes if it's big product like this it does not like it so much so I might create some problems okay so now I finally have our product and you can see that the visualization that is applied to the velocity layer and basically gives two zero values total transparency so I actually see the higher values corresponding exactly to the to the glacier to go to the Shelf and to the to the highest flow area the area with the highest velocities or Heist movement and also of the tributaries to the side here and this is basically one way how you can visualize the data you can overlay them also over sensor to data for example for optical and so on and now of course is not an ideal tool to visualize your data it's not made for that it's made for processing so if you wish to export the data and visualize them in a different software such as QGIS ArcGIS or any other one you can do that you can at the moment we have the data safe in a dim format which is a native format of snap but you can always export your data into any other formats like such as for example if I click on the product here's I do always expect full product so I click on the one here and I go to export and I can choose from a high number of products that I could use like just the Jo tip if I click on it I can also subset my data also produce one subset so if I don't want all of the bands and so on I can do spatial subset and many others um these steps I do not show here but it's just for you to know that a first snap is not ideal for visualization and so you can transfer it you they easily to a different software so now the last thing that we will do during the training is to see actually so how good is our estimates of the velocity how well does the offset tracking on these two images perform compared to other data sources that are out there and so we have I prepared a file and that contains data from two sources one of them is the data from the center of for poor observation and modeling data portal and these are near real-time the lost city maps produced by the offset tracking as well and Sentinel one images between two consecutive acquisitions in this case this means acquisitions of both Sentinel 1a and center one B so with the period between the acquisitions used for the ops of tracking is six days and the data that I will be using here are from the exactly the same period as our input images but they are from the 15th to 21st of September of 2017 various our results correspond to 19th to 21st so and the second data set contains body is extracted from the NGO Greenland ice velocity map from 2016 2017 and those are derived also using feature tracking in Sentinel 1 but compared to the previous data set the NGO velocity values are the envy of velocity use is 12-day period between images the same as our product but the final product is generated by pixel scale averaging of velocities estimated from image pairs acquired between 23rd of December 2016 and 22nd of February 2017 so they do not correspond exactly to the same period that we have here and they are example to 250 meter grid ok so how can we do this how can we compare our values so I prepared this file that is a CSV file that we can load into snow so click on back turn import and vector from CSV and here we do integration velocity and here the first thing so basically it's a table that has location of the points and the parameters corresponding to them such as the values of the door the velocity velocities for each of the data sets I mean I will use again predefined CSR so I need to apply sorry CRS I need to apply a specific I need to tell the software which projection my data are in so in this case is in a VGA slot long which has a PSG code of 43:26 so it's the general lifelong projection based on word show the existing from 1984 and I click OK ok and now it asks me how do I want to import the point data how you want to interpret them so I can leave them unchanged which is you mean that it will towards the point data set I can interpret each points as a vertex of a single line or I can invert each point as a track point which is the option that we will select and there we go so here we have our track points that are corresponding to the flow of the optical a sure or it's a basically flow profile along the along the flow direction of the glacier sorry and and we can now compare with our data set so we can for the comparison we can use this profile tool here and I click on the profile plot sorry first actually need to click here on the velocity band in order to tell the profile plot which is the input data that I want to use and then I also want to use region of interest which in this case will be not the velocity but the Petermann Glacier velocity points which I just imported so it will use the same exact points as in my comparison data set and I want to use a correlative tape again from stain data set and I want to compare them first with the sit bone velocities of six days so you know I will be select this which will give me I don't want to compute the in-between points and we can see these are the two data sets compared the blue one corresponds to the one that we just derived it's a little bit smoother because the values for the pixels are interpolated between GCP points that are further apart so 600 meters as compared to the sit bone values which are I believe 100 meters but down now don't want to leave all and also the variability higher variable by average in the sebum that can be caused by the so this is but otherwise actually so there is much higher variability in the dataset and in our data but otherwise and they correspond quite well to one another now let's have a look at the content of compare with the NGO dataset which we can select here now we can see the date' respond very well even though they correspond to different time periods so our Veda are from from September 2017 whereas the red they have said NGO dataset is from the winter of 2016 2017 but you can see that the velocities are really very very similar so our method seems to work very well for an estimation and it's relatively simple and you can basically now to take the data and further process them create maps and so on and so on so this is where I will end you can of course do many other steps and visualizing in the day time QGIS and so on there is some steps that are outlined and in the step-by-step guide that is available if you register for the virtual machine and you request this training kit you will also receive the which contains some additional steps of how to visualize the data in QGIS with arrows and so on so this is at the end of the exercise and I hope you enjoyed it thank you for your attention and we will close this webinar thank you very much and have a nice afternoon
Info
Channel: RUS Copernicus Training
Views: 4,189
Rating: 5 out of 5
Keywords:
Id: HjCpMstVTVo
Channel Id: undefined
Length: 56min 53sec (3413 seconds)
Published: Mon Nov 12 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.