Virtual Workshop 2021: Session 5 Talk2: Bathymetry and coral/seagrass mapping in Google Earth Engine

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
i just wanted to show everybody uh how we can map shallow water environments in the coastal zone uh and i'm aware that there's a variety of technical people here so gis or remote sensing analysts or just policy makers and the point of this workshop is just to show you a relatively simple way to be able to do this um pretty much from wherever wherever you are in the world and whatever laptop you have as well so before we begin i just want to tell you a bit about um where i've come from in this project so we're seeing me i've previously done a lot of work to do with the mapping of 10 fp and so that was the main topic of my master's project and so there i led the habitat mapping using satellites and then from that work i moved on to map the entire of blue zones shelf and that work is due to come out later this year and some of you may also know of the projects that we're also doing out in the caribbean too with shoreline change mapping in eo4sd so i myself have been doing that for knock so you may also have seen this data as well and so if you're interested in shoreline change and a lot in a lot of areas around the world we've just got a map here of showing our main locations then just head over to the data portal and you can see what we've been producing there anyway so what we're going to do today so we're just going to discuss what is google earth engine and we're going to explore the code editor platform that they use now this is slightly different to a lot of main gis software like arcmap or qgis and it's a little bit more technical and uses uh scripting now don't be daunted by that i'm very new to coding and scripting so this is i've made it more of a simple to understand uh view that you can do online so we're just gonna go through a simple classification to do with cloud and land masking similar to what tim just went through and a medium composite we're going to touch on satellite derived bathymetry because this is going to be one of the inputs to the classification we're going to build some training data and then we're going to actually carry out the classification and solo accuracy assessment so the main thing here is just to show you really how how we can do these things and to give you just a baseline understanding and after this presentation uh hopefully in the next week or two i will share with you the instruction manual which goes through a really detailed step-by-step guide so those of you who are more technical can just work through it in your own time so this yeah this talk's gonna be about 55 minutes and there's a q a if uh anyone has any questions after this and then hopefully we can finish a bit earlier session isn't going on as long as the other ones so yes what is google earth engine well this is quite a new and i think revolutionary part of satellite data analysis and this is based on a cloud-based geospatial processing platform now what does that mean well a lot of the time when you are using satellite data you often have to download all the data on your laptop and do your analysis on there and if you don't have a good laptop it can be quite painful you could be sitting there for quite a while processing to take this data well google earth engine um kind of skips all of that pain uh by having the ability to use google's own servers so they've got obviously top of the range servers where you can just process large amounts of data that all were held online and then they have some really nice tools where if you're working on satellite dating with other people you can then share your work routines online as well and you'll you'll see that when we go through that soon so today we're looking just at sentinel 2 data which is one of the most common optical satellites that people use to monitor the earth and on google earth they have surface reflectance data which is the one we're using today that's corrected for the atmosphere and the clouds so just because i want to like have a more interactiveness in this presentation i encourage you to visit the first link that i sent you in the chat which is called explore sentinel to images so if you just open that up and tim can you can you now see this chrome window or not uh no i'm afraid not at the moment okay i'm just gonna have to stop the share and reshare this right so uh if you follow this link i'll just talk you through exactly what i'm seeing on the screen so for those of you who don't have the um if you haven't signed up to google earth engine then uh don't worry you can just watch me so here we'll just we've just got a map of the us and we're just going to explore some sentinel 2 data so on the bottom left here we've got uh select a point on the map so i encourage you to just zoom in to wherever you want in the world wherever you're interested in wherever you're living in in this case i'm just going to do beliefs because that's where my example is so i'll just click to place a point and i've just clicked on the map where i'm interested in and i'm just going to select the target date and you can choose whatever you like i'm just going to go with one that's already here march the 17th and then point three i'm just going to select an interval of six months and then a standard cloud cover maximum cloud cover of 60 and then if you click on browse images there's a panel on the right which you can enlarge by um scrolling your cursor in the middle of the screen and so this is all the satellite data that's at this point uh or at your point if you've picked a different place in the earth um around the target date so as you can see with this is quite a handy tool to just scan through lots of data and in a short space of time and so uh you'll notice that there's quite a few images here that are a bit cloudy this one particularly uh is quite good so if we want to have a look a further more detailed look we can then add this to map and just zoom in and have a little explore so it's quite it's quite a handy tool if you just want to just visualize some satellite data of your area so i'm going to bring this back to the powerpoint now and if you just come back to this powerpoint with me as well and then we'll we'll take you to the next stage sergeant's interview again afraid sorry we're back on presenter view okay right let's just try this all right there you go yep is that right yeah that's grand lovely uh yeah so hopefully you're all back with me now and so obviously as we noticed there was quite a few images in there with lots of cloud and this is can be a real pain if you're just trying to use some satellite data within a certain time period and so to create some usable satellite data that we can map uh we can go through a stream like this where we gather the archive we might instantly remove these two images because they've got loads of cloud in them and then we mask the existing cloud in all the images that are available and then merge them all together through a median composite which i'll discuss with you in a second and so this this example on the right is just a median composite of data from 20 10 20 um over the belizean shelf so i touched on it there about creating a medium composite now what does that mean so on the left here i've got five satellite images that i had and i'm just picking out one single pixel in that so this is just a 10 by 10 meter pixel and the values in the middle here are the values that represent the brightness of a single pixel in a single band for each of these so for image one here we've got a value of 261 to a value of 267 and so on and you'll notice that uh image 5 has a really high pixel value and this represents the brighter areas which are the cloud and so when we take a medium composite we effectively put all these values for a single pixel in order and then we take the middle value so in this case it'll be 261. now obviously if an another way of merging this data together would be taken to take a mean but if we to do that we'd end up with a value of 2421 which isn't is not what we want we want the bottom cover or benefit habitat cover um which is around 250 here so amine is just not appropriate so if i bring you to another example and we change our pixel to somewhere else in the map uh you'll see in the middle column we've got a different mixture of pixel numbers so we've got three values here for one images one two and five and they're all showing values about three thousand eight hundred so when we come to take the median on the right when we put them in order we're going to extract a value of 3798 which is obviously not what we want because that is cloud so how do we combat this well we're going to have to mask out these areas of clouds and get rid of these high values so we'll just end up with 254 and 213 and we can use that data in the final composite so how do we create the mask well when you download any satellite data or um what comes with the satellite data is just a very basic cloud mask and this is shown on the left here so these are zones of blue are where esa when they produce the sentiment two data and they give you this cloud mask which as you can see is not appropriate there's large amounts of cloud here which isn't marked out so recently they've developed a cloud probability layer which they provide alongside the imagery as well so in this case we've got a short video on the right here which shows the cloud probability layer and we just use the thresholds to say oh anything above a certain value that is cloud and so if we apply this to all of the images in our analysis that we're going through we can then provide a much more effective and smoother medium composite so we're looking at shallow water environments so similar to what tim was going through earlier we're going to need to develop a land mask as well now i'm just using the near infrared bands so this is just the the band on the left here and this is what it looks like so those darker areas are the water and that's where the um radiation from the satellite from the sun sorry is being absorbed and none is being reflected back to the satellite and that's why it's looking a lot darker so when you compare that to land and you put a histogram together you can see that there's lots of water pixels which have got a very low reflectance so we can quite easily just say this is land on the right and this is this is water on the left and that's going to be our threshold so we can just do our analysis of the shallow water environments so how do we actually go about analyzing the satellite data and producing these these products to then make a classification at the end well this is where google earth engine really comes into its own so this is what it looks like so it should just be the same in any internet explorer and you've got four main windows so the window on the top left it's similar to your file explorer on just your computer and you have a series of files which are the scripts that i've i've built in this case so we're going to go through in a minute and also your assets which are your data sets or maps at the end of it you then have a central window which is your script and that's where you run a lot of your tools and then on the rights we have a console where you'll have you'll print your outputs and some tasks where you will export data to your computer at the bottom here that's where you can visualize your outputs and just have a have a play with your data just to see what's going on so to make it easier i've created a series of scripts that all look fairly similar so this is an example here of the create lan mask and we have uh right at the top of the scripts we've got these imports and they can be your own data or data that we correct uh create in the map view and for each script you'll have a couple of requirements and they are the imports needed so as you can see here for this landmark script to run we've got two requirements one of which is a point of interest around the world and the second is just a boundary for you to look at in that area and so the second half of this scripts window is your parameters so the green commented out section is all the parameters that we're we're going to need so uh when are we looking at the start and end date the uh clouds maximum cloud product productive probability sorry and that's the threshold that i was talking to you about earlier and some sort of near-infrared threshold and some output files so now we've i've showed you that if you've signed up to this um you can follow me and this is the second link in the chat box so i'm just gonna stop sharing and reshare my browser and this will open up the same window that you'll have if you follow the link so it should look similar to this i'll just give you a few seconds if you are following on also if anyone has any questions about um what i'm doing or if i haven't explained something well then feel free to put something in there in the chat box and i'll try and answer those where i can so for those of you who have opened up this window uh you'll see exactly the same format that we just went through a minute ago and on the left here there's a users and it should be your username and the mapping workshop that we're going through today so i'd like you to click on visualize sentinel 2 and just click in the scripts window in the middle you'll see the button run so just click that now at the minute we haven't defined a point of interest so just like we did before zoom into anywhere in the world in this case i'll head back to belize and in the map view window on the left you'll see a point marker tool and just click on that and again click on a point in the map wherever you're interested in and then it you'll see in the scripts window you'll see an import and this in this case it says geometry but we want to rename that to poi or point of interest and then so this is the only requirement for this script to run and then we need a few parameters that i've already put in there so we're just looking at some satellite data data from 2020 and we're using a cloud threshold of 40 so any images in the collection which are above 40 cloud won't be included we then run this and you'll see in a minute the image that we've created so as you can see we've got a fairly nice satellite image here and if you're in a cloudy area of the world you might see some cloud artifacts still in the image so this is why we need the masking process that i mentioned so if you then make this um if you go to your file explorer style window on in the top left you'll see sentinel 2 difference so if you click on that we'll abandon our changes for now and again we're going to need to put our point of interest in so if you just head to the map view and then i'm going to point the beliefs again and then rename this poi and then run the tool again so this one takes a bit longer but you'll see that on the left we've got a sensible image that we had before and this doesn't have any mask applied to all the images and in a minute you'll see an image on the right which is where we've masked every image in the collection and then done this median composite so this central slider you could then use to see the difference between the two now whilst this is still loading you can see that obviously the left side is still got clouds in it um you can still see these little artifacts which make the difference between features on the ground not as uh variable now if i slide this across you'll see the difference between the two so in the one on the right you'll start to see more clearly labeled features on the ground so in this case we can see seagraps in a bit more detail so this is the product that we want to work with now so if you if you have been working through this now come back to the zoom call and i'll explain the next few scripts because they need a few more inputs and it's a bit more complicated so we've got a couple more scripts on the left hand side and two of these are bathymetry now these are just similar uh methods to what tim described earlier and one of which is the stumpf method that tim went over and the other is a lysanger 1985. now both of these require some bathymetry data and uh this is uh just some bathymetry points that i uh we've collected from the lidar data now once you've uploaded your birth entry points uh you can then run the tool and it will give you a couple of bands to explore and they'll be exported to your assets which is over here so i've done this previously and here we've got the uh bathymetry layers here so we can have a quick look at that as well if i just make this a little bit bigger so in the map view we've got some bathymetry data it's not very well um picked out here but we can just stretch this data maybe a bit more and you can start to see some features and this is the same area that tim was working through before so this is one of the main inputs that goes to our next stage which is creating the training data now in this uh scripts this requires the land mask layer as well but i'm not going to go over that because it's all covered in the instruction manual so um feel free to go into more detail in that afterwards in your own time so this is just creating creating some training data for the classification now as you can see there's a lot going on here but some of it's just repeated because we're just collecting some data for each class so again we've got a point of interest our bathymetry that we just calculated some high resolution data that i've loaded in but you might have some drone data that you can also put in there as well and then we've got our four main classes that i'm going through today so that's sea grass sand coral and deep water now if i just pull this up so we can see the map view i'm just going to run this tool and you can see that i've created let's just take a few seconds load so i've created some polygons which we as james mentioned in the talk on friday this is a method called expert labeling now it's uh advised that you do this with a mixture of high resolution imagery and ground truth data so you might have a series of points or photos that you've taken on the ground like if we just had a drone survey of this whole island uh yeah you might also have some some points here that you can load in and you can look at to discriminate what's what so in this example i am just going to uh add another polygon to our class our training data and we're just going to look at sound so i can quite easily see if i draw if i click this polygon tool i can see this areas is sand and i'm quite confident that we can use this to build our training data for the classification so this is just as simple as adding adding the polygon and there it is so you might do this all around the the study over here that you're looking at and it's quite um it's advised that you get quite a lot of areas because uh it's important to reduce the amount of spatial autocorrelation that james highlighted in the talk on friday and this is where you want lots of different polygons around the image uh to increase the reliability of your classification so once you've drawn all of all of your polygons you then press the tool and the run in the run script tool and this will extract a series of random points from all these polygons and then it tasks in the top right will show you a couple of data sets your training and validation data that you're going to use for the classification so once you run these uh they will be exported to your assets so this is like obviously your your file explorer style window and they'll appear here so as you can see we've got uh some cme workshop training and validation data and this brings up me on to the f the final stage of the process which is the classification that's about under those changes so this script is probably the one that takes the longest so i'm going to bring you back to my presentation shortly but again this requires a point of interest your landmass the training and validation of the symmetry layers that we mentioned and again some some parameters that are similar to the ones we mentioned before so when you run the tool you'll come to a classification which will look like this just bring you back the presentation that we back in should be presented you now can you see that tim yeah that's grand that's grand lovely so this is the training and validation data that we've used for each of the zones and this is the area that we were just looking at so the red points here are your training data and the blue your validation data so it's important to say just for simplicity i've just taken all of the data from the same source but ideally you'd want to go out and pick some validation points uh either from a completely separate polygon data set or just individual photos that you've um you've taken that can then be used to assess the accuracy but as i said here we're just we're just looking at the points in blue to validate the data so this is the data that we've been working with and this is the classification that comes out out about five minutes later than um when i run the tool so if i just flick between the two you can see a pretty good uh idea about uh what's what uh especially i mean the coral reef areas you can see the north of belize city uh there's there's some turbid water there so it's actually picking up some coral which doesn't exist um and that's because these tub of waters are reflecting a similar signature to actual coral reef areas that you can see on the the top right of the map where there is the reef so when you come to use these maps it's really important to discuss what are the limitations including even when you have accuracy statistics like these in the top right now i'm just going to bring these to the next window and make this a bit bigger for you but this map is saying an overall accuracy of 0.9 or 90 now as as we mentioned there's a fair amount of the area that's north of belize city which really isn't looking like uh what it actually is in the ground and that's because we've taken our uh we've built our ground truth data set from these vectors these polygons of really high confidence areas so as i showed you we built a polygon of sand where it was quite obvious that there was going to be sand there and so that can give some misleading results in the accuracy assessment so the way to get around that is obviously flag these these limitations and also if you're going out to do some surveys and maybe you've got a drone survey it's important to go out to places where you're you're maybe not so confident about what's on the um on the seabed and then you can incorporate this in your your ground truth data set and then the accuracy should be more represented representative of what's the truth so another common way to assess how well your map has been produced is through a class confusion matrix and this is just comparing what you can actually see so that's your validation data versus what the map is producing now i'm not going to going to go into too much detail about that but as i said before we've got this this instruction manual that's going to come out uh very soon so if you're interested in that then then definitely follow that so that's it for me i hope hopefully you've you've learned a bit about how to use these tools and um if not yeah go ahead and use this instruction manual work through what you've done and if you've got some existing data or bathymetry data you can just edit those tools online that i've been through today and yeah apply it to your own area and hopefully you can get some nice results and if you've if you've made some some good maps and are confident with them i'd love to hear it from you so uh yeah please please send me an email or send one to the cma team and we can keep in touch so thank you
Info
Channel: NOC news
Views: 411
Rating: 5 out of 5
Keywords: National Oceanography Centre, Oceanography, Research, Science, NOC, Natural Enviroment Research Council, NERC, CME Programme
Id: TMzOq7kPy8M
Channel Id: undefined
Length: 33min 31sec (2011 seconds)
Published: Thu Apr 15 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.