Visualising data in NetCDF format

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello and welcome to you met search my name is Mark Higgins in this series of videos we're going to be walking through how you can discover download manipulate and visualize some of the amazing copernicus sentinel 3 data in the following videos we're going to introduce you to some of the you met sap product experts they're going to walk you through how you use some of the free and open tools like Bratz snap and QGIS to really get under the skin of these data to manipulate it and visualize it we really hope that you will enjoy and use this data in your work or your curiosity about the earth and marine environment welcome to you met set my name is Daniel Lee I work here as a software and data format engineer and we are standing in front of the Copernicus control room where some very exciting things are happening they're processing and controlling data from the Copernicus satellites which is coming down from space and exactly that data is going to be the topic of this video we'll be accessing that data and looking at how you can visualize and process it as well as data which has been produced on the basis of the observations of these satellites the topic of today's video is data which is stored in the netcdf format this is a very common scientific data format and it's widely used in the scientific community the various scientific communities because it's very flexible and expressive you can imagine it's like HTML you can use HTML to express any webpage that you like and netcdf can be used to describe any scientific data this also means that it's hard to make a program that can interpret any netcdf file and so for this reason all of the Copernicus data that I'll be showing today and in fact all Copernicus data in general adheres to certain conventions which tell users and also software specific places that they can look within the file in order to discover what data is there what it's all about etc the conventions that are really important are the ACD D which is the attribute Convention for data discovery this helps you find data and the climate and forecast conventions which are also known as the CF conventions and these together make it very easy to discover and also process data formatted and netcdf today we'll also be making use of several software tools there is the G doll library which is the geospatial data abstraction library which we will use to process geospatial data the netcdf tools will using the panoply visualization tool from NASA will also be using QGIS this is a package for visualizing and processing geospatial data and we'll be using several Python packages inside of a jupiter notebook most specifically the sci-fi stack including numpy and matplotlib x array for processing netcdf data and the side tools for plotting this data in a geospatial context and all of these resources are linked in the text below the video we'll be working today with data from three different Copernicus services the first one is seems the Copernicus marine environment monitoring service this is a service which is focused on monitoring the health of our oceans and providing information which is useful to anybody who has anything to do with the ocean so these might be fishermen or people who are involved in maritime shipping but in fact it could also be just normal people like you or me the second portal that we'll be accessing data from is the Copernicus open data access portal this is hosted at you met sent and it provides access to satellite data as well as level 2 products these are on the same viewing geometry as the satellite sees when it's flying over the earth but rather than recording with the satellite actually saw which is light and various wavelengths or radiation its geophysical variables which have been derived from what the satellite viewed the third portal that were accessing data from is the Copernicus atmosphere monitoring service also known as cams and cams is a great portal to find information about the quality of the Earth's atmosphere the data here is mostly forecasts and analyses related to the atmospheres composition and various variables which help us to understand what's going on there if you would like to know more about these different portals and the services themselves let us know in the comments for the focus of this video we will be looking at the netcdf data which i've already pulled down from these portals and i'm going to show to you with the software tools which i mentioned before the first two I'd like to show you today is panoply it's a tool which is designed to allow us to open netcdf files inspect their contents and visualize them especially if they are a geophysical scientific data on a plot or as various diagrams and for that I've downloaded some data from Siemens let's take a look at it the data is stored on my hard drive already and as you can see it's a file with a long file name and a dot NC file ending and this tells me that it's netcdf file because panoply is already installed on my computer all I need to do is open it if I were to open panoply directly I could choose a file that I want to work with in this case I opened panoply with the file in question already and so it's already opened as soon as panoply starts on the left hand side of my screen you can see the contents of the file and something that you'll notice about these file contents is that they're structured kind of like a folder which contains files we have the file name at the very top and then there are variables below there which are contained within the file so what the file has this multiple variables all stored in a way that's related to each other and they might be sharing things such as the location on the earth or the time for which they're valid and therefore variable is within this file temperature eastward velocity northward velocity and the sea surface height what we can also see is on the right-hand pane these are metadata about the file itself and so if we scroll all the way to the bottom in the right hand pane we can see so-called global file attributes this section contains attributes which apply to the entire data set for example here we have the title of the data set hourly mean fields from global ocean physics analysis and forecast updated daily we can find out who produced the data where to find them on the web what conventions they adhere to as well as some information about what time this data refers to and where it refers to above the Earth's center and on the Earth's surface horizontally if we want to see the metadata for the individual variables that's also available in the same view we scroll to the top for that what's also interesting though if you don't want to look at all that data at once is to just click on a single variable on the left-hand pane and that limits the display to only the metadata that's related to that there so for example if I click on z/os which is the sea surface height then I can see that that is stored with certain dimensions certain coordinates and I can also find out the meters and any kind of scaling and offset values which are relevant for viewing this data set so now that I've chosen the sea surface height variable I'm going to create a plot of that and if I click the Create plot button a window opens which asks me what type of plotter I'd like the plot that's most interesting to me right now is the longitude latitude plot because that displays the variable on the Earth's surface as a map which is what we're used to viewing so when I click create in the background the plot is created and here we can see a beautiful plot of the sea surface height and something that's very interesting about panoply is that the data is already labeled so panoply not only can understand netcdf it can understand specifically geospatial scientific data netcdf and for that reason it's interpreted what is in this plot which is the sea surface height it supplied a color map which is roughly appropriate for this type of variable and it's labeled it as well as the units and provided some basic data about the distribution of the variable that we're looking at what it also gives us is an array of tabs down at the bottom that we can use to customize this map for ourselves so for example I might navigate to the labels and say for a subtitle of this map see MEMS data and I might also change the map scale or the projection of the map that I want to use there are various other things that I can add to this map including country borders rivers etc so panoply is really great for getting a first look at your data because it understands your data and can show it to you in a way that you as a scientist or researcher are familiar with what it cannot do however is display multiple datasets together so if I want to compare the sea surface height height with the sea temperature for example I can't do that in the same map let's take a look at what that would look like so if I choose the temperature and create a plot of that I can create a new plot which I can display next to the old plot but I can't combine them so well this might be useful for getting a first feel for your data it's definitely not enough to analyze it numerically and find relationships between the data that you could use in your work or publish in a report for that we need to use another tool QGIS is a program which is designed for generic processing of geospatial data it's not designed specifically for scientific data but it can work fine with scientific data it may be in some cases that you have to do a little bit of work to prepare the data so that QGIS can understand it which is not the case for panoply because pan pls designed specifically for that data back to the example of combining sea surface height and sea surface temperature we can do that in QGIS quite easily I'm going to open my C MEMS folder and here we'll see I have the same file and it's represented to me in the same way hierarchically so I have the file name and the various variables which are stored in there and if I double click on sea surface height the first thing I'm presented with is a question which coordinate reference system should I use to interpret this data the reason I'm asked this question is because the CF conventions to which our scientific data adhere don't present the coordinate reference system to QGIS in a way that it understands there are many types of coordinate reference systems and they determine how to interpret coordinates which are associated with data for some local maps you might have coordinates which are in meters or even in feet or miles or kilometers and this is not often the case for global data global data normally uses latitude and longitude to record the position of an item on the Earth's surface because I know that's the case for this data and also because it's a safe assumption for global data I'm going to choose the classic global objection for my data which is wgs84 this is the data which is used most often for data which applies to the entire globe so if I click OK the display that pops up is now a map of the earth it seems I recognize that - map of the earth because I can see the continents basically punched out of the outline of what is the sea surface temperature and this looks like it's in the correct position I can also add on top of that the temperature which I'm doing via the same dialogue and when I do that that's also added to the map and I can toggle back and forth between them so you can see that they are indeed different maps they do indeed lie on top of each other and if I want to really verify in a quick and easy way that the maps are positioned correctly I can add some more data from another data set and in order to do that I'm going to pull in some polygons which represent the land coverage of the earth this is a free data set from natural Earth and it's simply a map of the continents and when I add that to the map you can see immediately the continents lay where we have holes in our ocean data which makes perfect sense because continents are in places where there's no ocean they all lie perfectly on top of each other so it looks like we've successfully combined three data sets from two different disciplines and in this case we might continue to work with them that's great what we've just done is visualized three different data sets from two different data sources altogether and we can see that in QGIS they're all lying on top of each other now just as a quick note QGIS has a lot of options to further process raster and polygon data or any generalized vector data for example we could do silly things like add together the sea surface height and the sea surface temperature in a more relevant sense we could take the eastward and northward velocity datasets and use them to compute the actual wind vectors and the wind speeds so these are things that are all possible within QGIS and because it has so many options it's way beyond the scope of this video but what I would like to show is a way to set a different color map because that makes the data a lot more attractive for any given data set and these in particular are raster data you can access the display options by simple double-click what I have here is the symbology tab for the sea surface temperature and it's only one band of data and currently it's displayed in greyscale but if I select single band pseudo color then I have an array of different color skills which I can select and customize I'm not going to change this right now but I'll use just a basic different color layer and what you can immediately see is that the data looks different and for people who really work with colors a lot you might be able to recognize more detail more features within this data so that's really good now what happens if we want to add additional data I'm going to now add to the map some data from the coda portal so if I want to add some coded data which is stored again in the satellite viewing geometry to this map it might be a straightforward process so now I'm going to select the coded data on my hard drive and I see again a netcdf file and it's stored with a number of different variables this is more complex data and for the sake of simplicity I'm going to choose the sea surface temperature variable from this because it combines well with the data that I already have on here so if I double click on the sea surface temperature I'm again asked what coordinate reference system should be used to interpret this data and if I apply the global coordinate reference system wgs84 which I had done for the previous data it looks rather confusing at first glance in fact we see almost nothing except for this little bit off in the corner and if we right-click on the data set and select zoom to layer you'll notice that the layer is actually off somewhere in an impossible position it's below the South Pole and that just doesn't work now I'm showing to you this mistake because it might happen to you and I want you to be able to find out why this is happening so let's do some data archaeology I'm looking at the data and I know because I selected the data previously that this should be displaying a region over the Mediterranean ocean and I should be able to see some of Italy and some of Greece and looking at the data I see if I mirror it on two axes so North becomes South and West becomes East that that looks a lot like Italy and that looks a lot like Greece so this is the first thing that's suspicious to me the second thing that's suspicious to me is if I move my mouse around the map I can see down here the coordinates that my mouse is over and these coordinates if I go to the top left corner which is located on the earth seem to begin at zero zero degrees now because we're looking at local data that is stored the same way as the satellite saw it it wouldn't make sense these to be stored as decimal degrees like the other day it is but it is telling that 0 0 maps to a place on the earth and the reason that there's no data displayed there is probably because that position within our data isn't actually over the ocean where's the data only contains information about the ocean there's more archaeology that we can do from the terminal command line so now I've opened a terminal and I'm going to navigate to the co2 folder and if I view the contents of the folder I see the netcdf files contained there and I'm going to use G Dahl which is the geospatial data abstraction library to display some data about this G Dahl info is a good tool to just dump some basic info about what a file contains and if I execute that on the file I will see at the very bottom of the exhaustive information that it outputs something impossible the corner coordinates which bound the data go to 512 degrees and 512 degrees can't exist on the earth so something definitely is wrong here if I open the same data set in panoply just to make sure that the data is not corrupted in some way what I will see is that panoply is able to display it just fine that tells me that I can trust the data much more than I might have thought initially now I've opened the data and panoply and I see roughly the same as I saw it before and I'm going to create a plot of the sea surface temperature which is the variable that I opened in QGIS it's just a few minutes ago when I do that the plot opens and luckily the data isn't located somewhere far away from Earth in an impossible place instead it is located where I thought it should be as you can see on the map I'm going to change the map now so that we can see it a bit better by zooming in on it and for that I'm going to use a different projection so I'm going to take the coordinates that I have which might be global and transform them into another coordinate system so that we can see a better and so now I'm zooming through all the available projections that panoply has to offer as you can see it's quite a few if I select transverse Mercator then the map doesn't look quite as descriptive as before the reason for this is that transverse Mercator is a projection for local data which fits well to our data but it's not positioned at the correct place right now I'm moving the center of the map now to 20 degrees east and 36 degrees north and when the map updates you can see that there's our data and it's in the correct position which means that somehow panoply can interpret the data whereas QGIS cannot how is it doing that we can find out some more information by using the NC dump tool which is a tool for viewing the contents of a netcdf file if I execute NC dump on the file that I'm interested in it will give me all of the data which is contained in the file and that's quite a bit so I'm going to use the minus H flag and I'm going to pipe the output to the program less which will allow us to scroll through that data now I can see the contents of the data as text and if I search for sea surface temperature I jumped to the variable in question and I can see a lot of metadata which is associated with it this by the way is the same view which is presented to you in panoply and we can see that sea surface temperature has a couple of interesting things for us it has an ad offset which we'll need to interpret the data later on it has a scale factor and it also has coordinates which are lon and lat so what it's referring to with the coordinates are actually other variables contained within the data and what they have is the actual coordinates where the data is located on the Earth's surface what we were looking at in QGIS with the coordinates within the array of data stored on the hard drive of course for me as a scientist I'm much more interested in where my data comes from on the earth then where my data comes from on my hard drive so let's go back and use G tall to extract those coordinates and apply them to the data now the tool to do that with is G tall warp so I'm going to run G dull warp I'm going to use the parameter geo lock G lock means extract to you location information and now I need to tell G da warp what data to interpret in house I'm going to say this is netcdf file I provide the name of the netcdf file and I also provide the variable that I'm interested which is sea surface temperature and now I need to provide an output file name I'm going to call this SS T for sea surface temperature dot TIFF to produce a geo TIFF file when I hit enter 'sheet all processes the data and gives us a bit of output so that we can track it and now it's done if we return to QGIS we'll be able to see that output now I'm now removing the sea surface temperature layer from my map because it's not useful the way it's loaded in there by selecting remove layer QGIS asked me to confirm that and now when I scroll down within my folder I can see that that new data set is available SST TIFF double-clicking that asked me again what coordinate reference system should I use and when I click OK using wgs84 I don't see something which is located far away from the earth instead I see a little tiny Fleck of color which is in the Mediterranean ocean let's zoom in to that right-click on the data set I select the option zoom to layer and when the map updates I can see that the data is indeed located where I expect it to be and it looks good now something that should be noted here if you're looking at the map legend you'll see that the data spans the domain from minus 299 to positive 40 420 which doesn't make sense we know from looking at the data that is stored in kelvins those would not be possible temperatures to find in the ocean on earth in fact anywhere and the reason for this is because of the aforementioned scale factor and ad offset if we want to interpret this data we need to apply those two factors in order to unpack the data and show it an incident of units and if I were to do that the most likely choice that I would take would be to use the map calculator which is available on QGIS what which is outside the scope of this video we've just seen is QGIS and its power for combining multiple datasets from different disciplines and showing them together and also possibly for processing them to make great maps or to make great data QGIS is very very powerful if you're willing to sit in front of your computer and click something together it's very interactive it's easily approachable but sometimes you have a workflow that you've already defined you know how it works and you want to program it so that you can be producing data on a daily or an hourly or a minute Lea basis and for that QGIS isn't the best platform instead I'm going to show you an example of how to do something similar to that in Python and this can be executed completely without human interaction the environment that I'll be using for this is a Python notebook because notebooks are good ways to produce reproducible research if I give you a notebook what it contains is an explanation of the steps which are going on and then the actual code which is executed in the course of this research and what you see in the notebook is not only the code but also the results and it's executable on your side as well which means it's as big of a revolution as if I could give you all of the materials in my lab and the lab itself so that you could reproduce my experiments very very exciting I have a Jupiter notebook server running on my computer at this moment and so all I need to do to interact with it is navigate to it in my web browser the data that we'll be working with is from the Copernicus atmosphere monitoring service and what you see on my screen right now is the cams portal which I used to download the data the data stored on my hard drive and I have a jupiter notebook which i've prepared for you with explanations and code and this will be accessing that same data this notebook is available on the web so you can download it and play with it yourself change it to match your needs let's get started this is notebook that will be executing in Python 3 which is the future of Python and if you worry about using Python 3 please don't Python 2 was the main Python version for a long time but now most libraries and especially all scientific libraries of relevance have been ported to be able to work with Python 3s this really is the way to go we're going to start by importing all the tools that we need and that's this block right here it just imports numpy the PI plot module of matplotlib x array which is used for accessing netcdf data and the CRS module of carto PI CRS is for coordinate reference systems numpy matplotlib are both from the SyFy stack of technologies x array is a package which is designed for working with netcdf data big arrays multi-dimensional data sets and carto PI's from the site was family of libraries and it's designed for working with geospatial data in a scientific context and we're going to use that from mapping I execute that by hitting control enter and I'm not expecting to see any output here because all of these libraries are installed in my machine and so all that's happening is that I'm gaining access to them within the context of this notebook so now we've imported all of the libraries that we need and in the next cell we're going to import the data that we want to work with and we'll be doing that by making a call to the x-ray function open data array open data array is right here and we pass into this function the file name that we want to gain access to and that is stored under the folder on my machine Kam's and it's netcdf file as you can recognize from the file name and we assign the result to this variable right here we're calling it no2 because this is nitrogen dioxide data and in the next line I just spit out a summary of what this no.2 data is that we can see how the computer is viewing it now what we see is the summary of the data that we just loaded we see that it has three dimensions the latitude longitude and time dimension and we also see the content of the data itself namely total column nitrogen dioxide with units kilograms per square meter and that's great that shows that our data can be understood by x-ray and presented to us and now I'd like to make some very basic plots with this so the first thing I'm going to do is make a plot of the total column nitrogen dioxide over a point on the Earth's surface there are multiple ways to do this I'm going to show you two the first one is to select by index so what you do is you call the object which we've created which contains the data that's the no.2 object and we're going to call its method I select the stands for index select and I pass in the indices and the positions within those indices that are interesting for me those are latitude and longitude and I've passed the number 0 to those that tells python to pull out the first position within those arrays so it's going to pull out all the you know 2 data that is located at that position and assign it to the variable first point we could do this and when I execute the code it will run without any problems but it's actually not very interesting for us from a scientific standpoint because again the I select is using the position with an arrays and as scientists were normally interested in the position on the planet Earth and so in order to select that data to select according to those criteria I'm going to use another method so again I access the no.2 object I call the Select method not the I select but the Select method and this will select based on values that I pass into the call so I say I would like to have all data which is located at the latitude 50 degrees and the longitude 8 degrees and I'm assigning that to the variable names Dom's dot because that's where I live and work and so that's interesting for me and in the next line one down I plot that data let's execute that now now what you see is that the call was executed and a nice plot is produced it's a line showing the evolution of no.2 values for the total atmospheric column above Dom's not for the month of June this year and we can see that the axes are correctly labeled so at the very top we have the position which this is extracted from longitude eight latitude 50 the y axis we have the total column nitrogen dioxide and on the x-axis we have time and we could recognize a certain cyclical pattern throughout the month which is interesting we can also compare two different places on the earth in the same plot as follows so I'm doing much the same as I did before I'm accessing the no.2 object calling the Select method I pass in the latitude of 50 and instead of passing in a single longitude value I pass in a list of values 1 and 8 I've chosen these values because they're approximately the positions of umit's hat and Dom snot and ECMWF and reading both of which participate in many of the Copernicus services and I choose the method nearest that means that if exactly those coordinates are not contained in my data X array will select the nearest available coordinates so this may not be exactly the position that I specified but it will be close enough for our purposes I could store the result of this call directly in a variable as I did in the previous call but instead I'm going to use it directly without storing it anywhere so I call the plot method of that and I tell the plot method that I wanted to create a line plot and that the x-axis should be the time dimension when I execute that what we'll see are two lines the plot is much the same as it was before but rather than telling me the longitude coordinate of the very top it tells me the longitude coordinate and the legend with the color which corresponds to it so that I can find that on the plot and what I can see here is that I have two different color lines one color corresponds to Dom's dot the other corresponds to reading and it looks like in the month of June this year reading had less nitrogen dioxide in its atmosphere than Dom snow did what if we want to get a feel for the data well there are lots of different things that we can do because now the data is available to us in Python so we can apply the full array of Python libraries to process it and that includes a lot of scientific software that we can access and use right away without needing to write it ourselves so I'm going to show you a contrived example it doesn't make much sense for many applications but it shows the power of what we've done here I'm going to call the numpy library and from that library I'm accessing the log function this computes the logarithm of an array of data data that I pass in to it I pass into it our no.2 data set and ask numpy to make a plot of that and here we see the plot the plot is a histogram of the logarithm of all of the no.2 data that is stored in our data set so in this one line of code we've iterated over all of the data stored there for all time steps for all positions on the planet and have computed the logarithm and then we made a histogram of all that that's very exciting if you're a data researcher but what if you want to see the data on a map if you're interested in the spatial relationships between the data we can create a map of that also within Python and display that in this notebook so what I've done here is I've accessed the no.2 object and I'm going to use the I select method this is selecting by index as we said before and I'm selecting by in by position within the time index that's why this argument is set to time equals zero so it's pulling out the first time step in the data for all locations on the earth and I'm doing this because oftentimes you're interested in the oldest data or the newest data in your in your time series I create a plot of that and that's actually enough to create a map of that time step of the variable I'm looking at but in order to make that map easier to understand I've passed an additional argument into it which is robust equals true what does this do it removes outliers from the creation of the color map and this will allow us to see some more detail because outliers can have a big impact on where the color map is stretched to I'm removing them so that we can recognize more in our map so now that I execute this line of code you can see a very very basic map of the earth so we see the total column nitrogen dioxide with a legend on the right side on the left hand side the y-axis is labeled as latitude and the bottom of the x-axis is labeled as longitude and we see what time specifically comes into question here and that's also interpreted in a way that we can understand this makes a kind of strange map because the zero coordinate is zero degrees east which lies over England so that means that on the left side we have the very edge of Europe and on the very right side we have again the edge of Europe and the map is centered over the Pacific Ocean which is fine if you're an oceanographer or if that's your point of interest but it's not the map that most people will be used to dealing with that doesn't bother us right now what we can see is that over Asia and Australia and also over the Americas there seems to be some higher concentration of no2 over areas which are heavily settled by humans which matches everything that we know about so2 s development in the atmosphere so that's great for getting a first look at this global data on a map now I'm from you met Sant and I'm interested in satellites so what I'd like to do in the next step is to look at the data much as it might be seen by a satellite so I'm going to create a new projection of the earth which will seem as if we're looking down at the earth from above I'm going to do this using matplotlib so that's in this cell right here I call matplotlib and from matplotlib I call the axes function and I pass it as an argument that I'd like to use the specific projection what projection do I want to use I want to use a projection from the sight walls family of libraries this is from carto PI and it's the orthographic projection which just means it's the projection as if I were looking at the Earth from above and I want to Center this map at eight point seven degrees and forty-nine point nine degrees north this is roughly the position of dominance organ we'll be looking at ourselves from above so to say we take the result and we assign it to the variable ax this is shorthand for axes and in the next line we call the coastlines method of this object to add coastlines to it this will give us a better feel for what we're looking at now in the last step we're going to add the data from our data set to these axes so to the plot and we do that by calling our no.2 data set and using the I select method and now we're going to select the last item according to our time dimension so this is the most recent data which is stored in this data set we call the plot function of the result and in the plot function we do two things first we transform the data onto another projection of the earth namely to the plate car a projection because many people are used to looking at the earth this way and secondly we do the same thing as we did before we set robust to true to remove outliers and this will help us to recognize details and features in our data I just executed that coat and if I scroll down what I see is a map of the earth everything is transformed correctly to be round and on a globe and it is the correct time point it's labeled correctly with a legend so that's great now one last thing I'd like to show and this is something about the power of working with Python and working with notebooks is that I can go back to any point in this file and change things andrey execute the whole thing so in this example I'm going to sent the orthographic view to zero zero which is the same view as our meteos at satellites see so they're centered over the prime meridian and the equator i set that to zero zero but i could just as well change something at the very beginning of the script for example i might change which data set is red and that would change what's plotted nothing else no other further work would be required in order to completely change what i've done and now if i go back to the beginning and say run all cells for example all of the cells are executed and you can see the results are the same as before as they pop up and you can also see that the final view which is produced is different this is the view that we see from media set as a researcher that is huge because it often happens that you have something that you need to change in your workflow and this prevents you from having to repeat every subsequent step you just change it in the script re execute and you're fine this is great for producing reproducible research and it's also great for prototyping workflows that you might want to apply in production so that's it for now what we've seen today is visualizing data from the Copernicus atmosphere monitoring service cams the Copernicus opened data access Kota the Copernicus marine environment monitoring service seems and we've learned to use several tools to visualize and analyze our data panoply for basic visualization G dal and NC dump and the NC tools in general for diagnosing issues with our data QGIS for combining datasets and finally several Python libraries embedded in the context of the Jupiter notebook I hope you're as excited as I am to work with the Copernicus data and that this video has been helpful for you and your work keep an eye on our channel for further educational videos and see you soon [Music]
Info
Channel: EUMETSAT
Views: 42,942
Rating: undefined out of 5
Keywords: EUMETSAT, Meteorology, Weather, Climate, Europe, Meteosat, Metop
Id: XqoetylQAIY
Channel Id: undefined
Length: 39min 56sec (2396 seconds)
Published: Thu Sep 20 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.