Multispectral and Hyperspectral Imaging for Plant Sciences

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
good morning everybody and thank you for joining us today for this webinar on spectral imaging in Plant Sciences I'm going to concentrate mostly on multispectral imaging and you'll see a lot of examples later using the video meter lab for instrument you can see here but I will start off first of all by talking a bit about the background to spectral imaging and where it comes from I'll tell you a little bit about the kind of hyper spectral imaging hardware that analytic can supply from headwall for topics I'll show you a few examples of agricultural or precision agricultural applications for hyperspectral imaging then I'll move on to a bit about multi spectral imaging versus hyper spectral talk a little bit more about the video meter lab that you can see here and some of the applications we've developed with that specifically in plant sciences and I'll show you a few clinical things so we start the presentation so they say a picture is worth a thousand words it's a well-known phrase and by analogy a spectral imaging is worth a thousand pictures so we can see here an example of hyper spectral data cube or hypercube they're sometimes known as the images here you can actually see a body of water perhaps at a town of some sort just here and some coast line and behind each pixel in this image is an entire spectrum collected for each pixel and there's a huge amount of information in there that we can use to tell a lot about the scene so what is spectral imaging in the first place well it's a hybrid of spectroscopy and imaging so we've had these two technique spectroscopy and imaging have been around for well over a hundred years each of them on the left-hand side you can see a spectrograph of light collected from an exoplanet actually planet in another solar system and we can tell a lot about the atmosphere on this planet just by looking at the light collected from when it passes in front of its home star so all these gaps in the spectrum tell us a lot about the chemical composition of this atmosphere and of course here we have a old-timey black-and-white image imaging has been around for a long long time and ever since it came about we've been using images to measure stuff photogrammetry imaging has developed from black and white into red green blue color images designed for our own eyes and then the next evolution up from that course is spectral imaging so spectral imaging is much much better if you like them RGB color which in turn is much better than monochrome or we're getting more and more information with each evolution in the chain and so what we get with a spectral image is the best of both worlds the best of spectroscopy and imaging so with spectroscopy we get a lot of information about the chemical identity but we lose any information about spatial location we just measure one spot whereas with RGB imaging or monochrome imaging we have our spatial location information but we don't really have any information about the chemical identity a spectral image on the other hand of the far right we have both the chemical identity and the spatial location in the image so this technique is has a few names depending on your background it's known as hyper spectral imaging or imaging spectroscopy sometimes chemical imaging and it depends what fields you're talking about they can all be kind of grouped together and they all use pretty much the same kind of hardware here's an example of a hyper spectral imaging spectrometer from headwall photonics and I'll go over a bit more later about how that works then multispectral imaging is really the same idea as hyper spectral imaging at the same technique except it's much more cost effective it's a lot easier to a lot easier to analyze and I'll tell you in a bit about why that is but just to take a quick step backwards and go back to basics apologies for those of you who are well familiar with this but just to cover all bases here we have an instrument on the left here it's called a handheld - is from our supplier of art called asd asd hadn't l2 and you can see what we have here is a probe that has an integrated light source it's just a quartz tungsten halogen bulb and a fiber-optic cable to collect light that reflects off the leaf so it illuminates the leaf in a certain area with the light source then we choose an area somewhere in the middle to measure and we collect that light that comes back into the fiber-optic probe comes back up and into the unit where it's analyzed this is a fizzle range light and we get an awful lot of information from that this is a widely used technique in all sorts of areas you get a a spectra spectra a spectral signature for that leaf you can see here the green line represents a green leaf and the red line is a red leaf you tell whether it's red or green from the spectral signature the idea or the drawback a guess for spectrometry spectroscopy is that you are only collecting information from one average area so that spot could be very small just a millimeter across could be very large it could be a couple of centimeters across but it's always gonna be the average of everything in that area the idea with imaging spectroscopy or hyper spectral imaging is that we're taking many many data points over the area so here we can imagine we've used the ASD handheld to take an 8x8 picture eight pixels wide and eight pixels deep image with a spectra behind each pixel you only drawback to doing this it'll be really tedious and slow to do and so luckily we can do it with specialist hardware now the idea of height chiral imaging has actually been around for decades mostly employed on satellites and airborne vehicles so what goes on here is we have this Sun as a light source and then the satellite or urban vehicle will move over the scene and collect data as it goes so this image on the right here this is a NASA JPL avarice craft avarice stands for airborne visible and infrared imaging spectrometer and the kind of data it collects will look like this so what we have these are two data cubes collected with avarice you can see on the left we have a scene of perhaps some buildings down the bottom left body of water a road and behind each of these pixels will be a very high-resolution spectrum a spectral signature for that area of the ground that the pixel is covering the width of the image is going to be determined by how wide your sensor is and the length can be as long as you like because as you fly long as you flow over the scene it just collects more and more data and it adds line by line to the data cube a bit like an inkjet printer and it builds up the image that way just a quick word about the actual data you get from hyperspectral camera you have to analyze it using multivariate statistics in specialist software packages so things like NV which has been around for a very long time there's others like gerbil optics you can use MATLAB and it involves very big data files so you've talking gigabytes and gigabytes of data per data cube requiring very powerful computers very complex analysis just to reiterate there's kind of two ways to look at a data cube here we have a again an image of some fields a bit of woodland on the left what you can look at is perhaps one pixel and all the wavelengths in that pixel have a nice spectrum spectral signature in one pixel or we could look at all the pixels all the whole area but at one single wavelength so this image could be specifically at green 550 nanometers light so there's two ways of looking at a data cube and what are you being useful nowadays well precision agriculture is the dominating application really it's used for looking at crop stresses so for nutrients water disease and pests soil color characterization vegetative color and yield estimation but it's a very widely applicable technique very versatile it's used for environmental monitoring so that's largely what these satellites are doing when they have their hyperspectral multispectral cameras they'll be flying over disaster areas such as in Canada at the moment the avaris aircraft has been used to look at the wild lands wild fires going on it could also be used for food quality and authenticity contamination so Food Standards Agency is at the moment very interested in the potential for hyperspectral and multispectral imaging for looking at food authenticity in particular it has applications in pharmaceuticals as well so looking at coatings on tablets or mixtures of powders and you can actually mount these on a production line they're very well suited to looking at moving webs so for example the satellite or plane has to fly over the scene to collect the image but you could also have a product moving under the camera instead and of course finally military and defense applications in fact military and defense was in the past until the past few years it was the main market for these type of cameras they obviously had the money to spend and the top-secret applications but precision agriculture has taken over by a long margin now over military and defense precision agriculture is really the main driver for these this type of imaging application so here we can see an example of a head wall photonics hyper spectral imaging spectrograph so what you can see is a very compact it's quite small actually the way that the head wall design works down in the bottom right we have a schematic of how they work what happens is you have a slit here a very fine slit so that's the width of the sensor and we're going to be collecting one line of the image one line at a time and that may be say a thousand pixels wide you'll get a thousand pixel wide image here we have the light path for one pixel coming into the system it hits heard a focusing mirror goes to a grating so this is where headwalls expertise is and is in very high quality very carefully designed diffraction gratings are very much like a CD when you see the reflection of the CD is diffracting the light and it come it separates the light into its red and blue and green components which then are then focused onto an array a sensor array a fairly normal sensor array really much a bit like you'd find in your phone so these are the types of sensors that have been for a long time put up in satellites and on airborne craft and over the years what with drones coming around headwall have put a lot of work into developing incredible piece of engineering really this is called the Nano hyper speck you can see on the left here a golf ball just next to it so you can see just how tiny it is so all those are focusing optics are all inside here yeah very high signal-to-noise ratio very high quality but it's tiny this orange box on top here is actually an integrated GPS an inertial sensor and also it has a 480 I think 500 gigabyte data collection hard drive or flash drive on here so when you're talking about size weight and power considerations for a drone such as we have here this is an incredible of we have a tiny device with its integrated data collection and GPS headwall photonics worked with Leica Geosystems a you know like a from the microscopes perhaps like a Geo Systems develops with headwall this drone here this orange beauty that has everything on it that you need so it will have this Nano hyper spec is inertial sensor lidar data collection unit which is this tiny orange box on top and a great advantage to this system is its ability to you can enter the GPS coordinates a polygon a square that you'd like to collect data from so in your field and press Co and the drone will fly off and only start recording data once it's over the area you specified so you gain great advantages in a time saved in collecting just the data that you want and cutting down all the complexity about getting rid of data you don't need it does also Auto rectification which I'll show you in a second it's very good integrated package ready to go so just go on through to show some examples of what these hyperspectral cameras are useful here we have a hyper spectral data cube collected from the ESA flex program at the top you can see the red green blue simulation of what it would look like to our eyes we have lots and lots of information we can reproduce an RGB image and we can see it's a mixture of fields with crops and soil and roads and all sorts and from this data after processing we have a decision made or we can we can highlight and quantify areas in this image and show exactly where the crops are growing and where there's just bare soil and depending on on how well these crops are growing or how healthier they are perhaps we can color code and you can see these false coloration images quite a lot when you're dealing with spectral imaging once you've collected the data you process it and output a false color did false contrast image with a lot of information there to make decisions on so another example perhaps could be from the Ariana project on the Left we can see what would look what a scene would look like to us just a patchwork of green and brown you can make out some roads and trees fields etc but using these kind of a powerful post-processing on these hyper cubes of data cubes we can start to classify and quantify the areas in the image whether they're grassland woodland roads playing fields buildings all that sort of things so this for example could be very useful to the military for finding camouflage tanks things that might look I invisible to our eyes because it's a certain shade of green but in the hypercube and the processing you can actually highlight and find whatever is in there taking it down back to the ground this slides is really to illustrate two points here first of all that in the last few years sort of five years or so the equipment and cost of the equipment and ease of use of the equipment has dropped dramatically making it much more widely available to research groups are interested in doing this down in the field or in the lab so this image here is right up close and personal with some wheat is and they've used a hyperspectral camera and imager to take an image and then quantify the degree of affection of Fusarium on these wheat ears so you could then use this calibration this data to go and feed into a hyper spectral drone flying over the field that they could then quantify over your entire field how badly affected by Fusarium it is another note to point is that you get a degree of infection it's a quantify it's not just is it infected or not we can tell how infected it is so these guys in their in their picture have indicated a degree of infect and color-coded it again with this false contrast false coloration that you always see another nice application down on the ground is in chlorophyll content so here these researchers have used an imaging spectrograph to image seven believes up-close-and-personal and based on the spectral signatures of different pixels in the image you can retrieve a lot of information about the chlorophyll content so I'm going to start to move on to talking about multi spectral imaging I'll just quickly point out some differences similarities it is the same idea where you have an image with many spectral data points behind each pixel or if you like another way of looking is a data cube with many hundreds of images so with hyperspectral you might have anything from two hundred to a thousand images in this data cube multispectral imaging it's just the idea that you have far fewer of those images far fewer of those spectral data points in your data cube so the data files are really a lot smaller so in hyper spectral imaging you'd have gigabytes and gigabytes of data at multi spectral it may only be a few hundred megabytes the analysis is therefore quite a bit more easy with multi spectral imaging and there is dedicated software out there specifically particularly rather with the video meter level show that in a minute the hyper spectral imaging you do get these very high resolution spectral signatures you get all this data and that's absolutely fantastic you retrieve a lot of analysis from there and multispectral imaging you much got a much lower resolution spectral signatures now you might think it's there for a poor-man's hyper spectral or it's not a good in the way you'd be quite mistaken there and it's kind of horses for courses just because it's multi spectral imaging doesn't mean it's really any worse it may actually be an advantage to give much it's much easier to analyze much quicker to analyze and really in a lot of situations you don't need 90% of the information in a hyperspectral data cube you might only need what's in the multispectral data cube to tell the same thing and I'll illustrate that now with a kind of an example so if we had a leaf and we know this leaf is a mixture of chlorophyll a and chlorophyll b and we spent time in the lab using a spectrometer to find out the absorption or reflectance spectra of each molecule separately chlorophyll a and chlorophyll b and we can see them here nice smooth spectra with multispectral imaging what we can do is just interrogate or sample those spectra at certain points so here is just four points along the spectra and we should can actually see that chlorophyll a and chlorophyll b are quite well separated in just those areas they have their own unique pattern many unique spectral signature even though it's a low resolution four data point spectral signature they're quite distinct and we can actually chemically identify and spatially locate those separate very closely related molecules over the surface of a leaf so although we've we've drastically cut down the amount of data we're collecting we're still getting a lot of analysis value from it and to take it back to for example the example I showed you earlier off bizarre room infection on a wheat ear we can actually do this same thing with the video meter lab so that's just a multispectral imaging system it only has 19 data points in each spectrum but here we can see a petri dish will has been masked out the petri dish button a petri dish with the grains of barley and Fusarium infected barley and we can run an analysis to actually find locate and quantify the degree of infection on this on these barley samples very quickly and very easily and we can train the algorithms and save them save the light settings and this again and again in a very objective and repeatable manner so here's an actual application over in the Danish Morten group I think it is where we have I'll play this video but first I'll point out we have the integrating sphere of the video music lab here a little conveyor belt that's feeding grain into the imaging area just underneath and a hopper that you can fill up so what happens is it just vibrates and distributes and grain onto the conveyor belt which this goes under and is imaged the blue is like a green screen if you like the blue background off the conveyor belts like a green screen that can be marked out easily by the software at the moment it's it's pausing between each image acquisition just so you can see clearly what's going on but in reality the image acquisition time for of video meter lab anywhere between five and sort of 10 seconds depending on how many of the LEDs you need to use you might not need all 19 you might only use 5 or 6 and so the image acquisition time will increase the data file size will go down and once it's gone under and goes past in the conveyor belt and it's just collected at the end they you can see the entire system so just normally the video meter lamp on its own is all you need and you would have to manually feed petri dishes underneath and take the images but with this you can get a much more high throughput system going here's a brief schematic of the video meat alone for so what we've got here is an integrating sphere all that is as a hollow sphere about a foot across foot-and-a-half it's painted white on the inside and it's got LEDs all around the inside circumference so there's 19 LEDs all around the inside this is a motorized raise and lower so it goes up and down and you put your sample here at the bottom the sphere descends and encloses it inside at the LEDs shine in turn for a very specific of time depending on your light setup and the camera up top collects the reflected light from your sample very easy quite intuitive is very easy to understand ignore for the moment this emission filter will I'll explain what that's about that doesn't come as standard but it can be very very useful the LEDs provide what's called active illumination so we can really very precisely control the lighting in here we're not reliant on the Sun for example and it doesn't get cloudy or you don't have a different times a day whether it's noon or sunset we can very precisely control the lighting conditions between UV 375 nanometers all up to all the way up to you infrared 970 nanometers there's a 6 megapixel camera up the top and that acquires an image each time the LEDs shine in turn a lot of people just a quick note a lot of people think all 6 megapixels that's not that great is it because my iPhone has a 15 megapixel camera it does but it's not very compelling light like for like in for example a 15 megapixel iPhone camera you've actually got 5 megapixels each for red green and blue so they're separated at the actual sensor has a filter printed on top of it to collect red green and blue light separately and create an image that looks pretty to us but what we have with a video meter lab a 6 megapixel camera acquires an image at each wavelength so in effect if you're going to compare them this is in effect like a 120 megapixel camera so a 6 megapixel image at 19 different wavelengths it makes like 115 megapixels so once it's collected there's 19 mm LEDs it collects an image for each OD you have 19 images for data cube or if you like 19 data points per pixel the filter will I'll quickly mention as an optional extra for that would allow you to look at multi spectral fluorescents so he cut out the reflected light and only allow through fluorescent light above the wavelength of the LED and so that could add a in effect up to 27 extra multispectral fluorescence images at 27 plus 19 you can have a 46 band of 46 image data cube from this setup for those who are interested if you can download this presentation later these are the wavelengths are the LEDs in the video meter lab they start in the UVA so that could be looking at fluorescence or GFP excitation and they go at intervals don't have to be regular intervals just intervals through the visible and into the infrared it stops at 970 because of the limitations of silicon sensors you'll find with if the new meter lab and a lot of hyperspectral cameras you'll find that they will often go from 400 nanometers to say a thousand nanometers because above a thousand nanometers that's infrared silicon is transparent and so you have to move on to in gas sensors so it's a different technology to look at far infrared light but we can get a lot of information just using the what's called the veena visible and near infrared region between say UV and 350 375 up to about a thousand so why would you want to use the video meter lab for multispectral imaging in the first place well it allows you to do very high throughput sort of semi automated workflow imaging using using multispectral imaging which is much better than normal color imaging which is always non-destructive objective and very repeatable and is in fact very flexible and general-purpose so for general-purpose quantification of features of putting quotes in a petri dish sized image area so the the image it takes of of a petri dishes as if you're looking at a petri dish from about arm's length and we can measure and quantify both spatial and spectral features a spatial can be area and size shape length and spectral features could be the color or indeed the chemical content of our sample so who's actually using this well in the plant science arena there is one of the blur John Innes Center up in Norwich they've had one about a year and they're still developing applications for it for a few years they've had ones over in bargaining and university in that in the Netherlands savez in France an ocean jente in Holland have one and pretty soon Rothamsted research will have theirs installed in a month or two time it's just being built now as some examples of not plant science customers just in the UK we have boots who use it to look at makeup and makeup where BP Castro knew using it for metal corrosion experiments the home office has one of their experimental research labs to look at fingerprint development and fingerprint analysis and Procter & Gamble have one in Egham for looking at counterfeit packaging so widely different applications really very versatile and even in plant science it's very hard to say what a certain research group might want to use it for it has all sorts of applications so I think I'll go on and show you and talk a bit about this applications in the plant science arena so these are groups abroad that are using the video meter lab it has very wide applications in have a look at Google Scholar circadian chlorophyll rhythms cultivar discrimination screening seeds transgenic seeds there's quite a good example just recently in plant genetics the paper plant genetics or journal rather on using these multi spectral images for seed bank gene bank accessions so we can relate the the genes and the phenotype in a very close manner and have a real valuable database of these multi spectral images that you can collect easily and store them away for future use various applications examples in general so for grayness e variety disease detection there's lots of moulting applications germination and hydration in food you can look at quality control authenticity contaminant detection as I mentioned the FSA the Food Standards Agency is very interested in this at the moment the lab of the government chemists LGC is trialing of video meter lab for to look at food authenticity in various scenarios pharmaceuticals is become a very popular application you can look at things like coating and API distribution in tablets materials and surfaces grading I've used corrosion areas as an example but really any kind of surface analysis is applicable a nice little trick is agar plate colony counting so we can see a couple of images here perhaps a bit small to see but an agar plate with two different kinds of colonies here and it can tell the difference between the two and count them separately and quantify them separately forensic analysis is another application of course that the home office has one for fingerprints but we could also look at bodily fluids or questioned documents counterfeit documents inks a sort of thing and because we are you looking at the visible and near infrared region we can actually use the information to look at color as we would perceive it so see lab color quantification and and simulation of different lighting conditions if that's important to you so here we're going to just quickly move on to a few different actual images taken from the video meter lab just to illustrate the sort of data you get from it so this is a purple Snapdragon it's just resting in a role of Sarah sellotape here just to hold it upright this is the RGB red green blue image generated so what it would look to our eyes and these are the 19 images that is taken in each individual wavelength so starting in the top left is the UV and then blue and goes through the green and visible red and into infrared region and this has just been false coloured with a jet coloration to enhance the contrast for our eyes and you can make this into a an animated gif and see where this hot spot right in the middle is of highly reflective UV and blue lights area whereas as you go on into the infrared you can see you there at the end of the jiff the whole flower reflects equal amounts of infrared light over the whole service there's no distinguish a nice example a lot of people use Arab adopts s as a model organism here we have a transgenic Arab adopts this plant on the left at GW GW D genotype I don't know what GW D stands for and on the right we have a wild-type and you can see to our eyes they both look green we can't really tell any difference but in a software what I can do is highlight groups of pixels together so on the left hand side all these red pixels have highlighted as a group of an example group and on the right hand side all the blue pixels we're going to group those together and look at the spectral signature and we can see that for the the GWD phenotype in red here although it's very similar to the spectral signature of the wild-type in blue it's got a very similar shape we can see that in the infrared region up here so what the wavelength down here this is the infrared region above sort of 700 nanometers it reflects far more infrared light and we can't see that that's invisible to our eyes so we can use that as an indicator perhaps when we're screening plants to see if we've been successful in transforming them we can use that to see if there are a transgenic or not one of the most fruitful applications for the video meter lab has always been grain and seed analysis so here's an example we have of simply some wheat and barley grains in the dish first we use the software to teach it what grain in general looks like so it can separate it from the background so wheat and barley are the same here to the software but we can also teach it to tell the difference between wheat and barley it's just an example application of course we can ourselves but it shows what it can do so we got wheat and barley highlighted in red or blue and once we've trained the software to do this it can isolate the grain from the background a bit like a green screen when you're watching the weather then separate the grains out and line them up in what's called the blob tool so here what it's done is isolated each grain and dish and lined them up all in a grid and generated lots and lots of information on it so things like the area the size shape length the color coordinates hue intensity all those and we can add to these features as well we could create an index of Fusarium infection for example a score between zero and one of how badly infected this grain is and that number would have appeared down here when we could report create a PDF report or export that data to excel for further analysis we can line them up color code them classify them even and this is actually what the John Innes Centre has been using it for or developing this application they haven't published anything yet so still in that development with video meter where they have the video meter lab instrument they feed it a dish of germinated wheat and it will take an image and separate the grains out line them up and then it's here is classified each brain on whether it thinks it's germinated or not so I think here is blue is germinated green is underman ated and red is upside down or blue is upside down you can't tell but it's been classified here and although this may not be 100% accurate as compared with the technician physically counting them in a petri dish it is a lot lot quicker and this happened the video meter doesn't get bored it no changes its own criteria for counting what's germinated or not and as long as it's close enough to the true answer same within 5% or even 10% it means you can get through thousands of samples in a couple of weeks as opposed to a couple of months another nice application could be in pathology so in leaves on of course Caesars are showed earlier but here we have a leaf and actually this is this was taken and the John Innes Centre quite a while back is just a bit of a second and see image where we know that the top half of this leaf has actually been infected with a pathogen and deliberately inoculated so what we've done is I've cut out a small section and again highlighted pixels with the software to give it two conditions two classes above is infected and below is not infected and we've trained what's called an NC da algorithm it's just one of these statistical multivariate statistical transformations these operations you don't have to understand how it works you can just use the drop-down boxes and create these sorts of models in about a minute it's very very simple and what it can then do is go through the rest of the image and classify the rest of the pixels in an RGB image and turn it into a transformed image showing an index of infection here so we can see that the infection has spread a lot over this top half you can see the just about see a little damage of the inoculation in fact if I go back to here you can see the the points where it's been inoculated highlighted very hard to see in an RGB image can't see it it's very obvious once you've done these models another application was root disease so we've got some a very clear a girl with some seedlings growing in it and they have diseased roots here and again all I've done is trained an algorithm this NC da to tell the difference between root disease and anything else so it's just the root disease is important and interesting here and again use the blog tool to pull out those areas and generate a whole load of data on it so you could imagine doing a time series of this taking images every half day or something and tracking how disease or infection spreads through the roots and you can export this to excel and and keep this data for further analysis later finally a rather nice example of high-throughput seed analysis is with oilseed rape and Charlock or wild mustard seed what we have here is a petri dish full of oil seed rape man Charlock seeds wild mustard seeds and the charros these are a little bit smaller you can you can see that but you know it's really hard to tell exactly which one is a normal serum which one's Charlock so how much we pay the farmer for this so under the blue light LED this is at 450 nanometers blue light stuff like we can see there's really no difference we can't see what's going on if we use infrared light though this is at 780 nanometers so beyond what we can see those Charlock seeds they really light up they reflect a lot more infrared light than the or seed rape does this is the same image but false colored in this jets coloration just to enhance the contrast and the oil seed rape really stands out like a light bulb and we can see for example Charlock seeds that are the same size as an old suit rape so you might mistake them for oil seed rape but in 780 nanometer light we can tell the difference night and day and then we can build a model based on this and run it on high-throughput petri dish or you saw the auto feeder earlier we could use that to feed it oil seed rape an admixture and then classify and quantify the area of oil seed rape and admixture in that little area there rather than do this very manually each time once you've developed these models and saved them with the light settings reproducible every time you load the light settings load the image analysis recipe and you create what's called a session and we have a session window here this is just a double click icon on the desktop this window opens press Start and it will just start taking images or it will just look to a folder full of images you already have and then analyze it using the algorithms that you've built in so we can see here we've got a mixture of oil seed rape Charlock it's distinguished oil seed rape from the Charlock and quantified each one separately and I've just got a very short video here of the session one of the session models in action I'll sped this up by two times this is twice as twice the speed it normally goes at and it's actually just looking at an image full of fold at a folder full of images I already have rather than taking images live from a video meter lab but that is of course what you can do I'll just press play here so we go up to open session manager press f12 to start it's looking into an image full of folder a folder full of images and you can see what it's doing is it's analyzing the image spitting out a color photograph equivalent and then an monitor and analyzed and transformed image at the bottom to show the areas that it is counting as oil sea rate versus admixture and it's generated a table of results that is also exported to an excel sheet that's auto saved in the same folder as all the images that come out here so it can be very very fast a very very easy way of grading admixture and and oilseed rape or similar situations so sort of coming towards the end of our presentation here thanks very much for listening to reiterate the video meter level 4 and spectral imaging in general is it's a very powerful technique so we've got the high throughput plant and seed phenotyping that's available to technicians and non experts now that's where we've got to so rather than the expensive NASA kind of systems we can all do it ourselves now and get a very nuanced human-like analysis there's very sophisticated tools both the power user and a novice so the technicians for example could use those session recipes have to build models they can just press go and it will generate analysis results it's very fast its non-destructive which is great you can then do all the other tests to your heart's content if you want to do destructive testing you don't really need any sample preparation at all either there's various links in this presentation and you can download it later click these links I'll just quickly point out these videos you can watch if you click on these links in the presentation it will take you to our YouTube channel here here you can see all the videos we've uploaded if you click on uploads but the one on the top left at the moment the most recently uploaded is a really short about one minute one of the quarter minute video on plant and crop Afeni typing with the video meter lab specifically and just here myself in my earphones and this this is just a nice simple software demonstration rather than showing you right now I think you can have a look at this and see the sort of results and how easy the software is to use from there and please forward that on to any colleagues that you think might be interested if you have application ideas please do get in touch if you want to see a bit more in-depth analysis with the software have a look at this oilseed rape and sherlock analysis video it's actually a playlist if you go to creative playlists you can see one of them here at the top right it's a four video series each about five or ten minutes long that just goes from the very basics we have also rape and Charlock in a dish takes you through the basics of getting data and generating models I hope you find it interesting ok so I think rather than go onto hardware software options and post-sales support those are there if you want to have a look at them later I think what I'll do is open the floor to any questions that people might have if you'd like to have a look over on the right-hand side of your screen you should see a question panel where you can type some questions so please go ahead okay so I just open the questions okay so we have a question here from Oliver what about nineteen six well plates we have done some imaging with the video meter lab four with plates sometimes you need to mess around with the focal plane usually get you some quite good results it can be used with 96-well plates the only drawback sometimes is that the wells can be deep and so the imaging focal plane might be a bit out of focus it's perfectly possible it might well be more suited to perhaps hyperspectral imaging where you can set the focus at the bottom of the plate and scan in a push broom manner you can have these kind of lab versions of a starter kit we have illumination and a translation stage basically a conveyor belt moves the sample underneath and you can collect a hyperspectral data cube I will mention a slightly separate technique that we actually have is Raman spectroscopy for 96-well plates so sirs Raman you have an instrument from digi lab where you can put a 96-well plate and it will give you Raman spectra for each well perhaps it's worth or sir having a bit of a chat later Oliver I'm going to talk a bit more about what you'd want to image in these plates yeah we have another question here from Jeju if you would like to try out or have a demo of any of this equipment from her head wall or video meter lab please just get in touch with us the European sales manager for head wolf and Sonic's is Scott luckily he lives up in Glasgow he's the one who would have demonstration equipment so we would be able to arrange a demonstration at some point depending on what your application is whether it's airborne or in the lab so please just get in touch with us that we can talk about that separately email addresses and all the contact details will be in this presentation which you'll be able to download we'll be sending out a link after the presentation once we've ended with a link to this presentation a link to we've recorded this presentation as well you can watch it again or send it to your colleagues at any time we've got one more question here how long does it take to acquire the data cubes it takes about roughly 5 to 10 seconds with the video meter lab with the headwall gear if you had a starter kit down in your lab so you're collecting data cubes of a something in your lab it will take about the same time as well the setup is a little bit more complex but you should be able to set up once and collect data cubes within a 30 seconds or so just check for questions here okay yet we have another question here how open is the device can you connect it with a computer so the head will the the video meter lab will run with any PC it has its own dedicated software acquisition and analysis I showed you some of that you can see some that in the video it just needs a very powerful computer it just needs a lot of RAM to work similarly the head wall equipment you will need a very powerful computer often with some quite specific hardware network cards frame grabbers that sort of thing head wall analytic we can supply that computer as well and guarantee it will work with the head wall device but you are welcome to use your own computers as long as they've got enough RAM enough processing power and crucially those certain bits of hardware like the frame grabbers that neat are needed to work with the head wall cameras and in the head wall cameras actually come with a few different kinds of camera links or a base link or camera link depending on your hardware okay does anybody else have any final questions no okay that's fine thanks everybody for attending our webinar today on spectral imaging for plant and crop sciences I hope you found it interesting and informative please do get in touch with us any time if you'd like more information about the technique or what it could be used for if you have a particular application in mind it's probably possible it's pretty amazing technique very much enjoy discussing this sort of thing with researchers so I hope to hear from you all soon thanks very much take care and good bye
Info
Channel: Analytik
Views: 14,961
Rating: undefined out of 5
Keywords: spectral imaging, multispectral imaging, hyperspectral imaging, videometerlab, analytik, videometer, plant phenotyping, germination scoring, chlorophyll content, seed & leaf pathology
Id: SlYYmAKQBLM
Channel Id: undefined
Length: 51min 56sec (3116 seconds)
Published: Wed Jun 01 2016
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.