How to Build a 3D Interactive App in Python: Point Cloud Feature Extraction Tutorial (Part 2)

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey friends welcome back to this part two of this tutorial on feature extraction and interactive app for 3D Point Cloud within python last time what we covered was everything that went until pre-processing Point Cloud feature extraction neighborhood definition and relative featurings right if you missed that I gr you to check the other video that cover all about feature extraction and now what we're going to do is actually the super fun stuff which is first automating everything so taking everything that we did looping that and scaling for the full point cloud and then after we'll dive onto step eight which is visualization and interaction so right visualizing in Python and creating this threshold that we'll be using to get some part of the data based on the feature that we had and then we'll export part of the whole point cloud with the features to thereafter used a segmentation technique with resing I will show you cases and application for labeling your data set and the goal is thereafter to have the ability to leverage these features for your 3D machine learning project and creating 3D AI models if that sounds good to you then let's get started already at step seven all right now the next stage um is to automate right but don't thereafter we have the visualization but we can already automate all of this to have that on the full point Cloud prepare the feature Matrix and um basically visualize then thereafter so to automate on the full data set we are preparing a feature Matrix and here I'm giving you a very deep trick something that will help just compress the number of library that we need right because you could do that with pandas very easily as well but I will show you how to use structured array with lpai so for that we are going to create a um data frame kind of elements or dictionary called DT which has names and formats every format is floats and the name is the name of all of our features planarity linearity omn variance verticality n x n y and z d high and d low which are nine features that we compute on the point Cloud right and then I will initialize an empty um nump array with the length of the the points so if I 100,000 and the dtype DT it will create all the columns and initialize that and then what I do afterwards is I initialize all of my features as an N value not a number all right so now if I were to check out what features is this is what we have right planarity linearity and what is really nice now we don't need to use um indexes to search certain element we can directly use like the name of the feature that we want PL and if I do that you see that it extract already the array kind of a data frame so this is very nice it will help us a lot to be more clear in the code that we write and to be sure that we select the right feature right so now let's Loop and automate first thing that I want to do is compute the point number that we will use thereafter in the loop and in the loop we are going to time our Loop and for each index in range of pts number right so 500,000 we will extract and use the PCA that we defined before and get that into our two little um variable and then after we are going to um just pass the featuring function PCI featuring and make sure that it fills our features so features linearity Index right features linearity index and so on will be PCA featuring and very nice then afterwards if uh we have like a number of Neighbors which is above two so minimal three we are going to actually make the selection and compute what we had before the feature high and feature low um like that right and at the end we are just going to time and print the time of the computation and that's done for the looping right I went really quick here don't hesitate to dive deep to understand better what I did now I'm going to execute but before executing this is going to take around 2 minute I guess and that is it we have our computation on the full point Cloud that is done that took a little below 2 minute to compute all the features now it's time to actually check out what that means with py Vista and this is a very quick way to plot all of that um and I just change every time the feature so I took the line before right and I render everything as a sphere so I'm going to execute that and as you can see that is the first feature which is the distance to the high point look how now that is you can already see here that you will be able to leverage that right to extract the trees pretty nicely right so that is the first thing that I wanted to show you the second thing is the planarity so the planarity it's shown here oh very cool so all the ground is mostly planner and the trees not so much the houses depends on which spot you are in right but you have some kind of noise so it means here we could adjust for example the scale of which we used here on 20 years neighbors for all of these PCA based features which maybe not the best strategy we could use the radi search maybe that will give you something better now what do we have here here um this is the verticality so again this is a bit redundant maybe with planarity or a bit distinctive we have something though is that I forgot to make an absolute value of the normal which means that one minus something we have something um duplicated so that could be addressed pretty easily um but I will leave it there here it's a bit more distinc in that planarity to get only the ground and that is very nice we could use that to get only the ground points now what is that is the Omni uh variance I think I have mismanaged what we should have ah no no no I don't it's just that we have points that are really out lers that ske the distribution um but it's okay for example what I could do here is put true right and comment all of this and check again as you can see see yes we have points but they are very very there and they Annoy Us right so that's it for the firsthand visualization um now I want to show you something super interesting with P Vista actually what I showed at the beginning which is having the ability to use a threshold so should create qu kind of an app to um threshold the features to extract and segment based on histogram thresholding using the features that that you computed so the first thing that we have to do and it's important to understand is that we actually have to pass the features inside of our variable PCD um PV right and give them a name that is relevant so D high for example right and the same will be for the low and for each of this we have to pass the the Scala field which is our features set feature highs and feature low maybe before executing that we can check out feature or fure high to check out what we have so if I press fure highs here yes we have one not a number but you you have a lot of them so that would be very nice as well to see how that behaves inside of the plotter we need to use a different function we cannot chuse plot directly with that we have to set up a plotter so it's an object a plotter so PV do plotter and then we'll had an interactive thresholding tool to show everything and to that just p. addore it's a mesh object okay we don't have triangle and press alt um and then we pass a bunch of parameter and the first parameter is PCD PV or variable the second one is the scalars that we want to use in this case it's for example D high right and um or d low yeah let's keep the high we give a title and uh um after we will press the show so let me have a title the distance to High Point in meta all scalar will leave it to true we will still Rend the points as sphere we increase the point size and we show are you read it for what's happening I just press that and beautyful you see that you have your scene with a capacity to filter out right based on exactly the distance from the high point and that's very interesting because we could um use that to filter out exactly the points that we were to to keep or let's do that but with the D low directly and in that case yes we have the distance to low point so that means that we could just get rid of all the ground points very easily and you have also the values here and the ramp right imagine what you can do with only that um it makes it very very very very very nice to actually label your data set for example uh in this way or to segment or to play with it very quickly now there is a trick that I spent a lot of time trying to figure out for you which is interactive selection and remnance if I could call that remant um basically making a selection after thr holding to play with afterwards right so to do that um we do the same thing except We'll add a specific line uh that I will show you just now okay so I took everything and the line is enable mesh picking and I put that here and after that I press the show and doing that we can actually so thresold right so I could threshold um like this just get rid of the ground yeah like this and uh now I press p and all the remaining points are now in an object When I close it's it's remanant which means that we could then play with this little element by calling again p and saying picked mesh all right and the last picked object will be then saved in the selection um and yeah I I I can show you what you can do then thereafter with that if that works for uh everything I'm not sure that's doubling up on that no yeah yeah that you cannot do but you can basically Port the selection and that's what I'm going to do here uh I give the selection path regual segmentation live obj selection p.p piore mesh and you have to pass this you don't need to understand that but basically what you do is have the plotter adds the the mesh of the selection that we have to the plotter only right and Export the object because of this very weird instructure GD that's the only way so far that I found that works uh straight out of the bat so if I execute that yeah I made a mistake so it's results directly segmentation live sorry about that uh error route unable to open that's what was written so now if I go into results I have segmentation live opj which I could open with meshlab or your external software yes I cannot export in py it's obj but as you can see we don't have the ground anymore only the the part that I left here right so it works we have the selection out of the three holding that we had and this is marvelous now the last thing that I wanted um to maybe show is actually to load that and visualize back to to show you that you have the way to go back right so PCD pv2 read the selection path that we put there and put the item lightning so that's what I'm doing and as you can see right we have our selection that is red so that is marvelous beautiful now let's go onto the step nine we're closing into the end which is merging with coordinates and serving the results um here we have some some sorts of things that we need to take care of so the first is actually importing a specific little numai module right called Rec function you will see why in a moment and we are going to initialize the same way we did for the features our data frame with our X Y and Z coordinates that is the only way to combine to structured data frame in an efficient way so I'm creating a structured not data frame but nump array with the structured feature array and I will concatenate them that's what I'm doing so first I'm initializing the same way a coordinates empty and then I'm basically filling my coordinates with the points PV variable of X Y and Z and after that uh the point Cloud featured right will be merged with this function rfn that I imported me Ray cords and features XYZ and features and then the last stage is to actually plot uh not plot but save with n as a txt so as key file right in the result folder so PCD with features I will call that um I say the PCD features variable and the format is a flot the delimeter is this and the commas the header will actually what I'm doing here it's a bit weird but I'm creating a string that is joined by comma and the string is based on dt2 names and DT names right and when I do that it's in the origin folder so it's nice I'm waiting for that to be finished yes and now if I go in the result folder I have my PCD with features which I could then directly open into Cloud compare and XYZ planarity linearity Omarian verticality and this is wonderful this is really truly a fit if you went at up until there absolute kudos to you you are part of a few U individuals that have the ability to do that in just a matter of of lines of code um instead of using existing software that will do only a limited amount of what you need right so I would just apply there I skip the first line and extract Scala names for the first line I apply and as you can see I now have my point cloud with the features that we computed and this is an absolute Wonder so RGB was already existing but Scala field we have now oh it's not the the good one I dropped the the RGB but as Scala field we have planarity uh we have linearity and you can see that you identify the pole really nicely there we have omn variance omn variance I I have to filter out a bit for you to see so if I filter out this point as you can see which is noise edit Scala field filter by value export you see omn variance allow us to to find that uh we have a lot of common variance in this area whereas in the trees yeah something is happening so that will be also useful verticality also very nice uh D high and d low this is beautiful and then from there what you can do is actually use that to um very quickly create a label data set look at looking at the time that we spent already here this will be for our next episode as well and then you can also use these features combined with machine learning based on features that you engineered so not deep learning but like classical 3D machine learning to predict based on the label data that you constitute using this thres holding method this is an absolute blast and all of this is for an application to ground and tree classification that will be the objective so for example here it's very easy to use the planarity or let's say verticality combined with the Z approach to filter out all the the the points on the ground and then after using D high and d low to extract all the other things and I can show you what I did in a matter of minute truly I save that in resources I will load that hopefully it works so that you can see what you can expect as a result yeah this is my classified result that I did I think within one minutes using this thres holding method right where I have my trees I have the ground I have all the manmade object and I have the noise and once you have that you have a label data set that you created so that's a way to use th method to create such a label data set that you can use then to um use svm or random forest or other classifier to extract a model an AI model that will do the heavy lifting for for you for the next unseen data set and that's super super powerful so that's it guys we passed through the full uh set of feature extractions right within this 10-step process and I hope you enjoyed it please if you did not yet do so if you believe this is helpful to you please subscribe to the channel to get more of that to leave comments to share with me what you would like to see next because I'm trying my best here to give you the best possible robust approaches using python 3D Tech in general to help you solve like real challenges identified in your professional or learning life let's say right and um yeah hope you enjoyed it and see you in the next uh video bye-bye and that's a wrap on this part two with feature extraction and visualization and interaction within python with p v only three libraries I truly hope that gave you a lot of value and now you have a very powerful mechanism or workflow depends on what you prefer which taxonomy is best to attack really really cutting edge problems linked with 3D Point cloud data set right if it helped you in any way don't hesitate to subscribe to the channel I'm trying to push more video in this context or leave a comment and tell me what you would like me to teach or shoot a video about and I would do my best to deliver on it continue pushing the frontiers of what we can do with 3D data more than a word and let's see each other in the next video [Music] bye-bye
Info
Channel: Florent Poux
Views: 916
Rating: undefined out of 5
Keywords: Point Cloud, LiDAR, 3D, Data Science, Modelling, Geospatial, Python, 3D Python, Vectorization, Segmentation, AI
Id: hIgRhew2V1Y
Channel Id: undefined
Length: 19min 42sec (1182 seconds)
Published: Thu Mar 21 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.