PDAL installation, pipelines and a bare-earth model

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hello guys in this video i'm going to show you how to install pidal create a simple pipeline and create a geotiff representing a digital terrain model using just ground points of a classified lidar point cloud file pdl is short for point that abstraction library pidal is similar to jidal the geospatial data abstraction library used to process and handle raster and vector data or like mdal which is the mesh data abstraction library for handling unstructured mesh data pdl is a c plus plus bsd library for translating and manipulating point cloud data its simplicity albeit the intricacies underneath the so-called pipelines makes it an easy powerful free and open source option to process lidar data by the way currently there is a crowdfunding project led by lutra consulting northroad and hobo aimed to integrate pidal to kuji's the integration will include a visualization and styling tool of point cloud data so let's get started the quickest and easiest way to install pdl is using a conda environment to start just download the appropriate version of conda for your machine for this example i'm using miniconda which is a small version of anaconda [Music] that includes conda python and other basic packages with anaconda the full version you have more than 700 packages for scientific computing in my case i'm gonna download the windows version and i'm gonna choose the mini conda installer in my case i need the 64-bit version we want mini conda because for pidal we just need conda but using mini conda you can install separately as or as a whole all the packages included with anaconda [Music] now with conda you can set up a new working environment and inside you can install pdl it is recommendable to never use your base install or base environment an environment is like a micro operative system where you can install packages and run programs and commands conda is an open source package management system and environment management system and as such it helps you to find and install packages so you don't have to worry about all these once the installation is complete you can just finish the installation and start working right away if you are in linux or mac os after you install miniconda open a new terminal to start working in windows open the mini conda shell by searching for it in the search box of your taskbar with the following command you create a new environment you can create as many environments as you wish the following environment will be created to work with pet pedal but you can use this command to create other environments for other purposes first let's check if the installation is up and running just write conda info and a bunch of information should appear the next step is to create the environment itself in this case i will create an environment called tutorial to do that type conda create dash n tutorial dash y and press enter n refers to the name of the environment and y is an instruction to let conda know that you do not need to be asked for confirmation for any prompt now let's install pdl inside your activated environment once you created your environment you just have to activate it to start working inside of it to activate your environment just type conda activate and the name of the environment in this case tutorial you will notice that the name of the environment appears at the beginning of the command line in parenthesis now to install pdl just have to type inside your working environment conda install c conda forge pidal dash y c refers to the name of the channel or repository where conda will fetch the packages for pidl now we just need to press enter to start installing pdl condaforge is the name of the channel or repository in this case is a community-led github organization containing repositories of conda recipes such as pidel finally dash y was an instruction to let conda know that you do not want to be asked for any confirmation wait until the installation is finished finally you can ensure that pdl works by listing the available drivers inside your working environment in this case tutorial just type pdl dash dash drivers the first obvious step to create a working folder where you can save all your information you can do it directly on the command line or using your os desktop manager in this case i'm gonna create a new folder in my documents folder that is going to be called tutorial and then i will change directory to that folder you can do the same thing using your the windows manager pdl uses the concept of pipelines to describe the reading filtering and writing of point cloud data a pipeline consists of a chain of processing elements arranged so that the output of each element is the input of the next the name is by analogy to a physical pipeline a pdl processing pipeline is represented in a javascript object notation file better known as json which is an open standard file format and data interchange format it uses human readable text to store and transmit data objects consisting of attribute value pairs and array data types it is object oriented and you can describe anything by following the standard formatting for instance you can describe an individual's attributes for pdl the structure may either be a json object with a key called pipeline or directly using a json array to create a pipeline you just have to write some sentences into a text file in windows you can use notepad or you can use notepad plus plus if you are using a mac you can use a text edit just be sure that every time you type something you remember to change the format and make it a plain text this will allow you to save your file as a unicode utf-8 file and you can add the extension that you wish in a mac you can also download these text editor which is similar to notepad plus plus to create a pipeline you just have to write some sentences sentences into a text file copy the following text into a text editor convert it into a plain text if necessary and save it as test.json obviously you want to save it inside your new folder as you can see if you are using notepad plus plus actually the text is formatted accordingly to the extension that you use to save it i let you a link in the description indeed of this video to download a copy of this file this file is a big array you know this because it starts and finishes with square brackets [Music] inside this array we have several objects each defined inside curly brackets as a collection of name value pairs as you may notice each object has a key called type [Music] in this case the type corresponds impedal to what they call drivers each driver is specific function or command impedal our first object is defined as a driver type called readership.ept pdl will interpret this object as a task to import data from an online source file name each defined inside curly brackets as a collection of name value pairs [Music] as you may notice each object has a key called type in this case the type corresponds impedal to what they call drivers each driver is specific function or command impedal our first object is defined as a driver type called readers if dot ept pdl will interpret this object as a task to import data from an online source file name [Music] but rather than taking the whole data set it will focus on defined boundaries [Music] defined as a pair of arrays where the first array [Music] refers to the minimum and then the maximum x coordinates and the second to the minimum and then the maximum y coordinates [Music] these coordinates are in epsg 3 [Music] this lidar file [Music] has been classified in this case is also in json format [Music] i want to produce a heel shade of the bare earth model to do that we create another object [Music] where i can filter the data using a driver called filters range where the limits of the classification ranges from 2 to 2. that is this will filter the point cloud and will work only with the classification number 2 which will always refer to the ground in lidar terms [Music] the pipeline will run each driver object separately but in sequence hence from this filtering i can create another object [Music] with the driver writers.las to generate a new lidar file only with ground points i will call the filtered point cloud object as you notice i used attack key with the value classify i can use this variable in the next object so from this object to this one first i define a name for the new lidar file which will be stored in the working folder where the test.json file is then using the key inputs i will call the object target as classify then i will tag this new object as writer's class finally i use a driver to process this object information the last object will do something similar but in this case we will call the object target as writer's last and the name of the file is a geotiff raster called test dot tiff the rest of the options are related to the driver type writers dot gdal which will generate the thief now you know more or less what constitutes the pipeline now it's time to run it to run the pipeline you just have to invoke the command inside your environment in conda or miniconda to run the pipeline json file that i just described go back to the command line remember that we were working in the environment tutorial inside the folder tutorial and pdl is already running so we just need to and just be sure that inside your folder tutorial you have the json file that you just created now you just have to type pdl pipeline to invoke the file [Music] and write test dot json and in my perspective it's very handy to add so you can run this as it is but i prefer to write also dash dash debug so to see the entire process of creation or just to check for errors and then you just have to press enter [Music] [Music] with the login you can see what the program is doing so pdl is running the pipeline and started running the readers.ept driver and the query bounds are the bounds that we defined plus the c values or set values of this model we didn't write them so the program considers a n quantity of values and it's telling us that this pipeline is running in stream mode which means that it's in sequence the query bounds are really important if you are managing ept files because most of these files i will show you in the meantime so i will show you where i got these epg files so this web page from usgs and wine includes a 1280x resources and in total it's uh like 21 trillion points so in this case we are using a database of this region but if you use all these regions so i'm going to click on it it's called usgs lpc just i'm gonna copy that and then i'm gonna filter this so as you can see each of these data sets so the one that we are using is actually this one the south blade river lot 5 it has 33 billion points so unless you have a very very very powerful computer and you don't want to spend all day waiting for the data to be available in your computer it's very very useful to just use boundaries the link for this file it's available just by copy the link in the ept column and you have also these other versions so actually if i click on poetry i can visualize the layer i'm going to check again if this is done okay it's done so we can see that it growed 6 million points so now let's see what did create so if we see the folder the program will create two files so one file is the a siplus file which is this one and a dem raster file which is this one in got format now we can visualize this file the t-file in kuji's so let's see how it looks so i already opened my coogies i'm just gonna drag this layer so this is uh my file is a normal dem file and let's visualize it as a heel shape so if you just go to styling and you change from single band gray to heel shade and there you go so i'm gonna add a background layer so you can see more or less what does it mean to classify and in this case just to show the ground so i'm gonna deactivate the test i'm gonna zoom in here so for instance here you have a bunch of houses and trees the digital terrain model will not include the trees and most of the houses features are gone here you can see the difference and that's it if you want to visualize the point cloud you can use tools such as plus dot io so just to show you quickly plus dot io so you go to this web page and you accept to store files on this device this visualization tool only works on chrome and then in here you can browse in your computer so i'm gonna go to documents it was called tutorial and the file that we created is test dot last open it wait a second and there you go so this is your point cloud that you created without trees or houses as you can see is very quick there are some other options that in another video we can explore so that's it and thanks for watching [Music] um [Music] lost in my eyelids
Info
Channel: ishibaro
Views: 1,992
Rating: undefined out of 5
Keywords: pdal, qgis, gdal, geotiff, point cloud, lidar, usgs, pipelines, json, conda, miniconda
Id: QI4zqmaEbc4
Channel Id: undefined
Length: 27min 23sec (1643 seconds)
Published: Tue Sep 15 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.