Thursday 4 June 2015

Implicit Objects and the Marching Cubes Algorithm (implementation in C++)

Hello,

today I am going to talk about algebraic representation of Objects and the Marching Cubes Algorithm. This tutorial's code is available at:
https://github.com/Art-n-MathS/ImplicitObjects

The code is part of the open source software DASOS, which is able to reconstruct forest from full-waveform LiDAR data. For more information about DASOS please look at: http://miltomiltiadou.blogspot.co.uk/2015/03/las13vis.html

Please note that some of the explanation were copied from my paper:
Miltiadou, M, Warren M.A, Grant M., Brown M., 2015, Alignment of Hyperspectral Imagery and full-waveform LiDAR data for Visualisation and Classification purposes, 36th International Symposium of Remote Sensing of the Enviroment, ISPRS Archives
Available at: http://www.int-arch-photogramm-remote-sens-spatial-inf-sci.net/XL-7-W3/1257/2015/isprsarchives-XL-7-W3-1257-2015.pdf

Implicit Representation of objects


Numberical implicitization was introduced by Blinn in 1982. Numerical implicitization allows definition of complex objects, which can easily be modified, without involving large number of triangles.

A function f(X) defines an object, when every n-dimensional point X the following conditions apply:

f(X) = a, when X lies on the surface of the object
f(X) > a, when X lies inside the object
f(X) < a, when X lies outside the object

where,
X = an n-dimensional point. It usually it is in Euclidean space defining the x, y, z poisitions of the point
f(X) = the function that defines the object of our interest. In the coding example is the function of the sphere
a = the isosurface of the object, which defines the boundaries of the object. f(X) is equal to a iff X lies on the surface of the object.

Marching Cubes Algorithm:


Even though numerical implicitization is beneficial for reducing storage memory and for various resolution renderings of the same object, visualising numerical/algebraic objects is not straight forward, since they contain no discrete values. This problem can be addressed either by ray-tracing or by polygonisation. In 1983, Hanraham suggested a ray-tracing approach, which used the equation derived from the ray and the surface intersection to depict the algebraic object into an image. The Marching Cubes algorithm was later introduced for polygonising implicit objects.

The Marching cubes algorithm constructs surfaces from implicit object using a search table. Assume that f(X) defiens an algebraic object. At first the space is divided into cubes, named voxels.Each voxel is defined by eight corner points, which lie either inside of outside the object. By enumerating all the possible cases and linearly interpolating the intersections along the edges, the surface of the implicit object is constructed (Lorensen and Cline, 1987).

In 2021 a research was published that investigates the performance of 6 different data structures for extracting an iso-surface (creating a 3D polygonal model) from voxelised data (Miltiadou et, al. 2021). It was shown that Integral Volumes was the fastest from the six, but it consumes the most memory. Each data structure has its owns prons and cons. Additionally, in that paper a new category of data strucutre "Integral Trees" were introduced. 


For more information about the Marching Cubes algorithm please look at the following blogpost:
http://paulbourke.net/geometry/polygonise/



Example Output:


The output .obj file, visualised in Meshlab


Code


This tutorial's code is available at: https://github.com/Art-n-MathS/ImplicitObjects

In this example, a sphere is defined using an function, polygonised using the Marching Cubes Algorithm and then exported into an .obj file.

Requirements:
- gmtl library
- c++11
- qmake

Compile:
$: qmake
$: make clean
$: make

Run:
$: ./ImplicitObjects


References

Miltiadou, M.; Campbell, N.D.F.; Cosker, D.; Grant, M.G. A Comparative Study about Data Structures Used for Efficient Management of Voxelised Full-Waveform Airborne LiDAR Data during 3D Polygonal Model Creation. Remote Sens. 202113, 559. https://doi.org/10.3390/rs13040559

Blinn, J.F, 1982. A Generalization of Algebraic Surface Drawing. ACM Trans.Graph, pp. 235-256

Hanrahan, P., 1983. Ray tracing algebraic surfaces. ACM SIGGRAPH Computer Graphics, Vol 17, No. 3.

Lorense, W. E., & Cline, H.E., 1987. Marching Cubes: A high resolution 3D surface construction algorithm ACM Siggraph Computer Graphics, Vol 21, No 4 pp. 163-169


Miltiadou, M, Warren M.A, Grant M., Brown M., 2015, Alignment of Hyperspectral Imagery and full-waveform LiDAR data for Visualisation and Classification purposes, 36th International Symposium of Remote Sensing of the Enviroment, ISPRS Archives






Monday 27 April 2015

Parsing a file into a words (C++ Programming)

It usually seems difficult to parse files, so I found this way which you can parse a file into words within a few lines and it works perfect. So keeping it here makes it more accessible when I will next need it. ;p

Let's assume that you have the following file that need to be parsed:

My name is Milto
and I like Capoeira! :)

Then the following code will read the file and print each word at separated line.


#include < iostream >
#include < fstream >
#include < iterator >
#include < string >
#include < vector >

int main(void)
{
   std::string filename = "inputt.xt";
   std::ifstream mystream(filename.c_str());
   if(!mystream)
   {
      std::cerr << "File \"" << filename << "\" not found.\n";
   }
   std::istream_iterator< std::string > it(mystream);
   std::istream_iterator< std::string > sentinel;
   
   std::vector < std::string > words(it,sentinel);

   for(unsigned int i=0; i< words.size();++i)
   {
      std::cout << words[i] << "\n";
   }
   
   return 0;
}

The output result will be the following:

$: g++ parsing.cpp -o out
$: ./out
My
name
is
Milto
and
I
like
Capoeira!
:)


You may download this example code from here:
https://www.dropbox.com/s/2chkdvjf2m0uhcp/parseFileIntoWords.zip?dl=0

Monday 23 March 2015

DASOS - Open Source Software for processing full-waveform LiDAR and hyperspectral Images


What is DASOS?

DASOS is an open source software that aims to ease the way of handling the full-waveform LiDAR data. Its name was derived from from the Greek word δάσος (=forest). For any publication using the software please cite the following paper: Open source software DASOS: efficient accumulation, analysis, and visualisation of full-waveform lidar


The aim of this software is to enhance the visualisations and classifications of forested areas using coincident full-waveform (fw) LiDAR data and hyperspectral images. It uses either full-waveform LiDAR only or both datasets to generate metrics understandable for foresters and 3D virtual models.

Influenced by Persson et al (2005), voxelisation is an integral part of DASOS. The intensity profile of each full-waveform pulse is accumulated into a voxel array, building up a 3D density volume. The correlation between multiple pulses into a voxel representation produces a more accurate representation, which confers greater noise resistance and it further opens up possibilities of vertical interpretation of the data. The 3D density volume is then aligned with the hyperspectral images using a 2D grid similar to Warren et al (2014) and both datasets are used in visualisations and classifications.

There are three main functionalities of DASOS:

  • the generation of 2D metrics aligned with hyperspectral Images 
  • construction of 3D polygonal meshes and
  • the characterisation of objects using feature vectors. 

Aligned Metrics

Here is a list of the available metrics:

From FW LiDAR (LAS1.3 or Pulswaves file formats):
- Non-Empty Voxels
- Density
- Thickness
- First Patch
- Last Patch
- Lowest Return
- Average Height Difference (works as an edge detection algorithm)
- AGC intensity


A visual explanation of the available full-waveform LiDAR metrics is given below:



There are also the following Hyperspectral metrics (derived from .bil files & .igm files)
- Hyperspectral Mean
- NDVI
- A single hyperspectral band




Polygonal Meshes

The following example is from New Forest and generated at one meter resolution.The first one was generated using FW LiDAR data and three user defined bands of the hyperspectral images. The second one was generated with only the FW LiDAR data.




The following video was also rendered in Maya using a polygon exported from DASOS:




List of Feature Vectors


This is useful for characterising object inside the 3D space (e.g. trees). For each column of the voxelised FW LiDAR, information around its local area are exported. The exported format is .csv to easy usage in common statistical packages like R and matlab.

In the following images there are two output examples. The top on contains processed information about the data and the second file contains the raw intensity values. Additionally each line is a feature vector. 



Related Links

The full user guide is available at:
https://github.com/Art-n-MathS/DASOS/blob/master/DASOS_userGuide_v2.pdf

The windows executable and the code are available at:
https://github.com/Art-n-MathS/DASOS

To add your own customised 2D metrics, you may read the following tutorial:
http://miltomiltiadou.blogspot.co.uk/2016/07/how-to-add-metrics-to-dasos.html

For any questions regarding the usage of the software please use the following Google Group:
https://groups.google.com/forum/#!forum/dasos---the-native-full-waveform-fw-lidar-software

Updates about DASOS could be found @_DASOS_ on twitter

For more information about the algorithms please refer to the related paper:
https://www.researchgate.net/publication/334069759_Open_source_software_DASOS_efficient_accumulation_analysis_and_visualisation_of_full-waveform_lidar


Related Papers:


Acknowledgements

Special thanks are given to my industrial supervisor Dr. Michael Grant, who has supported me during my entire studies.

I would also like to thanks my initial and current supervisors Dr. Matthew Brown and Dr. Neill D.F Cambpell and all the people who occasionally got involved: Dr. Mark Warren, Susana Gonzalez Aracil, Dr.  Daniel Clewley and Dr. Darren Cosker.

This project is funded by the Centre for Digital Entertainement and Plymouth Marine Laboratory.

The data used were collected by the NERC Airborne Research and Survey Facility (ARSF). Copyright is held by the UK Natural Environment Research Council (NERC).


Please note that this text was taken and modified from the EngD thesis of Milto Miltiadou, which was submitted to University of Bath in 2017.