Abstract—This paper is concerned with generating a continuous implicit representation of a robot’s workspace using sparse point cloud data. We adopt a Gaussian Process (GP) framework to model the underlying workspace surfaces and suggest a non- parametric formulation which allows us to capture the non-functional relation between ground plane and elevation that is not possible with, for example, terrain mapping algorithms. The point clouds we are processing are typically vast such that blind application of a Gaussian Process leads to computational intractability. We therefore use a mixture-of-GPs model where individual GP support sets are chosen via segmentation in both laser and appearance space. We provide results that highlight the robustness of our algorithm, and apply our framework to semantically guide resampling of the workspace.

Gaussian Process support sets chosen by image segmentation shown in: (Left) beam space manifold, parameterised by time and beam angle; (Middle) beam space unwrapped; (Right) 3D Euclidean plot of the laser data. All figures are coloured according to the support regions.

Gaussian Process support sets chosen by image segmentation shown in: (Left) beam space manifold, parameterised by time and beam angle; (Middle) beam space unwrapped; (Right) 3D Euclidean plot of the laser data. All figures are coloured according to the support regions.

  • [PDF] M. Smith, I. Posner, and P. Newman, “Generating Implicit Surfaces from Lidar Data,” in Towards Autonomous Robotic Systems, Plymouth, UK, 2010.
    [Bibtex]

    @inproceedings{SmithEtAl-TAROS10,
    Address = {Plymouth, UK},
    Author = {Mike Smith and Ingmar Posner and Paul Newman},
    Booktitle = {Towards Autonomous Robotic Systems},
    Keywords = {Efficient Large-Scale 3D Reconstruction, conference_posner},
    Month = {August},
    Note = {08},
    Pdf = {http://www.robots.ox.ac.uk/~mobile/Papers/SmithPosnerNewman_TAROS2010.pdf},
    Title = {Generating Implicit Surfaces from Lidar Data},
    Year = {2010}}