Shader-based LIDAR Simulation
AlphaPixel, with Skew Matrix Software contributed training and consulting in support of Ball’s development of a realtime system allowing joystick-driven ‘flight’ of an airborne LIDAR mapping sensor over large-area landscapes built from USGS survey data. Subsequently AlphaPixel provided similar support in the development of shader-based implementation of material surface properties, allowing radiometrically accurate modeling of objects’ response to lidar under varying lighting conditions.
The image depicts the LIDAR Software running realtime processing of OSG-based simulated inputs simulating flying the LIDAR instrument on a helicopter over Ball Aerospace’s headquarters building in Broomfield.
The tiny little chip at the bottom is the view of the landscape from OSG, with a color-coded hazard overlay. The grayscale image represents the raw “range image” frame as seen by the LIDAR, or in this case as produced by the simulator. The rainbow image is a point cloud of the 3D scene rendered in OpenGL from a different perspective.
This second image depicts simulated flight over synthetic lunar landscape as viewed through a realtime flash LIDAR sensor driving a Ball Aerospace hazard detection algorithm. The lunar terrain is rasterized through an OpenSceneGraph-based simulator that uses GLSL shaders to model how the flash LIDAR sensor would see the terrain. The resulting simulated input data is fed to the hazard detection software in leui of actual (unobtainable!) lunar testing.
Acceptable landing areas (green) are determined in realtime from a combination of slope, local obstacles, and vehicle footprint.
See a video of the lunar LIDAR hazard detection software in operation: