Drone LiDAR: TrueView Accuracy Testing with Caltrans

Updated: May 4

Author: Lewis Graham

GeoCue Group Inc. has been involved in the testing of survey-grade instruments on small Unmanned Aerial Systems (sUAS or, more popularly, “drones”) with the California Department of Transportation (Caltrans) for the past several years. We have had the privilege of working with Dr. Riadh Munjy, the lead investigator on this project, on some deep questions concerning accuracy of systems. Dr. Munjy is the department chair of the Civil & Geomatics Engineering Department at the California State University, Fresno and has been a primary contributor to the current state of art in photogrammetry and laser scanning for many years.

Test Site Overview

As part of this research project, Dr. Munjy’s group at Cal State has laid out a fantastic aerial sensor test site at the San Joaquin Experimental Range (SJER). On my first visit to this range some time ago, I was looking for military weapons! After a bit of research, I realized the “Range” in SJER is “cattle” range! SJER is primarily an agricultural test site jointly managed by Cal State Fresno and a few Federal and State agencies. The sUAS test site comprises about 75 acres and surrounds the SJER facilities cluster, providing some nice planar surfaces (sloped roofs) for examining LIDAR noise and strip to strip conformance. The terrain is high desert with some isolated trees and scrub brush. The terrain relief over the sUAS test area is about 35 meters. The sUAS test range is shown in Figure 1 with the layout of 3D control/check points shown as magenta triangles. I have also overlaid a set of rough 5 foot contours generated from a first pass ground classification (automatic ground classification and contour generation performed in True View EVO) to give you an idea of the terrain relieve – gently rolling.

Figure 1: San Joaquin Experimental Range sUAS Test Site


Steve Riddell (Algorithms lead for True View products and other duties as assigned!!) and I traveled to the range with a True View 410 and True View 620 in hand that last week in October. These systems are True View 3D Imaging Sensors (3DIS®) that comprise a LIDAR laser scanner and dual photogrammetric cameras. We brought along a DJI M600 Pro as our flight platform (I like the M600 Pro for, among other reasons, the fact that we can carry the batteries on a plane as cabin luggage).

We flew the True View 410 on October 26th and the True View 620 the next day (our True View 620 was making trips round the west as lost luggage for 24 nail-biting hours!). We flew parallel east-west flight lines of 50% overlap (with flight lines clipped to 40° off-nadir) with no cross strips (we were flying a pattern specified by the research group).

We processed the data to a colorized point cloud in our True View EVO post-processing software. EVO drives Applanix POSPac (desktop or cloud) to perform the trajectory solution so you can stay in our wizard-driven software for the entire post-processing workflow. We processed the data using the Position and Orientation System (POS) solution only – we did not inject any ground control points. We were able to measure both vertical and horizonal accuracies using the tools built into True View EVO for ASPRS compliant accuracy assessment.

Accuracy Assessment Summary

A summary of vertical accuracy is contained in Table 1 below. I will report on Horizontal Accuracy in a future True View Bulletin. It is important to note the conditions of the below accuracy assessment:

  • All control was set with a reference independent of the base station used for positioning the True View 3D Imaging Systems

  • Inverse Distance Weighted method with an influence radius of 2 ft was used for probing the LIDAR surface (note that ASPRS suggests a TIN (a TIN is not a good idea with dense data – subject of another article)

  • 5 check points/4 checkpoints were withheld for the True View 410/True View 620 respectively because the points were occupied by a base station or contained overhead interference. The check points in Table 1 are after this withholding.

  • The nominal flight altitude was 75 meters above ground level (AGL)

  • A terrain following flight plan was used and thus the AGL remained relatively close to 75 m, regardless of ground height

  • The mission was flown at a speed of 5 m/s

  • The LIDAR data are not debiased.

  • The LIDAR data were limited to scan angles of ±40° (80° cross-track swath) when geocoded

  • The LIDAR data were not clipped or conditioned in overlap regions

  • No post-geocoding corrections (e.g. “TerraMatch”, etc.) was applied to the data

  • Neither sensor was calibrated after shipment to California from Huntsville

Table 1: Network Accuracy Results (All units in cm)


I find these results quite remarkable. We advertise the True View 410 as a 5 cm Network Accuracy system and here we see 4.18 cm RMSE with no debiasing! Similarly, we observe 1.95 cm Network Accuracy with the True View 620, again with no debiasing. As we know, if the standard deviation of the mean (SDOM) is small relative to the mean error, we can justify using check points to debias the data. Since RMSE2 = Mean2 + STDV2, we can easily determine the accuracy of the above tests should we debias.

Debiased Network Accuracy (in cm):

  • True View 410 = 2.33 cm (RMSE)

  • True View 620 = 1.90 cm (RMSE)

The other remarkable feature to note is the total residual error range for the two sensors. For example, the True View 410 exhibits a total error range of 10.8 cm. If we assume this is distributed about the mean, we have had a maximum excursion of less than 5.4 cm from any check point. I find this quite remarkable for data collected from an sUAS!


Now you may be thinking “the True View 410 data are nearly as accurate as the 620 for a whole lot less money. Why would I get a True View 615 or 620?” The answer involves other important characteristics of the data – primarily hard surface deviation (“noise”), low energy return detection (e.g. detecting power lines) and ability to penetrate vegetation. These factors are all much better with the True View 615/620.

This study once again adds support to the idea of rapid workflows for data production. We literally processed these data from ingest from the sensor to a contour map (which requires ground classification) in two hours time. Talk about high return on investment!