The fundamental core of the SUE profession is a complete understanding of the principle of uncertainty. The SUE professional whether a licensed Professional Engineer or Professional Licensed Surveyor, or other qualifying professional must understand this principle.  These professionals operate under the relevant statutes in their respective state and sign off on the SUE deliverable having assessed each segment of a utility is in their judgement represented by the relevant quality level of uncertainty.  Each utility segment depicted on the final drawing is assigned a QL level from QLD to QLA under ASCE 38-02 Standard Guideline for Collection and Depiction of Existing Subsurface Utility Data.

Geophysical methods employed stand at second place in the uncertainty scale as QL B, only usurped by exposing of the utility through vacuum or other means of safe excavation.  Even QLA does not represent an absolute in terms of certainty the utility segment is properly identified as any SUE professional can tell you detection and exposure sometimes yields the unexpected!

One of the most recent advancements in geophysical methods involves GPR array systems for large scale or highly complex SUE investigations.   As with any reasonably new entrant of technology not yet universally integrated into an established practice such as SUE, value must be demonstrated.  This transcends from the professional SUE provider to the end client paying for the deliverable.  The degree of certainty having deployed the method adds value to the final deliverable must be apparent throughout the process.  Certainty by the very definition according Merriam-Webster is the quality or state of being certain especially on the basis of evidence.  Evidence from SUE professionals and research that have integrated these platforms in their respective projects are compelling enough that ASCE 38 and its upcoming updated version will include 3D depiction and collection methods such as multi-channel/array GPR.

Knowledge is inextricably linked to reduction of uncertainty.  Two-dimensional GPR is universally accepted and understood in the SUE profession and the concept of the array is generally known.  However, the very idea of 3D imaging seems like a quantum leap to the everyday GPR practitioner.  Nothing can be more certain than it is simply not a quantum leap but rather a natural progression.

3D GPR Array Processing De-mystified

In the world of geophysics as applied to SUE, one can get lost in the technical jargon involved from the field acquisition to the final 3D deliverable. The very word “geophysics” is scary to those who may not have enjoyed taking a mandatory physics class in high school! Remember, a GPR array is a GPR system. The difference from what is used for everyday designation of utilities is the data density that is impossible to replicate with a 2D system. This is because true arrays use combinations of transmitters and receivers to saturate the ground with the GPR signal as opposed to a single pair as is the case of standard 2D systems. If one looks at the data from individual channels in any array it simply looks like any typical 2D GPR profile. (Figure 1)

Figure 1. Raw 2D Array Data from two adjacent channels. Note hyperbolas that appear in the data as if two 2D scans were performed side by side. Note data collected with ImpulseRadar Raptor Array at 35 MPH in favorable GPR soils

If one were asked to choose the most influential differentiator from processing and viewing ordinary 2D scans versus 3D-data, it is signal migration. It is difficult in print media to visualize the concept of migration; however, it is nothing more than computationally moving the observed/ recorded reflected GPR signal to its proper location in the 3D radargram. As the array has many receivers, the energy from a reflector (i.e., a utility) is recorded at more than one receiver at a time. If they are at different distances from the point of reflection, the time to arrive will be different and hence in the simple radargram the further away the “deeper” that energy will be represented. Think of a shotgun firing at an object below and away from you (please do not try this at home) and having a horizontal paper target large enough on the ground surface to capture all the pellets that ricocheted off the object below. If you recorded the time for each pellet to pierce the paper, those that ricocheted at a high angle upwards would arrive first while those at lower angles and down range will strike the paper further away from the point of impact and much later. But the fact is, all the pellets hitting the paper ricocheted off the same point.

If we captured those pellets in the paper and moved them back (migrate) to the point they ricocheted, we could cluster these about the object. Well, voila! We do this very thing with the reflected GPR signal and cluster all these about the point source in the subsurface.

Figure 2a. Unmigrated data depicting classic hyperbola from potential utilities (Condor post processing software).

This is exactly how we migrate the energy from those important hyperbolas you see every day with your 2D systems (in favorable soils and in the case of a utility, perpendicular to the GPR traverse) into a point as the tails/legs are the scattered pellets with the apex the actual point of impact. Figures 2A and 2B show how selected targets are migrated to cluster the signal from the point of origin. An important component to effectively perform this operation is the velocity of the soil. The beauty of properly migrated data as the best result is only obtained with the most accurate velocity, which happens to yield the most accurate depths to the utilities and other objects designated in the data volume.

Figure 3. Migrated hyperbolas from figure 2 depicted as a depth slice at 2-feet below the road surface (Condor post processing software).

Remember, with a GPR array the data is a 3D volume so if we migrate/cluster all the signal back to where it came from in 3D space across multiple swaths of the array down the street an “image” of the actual utility or other feature is realized (Figure 3). Of course, this is not an image in the sense of a photograph, but one can assume with a high degree of certainty this feature is a utility. One does not have to be a geophysicist to make a professional judgement as to whether the image is a utility. The degree of uncertainty to make this judgement is quite low.

GPR Array Processing Software is the Key to Value

It is clear a GPR array can deliver consistent reasonable images of utilities from time-tested processing routines in favorable soil conditions and of a geometry detectable with the frequency of the system. Even in marginal or poor soils these same routines create images of trench features, although the actual direct detection of the utility may not be realized. However, the degree of uncertainty that a utility exists or not at that location is greatly diminished.

Until recently, the highest degree of uncertainty for use of 3D arrays involved how and how long the data is processed due to the complexity of the software. Understanding how soil conditions affect GPR array performance is not wholly different than success designating with your 2D system, thus the level of uncertainty can easily be managed with test scans or knowledge from having worked in the area prior. As mentioned heretofore, the real uncertainty is how the collection of these additional data affect the cost and overall value of the additional information to the project.

Experience with arrays has proven that the highest cost component of a 3D GPR array survey is not the field data collection. Arrays today can survey up to posted speed limits with RTK GPS systems and allow miles of data to be collected in fractions of an hour and the need for traffic control is eliminated in most instances, further reducing costs. Until recently, 3D GPR array processing could easily be half of the additional cost to implement the 3D array itself due to these new high-speed systems.

This rapid and efficient data collection creates a large volume of data that when processed allows extraction and designation of numerous subsurface utilities and utility features such as vaults, trench lines, buried valves and manholes. The amount of effort expended to get to the finish line until recently translated to roughly 6-10 times the field data collection time. The billable rate for a senior person with knowledge and skill to perform these processes tends to be at the higher end of the company spectrum and hence the cost increases substantially and the real or perceived value to the client may be difficult to translate.

A factor in the time expended to process the data was typified by software tailored to geophysicists who are accustomed to software that required a deep knowledge of signal processing. These programs paid scant attention to simplifying workflow and automating where it is possible without compromising a quality output. This exactly follows the trend in 2D GPR from very expensive systems with overly complex user interfaces and cumbersome hardware not suited to efficient data collection in streets and utility corridors that early adopters endured for the SUE industry. Recent advancements in GPR post-processing software is reducing processing time up to 50 percent! As with any technology, this will continue to trend downward but for now this is certainly a quantum leap.

The maturation and refinement of 3D processing software for GPR arrays is fundamentally reducing the uncertainty of the value of 3D GPR arrays by reducing the overall cost of the extra dimension. This is certain to please the SUE professional and their clients.

This article was originally published in dp-PRO FAll 2020