David M. Abbott, Jr.
AIPG Ethics Columnist
Denver, Colorado, USA
Honesty is the principal geoscience ethical principle (Abbott, 2000). “The only ethical principle which has made science possible is that the truth shall be told all the time. If we do not penalize the false statements made in error, we open up the way, don’t you see, for false statements by intention. And of course, a false statement of fact, made deliberately, is the most serious crime a scientist can commit.” (C.P. Snow, The Search, 1959). Snow’s statement concisely summarizes the importance of honesty in science. The Search is a novel in which the protagonist, a young physicist, publishes results that are based on an assistant’s work that was not checked. When others attempt to repeat the results, errors in the original work are discovered. The young physicist’s previously promising career is ended because he has violated the honesty rule as quoted above. Verification of geoscience data and recognition that geoscience models, while very useful, have inherent limitations that should be described in papers and reports are, or should be, a fundamental part of any geoscience study or project.
Verifying the Data Used in a Model
Models of geologic processes are very useful in assisting geoscientists in organizing their data and interpretations. Therefore, the first thing a geoscientist should do is verify that the data collected in fact are representative the of the data type being collected (rock, soil, water, air, geophysical, etc.) and that the data collected are repeatable. Only then should the data be added to a model. The mining industry has for many years required descriptions in technical reports of the sampling processes including the Quality Assurance/Quality Control (QA/QC) procedures employed to verify the reliability of samples used in the reports.
Roden and Smith (2001) point out that:
The key message that needs to be remembered in the area of field sampling is that errors introduced at this stage of the data generation process are, in most instances, the largest errors introduced into a program and that these errors cannot be rectified in the subsequent processing of the sample. Errors created in the field can only be rectified in the field.
Roden and Smith (2014) provide a detailed discussion of the various elements in a QA/QC program. Abbott (2007 & 2008) summarizes QA/QC procedures. Ensuring that the sampling and analytical procedures are repeatable is vital. Abbott (2009) describes the lack of repeatability of the Los Angeles Abrasion Test (ASTM C 131, AASHTO T 96) that for many years was one of the standard tests used by highway departments and others to characterize the suitability of an aggregate deposit for road construction. Swiger and Boll (2009) provide a detailed description of the problems involved in aquifer sampling.
Demonstrating reliability is true not only for chemically analyzed samples but also applies to geophysical and other types of data collection. Are your instruments properly calibrated, maintained, and cleaned? For example, one uranium exploration program that was finding no anomalous radiation was calibrating its detectors at the program’s field camp that was assumed to be away from any deposit. In fact, the camp sat on top of a large uranium deposit and so the calibration checks set the radiation background way too high. I once discovered that a base station I was using for a ground magnetic survey, which was chosen for repeated magnetic readings to measure diurnal variations, was for some reason a point at which magnetic readings fluctuated dramatically over short time intervals and was not in fact a reliable base station. I learned that taking repeated readings over a several was required to determine if a location would make a good base station. The drill hole survey program used at a field site I visited involved determining the azimuth and inclination of the drill holes using a wooden rod inserted into the collar of the hole. All drill holes were designed with a northeast azimuth. However, the top few feet of one hole had be bent roughly 90° during removal of the rig. At this hole, the rod used to measure the azimuth pointed roughly southwest and at an odd inclination. The error was easily spotted, but the data required checking to spot the error.
Sampling the water content of snow pack presents another kind of sampling reliability problem. The traditional method of snow pack sampling is to regularly sample selected sites over the winter by collecting a sample of the snow pack from top to bottom with a tube sampler that is then weighed to determine water content. The problem with this sampling approach is that a site 10 feet away can yield very different results because of differences in exposure to sun and/or wind. Science News (2017) reported on a NASA-led program to improve this problem in the measurement of the water in the snow pack.
The preceding examples demonstrate that QA/QC procedures should be included in any data collection program. Once one can demonstrate the reliability of one’s data, then the data can be used in a modeling program.
Models Involve Assumptions and Interpretations
Reed (2007) points out that “‘Modeling’ refers to the process of creating as special array of estimation.” The modeled parameter being estimated may be a surface (topography, structure contour, etc.) in a 2-dimensional “grid model” or a 3-dimensional block model of a mineral deposit, an aquifer, a block model of several stratigraphic layers, etc. Computer programs now allow for the rapid creation and rotation of block models in clear pictures that appear to show reality. But they are not reality. Adams (1985) noted that “All models involve assumptions and interpretations of what is known at the time, and they are only as reliable as the data and science on which they are based. They are valuable springboards for exploration, new intuition, and improvement. However, no model is beyond continued improvement and challenge.”
Reed (2007) cautions,
A healthy level of skepticism must be employed when using computer software to compute resource volumetrics. The algorithms or methods used to create the volumetric models have limitations that may be acceptable for one type of deposit while being completely inappropriate for another. For example, a sand and gravel deposit requires an approach that is completely different than the methods used to evaluate a phosphate reserve. The best way to avoid misuse is to always compare “slices” through the models with borehole logs that show the original data. These cross-sections are used to make sure that the model “honors” the data. Just as importantly, cross-sections should be evaluated to make sure that the modeling conforms to the expected geology.
Rahn (2000) and Oreskes (2000) point out that the limitations of computer models are frequently insufficiently discussed in the presentation of modelling results. This insufficient discussion of the limitations of models is a form of dishonesty, the omission of important facts needed to fully understand the statements made. Among the major reasons limiting computer modelling are the following points:
- Models can never be validated; they can only be invalidated by comparison with actual data (Rahn, 2000).
- Models can be tweaked to fit the actual data—does the tweaking hide a bad model or improve a good model? (Oreskes, 2000).
- 3D models are interpretations, not the “truth.”
- Geology is heterogeneous and non-linear—if the mathematics conflict with geology, suspect the math.
In addition to avoiding false and misleading (though omissions) statements, scientists have the obligation to acknowledge what we don’t know, as highlighted by the following two quotations.
- “Scientists frequently do not properly acknowledge the limits of what they really know and the uncertainties involved” (De Freitas, 2000).
- “It’s a kind of scientific integrity, a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards. For example, if you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results; and things you thought of that you’ve eliminated by some other experiment, and how they worked—to make sure the other fellow can tell they have been eliminated.” (Feynman, 1974 & 1999).
 Reed’s comments are particularly pertinent because Reed is a developer of computer modeling software and uses his software in geological consulting. He is not deceived by his “pretty pictures.”
Different models of the same data lead to different conclusions. This is well-illustrated for the public by the various models for forecasting hurricane storm tracks that are presented by television weather forecasters. Figure 1 is an example. The forecast storm tracks vary widely although most predict a turn to the northeast following landfall in South Carolina.
Figure 1: Different storm tracks for Tropical Storm Bonnie based on different forecasting models, http://derecho.math.uwm.edu/models/archive/2016/al022016/al022016_2016052906.png.
Figure 2 presents two different contour maps constructed from the same gridded data but using two different Surfer™ program contouring algorithms, triangulation, and inverse distance to the fourth power (ID4) (Abbott, 2004). Both maps show that the low point is in the middle, or somewhat right of the middle of the top of the map and that there is a lumpy high ridge near the bottom of the map. Beyond that, the maps are quite different.
Figure 2: Two different contour maps using the same gridded data and different contouring algorithms.
Geach (2016) presents a similar example of using the same data to arrive at different conclusions. Geach states that “we must understand what happens in the ‘black box’ if we are to ensure the quality of our outputs. … This is most important when applying computer based algorithms to produce geological surfaces that are spatially incomplete and difficult to refine.” Geach illustrated his point by using three different interpolation techniques on the same data set to determine the volume between a geologic surface and the upper contact of an underlying formation. Table 1 presents Geach’s results.
Ordinary Kriging and the inverse distance weighting function produced similar results while the radial basis function produced a volume over twice the size of the other methods. Which result is the best estimate is something the geoscientist working with the data will have to determine and the basis for the decision should be explained. Or the geoscientist could present all three results with a discussion of the relative strengths and weaknesses of algorithms used in this case.
Mathematical proof of exponential decline
When the production of hydrocarbons from a well is plotted on semi-log paper over time, the production trend, particularly after about six months, frequently falls along a straight line. Straight-line plots on semi-log paper reflect the applicability of the exponential function or change by a constant percentage amount per unit of time.
I once listened while a petroleum engineer mathematically proved that exponential decline should occur if one assumed that the reservoir was (1) isotropic, (2) homogeneous, (3) had a uniform thickness, and (4) had infinite horizontal dimensions. The problem is that these assumptions are never valid.
Figure 3 illustrates the reality of fluid flow in an oil reservoir rock. The rock is a Lower Cretaceous Dakota Group sandstone, which is a common reservoir rock in the Denver Basin. This outcrop, in a road cut southwest of Denver, is saturated with a fossil oil deposit (Abbott and Noe, 2016, Stop 6). During the erosion and uplift of this unit, uranium and other metal-bearing waters entered and flowed down gradient. When the oxidizing, uranium and other metal-bearing water encountered the reducing environment of the fossil oil reservoir, a roll front uranium deposit formed along with related trace element features, one of which is the ilsemannite (a blue molybdenum oxide) tongue outlined in Figure 3. The ilsemannite tongue represents 2-dimensional cut through a geochemical cell resulting from ground water flow into and through the sandstone. The irregular shape of the tongue illustrates the reality of fluid flow, which is very difficult, if not impossible, to accurately and precisely model.
Figure 3: Ilsemannite tongue in Dakota Group sandstone. (Photo by D.M. Abbott)
Professional judgment is required
Geologic conclusions are, in the final analysis, expressions of judgement predicated upon knowledge and experience. A geologic conclusion, however, purports to be more than an arbitrary determination—it is reached as a consequence of method, usually based on a model. The model used must be the one best adapted to the dealing with the questions asked about the property in question. If a non-standard model is used, the reasons why the standard (“best practice”) model is inappropriate and the basis for model used must be fully described (Abbott, 2004 & 2014). Discussion of the limitations of the model(s) used is an important part of describing what you know and what you don’t know as the result of your geologic work (Abbott, 2001). This is only basis for judging the validity of geologic work. Although different professionals may arrive at different conclusions, they should be able to determine whether another professional arrived at his differing conclusions in a scientifically sound matter.
Ultimately, we must recognize that achieving the degree of honesty required of us as geoscientists is difficult to achieve but must be pursued with diligence. It is not enough to avoid conscious lies or deception; we must strive to avoid the subtle deceptions of being dazzled by the beautiful drawings of our models. We must describe what we don’t know as much as what we know. We must ensure that the limitations of “best practices,” “standard approaches,” and models are understood by all!
Abbott, David M., Jr., 2000, Honesty: the principal geoscience ethical principle (abs): Geological Society of America Abstracts with Programs, v. 32, no. 7, p. A293.
———, 2001, Honesty—Reporting on What Failed as Well as What Works in Professional Ethics & Practices column 67: The Professional Geologist, August 2001, p. 27-28.
———, 2004, Are scientific honesty and “best practices” in conflict?: European Geologist, No. 18, December 2004, p. 34-38; reprinted in The Professional Geologist, July-August 2005, p. 46-51.
———, 2007, Assuring the reliability of your sampling results: The Professional Geologist, Nov/Dec 2007, p. 33-38.
———, 2008, Assuring the reliability of your sampling results in Professional Ethics & Practices column 121: The Professional Geologist, May/Jun 2009, p. 36-38.
———, 2009, Assuring the reliability of your sampling results: the LA Abrasion Test in Professional Ethics & Practices column 114: The Professional Geologist, Mar/Apr 2008, p. 20-21.
———, 2014, Use professional judgment: know the assumptions underlying a best practice: 2014 Transactions of the Society for Mining, Metallurgy & Exploration, v. 366, p. 421-425.
Abbott, David M., Jr., and Noe, David C., 2016, Consequences of living with geology: a model field trip for the general public, in Keller, S.M., and Morgan, M.I., eds., Unfolding the geology of the west: Geological Society of America Field Guide 44, p. 355-376, doi: 10.1130/2016.0044(15).
Adams, Samuel S., 1985, Using geological information to develop exploration strategies for epithermal deposits in Berger, B.R. and Bethke, P.M., eds, Geology and geochemistry of epithermal systems: Society of Economic Geologists Reviews in Economic Geology V. 2, p. 273-298.
De Freitas, Chris, 2000, Uncertainty in Science: What should we believe?: The Professional Geologist, January 2000, p. 5-7.
Feynman, Richard P., 1974 (1999), Cargo Cult Science: Some Remarks on Science, Psuedoscience, and Learning How Not to Fool Yourself in Engineering and Science (Cal Tec’s magazine) and reprinted in Jeffery Robbins, ed., 1999, The Pleasure of Finding Things Out: the best short works of Richard P. Feynman: Helix Books, Perseus Books, Cambridge, MA, p. 205-216. Also available on the web.
Geach, Martin, 2016, 3D model: stepping back: Geoscientist, www.geolsoc.org.uk/Geoscientist/June-2016/3D-models-stepping-back.
Oreskes, Naomi, 2000, Why believe a computer? Models, measures, and meaning in the natural world in Schneiderman, J.S., ed., The Earth around us: W.H. Freeman, p. 70-82.
Rahn, Perry H., 2000, Proof, validity, and some legal advice: The Professional Geologist, November 2000, p. 7-8.
Reed, James P., 2007, Volumetric analysis & three-dimensional visualization of industrial mineral deposits in Cappa, James A., ed., Proceedings of the 43rd Forum on the geology of industrial minerals: Colorado Geological Survey Resources Series 46 p. 202-523.
Roden, S., and Smith, T., 2001, Sampling and analysis protocols and their role in mineral exploration and new resource development in Edwards, A.C., ed., Mineral resource and ore reserve estimation—the AusIMM guide to good practice: the Australasian Institute of Mining and Metallurgy, Melbourne, p. 73-78.
———, 2014, Sampling and analysis protocols and their role in mineral exploration and new resource development in Mineral resource and ore reserve estimation—the AusIMM guide to good practice, 2nd ed.: the Australasian Institute of Mining and Metallurgy, Melbourne, p.53-60.
Science News, 2017, Snow science supporting our nation’s water supply: https://www.sciencedaily.com/releases/2017/02/170216130306.htm.
Swiger, Nick, and Boll, Jan, 2009, Groundwater sampling to achieve aquifer representativeness: : The Professional Geologist, Nov/Dec 2009, p. 42-49.
Snow, C.P., 1959, The Search: Charles Scribner’s Sons, 342 p.; also available in a 2008 paperback.