Quality and sampling error quantification for gold mineral resource estimation

Simon C. Dominy1, Saranchimeg Purevgerel2 & Kim H. Esbensen3

1 Camborne School of Mines, Cornwall, UK and Novo Resources Corporation, Perth, Western Australia. s.dominy@e3geomet.com

2 MSA Global LLC, Ulaanbaatar, Mongolia. p.saranchimeg@msaglobal.net

3 KHE Consulting, Copenhagen, Denmark. khe.consult@gmail.com

Sampling is a vital component during all stages of the mine value chain. It includes the sampling of in situ material and broken rock for geological, metallurgical and geoenvironmental purposes. Sampling errors are defined in the context of the Theory of Sampling (TOS), where incorrect actions may lead to uncertainty and create a significant overall sampling +measurement error. 1–3 The TOS breaks down this error into a series of contributions along the full value chain (the planning to assay-measurement process). Errors are additive throughout this pathway, unavoidably exacerbating risk. 2,4–6 After collection, sampling errors also occur throughout all subsequent downstream processes contributing to uncertainty in test work and any decisions made thereon. Across the full mine value chain, the sum of these errors generate both financial and intangible losses. In essence, poor quality, non-representative sampling increases project risk and may consequently often lead to incorrect project valuation. There is hardly any other application field where this is as critically important than for Gold mineral resource estimation, because of the very low grades and the extremely irregular mineralisation heterogeneities encountered (Figure 1).

Figure 1. For optimal sampling error quantification for Gold mineral resource estimation no efforts are spared: Reverse Circulation grade control drilling at the Novo Resources Corporation Beatons Creek project in Western Australia.

Sampling—the first critical success factor in the mine value chain

The data produced must be fit-for-purpose to contribute to mineral resources/ore reserves reported in accordance with the 2017 PERC7 or other international codes. Quality assurance/quality control (QA/QC) is critical to maintaining data integrity through documented procedures, sample security, and monitoring of precision, accuracy and contamination. Samples and their associated assays are key inputs into important decisions throughout the mine value chain.

The TOS was first developed in the 1950s by Dr Pierre Gy to deal with sampling challenges in the mining industry, though it has far wider applications today. 1–3 The TOS provides critical guidelines for reducing sampling errors, Table 1.

Table 1. Definition of TOS sampling errors.

Quality assurance and quality control

Quality assurance and quality control are the key components of a quality management system. 8–10 Quality assurance is the collation of all actions necessary to provide adequate confidence that a process (e.g. sampling, test work and assaying) will satisfy the pertinent quality requirements. While QA deals with prevention of problems, QC aims to detect these – in time. Quality control procedures monitor both precision and accuracy of samples and data, as well as possible sample contamination during preparation and assaying. Throughout any mineral resource sampling programme, QA/QC is a key activity to determine the imperative of fit-for-purpose samples.

Protocols should be set up to cover: field collection, laboratory preparation and analysis. During grade control, QA/QC should include field duplicates and certified reference material (CRM) submission, e.g. a minimum of three CRMs at a range of grades, including blanks. Laboratory QA/QC shall include internal CRMs, pulp duplicates, umpire sample submission, pulp screen tests and contamination tests. In particular, duplicate field samples provide a measure of variability of the entire sampling and analysis process. Best practice QA/QC is a very comprehensive framework, Table 2.

Documentation of sample collection and laborator y activities is an important part of QA/QC, as is appropriate staff training and monitoring. It is the opinion of the present authors that quality samples only follow from well-trained and experienced personnel. Companies should ensure that all staff involved in sampling activities are appropriately trained in sampling and, during their first few months, have adequate mentoring (sampling QA). This will be additional to other standard operational and safety training. Proper training shall be facilitated bywell-written and illustrated documentation, see examples in Reference 3.

Table 2. Best practice QA/QC for a Gold grade control sampling programme for sound resource estimation.

a Applies to any sample type collected.

b Applied to linear and drill samples; KPIs are based on sample type and expected mass.

c Laboratory crusher or reverse circulation (RC) rig rejects.

d Dependent upon nature of ore and assay method. For samples assayed via screen fire assay (SFA), a high precision would be expected for undersize fraction.

e Recommendation to have a minimum of three CRMs at grades ranging from cut-off, ROM and high-grade. For any batch of (say) 20–30 samples, three key CRMs should be added. Note that by their very need to be homogeneous, CRMs do not bear coarse “nuggety” gold, but they can be matrix matched by being quartz-dominated, sulphide-bearing a.o. The laboratory will also insert its own CRMs. CRMs used for SFA process will just be fire assayed. Action is required if 3d breached, usually re-assay of the entire batch if possible.

f Blanks provide a measure of contamination. They should be inserted after expected high-grade and/or visible gold-bearing samples. If substantial visible gold is present, two separate blanks should be placed after the sample. One blank should be added together with the three CRMs per batch. Laboratory will also place blanks into the sample stream.

g Test involves screening or use of an autosizer of the pulp to ensure 95 % passing. All samples should pass or the entire batch should be reground.

h Barren flush may be inserted after each and every sample for coarse gold samples. Assaying of the barren flush; for fine gold ores, a rate of 1 in 50 is appropriate increasing to 1 in 20 for coarse gold ores. Careful management of coarse gold ores is required. It is suggested that laboratories include a “wash” after visibly high-grade (e.g. visible gold-bearing) samples. However, if the ore bears notable coarse gold, then cleaning is best after each sample given that even low-grade samples can bear coarse gold particles.

i Monthly submission of samples (typically pulps), including standards and duplicates is sufficient to provide a check of primary laboratory results. This is especially important where an on-site laboratory is being used as it provides independent confirmation of the results. Where SFA, LW or PAL is used, there may be no pulp residues to submit. In this case, coarse rejects can be used. Umpire samples (e.g. pulps or coarse duplicates) should be supplied to the mine and submitted by mine staff to the umpire laboratory. In some cases, the laboratory (mine or off-site) may submit umpire samples as part of their internal QA/QC.

j HARD is half the absolute difference of the pair divided by the pair mean; HARD value for fine versus coarse gold; HARD can be expressed as RSV, where HARD = √2 / 2 · RSV, e.g. ±10 % HARD is ±13 % RSV.

k It is important to ensure that enough QA data is collected, particularly during a small sampling programme. The rate of insertion of CRMs, blanks etc. may need to be increased beyond the nominal 1 in 20 to achieve a minimum of 10 results.

Training and mentoring should be linked to continuous quality improvement programmes, where protocols are internally and externally audited at least annually. On-going supervision and periodic re-training are strongly recommended, and should always in part be based on practice at the rock face/in the core shed, not only in the classroom.

Quantifying errors along the full sampling value chain

he results of duplicate sampling programmes document the magnitude of errors across the full sampling value chain, Table 3. These generally show that a large component of the total error is introduced during sample collection, especially during primary sampling. As a result, undertaking excessive efforts to reduce errors during preparation and analysis will not necessarily result in a substantive uncertainty reduction. In contrast, the collection of larger, high-quality field samples (for examples using a higher number of increments in composite samples) will result in significant error reduction provided that other protocols are optimised appropriately.

Test work from a Gold vein deposit exemplifies the impact of sampling error through comparison of chip vs channel samples, Table 4. 13 Seventy-five sample triplicates (chip, hand-cut channel and saw-cut channel) were collected from around a 40 m × 20 m stope block (Figure 2). The mineralisation was known to have a moderate variability, containing visible gold up to 1.5 mm in size. The test block was sampled from faces located every
1.5 m along its upper and lower drives and two raises. After cleaning, a reference line was drawn across each face centre and the different types of samples were collected systematically from the bottom up: chip sample, hand-cut and saw-cut channels. The sample delimitation dimensions were estimated and designed to achieve a theoretical sample support of 3 kg m–1. All samples were subsequently prepared and assayed in identical fashion, via a total sample preparation and screen fire assay route. The FSE for this highly optimised protocol, was effectively zero. A QA/QC programme was applied, with all CRMs and blanks within expectation.

These results show a marked reduction in RSV and nugget effect between the three sample sets. The rigorous laboratory protocol and QA/QC indicate that errors within the laboratory were at a minimum. Therefore, the remaining variability relates to the in situ nugget effect and sample collection. The dominant error for the channel samples relates to the in situ nugget effect, given that sampling error was minimal. The dominant difference between the chip and saw-cut channel samples relates to sampling error. These results corroborate many previous findings, showing that saw-cut channel samples provide the best sample quality. Most importantly, this experiment substantiates the critical role of empirical total sampling/preparation/analysis error quantification.

Stage-wise error evaluation

ore detailed error evaluations can be undertaken including each key stage along the sampling value chain. Thus Table 5 shows the results of such an analysis for two contrasting Gold ore types (termed mesothermal and epithermal). In both cases the highest stage error again turned out to be the field RSV, at 42 % and 34 % respectively.

Table 3. Distribution of errors across stages of a gold sampling programme.
a Potential component error range as determined from duplicate sample (pair) analysis;11
b Maximum recommended FSE distribution across the sampling stages;12
c Maximum recommended other TOS error proportions across the sampling stages;12 
RC: Reverse Circulation.

For the epithermal system (no coarse gold), all stage errors were found to be easonable and did not require further action (Figure 3).

For the mesothermal system (coarse gold, i.e. “nugget” gold) both the field and analytical RSVs were deemed high. In order to improve on this situation, the field RSV was attempted to be reduced by taking a larger split at the rig (up from 2 kg to 4 kg) and assaying the entire 4 kg by a more precise analytical method (LeachWELL). Based on initial duplicates from the revised protocol, the field RSV was now reduced to 36 % and the analytical RSV 4 %, now acceptable for a coarse gold mineralisation.

There is a need, and a clear advantage, in moving towards full quantification of errors for objective QC assessment, where a first step is the application of the RSV sampling + analysis variability characteristic as defined in DS3077.3,14 Resolution of individual relative errors across the complete sampling, preparation and analysis stages can be gained from simple duplicate sample pairs, as evidenced by Table 5.

Table 4. Empirical example: comparison between chip and channel sample replicates. Mean grades cut and diluted to stope width. Reconciled stope head grade 13.7 g/t Au.
Figure 2. Underground sampling. Collection of optimal saw-cut channel sample. Left: cutting channel “delimitation” slots; right: sampling (“extraction”) of the channel material.
Figure 3. Left: logging and marking diamond drilling cores for sampling (at former Castlemaine Goldfields Ltd Wattle Gully project in Victoria, Australia). Right: sample preparation: diamond drilling (DD) core ready for cutting.

Gold – always special

For Gold resource estimation, special issues are about, compared to many other materials and commodities. Thus deliberately strenuous practical measures are recommended to reduce the risk of tampering of samples. These could include: maintaining increased security between the sample site (e.g. mine face and drill rig) and sample transport and careful recording of who has access to samples between collection and shipping, and maintaining a copy of that record.

Table 5. Stage-wise error estimation for two contrasting Gold ore types (RC = Reverse Circulation drilling; DD = Diamond Drilling). All figures rounded to the nearest whole %.
a Rig duplicate
b Core half duplicate.

Conclusions

eologists and analytical chemists must acknowledge the systematic rigour of the TOS framework and should readily be able to appreciate the help from proper management of all associated errors.

Empirical error estimations of all stages involved in the complete “from-lot-to-aliquot” pathway demonstrated above and the value of the critical information gained has been laid out in no uncertain way. Where samples are analysed to support any resource estimate, a QA/QC programme must be introduced to ensure continuous quality information of both sampling and assaying. Written protocols and procedures, staff training, periodic auditing of protocols and people, and re-training are all required. DS307714 provides a framework on how to produce transparent protocols regarding the specific sampling pathway. There are many QA/ QC frameworks that can be applied—more on this latter issue in later Sampling Columns.

References

[] P.M. Gy, Sampling of Particulate Materials: Theory and Practice. Elsevier (1982).

[] F.F. Pitard, Theory of Sampling and Sampling Practice. CRC Press (2019 ). https://doi.org/10.1201/9781351105934

[] K.H. Esbensen, Introduction to the Theory and Practice of Sampling. IM Publications Open (2020). https://doi.org/10.1255/978-1-906715-29-8

[] R.C.A. Minnitt, ”Sampling: The impact on costs and decision making”, J. South Afr. Inst. Min. Metall. 107, 451–462 (2007).

[] S.C. Dominy, “Impor tance of good sampling practice throughout the gold mine value chain”, Min. Tech. 125, 129–141 (2016). https://doi.org/10.1179/1743286315Y.0000000028

[] J.-M. Rendu, Risk Management in Evaluating Mineral Deposits. Society of Mining, Metallurgy and Exploration (2017).

[] PERC, Pan-European Standard for Reporting of Exploration Results, Mineral Resources and Reserves. The Pan-European Reserves and Resources Reporting Committee (PERC) (2017).

[] M . A . Vallée, “ Sampling quality control”, Explor. Min. Geol. 7, 107–116 (1998).

[] M.Z. Abzalov, “Quality control of assay data: a review of procedures for measuring and monitoring precision and accuracy”, Explor. Min. Geol.3–4, 131–144 (2008). https://doi.org/10.2113/gsemg.17.3-4.131

[] A . Simon and G.Gosson, “Considerations on quality assurance/quality control and sample security”, in Proceedings of the Sampling Conference, Perth, Australia, 27–29 May 2008. Australasian Institute of Mining and Metallurgy, pp. 135–140 (2008).

[] C.R. Stanley and B.W. Smee, “Strategies for reducing sampling errors in exploration and resource definition drilling programmes for gold deposits”, Geochem. Explor. Environ. Anal. 7, 329–340 (2007). ht tps://doi.org/10.1144/1467-7873/07-128

[] F.F. Pitard, “Guidelines for acceptable allotted sampling uncertainty”, in Proceedings of the World Conference on Sampling and Blending. Gecamin, Santiago, pp. 89–98 (2013).

[] S.C. Dominy, H.J. Glass, L. O’Connor, C.K. Lam, S. Purevgerel and R.C.A. Minnitt, “Integrating the Theory of Sampling into underground mine grade control strategies”, Minerals 8(6) , 232 (2018) . https://doi.org/10.3390/min8060232

[] DS3077. Representative Sampling – Horizontal Standard. Danish Standards Foundation (2013). www.ds.dk

Glossary

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

A

Aliquot

An aliquot is the ultimate sub-sample extracted in a 'Lot-to-Aliquot' pathway for analysis. By analogy, process analytical technology involves the extraction of virtual samples, which are defined volumes of matter interacting with a process analytical instrument.

Analysis

Analysis is the systematic examination and evaluation of the ultimate sub-sample of chemical, biological, or physical substance (Aliquot) to determine its composition, structure, properties, or presence of specific components.

Analytical Bias

Analytical bias is a systematic deviation of measured values from true values.  An analytical bias can arise from multiple sources, including instrument calibration errors, sample preparation techniques, operator method, or inherent methodological limitations. Unlike random errors, which fluctuate unpredictably, analytical bias consistently skews results in a particular direction. Identifying and correcting this bias is crucial to ensure the accuracy and reliability of analytical data (bias correction).

Analytical Precision

Analytical precision refers to the degree of agreement among repeated analyses of the same aliquot under identical conditions. It reflects the consistency and reproducibility of the results obtained by a given analytical method. High precision indicates minimal random analytical error and close clustering of analytical results around an average. Precision does not necessarily imply accuracy, as a method can be precise yet still yield systematically biased results. 

C

Composite Sampling

Composite sampling extracts a number (Q) of  Increments, established to capture the Lot Heterogeneity. Composite sampling is the only way to represent heterogeneous material. A composite sample is made by aggregating the Q increments subject to the Fundamental Sampling Principle (FSP). The required amount of increments for the requested Representativity Q can be carefully established to make sampling fit-for-purpose.

Compositional Heterogeneity (CH)

Compositional heterogeneity is the variation between individual fundamental units of a target material (particles, fragments, cells, ...). CH is an intrinsic characteristic of the target material to be sampled.

Correct Sampling Errors (CSE)
CSE are the errors that cannot be eliminated even when sampling correctly (unbiased) according to the Theory of Sampling (TOS). CSE are caused by Lot Heterogeneity and can only be minimised.
There are two Correct Sampling Errors (CSE):
  1. Fundamental Sampling Error (FSE)
  2. Grouping and Segregation Error (GSE)
Crushing
Crushing is the term used for the process of reducing particle size. Other terms are grinding, milling, maceration, comminution. Particle size reduction changes the Compositional Heterogeneity (CH) of a material. Composite Sampling and crushing are the only agents with which to reduce the Fundamental Sampling Error (FSE).

D

Data Format

Data must be reported as the measurement results and the Measurement Uncertainties stemming from sampling and analysis. Note that MUAnalysis and MUSampling are expressed as variances.

Data =            Measurement +/- (MUSampling ; MUAnalysis)

Example:       375 ppm +/- (85 ppm ; 18 ppm)

Note that the Uncertainties 85 ppm and 18 ppm are the square roots of MUSampling and MUAnalysis.

Data Uncertainty
Distributional Heterogeneity (DH)

Distributional heterogeneity is the variation between groups of fundamental units of a target material. Groups of units manifest themselves as Increments used in sampling. DH is an expression of the spatial heterogeneity of a material to be sampled (Lot).

DS3077:2024

This standard is a matrix-independent standard for representative sampling, published by the Danish Standards Foundation. This standard sets out a minimum competence basis for reliable planning, performance and assessment of existing or new sampling procedures with respect to representativity. This standard invalidates grab sampling and other incorrect sampling operations, by requiring conformance with a universal set of six Governing Principles and five Sampling Unit Operations. This standard is based on the Theory of Sampling (TOS).

webshop.ds.dk/en/standard/M374267/ds-3077-2024

Dynamic Lot

A dynamic lot is a moving material stream where sampling is carried out at a fixed location. For both Stationary Lots and Dynamic Lots, sampling procedures must be able to represent the entire lot volume guided by the Fundamental Sampling Principle.

F

Fractionation

Fractionation is a way of processing a Lot or Sample before sampling (or subsampling). Fractionation separates materials/lots into fractions according to particle properties, e.g. size, density, shape, magnetic susceptibility, wettability, conductivity, intrinsic, or introduced moisture ...

Fundamental Sampling Error (FSE)

FSE results from the impossibility to fully compensate for inherent Compositional Heterogeneity (CH) when sampling. FSE is always present in all sampling operations but can be reduced by adherence to TOS' principles. Even a fully representative, non-biased sampling process will be unable to materialise two samples with identical composition due to Lot Heterogeneity. FSE can only be reduced by Crushing (followed by Mixing / Blending) i.e. by transforming into a different material system with smaller particle sizes.

Fundamental Sampling Principle (FSP)

The Fundamental Sampling Principle (FSP) stipulates that all potential Lot Increments must have the same probability of being extracted to be aggregated as a Composite Sample. Sampling processes in which certain areas, volumes, parts of a Lot are not physically accessible cannot ensure Representativity.

G

Global Estimation Error (GEE)

The GEE is the total data estimation error, the sum of the Total Sampling Error (TSE) and the Total Analytical Error (TAE).

Governing Principles

Six Governing Principles (GP) describe how to conduct representative sampling of heterogeneous materials:

1) Fundamental Sampling Principle (FSP)

2) Sampling Scale Invariance (SCI)

3) Principle of Sampling Correctness (PSC)

4) Principle of Sampling Simplicity (PSS)

5) Lot Dimensionality Transformation (LDT), and

6) Lot Heterogeneity Characterisation (LHC).

Grab Sampling

Process of extracting a singular portion of the Lot. Grab sampling cannot ensure Representativity for heterogeneous materials. Grab sampling results in a sample designated a Specimen.

Grouping and Segregation Error (GSE)

The GSE originates from the inherent tendency of Lot particles, or fragments hereof, to segregate and/or to group together locally to varying degrees within the full lot volume. This spatial irregularity is called the Distributional Heterogeneity (DH). There will always be segregation and grouping of Lot particles at different scales. GSE plays a significant role in addition to the Fundamental Sampling Error FSE. Unlike FSE however, the effects from GSE can be reduced in a given system state by Composite Sampling and/or Mixing / Blending. GSE can in practice be reduced significantly but is seldomly fully eliminated.

H

Heterogeneity

Heterogeneity refers to the state of being varied in composition. It is often contrasted with homogeneity, which implies complete similarity among components, which is a rare case. For materials in science, technology and industry heterogeneity is the norm. Heterogeneity applies to various contexts, such as populations of non-identical units, bulk materials, powders, slurries, biological swhere multiple distinct components coexist.

Heterogeneity in context of the Theory of Sampling, is described using three distinct characteristics, Compositional Heterogeneity CH, Distributional Heterogeneity DH and Particle-Size Heterogeneity

 

Heterogeneity Testing (HT)

Heterogeneity tests are used for optimizing sampling protocols for a variable of interest (analyte, feature) with regards to minimising the Fundamental Sampling Error (FSE).

Experimental approaches available are the 50-particle method, the heterogeneity test (HT), the sampling tree experiment (STE) or the duplicate series/sample analysis (DSA), and the segregation free analysis (SFA).

Recently, sensor-based heterogeneity tests have been introduced which bring the advantage of cost-effective analysis of large numbers of single particles.

Homogeneity

An assemblage of material units with identical unit size, composition and  characteristics. There are practically no homogenous materials in the realm of technology, industry and commerce (mineral resources, biology, pharmaceuticals, food, feed, environment, manufacturing and more) of interest for sampling. With respect to sampling, it is advantageous to consider that all materials are in practice  heterogeneous.

I

Incorrect Delimitation Error (IDE)

The principle for extracting correct Increments from processes is to delineate a full planar-parallel slice across the full width and depth of a stream of matter (Dynamic Lot. IDE results from delineating any other volume shape. When a sampling system or procedure is not correct relative to the appropriate Increment delineation, a Sampling Bias will result. The resulting error is defined as the Increment Delimitation Error (IDE). Similar IDE definitions apply to delineation and extraction of increments from Stationary Lots.

Incorrect Extraction Error (IEE)

Increments must not only be correctly delimitated but must also be extracted in full. The error incurred by not extracting all particles and fragments within the delimitated increment is the Increment Extraction Error (IEE). IDE and IEE are very often committed simultaneously because of inferior design, manufacturing, implementation or maintenance of sampling equipment and systems.

Incorrect Preparation Error (IPE)

Adverse sampling bias effects may occur for example during sample transport and storage (e.g. mix-up, damage, spillage), preparation (contamination and/or losses), intentional (fraud, sabotage) or unintentional human error (careless actions; deliberate or ill-informed non-adherence to protocols). All such non-compliances with the criteria for representative sampling and good laboratory practices (GLP) are grouped under the umbrella term IPE. The IPE is part of the bias-generating errors ISE that must always be avoided.

Incorrect Sampling Errors (ISE)

There are four ISE, which result from an inferior sampling process. These ISE can and must be eliminated.

  1. Incorrect Delimitation Error (IDE) aka Increment Delimitation Error
  2. Incorrect Extraction Error (IEE) aka Increment Extraction Error
  3. Incorrect Preparation Error (IPE) aka Increment Preparation Error
  4. Incorrect Weighing Error (IWE) aka Increment Weighing Error
Incorrect Weighing Error (IWE)

IWE reflects specific weighing errors associated with collecting Increments. For process sampling, IWE is incurred when extracted increments are not proportional to the contemporary flow rate (dynamic 1-dimensional lots), at the time or place of extraction. IWE is often a relatively easily dealt with appropriate engineering attention. Increments, and Samples, should preferentially represent a consistent mass (or volume).

Increment

Fundamental unit of sampling, defined by a specific mass or correctly delineated volume extracted by a specified sampling tool.

L

Lot

a) A Lot is made up of a specific target material to be subjected to a specified sampling procedure.

b) A Lot is the totality of the volume for which inferences are going to be made based on the final analytical results (for decision-making). Lot size can range from being extremely large (e.g. an ore body, a ship) to very small (e.g. a blood sample).

c) The term Lot refers both to the material as well as to lot size (volume/mass) and physical characteristics. Lots are distinguished as stationary or dynamic lots. A stationary lot is a non-moving volume of material, a dynamic lot is a material stream (Lot Dimensionality). For both stationary and dynamic lots, sampling procedures must address the entire lot volume guided by the Fundamental Sampling Principle (FSP).

Lot Definition

Lot Definition describes the process of defining the target volume, which will be subjected to Sampling.

Lot Dimensionality

TOS distinguishes Lot volume  according to the dimensions that must be covered by correct Increment extraction. This defines the concept of 'lot dimensionality', an attribute which is independent of the lot scale. Lot dimensionality is a characterisation to help understand and optimise sample extraction from any lot at any sampling stage. There are four main lot types: 0-, 1-, 2- and 3-dimensional lots (0-D, 1-D, 2-D and 3-D lots).

Lots are classified by subtracting the dimensions of the lot that are fully 'covered' be the salient sampling extraction tool in question. The higher the number of dimensions fully covered in the resulting sampling operation, the easier it is to reduce the Total Sampling Error TSE.

Lot Dimensionality Transformation (LDT)

By the Governing Principle Lot Dimensionality Transformation LDT, stationary 0-D, 2-D and 3-D lots can in many cases advantageously be transformed into dynamic 1-D lots, enabling optimal sampling. However, the application of LDT has practical limits as some lots cannot be transformed (e.g. a body of soil, or a mine resource, biological cells). The optimal approach for such cases is penetrating one dimension with complete increment extraction (usually height) turning a 3-D lot into a 2-D lot.

Lot Heterogeneity

The lot heterogeneity is the combination of Compositional Heterogeneity, Distributional Heterogeneity and Particle-Size-Heterogeneity.

CH + DH + PH

Lot Heterogeneity Characterisation
Lot Heterogeneity Characterisation is the process of assessing Lot Heterogeneity magnitude. Logically, it is impossible to design a sampling procedure without knowledge of the Heterogeneity of target material. Lot Heterogeneity Characterisation is the process of determining Lot Heterogeneity when approaching a new sampling project. There are two principal procedures of determining Lot Heterogeneity, Replication Experiment (RE) for Stationary Lots, and Variographic Characterisation (VAR) for Dynamic Lots. Heterogeneity Tests determine Constitutional Heterogeneity as the irreducible minimum obtainable of Sampling Variance, excluding all other Sampling Error effects.

M

Mass-Reduction

Mass-reduction is a physical process that divides a given quantity into manageable sub-samples. Mass-reduction must ensure that these sub-samples are representative of the original quantity (Representative Mass Reduction – Subsampling

Measurement

The total process of producing numerical data about a Lot, including sampling and analysis is called Measurement. Simultaneously, sensor-based analytical technology combines virtual sampling and signal processing. For both types of measurements the principles and rules of the  Theory of Sampling apply.

Measurement Uncertainty (metrological term) (MU)

MU expresses the variability interval of values attributed to a quantity measured. MU is the effect of a particular error, e.g. a sampling error, or an analytical error  or of combined effects (see MUTotal).

MUsampling reflects the variability stemming from sampling errors

MUanalysis reflects the variability stemming from analytical errors

MUtotal is the effective variability stemming from both sampling and analysis

MUtotal= MUsampling+ MUanalysis

Mixing / Blending

Mixing and blending reduces Distributional Heterogeneity (DH) before sampling/sub-sampling. N.B. Forceful mixing is a much less effective process than commonly assumed.

P

Particle-Size-Heterogeneity (PH)

PH is the compositional difference due to assemblages of units with different particle sizes (or particle-size classes).

Pierre Gy

The founder of the Theory of Sampling (TOS), Pierre Gy (1924--2015) single-handedly developed the TOS from 1950 to 1975 and spent the following 25 years applying it in key industrial sectors (mining, minerals, cement and metals processing). In the course of his career he wrote nine books and gave more than 250 international lectures on all subjects of sampling. In addition to developing TOS, he also carried out a significant amount of practical R&D. But he never worked at a university; he was an independent researcher and a consultant for nearly his entire career - a remarkable scientific life and achievement.

Precision

Precision is a measure of the variability of quantitative results. The larger the variability, the smaller the precision. In practice, precision is measured as the statistical variance s2 of the quantitative results (square of the standard deviation).

Primary Sample

The initial mass extracted from the lot. The Primary Sample is the product of Composite Sampling and consists of Q Increments. Both the mass of the Primary Sample as well as the number of increments extracted influence the sampling variability. As the primary sampling stage often has by far the largest impact on MUTotal, optimisation always starts at this stage.

Principle of Sampling Correctness (PSC)

The Principle of Sampling Correctness (PSC) states that all TOS' Incorrect Sampling Errors (ISE) shall be eliminated, or a detrimental Sampling Bias will have been introduced.

Principle of Sampling Simplicity (PSS)

PSS states that sampling along the Lot-to-Aliquot can be optimised separately for each (primary, secondary, tertiary ....) sampling stage. Since the Primary Sampling stage is often the dominant source of sampling error, optimization logically shall always begin at this stage.

Process Periodicity Error (PPE)

PPE is incurred if short-, mid- or long-term periodic process behaviour is not corrected for, in which case it may contribute to a sampling bias.

A process sampling strategy must make use of a high enough sampling frequency to uncover such behaviours; the sampling frequency must as a minimum always be higher than twice the most frequent periodicity encountered.

Process Sampling Errors (PSE)

PSE come into effect when Dynamic Lots are being sampled without compensating for process trends or periodicities (Process Trend Error and Process Periodicity Error).

Process Trend Error (PTE)

PTE occurs if mid- to long-term process trends are not corrected for, in which case they may contribute to a Sampling Bias. PTE and Process Periodicity Error PPE may, or may not, occur simultaneously depending on the specific nature of the process to be sampled.

Q

Q

Number of Increments composited to a Sample.

R

R

R is the number of replications of a series of independent complete ‘Lot-to-AliquotMeasurements, made under identical conditions applied in a Replication Experiment.

Replication Experiment (RE)

The replication experiment RE consists of a series of independent complete ‘Lot-to-Aliquot’ analytical determinations, made under identical conditions. The number of replications is termed R. RE provides MUSampling + MUAnalysis.

Representative Mass Reduction – Subsampling

Representative Mass Reduction (RMR) aka sub-sampling. TOS argues why Riffle-Splitting and Vezin-sampling are the only options leading to Representative Mass Reduction.

Representativity

A sampling process is representative if it captures all intrinsic material features, e.g., composition, particle size distribution, physical properties (e.g. intrinsic moisture) of a Lot.Representativity is a characteristic of a sampling process in which the Total Sampling Error and Total Analytical Error have been reduced below a predefined threshold level, the acceptable Total Measurement Uncertainty.
Representativity is the prime objective of all sampling processes. The representativity status of an individual sample cannot be ascertained in isolation, if removed from the context of its full sampling-and-analysis pathway. The characteristic Representative can only be accorded a sampling process that complies with all demands specified by TOS (DS3077:2024).

S

Sample

Extracted portion of a Lot that can be documented to be a result of a representative sampling procedure (non-representatively extracted portions of a Lot are termed Specimens).

Sampling

Sampling is the process of collecting units from a Lot (sampling procedure; sampling process): Grab Sampling or Composite SamplingThere are only two principal types of sampling procedures: Grab Sampling or Composite Sampling.

Sampling Accuracy

Closeness of the analytical result of an Aliquot with regards to the true concentration of the Lot]/glossary]. NB. “sampling accuracy” = “sampling + analytical accuracy”

Sampling Bias

The Sampling Bias is the difference between the true Lot concentration and the average concentration from replicated sampling. Such a difference is a direct function of the Lot Heterogeneity and as such inconstant; it changes with each additional sampling and can therefore not be corrected for. This is the opposite to the Analytical Bias for which correction is often carried out.

Sampling Error Management (SEM)

SEM determines the priorities and tools for all sampling procedures in the following order:

  1. Elimination of Incorrect Sampling Errors (ISE) (unbiased sampling)
  2. Minimisation of the remaining Correct Sampling Errors (CSE)
  3. Estimation and use of s2(FSE) is only meaningful after complete elimination of ISE
  4. Minimisation of Process Sampling Errors
Sampling Manager

The Sampling Manager is the Legal Person accountable for ensuring that all sampling activities are conducted in accordance with scientifically valid principles to achieve representative results. They are responsible for managing the design, implementation, and evaluation of sampling protocols while balancing constraints such as material variability, logistics, and resource limitations. This role requires expertise in the Theory of Sampling (TOS), leadership, project management and stakeholder communication skills.

Sampling Precision

The Sampling Precision is the variance of the series of analytical determinations, for example from a Replication Experiment (RE). Sampling precision always includes the Analytical Precision, since all analysis is always based on an analytical Aliquot, which is the result of a complete 'Lot-to-Aliquot' sampling pathway. Therefore sampling precision = sampling + analysis precision.

Sampling Protocol

Document explaining the undertakings necessary for the sampling process. It contains the tools and procedures from Lot-to-Aliquot[/glossary].

Sampling Scale Invariance (SCI)

The Principle of SSI states that all Sampling Unit Operations (SUO) can be applied identically to all sampling stages, only the scale of sampling tools differs.

Sampling Uncertainty

Sampling Uncertainty is the difficulty of collecting a representative sample due to Lot Heterogeneity; the more heterogeneous the material, the higher the uncertainty associated with any sample attempting to represent the whole Lot.

Sampling Unit Operations (SUO)
A Sampling Unit Operation is a basic step in the 'Lot-to-Aliquot' pathway. Five practical SUOs cover all necessary practical aspects of representative sampling: Composite Sampling, Crushing, Mixing/ Blending, Fractionation, and Representative Mass Reduction - Subsampling.
Secondary Sample

A secondary sample is the product of Representative Mass Reduction - Subsampling from a Primary Sample. Identical nomenclature applies for further Representative Mass Reduction steps (Tertiary...).

Specimen

A specimen is a portion of a larger mass/volume (Lot) extracted by a non-representative sampling process. Grab Sampling results in a specimen.

Stakeholder

A Stakeholder is any entity interested in the result coming from sampling and analysis. Data representing stationary or flowing heterogeneous materials are requested by different parties with a multitude of differing objectives. Stakeholders can be internal, from commercial organisations, public authorities, research and academia or non-governmental organisations.

Stationary Lot

A Stationary Lot is a non-moving volume of material where sampling is carried at from multiple locations, each resulting in an Increment. For both Stationary Lots and Dynamic Lots, sampling procedures must address the entire Lot volume guided by the Fundamental Sampling Principle (FSP).

T

Theory of Sampling (TOS)

TOS Theory and Practice of Sampling: necessary-and-sufficient framework of Governing Principles (GP), Sampling Unit Operations (SUO), Sampling Error Management rules (SEM) together with normative practices and skills needed to ensure representative sampling procedures. TOS is codified in the universal standard DS3077:2024.

Total Analytical Error

TAE is manifested as the Measurement Uncertainty resulting only from analysis (MUAnalysis). TAE includes all errors occurring during assaying and analysis (e.g. related to matrix effects, analytical instrument uncertainty, maintenance, calibration, other), as well as human error.

Total Measurement Uncertainty

Whereas Measurement Uncertainty (MU) is traditionally only addressing analytical determination, e.g. concentration := 375 ppm +/- 18 ppm (MUanalysis), Theory of Sampling (TOS) stipulates reporting analytical results with uncertainty estimates from both sampling and analysis.  This gives users of analytical data the possibility to evaluate the relative magnitudes of MUsampling vs. MUanalysis, enabling fully informed assessment of the true, effective data quality involved. A complete data uncertainty must have this format:

MUTotal = MUSampling + MUAnalysis

The attribute Total Measurement Uncertainty (MUTotal) is the most important factor determining the attribute data quality.

Total Sampling Error (TSE)

The Incorrect Sampling Errors (ISE) and Correct Sampling Errors (CSE) add up to the effective Total Sampling Error (TSE). TSE is causing the Total Uncertainty resulting from material extraction along the sampling pathway from-lot-to-aliquot (MUSampling).

Total Uncertainty Threshold

The acceptable Total Measurement Uncertainty, which must include the Sampling Measurement Uncertainty (MUSampling) and Analytical Measurement Uncertainty (MUAnalysis).

U

V

Variographic Characterisation (VAR)

Variography is a variability characterisation of a dynamic 1-dimensional dynamic lot. A variogram describes variability as a function of Increment pair spacing (in time). Variography is also applied in geostatisctics in describing the variability as a function of spacing/distance between analyses.