2
to capture variability across the orebody. Their method-
ology suggests leveraging large datasets from inexpensive
tests to generate spatially detailed models, which can then
guide more targeted, costly testing. Similarly, Doll (2018)
advocates a three-stage iterative sampling approach, which
begins with broad initial variability sampling, narrows
down redundant parameters, and culminates in an opti-
mized program with minimal but strategically distributed
tests. This process ensures that the density of geometal-
lurgical data aligns with the heterogeneity of the orebody,
achieving cost-efficiency without compromising reliability.
Both approaches highlight the critical balance between
sampling density and precision to support robust through-
put forecasts and mine planning
The analysis draws from 154 feasibility studies pub-
lished over the past decade, encompassing various commod-
ities, project sizes, and regions worldwide. By evaluating
the frequency of comminution tests performed, the circuit
design criteria applied, and the associated capital expendi-
tures, this study examines trends in sampling practices and
quantifies the extent to which insufficient comminution
testing compromises throughput forecasts and undermines
the accuracy of NPV forecasts.
This paper considers the ratio of geochemical assay
samples to comminution test samples. From an uncon-
strained resource modelling perspective, this ratio would be
considerably lower than that generally observed in feasibil-
ity studies. Our study investigates how this ratio impacts
the predictive accuracy of throughput forecasts based on
ore competence, particularly in cases where ore hardness
variability is significant (Bueno et al., 2015 Lane et al.,
2010). The assay frequency for a project is defined by geol-
ogy practice and resource modelling approach based on
the “structural view” of the ore body as it emerges during
the exploration and resource analysis process. The aim is to
allocate value to “blocks” within the prospect based on ele-
mental grades. Elemental grade variation is linked to geo-
logical processes that are deposition dependent, structurally
related or alteration dependent. Ore characteristics that
determine AG/SAG mill throughout are related to a similar
or different set of geological characteristics. For example,
on one project, the gold grade was directly related to frac-
ture frequency and, hence, directly related to SAG mill
throughput, independent of “rock type.” On other projects
where the mineralisation is disseminated, there can be no
relationship between grade and AG/SAG mill throughput.
When geochemical assays or other logged data can be used
to map ore competence, the number of comminution test
samples required for high confidence in resource model-
ling is low. Where there is low variability in competence
(within experimental error of the test), the value added by
high volume comminution tests is also low. The challenge
is to optimise the comminution sampling intensity so that
risks to project NPV are identified in a timely manner,
thus allowing time to mitigate revenue loss through capital
expenditure or altered mining practice.
In this paper, we examine the impacts of a lack of com-
minution testing and provide actionable recommendations
for improving sampling, testwork, and design methods.
The proposed improvements are essential to enhancing the
operational efficacy of AG/SAG mill-based projects with
more accurate throughput forecasts, ultimately supporting
better-informed NPV calculations.
DATA SOURCES AND EVALUATION
Data Source
The data for this study were sourced from OPAXE, a sub-
scription-based database offering a wide range of mining-
related documents, including feasibility studies conducted
between 2014 and 2022. This dataset enabled us to cross-
reference feasibility data with actual operational perfor-
mance, forming a comprehensive basis for analysing the
relationship between throughput variability and its impact
on Net Present Value (NPV). A total of 154 feasibility stud-
ies were selected from the OPAXE database for detailed
analysis in this paper.
Feasibility Study Data and Performance Prediction
Pertinent data was extracted from the 154 feasibility stud-
ies. This included:
• the number of samples used for comminution testing
• the types of tests conducted with these samples
• the number of geochemical assays.
• project capital cost
• project operating cost
• forecast plant throughput
This data enabled an analysis of how the amount of
test data correlated with project size and the accuracy of
throughput forecasting.
A series of filters were applied to ensure a robust analysis
and exclude projects potentially compromising the study’s
quality. Projects without an AG/SAG mill-based grinding
circuit in their flowsheet, such as heap leach operations,
were excluded. Additionally, reports lacking essential infor-
mation were removed, including those that only referenced
the testing of bulk composites without detailing the num-
ber of samples or test campaigns conducted. After applying
these filters, the total number of feasibility studies included
in the analysis was reduced from over 200 to 154.
to capture variability across the orebody. Their method-
ology suggests leveraging large datasets from inexpensive
tests to generate spatially detailed models, which can then
guide more targeted, costly testing. Similarly, Doll (2018)
advocates a three-stage iterative sampling approach, which
begins with broad initial variability sampling, narrows
down redundant parameters, and culminates in an opti-
mized program with minimal but strategically distributed
tests. This process ensures that the density of geometal-
lurgical data aligns with the heterogeneity of the orebody,
achieving cost-efficiency without compromising reliability.
Both approaches highlight the critical balance between
sampling density and precision to support robust through-
put forecasts and mine planning
The analysis draws from 154 feasibility studies pub-
lished over the past decade, encompassing various commod-
ities, project sizes, and regions worldwide. By evaluating
the frequency of comminution tests performed, the circuit
design criteria applied, and the associated capital expendi-
tures, this study examines trends in sampling practices and
quantifies the extent to which insufficient comminution
testing compromises throughput forecasts and undermines
the accuracy of NPV forecasts.
This paper considers the ratio of geochemical assay
samples to comminution test samples. From an uncon-
strained resource modelling perspective, this ratio would be
considerably lower than that generally observed in feasibil-
ity studies. Our study investigates how this ratio impacts
the predictive accuracy of throughput forecasts based on
ore competence, particularly in cases where ore hardness
variability is significant (Bueno et al., 2015 Lane et al.,
2010). The assay frequency for a project is defined by geol-
ogy practice and resource modelling approach based on
the “structural view” of the ore body as it emerges during
the exploration and resource analysis process. The aim is to
allocate value to “blocks” within the prospect based on ele-
mental grades. Elemental grade variation is linked to geo-
logical processes that are deposition dependent, structurally
related or alteration dependent. Ore characteristics that
determine AG/SAG mill throughout are related to a similar
or different set of geological characteristics. For example,
on one project, the gold grade was directly related to frac-
ture frequency and, hence, directly related to SAG mill
throughput, independent of “rock type.” On other projects
where the mineralisation is disseminated, there can be no
relationship between grade and AG/SAG mill throughput.
When geochemical assays or other logged data can be used
to map ore competence, the number of comminution test
samples required for high confidence in resource model-
ling is low. Where there is low variability in competence
(within experimental error of the test), the value added by
high volume comminution tests is also low. The challenge
is to optimise the comminution sampling intensity so that
risks to project NPV are identified in a timely manner,
thus allowing time to mitigate revenue loss through capital
expenditure or altered mining practice.
In this paper, we examine the impacts of a lack of com-
minution testing and provide actionable recommendations
for improving sampling, testwork, and design methods.
The proposed improvements are essential to enhancing the
operational efficacy of AG/SAG mill-based projects with
more accurate throughput forecasts, ultimately supporting
better-informed NPV calculations.
DATA SOURCES AND EVALUATION
Data Source
The data for this study were sourced from OPAXE, a sub-
scription-based database offering a wide range of mining-
related documents, including feasibility studies conducted
between 2014 and 2022. This dataset enabled us to cross-
reference feasibility data with actual operational perfor-
mance, forming a comprehensive basis for analysing the
relationship between throughput variability and its impact
on Net Present Value (NPV). A total of 154 feasibility stud-
ies were selected from the OPAXE database for detailed
analysis in this paper.
Feasibility Study Data and Performance Prediction
Pertinent data was extracted from the 154 feasibility stud-
ies. This included:
• the number of samples used for comminution testing
• the types of tests conducted with these samples
• the number of geochemical assays.
• project capital cost
• project operating cost
• forecast plant throughput
This data enabled an analysis of how the amount of
test data correlated with project size and the accuracy of
throughput forecasting.
A series of filters were applied to ensure a robust analysis
and exclude projects potentially compromising the study’s
quality. Projects without an AG/SAG mill-based grinding
circuit in their flowsheet, such as heap leach operations,
were excluded. Additionally, reports lacking essential infor-
mation were removed, including those that only referenced
the testing of bulk composites without detailing the num-
ber of samples or test campaigns conducted. After applying
these filters, the total number of feasibility studies included
in the analysis was reduced from over 200 to 154.