Anda di halaman 1dari 9

HOME

CHAPTER 12

Operational Geometallurgy
Dean David

MAusIMM, Process Consultant, GRD Minproc Limited, Level 14, 140 St Georges Terrace, Perth WA
6000. Email: dean.david@minproc.com.au
Dean gained a Bachelor of Applied Science (Metallurgy) from the SAIT (now the University of South Australia)
in 1982. For 14 years he worked for CRA (now Rio Tinto), including six years at Argyle Diamonds in both
project and operational roles. Dean joined JKTech as a consultant in 1996, then managed JKTech Consulting
from 1999 to 2003. He joined GRD Minproc as their Process Manager in 2003 and switched to Process
Consultant (with GRD Minproc) in 2005. Dean has conducted many test work, pilot plant and process
implementation projects and managed day-to-day plant operations. Deans experience covers hundreds of
projects across the globe and across the mineral spectrum. Expertise areas include geometallurgy,
beneficiation, comminution, classification, physical separations (flotation, gravity, magnetic), sample selection
culminating in test program design and interpretation leading to plant design and and optimisation.

Introduction
Definitive Issues
Available Data
Making Process Sense of the Databases
Planning
Troubleshooting
Expansions
Data Analysis

ABSTRACT
The orebody is the only asset that mine has at its disposal to generate
revenue. Nature rarely provides neat and consistent orebodies so it is
essential that process engineers understand orebody variability and how this
interacts with the proposed mine plans. A fundamental difficulty in
understanding the orebody is that most of the available information is
geological in nature and not readily accessed by process engineers. To make
matters worse the metallurgical information is typically sparse and
disconnected from the geology. This chapter is an attempt to unlock some
geological mysteries and provide tools for the process engineer to link the
geological and metallurgical data to their maximum advantage, especially
when attempting to optimise base metals flotation plants.

INTRODUCTION
Geometallurgy can be defined as the marriage of geological and
metallurgical concepts in order to provide useful inputs to process
design and operational planning. The focus of geometallurgy is
entirely on the process outcomes but the basis of geometallurgy
should be firmly in the geology.
Orebodies are naturally occurring phenomena with little
consistency from location to location. Orebodies situated within a
few hundred metres can be as geologically different as orebodies

Flotation Plant Optimisation

on opposite sides of the earth. Within a single orebody it is possible


to have primary igneous rock sources, igneous intrusions, faulted
zones, metamorphosed alterations, oxidised zones, alluvial
accumulations and much more. From a geological viewpoint the
various aspects of each orebody can be defined with scientific
definitions and statistical accuracy using well-established
conventions. As with any science, the terminology used in geology
is specific to the discipline and meets its own requirements.
The consequent problem for the process engineer is interpreting
the science with the aim of finding a place for the geological
information within the metallurgical world. It must always be
remembered that geological data is primarily aimed at satisfying
geological needs and not process needs and, therefore, much of the
available geological information is not going to be used by the
process engineer. The process engineer must carefully and
selectively examine the relationship between geological data and
process response before committing to a geometallurgical strategy.

DEFINITIVE ISSUES
Geometallurgy, like any study topic, requires a problem definition
before embarking upon an analysis. The process engineer must

Spectrum Series 16

201

CHAPTER 12 OPERATIONAL GEOMETALLURGY

define problems and issues that, if solved, generate value for the
organisation. Geometallurgy becomes important when it is
identified that the key to understanding or solving a process
problem lies in an improved understanding of the source and
nature of the ore being fed to the plant.
Examples of geologically based issues that regularly arise for
process engineers in base metal sulfide operations include:

inability to predict and manage semi-autogenous grinding


(SAG) mill feed rate variability,

inability to predict and manage concentrate grades,


inability to predict and manage minor elements in concentrates,
simultaneous control of plant feed grade and throughput rate,
planning for throughput expansions, and
planning for new oretypes or orebodies to be treated in an
existing plant.

The economic importance of an issue is determined by


estimating the value that could be realised if the issue was
eliminated or addressed successfully. The potentially redeemable
value becomes the basis for justifying the resources that need to
be invested to solve the problem. For example, stabilising an
erratic SAG mill feed rate typically results in an average
throughput benefit of five per cent. A five per cent throughput
benefit achieved through operational stabilisation typically
translates to a five per cent revenue improvement with little or no
real capital or operating cost penalty. Additional revenue at little
or no cost equals profit. Assuming a two year payback is
acceptable to management it should be justifiable to invest seven
to ten per cent of annual revenue in a single solution that provides
a five per cent throughput benefit by stabilising the SAG mill feed
rate. However, this is unrealistic and all investment decisions are
tempered by risk assessment and by other projects competing for
attention and the same slice of the profit cake. For example, on a
plant having an annual revenue of A$100 M it may be realistic to
expect management to commit A$1 M to a project with good
potential to increase SAG mill throughput by five per cent.
Establishing an economic case is relatively easy compared to
providing the practical pathway to solving the problem. Often,
the opportunity to apply a large sum of resources to a problem
leads to the implementation of an expensive and generic
off-the-shelf solution. On occasions this is appropriate and
successful but in many instances it is not, and it could even have
an overall negative economic impact. A greater degree of success
is often achievable by understanding the problem both in detail in
breadth, and then applying a solution fit for the purpose at hand.
The first step to understanding the problem in detail is defining
the goal. In this example the goal is stabilised SAG mill feed rate.
What are the definitions of instability and stability in this
context?
Instability, at its worst, involves stop-start operation where it
should be continuous. More commonly instability is seen as a
boom-bust operational cycle where the plant is pushed to its
throughput limits and then beyond, resulting in the need to
significantly reduce throughput to bring the circuit back under
control. Often the boom-bust throughput variations exceed
15 per cent of average feed rate and can be as large as 30 per
cent. The average throughput that results is usually more than ten
per cent below the maximum achievable in the circuit. As well as
a loss of feed tonnes, the other major effects of this instability are
erratic flotation feed stream properties including F80 values, per
cent solids and volumetric flow. The economic effects are a loss
of feed tonnage coupled with a loss of flotation performance
resulting in less that optimal recoveries and grades. A general rule
for troubleshooting any process is that any upstream instability

202

that has the definite potential to cause problems should be


addressed before attempting to solve apparent in-stream or
downstream problems.
Stability is then defined by smooth operation of the plant.
However, when is a process operating smoothly? Smooth
operation of a ball mill circuit may be seen as continuous
operation with the feed rate varying by one to two per cent on an
hourly basis. However, operation of a SAG mill is usually
accepted as being smooth if hourly throughput variations are
within four to eight per cent. The reason for the difference is that
the SAG mill throughput rate has a greater dependence on ore
properties than does the ball mill.
The next step is to determine what is causing the problem.
There are a number of possible causes for an unstable SAG mill
feed rate, including:

inappropriate manual control actions,


inappropriate automatic control actions,
poor feed rate control capability,
poor or no control of multiple stockpile reclaim feeders,
lack of variable mill speed capability, and
poor grinding circuit equipment design.

None of these examples relate primarily to the orebody


geology and each should be dealt with using proven process
operation, process design and engineering practice. If any of
these issues is a problem then elimination may achieve the target
throughput improvement without entering the realm of
geometallurgy. These are relatively easy problems to solve as the
technical solutions are readily available from vendors,
engineering companies and experienced process engineers.
If, however, the engineering problems are addressed but the
feed stability problem persists then the primary cause of
variability is probably related to either the hardness or the
competence of the ore. Achieving feed rate stability will then
only be possible if a method is available to predict and manage
the hardness and competence of the ore that is delivered to the
plant for processing. A prerequisite for understanding what is
being delivered to the plant is an understanding of what makes an
ore competent or incompetent with respect to SAG milling and
what makes it hard or soft with respect to fine grinding.
The operating process engineer will have available a number of
historically developed devices for understanding the
comminution properties of the ore ranging from simple to
complex. The earliest of these methods would have been
developed by process engineers during the design phase of the
project. Subsequently, operating process engineers would have
arrived at their own conclusions about the main drivers of plant
throughput and developed new or modified models. The job of
the process engineer is to determine which of the available
predictive methods provides real information about the ore
properties and which methods are ineffective.
The ultimate proof of the method is in its use. The definitive
question is Is it possible, using the chosen method, to
successfully predict the SAG mill feed rate ahead of time for a
range of plant feed situations? If not then the method is either
wrong or incomplete. If the method can successfully predict feed
properties then there is probably a management issue that is
causing the feed variability, such as poor implementation of the
predictive method or poor control over ore sources and feed
blending.
On the assumption, often true, that the prediction of ore
properties is ineffective in practice, it is then necessary to
determine if a predictive method can be developed at all. It is here
that the process engineer enters the realm of geometallurgy.

Spectrum Series 16

Flotation Plant Optimisation

CHAPTER 12 OPERATIONAL GEOMETALLURGY

AVAILABLE DATA
The starting point for any geometallurgical investigation is the
geological model for the orebody and especially its relationship
to the mine model. The geological model contains drill data while
the mine model contains ore and waste blocks derived from the
drill data. Together this is the biggest geometallurgical asset for
the process engineer but, for some, it can be the biggest liability
in such an investigation. Two characteristics of drill data libraries
make them difficult for process engineers to work with: their size
and their complexity.
Geological data libraries (drill libraries) are much larger than
the databases process engineers usually work with, such as tables
of metallurgical test results. Geological databases can contain
data derived from 20 000 m of drilling during the early
exploration phase of a project and during the operational phases
the database can grow to more than 200 000 m. At the simplest
level the database usually contains multi-element assays for every
metre or two metres of drill intersection. However, the database is
also likely to contain information related to lithology (rock type),
alteration characteristics, colour, mineralisation, veining,
competence (RQD), point load index (PLI, a measure of rock
strength), specific gravity (SG), positional survey and any number
of other measures depending on the ore type involved.
The size of such a database makes it difficult to manage and
presents challenges when it comes to isolating useful information.
The tools used to manage drill databases are geologically based
and rarely suit the requirements or experience of process
engineers. Another complicating factor in the operational stage of
a project is that a large portion of the database may be irrelevant
as it represents ore that has already been mined.
The complexity of the database is the second negative factor
for process engineers. The complexity in terms of the multiplicity
of data sets has been mentioned previously. More problematic
than the number of data sets is the variety of bases on which those
data sets are arranged. For example, the assays will be determined
on a regular one or two metre interval basis but the lithology is
likely to be provided as irregular intervals, the lengths of which
relate to the extent of each lithological unit. Within the geological
database there may be five or more different data set tables, each
with their own individual interval set. Further complication can
happen if the geological model has been built up by a number of
successive companies or geologists, each with their own ideas
about what data is important and what is not.
To make the drill library database accessible to process
engineers it is best to use geological software to generate a single
suitable database. Packages such as GenSys can de-survey
geological data sets so that all properties can be reported on a
common spatial coordinate (ie XYZ) basis rather than on a
multiplicity of hole and depth bases. The ideal outcome is for each
individual record of the geometallurgical database to have a unique
spatial address and include every measured data type that has any
chance of being useful. The process engineer can then analyse the
single database using appropriate tools such as spreadsheets.
An accessible data set is only useful to the process engineer
when coupled with additional information about the orebody and
how it is being processed. In an operating mine the economic
limits of the orebody will be well understood by the geologists
and the plant ore delivery schedule is defined by the short-term,
medium-term and long-term mine plans.
A spectrum of mine-related data will exist with relevance
ranging from the imminent to the distant future, the available
mine planning data probably falls into the time-based categories
of imminent, short term and long term as per below.

Imminent ore that is ready to be processed within the next


week and is likely to be either blasted, ready for blasting or

Flotation Plant Optimisation

in stockpiles ahead of the primary crusher. This ore is the


subject of the daily production planning processes on site.
This ore can be well defined in terms of actual grade through
assay of blasthole drill cuttings. Comminution is either based
on the test result for the closest available metallurgical sample
or based on assumptions about the properties of ore types.
Factors that impede or blur the process engineers
understanding of imminent ores include medium-term and
long-term ROM stockpiles, blending stockpiles for plant feed,
lack of a blasthole sample program, lack of blasting (ie free
dig ore), poor in-fill drilling coverage, multiple or numerous
ore sources (typically a problem for complicated underground
mines) and a high degree of short distance variability in the
orebody (as may be seen in vein type deposits).

Short term each mining operation has a short-term


planning process to make decisions about the upcoming
waste and ore allocations. The basis for short-term decisionmaking is usually in-fill drilling conducted during the
construction and operational phases of the project. In-fill
drilling is designed to fill the gaps between the exploration
drillhole data and allow final ore/waste decisions to be made
about each block in the mine model. It is likely that mine
block sizes (the volume of ore considered to have similar
properties and is assumed to be mined as a unit) will be
reduced in size to better match the upcoming production
schedule. The in-fill drilling data set will certainly provide
grade and problem element data and, in some cases, it may
provide comminution data. The mine block properties are
recalculated using all available drill information and blocks
are then designated as ore or waste.
The most significant complicating factor influencing the
process engineers understanding of short-term planning data
is the mathematics used to derive the block properties. It is
essential that the process engineer understand the source of
the data underpinning short-term planning and understand
how it has been manipulated on its way to becoming block
data. Manipulations can include truncation of high-grade
spikes, data smoothing, data kriging (especially for
exploration data that is used to define block properties
because new measurements have not been conducted during
infill drilling, often comminution data falls into this category)
and data exclusions. The ideal data sources for the process
engineer are the short-term planning block predictions and
the raw drill data used to define the block. The full extent of
smoothing is then evident and the ore variability expectations
can be assessed by the process engineer.

Long term the long-term mine plan is essentially an


extension of the final operational plan arising out of the
exploration phase. The ore included in the long-term plan is
usually based more on geological orebody models than high
density drill data. The presence or absence of ore in a
particular location, its associated grade and recovery, its
lithology and alteration characteristics are all much less
certain than the data available for ore that is to processed
within the week or mined within the month. Much of the ore
will not even classify as proven under geological reporting
regulations. The long-term plan is typically updated
annually, firstly to reflect new drilling that is designed to
convert probable ore into proven ore and secondly to reflect
actual mining progress.
The long-term plan is most useful for gaining an understanding of the overall orebody and mine shapes,
documenting the proportions of the major ore types that are
expected in future years and understanding trends with time
and location for critical measures such as grade, impurities,
hardness and competence.

Spectrum Series 16

203

CHAPTER 12 OPERATIONAL GEOMETALLURGY

When using block model data it is important to note a number


of mathematical facts:
1. The properties assigned to an ore block are dependent
upon the measured properties of a few drill intersections.
It is possible that there are no drill intersections within a
particular mining block.
2. Geostatistical techniques, especially kriging, are used to
estimate the distribution of geological properties between
drill holes.
3. Some ore block properties may have been calculated
using complex equations that may have an unintended
outcome.
4. Some ore block properties will be dependent upon the
location of the block relative to a geological interface (eg
above or below water table, oxidised versus fresh ore
zone, etc).
5. Some of the block properties, or even the underlying data,
may have been filtered or truncated to avoid illogical or
extreme outcomes.
A good check when first encountering a block model is to
look at both the model and the underlying geological data in
the same orebody location and ensure that there is a
reasonable relationship between the two. In general, the
predictive methods are reliable but it is always advisable to
verify information, especially where important decisions are
being made.
The weakest link in the available data for geometallurgical
analysis is invariably the metallurgical data set. It is rarely, if ever,
practical to analyse the orebody metallurgically to anywhere near
the data density that is available geologically. For example, an
orebody of 100 Mt with 10 000 m of intersections identified
within ore has a geological data density of 100 samples per
million tonnes of ore. Assuming there are three major lithologies
that have been tested metallurgically and each lithology has had
50 variability samples tested for flotation properties and
grindability, the metallurgical data density is only 1.5 samples per
million tonnes of ore. At 5 Mt/a plant throughput (for example)
each years production is effectively based on the results from 7.5
metallurgical test samples compared to reliance on 500 metres of
drill intersections.
One ore block may represent a few days to a few weeks of ore
so the block model is typically not useful for short-term planning.
Usually a subset of the block model is populated with infill drill
data and then smaller mining blocks are chosen for medium-term
planning purposes. A subset of the medium-term plan can then be
populated with blasthole assay data for production grade control
purposes. In a few instances the metallurgical database is
continually updated with preproduction testing. Generally, the
testing gap between geology and process continues to widen as
the project advances. Fortunately the amount of available process
data begins to catch up once production begins. It is at this point
where the adequacy, or otherwise, of the metallurgical predictions
is revealed.
It is important for the process engineer to become familiar with
the planning processes and the available data sources on site.
Perhaps more importantly, the process engineer should become
acquainted with the owners of the various databases, the
individual geologists and mine planners. Databases always have
quirks and skeletons so it is best to have an expert assist you to
navigate to the data sources essential for your problem solving.
It is not necessary, or desirable, for process engineers to become
users of mining or geological software. Usually specific data is
required to solve a problem and it is preferable to extract the

204

necessary data from the sources and analyse it off-line using


familiar tools such as spreadsheets. Although it may be useful to
learn how to use the specialised software it is not normally
efficient. All modern geological and mining software has excellent
data export and interchange capabilities that should be exploited.

MAKING PROCESS SENSE OF THE DATABASES


The disparity between the geological and process databases
presents a problem and an opportunity. The problem is that the
metallurgical database is comparatively small and it is not going
to get larger very quickly. The opportunity that exists is the
potential to link the databases together and make some
metallurgical use of the geology.
Linking the metallurgical database to the geological cannot
progress without a critical mass of metallurgical data. There is no
real point proceeding if all that is available is a few tests on each
of the major ore types. It is especially problematic if these few
tests do not reveal unique characteristics about the ore types. In
many operations there is a library of process test data that is
available for interpretation but may not be gathered together in
one place. The starting point should be the available data and
making the most of it rather than embarking on a new test
program.
One criticism of existing metallurgical data that is often heard
is that because the test was done many years ago the results may
not be reliable. In a few isolated instances, and only because of
specific circumstances, this may be a valid criticism. However, in
general this is a foolish first position to have as you may be
ignoring the most valuable data resource you have. Saying that
test data is unusable because of age implies that one of the
following is true:

the test procedure used was at fault or is irrelevant to the


existing flow sheet,

the operator that conducted the test was at fault,


the ore tested has no relevance to current or future operations,
the way the ore was tested (ie inappropriate blending)
devalues the results, and

there is no information about what the sample represents


geologically.
In the absence of a clear problem it is advisable to assume all
past test results are valid until proven otherwise. Collect all past
test data and bring it into the geometallurgical analysis.
Historical metallurgical data is often characterised by its
diversity of form. Each process engineer or laboratory
superintendant is likely to have had their own ideas about the test
methodology, test products generated, the assays conducted and
the calculations performed. It is essential to get back to raw data,
especially test conditions, product masses, process times and
original assay data. It is also worth noting that there may be
comments by operators that are relevant and that even
identification of the operator can be a valuable piece of data.
Once the body of test data is identified the metallurgical
database can be built. Although time consuming it is usually
better to be comprehensive rather than selective when compiling
the database. Critical factors for interpreting the test work often
emerge during the analysis process so it is better to include all
possible raw data. It is not necessary to incorporate calculated
data as this can be derived as required. The metallurgical database
should also include reference data such as test numbers, dates,
sample numbers and ore types. As the aim is to link back to the
geological database is also essential to incorporate any links that
may exist back into the geological database such as drill hole
numbers and downhole depths. Try to standardise the data so that

Spectrum Series 16

Flotation Plant Optimisation

CHAPTER 12 OPERATIONAL GEOMETALLURGY

it is possible to analyse the metallurgical database using


spreadsheet or database tools. It is recognised that this can be a
daunting task where an extensive library of tests exists. However,
the usefulness of the metallurgical database alone is enough to
justify the investment, regardless of the multiplied benefits once
the geometallurgical linkages are established.
Establishing linkages to the geological database is entirely a
case by case process. Both the metallurgical database structure
and the geological database structure are likely to be unique so it
is not possible to list the methods by which they can be linked.
However, a number of aspects of each database are standard and
provide a starting point.
Geological databases typically contain:

drill hole numbers;


references to properties (assays, lithology, etc) by depth;
hole locations (collar positions);
hole directional information (dip, downhole survey, etc); and
ore properties possibly including:
assays,

lithology,
alteration,
SG,
magnetic properties,
radiometric properties,
mineralisation (ore and gangue minerals),
RQD,
fracture frequency (FF), and

point load index (PLI), etc.


In order of priority, the most important information after the
positional data is the set of chemical assays followed by the
lithology. Where there is limited comminution data from
metallurgical testing, then drill core structural information (RQD,
FF) and the strength data (PLI) become more important.
Note that the above description is typical of a database that has
not been subject to metallurgical influence. If detailed
metallurgical testing has been conducted on the orebody for
geometallurgical purposes some of the following may appear
either in the drill database or in the mining block model:

drop weight index (DWI),


SAG power index (SPI),
bond (or equivalent) work indices,

the use of Bond ball mill work indices to predict SAG or AG


mill throughput rates;

the adoption of unverified relationships between ore


measurements and operational outcomes;

inappropriate assignment of metallurgical ore properties to


geological categories, such as lithologies, where there is no
proven linkage between the two; and

predictive methods that ignore key factors, for example


prediction of copper concentrate grade based on head grade
without reference to copper mineralogy.
In a modern operational environment there is usually enough
data available to verify or discredit these predictive systems. If
not, then systems should be implemented and instruments
installed to allow, at a minimum, the verification of the predictive
information that is meant to drive decision-making in both the
mine and the plant. The saying if you cant measure it, you cant
manage it is especially true when it comes to geometallurgy.
Unfortunately, in many instances, it appears that measurement is
being done but because the measurements or the interpretation
are inappropriate the consequential management is misdirected.
Where there is no metallurgical data in the geological
information systems the geometallurgist must endeavour to
insert some in a useful manner. The success of establishing
geometallurgical links will depend on the type, quantity and
quality of the metallurgical data, the spatial nature of the
metallurgical samples and the ability to link the metallurgical
data to geological properties and locations.
Here is a theoretical example of establishing a set of useful
geometallurgical linkages for a copper mine, having mixed
copper sulfide mineralogy and using a SAG/Ball mill
comminution circuit, which experiences significant swings in
throughput rate.

Decide what metallurgical predictions are necessary for


effective management of the process. In this example the
following predictions are required:
plant feed rate based on both SAG and ball mill
limitations,

copper concentrate grade,


copper recovery, and
gold recovery.

liner abrasion indices (eg Ai),


ore abrasion index (ta),
mill feed F80 predictions,

Note that fundamental metallurgical inputs such as feed grade


and ore type are assumed to be already available because they
are geological and mining model outputs rather than based on
specific metallurgical test work or interpretation.

recovery predictions for products,


concentrate grade predictions,
SAG throughput rate (in an existing operation), and

Decide what measurements are useful for predicting plant

achievable grind P80 (with existing mills), etc.

If such information does exist then a few checks need to be


made to ensure the data is useful in an operational sense. For
example, in an existing operation it should be relatively simple to
check if the geometallurgical predictions are useful to the
operators and planners. If the data is ignored by the operators
then it is either a poor predictor of performance or systems have
not been put in place to ensure that the metallurgical information
is made available to those that need it. The latter is often the case
when the data has been collected during the design period for the

Flotation Plant Optimisation

purpose of selecting major equipment but no operationally useful


systems were implemented at plant start-up.
If it is found that the geometallurgical predictions do not match
operational reality then there is something wrong with either the
data that is being used, the interpretation method or the method of
distributing the data across the geological databases.
Examples of inappropriate use of metallurgical data include:

performance.
JK Axb has been proven to be an effective predictor of
SAG throughput at the site so a test such as DWI can be
used to economically establish a metallurgical database.

BWI has been shown to be useful for predicting ball mill


performance well so either a full BWI test or a simple
modified BWI will be used.

Copper concentrate grade is strongly related to the

Spectrum Series 16

copper

mineralogy

so

either

directly

measuring

205

CHAPTER 12 OPERATIONAL GEOMETALLURGY

The prediction of SAG mill feed rate will be made

mineralogical mix or inferring mineralogical mix from


Cu:S ratio are to be used. As the geologists have included
copper sulfide mineralogical proportion estimates in the
database and sulfur assays are available both methods of
predicting concentrate grades will be used and the most
appropriate will be chosen after a period of verification.

using the DWI and BWI measurements.

It has been found that DWI is related to lithology


and to depth so each lithology will be assigned a
base DWI value which will be increased in
proportion to depth.

Copper recovery is also strongly related to mineralogy

Similarly, the BWI value has been found to be

and it is known that recoveries are lower when chalcocite


and covellite are present. It is also related to oxidation,
which is a function of depth, alteration and ore type.
Copper recovery is also well predicted by simple
laboratory flotation tests on samples. As copper recovery
has been highly variable and prediction has been only
moderately successful from the geology a simple rougher
flotation test will be carried out to estimate recovery.

related to lithology and depth with the exception


that one lithology is considerably harder in the
north than the south. This lithology will be
divided in north and south types and each have its
own base BWI and depth relationship.

As a first estimate the SAG feed will be the


minimum throughput rate based on DWI versus
available SAG power and BWI versus available
ball mill power. The estimate will be capped to
maximum and minimum throughput rates.

Gold recovery is variable and difficult to predict. No

current methods are reliable. The rougher flotation test


will also be used to attempt to predict gold recovery.
Choose calculation methods for the predictions.
The plant feed rate prediction will be based on a
calculation that determines if the SAG or ball mill is
going to limit production based on DWI and BWI and
applying operating constraints such as available milling
power, maximum acceptable flotation feed F80 values and
maximum acceptable flotation circuit feed rate.

Copper concentrate grade predictions will be made in


two independent databases, the assay database and
the mineralogy database.

In the assay database a column will be inserted


which applies a calibrated theoretical relationship
between the Cu:S ratio and final concentrate
grade.

In the mineralogy database a column will be

An established relationship between plant feed Cu:S ratio

added that uses the stoichiometric copper


proportion of each mineral to calculate an overall
Cu concentrate grade.

and final concentrate grade will be applied to the


geologically based block model Cu and S assays. A
second prediction will be based on the mineralogical
information logged by the geologists. Inconsistencies
will be flagged for potential investigation.

Both grade predictions will be carried through


the predictive process and reported separately.

For 30 days the plant feed will be sampled hourly to

provide shift composites. Shift composites will be


subjected to a standard grind and flotation test. The
flotation test Cu and Au recoveries will be compared to
the shift recoveries in an attempt to develop copper and
gold flotation recovery predictors. In addition the ore
source for the shift or period will be noted. Additional
flotation tests will be conducted using freshly drilled
diamond core samples to firm up the relationship with
ore from known locations and geological characteristics.
Implement the predictive framework.
The predictions can be implemented in one of two
locations. The first, and most comprehensive is in the
geological database. The second, and most convenient, is
in the mining block model.

Copper recovery predictions will initially be made in


the assay database on the basis of constant tails
grade. This relationship will be updated following the
results of the planned flotation test program.

Gold recovery predictions will also be made in the


assay database on the basis of constant tails grade
and will be subsequently updated.

The block model divides the orebody into mineable


quantities of ore (blocks) to which properties are
assigned on the basis of internal or nearby drill hole data.
Including the desired predictions into the block model
requires that the necessary information is carried through
from the geological model for the calculations.

For SAG mill feed rate prediction it is necessary to


carry DWI, BWI and SG data. It is also necessary to
carry lithology as this determines the equation to be
applied to the data.

Adding the predictive framework to the geological


model invokes considerable complexity because
the useful process outcomes of the predictive
calculations will not be needed in the geological
model but in the form of short-term planning outputs
from the mine block model.

For copper concentrate grade prediction it is


necessary to carry per cent Cu, per cent S and copper
mineralogy.

For copper and gold recovery prediction it is

Adding the predictive framework to the block model


is simpler because the block model is numerically
smaller than the geological model and has been
smoothed to eliminate problematic data.

Adding the predictions into the geological model


requires that one or more of the many data tables be
expanded to include the predictive calculations. The data
table to be expanded depends upon the inputs to the
calculations. For example, if the calculations are assay
based then the assay table should be expanded. If the
calculations are ore type based then it may be necessary
to expand the lithology table.

206

necessary to carry the per cent Cu and g/t Au values.

Add the necessary calculations into the block model


set-up so that the predictions are made and can be used
immediately in block value calculations.
Assuming the set-up is mathematically valid and there is a
enough data to make the predictions statistically useful it is now
possible to add the desired predictions of SAG throughput, copper
concentrate grade, copper recovery and gold recovery to the
short-term planning outputs. Equally important is setting up a
comparison to see how good the predictions matched the actual
production outcomes.

Spectrum Series 16

Flotation Plant Optimisation

CHAPTER 12 OPERATIONAL GEOMETALLURGY

PLANNING
The most obvious use of the predictive data is for short, medium
and long-term metallurgical planning. Applying the predictions to
the long-term planning process should reveal future constraints
on throughput or production that will need to be accommodated.
Predictions of this nature feed into the capital planning process
and also provide early warning of systemic problems such as
hardening ore or changing mineralogical mixes.
Using such tools the process engineer can participate more
fully in the long-term planning of the overall project. For
example, hardening ore without a compensating grade change
will lead to a revenue reduction that must be counteracted.
Counteraction measures range from plant optimisation to plant
expansions or even to new plant construction. Marketability of
the concentrate is influenced by its grade, another of the possible
predictions. A trend to lower grade concentrates may lead to early
warnings for customers, a reduction in revenue expectation,
planning for increased throughput or sourcing of new customers
for the concentrate.
Equally important can be the educational aspect that arises
from geometallurgical planning. Provided the predictions are
shown to be reliable the mining engineers and geologists learn
what is important to the process plant and are able to contribute
more effectively to the overall site optimisation. For example, the
geologists may change the way they look at the orebody and
report ore characteristics. Alternatively, the mining engineers may
introduce, or modify, blending practices to smooth the process
plant operation by avoiding the delivery of undesirable ore
packages.
Without an integrated predictive geometallurgical system it is
difficult to participate fully in the site planning process beyond
the fundamental requirement of grade and quality targets.

problem and implementing a suitable solution. This could include


blending the high arsenic ores with low arsenic ores or it could
involve implementing a processing solution such as depressing
the arsenic minerals during flotation by the use of cyanide.
An ad-hoc investigation still requires a good level of
understanding of the geological information and a good working
relationship with the geologists and mining engineers. It also
requires a full understanding of the factors that influence the ability
to establish a link between the symptom and the source. Common
complications arise from the delays and mixing caused by storage
systems such as stockpiles, fine ore bins, ROM storage and
concentrate storage. These necessary devices can make it virtually
impossible to isolate the source of a problem in the mine. It may be
necessary to arrange for extended runs on a single ore type to
ensure the ore source is unequivocal during an investigation.
Other complications can arise from generic property
assignment to ore types. In these instances it is necessary to test
the assumptions by real testing. For example, if a particular ore
type is assumed by others to have a constant BWI value then take
five samples of that ore (as defined by the geologists) from
different locations and test them. Repeat the exercise on other
important ore types. If the results support the assumption then
continue to use it. If they disprove the assumption then it may be
necessary to modify site systems and re-educate people
depending on this assumption. Where the assumption is found to
be invalid, and it is necessary to rely on the BWI value in
reaching a solution to the problem at hand, then the situation
must be corrected. Additional BWI tests on ore samples specific
to the investigation are easily justified.
Statements such as all this ore type is the same, or there is
not much variability in the orebody must be tested by
examination of available data or by collecting more. In most
cases these statements prove to be false and can derail
investigations at the start.

TROUBLESHOOTING
Troubleshooting is an ongoing responsibility of production
process engineers. Any system that assists in the troubleshooting
process not only makes life easier but also improves the company
bottom line by solving problems faster. An integrated predictive
geometallurgical system would facilitate rapid troubleshooting by
providing a consistent basis for sourcing and analysing data. In
addition, a reliable integrated system would have a significant
benefit of having established trust with the mining and geological
departments and also with the site management. Not only would
it be possible to generate the necessary data to build a case for
change, but it should also be possible to get rapid agreement on
implementation.
More typically there is no available system where data can be
instantly sourced to back up metallurgical arguments. In these
situations it is necessary to carry out ad-hoc geometallurgical
investigations to complement a particular investigation. Examples
may include tracking down periodic recovery drops, reducing
concentrate grade variability or tracking down the source of an
impurity such as arsenic or fluorine.
An ad-hoc geometallurgical investigation would typically
follow a similar pattern to that used to set up an integrated
system. Even though the particular investigation is unique it is
advisable to set up generic tools that can be used in future
investigations. Begin by tracking back as reliably as possible
from the symptom of the problem to the source. For example,
high arsenic in concentrate is related to high arsenic in feed and
arsenic in feed may be the result of any one of a number of
geological features, such as particular ore types or alteration
zones. Tracing back from the problem to the source allows the
issue to be managed by flagging the geological source of the

Flotation Plant Optimisation

EXPANSIONS
Plant expansions are special cases where geometallurgy is
essential to achieving the intended outcome. Many expansions
have been less than successful because the ore to be treated did
not meet the expectations of the designers. This is a direct result
of a poor understanding of the future orebody and the future ore
properties.
The first rule with expansions is to check all assumptions that
drive the process design. If the expansion is throughput related it
is essential to check that tests have been carried out on samples
representative of the ore that is to be treated by the expanded
plant. In the absence of supporting data it is not adequate to
assume that because the ore blend will be 30 per cent X and
70 per cent Y that the properties can be predicted from past
experience. Take multiple samples of X and Y from the ore zones
to be treated by the expanded plant and subject them to all the
necessary tests to prove the design assumptions.
To check if this additional geometallurgical expense is
necessary it is relatively simple to repeat the financial
calculations with ten per cent lower throughput than predicted or
five per cent lower recovery.
An expansion decision must be based on a reasonable
guarantee of a return from the ensuing revenue stream. The same
principles that are applied to exploration companies, proving the
presence of the ore and establishing its grade, are the first steps in
justifying an expansion. In most cases this is well established.
However, in some cases the ore in question may lie in the inferred
category and will need to be elevated to a higher status before
proceeding. Note that if the ore is inferred then it follows that all
available ore properties are inferred.

Spectrum Series 16

207

CHAPTER 12 OPERATIONAL GEOMETALLURGY

If the expansion is based on treating future ores according to


established lithological categories then check that lithology has
been a useful indicator of process properties in the past. If this
has not been the case then the future will be no different to the
past and it will be essential that metallurgical tests are conducted
on samples of the future ores.
When carrying out a metallurgical test program on the future
ore there will be a set of obvious tests to perform and there will
be some less obvious tests. Perform the obvious tests
(comminution, flotation, abrasiveness, etc) and then carry out
some speculative tests, such as full elemental analysis or
mineralogical analysis of concentrates. For reference it is best to
carry out the same tests on current production samples. These
types of tests are simple and inexpensive checks designed to
identify if there are any new problems that may arise from the
future ore. If problems are confirmed or indicated by the tests
then a more comprehensive test program may be required to
identify mitigation strategies.
Another possibility in preparing for an expansion is to conduct
a plant trial on the future ore. This is highly desirable if it is
known that the future ore differs from the past ore. The existing
plant is the ideal pilot plant facility. The ability to conduct a
meaningful plant trial is contingent upon ore access and ore
volume availability. A plant trial has additional benefits such as
providing concentrate samples for customer evaluation and
providing a clearer evaluation of plant areas that are dependent
upon concentrate availability, such as regrinding, thickening and
filtration. Another less obvious benefit is that the plant operators
are exposed to the future ore. It is usually the operators that
notice operational effects first and the crew should be well
prepared before and debriefed after the trial.
At the implementation stage of an expansion project the major
risk areas must be mitigated to the satisfaction of the disciplines
on site and to the satisfaction of the management team. If the
factors listed below have been established definitively then the
expansion has been underpinned geometallurgically:

future ore tonnage and grade established,


future ore metallurgical response is known,
acceptable concentrates can be sourced from future ore,
future ore throughput properties are known and have been
used in comminution design, and

the variability of the future ore is understood.


DATA ANALYSIS
Effective and efficient analysis of geometallurgical data is
essential to any geometallurgical exercise. It is often necessary to
manipulate very large databases and spreadsheets and summarise
them into meaningful sources of information. It is also necessary
to correlate properties from the geological and metallurgical
realms so that trends and indicators can be established. It is in the
data sets that the important information resides, the difficult part
is extracting it. Even more difficult can be communicating the
information to others.
The breadth of available geological data has been described
previously. Also described has been the necessity to develop
analysis tools useful to process engineers. Unfortunately it is not
possible to provide a one-size-fits-all solution to geometallurgical
data analysis as the forms of geological and mining data are as
diverse as the orebodies they describe.
The fist step is to organise the data into a form that can be
manipulated meaningfully. One method (that preferred by the
author) is to arrange all the available geological data into a single
spreadsheet table against a common set of depth intervals. In
addition the data is de-surveyed to convert drillhole numbers and

208

depths into 3D spatial (ie X, Y, Z) coordinates. This is best done


with the assistance of a geologist, and with direction as to which
data is required the task can be completed in hours or minutes.
Once the database is available it is then possible to begin detailed
analysis.
Spreadsheet packages, such as Microsoft Excel, contain many
useful statistical and data manipulation tools. Probably the
most useful tools for manipulating large and unwieldy
geometallurgical databases are the Pivot Table tool and the
AutoFilter tool. Excel tutorials will not be provided here but all
readers that deal with large data sets of any kind are strongly
encouraged to become familiar with these tools. They have the
ability to extract data and trends from million-cell databases with
a few keystrokes.
Before making use of the data it is essential that each column of
data be checked for unusual features or errors. For example,
sometimes numerical data is actually present as text in the
spreadsheet. Often text information has a minimum length and
what appears to be a three letter code is three letters followed by
ten spaces. Such data anomalies confuse the analysis and the
analyser. Check for data artefacts inserted by the geologists, for
example representing missing data with 9999. Check for entry
errors such as a mix of ppm values in a per cent column or per cent
values greater than 100. There are infinite possibilities for
problematic data in databases. The general rule is if unexpected
results appear during analysis then go back and check if the data
that you thought you were incorporating is in the appropriate form.
It is usually necessary to add additional data columns to the
geological databases. For example, ore and waste are rarely
defined in the geological data because these are mining terms. A
data column that classifies the intervals into simple ore and waste
sets according to a reasonable grade cut-off level is easily added.
A consequent sort on ore then removes all the material that
process engineers are not interested in.
In banded or vein orebodies it may be necessary to re-interpret
the ore and waste definitions to match the reality of mining
limitations. For example, ensure that 1 m waste intervals within
an ore band are considered as ore because they will not be
selectively excluded. Similarly, if the minimum mining cut width
is 4 m in a vein deposit then include some waste either side of
significant ore intersections. These limitations are best discussed
with the miners before being implemented in the database. Once
implemented it is essential that a reality check is carried out to
ensure that the effect of the data manipulation is consistent
whenever it is invoked. Check between ten and 20 instances
through the database of any new data manipulations.
In databases compiled by different geological teams over time
there are likely to be duplicate assay columns or lithology
columns. Decisions must be made as to how this is reduced to a
single consistent column describing that property.
Another common manipulation is to round the X, Y, Z data into
useful intervals. For example, it is much more useful to know that
ore is within the depth range RL 100 to RL 150 m than it is to
know that the ore RL is precisely 123.27 m. This then allows the
database to be analysed by depth slices or slices in the north or
easterly directions. It also allows the ore to be examined as
blocks. Blocks defined by this rounding method are usually much
larger than the mining model blocks and are best distinguished by
the term Metblock.
Another useful addition to the database is to indicate those drill
intersections samples that have been utilised in the relevant
metallurgical test programs. Again, this can be tedious, but it
provides an excellent basis for evaluating the representative
nature of past sampling and is a guide to selection of samples for
future test programs.
If it is not possible to correct data to something meaningful,
then a number of alternatives exist. One is to eliminate the

Spectrum Series 16

Flotation Plant Optimisation

CHAPTER 12 OPERATIONAL GEOMETALLURGY

offending data record or column by deletion (or removal to a


place so it is then known to have been removed). Another option
is to fix the data, such as by averaging the corresponding values
for the records above and below. Another is to move a decimal
point if it is unequivocal that the reported value is not real but is
the result of a typographical slip. Any manipulations should be
noted and the information returned to the geologists. This is
usually valuable information that can lead to improvements in the
integrity of the geological database.
Once verified the database is ready to use. A number of
guidelines are useful here:
1.

Identify the ore that is important to the metallurgical issue


being addressed.
Analysing ore that will never be mined or will be mined
15 years from now is rarely useful. Identify, with the
assistance of mining engineers, the ore that impacts on the
decision or problem at hand and then concentrate on it. One
method may be to add an additional column to the database
and flag the important ore with a one or a TRUE. The data
can then be sorted on this basis. Another alternative is to
assign a future year number to ore intervals but this is much
more complicated.

2.

Identify the properties that are of most importance to the


metallurgical issue.
These properties then become the subject of data analysis
and guide the analysis to the important aspects of the data
set. For example, in a copper flotation plant the copper
grade is obviously important but so is the sulfur grade, the
arsenic grade and probably the gold grade. Using the data
tools mentioned above, it is then simple to make these
measurements the subject of the analysis.

3.

Use the data tools to focus on the important ore and the
important properties.

Using pivot tables it is possible to quickly construct a table


constrained to ore in the relevant time frame and examine
average Cu, Au, S and As grades and identify the number of
intervals identified in each category. It is also possible to
view this data in plan on the orebody by arranging it within
the rounded X and Y data axes categories. It is then possible
to step down through the orebody by selecting an individual
RL value or a set of RL values.
Once the data is in this analysis form it is then possible to
examine the orebody from a process perspective. For example:

Flotation Plant Optimisation

trends of arsenic with location, depth or ore type can be


quickly extracted;

trends in Cu:S with time can be examined; and


specific locations for extracting new metallurgical test
samples can be identified and a click will instantly drill
down into the database and show the relevant drill
intersections in that particular Metblock.
It is also possible to use graphical and statistical techniques to
search for correlations between factors, for example plot per cent
As against per cent Au on an X/Y plot to see if gold and arsenic
are related. If a relationship is suspected within a particular ore
type then use the pivot table drill down facility to extract the
relevant records then plot the specific data comparison again.
Tools such as correlation tables can be used to search large data
sets for significant linear correlations between large numbers of
pairs of data columns. Multiple linear regression can be applied
to see if a particular property is a function of a number of others.
Regression tools are particularly useful when investigating the
relationship between the limited metallurgical data set and the
extensive geological data set. Assuming the drill samples that
were subject to metallurgical tests have been correctly identified,
it is possible to extract the geological data for those intervals from
the database, calculate the averages of the interval properties and
correlate them to the measured metallurgical properties. In this
manner it may be possible to identify geological predictors of
metallurgical properties.
Ideally the metallurgical test samples will have been made up
of contiguous intervals of single lithologies, this is much more
useful than single test samples being formed as composites from
across the orebody. Especially problematic is when the only
available data relates to composites formed from across the
orebody and from a mix of lithologies. Both these situations
make it almost impossible to establish any linkages between
geological and metallurgical properties.
One of the most rewarding outcomes for the process engineer
from such an analysis is the deeper understanding that is gained
about the orebody. With this understanding the process engineer
is well equipped to contribute to discussions on the future of the
operation and to converse meaningfully and factually with the
geologists and mining engineers on site. If nothing else is gained
then the improved communication lines across the disciplines is
usually worth the effort.

Spectrum Series 16

209

Anda mungkin juga menyukai