As a Metallurgist, you’re always prioritising. There are a hundred different improvements you could make in the plant, and you’ve got to decide: What can get me the most bang for my buck? We think the answer lies in taking a closer look at the data you already have. While you probably spend your days in a sea of spreadsheets to process the data, build daily reports, populate downtime, and pull together metal accounting, we think there are better ways to handle some of those tasks that better utilise your time and l help you better optimise your plant.
If your plant is consistently falling just short of production targets, or your costs have inched higher in recent years, the solution may already be hidden in your existing data. The key is to use software tools that give you a richer view of your data to uncover hidden inefficiencies. Here’s why peeling back those layers matters—and how it can drive significant improvements.
The limits of surface-level data analysis
Excel is widely used in mineral process plants for basic trend analysis and reporting, and it does a good job of that. While these insights can be helpful, it won’t identify subtle, recurring inefficiencies and you can miss the nuanced details that are critical to continuous improvement.
“You might know the crushing circuit is your bottleneck but you might not realise that Conveyor 6 is having multiple trips on under speeds throughout the day, the month, the week, and that’s causing 5 per cent of your downtime,” says Josh Hirsch, Senior Metallurgist at Mipac.
“Having that data more readily available means you can actually allocate your resources and you can communicate that to the maintenance team.”
You might be aware of what the major issues are but with the right depth of data, you might be able to see a couple more of those nuances, and that could cumulatively translate into big wins.
“At one plant, we had all the downtime reporting being done in a VBA tool, and once we automated it and brought it into a more robust system, we started to realise that we had a level switch causing 30-second delays at regular intervals. It wasn’t visible in our previous analysis because we were only looking at the largest single events of the day,” says Josh.
A closer look at data layers—analysis of downtime by count and duration rather than just on headline events—revealed a recurring problem they hadn’t noticed before. The issue? A single level switch. The fix? Quick and simple. The impact? Significant.
This story highlights an important truth: small, frequent problems often have a cumulative impact that rivals—or even exceeds—the losses caused by major events. Josh explains that their unnoticed 30-second delays added up to a staggering two hours of lost production time weekly. Over the course of a year, that’s the equivalent of 104 hours of lost production.
To put it into perspective: recovering two hours of weekly production at a small copper mine would mean you generate over two million dollars annually. That’s not accounting for the additional plant downtime costs. But insights like these aren’t easy to identify without software that allows for detailed analytics and flexible views of the data.
The benefit of peeling back the layers
Once you identify small inefficiencies, deeper analysis can highlight long-term improvement opportunities. For instance, frequent but minor fluctuations in pump speed, when analysed over time, might reveal subtle wear in the equipment that affects throughput. Correcting these small issues improves process stability and reduces the likelihood of major breakdowns.
You can identify a recurring minor delay during a specific shift or when processing a particular ore type. By adjusting the flotation parameters slightly over time, recovery rates can improve incrementally, leading to more consistent production.
Tools designed for mineral process plants, like the MPA software suite, enable you to interpret the same data in multiple ways, which reveals insights that would otherwise remain buried.
For instance, instead of looking at downtime only by total duration, you can analyse:
- Frequency: Do events recur that add up over time?
- Short event trends: Are short interruptions clustered around specific shifts, equipment, or processes?
- Cumulative impact: How do small, frequent issues compare to occasional major ones in terms of overall losses?
Mining operations like Ok Tedi, Northern Star’s Pogo, Glencore’s Kamoto Copper and Mount Isa, and Anglo American’s Quellaveco are examples of plants seeing the benefit of software digitalisation.
Tools like Golden State let them visualise every area of their plant processes live on one screen. With a live view, they can immediately spot when a plant area is operating outside of the ideal operating range.
Software now lets them detect production deviations early, too. Once they’ve spotted that a plant area is operating outside the ideal range, they can drill down to detect the specific process causing the deviation.
These software systems help uncover opportunities and often lead to quick fixes that make a big difference to production.
As Josh puts it, “It was only because we had detailed analytics and access to the data that the problem was found. Without that depth, we would still be chasing our tails, looking at the biggest but infrequent downtime events and missing the frequent smaller ones.”
This doesn’t mean you need more data sources or entirely new systems. It’s about getting the most out of the data you already have. And it may free up hours in your day to dedicate attention to those areas.
Cumulative impact of small issues
Small inefficiencies, like frequent interruptions from minor equipment malfunctions or slight variations in ore, often accumulate over time and can cause significant long-term losses. For example, minor under-recovery because flotation conditions aren’t ideal might seem negligible in the short term, but when it’s compounded over weeks, months, or years, it leads to major losses in overall production.
Likewise, a small fluctuation in the particle size distribution can go unnoticed when you view it on a daily basis. But, through deeper data analysis, it becomes obvious that these small variations consistently impact recovery and cost the plant millions of dollars per year. It’s a simple process to adjust the mill to account for these fluctuations — and it can be a quick win by having the right depth of data on hand.
How to get the most from existing data
Your plant already collects vast amounts of data, but it probably isn’t analysed to its full potential. If you focus on improving how data is visualised, you and your team can uncover deeper insights.
Start by creating dashboards that provide a more granular breakdown of key metrics like ore grade, reagent use, and recovery by shift or by specific equipment. By breaking down performance data into smaller, more specific categories, operators can spot trends and issues that might otherwise be missed in a high-level report.
Excel might feel like a safe, familiar choice, but it can also hold you back from seeing the full potential of your operation. By diving deeper into your existing data, you can uncover hidden inefficiencies, fix them faster, and hit your production targets more consistently.
Read more about Mipac's software
The balance between safety and production has long been a point of contention within the mining industry. Historically, the pressure to meet …
You can’t manage what you can’t see, and in mineral processing plants, that means keeping a close eye on your process data. …
The ability to adapt to new technology is a critical and in-demand skill in the mining industry. Over the years, digital solutions …