‘Trust, but verify’: Ivy Chen urges peers to think hard about AI


Richard Roberts

Top image :
(Left to right) Jacqui Coombes speaks with John Kirkman, Paul Hodkiewicz, Ivy Chen and Mark Stickells
No rush to hand over resource modelling keys, Perth conference hears

What role for people as mining and consulting houses train machine learning models and AI to relieve bottlenecks in the industry’s “new era of data abundance”? “We have the capacity to think outside the box a lot more than a machine that lives in a box”.

Ivy Chen spoke for many in the room at the world’s biggest gathering of resource geologists when she suggested, “trust, but verify” – the English translation of the Russian proverb, “doveryai, no proveryai” – should remain the industry’s mantra as geoscience data gathering and geological modelling technologies infused with AI reset resource estimation workflows and maybe paradigms.

“Don’t let the computer have the last word,” said Chen, a respected geologist and company director in Australia and a key figure in the industry’s efforts to improve corporate governance around resource reporting over the past decade.

She was speaking on a panel discussing technologies and trends shaping “next-generation” resource estimation – literally the bedrock of mines and projects – including the modelling software being fed with a mindboggling array of high-resolution, multivariate data and run on supercomputers. The panel moderated by Dr Jacqui Coombes also featured leaders in key connected fields: Pawsey Supercomputing Research Centre CEO and 2024 Member of the Order of Australia recipient, Mark Stickells, veteran geologist and principal of Samarah Solutions, Paul Hodkiewicz, and Enterprise Transformation Partners managing director John Kirkman, probably one of the few genuine experts in Industry 4.0 mining interoperability.

The second Mineral Resource Estimation Conference organised by the Australasian Institute of Mining and Metallurgy (AusIMM) again drew more than 500 predominantly geoscience professionals from over 20 countries to Perth, Western Australia.

Mineral resource estimation was the “critical conduit between geosciences and mine planning, forming the nexus of the mining value chain by translating geological data into actionable insights that define mine design, operational strategy and ultimately cash flow and value creation”, the conference heard. But resource estimation workflows are straining to keep up with the vast influx of complex data and assimilation of AI into resource estimation has opened up exciting new frontiers and maybe a Pandora’s Box or two (depending on how full your glass is).

“Think about AI coming and helping humans to do simulations, mine planning, etc,” another top name in the field, BHP’s Dr Ilnur Minniakhmetov, told delegates. “I’m thinking about AI helping you to automate processes and then thinking a level a bit higher than what we do right now.”

Rio Tinto digital integrated mine research lead, Emet Arya said: “Our orebodies are getting deeper and lower grade with more complicated mineralogy. It’s like going from the gravity law of Newton to Einstein’s relativity.  It is complicated … so I think we need advanced geostatistics and we need AI and we need more accurate models.”

How do people stay “meaningfully” engaged in the MRE link in mining’s value chain? “I think the one thing which AI hasn’t managed to do yet is we haven’t managed to teach AI to be that cynical, grumpy, black-hat resource geo that some of us have a reputation of being,” Chen said. “It’s this grumpy, cynical black hat that’s going to test anything that machine learning or AI throws at us.

“We will ask, are you sure? What’s that based on? And we need to keep doing that. That’s our human advantage.

“We can try and teach that machine as much as we know – we will have really good training data sets and models – but beyond that, there’s still that subconscious processing that we all call gut instinct that is almost impossible to replicate in a synthetic intelligence, I believe.

“For now, anyway. I could be proven wrong probably in two or three years because of the speed that these machines are evolving. But I think we just have to be us – the nasty, sneaky, suspicious us that just doesn’t trust other people too much and says no, why? How? Who?”

The conference heard that while AI had more of the limelight at the inaugural MREC, even threatening to take over the event’s celebrated Parker Challenge resource modelling contest, it now seemed more settled in the co-pilot role it had assumed, at least commercially, in consumer software products and some new mining products. “Machine learning is here supporting our resource models,” Glencore principal resource geologist Dhaniel Carvalho said on a separate panel.

“We will do critical thinking. They will drive,” Minniakhmetov added. Even in the face of relentless cost pressures and cost-alleviating, aka job cutting, steps in the industry? “The answer is, what is the cost of error?” he said. A ChatGPT misdirection on a school assessment didn’t have the same consequences as a multi-million or billion-dollar misstep around high-grade mineralisation. “So I don’t think a manager [in cost-cutting mode] will decide that is the right thing to do. We’ll learn from our mistakes but I don’t think that will happen because [the] cost of error is very high.”

Freeport-McMoran chief geologist Rachel Rapprecht said: “It’s not a silver bullet. You need the geologic input, you need the evaluation and you need to critically look at the output and critically consider what you’re going to input. I think there’s also a component of geologists maybe not being comfortable with a black box machine learning algorithm and wanting to understand how the geology influences the final result that we’re working with. I know from conversations with others that having a black boxy solution makes people kind of uncomfortable.”

Newmont Corporation head of global resource management Arja Jewbali also believes the complexity argument works both ways.

“Models are not static,” she said. “We’re mining the deposit. We’re constantly collecting information and deposits are not stationary. You’re constantly evaluating how the model is performing. Are your parameters right? Is your domaining right? I feel like there’s a lot of complexity in the work that we do.

“Maybe I have a lack of vision but I don’t see machine learning and AI are there yet.”

The conference heard mining’s general struggle to unlock greater value from data, partly due to partitioning across its value chain but also its status as an afterthought in the Big-Tech-osphere, was being compounded by the demands for more complex models that served a faster-moving business calculus.

Coombes said AI, quantum computing and supercomputing could address the bottlenecks and “elevate resource estimation practices to meet the demands of modern mining”. She believes “we are at a pivotal nexus where we can fundamentally reimagine resource estimation”. But the tools and industry thinking needed to evolve.

Stickells, who has Australia’s most powerful supercomputer at the government and university-backed Pawsey centre, said he doubted the country’s past claim to be a world super-power in mining software was still valid.

“I spent most of the 2010s in the energy and resources space … and I remember a line about 70% of the world’s mining software [coming out of Australia],” he said. “I think those numbers have changed. It’s about time we write another chapter.

“The software stack in this industry is generally very powerful and very strong in certain areas but I think fragmented [in others]. The growing strain of data volumes that are overwhelming many sectors are also being felt in this industry and the challenge is to turn that data deluge that is coming from sensors, satellites [and] simulations … into tools that are useful for your industry.

“[For the local industry] I think there’s a great opportunity to write a new chapter.”

 

Leave a Reply

Not registered? Register Now

Powered By MemberPress WooCommerce Plus Integration