Collaboration key to faster geology modelling advances

Richard Roberts

Top image :
AusIMM conference panel members (left to right) DhanielCarvalho, Kathleen Hansmann, Sean Horan, Alex Boucher and Michelle Carey
Need for ‘open innovation’ creates challenges, opportunities

Technology won’t hold back geological modellers in their quest for speed or their ability to deal with increasingly complex datasets, the AusIMM’s first Mineral Resource Estimation conference heard.

“I don’t think technology is the barrier,” Imdex product management and marketing chief, Michelle Carey, told the conference in Perth, Western Australia.

“It really does come down to what’s going to help you release the most value … and what are you most willing to change to achieve that.”

A panel of industry leaders set up to discuss the “next frontier for geology software” and its potential impact on vital mineral resource estimation canvassed the changes leading vendors were making as they adjusted to market demands and shifts in the fast-moving wider information technology world.

“Commercial [mining] products are limited in their rate of change due to the need to maintain and support applications and existing customer workflows,” the session preamble read.

“On the other hand, coding and open-source applications are allowing skilled geologists to customise and build innovative new workflows.”

Carey said she did not envy the main geological and mining software incumbents as they grappled with the industry’s cloud migration readiness and speed, and increasing market pressure for “openness”, translated as sharing application programming interfaces (APIs) and other IP, in the name of progress.

The conference heard about advanced models, simulations and other activities generating terabytes of data and requiring more dynamic transmission and review channels. BHP head of geoscience Campbell McCuaig said the “fire hoses of data coming at us [was] … a real challenge”.

“We’re [Imdex] one of the many groups increasingly deploying sensors in the field and creating big streams of sensor data that we’re putting into cloud infrastructure,” Carey said.

“Those streams of data are being consumed by a lot of the software products that you guys use in your day-to-day. Now we do have some [software] capability ourselves. One of the main reasons we have some of that capability is because we’re creating challenges that we also have to be part of solving.

“We’re saying we want to allow real-time decision making, which can mean you need real-time modelling, [so] we’re creating some capability to do that.

“Everything that Imdex does is in the cloud … and there aren’t really technological barriers to us being in the cloud.

“[However] if you’re one of the long-term incumbent software companies I think the challenge is … you’re so embedded in the processes and workflows and systems and day-to-day lives of resource companies that it can be very hard for them to allow components of your system to be in the cloud.

“There’s a resistance to change.”

Other panellists, including Datamine’s geology product manager Kathleen Hansmann and Seequent (Bentley Systems) technical solutions director Alex Boucher, agreed the market appetite for gains from software and other technology was expanding rapidly. Constraints were seldom IT-related.

Sean Horan, principal resource geologist with SLR Consulting in Canada and a noted programmer, said cloud computing was “something that’s been on our lips for a long time”.

“It’s becoming a reality and hopefully in the near future it will be more widely adopted.

“There’s also optimisation of algorithms.

“There are solutions out there.

“I think fundamentally we need to address some core issues and I think that one of the biggest things we have to focus on are the platforms.

“Our models are getting bigger and bigger and more complex and we sort of have one shot at doing these models because they take so long to run. We’re talking about terabytes of data with simulations.

“So we really need to focus on speeding up these processes … [and then] I think we need to work on visualisation and accessibility.”

Boucher said the “elasticity of the cloud” meant big data collaboration was “no big deal”.

“A lot of limits disappear,” he said.

“We have a nice opportunity here to think about … If we started today, would we do things the same way? Would we have the same process, the same workflow, the same gate?

“As we get new technology … the algorithms, the infrastructure, the elastic cloud in particular – a lot of different things that were not available before – how can we revisit the workflow we have?”

Hansmann, credited with being pivotal to guiding the direction of Datamine’s resource modelling software, said the industry leader had built up a “huge amount of technical debt” over its decades in the market.

“We’re constantly refactoring stuff,” she said.

“We always have multiple refactoring projects going on and you won’t see a big bang where suddenly everything’s a lot faster.

“But slowly we’re going through and refactoring different processes and ensuring that they work in exactly the same way and they’re just written on more modern platforms and they’re written better and they are faster.

“The big difference between now and 20 years ago is that we have the capability to produce a new model every single day, and make sure that everything you have is fed with the most live data that you’re able to use [and] leveraging all the modern technology to make sure that you’re sharing stuff effectively. It doesn’t matter that your mine is thousands of kilometres away. The data is all there and it’s all accessible and shareable.

“So I think it’s about making sure that [user] workflows support that.

“I don’t know when the next breakthrough moment is going to be where technology is going to change a lot. Maybe AI is going to have something to do with that.

“But it’s generally just about communication and partnership between different groups working together to get a better solution.”

Asked about “better integration” of competing software products to benefit users in a market that is probably worth US$500 million a year, Hansmann said, “we should all be able to speak to each other”.

“We are all just software and at the end of the day it’s just data being passed between,” she said.

“I think we’re all doing what we can to make that easier for everyone.

“Recently we’ve been providing our API to Leapfrog [Seequent].

“We had a lot of common users and there’s struggles getting files between them. So it’s important that we do this.”

On the sidelines of the conference veteran geologist and senior technical sales specialist at Maptek, Steve Sullivan said companies had different policies and preparedness to work together.

“Sharing APIs from our perspective is not a big deal. We have an API and a software development kit, SDK, that is available to all of our customers but also all of our competitors,” he told

“So it is quite open.

“There’s always going to be some connection required to join data and processes together.

“That’s why we have an API that allows people and competitors and other application developers to access that.

“We do have some competitors that we work with very closely and we share APIs and updates. We have other competitors who don’t want to talk to us after 10 years of discussions.”

Sullivan said web-based software developments offered benefits to both users and vendors.

“One of the constraints [in the market] is that a lot of our mining communities are very remote,” he said.

“It’s good to go the cloud but don’t assume that sites have got the bandwidth that we might have in the cities because once you get out in some of these remote locations, unless you’re on some of these satellite systems like Starlink, you just can’t get the connectivity that you need to access your data.

“That’s one of the limitations we have at the moment.”


Leave a Reply

Not registered? Register Now

Powered By MemberPress WooCommerce Plus Integration