can now be installed relatively quickly. “In the utilities sector,
we can go from original data to a pilot application in under
four weeks,” says Castaldini. “In the oil and gas sector, we
are currently looking at three months, but we expect that
to decrease as we co-develop applications and get a better
understanding of the sector needs.”
They can do so in an efficient, cost-effective manner. “Many
aspects of the enabling infrastructure necessary for big data
analytics have matured,” says Smith. “For instance, the economies
of the Cloud now allow us to store, move and analyse data
much more efficiently. Historically, this data has been stranded
in isolated databases, unable to be related to give big picture
insights. Oil and gas professionals have done a lot of work mining
data and making operational improvements. What we are offering
is taking this analysis one step further.”
Companies are beefing up their data management
infrastructure. “When the concept of the DOF first arose, there was
an impression that since there was so much data coming in, data
quality wouldn’t be an issue,” says Jason Medd, Senior Product
Manager at Informatica, a data software provider. “That proved to
be a false assumption.”
For over a decade, Informatica has been working with oil
and gas companies to install data governance and data quality
processes that fuel sophisticated data management and analytics.
Working with open industry standards established by the
non-profit PPDM organisation, Informatica recently launched a
data quality accelerator. The new system allows a company to
consolidate data, improve quality, and enhance the ability of
business units to gain value across the entire well life cycle.
New communications networks and processes are springing
up. In mid-2015, BP announced an agreement to license GE
software that will connect all of BP’s oil wells to the internet in
order to optimise production globally. According to statistics, it
costs an average of US$3 million per week when an offshore well
goes down. BP, which as 6000 producing wells around the world,
plans to capture, store, contextualise and visualise data in real
time in order to drive efficiency and performance.
Over thehorizon
What will the longer-term future bring? ‘Code halo’ is a term
that is used to describe the information that surrounds people,
organisations and devices. It is generated by clicks, swipes, views,
interactions and transactions that generates a ‘virtual self’ made
of code. “We have them as consumers; halos that follow us,” says
Haynes-Gaspar. She notes that code halos can also exist in the
industrial internet. “You could have a virtual best operator that
helps you understand how to achieve better uptime for your assets,
regardless of where they are. It would be like a digital twin.”
Bit Stew envisions a time when computers will do most of the
work. “Ten years from now, we will have come full circle, where
software-defined applications help to automate a closed loop,”
says Castaldini. “The software will go from solving a higher degree
of problems to reaching a higher degree of solutions. Humans will
always be involved, but they will be able to focus on high value
problems.”
Regardless, no one doubts that the advent of big data is
launching a revolution that will benefit operators. Not only will it
offer the opportunity to extend operational life of equipment, it
will proffer an understanding of their business that will allow the
leveraging of assets in ways previously unimaginable. “It’s going
to be quite interesting as industry understands how artificial
intelligence and analytics converge,” says Haynes-Gaspar.
For more information visit
or email
Media planners and advertisers need reassurance too.
An ABC Certificate shows our figures have been
independently verified, giving you confidence
in our claim.
Ask to see our ABC Certificate.
See it, believe it, trust it.