We know we need to be more clever with our data, especially when you look at the cost of traditional enterprise storage.
Our customers are telling us that one of their key drivers for data analytics isn’t just to see what they can use for better efficiency but also what they can put in their archive freezer.
I read a report from International Data Corporation (IDC) and of course they warn of the growth in data, increasing tenfold by 2020.
What was interesting in this research was that by 2020 the percentage of useful data will be only be 35%, and right now that percentage is 22%.
The main data challenge for our customers right now is understanding what is good data, what is not and then how to make sense of the good data for use in practice.
The opportunity in data is really exciting. Both within single organisations and across different organisations. I find cross-sector references useful because sometimes we can be insular.
Looking across verticals and learning from each other is innovation – not reinventing the wheel but applying theories in new ways to meet new challenges.
Over the last few years, the Scottish public sector have been doing some great things with data. We were involved in a project with a local council and their authorities to bring different data sets together, analysing these sets to help identify children who were at risk.
Bringing data sets together is powerful as long as you have the right environment and security for it to be stored.
I really like this example from the pharma industry and it lends well to oil and gas if you consider the level of capital expenditure; drug research is expensive.
A vaccine plant costs $600 million to build and one batch of vaccine can take anything from 2 to 14 years to complete.
Components may have to be stored at exactly -8 degrees for a year or more and with any slight variation from the regulator-approved manufacturing process, all materials must be discarded.
Merck is known for its vaccine development, it developed the first MMR vaccine. In the summer of 2012, they were seeing higher than usual discard rates on certain vaccines.
From data insights Merck ascertained that certain characteristics in the fermentation phase of the vaccine production were closely tied to yield, and these could be controlled.
This kind of discovery wouldn’t have been possible without the technology used to store and analyse big data. Since the adoption of data analytics, Merck’s research and development effort has led to the approval of more new drugs than any other pharma company globally.
Microsoft’s publication “Global Enterprise Big Data Trend: 2013” found that Oil and Gas also want big data analytics.
More specifically they want predictive analytics, expanded data storage and functions to analyse unstructured data in order to assist exploration & development, drilling & completions, production & operations and enterprise security.
I think the energy industry are already good at monitoring and collecting data. The key is optimise it to build further business benefits.
Prediction and optimisation is really achievable in oil and gas – Shell are a good example here. They’ve fitted machinery with sensors to collect performance data allowing engineers to be one step ahead in terms of knowing when parts need to be replaced, reducing downtime risk and overheads.
The energy industry is the most data intensive industry we work with. The advantage of being so data heavy is being able to report what’s happening, analyse why it happened, predict what might happen, and finally automatically optimise operations based on data insights.
The more we can leverage data to do the heavy lifting, the more we can focus our attention on the industry’s bigger questions.
Richard Higgs is the chief executive of Brightsolid