Stream real-time data to unlock analytics

Client Snapshot

An upstream oil and gas company wanted to stream real-time data from their assets to enable predictive analytics, data science and operational intelligence. However, the disparate storage platforms and formats of the data made that integration nearly impossible. Blueprint provided the strategic and technical expertise to better connect the company to its external data vendors, as well as to the data itself.

Work Summary

The problem

A leader in hydrocarbon exploration and production had a great deal of production data in its various systems and platforms, but years of acquisitions and software overhauls had resulted in an ineffective data storage architecture, leaving their dozens of disparate data sources with little way to communicate with each other. Teams struggled to perform even simple analyses based on historical or real-time data.

One scenario involved the ability to look back at historical production data and determine the effect new oil wells had on existing wells when constructed near them or when tapping into the same reserves. Another was more forward-looking and involved the need to ingest and process real-time data streaming from production devices to identify devices that were losing pressure or running hotter, with the hope of using AI/ML to perform predictive analytics and identify problematic wells earlier.

“It’s good and bad. It’s good because you have a system for everything, and it’s bad because you have a system for everything,” a Blueprint software development lead said. “Getting systems to talk to each other and/or integrating the data becomes a real challenge.”

The Blueprint Way

As Blueprint worked with this company to assess its disparate data storage systems and platforms, one third-party source stood out as a possible focus for quick ROI. This vendor had experience working with the data platform that collected and managed the data for all above-ground operations, and also had an Azure-specific connector for streaming data directly into a Data Lake in the cloud. Blueprint worked with the company and the vendor to install and configure its connectors as well as debug the system once the connectors were in place. Blueprint then configured and architected the process by which that production data would stream into the Data Lake, where the desired data science, enhanced analytics and in-depth reporting occurs. Blueprint was then able to create sub-sets of the desired data and push it to the company’s own data warehouse to perform business intelligence analytics with other data across multiple systems.

 “That’s kind of the Holy Grail – seeing all your data under one roof, and that is easier said than done sometimes,” the developer said. “For this project, we played analyst, we played connector of people and we played integrator and implementer.”

Impact

Share with your network

You may also enjoy

A global renewable energy leader partnered with us to enhance their pricing capabilities using Databricks, improving revenue efficiency and overcoming data challenges to optimize financial decisions.
MissionWired partnered with Blueprint for a successful Unity Catalog migration, optimizing Databricks’ features and Lakehouse spend. The project exceeded expectations with 143 refactored notebooks and comprehensive post-migration training.