Five insights into building a great data platform can help energy, chemical, utility, and basic-materials companies get it right. For any sizeable company, a state-of-the-art data and analytics platform is no longer an option but a necessity. Such a platform acts as a central repository for all data, distills them into a single source of truth, and supports the scaling up of sophisticated digital- and advanced-analytics programs that translate data into business value (exhibit). Companies without one risk leaving serious value on the table. For a utility, for instance, a data and analytics platform can cut costs by up to 15 percent in some operational and maintenance areas, while savings in oil and gas companies’ upstream activities can run even higher, at up to 20 percent.
Unfortunately, most energy and materials companies find themselves, in some way, lagging behind early digital adopters, like the retail, travel, and financial-services industries. Although many of these companies have had sophisticated data systems for years for some functions, such as for seismic imaging and processing in oil and gas and for flow visualization in transmission, they often struggle to connect disparate systems and data in an easily scalable way. Many have experienced frustrating complexity, some have suffered painful failures, and a few are still trying to figure out the first steps in building their own data platforms.
Overcoming setbacks like these, however, could make all the difference in staying competitive for energy and materials companies. Robust data and analytics platforms can help them capture substantial value from improved efficiency and uncover new revenue and growth opportunities. Through our work supporting companies from many industries in such efforts, we have identified five key—and sometimes counterintuitive—insights into the best way to build your platform quickly to drive real business value.
Insight 1: Ensure everything you do starts delivering impact within six months
In developing and prioritizing use cases for your data and analytics platform, the single most important criterion is speed to business impact. The companies that obsess over value from the start and constantly push for high-impact, quick wins are those that do best at winning over skeptics and keeping up the momentum of change.
Many platform-building programs share a common problem: value capture begins only at the end, after all the pieces are in place. One industrial company invested $15 million in big data infrastructure, spanning servers, storage, networking, and software, only to discover that the investment didn’t start to pay off for at least three years. Cases like this can jeopardize the survival of a project.
A mining company took a different approach when it developed a data-visualization tool to give executives a consolidated, real-time view of performance in all its production processes at all its sites. Realizing that building out the tool and populating it with data flows would be a lengthy endeavor, the company proceeded site by site. Data visualization for the first site went live after only a few months, enabling the site manager to detect and resolve issues as they emerged instead of having to wait for their identification in the next monthly report. As the company extends the tool to each new location, that site can immediately begin generating impact, helping to sustain stakeholder support while the build-out continues.
Being strategic about achieving quick wins early on and capturing benefits at each stage of platform building gives companies the staying power to see long projects to completion, when exponential value can be realized.
Ten-second takeaway
Make “rapid return on investment” your watchword when selecting use cases. Leading companies achieve substantial impact fast with their platforms by being clear on which high-impact, quick wins to prioritize.
Insight 2: Use existing data to build in bite-size chunks
We have found that two common obstacles hold many companies back from building their new data and analytics platforms. One obstacle is the paralysis that sets in when organizations think about the gargantuan effort required. But our experience shows that no such “big-bang effort” is needed; instead, success comes from building a platform iteratively, piece by piece.
One mining company had data architecture spanning more than 200 separate systems—some siloed, and some incompatible. This created version-control issues for data and meant there was no single source of truth. In addition, visibility into the data inventory was poor. The company couldn’t clean up all its systems and data before building its new data architecture because the time and effort required would have jeopardized the business. Instead, it worked iteratively, taking critical use cases across the enterprise (such as digitizing asset-maintenance work flows) and building its new platform block by block. This enabled it to use existing data and metrics to bring new applications to life quickly and capture impact.
Similarly, the vice president of digital strategy at one utility initially intended to build its new data and analytics platform in one big-bang effort. However, the platform had to support many critical “keeping the lights on” functions for which there were no acceptable margins for error. So, it soon became evident that the only sensible way to proceed was by building piece by piece, with noncritical applications tackled first to allow elements of the platform to be tried and refined. Blocks supporting critical applications could then be built and tested thoroughly to ensure a smooth migration from the existing platform to the new one.
The second obstacle that many companies run into is the perceived need for a massive amount of data and data cleaning before any meaningful algorithms can be run. Many energy and materials companies are still at an early stage in determining their data structures, defining their data governance, and capturing a single source of truth for their data sets. They do need adequate volumes of the right data, but the good news is that they probably already have what they need for the highest-impact use cases—namely, good-quality data for their most important business metrics.
Many companies don’t realize they have a wealth of data at their disposal; others feel overwhelmed by the task of mining the data. One company doubted it had sufficient data to support certain use cases in finance, such as customer enquiries. With our support, it identified readily available commercial data that could be validated by the finance team—data that proved more than adequate in achieving its goal of reducing customer hold times.
At the same time, executives often assume their data will require thorough cleaning before they can be used. For sure, some data cleaning will be needed—nobody starts out with clean data—but it is likely to be much less than expected for the initial prized data sets. Even without master data management, there are easy ways to process available data to derive high-quality output. One OEM had to deduplicate data from more than 90 enterprise-resource-planning (ERP) instances, a task that many in the organization feared would take years. Instead, an expert team using existing algorithms is expected to complete the work in months.
Ten-second takeaway
Build your platform piece by piece. Start with the data you already have and tackle the use case or end-to-end work flow that has the greatest opportunity for impact.
Insight 3: Deploy analytics only to solve real business problems
Don’t lose sight of the point of a data and analytics platform: to create more value for your company. We have seen companies bring in analytics teams that end up working in silos with little or no input from the people who know the business inside out and understand its priorities. Without the right direction setting, these teams often focus on the wrong problems, leading to waste and lackluster results. With that in mind, we have two pieces of advice for energy and materials companies.
First, use your new platform to augment and improve what you are already doing. For one large chemical company, that meant focusing its digital and analytics efforts on actions such as debottlenecking processes and developing performance-visualization tools. Starting with a clear focus is crucial.
Second, make sure you embed your newly developed tools and capabilities in your existing functions and processes. The real power comes when new skills and technologies are combined with existing institutional knowledge.
Take the sports and leisure company seeking to improve its algorithms, models, and data architecture and extract maximum value from analytics. It had a large group working to make the best use of collective knowledge and a separate group undertaking analytics and research. To get both groups working together, leaders reengineered operations to include analytics in daily activities and give end users ownership and a say in how it was used. Instead of producing lengthy reports, the analytics group worked with end users to identify what information they needed and where data visualization might help them do their jobs better. That is an approach that energy and materials companies would do well to emulate.
The more companies approach analytics as a way to improve their current work and teams, the faster people will adopt new tools and use them to drive impact.
Ten-second takeaway
Plan and deploy your analytics platform to help you perform better. Apply it to real problems from day one.
Insight 4: Invest twice as much in your talent, culture, and processes as in tools
Any successful technology transformation depends on having the right organizational elements in place. Obvious though that sounds, a surprising number of companies make the mistake of procuring new technologies without ensuring they have the talent, culture, and operating models to implement them effectively. We would caution leaders to beware of the “shiny toy” syndrome. Solving performance issues takes more than a new tool, however sophisticated it might be.
Overlooking gaps in talent, capabilities, or operating models wastes effort and costs, and it defers problems for fixing at some future date. Take the global chemical company that hoped introducing a new software package would resolve low productivity, high costs, and low morale in its maintenance-planning unit. Six months in, the effort failed: underlying process issues had not been addressed, so the tool had no realistic chance of delivering the expected improvements.
Another common failure mode is investing in talent in a half-hearted way. “Let’s start with a data scientist and see how it goes” is something we often hear but never see work successfully. Rare is the data scientist who has the domain knowledge to be credible with colleagues in the business, for example. Nor is it likely he or she would have deep experience in data engineering or in writing, deploying, and maintaining production-quality code on big data stacks. In any case, writing the code and performing the quality assurance should never be done by the same person. A small pilot might scrape by without proper QA capabilities, but production systems never could. So, where talent investment is concerned, it is all or nothing.
These cautionary tales are far from isolated examples in our experience of working with energy and materials companies. To avoid making similar mistakes, it is worth bearing in mind four hard-won lessons.
First, adopt an agile operating model with cross-functional teams led by a product owner from the business. Take small steps, experiment in quick cycles, and course correct immediately as needed. Second, capabilities are critical. Monitor and review the number of data scientists, data engineers, platform architects, and analytic translators you have, and make sure you recruit, develop, and redeploy people in alignment with your strategic vision. Third, drive data governance from the top down. Push it to the lowest level of accountability consistent with your operating model, and track it rigorously. Last, ensure people buy into the adoption of advanced analytics and embrace it as part of a new culture—one that is data driven and accepts, even celebrates, failure when it happens early and brings useful lessons. Though often overlooked, that is a key part of adopting an agile approach.
Ten-second takeaway
Ensure you focus on people and processes, not tools. No matter how advanced the tool, it will be worthless without the talent and structures for managing and using it.
Insight 5: Democratize data across your business to catalyze innovation from within
The companies that innovate fastest and best often crowdsource ideas from employees who know their organization, industry, market, and customers inside out. You don’t have to be Google or Spotify to do this; energy and materials companies can take a leaf out of the tech giants’ books. A well-designed data and analytics platform can, for instance, provide easy access to reliable data for use in “hackathons” that allow employees to come up with creative ideas, draw on institutional expertise, and have their voices heard. We have seen new products evolve from first concept to early prototype in a matter of days with this approach.
Two simple rules for good design are that all data should be owned by the enterprise, not a business unit or process owner, and that most, if not all, employees should be able to see what data are available and access them through self-service tools. For instance, a data scientist in a utility could develop a model for customer churn and design a simple front end to access the results. Another data scientist could refine the model by adding new data attributes from an external source. Then, a user-experience designer could introduce graphics and messages to improve the user experience. As the model is scaled up, new users could ask for additional features and benefits, and a process of continuous improvement would kick in. In time, an innovation culture would take hold, enabling the organization to unleash the latent creativity and resourcefulness of its workforce, deepen employee engagement, and create more value for the business. Before this can happen, companies need to ensure that data can be accessed without the multiple gatekeepers and obstacles found in most siloed organizations.
Subscribe to the Shortlist
To get the right data to the right users in the right context at the right time, sound data governance is essential. Take customer records in an ERP platform as an example. Until recently, the ERP vendor controlled the definition of customers, users could access records only through the platform’s front-end tools, and any company-specific data elements were add-ons without a natural home in the platform. To work around these limitations, technology teams tended to combine ERP data with other databases or even made full copies of the data.
But this data manipulation can cause problems with version control and other issues. For instance, a developer accessing data through an application programming interface (API) could be presented with a customer record that is misleading because it is no longer connected to the ERP platform. The copying and repurposing of data causes fragmentation and highlights the importance of data governance. If companies want to nurture innovation by broadening access to data, they need to identify a “golden” source of truth for every data domain, whether it be a customer, product, field asset, or any other area critical to decision making.
When it comes to getting data to users, tools such as API managers, data-discovery tools, software-development kits, and so on are readily available for a relatively modest outlay. However, companies also need to get the right users—meaning their business experts—on board and persuade them to adopt new ways of exploring and testing ideas. That is the only way to generate high-quality data, derive meaningful insights, and capture value. Smart companies embrace design thinking and bring in users to participate in developing applications and tools from the very beginning. When products and services are developed with user input into every aspect of design and performance, they stand a much better chance of being easy and enjoyable to work with and gaining enthusiastic adoption among target groups.
Ten-second takeaway
Gear your platform to democratizing data. Make data available to all your employees, teach them how to use the data, and see what unsuspected value they can unlock for your organization.
Once upon a time, companies had to choose between making heavy investments in data infrastructure that didn’t start delivering benefits for a year or two and doing “skunkworks” data projects that drove short-term value but lacked staying power. Now, though, organizations can both have their cake and eat it—namely, drive value in months while simultaneously building a robust enterprise data platform that will serve the company for years. While doing this is no easy matter, the benefits of success considerably outweigh the costs and risks.
About the author(s)Adrian Booth is a senior partner in McKinsey’s San Francisco office, Jeff Hart is a partner in the Houston office, and Stuart Sim is an associate partner in the New Jersey office.
No comments:
Post a Comment