What is the Best Strategy to Deliver on the Promise of a Data Utility?

Published in Industry Insights

Published on April 21st, 2016

By John Randles, CEO, Bloomberg PolarLake

For years, the data management industry has been talking about utilities and the potential for a shared service. The basic premise is very simple and the analogy with the electric industry is useful. In the early days of the electricity market, each factory built its own power generation capability. In time, two challenges were solved so that electricity could be supplied far more cheaply and more reliably. They were:

  • Electric generation at scale
  • Reliable low- cost distribution

 

Once met, these tamed challenges enabled each factory to concentrate on its core business and retain great knowledge in the use and application of electricity.

It’s useful at this point to reflect on how this experience can inform our own path.

  • Where is the financial services industry on its journey to build a true data management utility?
  • What variety of approaches is the market is taking?
  • What are the chances of success with each of these approaches?

Firms have embraced a variety of approaches to building a data utility. These can vary from traditional outsourcing to application-hosting delivered variously from hosting partnerships all the way through to public cloud solutions. All of these are expressed in data utility language, but these services have unique differences. Moreover, none of them address the core issue of industrial scale.

To gain greater understanding, it is worth picking through and working to comprehend the important differences.

Business Process Outsourcing (BPO) as a Data Utility

With traditional outsourcing wrapped in the nom de jour (“Data Utilities”), we have staff changing employers, with the in-house application being supported by a third party in what is purely a balance sheet and labor arbitrage exercise. The staff from the user firm are usually quickly exited (a supplier in the space told me that even if someone is particularly useful and knowledgeable, that employee will still only be kept for a maximum of 6 months!). The only place where a conversation like this can be entertained is with the largest global banks—outside of which such a dramatic upheaval is just not worth the effort and firms cannot even pretend to see the benefits. Very few have tried this option and the ones who haven’t get queasy at the mere thought of it, with the result that it is dropping off the list of viable options. The risks are enormous and the proven benefits nonexistant. The biggest issues: after a while, there is a complete lack of control and a hollowing out of the firm’s knowledge and intellectual property.

Internal Systems Repurposed as Data Utilities

Another approach involves the internal applications getting a lick of paint and then being dubbed “Data Utilities.” At this point, more than one firm has declared that they are going to use another firm’s “Security Master” or “Pricing Master” application. One of the biggest challenges with this approach is that applications that have a very spotty internal reputation are now expected to solve the problem for the industry—a tall order and something for which these applications were never designed. Furthermore, this approach is also often mixed with a third-party solution—the plan being that these firms and applications will all migrate together over a multi-year program while keeping the lights on at home on an old application. The best bit is that everyone is going to try to agree on standardized requirements both internally and across the firms. This, in turn, means that nobody is responsible for the core product’s vision and capability. The committee decides.

Juggling on a unicycle while someone is throwing rocks is a way to best describe the heroic efforts involved in attempting to build a data utility starting from this point. A very tall order and something easy to declare victory on but very hard to deliver value on.

The Software Vendor Mixing with the Public Cloud as a Data Utility

A third approach is to take an existing application and to run it on a partner’s data center or on a public cloud. By far this is the cheapest and nastiest version of what we have seen out there. “It’s on my box not yours—period” is the value proposition. This delivers the client basically none of the benefits you would expect from a data utility (think electricity—central processing at industrial scale). “Am I sharing common processing with other clients?”—no. “Am I gaining economies of scale with IT Operations support?”—no. “Is there any Data Operations component to this?”—no. “And, by the way, am I running any other core business application in a Public Cloud?” “We’re not doing that are we?”

This is what I call the Zero-Dollar Data Utility strategy—spend $0 on R&D, Operations, Service, Support and end up with a data utility. Unfortunately with this approach, firms are still building separate electricity factories and there is zero sharing of capability—we have just moved our unique power generation capability into someone else’s building, but it is all still my responsibility. I’m still in the unique local power generation business.

Fulfilling the Promise of a Data Utility

So, what is the best strategy to deliver on the promise of a data utility? Remember back to the electricity utility analogy— the most important elements are:

  • Central Generation at Scale
  • Distribution

For a data utility to operate at real scale, it needs a purpose-built data utility platform where all clients share appropriate resources based on true multi-tenancy and processing billions of data records per day. At scale, the data utility needs to control its own resources—hardware, software and security—therefore, a private, fully owned and operated cloud is essential. A true data utility requires very broad access to data across all data sets (Reference, Pricing, Corporate Actions, Index, Legal Entity, Positions, Trades, etc.). And, finally, the data utility needs to be proven operationally.

In the future, the distribution side should be one of web-based workstations, with File-, API- and Message-based distribution across resilient networks, all of which address the second piece of the puzzle: distribution.

The final piece and the often overlooked element are the data utility users at the client’s site. We need to truly understand and appreciate where they fit in. In the same way factories 100 years ago invested in the use of electricity versus the generation of electricity, in our century the investment will be in the use of data—not in the generation of clean data.

Users of data management utilities will be far more advanced and evolved data practitioners and certainly they will not be an outsourced commodity simply no longer required. These experts will be looking for opportunities in data, helping the firm stay ahead of regulations, finding operational efficiencies and helping the business move faster through better use of data. All the actions that firms cannot take today because they are stuck in a world of electricity generation not electricity use!

With the industrialization of electricity, expertise in the use of electricity followed centralized generation. In the same way, we see our clients evolving into sophisticated data analysts delivering high-value insights and value to their firms. They are certainly not the traditional data operators of old.

Learn more about the Bloomberg PolarLake Data Utility