Addressing the Data Barriers to Better Business: No 1. Complexity

Published in Industry Insights

Published on July 1st, 2015

In order to create a platform fostering an in-depth understanding of current data management issues, earlier this year, Bloomberg PolarLake convened Chief Data Officers (CDOs), Heads of Operations and Chief Technologists from more than 100 firms at a series of roundtables hosted in 13 financial hubs around the globe. Attendees represented a diverse mix of organisations, from regional banks to global asset management firms.

Here, in the first of a series of posts, Bloomberg PolarLake sets out the key challenge consistently identified by this diverse but united group as a ‘data barrier’ to better business – increasing data complexity.

Addressing the Data Barriers to Better Business:  No 1. Complexity

Listening to CDOs at more than 100 diverse firms from four corners of the globe, one thing is clear: these professionals and the firms they represent are forced to confront a never-ending and unprecedented slew of challenges – doing more with less, improving data quality, meeting the growing demands of regulators and responding to internal business needs – all at the same time.

While many of these issues are well worn, during the course of our CDO roundtable series it became apparent that one associated but relatively untold data barrier is consistently causing concern and frustration across firms, industries and geographies – increasing data complexity.

Data complexity was described in many forms, fuelled by many factors: increasing business demand for multiple niche vendor sources; new regulatory requirements demanding secondary sources and evidence of data governance and lineage; the growing demand to integrate internal data sources and the proliferation of customised standards, to name but a few.

Add to that the growing complexity of the local hardware and software estate, where EDM databases, risk data warehouses, order management and accounting systems must now all be in sync, all with clean data and all open to change on demand – it’s no wonder firms are struggling to keep up.

Many organisations have implemented local commercial Enterprise Data Management (EDM) systems to meet a specific business need and have achieved initial success, a single project at a time. With this approach, tactical problems are solved but most organisations are yet to implement a horizontal strategy.

More often, within a single firm, different regions, asset class groups, trading strategies, risk managers and regulators are all making simultaneous and sometimes competing demands. The resulting need for massive parallelism in on-premise data management solutions spans everything from a design, development, testing and release management perspective. In other words, real enterprise discipline is required.. This is inevitably not what the local solution deployed three, four or even five years ago was designed to do – it was implemented to solve a singular problem. And it did it well. But the goal posts have shifted.

Perhaps the single greatest cause of increased complexity and the greatest challenge for many firms in this new world is the number of vendor and internal data feeds that now need to be managed and mapped to the many consuming applications has increased exponentially. Complexity has grown to include demands such as managing vendor connectivity, cross vendor data dictionaries, vendor notifications, matching, ongoing acquisition process monitoring, vendor performance management and error correction, cleansing – and the list goes on.

What’s more, firms who are managing the complexities around internal feeds today are effectively on their own. Once the relevant software is installed, as far as the software vendor is concerned, the data management process is solely in the hands of the client. Some publish vendor feed adaptor upgrades to help this process, but the upgrades themselves tend to be so cumbersome, error prone and labour intensive that users often choose to ignore upgrade prompts and muddle through themselves or react well past the effective date.  Other vendors don’t even pretend to offer an upgrade path, leaving firms to work it out themselves with only basic training.

In all of these cases the CDO and his or her teams are looking after all connections directly – big five vendors, rating agencies, index providers, custodian and broker feeds, internal feeds, exchange feeds, etc. etc. Not only into the EDM system but those that bypass it as well and therefore need additional direct mappings.

With all that said, it’s no surprise the consistent message we heard from the CDO roundtables was, “Please simplify this situation for us -now.” So, how can CDOs’ wishes be met?

Simplifying data management is the challenge at the heart off Bloomberg PolarLake’s managed service model. Our goal is to take the complexity out of data management, to provide operational control and auditability of data, to deliver rapid data and process integration and, ultimately, to help firms address the data barriers preventing more transparent, efficient and value-driven business operations.





Related content

Register to learn more about Bloomberg PolarLake and receive our White Paper “Simplifying Data Management”