Reference Data Integration Goes Prime Time
John Randles, CEO PolarLake
In this time of massive industry crisis why should Financial Institutions be concerned with Reference Data Integration? The answer is simple: cost and risk, everybody’s focus right now.
This industry is estimated by Aite to be spending $2.5bn on integration alone in 2008 on Reference Data integration projects. That does not include EDM systems, vendor data feeds etc. Just integration. And the main reason why this number is so big is that the complexity and change of a dynamic integration problem is being addressed by processes and tools suited to static and simple 1-1 mappings. Integration change is a constant in the Reference Data world. When it comes to risk, when large amounts of data are being integrated into multiple risk, accounting, order management systems etc. how can the Business be sure that a brittle integration infrastructure is getting the right data, in the right format, to the right place and at the right time. Control. That is the key to Reference Data Integration. Control the costs and control of the day-to-day operations of the system. This will give everyone confidence what was expected to be done was actually done.
At a recent conference roundtable the Chairman asked the audience of Business Professionals from the major Banks what could the vendor community do to help them to more effectively manage their Reference Data. The term vendor in this case was the broadest possible, including data vendors, EDM vendors and associates services companies. The answer, coming from a Global Bank, which all of the Banks present concurred with was integration was the key and the most difficult area of Reference Data infrastructure.
While that answer may not be obvious to everyone, the challenge described was one of complexity. The key message coming from Banks is that the rate of change in the Reference Data world (vendor feed formats, new vendor feeds, direct market access, new and changing downstream systems, new and changing asset classes, increasing complexity of business rules) is what differentiates the Reference Data integration challenge from a static integration that is unlikely to change as frequently (such as a payments system). The only constant in the world of Reference Data is change. Also the thinking has moved beyond a single centralized database to a grey world where a federated, virtual golden copy is more realistic in the larger organisations. This federation is achieved through integration.
The key to Reference Data Integration is moving the responsibility beyond the traditional IT developer role. Pure IT dominance of the solution leads to the much used and hated term in IT Departments – “the development queue”, where integration projects run sequentially through the IT development process.
The primary challenge in Reference Data Integration is to get the people who know the data and the systems best of all; the Business and Data analysts, fully engaged in the integration process rather than sending word and excel requirements documents to IT and hoping for the best. Traditionally the simplest level of involvement of Business Analysts has been very difficult to achieve as IT did the integration with generic tools that involved lots of line drawing and coding of complex conditionality.
The future of Reference Data Distribution will allow the Business Analysts to take primacy in defining the business rules in an intuitive Business focused data environment. Excel comes to mind as the best analogy. Business Semantics, the raw meaning and context of the data will take precedence over technology considerations and the next flavour of SOA, ESB, EAI, J2EE, .Net etc. The IT / development staff should then be in a position to look after messaging, sequencing, transactional behaviour, connectivity, testing etc. which are the areas where IT department excel. Business Analysts will define the semantic business rules of integration. This will help IT and Business to work in partnership without the tension caused by the traditional approach to sequential coding each and every Reference Data Integration task. Only when this cooperative, dynamic approach to integration is adopted will the vision of a federated Golden Copy of Reference Data with rapid response time to business change be achieved. If the industry does not address this challenge the new legacy systems will be the integration spaghetti that will surround the new data warehouses that have been invested in to solve this very problem.