September 02, 2011

At the same time, external forces continue to exert pressure with organizational change (M&A, restructuring, etc.), demanding flexibility in systems that support financial organizations. And the unknown outcomes of various regulatory changes (the impact of the Dodd-Frank Bill) mean that banks' departments have to architect to support perpetual change. And here's the rub: this all has to happen in a way that can handle very high volumes, which demands high performance. Big Data demands processing in an acceptable period, but this must be for the complete end-to-end process, from data collection to the completed analysis/reports. Without the ability to quickly standardize and process in real applications to support data analysis, banks that are slow to adapt to the new realities of Big Data will be left behind as forward-looking organizations secure a competitive advantage.

Information in Action

The challenge is therefore not the analysis, insight and mining -- it is "actioning"the data within business operations to drive sustainable value and a leading market position. Banks have traditionally run their businesses in silos -- by product, by acquisition, by geography, etc. So there's a built-in challenge to integrate and standardize information held within each silo across global organizations.

One could argue that most banks still lack the application assets necessary to exploit big data: they might not have the analytic staff that is traditionally best equipped to drive insights from information. And of course there's the reality that many banks are saddled with legacy infrastructure that makes the exploitation of customer data (transactions, products, balances, risk profiles, location, demographic, segmentation, costs etc.) very difficult.

In order to put information into action without expanding departments to handle manual processes, banks need to create data and process transparency, enable experimentation and replace some human decision-making with executable models and rules. That's right: when it comes to Big Data, people simply can't make smart decisions about massive volumes of data without systems and technologies in place that manage the processes by which that data reaches those humans. Never was the expression, "I can't see the forest for the trees,"more appropriate.

Banks also face pressure to innovate products and deliver new business models and services, which means that data has to be sophisticated. It must be used to analyze variability in performance so that business users within banks can discern root cause and effect of customers' decisions. When banks successfully enable this ability, their leaders can manage performance to higher levels.

This might mean adopting a powerful allocations system to attribute revenue to products, marketing drivers, branches and customers. This kind of system then lets the organization allocate granular costs to all the dimensions to get a true picture of customer and product profitability so 'what if' questions about profit drivers can be answered and business rules and processes can be changed and tailored to improve service and profit.

A Holistic Approach

When dealing with the challenges of Big Data, most banks are forced to use multiple products from a variety of vendors. But most solution stacks have significant gaps, requiring the use of a set of disparate technologies. This results in a systems landscape that is opaque, complex, rigid and very costly to maintain.

Accepting this fragmented software stack, banks need to look at enabling technologies that can work with its existing technologies and applications landscape. To ensure that strategy delivers value, IT and business decision-makers need to collaborate on a holistic approach to Big Data.

The best solutions are not always the most obvious. Great systems deliver immediate value to the business while measuring up to the standards of the IT team. Adopting solutions that put Big Data into action will help close the gap between business and IT users while creating competitive advantage for those banks with clear enough vision to see the big picture.

Martin Redington is the senior vice president of product management at Microgen, an Enterprise Application Platform (EAP) provider that helps the world's largest banks and other global organizations solve the challenges associated with Big Data. Prior to Microgen, Mr. Redington ran the consulting services at OST Business Rules, which Microgen acquired in 2002. He has more than 20 years of experience in managing mission-critical projects with tight deadlines and budgets; 12 of these years have been related to business rules and enterprise application platform technologies.