Data & Analytics

00:05 AM
Julien Courbe
Julien Courbe
Commentary
50%
50%

The Devil Is in the Data

Data must be a key priority when embarking upon transformation initiatives.

Regulatory requirements, changing customer expectations, and aging systems have strained many companies’ technology infrastructures -- and the employees who use them -- to the breaking point. Financial institutions have responded by creating bolt-on applications that help achieve business and operational goals in the short term. But these short-term solutions have created long-term challenges by compromising the accuracy of information and fragmenting the data landscape. The impact is being felt across the board, not only in finance, but also in the IT department, business units, and ultimately, in the organization’s bottom line. Specifically,

·  Business units are becoming more disconnected and less efficient and effective

·  Employees’ productivity is negatively impacted

·  Institutions may be unable to respond accurately to reporting requirements

·  Potential loss of market share

A growing number of financial institutions recognize the need to develop sustainable infrastructures that address current and future challenges. Many of these institutions are focused largely on the technology involved. But the devil is in the data that flows through the infrastructure. It is essential to resolve infrastructure and data granularity issues from the front office to the back office, embarking on sweeping cross-divisional change initiatives that put data at the top of the transformation agenda.

In order to enable more effective, data-driven decision making, financial institutions should consider the following data management framework:

  1. Data quality. Continually enhance data through rigorous metrics-based analysis and rectification.
  2. Data governance. Develop policies and procedures to support a uniform approach to data management across all organizational levels.
  3. Data delivery. Leverage automation to ensure that systems can efficiently provide accurate data (both standard and ad hoc) across business units.
  4. Data architecture. Develop an architecture that enables multiple data types and rapid accessibility and that also minimizes data redundancy.
  5. Physical environment. Ensure that the hardware and software technology utilized to load, store, and report enterprise information is robust and comprehensive enough to support enterprise information delivery.
  6. Security. Design user and data classifications that protect information, enhance transparency and audit logging, and enable segregation of duties.
  7. Organization: Ensure that resources can be reallocated as needed to support efficient delivery and management of data across sectors, business units, and other organizational boundaries.

The framework enables financial institutions to boost employee productivity by focusing staff on meaningful data analysis rather than data scrubbing and error resolution; improve regulatory compliance by providing more accurate, timely responses to regulatory reporting requirements; and fuel revenue growth by giving decision makers the information they need to better understand customers, market dynamics, and competitors when exploring strategic initiatives.

Julien Courbe is a Partner and PwC's Financial Services Technology Leader, as well as US Asset Management Advisory Leader. In his role as Financial Services Technology Leader, he frequently covers technology-enabled business transformations, cost reduction programs, and risk ... View Full Bio

Comment  | 
Print  | 
More Insights
Comments
Newest First  |  Oldest First  |  Threaded View
KBurger
50%
50%
KBurger,
User Rank: Strategist
7/16/2014 | 10:22:37 AM
Re: Which comes first?
It's an age-old problem and even as more organizations focus more on "buy" or "rent" (outsource) than build it seems like even with the best intentions to simplify and streamline it doesn't take much to create new silos.
Becca L
50%
50%
Becca L,
User Rank: Author
7/15/2014 | 10:22:09 PM
Re: Which comes first?
Great observation, Kathy. Companies are busy building up these frankenstein-like systems, (adding on smaller systems and project-driven applications) rather than overhauling at an enterprise level. The more investment the companies make in these smaller and manageable project, the harder it will be to back up. Hopefully these small projects will fit into an enterprise overhaul, but my impression is that it rarely goes so smoothly.
KBurger
50%
50%
KBurger,
User Rank: Strategist
7/14/2014 | 2:39:41 PM
Which comes first?
It seems like there is something of a chicken/egg consideration that many banks may have overlooked. They are dealing with an imperative to do more with data (related to compliance, growth initiatives, etc.) but without the right foundation/architecture they just perpetuate the problems Julien outlines. But it seems like addressing the problems is difficult because of a lack of shared information, too. I guess this is one reason why many advise approaching data management (or big data, analytics, etc.) via smaller, more manageable projects rather than an enterprise approach -- although of course the hope is that the smaller projects will be done in a way to enable an eventual enterprise approach.
Byurcan
50%
50%
Byurcan,
User Rank: Author
7/11/2014 | 9:42:45 AM
Data
Good points, data is meaningless without the bility to analyze and make sense of it.
Register for Bank Systems & Technology Newsletters
White Papers
Current Issue
Slideshows
Video