London-based HSBC ($2.5 trillion in assets) already has a Basel II program through which it has worked with regulators and internal compliance staff to provide information called for by the Basel II agreements. To support this, the bank developed a Basel II risk reporting warehouse in which it consolidates all information on risk associated with its client base.
The bank hired Peter Serenita away from JPMorgan Chase (where he was one of the industry's few chief data officers) in July 2009 to head an entity data program that oversees day-to-day operations for client data onboarding and account opening for the global banking markets business, as well as the change management program, improving the client onboarding process and the management of client data. According to Serenita, the multiyear program, which is ongoing, is putting processes, procedures and controls in place to cleanse data and ensure that it stays clean.
Getting client data right is critical to credit risk management, continues Serenita, who is now global head of entity services for global banking and markets at HSBC. "You have to have consistent client data in order to be able to assess the exposure you have with your clients," he says.
The complex nature of counterparty relationships provides a special challenge to client data management for risk assessments. "The [credit] crisis has highlighted all the connectivity beyond the initial risk with the initial counterparty," Serenita points out. "A counterparty can play several different roles -- that's why I use the word 'entity' rather than 'client.' An entity might be a client, but that same entity might be an issuer of paper or a guarantor, or serve some other role."
Part of the entity services group's job is to understand all the different roles entities play in their transactions with HSBC. The group, according to Serenita, will collect globally consistent information about entities and the relationships they have with the bank and with each other and present this data to the risk technology and operations group.
While Serenita is not yet at the point of purchasing software for this project, he says there are two types of technology that should prove helpful to this work. One is "golden copy" data management software, which has one central data repository but distributes data to multiple users and applications in a consistent but flexible way. (Two vendors of this type of software are New York-based GoldenSource and New York-based Asset Control.) The other is data analysis software that can sift through data, observe commonalities and divergence, and report on the divergences, allowing the data analysts to focus their attention where it's needed. This software also can be used for ongoing data cleansing and data quality efforts, Serenita notes. (Armonk, N.Y.-based IBM's SPSS and Billerica, Mass.-based Harte-Hanks' Trillium are two examples of this type of software.)
Serenita suggests an example of how these technologies can enhance credit risk: If a bank has a global client with a $100 million credit line at each of three international locations -- such as Hong Kong, London and New York -- each location would measure and monitor the credit line and the information would automatically be rolled up into a master view of credit risk. By having a global view of the client with global credit limits, the bank can approve an excess in one location by reducing the unused limit in another location, which better utilizes the overall credit and helps both the bank and the client.
"I see a renewed vigor across the industry for data management that goes across security data, pricing data and compliance data," says Serenita. "There's a recognition that to be in the global marketplace and serve global clients, this is becoming more and more important -- to the regulators and everyone else."