July 10, 2013

As financial institutions grapple with evolving business landscapes and increased information demand, finding optimal ways to store, organize and monetize the ever-increasing crush of data they possess is of crucial import. How effectively banks can make better business decisions based on the so-called "big data" they process on a daily basis will be crucial for the industry going forward, said Andy Hirst, senior director of Industry Marketing for SAP.

Hirst, speaking on big data innovation at the International SAP Conference for Financial Services this week, noted that banks are inundated with new sources of information constantly.

"The industry has been been analyzing structured information for many years, but the new growth now is in unstructured data," he said.

Additionally, the way businesses want to look at and analyze data has been greatly influenced by consumer use of mobile devices, Hirst said. "We're used to seeing information presented beautifully in these great graphic ways," he added.

Further, he noted that internet and retail companies who are able to use data to engage in highly targeted marketing efforts, such as Amazon or Google, have raised customer expectations. This confluence of the new consumer experience and the desire for seeing real-time results on mobile devices have introduced new technologies to try and help financial institutions take advantage of their data better, like in-memory computing. Hirst believes in-memory solutions will help banks not only process data faster, but more importantly aid in turning that new capability into tangible value, or as Hirst puts it, "What is the business value of being able to do something faster today than yesterday?"

To that end, Hirst offered several key takeaways for banks to consider to get the most out of the big data they have access to. Firstly, he advised them to take advantage of predictive analytics, "not looking at how the world was, but how it will be." The faster a bank can analyze data, the better the predictive value of it, Hirst noted, and as a result the industry must move from batch to real-time processing.

Hirst also advised financial institutions to "maintain one copy, not dozens" of their data. The more data is copied and moved, the less reliable it becomes, he added.

Banks also should focus on not simply using more data, but more diverse data. This includes not only internal data, but external information, such as social data, he said. Also of note is the need to realize that this process is not just a science, but an art as well. "It's about putting humans and data together to get the most insight," he said.

Ultimately, said Hirst, faster data processing and sophisticated analytics are crucial for banks to achieve a 360-degree view of the customer, developing true relationship-based pricing, and answering the question, "How do banks make money off the data being held in their back systems right now?"

[See Also: 7 Big Data Players To Watch]

ABOUT THE AUTHOR
Bryan Yurcan is associate editor for Bank Systems and Technology. He has worked in various editorial capacities for newspapers and magazines for the past 8 years. After beginning his career as ...