2017-01-27

Keyword tags:

Big Data

Cloud Storage

Data Storage Management

Storage Strategy

Extra tags:

Solving the Data Glut at Financial Institutions

Financial institutions (FI) have, for decades, hoarded information about customers, even the same customers. Much of a customer’s information was previously stored on paper. But while digital has made it more economical to store customer records, processes haven’t changed much. Just go to your bank in Hong Kong and say you want to open a new account, and see how fast the bank clerk will whip out a blank application form and ask you to fill it in.

Just to be clear, you’ve been banking with the same bank for years! Like their brethren in government, financial institutions hoard information. It used to be in stacks of paper.

These days, the impetus to hoard is even stronger because of the drive to better understand the customer, and the amounts of data to store exponentially greater as the sources expand to include everything from voice, video, social media and most recently IoT.

But how can IT cope with this impetus to hoard when traditional models of compute and storage have not changed significantly in years – when architecting the data center is still based on per square foot or inch (or meter) of data center space.



Yes, the Monetary Authority of Singapore (MAS) said cloud is now a good thing but did you read the regulator’s updated guidelines on outsourcing. When you’ve figured out you can move ahead, read the document “Financial Institutions and Cloud Computing – What’s on the Horizon” from Mayer Brown so you at least know where to start.

As most old techies will tell you, data centers are architected for predictability – whether it is performance, availability or resilience. About the only thing that data center owners can fully guarantee is security, and they cover themselves nicely in the fine print of the operational manual to using the data center services.

But with technology developments moving so much faster than the procurement process at most financial institutions can catch up to, how do you avoid installing obsolete technology?

The answer may lie with what we are reading about lately – software defined everything. The “everything” is in reference to the fact that vendors are now marketing a lot of their offerings as software-defined. Software defined compute, software defined networks, software defined storage, etc. Like Hollywood, the tech industry is at times creative, and at other times fixated on solving the same problem by modifying what already exists.

Drivers of innovation

Chwee Kan Chua, assistant vice president, Big Data & Analytics and Cognitive Computing at IDC said the first step for any banking organization is to assess the type of data and the purpose of the analytics application. Not all data are relevant; data lakes can quickly become data graveyards.

But what is the right storage mix to meet the needs of the business? Huawei recommends a 40/60 split of SSDs to HDDs although the initial design will likely be driven by the use case.

Although media hybrids are increasingly becoming the norm, Huawei observed that Flash appeared to be where the greatest benefit to financial institutions lies, particularly as digital transformation initiatives seek to mine that all illusive truth about the customer using big data applications.

Phil Davis, Hewlett Packard Enterprise (HPE) vice president of storage for Asia Pacific and Japan, warned that there is no one-size-fit-all formula for calculating storage requirements, but there are various factors that retail banks should consider related to how much data they have now, and what their expected growth rate is.

The way forward?

Dan Brassington, head of Big Data Practice for Southeast Asia at Dell EMC Services suggested that organizations begin with an initial sizing assessment taking note of the data types, initial volume, daily ingestion and then what is being done with the data on a daily basis.

“Think what analytical workloads or queries users are applying and how the data changes during its analytics journey within the organization. Always plan for three years and don’t forget DR. Remember, the more data one can store, the greater insights one can generate; data is an organization’s biggest asset,” he added.



HPE’s Davis says many retail banks start their big data application journeys analyzing customer purchasing patterns to create offers based on how and where their customers shop, adding in social interests, location data and consumer sentiment.

“For these workloads, it is generally a factor of how far back they want to review. The banks can take their existing data sources, for example core banking apps, over a period of months and add in external factors like social media or location data; this will give them guidance on how much storage they need to provision when getting started,” he elaborated.

Banks also need to consider how often their data will be ‘read’. Big data apps comb data looking for that combination of behaviors or patterns that might create an opportunity, making them read intensive – with large sets of data being processed and analyzed often.

In addition “big data apps for retail banking are becoming more real-time in nature. Real-time analytics on large data sets requires a platform that offers phenomenal read performance, which can only be offered by an all flash storage platform,” said Davis.

Finally, banks need to plan ahead for how they can scale. Once the lines of business gets a taste of what is possible with big data applications, they want more very quickly. This is where storage architecture really matters.

“IT organizations need to ensure that the infrastructure can scale with the demand from the business. Having a storage architecture that allows you to simply scale from a few dozen terabytes to a few petabytes while maintaining consistent service levels will help to ensure success with big data projects,” explained Davis.

To cloud or not to cloud

For more than a decade, banks in Asia have decidedly put cloud at arm’s length simply because regulators put what amounts to a large “stop” sign on anything cloud. Recently, the revised outsourcing guidelines by the Monetary Authority of Singapore is re-opening the case for financial institutions to look at the application of cloud in the industry, and with it a need for IT architects to relook at their storage requirements particularly where the application is big data analytics.

But there is a catch: applications, systems and databases were designed and built for decades’ ways of doing business. This includes how data is created, storage, archived and managed.

Tong Cai Chen, IBM ASEAN ASEAN sales specialist, noted that the selection of storage deployment designs is bound by limitations on traditional monolithic storage designs and their connectivity options.

“While the science is to build a system to analyze across large, diverse data sets in the most efficient manner, the art is to ensure that the data storage design is highly optimized to match data profiles with the most efficient deployment method,” he said.



He also commented that while keeping data on premise is believed to provide a more predictable performance, better security and better visibility of data; high capital costs, scalability and ease of operations are sacrificed.

Off-premise deployments allow alternative infrastructure investment options with an OPEX model and the availability of data to be accessed anywhere.

He also observed that with the various on-cloud application options at one's disposal, off-premise deployments with premium cloud vendors provide access to powerful capabilities to complement the storage systems.

“What is becoming an emerging model is to have data available in a hybrid cloud deployment model. Businesses would then be able to deploy based on the most optimized design for the various profiles of data to be stored and analyzed. Such flexibility in deployment is something software defined storage can provide to help give businesses the differentiating edge needed to harness the full value potential of their data,” he added.

Huawei cautioned that although cloud is becoming a reality in the industry, the software that banks use and the processes they follow are still fairly fluid.

DELL’s Brassington said IT architects need to think about workloads and how data is being ingested, analyzed and provisioned for both users and applications.

“Think of a hot, cold and warm data strategy when planning storage requirement for on/off premise. There is no one size fits all, therefore it is important to understand and identify the correct approach based on the needs. Key factors to keep in mind are: management and control of data; latency and availability; and cost efficiencies,” he recommended.

The glut will continue

As we move into 2017, industry observers predict that artificial intelligence will be the new area of interest across the spectrum of financial services and insurance. Some quarters suggest that the industry will finally launch applications that take advantage of the Internet of Things.

While the certainty of these predictions is not well defined, what is clear is that regardless of what new innovations come to market, the reality is that the data glut will continue and that the industry must become smarter in how it architects the creation, storage, management and protection of data in all its forms and regardless of importance.

Show more