Perspectives

Three Steps to Data Nirvana

Virtusa Corp
Article

It’s been thirteen years since British mathematician Clive Humby coined the phrase “data is the new oil”. Today the phrase has become so overused, it’s cliché but when it comes to banks and other financial institutions, it’s even deeper – data is the new water. It’s such an essential component no modern financial firm could hope to survive without a constant supply. But, like water, data must be sanitized and cleaned before it can be used. Just as dirty water can make you very ill, unclean data can cause a host of problems from a poor experience for customers when speaking to staff, to wasted resources when departments unnecessarily purchase data already held elsewhere in the organization. And considering banks spend more than any other industry on Big Data and analytics solutions – nearly $17bn in 2016 – it’s even more important the data these solutions are working with is viable.

Top tips for better data

With this in mind, there are some key steps banks can take to ensure that the data they’re using is up to scratch and that they’re able to use it to make decisions with confidence:

  1. Everything comes as standard: Organizing data into a standard, readable format is an absolutely essential step in ensuring better data management, particularly for firms trying to track thousands of complicated data streams at once. Modern banks are looking to gain insights to their customers by analyzing everything from our spending habits to our social media searches, in order to be able to offer more personalized services. While this information is potentially of great value, it can only really be used effectively when viewed in context. This means all the data gathered on a given customer – both structured and unstructured – needs to be able to be looked at simultaneously, something that is only feasible if the data is standardized first.

 

  1. Verify what you’re told: Once the data has been standardized, the next step is to make sure that it’s accurate. Given that around a third of all customer data is believed to be incorrect, banks need to implement stringent data quality checks for all the customer data they hold – after all there is no point in launching a personalized ad campaign if you’ve got the wrong mobile number and street address! Checking the accuracy of such data can involve the use of expensive third-party resources and databases but the cost of inefficient data can easily be much higher. Not only does inaccurate data create waste but it can also damage a banks reputation if it’s publicly getting things wrong all the time.

 

  1. Get it together: The final step, once the data has been standardized and verified is to centralize it so that staff can quickly spot any duplicate pieces of data or data which is no longer accurate. Centralizing data provides a single overview of the customer so that any time staff are contacting them, they have the latest information and can therefore offer the best advice and customer service. On top of this, given how often banks need to purchase data from third parties, it’s important to ensure that money isn’t being wasted on buying data which has already been purchased by another department.

The start of the journey

Sadly, there is no silver bullet which can magically solve all the data management issues that a modern enterprise faces but these three steps of standardizing, verifying and centralizing data resources are the first steps any firm should look to accomplish when tackling the problem.

Today banks are investing in a host of incredibly powerful technologies as part of a grand digital transformation strategy, including the deployment of AI, smart devices, and cloud migration. All of these are underpinned by data and so banks need to ensure they can walk before they run by doing a full review of their data infrastructure and data management capabilities. In particular, banking CIOs need to ensure they have conducted a thorough review of their digital strategy so that, before undertaking any major data project, they are clear on what data infrastructure upgrades might be needed to make the project a success. Otherwise they could find themselves drowning in the deep waters of bad data.