Legacy systems – still the big barrier to banking innovation

21 December, 2015 (14:41) | Blog | By: admin

By Banking Tech

 CTI 3

Nimish Shah is banking sector lead at Talend

Outdated legacy IT systems are a major stumbling block for traditional UK high street banks as they look to fight back against their often more agile rivals, widely known as ‘challenger banks’, who unhindered by complex, unwieldy IT infrastructures are typically better positioned to innovate, writes Nimish Shah.

In a recent survey by Talend, nearly half of the banking industry professionals that were polled made reference to the limits of legacy systems as the biggest IT challenge facing the sector, with 43% also citing it as the main barrier to realising the benefits of big data analytics.

Often too, it’s the bank’s existing infrastructure that holds them back from making optimum use of their most valuable asset: their data. 56% of respondents blamed legacy systems for the lack of data integration while 45% claimed that the quality of their data was preventing ‘real-time insights for the business.’

Extracting insight from key data is crucial for banks if they want to retain customers in a world where the industry is increasingly pushing to make it easier to change. Recent years have seen the establishment of the Current Account Switch Service specifically set up to make the process more straightforward, together with an ongoing push by the Competition and Markets Authority to make it easier for customers to take charge of their accounts. Adding to the uncertainty for the big banks, we are living in a world where challenger organisations are achieving ever greater successes.

Indeed, KPMG’s inaugural Challenger banking benchmarking report, The Game Changers, recently revealed that based on its research, challengers are outperforming the Big Five banks in terms of growth.

However, there are clear signals that the banks know what they need and want to do to stay ahead of the pack and perhaps unsurprisingly, big data plays a key role in that. More than three-quarters (76%) of the banking professionals agreed that the industry has a clear understanding of the benefits of big data, and more than half (51%) see it as a way to innovate faster and more effectively against industry rivals. Equally, nearly a third (32%) believe that big data can help counteract economic pressures facing the banking industry.

This is all positive but banks still face that familiar gap between the vision and reality. There’s a clear divide between the growing understanding of the potential benefits that big data can bring and real world implementations that actively utilise it. For example, today, just 30% of junior managers and professionals claim that their organisation is well advanced in their big data initiatives.

So what’s continuing to hold uptake levels back? The survey highlighted the integration of multi-channel data (referenced by 34% of the sample); managing the data explosion (33%) and gaining actionable insight from customer data (25%) as critical areas of concern. The ongoing shortage of expert resource is another key issue, a problem exacerbated by the continuing complexity of the big data environment. As data science has evolved, the technology required has become increasingly advanced and the pool of engineers capable of using it in commercial applications – or even understanding it – has not grown accordingly. Today, 28% of respondents see their lack of in-house skills as a major barrier to big data adoption. The truth is there are few staff with big data skills on the market and so those with the knowledge are extortionately expensive.

Fortunately, technology can help to close the talent gap. To start seeing genuine value from big data projects, banks must embrace technology platforms that simplify the process, reduce the need for complex coding and, of course, that integrate with legacy systems and place graphical tools into the hands of users outside the IT department. Only by so doing will banks begin to excel in their big data applications and ensure that 2015/2016 is the timeframe when big data projects start to show real returns.

Click here for original article.

Visit Us On FacebookVisit Us On TwitterVisit Us On LinkedinVisit Us On Google PlusCheck Our Feed