Let’s start with the problem with BCBS 239’s simple, there is so much hype and commentary about what’s involved, implementation approaches and the role of technology, that many simply do not know where to start. With this in mind, I have penned this brief article to help shine a light on what I believe is a practical and straightforward set of steps to put a plan in place.
The global financial crisis highlighted that most large banks lacked appropriate facilities to aggregate, assess and analyse risks in a timely and flexible fashion. With many believing that self-regulation, enhanced systems and improved management practices offered the solution, the Basel Committee on Banking Supervision felt differently and delivered BCBS 239, a new standard in regulation aiming to strengthen banks risk data aggregation capabilities and internal risk reporting practices.
As a health warning, these are my views and my perspective. It’s a personal perspective based on many years’ experience of dealing with financial service organisations, the risk agenda and understanding the role of data to support predictive analysis of all types.
A little context…
The requirements issued in January 2013 by the Basel Committee on Banking Supervision pose a real and serious challenge for many global financial institutions. There isn’t a week that goes by without another ‘budget busting’ change programme kicking off with the principal aim to pacify the regulator.
It is no secret that a number of large financial institutions are struggling to cope with historic regulatory initiatives such as Basel II. Many of these are ‘swamped’ here before they can begin to contemplate the endless raft of regulatory ‘noise’.
‘Noise’ and ‘distraction’ are seemingly confounding these organisations to ever increasing depths of regulatory depravity. What fascinates me is how, time and time again, these organisations (and banks in particular), are struggling to get their house in order. In the main, it appears that they repeatedly fail to take a common sense approach and do what is actually right for the business itself. My observation here being that doing it right for the business will mean that the regulatory agenda will be satisfied as a by-product of good business practice.
And onto some facts…
So, we have an answer, let’s look a little closer at why this is the case and what it all means.
Enterprise risk management, liquidity risk management, performance management and regulatory compliance are interrelated obligations. This wasn’t always the case and this now means there is an urgent need to improve risk management data capabilities and architecture to ensure all stakeholders get a clear and consistent view, a single source of truth of the bank’s aggregated risk exposures. Data adequacy is a critical success factor within a unified data framework across risk and finance as a minimum, and treasury for liquidity risk purposes.
A path ‘badly’ trodden…
In my opinion, the majority of banks seem to be approaching the regulatory agenda completely the wrong way round, back to front, so to speak. The traditional focus appears firmly on addressing the regulators expectations and then that’s it, the bare minimum needed and it’s considered ‘job done’, at which point the attitude adopted seems to be ‘we’ll sit back until the next set of regulations hit us’.
We all know that common sense does not come easy in this sector. However, the seemingly obvious lack of application of standard commercial and business acumen in getting to grips with these regulatory issues begs the questions ‘Where has all the common sense gone?’.
So, for the avoidance of doubt, I’ll make my point again, in my opinion, banks should be looking to get their house in order and the rest is straightforward, and I will explain why below.
Issues… The root cause!
A recent survey by the Basel Committee’s Working Group on SIB Supervision (WGSS) highlighted the primary weaknesses, namely banks’ continued reliance on manual workarounds, a lack of a consolidated view of risk, fragile risk systems, poor or dissatisfactory risk data governance, opaque definitions of data ownership, poor and weak controls around data quality assurance, and not enough documented policies and procedures around risk data aggregation.
These problems have a complex genesis, driven by the proliferation of organisational information and data silos. Applications sprung up throughout the institutions, serving specific vertical functions. As new requirements and regulations came into effect or where identified, many enterprises built new platforms to address them, whether around enterprise risk management, operational risk and compliance, enterprise performance management, or customer insight.
In these environments, there was often manual or poor sharing of data and a lack of consistency, which limited visibility while significantly increasing complexity.
Whether these problems originate in the proliferation of siloed systems or in the sheer complexity of the business organisation, the target should be the same: building a single source of truth for risk data that is accurate, clear, complete, and readily accessible. It should also be reconciled with finance data so all stakeholders in the bank can rely on the figures and reports derived from it.
Combined, this approach creates a common sense lexicon or a single language that can bring closer and more successful alignment of risk and finance operations moving forward.
So a little deeper in, why does my analysis suggest this to be the case?
Goals, pressure and lack of knowledge
CRO’s and CFO’s are now under pressure to transform their risk and finance functions accordingly. But the question everyone seems to be asking is how do we actually do this, what methodology should we use for the transformation, is it actually going to guarantee us success?
For many, transformation activities are failing at huge cost and no-one wants to be the next in line. Critically, there appears to be little appetite out there to do what is best meaning that the scale of functional or process change required to fix the core is regarded as simply too unpalatable.
The utopian end state
Most organisations are struggling to determine a target end state architecture and a granular unified data model that satisfies the need. So far, the larger consultancies have shown that they do not have the wholesale skills and granularity required to get close to a set of requirements that the business will agree to, let alone start to implement.
Their business model is not based on implementation per-se and this is why my observations stand close scrutiny.
It should, therefore, come as no surprise that I consider the problem here to be right in front of your eyes. The majority of online media and consulting ‘thought leadership’ regarding BCBS 239 seemingly appears to simply ‘regurgitate’ the BIS paper itself. Article upon article talk about ‘satisfying the regulators ‘ and that’s it.
I am yet to discover an implementation approach anywhere in these articles (or one that makes sense), and no one is talking about achieving genuine transformational outcomes for your money spent. I am ashamed that the consultancy world is failing spectacularly in this key transformation arena.
The technical solution that vendors are most frequently touting is a common data warehouse, a one size fits all regulatory answer, but the solutions themselves are not up to scratch. In my experience, when pressed, the large vendors will admit this fact themselves.
Where do we go then? Bespoke is too scary. And how should we manage these vendors if we do in fact go with them? If you’re following my earlier advice from above, and you are looking closer… just pause and ask yourself the question, ‘is Bespoke really too scary?’.
As a result of the above, the effect is that banks are overly focused on purely satisfying the BCBS 239 regulation itself. In reality, it would appear that no-one has realised that if implemented correctly there is huge value in getting BCBS 239 right! Am I getting through yet?
The principle here is ‘business first and the regulatory agenda second’. If your business processes, technology and data were optimised in the first place (simply delivered through a transformation design methodology as outlined below), then achieving BCBS 239 and in fact, the bulk of the regulatory agenda would become a ‘walk in the park’. An effective risk management strategy demands adequate data, this is a given, and not something to just to satisfy the regulator.
The appealing thing for me about BCBS 239 is not that the requirements represent a new set of obligations for banks, but most critically, because they actually present an opportunity to strengthen existing initiatives to improve inadequacies in risk management functions or start new transformative changes that get banks operating where they should be.
Banks should be ‘lean, mean, money making machines’, running slick operations that seek to maximise financial reward for minimum and managed risk at every opportunity.
Fear not, there are a number of very appealing, genuinely realisable business benefits that are on offer here, but few have woken up to this yet.
And the related benefits and business outcomes…
In my experience and that of my colleagues at ACS, whichever way you wanted to look at this (implement BCBS 239 to get your business right, or get your business right and BCBS 239 is satisfied) there are some key benefits that you could realise if you had implemented BCBS 239 correctly.
- A clear and comprehensive view of risk exposure
- Lower losses due to improved data quality and risk insight
- Lower capital needs due to accurate asset / liability classification
- Improved revenue through client insight and product differentiation
- Reduced operational risk through less manual effort (and hence lower capital)
- Lower costs through improved STP rates
- Improved loss prevention reducing operational risk cost
- Improved data management and reduced reconciliations leading to lower compliance and operational costs
- Improved risk adjustment capabilities increasing overall business value
- Improved limits management and efficient capital allocation
- Lower provisioning
Okay, I think I get it now… So, how do I go about it?
So the question is how do we go about all of this? The answer is very straightforward and lies in the adoption of a transformation design methodology that utilises granular integrated capability (process) models across finance, risk and treasury (a subject I will cover in more detail in future articles).
Unsurprisingly, given the level of consulting fatigue in the market, many organisations are convinced that a transformation is exactly what they don’t need. To those organisations that are still struggling to muster an approach of any sort, or those that are not quite sure whether they are on the right path or not, I would advocate you look to perform a risk data self-assessment that if performed correctly, will give you the answer you need. My approach to getting this diagnostic done is outlined below.
It isn’t rocket science. It’s simple and practical with dramatic outcomes. The above illustration should be self-explanatory, but please feel free to contact me for any query or further information if needed.
Once you have performed this ‘self-assessed diagnostic’ and have a holistic view of the projects required to address the gaps, then generally speaking you are looking at a transformation, but call it what you may (the word ‘transformation’ has been used far too haphazardly in recent years, particularly with technology. My perspective here is simple, there is no such thing as a technology transformation, businesses transform and technology is there to support the business!).
Either way, the result of the diagnostic is generally a ‘sign-posted’ amount of change that may present a daunting challenge to both the business and technology functions, and a transformation design methodology is generally required to deliver the change.
Again, this is common sense. In short, it’s all about granular process architecture, an integrated capability (process) model across finance, risk and treasury, which is actually where the relevant subject matter expertise is pertinent. The methodology allows for the change to be delivered in a very logical way that provides the business with true visibility of the implementation as it progresses through the life cycle. The result is that the business has improved control over the entire implementation process.
Financial institutions are not adequately addressing BCBS 239. This is true of most other regulatory initiatives too. The data driven approaches that most banks are considering in addressing BCBS 239 are a mere ‘cop out’ and are there, at best, to do little more than satisfy the regulator.
So banks are missing a treat, they should be looking to get their house in order and reap the huge benefits that are on offer. The nightmare regulatory agenda will then become a thing of the past.
Banks should think smart. They should be seizing the opportunity to address the capability required to manage the risks of the business they are performing and the underlying business processes needed to support those capabilities. The prospect is scary because I see few talking common sense here and when the chips are down, the big consultancies have failed them spectacularly.
The route to success is actually very straightforward. My advice is to get your risk data self-assessment performed in the sequence I have outlined above. Work out an internal model that provides you with the capability (and satisfies the regulator as to your capability) to manage risk exposures effectively, expand business operations and manage additional business risks. Then, follow a transformation design methodology that provides you with genuine transformational outcomes, delivered through a granular, integrated capability model across finance, risk and treasury.
If you have any queries regarding this article then please feel free to contact me here.
Written by Matt Good