Smart management of liquidity and capital is a vital element in the identification and execution of today’s banking strategies. Fallout from the financial crisis has also heightened regulatory scrutiny of liquidity and capital adequacy, with further pressure likely to come from the implementation of Basel III.
Identifying and managing risks across the enterprise, at the same time as demonstrating transparency and control to investors, regulators and other key stakeholders, are core management tasks. For banks, effective decision-making demands faster, more frequent and more consistent information about risk exposures, the capital needed to support them and the resulting impact on present and future profitability. It also requires the ability to generate risk-adjusted performance measures (e.g., return on risk-adjusted capital) at a granular level, be this by customer, country, asset or sector.
Success depends on comprehensive and integrated risk and financial information – but the reality for many (financial and non-financial institutions alike) is that current systems, processes and data may be obscuring this joined-up view. So, what are the barriers to aligning risk and finance and what’s the best way to overcome them?
Many banks have trusted to technology to help integrate risk and finance reporting, often at great cost and with mixed results. A more pragmatic and cost-effective approach is to get the data right first.
To tackle the urgent task of developing a risk and finance data framework, some organisations have overhauled their reporting systems. Data warehouses are being used to bring together the required operational data. While this approach can enhance data mining and analytical capabilities, it can often fall down through poor data capture. The old adage holds true: ‘rubbish in, rubbish out’.
Others have compounded the problem by building adjustments into their group-level reporting processes to help iron out some of the deficiencies in the source data. While this can often be a short-term necessity, the adjustments tend to multiply in the face of mounting regulatory or business demands. The transparent connection with the underlying data is often lost, which exacerbates the problem and creates a vicious circle of data issues.
Without improvements in the quality and completeness of data captured at source, changing the processes and systems will have little impact. Risk and finance cannot resolve the difficulties on their own. The entire organisation must recognise that good data and the decisions that flow from it are everyone’s responsibility.
We think that putting data management rather than technology at the centre of the solution can really make a difference – see the diagram above, which outlines the core elements of this. The key is to look at what internal and external customers need (‘demand perspective’) rather than what the business currently produces (‘supply perspective’) to make sure that investment in new architecture is driven by a strategically aligned business outcome. For example, prioritising improvements in the quality and availability of risk modelling data used for Basel II, can derive significant capital reduction benefits for a relatively limited investment.
It makes sense to focus initially on the elements of data directly controlled by risk and finance – such as counterparty hierarchies, financial charts of accounts and legal entity and management structures, as well as transactional data from the front office. Next in line are all the other data elements owned by areas outside of these functions, but used by them to complete their reporting outputs. Examples include product classifications and customer hierarchies, instrument and market data, and other sales and marketing attributes. These are important when establishing robust risk-adjusted performance and economic capital metrics, which require capital, revenue and cost allocations across relevant business dimensions.
The data policy sets out the specific data elements required by risk and finance to meet their modelling and reporting requirements. It then defines clear ownership and the requirements for the quality, consistency and supply of each data set. This has clear advantages over the alternative approach that focuses on the business outputs (i.e., reports produced), before identifying the detailed data elements required to produce them.
Governance procedures to support the data policy include assigning data ‘owners’ – both the end user within risk and finance and the originator of the data at source. Ownership by end users ensures the reporting outputs are responsive to current and evolving needs. It also means the producers of data – typically the front office – can be encouraged to address poor quality data capture, for example through the introduction of data quality chargeback mechanisms or penalties. This helps the front office see the benefits to them personally and the wider business of adopting a different approach to data.
Identifying the data elements will allow you to put together a data dictionary, which sets out your data standards. This starts with the business demand for each element and what is needed to meet it, then maps the required data back to its sources and sets out how it is used, derived, controlled and verified. It not only defines data’s meaning, but also its relationship to other data, origin, use and format. The mapping process will help to streamline supply by identifying and ironing out any logjams, duplication or inconsistencies. This information will then feed into any data remediation process required over the short term.
The completion of a data policy and data dictionary triggers the development of a logical data model for risk and finance, which establishes how the data is to be structured and how each data element is related to another. This data model then provides the foundation for all process and system change.
In parallel to establishing good data governance and design approaches, it’s important to start fixing data and resolving any issues as soon as possible. It’s important to make sure these fixes are aligned and controlled through appropriate business governance.
Good data is the lifeblood of any business and requires effective management like all other assets. Fixing the data first helps address the key organisational issues at the heart of today’s market demands for increased transparency and accuracy of reporting. It also gives a solid basis to drive more benefits from any future technology solutions, as well as making them significantly cheaper and less risky to implement.