Put data first – it’s the lifeblood of business

Integrating finance and risk reporting is an imperative for many organisations. Focusing on data and not technology is the key to fast results that will pay off in the longer term, argue Paul Morton and Symon Dawson.

Smart management of liquidity and capital is a vital element in the identification and execution of today’s banking strategies. Fallout from the financial crisis has also heightened regulatory scrutiny of liquidity and capital adequacy, with further pressure likely to come from the implementation of Basel III.

Identifying and managing risks across the enterprise, at the same time as demonstrating transparency and control to investors, regulators and other key stakeholders, are core management tasks. For banks, effective decision-making demands faster, more frequent and more consistent information about risk exposures, the capital needed to support them and the resulting impact on present and future profitability. It also requires the ability to generate risk-adjusted performance measures (e.g., return on risk-adjusted capital) at a granular level, be this by customer, country, asset or sector.

Success depends on comprehensive and integrated risk and financial information – but the reality for many (financial and non-financial institutions alike) is that current systems, processes and data may be obscuring this joined-up view. So, what are the barriers to aligning risk and finance and what’s the best way to overcome them?

Have you got the information you need?

Within many organisations, risk and finance are still worlds apart, making it very difficult to generate integrated information and insights. Why is this? In our experience, there are a number of contributing factors:
  • Reporting data, processes and systems for the risk and finance functions have generally been developed independently and have struggled to keep pace with regulatory pressures, including IFRS, Basel II and Sarbanes-Oxley. There has been a tendency towards short-term, tactical solutions that has hampered attempts to develop integrated systems.
  • Underinvestment in back office processes and systems and a focus on delivering solutions at the lowest cost and with minimum disruption.
  • Short-term front office requirements have set the agenda. The need to quickly implement a new product or bring on board a new client has encouraged the tendency to adopt tactical measures. A lack of front-office accountability for data quality and management has reduced the pressure to address systems problems.
  • Even global organisations with the scale to allow them to make the most of the cost and efficiency benefits of enterprise-wide developments have struggled to reconcile the divergent regulatory and statutory requirements across their businesses, and have been forced to implement multiple local solutions.
  • Delivery of the required reports has depended on manually intensive workarounds that can be expensive and prone to error.

The solution?

Many banks have trusted to technology to help integrate risk and finance reporting, often at great cost and with mixed results. A more pragmatic and cost-effective approach is to get the data right first.

To tackle the urgent task of developing a risk and finance data framework, some organisations have overhauled their reporting systems. Data warehouses are being used to bring together the required operational data. While this approach can enhance data mining and analytical capabilities, it can often fall down through poor data capture. The old adage holds true: ‘rubbish in, rubbish out’.

Others have compounded the problem by building adjustments into their group-level reporting processes to help iron out some of the deficiencies in the source data. While this can often be a short-term necessity, the adjustments tend to multiply in the face of mounting regulatory or business demands. The transparent connection with the underlying data is often lost, which exacerbates the problem and creates a vicious circle of data issues.

Without improvements in the quality and completeness of data captured at source, changing the processes and systems will have little impact. Risk and finance cannot resolve the difficulties on their own. The entire organisation must recognise that good data and the decisions that flow from it are everyone’s responsibility.

Pragmatic approach

We think that putting data management rather than technology at the centre of the solution can really make a difference – see the diagram above, which outlines the core elements of this. The key is to look at what internal and external customers need (‘demand perspective’) rather than what the business currently produces (‘supply perspective’) to make sure that investment in new architecture is driven by a strategically aligned business outcome. For example, prioritising improvements in the quality and availability of risk modelling data used for Basel II, can derive significant capital reduction benefits for a relatively limited investment.

It makes sense to focus initially on the elements of data directly controlled by risk and finance – such as counterparty hierarchies, financial charts of accounts and legal entity and management structures, as well as transactional data from the front office. Next in line are all the other data elements owned by areas outside of these functions, but used by them to complete their reporting outputs. Examples include product classifications and customer hierarchies, instrument and market data, and other sales and marketing attributes. These are important when establishing robust risk-adjusted performance and economic capital metrics, which require capital, revenue and cost allocations across relevant business dimensions.

Data policy

The data policy sets out the specific data elements required by risk and finance to meet their modelling and reporting requirements. It then defines clear ownership and the requirements for the quality, consistency and supply of each data set. This has clear advantages over the alternative approach that focuses on the business outputs (i.e., reports produced), before identifying the detailed data elements required to produce them.

Data governance

Governance procedures to support the data policy include assigning data ‘owners’ – both the end user within risk and finance and the originator of the data at source. Ownership by end users ensures the reporting outputs are responsive to current and evolving needs. It also means the producers of data – typically the front office – can be encouraged to address poor quality data capture, for example through the introduction of data quality chargeback mechanisms or penalties. This helps the front office see the benefits to them personally and the wider business of adopting a different approach to data.

Data dictionary

Identifying the data elements will allow you to put together a data dictionary, which sets out your data standards. This starts with the business demand for each element and what is needed to meet it, then maps the required data back to its sources and sets out how it is used, derived, controlled and verified. It not only defines data’s meaning, but also its relationship to other data, origin, use and format. The mapping process will help to streamline supply by identifying and ironing out any logjams, duplication or inconsistencies. This information will then feed into any data remediation process required over the short term.

Data model

The completion of a data policy and data dictionary triggers the development of a logical data model for risk and finance, which establishes how the data is to be structured and how each data element is related to another. This data model then provides the foundation for all process and system change.

Data improvement initiatives

In parallel to establishing good data governance and design approaches, it’s important to start fixing data and resolving any issues as soon as possible. It’s important to make sure these fixes are aligned and controlled through appropriate business governance.

Benefits of ‘data-first’

The key benefits of adopting this pragmatic approach include:
  • Faster results – putting data first, enables companies to get most of the business benefits quickly without having to wait for a fully integrated finance and risk architecture to be developed.
  • Improved return on investment in technology – by ensuring that any future technology investment is closely aligned with business requirements and is using ‘good’ data, it is more likely to deliver the expected benefits.
  • Increased agility – by being able to demonstrate to stakeholders that data quality and the timeliness of processes are improving, they will have greater confidence in the organisation’s ability to meet tightening reporting deadlines.
  • Greater insight – by focusing on data quality, fewer manual adjustments will be needed, which makes management information faster and cheaper to produce, as well as more accurate and focused on business needs.

Data at the centre

Good data is the lifeblood of any business and requires effective management like all other assets. Fixing the data first helps address the key organisational issues at the heart of today’s market demands for increased transparency and accuracy of reporting. It also gives a solid basis to drive more benefits from any future technology solutions, as well as making them significantly cheaper and less risky to implement.


Paul Morton and Syman Dawson are both advisory partners at PwC.