In the latest Informed Comment, Barnett Waddingham's Julie Walker gives five reasons why all schemes, employers and administrators should care about scheme data quality.
Long before 'future-proofing' was spoken of, members whose data is still accumulating today had their pension records written up on colour-coded index cards by the ladies in the typing pool.
Truly fit-for-purpose pension technology was slow to evolve, but the data kept on coming, and every five years or so the regulatory environment threw up something so fundamentally different – such as equalisation or auto-enrolment– that records had to be pulled apart and reassembled in a slightly different order.
There are a multitude of reasons to improve scheme data quality, and not only because the regulator has said so. My potential top five include the impact on key stakeholders in the pension scheme.
Benefit uncertainty for members
Pension calculations should never be based on a best guess. Accurate data are integral to every aspect of benefit calculations and payments. Once incorporated into pension records, compounding errors are likely to remain undetected for years.
It is impossible to quantify how many incorrect pensions or transfer values have already been paid, but the numbers must surely amount to very many millions when averaged over the entire pension population.
Administration transfers and costs
Pension administration is a fluid industry and often data errors come to light when trustees appoint new administrators who carry out a thorough due diligence process.
Trustees generally claim ignorance on any pre-existing data issues and the new administrator is often left to pick up the pieces, and much of the costs, on correcting historic issues.
The consequences of ignoring any underlying data quality issues is that schemes will receive more complaints from members, all of which take time and money to resolve.
Higher funding costs for trustees and employers
Incorrect data and the increased likelihood of incorrect benefits being used in the actuarial valuation inevitably lead to either over or understated calculations. This has a direct impact on deficit calculations and ongoing funding plans.
In 2009 the National Audit Office uncovered overpayments in the region of £90m across five public sector schemes, directly traceable to errors in contracting-out pension records. Private sector schemes do not have the luxury of absorbing data errors on this scale.
GMP reconciliations for HMRC
HM Revenue & Customs is the key external partner for every pension scheme, with a particular dependence on the National Insurance Services to Pensions Industry's role in dealing with guaranteed minimum pensions – a tricky, sticky, difficult-to-calculate statutory benefit clogging up the arteries of many pension schemes.
With the clock ticking on the countdown to 2018, when NISPI closes the door on GMP queries forever, trustees have very little time to reconcile their records.
As these reconciliation exercises will be comparing data as far back as 1978, the results often highlight severe failings in maintaining accurate records and trustees are likely to end up accepting the GMPs held by HMRC as they cannot prove their accuracy either way.
A faster, cheaper PPF transfer or buyout
The prescriptive nature of data content and quality when schemes wind up or are going through Pension Protection Fund assessment means that large parts of the wind-up or assessment budget are eaten up by data and benefit interrogation and rectification.
With the favourable economic conditions at the moment leading to more trustees being in a position to consider buyout, they will not want any action to be held up by inaccurate or incomplete data.
One of the key outcomes of the regulator’s record-keeping guidance and reviews has been to encourage a focus on the accuracy of data as well at its presence.
Data are integral to every aspect of pensions. Ultimately the benefit to all stakeholders is the ability to state confidently that the benefits members are entitled to have been correctly calculated, based on correct and robust data.
Julie Walker is an associate at Barnett Waddingham