By Jane Fuller, former financial editor of the Financial Times and co-director of the Centre for the Study of Financial Innovation thinktank
The IASB and FASB differ over how best to switch from an incurred loss model for loans to an expected loss one. While the IASB has the ‘least bad’ option, it will be a case of seeing which works best.
Goodbye convergence, hello competition. Now that the US has backed away from adopting International Financial Reporting Standards (IFRS), the latest transatlantic duel is over how to switch from an incurred loss model for loans to an expected loss one.
As the chair of a committee responding to the plan from the International Accounting Standards Board (IASB), we felt a definite steer towards its ‘deterioration approach’. So it was hard to give the ‘lifetime loss approach’ proposed by its US peer, the Financial Accounting Standards Board (FASB) a fair hearing.
This is a pity because the FASB version appears simpler. Its ‘current expected credit loss model’ offers a single measurement objective of assessing expected losses (EL) over the life of the loan. So on day one there is no impediment to recognising any losses, whereas the IASB model entails booking ‘a portion’ effectively a 12-month horizon.
As the loan progresses, expectations are reassessed and adjustments made to the loan loss allowance. A bank that expands its lending by making more loans and/or extending its maturity will have bigger upfront losses.
Objections to this include that a day one loss is a nonsense. What management in its right mind – and let’s assume what chastened bankers are now closer to that – would lend at an immediate loss? Is it right that growing bank has to book bigger upfront losses?Is there a perverse incentive to keep loans to a short maturity?
The IASB suggest there is no reason to make a growing lender look less profitable than one in a steady state. The obvious counter is that the growing bank is more risky – and that should be reflected in the accounting.
It should be remembered that the IASB made itself vulnerable to US divergence by proposing a confusing ‘three-bucket’ approach to impairment. The deterioration model still has a trigger that switches loans from one bucket, where only a portion of EL are provided for, to another that allows the full lifetime losses. But the trigger sounds rather fussy – ‘a sufficient deterioration in credit quality’.
Forecasting full life-time losses at the outset of a loan is also fuzzy, so you have to pick which of the approaches offers better information about credit quality and is less easily gamed.
The principle should be that the accounting reflects economic reality, indeed that’s what the incurred loss model did. Banks are cyclical. They make a profit on a loan until it goes sour: the cliff edge is there. This can be anticipated with the help of experience – the EL idea – and postponed through forbearance, but it is not a smooth business.
Since the incurred loss model was used as an excuse for foot-dragging on loss recognition, the move to EL has broad support. But it should not provide an opportunity for a return to ‘general provisions’ that can later be fed back in to flatter profits.
The FASB promises that investors will receive plenty of information about changes in credit quality through the lenders’ regular reassessment of loss expectations. But this still means the analysis of profits will be done through the prism of movements in and out of the provision pool, at a remove from the actual performance of the loans.
There is a suspicion, denied by the FASB, that prudential regulators have applied pressure for more upfront provisioning. Accounting should remain neutral in this. It is bank boards, prompted by much tighter prudential requirements, that need to ensure enough profits are retained to absorb expected – and unexpected – losses.
So the IASB’s hybrid looks the least bad option. We are back in a world of competing standards, so let’s see which works best.
This post first appeared in Accounting and Business UK December 2012