1 Introduction
The 2007–9 global financial crisis brought to light several cases of financial accounting misreporting in the banking industry. The U.S. Securities and Exchange Commission (SEC)’s push to step up its policing of accounting fraud led to a surge of cases and investigations. For example, the First National Community Bancorp’s methodology for determining the amount of impairment on securities was found to not comply with the Generally Accepted Accounting Principles (GAAP), and, hence, the bank understated losses in 2009 and 2010 (Securities and Exchange Commission (SEC)
2015). Similarly, Bank of America Corp. failed to deduct certain realised losses when it calculated and reported its regulatory capital, meaning that capital was overstated in its SEC filings from 2009 to 2013 (Securities and Exchange Commission (SEC)
2014). In the same vein, Fifth Third Bancorp. failed to record substantial losses during the crisis by not properly considering a portion of its non-performing loan portfolio (Securities and Exchange Commission (SEC)
2013).
Auditors failed in several instances to raise the alarm to the authorities for considerable discrepancies in the books of banks. The Federal Deposit Insurance Corporation (FDIC) sued PricewaterhouseCoopers (PwC) for $1 billion for not detecting a massive accounting fraud that brought down Colonial Bank in 2009. PwC was blamed for missing huge holes in Colonial’s balance sheet without ever detecting the multibillion-dollar fraud at Taylor Bean & Whitaker Mortgage Corp., which was Colonial’s largest client. Furthermore, two KPMG auditors received suspensions for failing to scrutinise loan loss reserves at TierOne Bank, which also went bankrupt during the crisis. Along the same lines, just eight months prior to the demise of Lehman Brothers, Ernst and Young’s auditors remained silent about the repurchase transactions that significantly disguised the bank’s leverage.
This paper evaluates the financial statements of U.S. banks by employing a mathematical law which was established by Frank Benford (Benford
1938) to detect data manipulation. If an accounting figure is manipulated upwards to increase the digit in the first position by one, a higher than the anticipated proportion of low numbers and a lower than expected frequency of high numbers will occur in the second position. This describes an uncommon digital pattern that violates Benford’s Law and indicates an upward data manipulation; if the converse holds true, then data are subjected to downward manipulation. An upward manipulation pattern suggests that when an accounting figure starts with either the digit eight or nine it is more likely to be rounded up to a figure which has a one as the first digit. In the context of downward manipulation, the value of an accounting figure is lowered when its first digit is one to a figure which starts with the digit nine.
Despite substantial progress in the areas of financial misreporting and accounting fraud, existing methods and metrics have several deficiencies that limit their usefulness. The use of Benford’s Law is deemed as being superior to other traditional approaches as it overcomes some of the key concerns surrounding these approaches. Prior work has utilised Benford’s Law to identify accounting fraud and data manipulation in corporate firms (e.g., Alali and Romero
2013). Prior literature has also linked the use of Benford’s Law in the realm of financial audits (Durtschi et al.
2004), taxes (e.g., Nigrini
1996), macroeconomic indicators (e.g., Gonzales-Garcia and Pastor
2009), and interest rates (e.g., Ashton and Hudson
2008). However, no paper has ever applied the Law to identify irregularities in financial reporting in the banking industry, making this a key innovation of our study.
Banks are special entities from various perspectives and, as such, their accounting books require special attention. The banking sector is one of the most heavily regulated sectors in the economy and a key regulatory measure is a capital adequacy requirement that provides loss absorbency. Troubled banks have incentives to overstate equity and regulatory capital, particularly during economic downturns (e.g., Vyas
2011; Huizinga and Laeven
2012). Furthermore, since banks’ capital requirements are linked to the riskiness of their portfolios, bank accountants face incentives to overstate the quality of assets and understate the relevant risks and potential losses. Banks’ high leverage provides additional incentives to remain undercapitalised, since the benefits of issuing new equity primarily accrue to creditors, and also to make excessively risky investment decisions (Jensen and Meckling
1976). These moral hazard incentives may be exacerbated for larger banks that expect to be bailed out when insolvent. Moreover, banks are inherently opaque (Morgan
2002; Flannery et al.
2013) in lending decisions that are based on private information about borrowers and projects that is not available to those outside the bank (e.g., Diamond
1984). Additionally, in contrast to corporate firms, banks are engaged in a broad array of trading activities that produce non-interest income and which makes banks relatively more complex (Morgan
2002; Laeven
2013). In this context, accruals for banks reflect different considerations than those that drive accruals for corporate firms (Cohen et al.
2014). Lastly, since banks finance illiquid loans with liquid deposits, they have incentives to show to depositors and to other creditors that their portfolios are highly liquid.
To capture the complexity in the functions that banks perform as presented above and the likely manipulation in their relevant accounting figures, we shed light on a wide array of balance sheet and income statement variables used by U.S. authorities to monitor the performance and soundness of banks. We analyse the components of the CAMELS rating system that capture capital adequacy, asset quality, earnings strength, and the level of liquidity, which managers have inherent incentives to manipulate. This more comprehensive analysis distinguishes our paper from the bulk of the Benford’s Law literature that focuses on the manipulation of earnings and earnings-related figures of corporate firms (e.g., Carslaw
1988; Thomas
1989; Niskanen and Keloharju
2000; Van Caneghem
2002). The paper also differs from the banking literature that explores how managers use accounting discretion to manage earnings by employing various loan loss provisioning discretion models (e.g., Ahmed et al.
1999; Beatty et al.
2002; Cohen et al.
2014).
1
Poor accounting data quality and weak disclosure practices combined with possible data manipulation in the banking sector have contributed to the propagation and the prolongation of the recent crisis and can possibly plant the seeds for the next financial turmoil. That said, an additional innovation of our paper is that we test for data manipulation prior to the outbreak of the crisis as well as during the crisis. The crisis offers an excellent research environment that enables us to turn the spotlight on any discrepancies in financial reporting across banks in different financial conditions and to explore how Benford’s Law can be used to document those discrepancies.
We report several interesting results, which are all robust to a number of sensitivity and placebo tests. Banks utilise loan loss provisions to manipulate earnings and interest income upwards throughout the two examined periods regardless of their level of soundness. In the case of bailed-out banks, non-interest income is also found to be manipulated upwards in both periods. Together with loan loss provisions, failed and bailed-out banks resort to a downward manipulation of their allowance for loan losses and non-performing loans. Furthermore, manipulation is found to be more prevalent in distressed banks, which have stronger incentives to conceal their financial difficulties. Importantly, manipulation is both magnified during the crisis period and also expanded to affect regulatory capital. Overall, banks utilise data manipulation to disclose an artificially improved picture of their performance.
The rest of the paper proceeds as follows. Section 2 describes the theoretical underpinnings of Benford’s Law, presents the relevant literature, and clarifies how Benford’s Law addresses the limitations of some previous methods. The data and the tools we employ in our empirical analysis are described in Section 3. Section 4 presents our main empirical results and Section 5 performs a series of robustness checks, and then Section 6 concludes.
5 Robustness Analysis
We carry out a number of different tests to validate the robustness of our findings. We start by testing the sensitivity of our results to a set of alternative CAMELS components. The book equity capital (
EQUITY) is used to measure capital adequacy; asset quality is measured by non-performing loans (
NPL) and by allowances for loan losses (
ALLOW); earnings strength is based on total earning assets (
EARNAS); and liquidity is captured by federal funds purchased and securities sold under agreements to repurchase (
REPOS). Variables and data sources are described in Table
2.
The results for the pre-crisis period in Table
7 reveal the occurrence of positive (negative) and significant proportional deviations in digits zero and one (eight and nine) for
EARNAS across the three groups of banks. A downward bias is reported in
ALLOW for the groups of distressed banks as there are more nines and fewer zeros than anticipated in the second digit; deviations are significant at the 5% level. The goodness-of-fit chi-square test produces significant results for both
EARNAS and
ALLOW. No indication of manipulation is found for bank capital (
EQUITY) and liquidity (
REPOS) in the pre-crisis period, which corroborates our baseline results.
Table 7
Robustness test: Digital analysis of the second position over the pre-crisis period
To the extent that loan loss allowances are included in the regulatory capital of banks, managers have incentives to manipulate ALLOW downwards with the purpose of demonstrating that a strong capital cushion has been set aside to absorb shocks in a financial turmoil. Under the Financial Accounting Standards No. 5 (FASB 1975) entitled ‘Accounting for Contingencies’, when credit losses are likely and can be reasonably estimated, an expense called loan loss provisions and a contra-asset (to earning assets outstanding) called loan loss allowances should be recorded on banks’ accounting books. That is, by manipulating loan loss provisions downwards as documented in our baseline analysis, banks are inclined to follow the same or similar strategies with loan loss allowances, and, by doing so, are entitled to artificially increase the volume of earning assets.
Turning to the crisis period, the results in Table
8 reveal an upward manipulation of
EARNAS for all three groups of banks; manipulation is stronger compared to the pre-crisis period. Regarding asset quality, the values of
z-statistic demonstrate significantly negative deviations in digit zero and significantly positive deviations in digit nine for both
ALLOW and
NPL for the two groups of distressed banks. The chi-square test confirms the statistical validity of these deviations. The downward manipulation in
NPL can be explained by the direct relation that holds between non-performing loans and loan loss provisions: higher levels of non-performing loans imply troubles in the loan portfolio of banks and these troubles are normally reflected in higher loss provisions as documented in our baseline analysis (Kanagaretnam et al.
2010). Interestingly, we find no indication of manipulation in
EQUITY. This means that banks are mainly interested in appearing to be sufficiently capitalised by solely manipulating their regulatory capital (
TIER1) upwards. Lastly, by applying Benford’s Law to the first position of
EARNAS,
ALLOW, and
NPL, we can corroborate the rather unnoticeable manipulation patterns detected in our baseline analysis.
Table 8
Robustness test: Digital analysis of the second position over the crisis period
As an additional robustness test, we consider Nigrini (
1996)‘s Distortion Factor (DF) model that indicates whether data are overstated or understated and that also estimates the extent of manipulation. The DF model compares the mean of the actual numbers in a data set and the mean of the expected numbers based on Benford’s Law. Since there is no unique mean for the numbers contained in a data set which closely approximates the Law because such a data set consists of relatively large or small numbers, Nigrini (
1996) suggests moving the decimal point of each actual number so that each number would fall into the interval [10,100). More concretely, each number that is below 10 is expanded to a number within the range [10,100). Similarly, each number that is larger than 100 is collapsed to a number within the range [10,100). For example, the number 4.29 expands to 42.9, while the number 1040 collapses to 10.4. A comparison is then made between the mean of the set of actual numbers scaled to the [10,100) range and the mean of the set of the expected numbers that conform to the Law.
The Actual Mean (
AM) of the
n collapsed or expanded data is:
$$ AM=\frac{\sum X}{n}, $$
(6)
where
X stands for the collapsed or the expanded data values, and
n is the number of observations of the examined variable. The Expected Mean (
EM) of Benford’s distribution is:
$$ EM=\frac{90}{n\left({10}^{\frac{1}{n}}-1\right)} $$
(7)
We can now calculate
DF as follows:
$$ DF=\frac{\left( AM- EM\right)100}{EM} $$
(8)
Equation (
8) reflects the average percentage manipulation of the examined data. When
DF is positive (negative), an upward (downward) manipulation is detected. Since
AM and
EM are the means of
n random variables, the distribution of
DF approaches the normal distribution according to the central limit theorem. Hence, the
z-statistic that tests the null hypothesis that the
AM equals the
EM can be computed for a relatively large
n.
12
To enhance the validity of this robustness test, we account for possible selection bias in our sample of banks. Therefore, we exclude the distressed banks that were not expected to be let to fail as there were systemically important. The managers of those banks had, in principle, little incentives to resort to manipulation. On the same day that the U.S. Treasury launched TARP/CPP, nine of the largest banks which together accounted for approximately 55% of the U.S. banks’ assets were ‘arm twisted’ by authorities to participate in the programme. Those banks were: Bank of America, Citigroup, JP Morgan Chase, Wells Fargo, Morgan Stanley, Goldman Sachs, Bank of New York Mellon, State Street, and Merrill Lynch. We also exclude Washington Mutual since its largest subsidiary was a thrift institution and not a commercial or a savings bank. By excluding these banks from our sample, we remove the impact of extreme values and outliers that may have an effect on the observed frequencies and the relevant digital analysis.
Table
9 presents the
DF values in percentage terms for all ten variables under scrutiny for the three groups of banks over the two periods. The reported signs and the magnitude of
DF confirm the results obtained in our baseline analysis. Total earning assets (
EARNAS) are significantly distorted upwards by 7.41%, 6.93%, and 6.53% for the non-distressed, failed, and bailed-out banks respectively in the pre-crisis period. The relevant percentages in the crisis years are equal to 7.82, 9.30, and 8.81 for the three banking groups, respectively. A very similar distortion pattern is documented for total interest income (
INTINC). Furthermore, a significant upward distortion of 7.42% in the pre-crisis period and 8.94% in the crisis period is documented for the non-interest income (
NINTINC) of bailed-out banks, which is also in line with our baseline findings. We further confirm that manipulation is magnified and expanded after the outbreak of the crisis.
13 All banks are found to be involved in an upward manipulation of the core regulatory capital during the crisis period. Indeed,
TIER1 is significantly manipulated upwards, where the extent of manipulation is higher for the distressed institutions. Moreover, loan loss allowances (
ALLOW) are significantly distorted downwards by failed and bailed-out banks in the pre-crisis period (−5.94% and − 6.20%, respectively). The magnitude of distortion is enhanced in the crisis period (−8.53% and − 7.64%, respectively). Loan loss provisions (
LLP) are also found to be distorted downwards by all banks across the two time periods. The reported distortion is significant at the 5% level and is stronger in the crisis period. As regards non-performing loans (
NPL), these are manipulated downwards by the set of distressed banks not only in the crisis period (as evidenced the baseline analysis), but also in the pre-crisis period.
Table 9
Distortion factor model
Panel A: Pre-crisis period |
Non-distressed banks (obs = 6,302) | 1.61 0.49 | 1.33 1.06 | -0.78 -0.50 | -2.38 -1.46 | -3.19 -1.72* | 7.41 1.78* | 5.31 2.14** | -1.01 -0.47 | 0.75 1.27 | 0.89 0.55 |
Failed banks (obs = 448) | -1.05 -0.67 | 2.11 1.18 | -2.95 -1.80* | -5.94 -2.27** | -3.51 -1.96** | 6.93 2.40** | 7.89 2.35** | 1.18 1.50 | -1.08 -0.79 | 1.64 1.30 |
Bailed-out banks (obs = 815) | 1.89 1.04 | 1.97 0.85 | -3.76 -1.93* | -6.20 -1.91* | -6.63 -2.38** | 6.53 2.68*** | 6.21 2.14** | 7.42 1.86* | -1.33 -1.10 | 0.82 1.25 |
Panel B: Crisis period |
Non-distressed banks (obs = 6,302) | 1.90 0.52 | 5.89 1.74* | -1.38 -0.72 | -3.58 -1.60 | -3.85 -1.94* | 7.82 1.97** | 5.12 2.08** | -0.75 -0.50 | 0.38 1.45 | 1.38 1.06 |
Failed banks (obs = 448) | -2.19 -0.58 | 7.41 1.85* | -6.44 -1.91* | -8.53 -2.42** | -4.98 -2.31** | 9.30 3.17*** | 8.75 2.48** | 1.39 1.28 | -0.96 -0.59 | 2.16 1.48 |
Bailed-out banks (obs = 815) | 2.30 1.21 | 8.09 2.27** | -6.79 -2.04** | -7.64 -2.10** | -7.53 -2.17** | 8.81 2.80*** | 6.93 3.61*** | 8.94 2.31** | -2.00 -1.29 | 0.77 1.14 |
To further enhance the robustness of our results, we confirm the validity of the crisis effect and that of the effect of distressed banks by carrying out two placebo tests. Lastly, the ability of Benford’s Law to predict data manipulation is corroborated by an out-of-sample prediction analysis we conduct.
14
6 Conclusions
Manipulation may transmit noise to the operation of the banking sector. It distorts the expectations of shareholders and other investors about the individual bank valuations and the overall financial conditions in the sector. It also misleads authorities in their task to identify and address any problems in the operation of banks and to unravel any sources of instability for the whole industry. We utilise Benford’s Law to test whether and to what extent a set of fundamental balance sheet and income statement data were manipulated prior to and also after the crisis.
Several interesting findings are reported in our baseline analysis and are corroborated by robustness checks. Banks, regardless of their financial condition, appear to utilise loan loss provisions to manipulate earnings and interest income upwards throughout the two periods. For bailed-out banks, non-interest income, which is a key income component in their business model, is also manipulated upwards in both periods. Furthermore, distressed (bailed out and failed) banks resort to the downward manipulation of allowances for loan losses and non-performing loans and this, in combination with the delay in loan loss provisions, enables them to artificially increase their reported earnings. As such, manipulation is more evident in distressed institutions. Moreover, manipulation is magnified in the crisis period for all the examined banks. It also encompasses regulatory capital.
In sum, banks’ data manipulation decreases the reliability of accounting information, erodes confidence, and undermines their credibility. Manipulation yields a distorted view of the financial conditions and the health of banks, which may have the effect of increasing regulatory forbearance. It is, therefore, crucial for authorities and other outsiders to find ways to reduce manipulation. Our results call for a more in-depth evaluation of the quality of the bank accounting information by applying a Benford’s Law-type analysis, which can assist authorities in deterring manipulation both under normal financial conditions, and during financial debacles when the phenomenon is exaggerated.
Acknowledgements
We thank the Organising Committee of the 23rd MFS Conference 2016 for awarding Nikolaos I. Papanikolaou with the Best Young Researcher Award. We also thank the participants in the FMA European Meeting 2015 as well as the Scientific Committee members for nominating the paper for one of the two Best Paper Awards. Special thanks to the participants in the XXIV International Rome Conference on Money, Banking and Finance and to those in the Financial Intermediation Network of European Studies (FINEST) 2016 and 2018 Workshops for their valuable comments and suggestions. We are indebted to Nicola Cetorelli, Franco Fiordelisi, William (Bill) Megginson, Marco Pagano, and George Pennacchi for their insightful discussions, which helped us to improve the quality of our work. Furthermore, the paper has been benefited from comments by Olivier De Jonghe, Bill Francis, Claudio Giannotti, Dimitrios Gounopoulos, Iftekhar Hasan, Saverio Stentella Lopes, Phillip Molyneux, Daniele Angelo Previati, Kostas Tsatsaronis, Piet Usselmann, and Stefano Zedda. Also, we would like to thank Aineas Mallios and Mounir Shal for their assistance in data collection and refinement. We also thank the Faculty of Management of Bournemouth University for the provided financial support via the Mid-Career Seedcorn Funding scheme. Our sincere thanks to an anonymous reviewer of the Journal of Financial Services Research for their incisive comments. Special thanks are extended to Christopher Hartwell and Tim Lloyd for their professional assistance in proofreading the paper.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.