3.1 Data collection and keywords
There are only three providers of international citation data: Web of Science, Scopus and Google Scholar, all of which are commercial. In this paper, we aim at finding the journals most cited in research under the umbrella of management accounting. To do this, we first identify management accounting research by searching databases for articles based on selected keywords. Then, these articles’ bibliographies are analysed. We withdraw the sources being cited from the bibliographies; they are the foundation for the final journal ranking. Source in this context means journals.
WoS (formerly The Institute for Scientific Information (ISI)) was established in 1960 and is now owned by Thomson Reuters. Its indices go back to 1945. Scopus was established by Elsevier in 2004 and covers more journals than WoS, but its indices only go back to 1960. It is important to note that coverage for each journal is not necessarily complete. For instance, Scopus indexes all volumes of AOS and MAR, while WoS only indexes MAR from 2008 and AOS from 1981. GS is Google’s academic search. In addition to indexing journals, GS also indexes non-scholarly documents like government reports and student theses. However, there is no transparency regarding exactly what GS covers.
The first challenge is to define management accounting. As Robert Scapens stated in his final editorial in
MAR: (1) ‘We deliberately avoided defining ‘management accounting’ as this could restrict the development of the journal.’ (2) ‘I have taken the view that the scope of management accounting is defined by the papers which are in the journal.’ (Scapens
2014: 246). Therefore, in this article, we depart from an understanding of management accounting in the veins of the German
Controlling, the Swedish
Ekonomistyrning, and the Danish and Norwegian
Økonomistyring. These concepts are much broader than management accounting and embrace performance management, management control, and performance measurement (see for instance Schäffer (
2013), Günther (
2013) and Berg (
2015)). In addition, the terms
management accounting and
management control are sometimes used interchangeably (Chenhall
2003). Furthermore, Ferreira and Otley (
2009) use the construct of performance management systems as superior to and embracing management control and hence management accounting as well. Thus, research under the heading
performance management may as well be management accounting. Also,
management control has numerous analytical conceptualizations (Strauss and Zecher
2013; Berens
2014). The title of the present journal itself, also indicates that
management control should be included. Furthermore, we include
performance measurement, both because it is a part of the CIMA Official Terminology, but also because the journal
Management Accounting Research dedicated a special issue to the topic (MAR 25(2)). Moreover, according to Franco-Santos and Otley (
2018: 696), performance management systems involve performance planning, measures, targets, incentives and other means of control. Indeed, we also acknowledge that performance management and performance measurement have no straight-ahead interpretation, see for instance Bourne et al. (
2018).
As claimed above, in German-speaking countries ‘
controlling’ is widely accepted as a general label for management accounting and management control systems (Günther
2013). However, searching Google Scholar for ‘controlling’ returns 4.90 million hits, including articles in
Physical Letters and
Nature about topics far from management accounting. Searching ‘
controller’ returns 4.88 million hits, many related to micro-controllers, i.e. a small computer or a single integrated circuit. Thus, we added the prefix ‘business’ as in ‘business control’ and ‘business controlling’. We have excluded the personification, as for instance
management accountants, business controllers, CFOs, or
financial managers, as the use of titles may not be consistent and numerous.
One option may be to include terms such as ‘
budgeting’, ‘
costings systems’ or ‘
the balanced scorecard’, which fits with reasons for applying management accounting justified by organizational economics (Samuel
2018). However, selecting terms at this level of detail may be a foolhardy exercise. As noted by Ax and Bjørnenak (
2007), 250 (59.2%) of the concepts listed in the 12th edition of
Cost Accounting (Horngren et al.
2005) are new compared to those in the first edition (Horngren
1982). Which search terms to include, and which to exclude to get a good data set, is thus a central and challenging question. In this paper, we chose generic umbrella terms and not a system’s type of use, level of sophistication, or organizational outcomes (Günther and Heinicke
2019). Yet, we acknowledge that where we draw the line for search terms will influence the outcome.
The eleven search terms are presented in Table
1:
Table 1Keywords (search terms)
Keywords | Management accounting |
Managerial accounting |
Managerial economics |
Management accounting and control |
Cost management |
Cost accounting |
Management control |
Performance management |
Performance measurement |
Business control |
Business controlling |
We performed topical searches in the Science Citation Index Expanded (SCI-EXPANDED) and Social Science Citation Index (SSCI) in WoS for the years 1945 through 2018, refining the search to articles only. In Scopus, we conducted document searches, also only covering articles published between 1960 and 2018. In GS, we also searched the selected terms for the time span 1945–2018. Including only full years makes searches reproducible. Yet, reproducing searches in GS is difficult as the index changes rapidly.
3.2 Issues regarding searching databases and downloading data
The data collection is comprehensive, time-consuming, and there are several potential sources of error present. The searches conducted in WoS and Scopus will look for the search terms in the title field, the abstract field, and the keyword field. This kind of limited metadata search is not possible in GS; the only options are to search anywhere in the article or only in the title field and to include any kind of document found in the index, not only articles. The metadata quality in GS is simply bad, as demonstrated by Jacsó (
2010) for instance.
Searching only the title field for our keywords in GS retrieved 15,500 articles; searching anywhere retrieved 17,300. As many articles do not use our search terms as part of their title, we chose to use the results from anywhere as a starting point, as this will include more relevant documents without too many irrelevant ones. In addition, when limiting the search year-by-year to identify the development of citations, we experienced a rather odd outcome. Limiting the search to the timespan 1945–2018 returned 629,000 articles, i.e. more than in searches not limited by years. In addition, the sum of year-by-year searches for 1945–2018 adds up to more than 260,000 articles. In addition, searching the eleven terms one by one, adds up to 1,962,120. Yet, GS does not allow for more than nine search terms at the same time. This clearly demonstrates that GS should be used with the utmost caution in the case of controlled searches.
Indeed, the data quality of GS has been thoroughly criticised; see for instance Jacsó (
2005,
2006,
2010). This, Aguillo (
2012) argues, means that citation data from GS is not comparable to data from traditional bibliometric databases (WoS and Scopus). However, as Stuart (
2014) says, GS ‘should be seen as providing additional, complementary metrics’ (p. 130) based on citations from more diverse publications or formats previously underrepresented in bibliometric analyses. In addition, Harzing and Alakangas (
2016) found that all three citation databases ‘provide sufficient stability of coverage’ (p. 802) and include GS in their study. This and the fact that GS is very popular among researchers are reasons for including GS.
However, there are also issues regarding WoS and Scopus. Historically, storing data used to be expensive, therefore the data was compressed. This means, for instance, that journals are represented by abbreviations, but the use of abbreviations is not consistent or universal. For instance, we found 10 abbreviations representing the journal Strategic Finance in the WoS data set. Altogether, we found 36 journals with more than one abbreviation. There is no list of abbreviations available from the database providers, any abbreviations must be manually checked by the researcher.
To overcome this problem, we constructed one thesaurus file for each database to merge the data. By searching the abbreviations in the citation data source, the authors succeeded in verifying the titles. However, journal titles are not spelled consistently; for instance, ‘organisation’ and ‘organization’ are used interchangeably. Furthermore, some journals have changed their name several times over the years; Strategic Finance is on its sixth name since the 1920s. This is also considered in the thesaurus file. Nevertheless, we acknowledge that we have almost certainly overlooked some abbreviations in the current paper.
We must also state that none of the databases is complete; each one represents only part of the available research. WoS and Scopus provide complete lists of what they index. This kind of transparency is not offered by GS. WoS and Scopus also manually control their metadata. What is especially important to us is that they provide keywords, both from the author(s) and their own. In this way, non-English articles may be indexed with keywords such as management accounting. However, the adequacy is dependent on the indexer. WoS and Scopus also provide duplicate control, i.e. one article is counted only once. If GS finds the same article multiple places on the web, for instance at Research Gate, the institutional archive and the publishing house, one article may be counted three times.
Another challenge is the reliability of the specific searches. When searching for
management accounting, do we get management accounting research? A topic search in WoS based only on
management accounting returned 985 articles; a number itself very low. All but eight were placed within the field of management accounting upon manual inspection. The following quote illustrates that the method using search terms is not bulletproof: ‘These deal primarily with the mechanisms and epidemiology of the disease, with papers dealing with
depression management accounting for the fastest growing group of citation classics’ (Lipsman and Lozano
2011, p. 39). For the other search terms and databases, we only randomly checked for this kind of incorrect indexing; a check that lead us to expel the term
financial management (FM). Scopus returned 90,366 articles labelled FM, approximately twice the sum of the other keywords altogether. A random control revealed many irrelevant articles, especially medical ones. However, the same was not the case for Web of Science. In addition, we checked the most cited authors, all of whom made sense when it came to the area being studied (Kaplan, Chenhall, Simons, Neely, Otley, Langfield-Smith, Fornell, Eisenhardt, DiMaggio among others). Also, we extracted data for each of the search terms individually and compared them with the main search (the 11 search terms altogether). This allowed us to see whether some of the search terms had to be considered too generic.
It is possible to download citation data directly from WoS and Scopus. In WoS, the restriction is set to citation data from 500 articles per download. For us, this means downloading 28 different files to get all the data. In Scopus, the restriction is set to 2000 hits per download, and it is necessary to conduct multiple searches resulting in a maximum of 2000 hits. We did this by limiting the searches by time spans, requiring 26 searches/downloads to get the full data set.
To download data from GS, one must use secondary software, as GS is lacking an application programming interface (API). We applied Publish or Perish (PoP) (Harzing
2007); another alternative is programming in Python (Stuart
2014, p. 130). In GS, i.e. PoP, the download restriction is set to the 1000 most cited articles, which makes it impossible to download a full data set. Furthermore, neither PoP nor Python programming include the citation data we need in their available downloads. In addition to these shortcomings, Jacsó finds that GS is ‘especially inappropriate for bibliometric searches, for evaluating publishing performance and impact of researchers and journals’ (
2010, p. 175). As a result, GS was excluded from further analyses.
The data from WoS and Scopus were further analysed in a spreadsheet. Converting data can be a source to errors and omissions. Finally, we used the program VOSviewer to sum up the citation data, merge the journal data using the thesaurus files, and find the most cited sources effectively.