Skip to main content
Top

2022 | OriginalPaper | Chapter

An Analysis of Methodologies, Incentives, and Effects of Performance Evaluation in Higher Education: The English Experience

Authors : Giovanni Barbato, Matteo Turri

Published in: Governance and Performance Management in Public Universities

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

The chapter illustrates the English systems for research evaluation (the Research Excellence Framework—REF) and teaching evaluation (Teaching Excellence and Student Outcomes Framework—TEF) as NPM-driven mechanisms aimed at increasing public accountability and efficiency. Particular attention is given to the methodology of assessment, the incentives provided through the evaluation mechanism and the actual/potential effects that have been generating on both academics and universities. These two experiences are analyzed through the lens of the Management Control Theory (MCT), which argues that each evaluation system must be designed according to the nature of the activity/task under evaluation in order to be effective. Two dimensions are particularly relevant to this regard: the degree of the measurability and attributability of the output (i) and the knowledge of the cause–effect relation or transformation process producing the output (ii). Different configurations of these two dimensions lead indeed to different types of evaluation (input, process, output evaluation). Regarding the REF, it is highlighted how peer-review reflects both a process-based and an output-based evaluation since the quality of the research outputs under assessment should be judged according to the consistency of theoretical arguments and research methodologies employed to develop that specific research output. However, when this process shifts towards increasing attention to just quantifiable indicators of research outputs, some unintended consequences may arise. Concerning the TEF, the dominant focus of the metrics-based system towards measurable dimensions of teaching outputs, may not be able to grasp more qualitative and not directly measurable aspects that are still crucial in discriminating between satisfactory or partial performances.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Footnotes
1
The evaluation exercise has taken different denominations over time. Between 1986 and 1989 it was the Research Selectivity Exercise (RSE) while from 1992 to 2008 the Research Assessment Exercise (RAE). The evaluation exercise has then been renamed Research Excellence Framework (REF) from 2014. We will only use the term REF in this chapter to generally indicate the research evaluation exercise in England.
 
2
Research England replaced the Higher Education Funding Council for England (HEFCE) through the 2017 Higher Education and Research Act. The funding agency was previously called the University Funding Council (UFC) between 1989 and 1992 and University Grant Committee (UGC) before 1989.
 
3
The set of quality assurance processes implemented by the Quality Assurance Agency (QAA) operates alongside the TEF. Yet, this chapter will only focus on the TEF since this is the main evaluative system that counterbalance the REF.
 
4
As a result of the 2017 Higher Education and Research Act, the Office for Students has replaced HEFCE as the regulatory body for teaching.
 
5
The NSS records students’ opinions on several aspects of their degree programs in the final year of their academic career.
 
6
So, for example, if a HE provider presents a total value of 2.5 based on its core metrics, should be assessed, at the end of the first step, with the “Gold” rating.
 
Literature
go back to reference Adams, J., & Gurney, K. A. (2014). Evidence for excellence: Has the signal overtaken the substance? An analysis of journal articles submitted to RAE2008. Digital Science. Adams, J., & Gurney, K. A. (2014). Evidence for excellence: Has the signal overtaken the substance? An analysis of journal articles submitted to RAE2008. Digital Science.
go back to reference Barbato, G., & Turri, M. (2017). Understanding public performance measurement through theoretical pluralism. International Journal of Public Sector Management, 30(1), 15–30.CrossRef Barbato, G., & Turri, M. (2017). Understanding public performance measurement through theoretical pluralism. International Journal of Public Sector Management, 30(1), 15–30.CrossRef
go back to reference Beech, D. (2017). Going for gold: Lessons from the TEF provider submissions. Higher Education Policy Institute (HEPI) Report 99. Beech, D. (2017). Going for gold: Lessons from the TEF provider submissions. Higher Education Policy Institute (HEPI) Report 99.
go back to reference Bence, V., & Oppenheim, C. (2005). The evolution of the UK’s research assessment exercise: Publications, performance and perceptions. Journal of Educational Administration and History, 37(2), 137–155.CrossRef Bence, V., & Oppenheim, C. (2005). The evolution of the UK’s research assessment exercise: Publications, performance and perceptions. Journal of Educational Administration and History, 37(2), 137–155.CrossRef
go back to reference Bevan, G., & Hood, C. (2006). What’s measured is what matters: Targets and gaming in the English public health care system. Public Administration, 84(3), 517–538.CrossRef Bevan, G., & Hood, C. (2006). What’s measured is what matters: Targets and gaming in the English public health care system. Public Administration, 84(3), 517–538.CrossRef
go back to reference Butler, L. (2007). Assessing university research: A plea for a balanced approach. Science and Public Policy, 34(8), 565–574.CrossRef Butler, L. (2007). Assessing university research: A plea for a balanced approach. Science and Public Policy, 34(8), 565–574.CrossRef
go back to reference Deem, R., & Baird, J. (2020). The English teaching excellence (and student outcomes) framework: Intelligent accountability in higher education? Journal of Educational Change, 21, 215–243.CrossRef Deem, R., & Baird, J. (2020). The English teaching excellence (and student outcomes) framework: Intelligent accountability in higher education? Journal of Educational Change, 21, 215–243.CrossRef
go back to reference Department for Education (Dfe). (2017, October). Teaching excellence and student outcomes framework specification. DfE. Department for Education (Dfe). (2017, October). Teaching excellence and student outcomes framework specification. DfE.
go back to reference Diefenbach, T. (2009). New public management in public sector organizations: The dark sides of managerialistic enlightenment. Public Administration, 87(4), 892–909.CrossRef Diefenbach, T. (2009). New public management in public sector organizations: The dark sides of managerialistic enlightenment. Public Administration, 87(4), 892–909.CrossRef
go back to reference Frey, B. S., Homberg, F., & Osterloh, M. (2013). Organizational control systems and pay-for-performance in the public sector. Organization Studies, 34(7), 949–972.CrossRef Frey, B. S., Homberg, F., & Osterloh, M. (2013). Organizational control systems and pay-for-performance in the public sector. Organization Studies, 34(7), 949–972.CrossRef
go back to reference Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304.CrossRef Geuna, A., & Martin, B. R. (2003). University research evaluation and funding: An international comparison. Minerva, 41(4), 277–304.CrossRef
go back to reference Geuna, A., & Piolatto, M. (2016). Research assessment in the UK and Italy: Costly and difficult, but probably worth it (at least for a while). Research Policy, 45(1), 260–271.CrossRef Geuna, A., & Piolatto, M. (2016). Research assessment in the UK and Italy: Costly and difficult, but probably worth it (at least for a while). Research Policy, 45(1), 260–271.CrossRef
go back to reference Gunn, A. S. (2018). Metrics and methodologies for measuring teaching quality in higher education: Developing the teaching excellence framework (TEF). Educational Review, 70(2), 129–148.CrossRef Gunn, A. S. (2018). Metrics and methodologies for measuring teaching quality in higher education: Developing the teaching excellence framework (TEF). Educational Review, 70(2), 129–148.CrossRef
go back to reference Harley, S. (2002). The impact of research selectivity on academic work and identity in UK universities. Studies in Higher Education, 27(2), 187–205.CrossRef Harley, S. (2002). The impact of research selectivity on academic work and identity in UK universities. Studies in Higher Education, 27(2), 187–205.CrossRef
go back to reference HEFCE. (2011). Research excellence framework 2014: Assessment framework and guidance on submissions, REF 02/2011. HEFCE. (2011). Research excellence framework 2014: Assessment framework and guidance on submissions, REF 02/2011.
go back to reference HEFCE. (2015). Research excellence framework 2014: Manager’s report. HEFCE. (2015). Research excellence framework 2014: Manager’s report.
go back to reference Henkel, M. (1999). The modernisation of research evaluation: The case of the UK. Higher Education, 38(1), 105–122.CrossRef Henkel, M. (1999). The modernisation of research evaluation: The case of the UK. Higher Education, 38(1), 105–122.CrossRef
go back to reference Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261.CrossRef Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261.CrossRef
go back to reference Joynson, C., & Leyser, O. (2015). The culture of scientific research. F1000Research, 4(66), 1–11. Joynson, C., & Leyser, O. (2015). The culture of scientific research. F1000Research, 4(66), 1–11.
go back to reference Leiber, T. (2019). A general theory of learning and teaching and a related comprehensive set of performance indicators for higher education institutions. Quality in Higher Education, 25(1), 76–97.CrossRef Leiber, T. (2019). A general theory of learning and teaching and a related comprehensive set of performance indicators for higher education institutions. Quality in Higher Education, 25(1), 76–97.CrossRef
go back to reference Martin, B. R. (2011). The research excellence framework and the ‘impact agenda’: Are we creating a Frankenstein monster? Research Evaluation, 20(3), 247–254.CrossRef Martin, B. R. (2011). The research excellence framework and the ‘impact agenda’: Are we creating a Frankenstein monster? Research Evaluation, 20(3), 247–254.CrossRef
go back to reference Moed, H. F. (2008). UK research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153–161.CrossRef Moed, H. F. (2008). UK research assessment exercises: Informed judgments on research quality or quantity? Scientometrics, 74(1), 153–161.CrossRef
go back to reference O’Leary, M., & Wood, P. (2019). Reimagining teaching excellence: Why collaboration, rather than competition, holds the key to improving teaching and learning in higher education. Educational Review, 71(1), 122–139.CrossRef O’Leary, M., & Wood, P. (2019). Reimagining teaching excellence: Why collaboration, rather than competition, holds the key to improving teaching and learning in higher education. Educational Review, 71(1), 122–139.CrossRef
go back to reference Office for Students. (2018). Teaching excellence and student outcomes framework (TEF) framework. Year four procedural guidance. OfS 45/2018. Office for Students. (2018). Teaching excellence and student outcomes framework (TEF) framework. Year four procedural guidance. OfS 45/2018.
go back to reference Ouchi, W. (1979). A conceptual framework for design of organizational control mechanism. Management Science, 25(9), 833–848.CrossRef Ouchi, W. (1979). A conceptual framework for design of organizational control mechanism. Management Science, 25(9), 833–848.CrossRef
go back to reference Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: A review. Research Evaluation, 23(1), 21–32.CrossRef Penfield, T., Baker, M. J., Scoble, R., & Wykes, M. C. (2014). Assessment, evaluations, and definitions of research impact: A review. Research Evaluation, 23(1), 21–32.CrossRef
go back to reference Rebora, G., & Turri, M. (2013). The UK and Italian research assessment exercises face to face. Research Policy, 42(9), 1657–1666.CrossRef Rebora, G., & Turri, M. (2013). The UK and Italian research assessment exercises face to face. Research Policy, 42(9), 1657–1666.CrossRef
go back to reference Research England. (2019). Guidance on submissions – REF 2021 01/2019. Research England. (2019). Guidance on submissions – REF 2021 01/2019.
go back to reference Rijcke, S. D., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use: A literature review. Research Evaluation, 25(2), 161–169.CrossRef Rijcke, S. D., Wouters, P. F., Rushforth, A. D., Franssen, T. P., & Hammarfelt, B. (2016). Evaluation practices and effects of indicator use: A literature review. Research Evaluation, 25(2), 161–169.CrossRef
go back to reference Samuel, G. N., & Derrick, G. E. (2015). Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014. Research Evaluation, 24(3), 229–241.CrossRef Samuel, G. N., & Derrick, G. E. (2015). Societal impact evaluation: Exploring evaluator perceptions of the characterization of impact under the REF2014. Research Evaluation, 24(3), 229–241.CrossRef
go back to reference Shattock, M. L. (2012). Making policy in British higher education 1945–2011. McGraw-Hill/Open University Press. Shattock, M. L. (2012). Making policy in British higher education 1945–2011. McGraw-Hill/Open University Press.
go back to reference Smith, P. (1995). On the unintended consequences of publishing performance data in the public sector. International Journal of Public Administration, 18, 277–310.CrossRef Smith, P. (1995). On the unintended consequences of publishing performance data in the public sector. International Journal of Public Administration, 18, 277–310.CrossRef
go back to reference Speklé, R. F., & Verbeeten, F. H. M. (2014). The use of performance measurement systems in the public sector: Effects on performance. Management Accounting Research, 25(2), 131–146.CrossRef Speklé, R. F., & Verbeeten, F. H. M. (2014). The use of performance measurement systems in the public sector: Effects on performance. Management Accounting Research, 25(2), 131–146.CrossRef
go back to reference Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching. Review of Educational Research, 83(4), 598–642.CrossRef Spooren, P., Brockx, B., & Mortelmans, D. (2013). On the validity of student evaluation of teaching. Review of Educational Research, 83(4), 598–642.CrossRef
go back to reference Stern, N. (2016). Building on success and learning from experience: An independent review of the research excellence framework. Stern, N. (2016). Building on success and learning from experience: An independent review of the research excellence framework.
go back to reference Technopolis. (2010). REF research impact pilot exercise: Lessons learned project: Feedback on pilot submission. Technopolis. (2010). REF research impact pilot exercise: Lessons learned project: Feedback on pilot submission.
go back to reference Technopolis. (2015). REF accountability review: Costs, benefits and burden. Technopolis Group. Technopolis. (2015). REF accountability review: Costs, benefits and burden. Technopolis Group.
go back to reference Technopolis (2018). Review of the research excellence framework: Evidence report. Technopolis Group. Technopolis (2018). Review of the research excellence framework: Evidence report. Technopolis Group.
go back to reference Thorpe, A., Craig, R., Hadikin, G., & Batistic, S. (2018). Semantic tone of research ‘environment’ submissions in the UK’s research evaluation framework 2014. Research Evaluation, 27(2), 53–62.CrossRef Thorpe, A., Craig, R., Hadikin, G., & Batistic, S. (2018). Semantic tone of research ‘environment’ submissions in the UK’s research evaluation framework 2014. Research Evaluation, 27(2), 53–62.CrossRef
go back to reference Turri, M. (2005). La valutazione dell’Università. Un’analisi dell’impatto istituzionale e organizzativo. Guerini e Associati. Turri, M. (2005). La valutazione dell’Università. Un’analisi dell’impatto istituzionale e organizzativo. Guerini e Associati.
go back to reference University and College Union. (2013). The research excellence framework (REF) - UCU survey report. University and College Union. (2013). The research excellence framework (REF) - UCU survey report.
go back to reference Van Thiel, S., & Leeuw, F. (2002). The performance paradox in the public sector. Public Performance and Management Review, 25(3), 267–281.CrossRef Van Thiel, S., & Leeuw, F. (2002). The performance paradox in the public sector. Public Performance and Management Review, 25(3), 267–281.CrossRef
go back to reference Weingart, P. (2005). Impact of Bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.CrossRef Weingart, P. (2005). Impact of Bibliometrics upon the science system: Inadvertent consequences? Scientometrics, 62(1), 117–131.CrossRef
go back to reference Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. HEFCE.CrossRef Wilsdon, J., Allen, L., Belfiore, E., Campbell, P., et al. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management. HEFCE.CrossRef
Metadata
Title
An Analysis of Methodologies, Incentives, and Effects of Performance Evaluation in Higher Education: The English Experience
Authors
Giovanni Barbato
Matteo Turri
Copyright Year
2022
DOI
https://doi.org/10.1007/978-3-030-85698-4_3

Premium Partner