Skip to main content

The Development and Validation of a Computer-Based Test of English for Young Learners: Cambridge English Young Learners

  • Chapter
Assessing Young Learners of English: Global and Local Perspectives

Part of the book series: Educational Linguistics ((EDUL,volume 25))

Abstract

This chapter summarises the rationale for the development and validation work that took place over 2.5 years before the launch of the computer-based (CB) format of the Cambridge English Young Learners English tests (YLE). Several rounds of trials were carried out in a cyclical way, in a number of different locations across various countries, to ensure data was collected from a representative sample of candidates in terms of geographical location, age, L1, language ability, familiarity with the YLE tests, and experience of using different computer devices – PC, laptop and tablet. Validity evidence is presented from an empirical study, using a convergent mixed methods design to explore candidate performance in and reaction to the CB YLE tests. Regression analyses were conducted to investigate which individual test taker characteristics contribute to candidate performance in CB YLE tests. The results indicate that CB delivery presents a genuine choice for candidates in line with the Cambridge English ‘bias for best’ principle. Positive feedback from trial candidates, parents and examiners suggests that CB YLE tests offer a contemporary, fun, and accessible alternative to paper-based (PB) YLE tests to assess children’s English language ability.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Barnes, S. K. (2010a). Using computer-based testing with young children. In Proceedings of the NERA conference 2010: Paper 22. Retrieved from http://digitalcommons.uconn.edu/nera_2010/22

  • Barnes, S. K. (2010b). Using computer-based testing with young children. PhD dissertation, Number: AAT 3403029, ProQuest Dissertations and Theses database.

    Google Scholar 

  • Becker, H. J. (2000). Who’s wired and who’s not: Children’s access to and use of computer technology. The Future of Children: Children and Computer Technology., 10, 3–31.

    Article  Google Scholar 

  • Bennett, R. E. (1998). Reinventing assessment: Speculations on the future of large-scale educational testing. Princeton, NJ: Policy Information Center, Educational Testing Service.

    Google Scholar 

  • British Educational Research Association. (2011). Ethical guidelines for educational research. London: BERA. Retrieved from http://content.yudu.com/Library/A2xnp5/Bera/resources/index.htm?referrerUrl=http://free.yudu.com/item/details/2023387/Bera

  • British Psychological Society. (2009). Code of ethics and conduct. Leicester, UK: BPS. Retrieved from http://www.bps.org.uk/sites/default/files/documents/code_of_ethics_and_conduct.pdf

  • Brown, J. D., & McNamara, T. (2004). The devil is in the detail: Researching gender issues in language assessment. TESOL Quarterly, 38(3), 524–538.

    Article  Google Scholar 

  • Chapelle, C., & Douglas, D. (2006). Assessing languages through computer technology. In C. J. Alderson & L. F. Bachman (Eds.), Cambridge language assessment. Cambridge, UK: Cambridge University Press.

    Google Scholar 

  • Clariana, R., & Wallace, P. (2002). Paper-based versus computer-based assessment: Key factors associated with test mode effect. British Journal of Educational Technology, 33(5), 593–602.

    Article  Google Scholar 

  • Council of Europe. (2001). The common European framework of reference for languages. Cambridge, UK: Cambridge University Press.

    Google Scholar 

  • Creswell, J. W., & Plano Clark, V. L. (2011). Designing and conducting mixed methods research (2nd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Economic and Social Research Council. (2012). ESRC Framework for Research Ethics (FRE) 2010. Swindon, UK: ESRC. Retrieved from http://www.esrc.ac.uk/_images/framework-for-research-ethics-09-12_tcm8-4586.pdf

  • Endres, H. (2012). A comparability study of computer-based and paper-based writing tests. Research Notes, 49, 26–33, Cambridge, UK: Cambridge ESOL.

    Google Scholar 

  • European Commission. (n.d.). The RESPECT project code of practice. The European Commission’s Information Society Technologies (IST) Programme. Retrieved from http://www.respectproject.org/code/index.php

  • Fox, J. (2002). An R and S-PLUS companion to applied regression. Thousand Oaks, CA: Sage.

    Google Scholar 

  • Fox, J. (2008). Applied regression analysis and generalized linear models. Thousand Oaks, CA: Sage.

    Google Scholar 

  • Hackett, E. (2005). The development of a computer-based version of PET. Research Notes, 22, 9–13, Cambridge, UK: Cambridge ESOL.

    Google Scholar 

  • Hong Kong Special Administrative Region Government Education Bureau. (n.d.). General studies for primary curriculum. Retrieved from http://www.edb.gov.hk/en/curriculum-development/kla/general-studies-for-primary/index.html

  • Hong Kong Special Administrative Region Education Bureau Information Services Department. (2014). The fourth strategy on information technology in education. Realising IT potential, unleashing learning power. Retrieved from http://www.edb.gov.hk/attachment/en/edu-system/primary-secondary/applicable-to-primary-secondary/it-inedu/Policies/4th_consultation_eng.pdf

  • Jones, N. (2000). BULATS: A case study comparing computer-based and paper-and-pencil tests. Research Notes, 3, 10–13, Cambridge, UK: Cambridge ESOL.

    Google Scholar 

  • Jones, N. (2006). Assessment for learning: The challenge for an examination board. In R. Oldroyd, (Ed.), Excellence in assessment: Assessment for learning (pp. x–u). Cambridge, UK: Cambridge Assessment.

    Google Scholar 

  • Jones, N., & Maycock, L. (2007). The comparability of computer-based and paper-based tests: Goals, approaches, and a review of research. Research Notes, 27, 11–14, Cambridge, UK: Cambridge ESOL.

    Google Scholar 

  • Lee, H. K. (2004). A comparative study of ESL writers’ performance in a paper-based and a computer-delivered writing test. Assessing Writing, 9(1), 4–26.

    Article  Google Scholar 

  • Li, J. (2006). The mediation of technology in ESL writing and its implications for writing assessment. Assessing Writing, 11(1), 5–21.

    Article  Google Scholar 

  • Maycock, L. & Green, T. (2005). The effects on performance of computer familiarity and attitudes towards CB IELTS. Research Notes, 20, 3–8, Cambridge, UK: Cambridge ESOL.

    Google Scholar 

  • McDonald, A. S. (2002). The impact of individual differences on the equivalence of computer-based and paper-and-pencil educational assessment. Computers & Education, 39(4), 299–312.

    Article  Google Scholar 

  • Merrell, C., & Tymms, P. (2007). What children know and can do when they start school and how this varies between countries. Journal of Early Childhood Research, 5(2), 115–134.

    Article  Google Scholar 

  • National Association for the Education of Young Children (NAEYC). (2009). Joint position statement from the National Association for the Education of Young Children and the National Association of Early Childhood Specialists in State Departments of Education: Where we stand on curriculum, assessments and program evaluation. Retrieved from http://www.naeyc.org/files/naeyc/file/positions/StandCurrAss.pdf

  • Neuman, G., & Baydoun, R. (1998). Computerization of paper-and-pencil tests: When are they equivalent? Applied Psychological Measurement, 22(1), 71–83.

    Article  Google Scholar 

  • O’Sullivan, B., Weir, C., & Yan, J. (2004) Does the computer make a difference? IELTS Research Project Report. Cambridge, UK: Cambridge ESOL.

    Google Scholar 

  • OECD/CERI (2008). New millennium learners: Initial findings on the effects of digital technologies on school-age learners. Retrieved from http://www.oecd.org/site/educeri21st/40554230.pdf and http://www.oecd.org/edu/ceri/centreforeducationalresearchandinnovationceri-newmillenniumlearners.htm

  • Pedró, F. (2006). The new millennium learners: Challenging our views on ICT and learning. Paris: OECD/CERI.

    Google Scholar 

  • Pedró, F. (2007). The new millennium learners: Challenging our views on digital technologies and learning. Nordic Journal of Digital Literacy, 2(4), 244–264.

    Google Scholar 

  • Pommerich, M. (2004). Developing computerized versions of paper-and-pencil tests: Mode effects for passage-based tests. The Journal of Technology, Learning and Assessment, 2(6), 3–44.

    Google Scholar 

  • Pomplun, M., Frey, S., & Becker, D. (2000). The score equivalence of paper-and-pencil and computerized versions of a speeded test of reading comprehension. Educational and Psychological Measurement, 62, 337–353.

    Article  Google Scholar 

  • Prensky, M. (2001). Digital natives, digital Immigrants. On the Horizon, 9(5), Bradford, UK: MCB University Press.

    Google Scholar 

  • Rideout, V. J., Vandewater, E. A., & Wartella, E. A. (2003). Zero to six: Electronic media in the lives of infants, toddlers and preschoolers. Menlo Park, CA: The Henry J. Kaiser Family Foundation.

    Google Scholar 

  • Russell, M., & Haney, B. (1997). Testing writing on computers: An experiment comparing students’ performance on test conducted via computer and via paper-and-pencil. Education Policy Analysis Archive, 5(3), 1–19.

    Google Scholar 

  • Sim, G., Holifield, P., & Brown, M. (2004). Implementation of computer assisted assessment: Lessons from the literature. ALT-J, 12(3), 215–229.

    Article  Google Scholar 

  • Sim, G., & Horton, M. (2005). Performance and attitude of children in computer based versus paper based testing. In P. Kommers & G. Richards (Eds.), Proceedings of ED-MEDIA World conference on educational multimedia, hypermedia & telecommunications. Seattle, WA: AACE.

    Google Scholar 

  • Social Research Association. (2003). Social Research Association ethical guidelines. London: Social Research Association. Retrieved from http://the-sra.org.uk/wp-content/uploads/ethics03.pdf

  • Taylor, C., Jamieson, J., Eignor, D., & Kirsch, I. (1998). The relationship between computer familiarity and performance on computer based TOEFL test tasks. TOEFL Research Reports. Princeton, NJ: ETS.

    Google Scholar 

  • Tymms, P., & Merrell, C. (2009). On-entry baseline assessment across cultures. In A. Anning, J. Cullen, & M. Fleer (Eds.), Early childhood education: Society and culture (2nd ed., pp. 117–128). London: Sage.

    Google Scholar 

  • Tymms, P., Merrell, C., & Hawker, D. (2012). IPIPS: An international study of children’s first year at school. Paris: OECD.

    Google Scholar 

  • Wall, K., Higgins, S., & Tiplady, L. (2009, September). Pupil views templates: Exploring pupils’ perspectives of their thinking about learning. Paper presented at 1st International Visual Methods Conference Leeds.

    Google Scholar 

  • Wang, S., Jiao, H., Young, M. J., Brooks, T. E., & Olson, J. (2007). A meta-analysis of testing mode effects in grade k–12 mathematics tests. Educational and Psychological Measurement, 67, 219–238.

    Article  Google Scholar 

  • Zandvliet, D. (1997). A comparison of computer-administered and written tests. Journal of Research on Technology in Education, 29(4), 423–438.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Szilvia Papp .

Editor information

Editors and Affiliations

Appendices

Appendix A: Candidate Questionnaire (English Version)

CB YLE CANDIDATE QUESTIONNAIRE

Dear CB YLE Candidate

We would like to ask you a few questions about you and the YLE test you took. Your responses will allow us to make sure the test is working properly. Your answers will be strictly confidential. Please tick ☑ the boxes or WRITE or DRAW your answers as appropriate.

1.

What is your full name?

Surname

 

Given name

 

2.

What is your candidate ID?

 

3.

What is your mother tongue?

 

4.

Are you a boy or a girl?

Boy

Girl

5.

How old are you?

5 or younger

6

7

8

9

10

11

12 or older

6.

Which school did you take the test at?

7.

How many years have you been learning English?

8.

How often do you use computers?

Every day

Once or twice a week

Only at weekends

9.

Where do you use computers?

In school

In English classes

At home

10.

What do you use computers for?

English homework

Email or chat with friends in English

Anything else?

..........................∋

11.

Which type of computer do you use most at home?

Desktop (PC/Mac)

Tablet

Laptop

12.

Do you prefer taking tests on paper or on computer?

On paper

No difference

On computer

 

Listening and Reading and Writing tests on COMPUTER

  

Yes

Not sure

No

1.

I knew how to change the volume in the listening test.

2.

It was easy to write my answers on the computer.

3.

It was easy to select my answers on the computer.

4.

It was easy to colour my answers on the computer.

5.

It was easy to move between questions.

6.

It was easy to move between tasks.

7.

I had enough time to answer all the questions on the computer.

8.

I looked at the timer to know how much time I had left in the test.

9.

I was worried when I saw the timer counting down on the computer.

10.

I liked the pictures in the test on the computer.

11.

The examples et the start of each task helped me understand what to do in the test on the computer.

12.

I liked taking the Listening and Reading & Writing tests on the COMPUTER.

Speaking test on COMPUTER

  

Yes

Not sure

No

13.

I understood how to check the microphone in the speaking test.

14.

I knew when to start speaking in the speaking test on the computer.

15.

I knew when to stop speaking in the speaking test on the computer.

16.

I used the clock to help me know how much time I had to speak in the speaking test on the computer.

17.

I had enough time to think about my answers in the speaking test on the computer.

18

I had enough time to give my answers in the speaking test on the computer.

19.

I felt nervous taking the speaking test on the computer.

20.

I liked speaking to a computer.

Anything else you would like to WRITE or DRAW about the test on the COMPUTER?

Your preferences about the YLE test on PAPER or COMPUTER

  

On paper/face-to-face with examiner

No difference

On computer

 

21.

I found the test easier …

 

Why?

22.

I prefer listening …

 

Why?

23.

I prefer speaking …

 

Why?

24.

I can read more easily and more quickly …

 

Why?

25.

I can write more easily and more quickly …

 

Why?

26.

I preferred taking the test …

 

Why?

Anything else you would like to WRITE or DRAW about the test on PAPER or COMPUTER?

Thank you for your time.

Appendix B: CB YLE Observer Checklist

CB YLE OBSERVER CHECKLIST

Dear YLE Examiner, Test administrator, Usher, or Teacher,

We would like to ask you a few questions about the CB YLE tests you have observed. Your responses will allow us to make sure the test is working properly. Your answers will be strictly confidential. Please tick ☑ the boxes or WRITE your answers as appropriate.

1.

What is your name?

2.

What is your examiner ID (if relevant)?

3.

What is your mother tongue?

4.

Are you male or female?

Male

Female

5.

How old are you?

18-20

21-25

26-30

31-35

36-40

41-50

51-60

61+

6.

In which centre/location did you observe the test?

7.

How many years have you been examining/administering/preparing learners for YLE?

Examining

Administering

Preparing learners

…… years

…… years

…… years

8.

How often do your students use computers?

Every day

Once or twice during the week

Only at weekends for homework assignments

9.

Where do your students use computers?

In school

In English classes

At home for homework

10.

What do your students use computers for?

English homework

Email or chat with friends in English

Anything else?

  

……………………

11.

Which type of computer do your students use most at school?

Desktop (PC/Mac)

Tablet

Laptop

  

12.

Do your students prefer taking tests on paper or on computer?

On paper

Not sure

On computer

13.

Which level of CB YLE tests have you observed?

Starters

Movers

Flyers

14.

What type of computer were the candidates using during the test you have observed?

Desktop (PC/Mac)

Tablet

Laptop

Your observations on the CB YLE Speaking test

  

Yes

Not sure

No

27.

The candidates checked the microphone in the speaking test.

28.

The candidates understood clearly what they had to do in the speaking test on the computer.

29.

The candidates knew when to start talking in the speaking test on the computer.

30.

The candidates knew when to stop talking in the speaking test on the computer.

31.

The animations were helpful for the candidates to know how and when to start talking.

32.

The animations were helpful for the candidates to know how and when to finish talking.

33.

The candidates checked the timer to see how much time they had to speak.

34

I noticed some candidates rushing their answer in response to the timer.

35.

The candidates had enough time to think about their answers in the speaking test on the computer.

36.

The candidates had enough time to give their answers in the speaking test on the computer.

37.

I noticed candidates were nervous while taking the speaking test on the computer, e.g. they hesitated, looked confused or distracted.

38.

The candidates seemed to like speaking to a computer.

39.

Lack of human examiner support did not prevent candidates from providing responses.

Your observations on the CB YLE Listening and Reading & Writing tests

  

Yes

Not sure

No

40.

The candidates changed the volume in the listening test.

41.

The candidates understood what they needed to do in the Listening test on the computer.

42.

The candidates understood what they needed to do in the Reading and Writing test on the computer.

43.

The candidates were able to click/tap to write their answers on the computer.

44.

The candidates were able to select their multiple choice answers on the computer.

45.

The candidates were able to colour their answers on the computer.

46.

The candidates were able to move easily between questions.

47.

The candidates were able to move easily between tasks.

48.

The candidates had enough time to answer all the questions in the Listening test on the computer.

49.

The candidates had enough time to answer all the questions in the Reading and Writing test on the computer.

50.

The onscreen timer in the Listening and Reading/Writing tests made candidates anxious.

51.

The examples/model answers helped candidates answer the questions in the test on the computer.

52.

The candidates liked taking the Listening and Reading & Writing tests on the computer.

Candidate preferences about the YLE test on PAPER or COMPUTER

  

On paper / face-to-face with examiner

No difference

On computer

  

 

53.

Candidates find the YLE test easier …

 

Why?

54.

Candidates prefer listening …

 

Why?

55.

Candidates prefer speaking …

 

Why?

 

56.

Candidates can read more easily and more quickly …

 

Why?

57.

Candidates can write more easily and more quickly …

 

Why?

58.

Candidates prefer taking the YLE test …

 

Why?

Any other comments about what you observed in the CB YLE Speaking test?

Any other comments about what you observed in the CB YLE Listening and Reading & Writing tests?

Any other observation about candidate preferences on PAPER-based or COMPUTER-based YLE tests?

Thank you for your time.

Appendix C: Effect Plots from Regression Analyses (Figs. 7, 8, 9, 10, and 11)

Fig. 7
figure 7

STARTERS – Model 1 effect plot

Fig. 8
figure 8

FLYERS – Model 2 effect plot

Fig. 9
figure 9

FLYERS – Model 3 effect plot

Fig. 10
figure 10

FLYERS – Model 4 effect plot

Fig. 11
figure 11

FLYERS – Model 5 effect plot

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Papp, S., Walczak, A. (2016). The Development and Validation of a Computer-Based Test of English for Young Learners: Cambridge English Young Learners. In: Nikolov, M. (eds) Assessing Young Learners of English: Global and Local Perspectives. Educational Linguistics, vol 25. Springer, Cham. https://doi.org/10.1007/978-3-319-22422-0_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-22422-0_7

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-22421-3

  • Online ISBN: 978-3-319-22422-0

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics