Online proficiency testing and intensive course improvement of English instruction at a technology university in Taiwan

Main Article Content

Hsiu-Ching Tso
Gloria Shu -Mei Chwo


Since 2010, Hungkuang technological university in Taiwan has undertaken an increasingly intensive program of twice yearly online English testing of freshman and sophomore students. This is done using both the Bridge and Full Tests of English for International Communication (TOEIC) in mock and real versions. A preliminary analysis of 15,613 individual test scores for reading, listening or both yielded a range of relevant findings including the following. (1) The overall mean proficiency on the real Full TOEIC was 111 for listening and 75 for reading. This was found to be well below the institutional target of minimum total score 350. Progress was also uncertain between freshman and sophomore years. This suggested some loss of motivation on the part of the students. (2) Dierences between Departments revealed the impact of some departments doing extra work on English. The Physical Therapy Department Students, for example, scored better on reading than listening (Bridge TOEIC). This is possibly due to the English medical textbooks used and a students’ self-help reading club. (3) The Bridge and Full TOEIC scores correlated with each other well however actual equivalences found among our lower proficiency students did not match published conversion tables. (4) The mock tests proved to be poor predictors of real test performance. This counts against their use to reduce costs of taking the real tests. Based on the above findings, an intensive English program (IEP) solution was proposed to improve upon the limited English progress that was found. Following the implementation of the IEP, the average improvement score was 123.5 and the highest score was 800.

Article Details

How to Cite
Tso, H.-C., & Chwo, G. S. .-M. (2019). Online proficiency testing and intensive course improvement of English instruction at a technology university in Taiwan. Interdisciplinary Research Review, 14(2), 8–15. Retrieved from
Research Articles


M. Wallace, Action research for language teachers, Cambridge, 1998.

D. Powers, H. Kim, F. Yu, V. Weng, W. VanWinkle, The TOEIC[R] speaking and writing tests: Relations to test-taker perceptions of proficiency in English, Research Report, ETS RR-09-18.

T. McNamara, Language testing, Oxford, UK, 2000.

P. Scholfield, Quantifying language, UK: Multilingual Matters, 1995.

B. Lynch, Language program evaluation: Theory and practice?, Cambridge, UK, 1996.

J. Davis, T. McKay, A guide to useful evaluation of language programs, US: Georgetown University Press, 2018.

S. Erfani, A comparative washback study of IELTS and TOEFL iBT on teaching and learning activities in preparation courses in the Iranian context, English Language Teaching, 2012.

T. Khodabakhshzadeh, R. Zardkanloo, I. Alipoor, The effect of mock tests on iranian efl learners’ test scores, International Journal of Education and Literacy Studies.

A. Hughes, Testing for language teachers, Cambridge, UK, 1998.

Council of Europe, the common european framework of reference,

Educational testing service,

Anon, language assessment literacy enhancement: Any room for technology-enhanced language learning?, ICALT (2016).

American magazine center,

J. McNiff, Action research for teachers: A practical guide, UK: David Fulton Publishers, 2005.

S. Chwo, Lessons from six years of online proficiency testing associated with the freshman and sophomore English course in a technology university, Springer International Publisher, 2016.