2) Just as importantly, both systematic reviews also include the #PRISMA checklist, and PRISMA flow diagrams are now a standard for all #evidencesyntheses in the journal, supporting transparent reporting. Congratulations to the authors for setting a strong research precedent!
Posts by Language Testing
Chao Han et al. developed a bespoke tool for this purpose in their review of rater-mediated rubrics for interpreting, available as supplementary material on #OSF. doi.org/10.1177/0265...
Yanxin Wang & Shangchao Min used the COSMIN Checklist in their review of human–machine correlations for automated speech evaluation. doi.org/10.1177/0265...
The 2 systematic reviews embody two encouraging developments that strengthen reporting standards in #systematicreview & #metaanalysis research. 1) Both publications conduct evaluations of #methodologicalquality, which is rare in #appliedlinguistics & #languageassessment research.
Language Testing Volume 43 Issue 2 is now available! In this issue: 2 original articles, 2 systematic reviews, 1 brief report, and 2 book reviews. Check out our spotlight on the systematic reviews in this thread ⬇️. Full issue is available here: journals.sagepub.com/toc/ltja/43/2
We look forward to continuing our collaboration with Troy as a member of the Editorial Board, and offer our warmest congratulations once again to Slobodanka on her new role! 4/4
We're sincerely grateful to Troy Cox for his dedicated service as outgoing Test Reviews Editor. He established a new test review task force, developed genre‑specific reviewer guidelines, & implemented a more systematic, consultative process for selecting future test reviews. 3/4
Check out our full index of reviews here (all open access!): shorturl.at/leHsD Language Testing remains committed to widening access to high‑quality professional evaluations of tests and strengthening assessment literacy across the field. 2/4
Language Testing is delighted to announce that Slobodanka Dimova has taken on the role of Test Review Editor! Slobodanka previously served as Book Review Editor for the journal and now steps into this essential position. 1/4
Alongside modelling the "Letter to the Editor" publication format, she raises timely questions about membership engagement, voluntary service & the sustainability of professional associations. New Year’s resolutions anyone? (2/2)
Check out Lynda Taylor's piece in #LanguageTesting on what it means to be a member of a professional association in our field. doi.org/10.1177/0265... (1/2)
Also check out our Vodcast that directly feeds into this article on the Language Testing YouTube channel, where scholars from Latin America give their perspectives on barriers and enablers to publishing: www.youtube.com/watch?v=xP23... (4/4)
Language Testing is committed to equalizing opportunity & enhancing diversity & inclusion of our authors, readers, reviewers, Editorial team & scholarly community. We congratulate Salomé Villa Larenas on illuminating the state of play & challenges – crucial to effecting change! (3/4)
The Virtual Special Issue synthesizes & revives previously published work in #LanguageTesting that deserves renewed attention. Look forward to further contributions that this piece inspires! (2/4)
This pioneering Virtual Special Issue by Salomé Villa Larenas on the #GlobalSouth is consequential in bringing to light regional disparities in our journal and field and in shifting the focus to underrepresented contexts. doi.org/10.1177/0265... (1/4)
Language Testing Volume 43 Issue 1 is now available! This issue contains 1 editorial, 2 original articles, 1 brief report, and 1 viewpoint. Check out our posts on the Virtual Special Issue and Letter to the Editor, also in this issue! journals.sagepub.com/toc/ltja/43/1
Call for Language Testing Special Issue 2028 proposals is now open! Submission deadline: February 20, 2026. For information about how to submit a proposal, please read the “Language Testing – Special Issue Instructions for Guest Editors” document: tinyurl.com/bp7nyx9v
Now available in OnlineFirst, Chao Han, Mengting Jiang and Qionglu Chen present a systematic review and meta-analysis of rubrics used in the assessment of language interpreting, focusing on the intention, development, design, practice, and utility of the rubrics. doi.org/10.1177/0265...
The Editorial by @lukeharding.bsky.social discusses contrasting outlooks on the future of language assessment in the age of AI and highlights themes in the special issue articles that spark optimism for the field’s development in terms of ethics and responsibility. doi.org/10.1177/0265...
Ikkyu Choi and Jiyun Zu propose a new method for generating bias-free language assessment content, ensuring that the content is free from systematic relationships between demographic entities and their attributes. doi.org/10.1177/0265...
Ekaterina Voskoboinik et al. leverage LLMs for automatic assessment of L2 speech in Finnish & Finland Swedish and explore the viability of LLM-generated responses to enhance automated scoring of responses from learners at less common proficiency levels. doi.org/10.1177/0265...
Shungo Suzuki and colleagues introduce an L2 speaking assessment program that utilizes ML scoring and a conversational AI agent to provide contextualized diagnostic feedback on lexical use to learners. doi.org/10.1177/0265...
Yasuyo Sawaki and colleagues compare LLM and writing instructor checklist-based evaluations of written summaries generated by Japanese undergraduate English learners. doi.org/10.1177/0265...
Andrew Runge and colleagues introduce an innovative interactive writing task where test-takers receive LLM-generated follow-up prompts aimed to help test-takers elaborate on their initial response to address relevant themes. doi.org/10.1177/0265...
Liam Hannah et al. examine whether including prosody in oral reading fluency assessment can reduce scoring bias, improve diagnostic efficacy, and enhance prediction of reading comprehension across language background. doi.org/10.1177/0265...
The Special Issue features 7 original articles. Erik Voss investigates the performance and transparency of traditional machine learning methods vs. neural network models for scoring English essays in the TOEFL11 learner corpus. doi.org/10.1177/0265...
Guest Editors Eunice Eunhee Jang and Yasuyo Sawaki introduce the Special Issue: doi.org/10.1177/0265...
Language Testing Volume 42 Issue 4 is now available! The Special Issue of 2025: Advancing language assessment for teaching and learning in the era of the artificial intelligence #AI revolution: Promises and challenges journals.sagepub.com/toc/ltja/42/4
This article in Language Testing is now free to read & download for a year! Erik Voss investigates the performance and transparency of traditional machine learning methods vs. neural network models for scoring English essays in the TOEFL11 learner corpus. doi.org/10.1177/0265...
This article in Language Testing is now free to read & download for a year! Haerim Hwang and Hyunwoo Kim present a new open-source application for measuring syntactic complexity in L2 Korean production. doi.org/10.1177/0265...
Free access until September 9, 2026.