INVESTIGATION OF WRITING ASSESSMENT LITERACY OF UKRAINIAN UNIVERSITY TEACHERS
DOI:
https://doi.org/10.17721/2663-0303.2019.4.02Keywords:
criteria for assessment, rating scales, teacher assessment practices, writing assessment literacy, writing tasksAbstract
The rating process of students’ writing has been a long-standing concern in L2 large-scale standardized and classroom-based assessment. Several studies have tried to identify how the raters make scoring decisions and assign scores to ensure validity of writing assessment. The current paper addresses the issue of writing assessment practices of Ukrainian university teachers, how they approach rating scales and criteria with an attempt to understand culturally specific challenges of teachers’ writing assessment in Ukraine. To investigate the issue, this study employs the analysis of the survey results obtained from 104 university teachers of English. The survey consisted of 13 questions that provided insight into current practices in assessment of writing, such as frequency of assessment, use of rating scales, rater’s profile, criteria of assessment, feedback and rewriting, training in assessment of writing. The survey responses show that assessment in Ukraine is not regulated by common standard, and thus the approach to students’ writing assessment is often intuitive. A frequent practice is that teachers tend to rely on errors – as observable features of the text – to justify their rating decisions, Consequently, by shifting focus onto the surface features of writing, grammar mistakes in particular, the teachers underrate such criteria as “register”, “compliance with textual features” and “layout”. Additionally, the data reveal contradictory findings about writing assessment literacy of the teachers questioned. Even though most teachers claim they apply scales while rating, many confess they cannot tell the difference between holistic and analytic scales. Moreover, the results indicate that feedback is not yet a meaningful interaction between a Ukrainian teacher and a learner. Therefore, the results of the study demonstrate the need for the improvement in writing assessment practices, which could be achieved through providing training and reorientation to help Ukrainian teachers develop common understanding and interpretation of task requirements and scale features.References
Alaei, M.M., Ahmadi M., & Zadeh N.S. (2014). The Impact of Rater’s Personality Traits on Holistic and Analytic Scores: Does Genre Make any Difference too? Procedia - Social and Behavioral Sciences 98, 1240 – 1248. Available online at www.sciencedirect.com
Aslim-Yetis, V. Evaluating Essay Assessment: Teacher-Developed Criteria versus Rubrics. Intra/Inter Reliability and Teachers’ Opinions. Croatian Journal of Education. Vol.21; No.1/2019, pages: 103-155. https://doi.org/10.15516/cje.v21i1.2922
Attali, Y. (2016). A comparison of newly-trained and experienced raters on a standardized writing assessment. Language Testing, Vol. 33(1) 99–115. DOI: 10.1177/0265532215582283.
Bijani, H. (2010) Raters’ Perception and Expertise in Evaluating Second Language Compositions. The Journal of Applied Linguistics Vol. 3, Issue 2, p.69-89. Retrieved from Researchgate.
Brookhart, S. M. (2011). Educational assessment knowledge and skills for teachers. Educational Measurement: Issues and Practice, 30(1), 3–12. doi: 10.1111/j.1745-3992.2010.00195.x.
Chan, S.H.C. (2013). Establishing the validity of reading-into-writing test tasks for the UK academic context / A thesis submitted to the university of Bedfordshire, 376p.
Cho, D. (2008) Investigating EFL Writing Assessment in a Classroom Setting: Features of Composition and Rater Behaviors. THE JOURNAL OF ASIA TEFL Vol. 5, No. 4, pp. 49-84.
Coombe, C., Folse, K., Hubley, N. (2007). A practical guide to assessing English language learners. Ann Arbor, Michigan: University of Michigan Press.
Crusan, D., Plakans, L., & Gebril, A. Writing assessment literacy: Surveying second language teachers’ knowledge, beliefs, and practices. Assessing Writing - Volume 28, April 2016, Pages 43–56. https://doi.org/10.1016/j.asw.2016.03.001
Eckes, T. (2012). Operational Rater Types in Writing Assessment: Linking Rater Cognition to Rater Behavior. Language Assessment Quarterly, 9: 270–292. DOI: 10.1080/15434303.2011.649381
Fahim, M., & Bijani, H. (2011). The effects of rater training on raters’ severity and bias in second language writing assessment. Iranian Journal of Language Testing, 1(1), 1-16.
Ghanbari, B., Barati, H. and Moinzahed, A. (2012). Rating Scales Revisited: EFL Writing Assessment Context of Iran under Scrutiny. Language Testing in Asia. Volume two, Issue one. 2012, p. 83- 100.
Hamp-Lyons, L. (2007a). The impact of testing practices on teaching. In International handbook of English language teaching, 487-504. Springer, Boston, MA
Hamp-Lyons, L. (2007b). Worrying about rating. Assessing Writing, 12(1), 1-9. https://doi.
org/10.1016/j.asw.2007.05.002.
Jeong, H. (2015). Rubrics in the classroom: do teachers really follow them? Language Testing in Asia 5:6. DOI 10.1186/s40468-015-0013-5.
Johnson, J. S., & Lim, G. S. (2009). The influence of rater language background on writing performance assessment. Language Testing, 26(4), 485-505. http://dx.doi.org/10.1177/0265532209340186
Keh, C. L. (1990). Feedback in the writing process: a model and methods for implementation. ELT Journal, (44)4, 294-304.
Kim, Ah-Young and Gennaro, di Kristen. (2012). Scoring Behavior of Native vs. Non-native Speaker Raters of Writing Exams. Language Research 48.2, 319-342.
Knoch, U. (2007). ‘Little coherence, considerable strain for reader’: A comparison between two rating scales for the assessment of coherence. Assessing Writing 12 (2007) 108–128. Available online at www.sciencedirect.com.
Knoch, U. (2011). Rating scales for diagnostic assessment of writing: What should
they look like and where should the criteria come from? Assessing Writing, 16,81-96.
Lee, I. (2010). Writing teacher education and teacher learning: testimonies of four EFL teachers. Journal of Second Language Writing, 19, 143–157.
Lim, G. S. (2011). The development and maintenance of rating quality in performance writing
assessment: A longitudinal study of new and experienced raters. Language Testing, 28, 543– 560.
Lumley, T. (2002). Assessment criteria in a large-scale writing test: what do they really mean to
the raters?. Language Testing, 19(3), 246-276. https://doi.org/10.1191/0265532202lt230oa.
Mellati, M., & Khademi, M. (2018). Exploring Teachers’ Assessment Literacy: Impact on Learners’ Writing Achievements and Implications for Teacher Development. Australian Journal of Teacher Education, 43(6). http://dx.doi.org/10.14221/ajte.2018v43n6.1
Nemati, M., Alavi, S. M., Mohebbi, H., and Masjedlou, A. P. (2017). Teachers’ writing proficiency and assessment ability: the missing link in teachers’ written corrective feedback practice in an Iranian EFL context. Language Testing in Asia 7:21. DOI 10.1186/s40468-017-0053-0.
Skar, G. B. & Jølle, L. J. (2017). Teachers as raters: An investigation of a long-term writing assessment program. L1-Educational Studies in Language and Literature, 17, 1-30. https://doi.org/10.17239/L1ESLL-2017.17.01.06.
Truscott J. (1999). The case for “The case against grammar correction in L2 writing classes”: A response to Ferris. Journal of second language writing, (8)2, 111-122. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/ download?
doi/abs/10.1.1.598.1536&rep=rep1&type=pdf.
Weigle, S. C. (2002). Assessing writing. Cambridge: Cambridge University Press.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2019 Ольга Квасова, Тамара Кавицька, Вікторія Осідак
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Copyright policy according to the terms of the license: Creative Commons "Attribution-NonCommercial" 4.0 International (CC BY-NC 4.0).
Authors who publish their articles in "Ars Linguodidacticae" (Open Access Journal) retain the following rights:
- The authors retain the copyright of their article and grant the Ars Linguodidacticae journal the right to first publish the manuscript of their article under the Creative Commons (CC BY-NC 4.0) Attribution License, which allows others to freely distribute the published work with mandatory reference to the author of the original work and first original publication in the Ars Linguodidacticae journal. An indication of the retention of the copyright of the work is provided on the title page of the article.
- The authors reserve the right to enter into separate contracts for the non-exclusive distribution of their article as published in Ars Linguodidacticae (e.g., placing the article in electronic libraries, archives and catalogs or publishing it as part of institute collections and monographs), provided that a full reference to the first original publication in Ars Linguodidacticae is given.
- The policy of the "Ars Linguodidacticae" journal allows and encourages authors to post a manuscript both before and during editorial processing, as this promotes productive scientific discussion and has a positive effect on the speed and dynamics of citing the article.
The editorial board reserves publishing rights to:
- the collated original articles and to the entire issue of the journal.
- the design of the journal and original illustrative and supplementary materials.
- the reprint reprints of the Journal in printed and electronic form.
The copyright policy is carried out according to the terms of the license: Creative Commons "Attribution-NonCommercial" 4.0 International (CC BY-NC 4.0).
For more information, please read the full text of the CC BY-NC 4.0 Public License.
Creative Commons Attribution-NonCommercial 4.0 International License.