Handwriting vs. Typewritten

This is some research I conducted at the request of my Principal.

Word Processed vs. Handwritten Essays: What Does the Research Say?

Our school implements a traditional final exam period. Over the course of three days, students are excused from classes and sit for timed exams. In some cases, the final exam counts as much as the grade for an entire quarter, and is worth up to 33% of the student’s semester grade.

As our school reexamines our current practices of assessment, I was asked to examine current research and studies contrasting handwritten and word processed final essays. Our school is a one-to-one school, so all students have access to a personal laptop throughout their school careers. The school is concerned about the validity of handwritten essays in assessing students whose work all semester has been word processed.

There are other questions the school wishes to investigate, as well. If given the choice to handwrite or type, are students who choose to handwrite their essays at a disadvantage when the essays are scored? Do students perform differently when handwriting vs. typing their answers? What are students’ concerns when choosing to handwrite or type?

The University of Edinburgh did a study in which students were given the choice between handwriting or typing essays. Typed and handwritten versions of each essay were produced and graded by a group of four graders. Surprisingly, the researchers found a small but noticeable grading bias toward handwritten essays. Students who chose to word process their essays received scores that were several points lower, a result that has been confirmed in a number of other studies (Russell & Tao 2004, MacCann et al 2002, Sweedler-Brown).

Powers et al (2014) provide two possible explanations in their report on a similar study they performed at Rio Hondo College. When grading a typed essay, readers may inadvertently expect a higher level of polish because the essay is typed out, rather than considering it as the rough draft produced under time pressure and anxiety that it is. The authors also noted a “Reader Empathy Assessment Discrepancy” in which readers tended to identify more closely with the writer if the work was handwritten, citing a “closer identification with the writer’s voice.” 

This suggests that in order to ensure that handwritten and typed essays are graded equivalently, schools should either require that all examinations be either handwritten or typed, or in cases where students are given the choice, educate graders about the presentation effect. Mogey et al. 2006 found that when teachers were made aware of this type of grading bias, it reduced the effect. It is also interesting to note that papers that were printed in a cursive font also seemed to reduce, but not eliminate, the bias (Powers et al 2014). When asked if they wanted to be able to choose handwritten or typed essays, only students who were faster handwriters suggested the need for choice. (Mogey et al 2014)

A second difference between handwritten and typed essays is the quantity of the finished output. Thomas, Paine, and Price (2003) found that students who typed their essays produced considerably more words than those students who chose to handwrite, a finding confirmed by Mogey, Paterson, Burk and Purcell (2006) although another study found the difference to be very small — 9 words more on average (Horkay et al. 2006) In schools where most assignments are given on the computer, students rarely use handwriting to do their work. Connelly, Dockrell and Barnett (2005) found that first-year undergraduates had a handwriting fluency level similar to what would be expected from an 11-year old child. Consequently, requiring students to handwrite their final essays when they have typed all their previous work calls into question the very validity of the assessment.

A number of studies also show that students who are very familiar with technology do better on typed tasks than those who are not (Wolfe, Bolton, Feltovich, and Bangert 1996 and Wolfe, Bolton, Feltovich, and Niday 1996). This suggests that for a number of our students entering as ninth graders, if they have not had the same access to technology, they will be at a disadvantage until they develop their word processing skills (not just typing, but editing and rearranging text too.)

The biggest question of all is how we might use computers more effectively to assess what students have learned. The nature of the word processed essay is essentially a substitution-level task (Puentedura, 2014) and is subject to many of the limitations that handwritten essays face. We should look into the possibility of using computers to generate and display animations, screencasts, video clips, live links, and other digital artifacts to demonstrate learning in ways that would be impossible in a handwritten essay. We might also examine the value of using time constraints on the task of demonstrating learning, and consider piloting untimed tests. 

In the end, the goal of the final exam should be to gather information about what is most important to us to know about our students’ learning. Horkay et al. 2006 suggest that which mode you use depends on what you want to know — do you want to know whether students write well on paper, or digitally, or how well they write in the mode of their choice? How relevant is the ability to write well on paper? Once we answer those questions for ourselves the path should be clearer.

References

Horkay, N., Bennett, R. E., Allen, N., Kaplan, B., & Yan, F. (2006). Does It Matter if I Take My Writing Test on Computer? An Empirical Study of Mode Effects in NAEP. Journal of Technology, Learning, and Assessment, 5(2).

MacCann, R., Eastment, B., & Pickering, S. (2002). Responding to free response examination questions: computer versus pen and paper. British Journal of Educational Technology, 33(2), 173–188. http://doi.org/10.1111/1467-8535.00251

Mogey N., and G. Sarab. 2006. Essay exams and tablet computer – trying to make the pill more palatable. Paper presented at the 10th CAA Conference, Loughborough, UK. 

Mogey, N., Paterson, J., Burk, J., & Purcell, M. (2010). Typing compared with handwriting for essay examinations at university: letting the students choose. Alt-J, 18(1), 29–47.http://doi.org/10.1080/09687761003657580

Powers, D. E., Fowles, M. E., Farnum, M., & Ramsey, P. (2014). Will They Think Less Of My Handwritten Essay If Others Word Process Theirs? Effects On Essay Scores Of Intermingling Handwritten And Word-processed Essays. ETS Research Report Series, 1992(2), i–15.http://doi.org/10.1002/j.2333-8504.1992.tb01476.x

Puentedura, R. R. (2014). SAMR: A contextualized introduction. Retrieved November.

Russell, M., & Tao, W. (2004). The influence of computer-print on rater scores. Practical Assessment.

Sweedler-Brown, C. O. (1991). Computers and assessment: The effect of typing versus handwriting on the holistic scoring of essays. Research and Teaching in Developmental Education.http://doi.org/10.2307/42801814

Thomas, P., C. Paine, and B. Price. 2003. Student experiences of remote computer based examinations. Paper presented at the the 7th CAA conference, July, Loughborough, UK. 

Wolfe, E. W., Bolton, S., Feltovich, B., & Bangert, A. W. (1996). A Study of Word Processing Experience and its Effects on Student Essay Writing. Journal of Educational Computing Research, 14(3), 269–283. http://doi.org/10.2190/XTDU-J5L2-WTPP-91W2

Wolfe, E. W., Bolton, S., Feltovich, B., & Niday, D. M. (1996). The influence of student experience with word processors on the quality of essays written for a direct writing assessment. Assessing Writing, 3(2), 123–147. http://doi.org/10.1016/S1075-2935(96)90010-0

© Douglas Kiang 2020