Sunday, April 26, 2009

MA Dissertation IV

Way (2008) notes that two studies (Russell, 1997, and Russell 2001) show higher performance in computer mode tests, whilst two others (Way, 2006, and Bridgeman, 1998) show a lower performance compared to handwritten mode. He comments: "These findings may have to do with the keyboarding skills of the students involved in the studies."

It would be nice to eliminate that "may" by correlating the test score with a score on a simple keyboard skill test. There are a number of them online, for example at There's a online test here at TypingMaster, which also includes a downloadable test. (It occurs that if I was expecting teachers in, say, the Czech Republic to conduct this research on my behalf, under the auspices of FEB, I could make a wee TBL lesson out of it, so that it's more teacher and learner friendly).

Bridgeman, B., & Cooper, P. (1998). Comparability of scores on word-processed and handwritten essays on the Graduate Management Admission Test. Paper presented at the annual meeting of the American Educational Research Association, San Diego, CA.

Russell, M., & Plati, T. (2001). Effects of computer versus paper administration of a state-mandated writing assessment. TCRecord. Retrieved December 12, 2007,from

Russell, M., & Haney, W. (1997). Testing writing on computers: An experiment comparing student performance on tests conducted via computer and via paperand-pencil. Education Policy Analysis Archives, 5(3). Retrieved December 12,2007, from

Way, W. D., & Fitzpatrick, S. (2006). Essay responses in online and paper administrations of the Texas Assessment of Knowledge and Skills. Paper presented at the annual meeting of the National Council on Measurement in Education, San Francisco, CA.

Way, W.D., Davis, L.L., Strain-Seymour, E., (2008). The Validity Case for Assessing Direct Writing by Computer. A Pearson Assessments & Information White Paper. [incomplete ref]