Rank: 2-3 (Herrington and Moran are suspicious enough to be opposed to the decision to implement Writeplacer, but they do see its upsides (cost/efficiency).
Argument: The researchers are convinced of Writeplacer’s cost-effectiveness as a placement exam scorer (or, at least, they see how it is a cheaper and more efficient method than hand-scoring), but they are troubled by the removal of human assessors for multiple reasons.
Assumptions: This is an interview-based case study of the implementation of Writeplacer Plus (as an entrance-level sorting device for newly-registered students) at Valley College). The researchers interviewed administrators, teachers, and students about the use of Writeplacer. Writeplacer, in this situation, was used as a one-time test (meant to act as a filter), and so, the administrators see no problem using it as a filter (since it’s not functioning within the classroom itself).
Points of Interest:
- The core (and fascinating) problem that the researchers isolate is that the students are writing not to humans, but to computers (a problem that extends beyond this study to the greater realm of AES) (114). They wonder how knowledge of this different audience could affect students’ rhetorical practice.
- Administrators and faculty have distinctly opposed opinions about Writeplacer (the former think of Writeplacer as a great “filter” (124); the latter are suspicious and don’t want to be blindsided (because when they were hand-scoring exams, they could get an idea of what kind of writing they would have to work with). Students were more likely to shift between those positive and negative poles of opinion. Overall, they were shocked (specifically, some approved, some disapproved, but all wanted actual human teachers in their courses).
- No matter what, everyone involved (essentially) was in favor of keeping AES out of the actual classroom (even though Writeplacer Plus was justified by administrators as a one-time “filter”).