Nuanced assessments: More than the final grade
This post is republished from Into Practice, a biweekly communication of Harvard’s Office of the Vice Provost for Advances in Learning.
Howell Jackson, James S. Reid Jr. Professor of Law, experiments with end-of-semester exams and writing assignments to create opportunities for meaningful, formative feedback through skills practice, reflection, and peer collaboration.
The benefits: Jackson has experimented with multiple choice questions, most recently in Introduction to Securities Regulation, requiring students to select options such as “Clearly no” and “Probably yes” and then write short essays elaborating on a few of the questions they found particularly challenging or problematic. This nuanced approach affords opportunities to explore and evaluate different skill sets: “Grading these responses tells you an awful lot about which of the students are really thinking hard and understand the doctrinal complexities we have explored in class.”
The challenges: Some customization is required in designing for specific pedagogical goals, as no one method lends itself to all content or course structures. The time commitment to evaluate different assessments is also aconsideration: while long-form essays may provide the greatest opportunity for personalized feedback, Jackson assigns these in courses no larger than 30-40 students. The short essays accompanying multiple choice exams also require more time to grade.
Takeaways and best practices
- Exams and assignments as skills practice. When designing multiple choice and essay questions, Jackson bears in mind common tasks and challenges that young legal associates will face, for example, evaluating the plausibility of legal interpretations. “What I’m hoping to test is, ‘Are they beginning to think about the law and sort information in ways they’ll be expected to do in practice?’”
- Solicit feedback via testimonial or outcome. With any new assessment approach, Jackson collects feedback from students about their experience. “Sometimes with multiple choice questions, I think the answer is clear, but I discover from the students’ short essay responses that several other answers are plausible. That gives me a chance to adjust my grading, including removing or rescoring questions based on student responses. If I did strict multiple choice—that is, without additional short essays—I may have never known that a question didn’t work.” He also reviews exam performance for evidence of question biases (e.g., gender).
- Peer review promotes top effort. In courses where research papers are assigned, students post comments on at least four papers written by other members of the class. “Students are a little bit more compulsive when they know their classmates are going to evaluate their papers.” This online interaction exposes students to topics not covered in class, and provides another evaluative opportunity.
Bottom line: Thoughtfully crafted questions can increase student learning through varied feedback opportunities. According to Jackson, the time investment to try different approaches is worth it: “I tell my students that a new method of evaluation could be a complete disaster, probably will end up being just fine, but might just possibly evolve into an innovation that will revolutionize the way the law is taught around the country. That’s the kind of experimentation we should be doing at Harvard.”