Online analytic reading and writing tutorial dissemination and evaluation

Awardees: Meira Levinson (HGSE), Sherry Deckman (HGSE), Abena Mackall (HGSE)

Summary: Awardees plan to use usage data and impact assessments to further develop their online tutorial on analytic reading and writing.

Meira Levinson, Professor of Education, and Sherry Deckman, Ed.M.’07, Ed.D.’13, created an online Analytic Reading and Writing Tutorial in 2008-2009 that they used in their courses with hundreds of students. With these funds, they intended to further develop the tutorial, as well as to develop a formal evaluation tool to capture data on student use and learning.

As with any multimedia tool, the team encountered some setbacks throughout the duration of their project. When Deckman and Christina Dobbs, another student who had been hired to help develop the evaluation tools, graduated in 2013, Abena Mackall, Ed.D. ’17, resumed work, primarily on the assessment writing portion of the project. The process of developing assessment methods for the online tutorial has yielded some unexpected insight—Levinson says that the assessment model helped her take a closer look at the tutorial’s learning objectives. Based on these observations, she estimates that she has made significant revisions to 20% of the slides and moderate revisions to another 10-15%. While Deckman had narrated the original online segments in her own voice, so many slides ended up being revised during this process that Levinson had to rerecord the entire tutorial as narrator. They have since hired a T.I.E. intern to focus on the technical work of the online modules so that Mackall can focus exclusively on the content and the assessment.

The assessment model that Levinson and Mackall devised this past year puts students in the position of offering advice to a virtual “student” in their writing group. This enables students who have used the tutorial to demonstrate their mastery of the new ideas and skills by guiding someone more novice through the analytic reading and writing process. Because it is a writing tutorial that tries to teach quite complex skills, Levinson noted, the assessment has to be self-conducted. But Levinson and Mackall wanted it to be a more rigorous assessment than just attitudinal measures: “How well do you feel you understand…” They therefore created the character of Matt, who asks his “writing partner” for help in reading for, planning, and writing an analytic paper. The students respond to Matt’s questions via multiple-choice answers. Both correct and incorrect responses were crafted to provide additional writing tips and support to the students. Their initial pilot of two assessment modules, with 10 students, “told us we were on the right track,” according to Levinson.

Levinson stressed that the process has been slow going, in part because realistically this project is always a fourth or fifth priority to her primary responsibilities, and in part due to the affordances of the software tool where they developed the tutorial. Levinson was hoping to create an authentic assessment in which students themselves could highlight, annotate, and write. They looked at the edX annotation tool as a possibility, but felt that it was a bit clunky in its early stages, so ultimately went with a multiple-choice method. “There’s bad multiple choice, and there’s better multiple choice,” Levinson remarked. They have put forth great effort to make the multiple-choice assessment as meaningful and comprehensive as possible. She reiterated that she is very happy with the tool and is enthusiastic about exploring its impact.