Student Peer Assessment: Peter Graf Interview

An interview with

Peter Graf

Peter Graf, professor of psychology at the University of British Columbia, understands the value of personalized feedback for student learning. However, such feedback can be challenging to deliver in his large classes of more than 250 students. With the support of the Teaching and Learning Enhancement Fund, he was able to develop a student peer review project where he motivates students and develops their critical thinking.

Please tell us about your project.

I teach introductory psychology classes, Psych 101 and Psych 102. For each class, I have about 250 and 300 students. I wanted to give students in these large classes an opportunity to develop their writing skills, hone their ability to read critically, enable them to compare their work to the work of peers, engage them more fully in the process of learning to write and to give feedback on written work. We all find it much easier to find flaws in the work of others, and we learn from the flaws we discover.

I ask students to write a 1,000- to 1,200-word essay on a topic that connects psychology with something that happens in real life. Students submit the essay to a platform that we use for circulating them. They also submit their essay to TurnItIn for an originality check.

Then, every essay gets assigned to six anonymous reviewers. The reviewers have two weeks to grade every essay. The teaching staff review the reviews to make sure they are valid [and consider questions such as]: Are the reviews consistent with the rubric? Is the written feedback both critical and constructive?

The median of the peer grades is used for essay grading. Students have the right to challenge the grade they get and ask the instructor to regrade it. About three out of 300 students end up asking for that.

What motivated you to initiate the project?

As an undergraduate student, I learned the most in small classes, where there was a strong emphasis on writing and presentations. Students in large classes typically don’t have such opportunities.

If we want students to learn to read critically and to express themselves more effectively, we must create opportunities for developing and practicing the relevant skills. At some point, Catherine Rawn [Professor of Teaching in the Department of Psychology] and I got together and said, “What can we do to change this introductory course in such a way that we give students more opportunity for critical reading, critical writing?” That’s why we got into peer assessment.

However, it was clear right from the beginning that students were quite reluctant or unhappy with this kind of assignment. Students don’t know how to do peer assessments and believe that assessment can be done only by instructors.

This was shown in two different ways. One, obviously, we got complaints. It isn’t a big issue because you just make sure they have the opportunity to have their paper regraded by the TA. The second one is that the students just switched classes (sections). That’s when we knew we needed to address that kind of problem.

How did you do it?

In the past, Catherine and I have addressed that fear in a class through readings and so on. But, quickly, we could see that part of that fear comes from the fact that students just don’t know what peer assessment is.

That’s why the peer assessment training workshop came about. The workshop introduces students to peer assessment, educates them about peer assessment, requires them to become familiar with the assignment description and the scoring rubric, and allows them to practice peer assessment skills, etc.

Students learn by doing. You need to get them doing something. So, students have to become familiar with the assignment description and with the rubric. They have to practice by completing peer reviews on sample essays using the rubric.

Continue reading ▼

Did you have the support you need for the project?

We had great support in connection with the peer assessment training workshop through the Teaching and Learning Enhancement Fund. The project changed a lot, even from the grant application to the actual way of developing the workshop. We had lots and lots of help, and that made all the difference.

What were some of the key outcomes of the project?

Once students have done the peer assessment training workshop, their attitude toward peer assessment changes. They no longer think it’s inappropriate to be assessed by peers, or for their grades to be determined by the assessment of a peer. They also no longer have this conviction that they would be good at doing peer assessments, but their peers would not be good.

Also, students are more engaged with the feedback provided by reviewers — they question the validity of the feedback. Students should get the experience of disagreeing with reviews and criticizing the reviews that they received. You never get that experience if you’re graded by a teacher. I think it’s useful if we can increase our students’ awareness of the subjectivity of written and spoken words.

A significant outcome, in my view, is that students are better equipped to assess and predict the quality of their own work — the difference between what grade they expect for their work and the grade they receive is reduced by around 50 per cent after completing the workshop. The evidence that we have shows that this narrowing of the difference between expected and actual performance is due in part to the fact that their essay grades are higher by about five per cent. And also, that their expectations have lowered.

So, the peer assessment workshop makes students write better quality papers, and it makes them have fewer unjustified assumptions about the quality of their work.

How did the project impact learners or how you teach?

Students now start becoming more actively involved in looking at the feedback, and saying, “Hey, this is what the assignment was all about, and this is what I did.”

I do a lot more learning by doing. Learning is really something that requires activity. Today, in my classroom, there’s far more opportunity [for students to be active], either in the classroom or online.

What lessons have you learned that you want to share with your colleagues?

The biggest challenge is probably the administration. It’ll take a bit of effort to learn to set it up. But, ultimately, it’ll run smoothly, and it’s fail-safe.

The other thing is that students have lots of fear about being assessed by other students. So some initial hand-holding, to give them a reason for why peer assessment is being done, is important. Share with your students that peer assessment:

  • Helps students understand the audience, the readers of their writing.
  • Helps students develop the skills to read critically and to write critically, to make critical constructive comments on peers’ work. This is a life skill you need anytime you go into a job.
  • Helps students develop a skill to deal with the diversity of reviews, so that eventually they can come up with a product that’s better.

Finally, I would really advise anybody to assure students that the median is, in fact, a fair grade. Again, just telling them is probably not good enough. The thing to do is have them grade their own essay against that same rubric and see if they come up with that same thing. Once you take all these steps, I think you’re doing a great job.

What are the future plans for this work? Any changes that you want to make?

I would like it if the platform (used for allocating essays to reviewers, and for collecting reviews) was integrated with TurnItIn. There should be a single gateway to make administering the process a lot easier.

I’ll probably make this a slightly heavier component in the future and make sure that everything gets fine-tuned. I want to start doing the peer assessment with a spoken presentation and a poster. Every so often, I teach Psych 333, which is the third-year memory course and has only about 120 students. And students in that class typically do longer essays. They also are expected to do a PowerPoint on one or two research reports or to present it on a poster. Next time I teach 333, I’ll probably do that, posters.

 

How UBC faculty have incorporated Student Peer Assessment

Silvia Bartolic

Silvia introduced SPA as a way of sharing her sociology students’ work with their peers. She explains the challenges and learnings she found along the way.

Learn more >

James Charbonneau

Initially a community-building exercise, James explains how peer evaluation in his physics class evolved into student peer review platform ComPAIR, and the importance of safety in peer assessment.

Learn more >

Kevin Chong

One of the foundational pieces of creative writing is peer review. Traditionally run in small workshops, Kevin shares how he brought peer review across to large lecture classes and its importance to developing writers.

Learn more >

Peter Graf

Personalized feedback for student learning can be challenging to deliver in large classes. But beyond that, Peter sees peer assessment as an opportunity for students to develop important critical reading and self-assessment skills.

Learn more >

Misuzu Kazama

It’s far more common for peer review to be applied to writing tasks than spoken ones. Can language students give each other good feedback on a spoken assessment? Misuzu developed a project with real-world context to find out.

Learn more >

Kelly Allison & Marie Nightbird

Interpersonal communication is a key skill for social work students. After using informal peer feedback to develop those skills for many years, Kelly and Marie share how they formalized the process to gather insights and improve the student experience.

Learn more >

Jennifer Walsh Marr

From a starting point of investigating accountability in group work, Jennifer’s peer assessment project led to more student-centred teaching, and a better sense of community for her students.

Learn more >