3 | 2 | 1 | 0 | ||
---|---|---|---|---|---|
Commitment | Attended all scheduled team meetings or notified the team of absence. | Missed team meetings, with notifications, with enough regularity to be problematic. | Missed one or more team meetings without notifying the team. | Regularly missed team meetings without notifying the team. | |
Participation | Contributed to project planning, implementation, testing, and presentations. | Did not contribute to one of the following: project planning, implementation, testing, and presentations. | Did not contribute to two of the following: planning, implementation, testing, presentation. | Did not contribute to three or more of the following: planning, implementation, testing, presentation. | |
Communication | Clear reports on what has been accomplished, what is in progress, and what stands in the way, thereby facilitating progress. | Sometimes is unclear about what has been done, what is in progress, and what stands in the way, creating minor impediments to progress. | Is regularly unclear about what has been done, what is in progress, and what stands in the way, creating significant impediments to progress. | Communication patterns directly disrupt team progress. | |
Technical contributions | High quality technical contributions that facilitate success of the team. | High quality technical contributions that do not directly facilitate the team's success. | Low quality technical contributions that frequently require redress by other team members. | Low quality technical contributions that inhibit success. | |
Attitude and Leadership | Listens to, shares with, and supports efforts of others, and actively tries to keep the team together. | Listens to, shares with, and supports the efforts of others. | Frequently fails to listen, share, or support teammates. | Displays an antagonism that inhibits team success. |
I have been conducting the evaluations on paper and transcribing the data into a spreadsheet for analysis. I realized last academic year that the analog form was preventing me from using self- and peer-evaluations more frequently. I suspect that if teams had to give and receive this feedback more regularly, more teams would be able to identify and mitigate collaboration problems. I decided to require self- and peer-evaluations for each iteration of my Game Programming and Advanced Programming courses, in part to push me to explore a streamlined, digital solution.
My first stop was Blackboard, which is required at my university at least for entering students' final grades. I found no support for the kind of group-restricted peer evaluations that interest me.
Enter Google Forms. If you have not used this before, it is an easy-to-use tool for creating forms, distributing them, and analyzing the results via integration with Google Sheets. I created a template that converts the tabular rubric into a series of five questions on a zero-to-three scale. To use the form for any particular class, I copy the list of student identifiers from Blackboard and paste them into the first two questions—the selection of the evaluator and the subject under evaluation. I could not find a way to make the template publicly readable without also making it publicly editable. If you want to try the form yourself, contact me and I'll share it with you via Google Docs, then you can make your own copy of it.
Once the students have completed the form, I use a pivot table to summarize the results. Specifically, I add a row for the subject of evaluation ("evaluatee"), then add values for the five rubric categories, summarizing by median. The pivot table is itself a sheet, and so I can add a new column that sums the medians to reach a student's contribution score.
Getting this information back to the student is still a manual process. My current approach is to make a new column in Blackboard's grade book and then sort the grade book table by student identifier to match the sequence in the pivot table. I step through, student by student, entering the contribution score and, in the comments field, the five medians in a simple, space-separated format. While this is somewhat tedious, it does force me to look at each line and identify whether I should intervene with a team or an individual.
The form cannot validate that each student completes the form exactly once for each peer. Another quick pivot table makes this easy to verify, however: add a row for each evaluator and a column for each evaluatee, add the evaluatee data, and summarize by "COUNTA". Anyone who has submitted more than one evaluation of the same person will be notable in the table, and each row's total should be the number of person in that student's team. This technique allowed me to recognize that a few students submitted multiple evaluations of a partner, presumably by mistake since the data matched up.
This approach presumes that students will be honest in identifying themselves, but the whole self- and peer-evaluation system already presumes honesty. I talk to my students about this, that I assume they will be honest with me because I have been honest with them.
Hopefully these notes will be useful to others who are thinking about how to conduct peer evaluations. If nothing else, this will help me remember how I handled the first iteration so that I can replicate it in the second!