Friday, December 18, 2020

Reflecting on CS445/545 Human-Computer Interaction, Fall 2020

This is the third in my sequence of end-of-semester reflections [1,2], and today's topic is my Human-Computer Interaction class. This was the course that I ended up designing twice this past summer. The end result was a great experience. Out of my three different courses in Fall 2020, I think this is the one where I was most consistently impressed with students' dedication to tackling the ideas of the course. And how many ideas there were! With the removal of face-to-face discussions, and knowing the limitations of discussion boards, I presented a wider variety of ideas to the students this semester than in the past. I will proceed by giving some brief thoughts on these additions.

One of the most obvious, sustained additions was adding Open Mind Platform, which I wrote about in October. It went as I had hoped, with my seeing evidence that students learned from the content of Open Mind itself but also thought critically about its implications for HCI design. I have also been in touch with the team that manages the platform, and I have reviewed some revisions that they are making to further improve the teacher's experience. Their dedication to improving Open Mind excites me, and I am glad that I was able to play a part in it. In fact, I recently added Open Mind Platform as an Achievement in my sketch of CS222 plans for Spring 2021, which will be the topic for another day's blogging.

I think I made good use of Scott Klemmer's lectures, balancing them against several of my own lectures that gave my perspective on various HCI topics. Whereas Klemmer's presentations were generic, I was able to focus mine on the specific assignments and tasks that I had in mind for my students. I do not know when next I will be asked to teach this course, but I will need to consider how I might reuse some of this course design for face-to-face classes.

Another valuable addition was Nielsen's Usability Heuristics. Given the challenges of empirical testing in the face of a global pandemic, performing heuristic evaluation was much more practicable. Nielsen's approach combined well with the ideas in Norman's Design of Everyday Things, Gestalt principles, and the accessibility heuristics we explored, showing students that there were multiple valid and useful ways to evaluate a system.

I am pleased with how the final project went. I assigned them a challenge that I myself faced earlier this year: taking a traditionally in-person Student Showcase from the CCSC Midwest conference and making it an online asynchronous event. Rather than just throw them at the problem, I scaffolded a series of six weekly assignments that walked through a reasonable, human-centered process. Only two weeks out of the six were spent on building, which I think surprised many of the students. The technical artifact was not the primary outcome of the process but rather a secondary one: the primary outcome was the report that students built on week by week. In the end, these ranged from about 15 to 50 pages long, depending on students' use of images and their attention to detail. After reading their final project submissions, I recorded a response video in which I encouraged them to consider the extrinsic value of this document, given that it provides clear evidence to potential employers that they themselves can follow best practices of design—not just parrot a few definitions from a textbook.

The final project was not without problems. Some students fell behind, and there was little recourse for them. I provided verbal instruction on how to progress; for example, if someone was unable to get a build working, I told them to reach out to classmates and continue the project using someone else's build. However, no one did that: they either made something up or gave up on the project. It is worth noting that the view counts on my weekly feedback videos were noticeably less than the course enrollment, so I am not even sure that the students in trouble knew that I had given them instruction. 

The best of the reports explicitly compared their results to the one that I used. I did not require this, but I was surprised that only of the students thought to use my solution as a benchmark. I was a little disheartened to see how many students praised their solutions based on the limited understanding and testing, but most seemed to be honest about their strengths and shortcomings.

While planning for this semester, I went back and read my post about the Fall 2019 offering, which regular readers may recall was particularly challenging. Seeing the high quality work of these students, I wish I had them in a semester with a properly community-engaged project! The CCSC Midwest Student Showcase was a fine case study, but it was still primarily about my work as opposed to a community member's. Ah well, if my biggest regret is that I could not get these students working on something even more public, then I think that means it was a great semester.

No comments:

Post a Comment