Monday, July 17, 2023

Summer Course Planning: CS414 Game Studio 1

Last semester, I taught my department's new game preproduction class for the first time. I wrote a few blog posts about it [1,2,3,4,5], but it seems I didn't write a summarizing essay. It was a great experience, and I have been tasked with teaching our new games capstone sequence in the coming academic year. It is a two-course Game Studio sequence. In the future, it will be interdisciplinary, but because my department is one catalog ahead of our collaborators, our vanguard is entirely Computer Science majors.

I was glum after having killed my summer side project about a week and a half ago, and so it was originally with some reluctance that I moved into doing some university work. Once I got into it, though, I quite enjoyed pulling my plans together for the first course of the game studio sequence, CS414. I do enjoy this work, but it was also good to have some time away from it.

I posted my course plan as a GitHub repository rather than following my custom the last several years of making a web site for the course. I was inspired in part by Robert Talbert's repositories (example), which I came across after reading through some of his alternative grading posts. His presentation of EMRF grading stuck in my head for potential use, although I decided against using in CS414. It was definitely faster to type up a course plan as a directory of Markdown files, but the result is unarguably less elegant than my usual course sites. Consider my CS315 Game Programming site, where Javascript functions remove the redunancy necessary for allowing both presentation and download of checklists, or CS222, where custom components and stylesheets make achievements look enticing. In my mind, the jury is still out, and I'm not sure which approach I will take for the third course I need to prep this summer. (Also, an unrelated discussion on the SIGCSE mailing list brought me to a coherent argument to stop using GitHub, and now that's stuck in the back of my head as well.)

As part of my preparatory work, I read the second half of Lemarchand's text that we had used for the preproduction class. I decided that we will continue to follow his process, taking the conventional production approach of dividing the rest of our time into an alpha phase, a beta phase, and postproduction. I'm following his advice on the proportion of time to each of these activities, and this has the alpha phase wrapping up about a week before the end of the Fall semester, and the beta phase will finish two weeks before Spring Break in the subsequent course. I contemplated applying instead the Scrum-based approach to planning that I have used in my one-semester immersive game studio classes, but this is a good opportunity for us to follow Lemarchand's advice from the trenches and also for me to learn something new.

I spent a lot of time thinking and tinkering around the best process and tools to recommend to my students. There were two key concepts that kept coming to mind. The first is that I want to gain the benefit of how I use two-week sprints in my usual game studio courses. We do this by planning out a two-week increment and estimating the time required for all the relevant tasks. I manage the backlog grooming, story articulation, and prioritization; the students determine which stories to address, the tasks required, and the estimated hours for each task. The students update their hours after every work session, and this lets me draw a burndown chart to map steady progress against actual progress. At the end of each sprint, we talk about how it went, using a retrospective format to update the methodology. This approach gives students ownership over the execution of the project, forces them to communicate with each other, gets them into a mode of reflective practice, and makes them think about estimation. It has the disadvantage of not actually being agile, as many methodologists have pointed out: committing to stories for a sprint is the polar opposite of embracing change. This has the corresponding issue that students (and sometimes I!) confuse estimate and commitment; this sometimes manifests as students becoming martyrs for the project, but more often it manifests as students not meeting their goals because they haven't managed their time prudently.

The second concept that I've wrestled with is that I want to suggest to them an appropriate level of tool support that will help them succeed without getting in their way. This led to a lot of research and consideration. This is not an exhaustive list, but I want to capture a few notes here so that I can find them later.

  • Taiga: Disadvantage: In Scrum mode, forces the use of user stories and story points for tracking work. I do not want to use story points, and I would like to be able to track tasks outside of stories (which is something Scrum permits).
  • HacknPlan: Disadvantage: Enabling hour estimates also enables hour logging, which is something I don't want to use. It has no built-in way to track changing estimates over time: changing the estimate of a task or story is a lossy operation.
  • GitHub Projects: Everything is an "issue," which is not ontologically right, and it has no way to articulate something like a story as comprising of tasks. (There seems to be such a feature in beta for public repositories, and I think that's being used in Godot's project trackers, but it looks like I do not have access to it.)
  • Lemarchand's Google Sheets Templates: Require manual copying of data across sheets, producing redundancies that any good project management system should eliminate.
  • Yoda: I could not make this work for private repositories.
In the end, I chose HacknPlan. I made a video a few years ago about how to incorporate a manually-updated burndown chart into HacknPlan. Unfortunately, there was a part of the process that I forgot to put into the video: where do the data come from? Since HacknPlan's estimate changes are lossy, one needs to get the daily data from HacknPlan into the spreadsheet. The way this is done is by using the information ("i") button on a HacknPlan board, which will show you the current number of work items and hours remaining. 

Lemarchand's presentation of the production process leaves a gaping hole in moving from putting together a schedule to executing the work. He points out that if you have more than a few weeks of work, it should be broken down into smaller chunks; I am sure he is thinking of something like Scrum sprints here. What, exactly, is the relationship between the original schedule, the macro document, and sprints, though? 

Here's a small example to illustrate how this becomes complicated. For preproduction, one of my teams articulated the task to model a spaceship; it's a top priority task and is estimated to take four hours. For the purposes of the alpha milestone, however, it doesn't have to be completely finished: it only has to be representative, enough to allow feedback to be collected and taken into account before polishing. What, then, happens to that original work item on the schedule? A four-hour, high-priority modeling task that has to be done before the end of the project is actually at least two different modeling tasks, with two different priorities, potentially happening at different times or even by different people. One way to manage this is to decompose the task into two that sit in different user stories, each with different conditions of satisfaction: the first could specify that the model be "good enough for alpha," for example. 

Coincidentally, I recently re-watched Allen Holub's 2015 NoEstimates talk. It's worth watching, and one of the most important points he makes is that projections are more valuable than estimates. He also advocates (elsewhere, I think) eliminating sprints in favor of simply doing the most important thing next. This inspired me to try something new: abandon Scrum-style sprints in favor of a more agile alpha phase. The students will be responsible for slicing the problem into stories and tasks, and we will use their progress early in production to project whether or not they will meet their goal. I created a spreadsheet template to facilitate this, using data about the number of work items remaining and the number of hours remaining to forecast when the work will be complete. Here are two charts that it outputs based on three weeks of completely fabricated data.


I have scheduled biweekly studio meetings where the teams will be required to interpret these projections and determine what it means for their project goals. I am eager to see how this compares to two-week Scrum sprints, especially since several of the students have already experienced that approach to game project management. Incidentally, I am happy to receive feedback on my use of the forecast function in the spreadsheet. I've never used it before, and I'm not a statistician.

The final piece of the course design puzzle is grading. My first draft of the course involved some intentional scaffolding and deadline around things like telemetry and playtesting. As I began to work more toward an agile, student-driven approach, I felt a corresponding desire to remove such requirements and replace them with mentoring. That is, rather than demand a priori that teams have game metrics set up by week number four, I will have them read and discuss the book and then consider for themselves when these need to be in place. That may sound like inviting foolishness, but Lemarchand provides some very good, clear checklists around things like metrics and testing. Whereas the portion about managing work is light, these other sections are extremely detailed. Indeed, it's clear from reading the book that perhaps the most important, most stressful role on the project will be formal testing coordinator.

This resulted in a minimalist grading system. Keep in mind that this is a small class, and I have had all of these students in at least one class prior; most I have had in two or three other courses. The alpha milestone is satisfactory if it meets the following criteria, the first of which is a meta-criterion.

  • The project meets the Alpha Milestone criteria described in Lemarchand Chapter 28.
  • At least one formal playtest was conducted, with the results analyzed and presented in a studio meeting, during the alpha phase.
  • Game metrics are incorporated as per the checklist in Lemarchand Chapter 26 (page 275). The integration, analysis, visualization, and use of feedback to refine the game design have been presented to the studio during the alpha phase.
  • An appropriate bug tracking process is established, documented, and put in use.
  • Coding standards are established, documented, and followed.
  • The milestone is presented at the Alpha Milestone Review.
That is most of the semester's work. For individual student grades, I took a specs-based approach:
Meeting these criteria earns a D or better grade.
  • Your alpha milestone does not satisfy its requirements before the final exam period begins.
  • You complete one of your one-on-one meetings with the professor.
Meeting all the previous criteria as well as the following earns a C or better grade:
  • Your team's alpha milestone satisfies its requirements before the final exam period begins.
  • You have completed both of your one-on-one meetings with the professor.
Meeting all of the previous criteria as well as the following earns a B or better grade:
  • Your team's alpha milestone satisfies its requirements by the published deadline.
  • You have successfully started your beta production phase before the start of finals week.
  • You have resolved all issues arising from your one-on-one meeting with the professor.
Meeting all of the previous criteria as well as one of the following earns an A grade:
  • You have served as a playtest coordinator and ensured that all of the requirements and goals of the playtest are met.
  • Another team member gives you a commendation for excellence via email to the professor. Each student may give only one such commendation.

There are few enough students that one-on-one meetings will not be onerous for me, and it will give an opportunity for honest conversation outside of the studio and out of teammates' hearing. The teams should be able to hit the milestone as long as they are paying attention, since they themselves control the scope of their project. I thought about using something like achievements to reward someone for being a playtest coordinator, but I ended up just slotting it into one of two A options. I haven't used a commendation system before, but with such a small group, and where all the work is really transparent to the whole studio, I want to give this a try. I fully expect pairs of students to agree to give each other commendations, and that's fine; they still cannot satisfy the alpha milestone requirements unless more than one student picks up the playtest coordinator role.

That's a good summary of my work last week in preparing for this course. I am not exactly in a rush for the semester to start, but if it started tomorrow, I'd be happy to start working with these students.

No comments:

Post a Comment