We are now two sprints into the CS315 game programming class. I have not updated this blog as often as I intended, but I suppose that means I've been busy, which may even imply that I've been productive. The advantage of being healthy in a house full of sick people is that everyone else is sleeping, and I still have the energy to do some writing.
I am a little surprised to see I haven't actually written much about this class since before the semester started. We are in the middle of the fifth week of the semester, but let me start by explaining what happened in the first week.
One week of training
The students were introduced to the "course" by being told that it was not a conventional course at all. Rather, the students were in an orientation week for their positions working for 3:15 Studio, and their project was the implementation of a game based on Morgan's Raid. They were told that the orientation and training would last one week, and we'd move from there right into production. I was also explicit in saying that they were expected to work ten hours per week. This is something I've started telling all of my classes, based on the arithmetic that 40 hours per week is "full time" and 12 credit hours is "full time." With this class, though, it was not a recommendation to allocate that much time but a statement that this was the amount of work required. That is, from day one, this was presented as a situated learning experience, in which the students comprise one studio under my direction.
The rest of the first meeting consisted of introducing the game design, the stakeholders, and the historical context. The students were given a quick introduction to Mercurial and the MercurialEclipse plugin, and they were shown how to upload an image to a shared repository. Their first assignment, then, was to upload a portrait to a shared repository to help us get to know each others' names.
The second day (Wednesday) was a breakneck introduction to test-driven development, entity system architectures, pair programming, and Scrum. This was done in a lecture-oriented style, primarily as a means to maximize throughput. It was expected — and experience has shown — that not much of this was retained by the students, but it did help them build a mental framework of what level of rigor was expected. This resulted in the studio size decreasing from thirty to twenty-five, which I see as a positive step: students who were not willing to make the investment elected to choose another course. (It is an unfortunate phenomenon that, even at the 300-level, students mistake "Game programming" as a light and fun elective, when in fact, it is grueling and fun.)
The third and final meeting of the week (Friday) was mostly devoted to team formation and task allocation. Based on the number of registered students, I directed them to form four teams. The teams were formed primarily based on pre-existing social networks, at least according to my informal observations. At one point, there were three complete teams formed, but a headcount revealed that two students had not actually been incorporated into a team. Two of the teams shed members to join these two to be the final team, whom I will reference later as "that team I mentioned before."
For the first sprint, I wanted to give the students an "easy win" in order to build morale, and I wanted to ease them into Scrum, entity systems, and TDD. Based on this motivation, I set up a product backlog consisting of user stories that were all at the task level. That is, I had pre-sliced problems into tasks achievable by pairs of students. These were, in a sense, not proper product backlog items, since many had to do with software architecture and not anything that has value to the stakeholders, but to me this was a reasonable accommodation in my circumstances.
Sprint One: Two weeks
The first sprint was a success. All of the teams delivered their product backlog items (PBIs) modulo some miscommunication and minor revisions. Most of the tasks were fairly straightforward implementation tasks, the sort that would take me under an hour, but that I expected the students to take much longer since the learning curve was so steep. It's worth mentioning that the team I mentioned before happened to pick up the most difficult of these tasks. Given that the team was made up of a hodgepodge of students as opposed to a motivated group of friends, I was concerned as to how they would handle it, but this team came together in an exemplary way to solve their problems. (More on that in a later blog post, perhaps.)
There was a case around the middle of the sprint where some students broke the build — did I mention we're also using CruiseControl for continuous integration? I took the opportunity to shame them on the studio mailing list, and in that message, reminded everyone to be careful about what they code they push. I emailed these students privately and thanked them for being good sports about being the first to break the build, dooming them to be my sacrificial lambs for the good of the team. I knew these guys, and I knew they could take it. Good job, men. Since then, the only real "breaking" of the build has dealt with images being too large and non-portable Java code that works on only some operating systems.
The production portion of the sprint ended on a Thursday, and the next day we held a combined Sprint Review and Sprint Retrospective. This is really a place where the University's scheduling system is a challenge, since we had to do both in the space of 50 minutes. We could have used more time, but we made the most of the time we had. The Sprint Review revealed one of the major weaknesses of the product backlog design: because the PBIs were primarily architectural, I was the only one of the three Product Owners who could appreciate that progress had been made. The programmers and I were delighted just to get a window open with anything in it, but I think my collaborators expected a little bit more. I assured them that this was a necessary step to get the students up to speed.
The Sprint Review was amazing. We had two timeboxes: ten minutes to discuss what went well, and ten minutes to discuss what could be improved. We could have kept on going with our list of successes, and I could feel the excitement and morale in the air. As for the things to improve, many were very simple to address. We left the meeting on a high — helped by Concannon's chocolate chip cookies. One of the important tips I got from Clinton Keith's book was to take the time to celebrate at the end of a sprint, but with a mix of under- and over-age students, cookies seemed like the safest bet.
After this meeting, I met with my co-PI and we reprioritized the product backlog. Moving into the next sprint, we consciously revised the PBI format to deal strictly with those things that add value to the product owners.
These are the burndown charts for the four teams for Sprint 1.
I used Scrum a few semesters ago in game programming, and since then I've studied it more and been applying it with more care. However, my recollection is that it took us several sprints to get anything approaching the level of steadiness shown in some of these burndown charts. Some of the skips are interesting, since they show places where teams thought they had finished their tasks, then found that someone else pushed code that broke theirs or overwrote theirs—casualities of learning Mercurial—and then had to do clean the repository and get their solutions back in place.
Sprint 2: A 1.5-week short sprint
Sprint two ended today, and it was a short sprint due to weekend conferences: both me and the other lead investigator on the project would be away, and so we couldn't do a two-week sprint as planned. Instead, we decided to have Sprint 2 be a short one followed by a slightly longer Sprint 3.
I had overestimated the diligence of my students when, the weekend before the Sprint Planning meeting for Sprint 2, I emailed some comments on the revised product backlog and sprint backlog formats. As mentioned above, I revised the product backlog into items that required slicing by the students as opposed to pre-sliced tasks, and so I had also revised the Sprint Backlog format so that a PBI could be copied in and individual tasks listed below it. At 2pm we started the Sprint Planning, and it quickly became clear that the students had not appreciated this difference. Teams quickly started committing to PBIs, before they had taken the time to see that these were much heavier PBIs that needed breaking down. I hoped they would see this, but when I saw that teams were also changing the sprint backlog format to match the old one, I interrupted and explained what they should be doing: pulling down a PBI as a team, breaking it into tasks, committing to and estimating for those tasks, and then only committing to another PBI if there was excess capacity within the team.
The unfortunate result of this was that, rather than having a contiguous block of high-priority PBIs selected for the sprint, it was a bit of a toss-up. There was not time in our 50-minute timebox to start over, and so some rather important items were left unclaimed. Also, the teams shied away from high story point PBIs for fear of complexity in a short sprint, and I'm not sure that was unwise.
I felt a little more tension this sprint as more students had to use pieces created outside their team. Also, as I worked with teams, I noticed the overall quality of the code was getting worse. It's not so bad as to be unredeemable, but there are some code smells that would make a seasoned professional reach for a handkerchief. The problem is, of course and again, that it's a team of novices who cannot recognize code smells.
Even though the production was supposed to stop yesterday, I helped one team pull a last-minute victory before the Sprint Review and Retrospective today. The review went well, and my other two Product Owners agreed that it was good to see something actually working, rather than just looking at code that only I could appreciate. There were a few more loose ends this sprint: not all of the PBIs were completed according to their conditions of satisfaction, although this was fairly minor. One teams had committed to PBIs on the product backlog that it had neglected to copy to the team's Sprint Backlog, but I think they were embarrassed enough at this that they would not do it again. Much of this was fallout from the rough Sprint Planning meeting.
During the Sprint Retrospective, the students were much more pensive. No longer were they chomping at the bit to shout out the first positive thing that came to mind: instead, their contributions were more considered. This is natural, I think, and it shows a maturation of the studio. I was glad to see that inter- and intra-team communication, although mentioned as a success in Sprint 1, still was listed here, because communication is the key to keeping this studio together and making the project succeed.
We are currently discussing via our mailing list a few ways to address shortcomings in the system. One of the positive outcomes of this is that now, I think the students are all hungry for more information about software design and refactoring. They are feeling the real pain of real software development. This is very different from other teaching experiences—even my other class—where I have tried to teach the solution without the students really feeling the problem. It looks like we'll have some Community of Practice meetings in which I will lead code reviews or refactoring/redesign sessions. The students are hungry for it, and that should help them absorb some rich ideas.
Because I have them, here are the Sprint 2 burndown charts. They are in the same team order as the previous, so feel free to compare. My only significant comment about them is that, despite the fact that the teams were slicing up PBIs themselves for the first time—and many of them have never done this and never worked on a team before at all—these are still remarkably steady.
Onward to Sprint 3
While I am at the 2010 Consortium of Computing Sciences in Colleges Midwest conference, my students will be responsible for running the Sprint Planning meeting for Sprint 3. I have sorted out the highest priority PBIs, though I still need to do some cleaning up further down the backlog (which, incidentally, currently contains about 85 PBIs). I am confident that they will do a fine job in my absence.
This has been a fantastic teaching opportunity for me. I have invested more time and emotional energy into this project than any other venture since my doctoral dissertation, and I think the students appreciate being involved in this experience. My one niggling fear is that the whole experience cannot be replicated without expending this herculean effort each time, but I will leave sustainability of immersive learning for another writing.