Thursday, December 30, 2010

CS315: What We Learned

During the "final exam" meeting for CS315, I conducted an exercise similar to the one at the end of CS222. I gave the team 30 minutes to list all of the significant things they learned during their 315 Studio experience, and this resulted in 78 items. The list of 78 were consolidated to 76 to remove redundancy. I gave each student three stickers which they used to vote on the items they thought were most significant. When we isolated the top 10% by votes, we ended up with this list:
Interestingly, only two of these—the top two—are explicit learning objectives in the course syllabus. The rest deal with the practice of developing a game. Just below these top vote getters were similarly useful tools: Mercurial, Javadoc, and shell scripting.

The surprising thing to me is that there were almost no items listed that dealt with team communication. At the end of each sprint, we held a Sprint Retrospective, during which time the team discussed what went well and what did not go well. Team communication (within teams and across teams) was listed every sprint as something that was going well, and so I assumed that given the opportunity, they would articulate team communication as a lesson learned. I mentioned my surprise in our debriefing after the exercise, and one of the students astutely observed that interpersonal communication was considered a "background" skill, something that was taken for granted. Given more time, I would have liked to have explored this theme, but we were already running late.

I am left with this question: what does it mean that students consider interpersonal communication to be different from other skills developed in the course of a project? Is this an artifact of the university infrastructure, an effect of this particular course design, or just human nature to consider it so? There were no explicit team-building or communication-enhancing activities: there was just the project, run through Scrum, and the associated reflective retrospectives. This might be enough to dismiss the point, since students were not explicitly exposed to interpersonal communication as a subject of study, except that "coding for reuse" was also not separately identified, and this made the top six.

Fall CS222 Redux

I'm teaching CS222: Advanced Programming again in Spring. Fall was the first offering of the course, and in the grand scheme of things, I am happy with the results. However, I think think there are many opportunities for improvement. The course had two major components:
  1. Daily writing assignments, many of which were based on readings of Effective Java, relating the lessons of that book to current and past programming tasks.
  2. A few small individual programming assignments early, a two-week pair project at the early-middle, and a six-week team project delivered in two milestones.
In keeping with the "2" theme, there are two key observations I have upon reflecting on the course. The first is fairly simple: when we spent some time reflecting on how we learned, reading Effective Java did not come up. I spoke with some students after the meeting who admitted that for most of the class, they saw this as busywork, but towards the end, they could see more how it related to their work. Some students complained that the readings weren't tied to lecture and that they didn't like the "textbook," which betrays the fact that they didn't really get the point: it's not a "textbook," and it was supposed to affect practice, not waste time being reiterated in lecture.

The other observation is more confounding. In the six-week project, there was a milestone deliverable and presentation at week three. I explained to the class that it was purely a formative assessment: it would let them get feedback and see what their grade would be if they turned this in. In addition, the students were given a checklist of the features and ideas I expected to see in the project (such as automated unit tests and evidence of best practices a la Effective Java). Many of the milestone deliverables lacked required elements, and so their formative evaluation reflected this. I was explicit in my feedback about what was lacking. In the final deliverable, at the end of the semester, most projects still lacked those elements that were lacking at week three. That is, there is no evidence that any team actually used the formative evaluation to influence their practice!

In the final analysis, another interesting and perhaps predictable trend emerges: students' grades on the writing assignments were not significantly different from their grades on the programming assignments.

Here are some ideas I'm kicking around for revising the course. I welcome your comments.
  • More programming projects. They should be programming all the time.
  • Require programming projects to have accompanying technical reports that explicitly connect the artifact to theory.
  • Rather than provide goalposts for students to show competency in specific skills, provide a list of concepts and allow students to demonstrate competency in their own way. That is, provide them with a checklist of sorts that they keep throughout the semester, and that I check off when I believe they have shown adequate competency in an area. The course grade then would be directly and transparently derived from this document.
  • Rather than assign specific sections of Effective Java as I find them interesting, require students to browse the book and understand its structure, then require them to incorporate relevant areas into their programs throughout the semester. That is, replace assigned reading&writing assignments with student-directed inquiry.
  • Introduce Mercurial very early in the semester and require all projects to be hosted. This way, I can easily pull up any student's project during lecture to use as an example with fiddling with switching laptops or strange computer configurations.
  • Mandate some kind of individual accountability into team projects, such as modified Sprint Burndown charts. This would help students learn to identify tasks and estimate effort in addition to increasing the transparency of team operation.
The main problem for which I do not yet have a solution comes down to the teams' need for individualized consultation balanced against teams' dominant interests in their own projects. The students develop their own project ideas, and so each project is different. Hence, the concepts and tools that one team needs to succeed will be different from another. For example, I had one team in Fall who were using Android: they would have benefited from more time spent with me discussing how to architect Android applications, but this would probably not have been motivating to anyone other team. By itself this is not a problem, since I could add "work days" to the class schedule, during which time I would circulate among teams and provide more guidance. (Note that office hours inevitably do not coincide with student working hours: they're in class while I'm available to help, and when they're working, I'm at home.) There is an opportunity for students to share their individual findings through class presentations, but I have not yet found a good mechanism for adequately incentivising both presenters and the audience. I do not want to introduce inauthentic artifacts purely for assessment, such as graded quizzes or audience feedback, but I have a hard time conceiving of an authentic incentive that could be used.

For what it's worth, I'm still quite on the fence regarding adding "work days" to the course meeting schedule. The class does not have a scheduled lab component, but maybe it needs one. On this issue, I have a hard time separating my knowledge about teaching and learning, my intuition, and my own K-12 and undergraduate experience. Added into the mix, I am not sure how many of these problems are really due to structural issues beyond my control, namely the fact that this is one of several disconnected learning experiences that students are engaged in. However, this would certainly help with my sense that I was talking too much in class.

Saturday, December 11, 2010

CS Ed Week

Today marks the end of 2010 CS Education Week. I could enumerate all the reasons why this is significant, but it's certainly easier for you to just follow that link and see for yourself.

In honor of CS Education Week, I want to reflect a little on how the field of CS Education has impacted me personally and professionally. I have always had a penchant for teaching, possibly because both of my parents were trained to be teachers. (Not that it's the training that matters per se, but that they both had interest and aptitude in the pursuit.) I had several teaching experiences prior to entering graduate school, and I knew that it was something for which I had a passion. However, entry into the academy is not through a focus on teaching, but rather on the individual intellectual pursuits represented by the doctorate. I spent seven years at the University at Buffalo, first on my masters and then on my doctorate. Although I would occasionally talk with my advisor about my interest in teaching, he sagely recommended avoiding teaching until the doctorate was complete: both are at least full-time jobs, and one who begins a lectureship tends to have much more difficulty finishing the dissertation.

I taught a few courses while at UB, but it wasn't until I became an Assistant Professor at Ball State University that I could really focus in on what it meant to be a good teacher. I found it challenging to keep up with my work on JIVE, partially due to the distance from UB and partially due to the stresses of the new job, being as how BSU is not a research-focused institution: with no grad students to work under me, I was unable to keep the pace of research and development.

My interest in design patterns and games led me to explore the intersection of these ideas with students, and this led to my first CS education publication, "Computer Games as a Motivation for Design Patterns," which I presented at SIGCSE 2007. This was my first time attending the conference, but it was an eye-opening experience. I learned more about education research and realized how many more opportunities I could make for significant research, but most importantly, I was inspired by being surrounded by a thousand CS professors who care deeply about student learning. Most professors are good folks who want their students to learn, but the SIGCSE community is different: these are scholars who have devoted their lives to helping make computer science education better. I'm proud and humbled to be among their ranks.

I have also become involved in the Consortium for Computing Sciences in Colleges, Midwest Region. I think I first went to this conference in 2006, inspired primarily by the idea of bringing a team of undergraduates to the programming competition, even though I didn't know any of the three members of the team. Since then, this has become one of my favorite annual trips, gathering several of our high-achieving students and spending two days chatting about research, education, and life. The regional conference serves a role similar to the international SIGCSE conference: it brings together a vibrant community of dedicated faculty from around the Midwest, and it's always reinvigorating to spend time with them. In fact, this is now my second year as the publicity chair for the conference and my first year as an at-large member of the regional steering committee.

There is a lot of room for improvement in higher education, and so there is a lot of room for improvement in Computer Science education. Thank you to all the scholars who have gone before me and the ones who will come afterwards—this year is my sixth as a professor, and I am proud to have seeded two alumni into CS graduate school who I know will be excellent professors. Thank you to organizations like ACM and IEEE who promote computing. Thanks to CCSC for their support of regional conferences, especially with tight travel budgets being the new norm. Thanks to CSTA for their work in K-12, where there is the greatest need to help students see the value of computational thinking, regardless of their future careers. Thanks to you, dear reader, for considering the value of computing and computer science education.

Thursday, December 9, 2010

CS222: What we learned and how we learned it

I did something experimental today in our last meeting of CS222: Advanced Programming. Regular readers may recall that this was the first offering of the course, not just by me, but by the university; following the SIGCSE mailing lists, I think other departments are starting to see the need for such a course as well. I decided to devote most of our 75-minute meeting time today to a student-directed analysis of what we learned, inspired by the structure that Michael Goldsby used when he led our six-hour Future of Education Task Force meeting.

First, I asked the students to list anything that they learned during the semester that was incident upon the CS222 experience. I told them that it didn't have to be something that was explicitly listed in the syllabus or in our meetings, but anything that was somehow related. I recorded these on a large self-stick easel pad, and as we filled up each sheet, I had my undergraduate teaching assistant post it around the room. We filled eight sheets with 75 items (with one item on the ninth sheet) in just under 30 minutes.

After listing the 75 items, I distributed sticker sheets and asked each student to put a sticker by the three items that were the most important or valuable to them. I briefly explained that this was purely subjective—that they were free to define what "important" and "valuable" are themselves. Based on the distribution, we made the cut-off at four stars, giving us a consensus on the following items as most valuable:

  • Team programming
  • Test-driven development
  • Use of libraries (software)
  • Refactoring
  • UML
  • Design patterns
From these, I asked the students to consider how they learned these. It took a bit of prompting to construct this second list, but we ended up with 13 items. For example, the first item offered was, "by writing code." I asked for more information about the kind of situation the student meant, because many of our ideas are reified in code. He clarified that he meant, "by writing code that uses these ideas." I pushed a little harder into the kind of situation he was describing, and we ended up with, "by writing code that uses these ideas in the final project." Not all of these were articulated as well as I would have hoped, but this might reflect the most interesting part: that the students did not have the vocabulary to describe activities that they thought were useful to their learning.
Each student was given two stars for these sheets, and these four rose to the top:
  • Lecture-based example that was built upon in assignments
  • Looking at code as a group in class
  • By writing code that uses these ideas in the final project
  • Demonstrations of practice
Disturbingly missing from the entire list of thirteen is any mention of books or the Internet. It seems the focus of student's thinking about learning is classroom-based, despite the emphasis this semester on reflective practice, metacognition, and explicitly learning how to use external (i.e. non-self, non-university) resources. I do not want to jump to conclusions about this, since as mentioned above, the students clearly lacked a vocabulary for describing their learning experiences.

Wednesday, December 8, 2010

MythTV Upgrade

I have just completed an upgrade to my MythTV box. I originally set it up in 2007, basing it on Mandriva 2007.1, and it's been running mostly seamlessly since then. I have had occasional problems with session management, in which multiple frontends would start upon boot, but after some tinkering this became a matter of routine maintenance. However, I've been more recently interested in leveraging the streaming options of my NetFlix subscription, and unfortunately, they do not support streaming to Linux. I understand that this must be due to contracts with the content providers, who insist upon DRM. I wish there were a better solution to this, but I honestly don't have one, and the more pragmatic issue was that I was disappointed with my inability to stream video to my family room TV.

A few days ago, I was browsing the Web and came across a mailing list post from earlier this year in which the author describes how he configured NetFlix streaming in MythTV by way of running Windows XP within VMWare Player. I usually use VirtualBox for all my virtualization needs, but the mythtv-users thread suggested that there are impassable audio barriers with VirtualBox that don't show up with VMWare Player. I have a spare Windows XP license, so I installed it on my myth box about two weeks ago. There is a sense in which it worked, but it was painfully slow: the machine only had 512MB RAM on a Sempron 3000, and trying to do anything with VMWare Player caused hard drive thrashing with swap access.

Poking around my closet of abandoned hardware, I found a case from my previous desktop machine and booted it up. Finding everything in working order, I picked up a 1TB drive on a Newegg Thanksgiving deal and proceeded to transfer the Myth hardware to the other box. This one had 2GB RAM and an Athlon 64 3700+, a vast improvement of memory and a significant improvement of processor.

Unfortunately, getting the new system installed was not as seamless as I hoped. I tried many different distributions and each one ended up with some kind of problem. The odd thing is that moving the old hard drive into the new shell worked just fine, and I was able to check that the hardware configuration was working. However, Mandriva 2007.1's ALSA drivers were too old to work with VMWare player, and so the audio was garbled when streaming video.

After quite a bit of tinkering, I started doing more diagnostics of the installation media themselves, and I found each to have an error. It appears that the burner on my workstation cannot accurately burn 700MB CDRoms. Who knew? When I took the ~700MB image and put it on a blank DVD instead of a blank CD (and swapped the graphics card, which may or may not have made a difference but definitely reduced noise), the installation went more smoothly. I still had to specify "nomodeset" as a kernel parameter in order to get to an installer, but now I have a nice shiny Mythbuntu 10.10 installation working great.

Two unexpected changes from the old installation: First, my USB wifi device worked automagically, without having to download any extra drivers or anything. Huzzah! Second, my StreamZap remote control was automatically recognized as a keyboard, but it was also configurable through the Mythbuntu control center. The odd result was that I was getting double input for the four arrow keys. After some digging online, I discovered that I could just comment out the arrow keys in the ~/.lirc/mythtv configuration file, and now it's working fine.

One of the nicest features I originally set up was for the machine to wake itself up to make a recording and then shut itself down afterwards. This way it doesn't have to be an always-on machine. On the Mandriva 2007.1 installation, this was enabled through nvram-wakeup, and because my motherboard was not in the database, this required a good deal of tinkering to get set up correctly. By switching to Mythbuntu and a newer motherboard, it is all done now with ACPI calls. Specifically, I followed the instructions for configuring ACPI wakeup on the mythtv wiki, and this worked like a charm.

Installing Windows on VMWare Player was not a problem, but hooking it up with the MythTV frontend was not as easy as described in the mailing list. I had an odd situation: if I opened a terminal through the desktop environment (XFCE), then I could run vmplayer with no trouble. It generated some warnings, but it started and run without issue. However, if I launched a terminal from mythwelcome or via the mythfrontend button, running vmplayer would generate the same error messages but then do nothing. After many fruitless attempts to fix this, I asked the resident Unix expert in the department, and he suggested I use printenv in both terminals and diff the results. I had been trying to do something similar but in a much more awkward way—always nice to learn a new *nix command! After a few failed attempts, I discovered that by unsetting GTK_PATH, I could start vmplayer consistently.

So there you have it! I have a machine that boots faster, runs quieter, and allows me to watch streaming Netflix movies from the comfort of my living room. It also has four times the hard drive space, so my son can record as many Dinosaur Train episodes as he wants without my wife's America's Test Kitchens needing to be deleted to make room. My original build four years ago was supposed to be built of spare parts, but it ended up costing me about $400 due to my hardware being faulty or having the wrong interfaces. This revision only required purchasing a bigger drive, and the rest was accomplished with existing hardware and elbow grease. Thanks to the Myth community for all the excellent software and resources, thanks to Spencer for the Unix help, and thanks to Paul for the TV tuner cards that allowed me to build it at all the first time.

Tuesday, December 7, 2010

Future of Education: Inside and Outside the System

This is a continuation of my series of posts (1,2) on the Future of Education Task Force at Ball State University.
I have had two very productive meetings with my working group within the task force, which consists of three faculty and a student. We worked individually on some ideation, and at our first meeting, we collected and sorted our ideas into a laundry list. In our second meeting, about a week later, we restructured our list and removed redundancies as well as elements that are clearly outside our sphere of influence.

The most exciting deliverable from our working group comprises of a pair of prototypes for improving higher education. The two are orthogonal, one being "inside" the current system and the other being "outside" of it. I will start with the latter.

Institution within an Institution

This prototype is based the observation (a la disruptive innovation) that institutions tend to be self-preserving regardless of the need for disruption. The premise is that an "inner institution" be created, in the spirit of a spin-off company, that would be able to experiment with different organizational structures. Easy examples include current models of credit hours, faculty loading, and core curricula. In order to succeed, the inner institution would need to have timeboxed experiments and measurable plans for impacting the rest of the institution. It would be a test-bed for allowing faculty and students to deeply explore a design-thinking-oriented solution to place-based higher-education.

I can see echoes of my discipline and preferences in this model, of which I was a key designer. In software development, there is a dangerous tendency to talk about what might be the best solution rather than sitting down and exploring it. It's one of the reasons I like test-driven development: you start by considering how the module will be used, and then build the module to match, via rapid prototyping with short feedback loops. This applies the same principles to higher education.

I find this model appealing because of its obvious connections to principles of agile software development and design thinking. In software development, we know that we cannot know all of the users and their needs a priori, and so we use agile methods and feedback to continuously improve. Rather than engaging in endless debates—as faculty are wont to do—this structure would allow instead for real experimentation and measurement.

There are many dangers with this kind of approach, of course. As with the connection between a research unit and the rest of a company, the pressures of the controlling organization may quash the potential of the experiments, and there is a risk of irrelevancy of the experimental structures to the rest of the institution. The students involved would have to trade expectations and establishment for an experimental undergraduate experience, and I would not blame parents for being hesitant to send their own kids into such a situation.

Themed Institutes

The working group could not come up with a compelling name for this concept, so for the sake of discussion, I will use "institute". The premise is that the university would identify a small number of prominent, post-disciplinary tensions or problems, such as "Digital Culture and Ethics" or "Capitalism and Sustainability". Faculty from any department could apply to be part of the institute. It merits repeating that these institutes would be post-disciplinary by definition, so they could never fit within an existing department.
The institute would be timeboxed: it would be created and exist for a fixed period of time, after which it would be dissolved.

The faculty proposal would include a description of the learning experiences he or she could offer that further the institute's inquiry. This would practically require that faculty collaboratively propose participation in order to identify how various scholarly traditions would intersect within credited learning experiences. Students would be recruited directly into the institute, potentially directly from high schools and freshmen-level experiences. The students would therefore be part of the institute just as faculty are, and the mixed cohort would grow and learn together.

Students would need degrees, of course, and it would be challenging (though perhaps appealing) to offer a degree in "Digital Culture and Ethics". Hence, another aspect of faculty participation would be organizing plans for students to earn a degree within the faculty member's home department through the institute. Again, this encourages pre-participation negotiation among faculty and departments. This could take the individual pain of course articulation from VBC fellows and making it a shared responsibility in which no one department has the hammer over any one faculty member.

Any prototype for institutional change is going to run into the problem of limited resources. By and large, faculty resources are allocated through departments. Themed institutes may be conceived as transient departments, and depending on how much of a faculty member's load is associated with the institute, they would require facilities and equipment to match. The timeboxing of institutes mitigates some of the risks of allowing resources to follow individuals, even if they are working outside of the department in which they hold tenure.

The next step is that our prototypes and part of our laundry list will be shared with the upper administration, who will provide feedback for the next round of prototypes. We already have a meeting planned for early January in which the task force chairs will share their feedback with us, and I will in turn relay that information to you, dear reader.

Allow me to reiterate that these are not plans: they are only prototypes. They are ideas that we have articulated for the express purpose of learning from them and then throwing them away. The hope in any prototyping process is that each build is a little closer to solving the problem, even if the problem itself may not have been well-identified at the outset. I hope that I have articulated the prototypes clearly enough to foster consideration and discussion.

Tuesday, November 30, 2010

Design thinking graphic

Design thinking.

I needed this image for a proposal I am writing. I can't tell you how many times I've sketched this. It's not quite as pretty as the one I had been using, but it saves me space in the attributions, and it should be reproducible in greyscale without trouble.

Creative Commons License
Design Thinking by Paul Gestwicki is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

Tuesday, November 9, 2010

Future of Education: Challenge Statements

About two weeks ago, I wrote about my six-hour meeting with the BSU Future of Education Task Force. One of the primary outputs of this meeting was a list of ten challenge statements. There's no secret recipe involved, so here they are, in no particular order:

  1. How might Ball State University foster learning in college?
  2. How might BSU help students embrace serendipity and uncertainty?
  3. How might BSU assist/encourage faculty to modernize their teaching?
  4. How might BSU provide more opportunities for experiential or immersive learning?
  5. How might Ball State University make the educational experience more relevant and interesting for students?
  6. How might BSU increase classroom engagement and interaction?
  7. How might BSU give students more choice so they can personalize their education?
  8. How might Ball State University get faculty to accept the opportunity costs of change?
  9. How might BSU align awards and incentives to change teaching and assessment?
  10. How might BSU increase awareness of resources to help faculty improve their teaching?

So, dear reader, how might we?

Thursday, November 4, 2010

Safe Fail

Regular readers will remember that in Spring 2010, I taught a Human-Computer Interaction class (CS345/545) in which students worked in teams to develop Android applications. The inspirational goal—stated at the start of the semester and mentioned many times during—was to release novel applications on the Android Market. Out of seven teams, none actually released their applications on the Market during the semester, and in fact, many did not exhibit core functionality by the end of the 15 weeks of development. One team did add some spit-and-polish over the Summer and has since published theirs: it's called Elemental: Periodic Table, and it has 1/2 star more than my own Connect the Dots. (I think there's a good reason for that, and it has to do with expectations management, but I'll write about it another day.)

There is a significant contrast between CS345/545 and the current CS315 course (game programming, a.k.a. 3:15 Studio). This has been kicking around the back of my head for a while, but I have not done any real rigorous reflection about it. However, this is all just set-up for the story, so let me carry on.

I was in the lab the other day with some of 3:15 Studio, and things were going well. I turned to one of my students who was also in CS345/545, and I mentioned that I wished that the Android development had been as productive. This particular fellow is a good student, but it's fair to say that his project tanked last Spring. His response to me was not one of regret or criticism, but rather, he observed that it was better to fail at a big team-based semester project last Spring than to fail at the Morgan's Raid project or his senior capstone.

What phenomenal wisdom! In my own idealistic and individual analysis, I had been thinking about the failure as being a problem that needed to be solved. This student recognized that the learning still happened. Specifically, it was learning from failure, which might be the only kind of learning that matters.

Wednesday, November 3, 2010

Students owning space

Let me tell you why this sign makes me happy.

A few weeks ago, we had a ScrumMasters Community of Practice meeting in 3:15 Studio (which consists of the students in my CS315 class). There are four teams within the studio, each with its own ScrumMaster. The goals of this CoP meeting were to facilitate the ScrumMastering process and identify any inter-team issues.

The student ScrumMasters mentioned that, especially towards the end of the sprint, the lab gets quite crowded. The University scheduled CS315 for 2–3PM MWF, and so this is the time that everyone in the studio is working on the project. About 1/4 work in the classroom where the course is official scheduled, and 3/4 move up to the lab, which is much more conducive to small-group work. At the CoP meeting, the students suggested putting up some signs claiming the priority is given to CS315 students at 2–3PM MWF. However, this is technically what some might call "unauthorized use of the lab as a teaching space," and so the students don't really have preference during this time slot, not in the letter of the law anyway. I recommended that the students come up with signs that convey what they want, and then work with the department chair to have them approved.

When I first brought up official signage with the department administration, they were not too keen on it, mostly because we don't really have an official lab usage policy. The machines are configured so that only our majors can sign on, but otherwise, there is no real regulation of the space. However, when I informed the administration that it was the students who wanted to put up signs—not me—they became more interested in the process.

One of the student ScrumMasters drafted a reasonable poster on Monday, and I gave him some feedback from my perspective as a faculty member. He made some edits and sent it to the chair, who made some further changes and then approved its posting. Today, the student posted the notice. Perhaps to him and the other students, it was a mundane occurence, but to me it was significant. It was a sign that the students took ownership of this collaborative work space, that they felt they are stakeholders in departmental resources, and that they can define how these resources are used for the betterment of their education.

Kudos to you, 3:15 Studio, for being exemplars of learning in the campus community!

Tuesday, November 2, 2010

A growing dissatisfaction with 222

There are plenty of things to keep me busy, but I find myself drawn mostly to the need to reflect on some recent activities in my CS222: Advanced Programming class. I have written a little about the course design previously, but as a refresher, it's a brand new course in the curriculum, designed as a bridge from the foundations courses to the project-oriented upper-division courses.

I have a growing dissatisfaction with the execution of CS222, and I think this is in stark contrast to my other course, CS315: Game Programming. 315 is based entirely on studio-based learning: it is project-oriented learning to the nth degree, with all 25 students collaborating with me and external stakeholders on a real project. Perhaps the best way for me to express the difference between these two courses is to show what I did in the last meetings of these two courses.

In CS315, where students are in their 10th week of the project:

  • Worked on research for 5-10 minutes while the students had their "daily" stand-up meetings.
  • Got new markers for the lab where many of the students work.
  • Found paper for the printer in the lab so that students could print up some signs on lab usage, which they proposed to the department chair as official statements.
  • Helped a student in the lab, who is in my other class, with his project and talked about D&D a little with him and some 315-ers.
  • Gave some tips on how to deal with in-game dialog box images and text, facilitating conversation between members of different teams.
  • Looked over a group's shoulders and recommended a refactoring to a monstrous method.

In CS222. where students are in the second week of a six-week project:
  • Talked for 5-10 minutes about the importance of voting, the history of public education and its role in forming an educated electorate, and the difference between indoctrinating patriotism and respecting freedom. (It is election day.)
  • Showed past and current examples of an MSF-style risk matrix and Scrum backlog, explaining that the teams may use either one--or something else--as long as they are deliberate in their decision and reflect on it, and they can write a reflection on it later. (5-10 minutes)
  • Gave a walkthrough of how I use Google's guava library. This took about 10-15 minutes, partially due to unexpected system problems with Eclipse, probably related to the distro change and multiple Eclipse installs I currently have on my netbook.
  • The remaining 35 minutes was used by teams to evaluate each others' physical prototypes. As teams evaluated each others' interface designs, I posted some links to the course Web site. Only one group asked me for feedback on their design, and then others came to ask about other things. The best of these was the question of whether I thought a specific project was even worth pursuing, and I explained (to the student's satisfaction) that it wasn't me they should ask, but their prospective users.
315 is a joy. It is by and large an amazing experience to watch students learn to work together and learn from each other in a wholly legitimate way, a sort of organized chaos.

In 222, by contrast, I keep finding myself talking much more than I want to. It's not usually until I'm done that I realize how long I have been talking. We can take the guava libraries as an example: when I first learned about them, I was floored, and I wished that someone had shown them to me ages ago. In my mind, it would have only taken someone's showing a minimal example for me to see why I should be using this in all my Java projects. Yet, when I showed the students what I do with it, I got very little excitement from them. Was it because they do not have the perspective I have to realize what a useful library it is, or was it a failure of my presentation? These two are hard for me to disassociate, but the fact remains that maybe the whole thing would have been better organized as a workshop than a lecture. This is not without its own complications, not the least of which is the amount of planning that's required for minimal payoff.

Since day one in 222, we've had assignments due twice a week, at each meeting. This idea was gleaned from the SIGCSE mailing list as a way of ensuring students are doing the reading and keeping up. I will be teaching the course next semester, and I need to re-evaluate this structure in light of the students' lack of programming prowess: I'm not sure they're seeing the forest for the trees on issues of design, since many still struggle with basic program structure. Be that as it may, right now, the students are working on a six week project, delivered in two three-week increments. I am continuing to have them read items from Effective Java, but now instead of relating them to old projects or other experiences, they are relating these to their current project. For example, they are currently reading about minimizing the scope of variables (Item 45), and I asked them to evaluate their current project--in its current state--in light of this design idea. It's too early to know if this is positively impacting student learning, but my fear is that I might be giving them too many diverse ideas for them to deeply learn from their project experience. As soon as a student sees the homework as busywork rather than a learning experience, it loses almost all value to the student.

Maybe writing this, then, has helped me to see the problem: some students are not seeing the tips in Effective Java as the brilliant ideas they are, and so the well-intentioned assignments become busywork instead of learning experiences, and so both the assignment and the project suffer, rather than the synergy I try to promote. If that's true, then it becomes a problem of motivation management, the perennial problem of education. I think that I will scheduled time in class, perhaps next Tuesday, in which to engage students in a discussion of this whole process. I get the feeling that many of them do not feel like they can give me an honest evaluation of the class in person, so I may have to find a way to get them brainstorming, maybe a SWOT analysis, and come up with communal ideas.

As always, no answers, but this blog is not a place for answers: it's a place to try to hunt down the questions, and the questions within questions. As always, I'm open to ideas from the community. My plan for now is to maximize the learning in the time we have left this semester (about four weeks), and then use Winter Break to do a more significant analysis of the structure, goals, and delivery of 222.

Tuesday, October 26, 2010

Future of Education Task Force

As you may have read in the Muncie Star Press, I am on the Future of Education task force here at Ball State University. Contrary to the headline, competition from online institutions is not the primary motivation for the task force. Rather, our mission—as articulated by the task force co-chair— is:
Students coming to Ball State now and in the future are more technologically inclined, how do we tap into their expectations and mindsets to create a more relevant and better learning community?
I suggested that this be edited into a more relevant form, to wit:

How do we create a learning community?
Be that as it may, we had a major meeting yesterday, running from 9am until 3pm. The meeting was expertly facilitated by task force member and Entrepreneurship Center director Michael Goldsby. I would like to share here some of the highlights from this meeting.

We began with a SWOT analysis. Michael referred to all of the strengths, weaknesses, opportunities, and threats explicitly as "facts," and he used continuous numbering over the facts. This strikes me as significant, since I would have naively numbered strengths differently from the weaknesses, but using continuous numbering—and even continuous pagination—emphasized the fact that the ideas are more important than their categories. Several times, a task force member would say a single word or phrase that is known on campus, such as "The Working Well Program." Michael challenged with the simple question, "Why?" That is, he raised the level of discourse by forcing us out of academically-comfortable buzzwords towards meaningful facts. (For example, "The Working Well Program promotes physical and mental health among faculty and staff," a strength of the institution.)

A critical part of effective SWOT, as with brainstorming in general, is supporting divergent thinking by postponing criticism. This can be challenging for any group, and we were no exception. At several points, we devolved into discussion over points rather than supporting the creative brainstorming process. The facilitator was not always able to pull us back out. I have to admit, there was one point where I challenged a fellow task force member on a cited weakness, and it was completely unproductive to the process. I guess it's OK, though, since I did it because of my circumstances and everyone else did it because of who they are.

One of the task force members is Michael O'Hara, whose opinions on higher education I greatly respect even though we don't completely agree. He has a propensity for expressing complex ideas using just the right words, perhaps an artifact of his being both an academic and an actor. During the SWOT analysis, he championed the perspective of higher education as a humanistic endeavor rather than a utilitarian one. This point—the conflict of utilitarianism and humanism in higher education—is a critical aspect of the current debates, and I hope to return to it in later writings.

Another theme of the SWOT was the "placeness" of Ball State University. It is important for us to leverage our brick-and-mortar nature, given that much of our competition is from purely digital institutions. Martha Hunt pointed out that sustainability plays well into this discussion, that having a physical place allows for serious and grounded academic discourse on respect for physical space. As a Computer Scientist, I tend to abstract space and work with intangibles, and so it was good to have a Landscape Architect to bring us down to a discussion of what space really means.

The most challenging and intellectual elements were contributed by Matthew Wilson, whose scholarship as a humanities-minded geographer is directly tied to both virtual and physical space. In my opinion, the most meaningful challenge of the whole SWOT analysis was Matt's call for post-disciplinary thinking. Like the utilitarian vs. humanistic dichotomy, the call for post-disciplinarity cuts to the heart of much of higher education's problems. When a fellow task force member asked for my thoughts on post-disciplinarity, I pointed out that we are in disciplines, but problems, by and large, are not.

After the divergent SWOT analysis, which generated 200 facts in about three hours, we moved to a convergent mode in which each task force member had to select at most five of the greatest importance. The highest vote-getters moved to the next round, which consisted of an articulation of challenge statements of the form,  "How might Ball State University...?" Similar to the divergent-then-convergent SWOT, we started with some 25 challenge statements, each selected three, and the highest vote-getters moved forward. The final step involved the creation of a "map" of challenge statements, connected to their antecendents. As one who is well-versed in graph theory, visual rhetoric, and graph drawing, I saw this last step as particularly subject to manipulation and misintepretation, since we assumed a planarity in the articulation of a wicked problem.

I have many notes on the SWOT analysis in my blue notebook, which could be attributed to the fact that it was in the morning and was done with free coffee on hand. Upon reflection, though, I did find the SWOT the most interesting step, and the rest logically followed. I am not completely convinced that I would not have come up with the same output without investing a full collaborative day into it, but that's not really the point: the point is that now the task force has a shared experience that forms a collectively-owned starting point for the prototyping process.

Speaking of which, prototyping is next. Task force members are being put into teams to iterate through prototypes on the Future of Education. I am eager to see who the task force chairs assemble into groups, though in any case, I plan on using some of my software development tools to assist, definitely user stories and possible risk matrices. Assuming time allows, I will share my prototypes here, for my continued exercise in reflective practice and to invite commentary from the community.

Tuesday, October 12, 2010

Screencasting in Mandriva

I spent too much time today making a short screencast in order to demonstrate a specific property of Eclipse and Java. I could not find any clear recommendations on what software to use on Mandriva, my preferred Linux distribution, at least not beyond what was given on Wikipedia's page comparing screencast software.

I ended up using recordMyDesktop with the qt frontend, both of which were in the standard Mandriva repositories. You can select an area of the screen to record via the main window, but there is also a tray icon. Initiating a recording from the tray icon seems to always grab the whole screen, regardless of what is selected.
Additionally, the selection only seems to work if you choose the box and then hit record on the main window. Stopping that recording and then clicking record a second time results in having a full-desktop recording, which is not what I wanted. These complications caused me to have to throw away three reasonable takes, because it was easier to recreate them then to figure out how to crop a whole video. (Although if someone knows how to do that on Mandriva, let me know.)

With an acceptable take--although I forgot one important point, but am not willing to try yet again--I uploaded the file to YouTube. RecordMyDeskop produces Ogg Theora video output with Ogg Vorbis audio, which I think is great as a supporter of open formats. YouTube happily accepted the upload, recognizing it as Theory+Vorbis, and did its magical processing. The end result, unfortunately, had completely garbled video, just some colored flecks on the screen, while the audio worked fine.

A little scouring of the Web, and I discovered a Devede, a Linux application for mastering DVDs that can also be used to convert to AVI video format. After installing the requisite codecs, I was able to convert the ogv into avi, and I uploaded that to YouTube. Somehow it got confused and thought I cancelled the upload, so I had to do this twice, but the final one stuck.

This worked for me, but if anyone knows of a simpler and more streamlined process, I'm open to ideas.

Angry Unicorns

As I sit in office hours with no students around, I will share with you, dear reader, a tale of higher education, motivation, and an angry unicorn.

My students in CS222 are working in pairs on a two-week project, a little RSS analyzer. This was the halfway point, and we used our meeting today for each team to give a status report. After giving my introduction, I sat down so that student teams could, at their pace and discretion, come up and give their status reports. The first team came up rather quickly and did a fine job. After they collected their peer evaluation forms and sat down, there was a delay while the rest of the students considered whether or not to go next. I started a doodle, which turned out into a nearly-complete rendition of Strong Bad:
As the second group of presenters finally came to the front, I informed the class that I was one leg away from completing a sketch of Strong Bad, and that I didn't know what would happen if I had completed Strong Bad, but I'm sure it wouldn't be good.

We spent the next 50 minutes or so hearing status reports, at which time there was another lull, although by my estimation, about 20% of the class had not yet presented. So I started sketching again...
Once complete, I informed the class that I had completed my angry unicorn, and so I assumed that no one else wanted credit for presenting today. This was the first time I made any statement about the presentations being "worth credit," although the students knew that I was jotting down evaluations using the same rubric as their peers, and that I would email these later.

Immediately upon mentioning credit, there was hubbub and three more groups got up to give status reports. One of the groups had very little done, but they gave a great honest report. Another "group" was missing half, but the sole attendee gave a great synopsis of the team's status.

Most likely, it was mention of credit that made these last few get up and present, but I like to think it was the threat of angry unicorns.

Friday, October 8, 2010

All my thoughts, or, three-in-one

This morning, I attended a meeting where one of the agenda items was for each attendee to take five minutes to share:
  • The present state of higher education
  • The various responses to this state
  • How these responses point to a new vision for higher education
  • Readings to support this
Turns out, we didn't do this part of the agenda.

I'd hate for my hastily scribbled notes from before 8AM this morning to be wasted, so for you, dear reader: all of my ideas.

Java Workshop

About three weeks ago, I floated the idea of a Java Workshop to my CS222 class, and enough liked the idea for me to pursue it. Last Tuesday, I gave a two-hour evening workshop on the fundamentals of Java. Here is a synopsis of the major points we covered. The point of this post is primarily to refresh the ideas among those who attended, but others may still be interested to see what I consider as the Java fundamentals. It may be worth noting that this was explicitly about the language and environment of Java, not best practices or even software development.

  • "Java" refers to a language, an API, and a VM. It's good to be aware of this overloading of the word so that you can see how the whole system fits together, especially with the proliferation of languages that run on the JVM.
  • Java has primitive types and reference types. Primitive types are just boxes that hold values, but reference types are for objects, the fundamental building blocks of OO systems.
  • All of the primitive types have similarly-named reference types. For example, there's int and there's Integer. The difference between these reveals an important insight into object-oriented software design. Whereas an int variable is just a box that holds a value, an Integer instance is a representation of the idea of a number. Consider the following code.
    int x = 5;
    Integer y = 5;
    Here, x is just a box that holds an integer, and so we can say x=7, and that means we're putting a different value in the box. On the other hand, y is five. We cannot say y.setValue(7) any more than we can assert that 5=7 in mathematics.
  • It's good to recognize that y=5 is not free: it's using autoboxing to convert an int primitive into an instance of Integer. While it's true that premature optimization is the root of all evil, you shouldn't write code that creates unnecessary objects either.
  • It is critical that a Java developer builds a functionally-correct mental model of
    classes, objects, static, methods, and fields. A blog post is not the right way to help you, dear reader, to identify the problems in your own mental model. The good news is that my services are for hire. ;)
  • There are four access control modifiers that you should know so that you can read code: public, private, protected, and package-protected (which has no keyword). However, you should really only ever use public and private, so you just saved half your allocated brainspace on this topic.
  • To understand classes, abstract classes, and interfaces, I used an example like the following.
    public interface InningListener() {
      public void strikeOut();
      public void walk();
      public void balk();
      public void beaned();
      public void hit();
    public abstract class AbstractInningListener implements InningListener {
      public void strikeOut() {}
      public void walk() {}
      public void balk() {}
      public void beaned() {}
      public void hit() {}
    public class Inning {
      private final List<InningListener> listeners = new ArrayList<InningListener>();
      private int strikes = 0;
      public void addInningListener(InningListener listener) {
      public void strike() {
        if (strikes==3) {
      private void fireStrikeOutEvent() {
        for (InningListener listener : listeners) {
    public class Demo {
      public static void main(String[] args) {
        Inning inning = new Inning();
        inning.addInningListener(new AbstractInningListener() {
          public void strikeOut(){
            System.out.println("Strike out!");
        for (int i=0; i<3; i++)
    A few highlights of this example:
    • This is an example of the Observer design pattern.
    • The abstract class provides default empty implementations of the methods in the interface so that implementors—like the one in Demo—can provide implementations of only those methods that they care about. This keeps the code from getting cluttered up with no-op implementations.
    • This pattern can be found throughout java.awt.event. I tend to do something a little more idiosyncratic involving inner interfaces and inner classes, but that's for another day.

  • If you're doing I/O, it's good to know that the InputStream/OutputStream family of classes is for byte-level I/O, Reader/Writer family is for character-encoded content, and the nio libraries are for blazing fast access to hardware buffers, and so can be ignored until you need it. To me, the critical classes to know are BufferedReader and PrintWriter. BufferedReader has the amazingly-useful readLine method, allowing line-by-line access to a stream, and a PrintWriter behaves exactly like System.out.

  • The classloader trick will load a resource from the classpath whether you are running in Eclipse, from a command-line, from an executable jar, in a servlet, or on Java WebStart:

  • Here's a slapdash tour of the collections API:
    • List<T> defines a sequence. There are two useful implementations,
      LinkedList<T> and ArrayList<T>, which come with the usual caveats implied by their names.
    • Set<T> defines a set, in the discrete mathematics sense.
      The two common implementations are TreeSet<T> and HashSet<T>, but you should use the latter because you're always providing a useful hashCode method in all of your Java classes—or else you're doing it wrong.
    • Map<K,V> defines a dictionary, an associative array, a lookup table—choose your favorite mental model. Again, we have TreeMap<K,V> and HashMap<K,V>, and you should use the latter and override hashCode in all of your classes.
    • Finally, use guava.

I think that's everything we covered, except for some of the more specific questions. I hope this post is helpful to someone looking for a birds-eye view of Java. As I mentioned at the workshop, when I started using Java in 1998, there was not much to learn and there was basically one place to go to get started. Now, I imagine that novices can be easily overwhelmed by the immensity of just the standard libraries. By the way, the good news here is that the best place to go for an introduction to Java—as long as you aleady know something about programming— is still The Java Tutorials.

At what point intervention?

I find myself very nervous about this afternoon's 3:15 Studio meeting. I have a feeling that some key PBIs are going to be missing, but there's a sense in which that is fixed through Scrum by carrying them over with high priority to the next Sprint. My deeper concern is that the developers are losing steam, in large part because they have been myopically looking at the PBIs rather than keeping one eye on future needs. Alistair Cockburn put it best when he described software development as a cooperative game with two goals: first, meeting the needs of the current iteration; second, setting up for the next one. In looking over students' shoulders, eavesdropping in discussions, and directly working with them, I see the students expending all their effort on the first goal and almost none on the second goal.

There are various things I can do to help as a Product Owner who is also the lead software architect. I could invest time in developing a robust technical design document that describes the pieces of the architecture that are not yet in place. I could run through the product backlog and come up with reasonable estimations for every PBI so that I can make a product burndown chart to show studio velocity.

The key dilemma I keep coming back to, however, is this: at what point to I just sit down and write some code? The game is already based on an entity architecture kernel that I provided, and I am one of the principle stakeholders in the success of the project. I see some of the students seriously struggling with code that, to me, is at most a two-hour job. There is value in having students struggle through these problems so that they can develop problem-solving skills and learn to deal with frustration. On the other hand, one of the best ways to learn is by working with someone more talented. At what point is it better for me to just show them how to write it rather than trying to coax it out of them? For that matter, how do I know when it is better for the project for me to add modules or refactor modules students have already created? These questions make me nervous, and I think it's because there's no "undo" button for teaching.

In my mind, I keep coming back to the Confluence project, where I worked right alongside Carrie and Josh to build that system, and it was good: the system worked and they learned. For that project, I had release time to develop it, and both students received independent study credits, and I don't have any doubt that we did the right thing. 3:15 Studio is formed out of a "regular" class, with an order of magnitude more students. Does that mean it is any different?

Wednesday, October 6, 2010

Civilization, War, and Learning

Those of you who follow my Facebook posts have probably figured out by now that I broke down and bought Civilization V. It's the latest by Sid Meier, whose Pirates! is one of the best games I have ever played, but that's a post for another time. For those who are not familiar with the Civilization series, they are turn-based strategy games that follow the development of civilization from ancient times up to the modern age. Although they are a hallmark of PC turn-based strategy gaming---and I love turn-based strategy games---I have never played any games in the series until now.

Last night, I finished my first game of Civ V. I used the default settings, which were for beginning players, and I happened to be playing as the British Empire. My civilization developed much faster than the opposition, probably more because of the low difficulty setting than any clever strategy of mine. There are multiple ways to win the game peacefully: scientific leadership, diplomacy, and cultural development. Because my civilization was advancing so much more rapidly than my opponents, I took the classic gamer's optimization approach: military victory. After all, why wait for my inevitable cultural victory when I can just send my superior army to crush the opposition and annex their land to mine, thereby winning the game?

As my army crushed the opposition, the in-game music turned from pleasantly ignorable and ambient into rather dark and sad compositions. Every time I declared war and sent my troops over their borders, the music set the mood not of a glorious conquest, but of grief and despair. As my navy bombarded coastal cities, they caught fire, and I could hear the screams of civilians. As I destroyed the last city of the Indian empire, Gandhi appeared on the screen and congratulated me on destroying a kind and peaceful people. As I defeated my last opponent, I received the end-game screen, an image of WWI-era soldiers marching over a barren land, and again the music set a somber tone.

Here is a great example of the power of learning in games, following the themes in Koster's Theory of Fun. The game is clearly designed to be fun, and there is a sense in which my military victory was enjoyable, but even more than that, it made me think. The formal elements of the game (the rules) combined elegantly with the dramatic elements (the music and story) to produce a unique experience. Most games I play follow a single hero who single-handedly destroys the vanilla-evil opposition, and there was much rejoicing. In Civ, I actually felt guilty for destroying these other civilizations just so that I could "win."

I hope to be able to draw on this in my current efforts with the Morgan's Raid game (as well as the potential Underground Railroad project), where there is a similar lesson we wish to subtly teach: although the player takes the role of John Hunt Morgan, he is neither inherently a hero nor a villain, but the Civil War was undeniably an ugly and painful thing. We have an interesting design constraint in the game as well. Since it is designed for 4th grade students to play in schools, it's likely that it will be played without music, in crowded labs or during free time. It was the somber music of Civ that moved me the most; I can only wonder what would have happened if I had been playing without my speakers on, since I can never go back to my first experience.

Wednesday, September 29, 2010

Demon JHM: Final boss of Morgan's Raid

I stopped by the lab to see if any of the 3:15 Studio team members needed anything, and the room was empty except for this.

I think I have to go rewrite the design document now.

Sunday, September 26, 2010

Contemplating creative inquiry: a call for partners

One of the unique features of Ball State is the Virginia Ball Center for Creative Inquiry, known locally as "the VBC." The center supports seminars that: explore the connections among the arts, humanities, sciences, and technology; create a product to illustrate the interdisciplinary study; and engage the community in a public forum. The faculty and students who spend a semester at the VBC are completely committed to the project: the faculty member is relieved of all other teaching and university service responsibilities, and the students earn full-time credit for their participation. The seminars themselves are held off the main body of campus, in the beautiful and inspiring Kitselman Center.

I am considering applying to run a seminar in the next academic year, having talked the possibilities over with a few key people, including my department chair and the director of the center. A VBC seminar would be the natural evolution of my work on game design and development. All of my previous projects share a common shortcoming, namely that they have been shoehorned into 3-credit experiences under conventional university scheduling constraints. A VBC seminar is an opportunity to collaborate with students in a legitimate studio environment. In practical terms, we would really be an indie game studio.

I would like to use the seminar to create a game, created by following principles of agile software development. The two missing specifics are the community partners and the game itself. One compelling idea is to build upon my experience with the Morgan's Raid project to make another explicitly educational game for school-age children, potentially on the Underground Railroad. I already have collaborators and potential community partners for such a project, and creating software for K-12 has a high impact factor. Another option is to make something subversively educational. I agree with Raph Koster's theory that fun and learning are intrinsically tied, and it would be a great design exercise to make something superficially "simply fun" that, in fact, teaches 21st-century values.

This post is both a reflection and a call for partners. If you are part of an organization that has a compelling story to tell, or are in the games industry and want to explore strategic partnerships on community-focused developments, or you have a crazy idea that just might work, let me know. You can leave comments here of course, or you can email me (firstname.lastname at gmail). VBC application materials are due at the end of the calendar year, so there's time to bounce ideas around.

Thanks for reading. By the way, if you're a BSU student who will be around next year, think about participating and spread the word. I want the cream of the multidisciplinary crop!

Wednesday, September 22, 2010

Two Sprints with 3:15 Studio

We are now two sprints into the CS315 game programming class. I have not updated this blog as often as I intended, but I suppose that means I've been busy, which may even imply that I've been productive. The advantage of being healthy in a house full of sick people is that everyone else is sleeping, and I still have the energy to do some writing.

I am a little surprised to see I haven't actually written much about this class since before the semester started. We are in the middle of the fifth week of the semester, but let me start by explaining what happened in the first week.

One week of training

The students were introduced to the "course" by being told that it was not a conventional course at all. Rather, the students were in an orientation week for their positions working for 3:15 Studio, and their project was the implementation of a game based on Morgan's Raid. They were told that the orientation and training would last one week, and we'd move from there right into production. I was also explicit in saying that they were expected to work ten hours per week. This is something I've started telling all of my classes, based on the arithmetic that 40 hours per week is "full time" and 12 credit hours is "full time." With this class, though, it was not a recommendation to allocate that much time but a statement that this was the amount of work required. That is, from day one, this was presented as a situated learning experience, in which the students comprise one studio under my direction.

The rest of the first meeting consisted of introducing the game design, the stakeholders, and the historical context. The students were given a quick introduction to Mercurial and the MercurialEclipse plugin, and they were shown how to upload an image to a shared repository. Their first assignment, then, was to upload a portrait to a shared repository to help us get to know each others' names.

The second day (Wednesday) was a breakneck introduction to test-driven development, entity system architectures, pair programming, and Scrum. This was done in a lecture-oriented style, primarily as a means to maximize throughput. It was expected — and experience has shown — that not much of this was retained by the students, but it did help them build a mental framework of what level of rigor was expected. This resulted in the studio size decreasing from thirty to twenty-five, which I see as a positive step: students who were not willing to make the investment elected to choose another course. (It is an unfortunate phenomenon that, even at the 300-level, students mistake "Game programming" as a light and fun elective, when in fact, it is grueling and fun.)

The third and final meeting of the week (Friday) was mostly devoted to team formation and task allocation. Based on the number of registered students, I directed them to form four teams. The teams were formed primarily based on pre-existing social networks, at least according to my informal observations. At one point, there were three complete teams formed, but a headcount revealed that two students had not actually been incorporated into a team. Two of the teams shed members to join these two to be the final team, whom I will reference later as "that team I mentioned before."

For the first sprint, I wanted to give the students an "easy win" in order to build morale, and I wanted to ease them into Scrum, entity systems, and TDD. Based on this motivation, I set up a product backlog consisting of user stories that were all at the task level. That is, I had pre-sliced problems into tasks achievable by pairs of students. These were, in a sense, not proper product backlog items, since many had to do with software architecture and not anything that has value to the stakeholders, but to me this was a reasonable accommodation in my circumstances.

Sprint One: Two weeks

The first sprint was a success. All of the teams delivered their product backlog items (PBIs) modulo some miscommunication and minor revisions. Most of the tasks were fairly straightforward implementation tasks, the sort that would take me under an hour, but that I expected the students to take much longer since the learning curve was so steep. It's worth mentioning that the team I mentioned before happened to pick up the most difficult of these tasks. Given that the team was made up of a hodgepodge of students as opposed to a motivated group of friends, I was concerned as to how they would handle it, but this team came together in an exemplary way to solve their problems. (More on that in a later blog post, perhaps.)

There was a case around the middle of the sprint where some students broke the build — did I mention we're also using CruiseControl for continuous integration? I took the opportunity to shame them on the studio mailing list, and in that message, reminded everyone to be careful about what they code they push. I emailed these students privately and thanked them for being good sports about being the first to break the build, dooming them to be my sacrificial lambs for the good of the team. I knew these guys, and I knew they could take it. Good job, men. Since then, the only real "breaking" of the build has dealt with images being too large and non-portable Java code that works on only some operating systems.

The production portion of the sprint ended on a Thursday, and the next day we held a combined Sprint Review and Sprint Retrospective. This is really a place where the University's scheduling system is a challenge, since we had to do both in the space of 50 minutes. We could have used more time, but we made the most of the time we had. The Sprint Review revealed one of the major weaknesses of the product backlog design: because the PBIs were primarily architectural, I was the only one of the three Product Owners who could appreciate that progress had been made. The programmers and I were delighted just to get a window open with anything in it, but I think my collaborators expected a little bit more. I assured them that this was a necessary step to get the students up to speed.

The Sprint Review was amazing. We had two timeboxes: ten minutes to discuss what went well, and ten minutes to discuss what could be improved. We could have kept on going with our list of successes, and I could feel the excitement and morale in the air. As for the things to improve, many were very simple to address. We left the meeting on a high — helped by Concannon's chocolate chip cookies. One of the important tips I got from Clinton Keith's book was to take the time to celebrate at the end of a sprint, but with a mix of under- and over-age students, cookies seemed like the safest bet.

After this meeting, I met with my co-PI and we reprioritized the product backlog. Moving into the next sprint, we consciously revised the PBI format to deal strictly with those things that add value to the product owners.

These are the burndown charts for the four teams for Sprint 1.

I used Scrum a few semesters ago in game programming, and since then I've studied it more and been applying it with more care. However, my recollection is that it took us several sprints to get anything approaching the level of steadiness shown in some of these burndown charts. Some of the skips are interesting, since they show places where teams thought they had finished their tasks, then found that someone else pushed code that broke theirs or overwrote theirs—casualities of learning Mercurial—and then had to do clean the repository and get their solutions back in place.

Sprint 2: A 1.5-week short sprint

Sprint two ended today, and it was a short sprint due to weekend conferences: both me and the other lead investigator on the project would be away, and so we couldn't do a two-week sprint as planned. Instead, we decided to have Sprint 2 be a short one followed by a slightly longer Sprint 3.

I had overestimated the diligence of my students when, the weekend before the Sprint Planning meeting for Sprint 2, I emailed some comments on the revised product backlog and sprint backlog formats. As mentioned above, I revised the product backlog into items that required slicing by the students as opposed to pre-sliced tasks, and so I had also revised the Sprint Backlog format so that a PBI could be copied in and individual tasks listed below it. At 2pm we started the Sprint Planning, and it quickly became clear that the students had not appreciated this difference. Teams quickly started committing to PBIs, before they had taken the time to see that these were much heavier PBIs that needed breaking down. I hoped they would see this, but when I saw that teams were also changing the sprint backlog format to match the old one, I interrupted and explained what they should be doing: pulling down a PBI as a team, breaking it into tasks, committing to and estimating for those tasks, and then only committing to another PBI if there was excess capacity within the team.

The unfortunate result of this was that, rather than having a contiguous block of high-priority PBIs selected for the sprint, it was a bit of a toss-up. There was not time in our 50-minute timebox to start over, and so some rather important items were left unclaimed. Also, the teams shied away from high story point PBIs for fear of complexity in a short sprint, and I'm not sure that was unwise.

I felt a little more tension this sprint as more students had to use pieces created outside their team. Also, as I worked with teams, I noticed the overall quality of the code was getting worse. It's not so bad as to be unredeemable, but there are some code smells that would make a seasoned professional reach for a handkerchief. The problem is, of course and again, that it's a team of novices who cannot recognize code smells.

Even though the production was supposed to stop yesterday, I helped one team pull a last-minute victory before the Sprint Review and Retrospective today. The review went well, and my other two Product Owners agreed that it was good to see something actually working, rather than just looking at code that only I could appreciate. There were a few more loose ends this sprint: not all of the PBIs were completed according to their conditions of satisfaction, although this was fairly minor. One teams had committed to PBIs on the product backlog that it had neglected to copy to the team's Sprint Backlog, but I think they were embarrassed enough at this that they would not do it again. Much of this was fallout from the rough Sprint Planning meeting.

During the Sprint Retrospective, the students were much more pensive. No longer were they chomping at the bit to shout out the first positive thing that came to mind: instead, their contributions were more considered. This is natural, I think, and it shows a maturation of the studio. I was glad to see that inter- and intra-team communication, although mentioned as a success in Sprint 1, still was listed here, because communication is the key to keeping this studio together and making the project succeed.

We are currently discussing via our mailing list a few ways to address shortcomings in the system. One of the positive outcomes of this is that now, I think the students are all hungry for more information about software design and refactoring. They are feeling the real pain of real software development. This is very different from other teaching experiences—even my other class—where I have tried to teach the solution without the students really feeling the problem. It looks like we'll have some Community of Practice meetings in which I will lead code reviews or refactoring/redesign sessions. The students are hungry for it, and that should help them absorb some rich ideas.

Because I have them, here are the Sprint 2 burndown charts. They are in the same team order as the previous, so feel free to compare. My only significant comment about them is that, despite the fact that the teams were slicing up PBIs themselves for the first time—and many of them have never done this and never worked on a team before at all—these are still remarkably steady.

Onward to Sprint 3

While I am at the 2010 Consortium of Computing Sciences in Colleges Midwest conference, my students will be responsible for running the Sprint Planning meeting for Sprint 3. I have sorted out the highest priority PBIs, though I still need to do some cleaning up further down the backlog (which, incidentally, currently contains about 85 PBIs). I am confident that they will do a fine job in my absence.

This has been a fantastic teaching opportunity for me. I have invested more time and emotional energy into this project than any other venture since my doctoral dissertation, and I think the students appreciate being involved in this experience. My one niggling fear is that the whole experience cannot be replicated without expending this herculean effort each time, but I will leave sustainability of immersive learning for another writing.

Tuesday, September 7, 2010

A parenting reaction to a teaching problem

Something happened in CS222 today that surprised me. Specifically, I did something that surprised me. More specifically, I did something twice that surprised me. What did I do? I scolded my students.

I always have students close the door to the classroom since it's a noisy hallway. I have mentioned to the students many times that they should not come in late, because it is disrespectful to me and to their peers. I try to be transparent about my pedagogy, my constructivist teaching philosophy, and the idea that knowledge is created within a community of practice. Today, I was in the middle of a brief lesson on seemingly innocuous programming language features that, when misused, can lead to wicked bugs, when a student came into class late. It was about 8 minutes into class, and I cracked, pointing to him and shouting, "You need to stop coming to class late. It's driving me crazy!" Honestly, I don't know if that particular student had come in late before or not, but I was so irritated that I did not care. In retrospect, I hope "you" was interpreted in the plural, but I doubt that it was.

About seven minutes later, I had just finished my little impromptu lesson on the importance of using caution when modifying core language features (like Smalltalk's "new" method or operator overloading in C++), when a student let out a loud and bored-sounding sigh. I rolled with that one, saying something intended to imply my irritation. About three minutes later, the same voice let out the same sigh --- this time while I was in the middle of a demonstration --- and I lost it. I don't remember exactly what I said, but I know I pointed in that region of the room and commanded the person to stop it.

Needless to say, this did not set the right tone for the class. Later on, I told a funny story that involved repeatedly flipping off the classroom, and that seemed to cheer them up. Nothing gets students back like a little vulgarity.

In both instances, it was an immediate emotional pang that drove my reaction. It's the feeling I get when students dehumanize me or their classmates, treating the class as if they're watching a video. It's a feeling of being dehumanized and devalued. Further, it's a clear sign that the student is not trying to be part of a learning community. It's a terrible feeling.

Later on, I asked the students some questions about a piece of code I had just written, and none answered my question nor my request for questions about it. I knew that the non-participatory atmosphere was partially my own fault, so I had them form small groups that would be responsible for sharing either their best answer or their best question. As the students were working in small groups, it gave me time to start reflecting on what had happened, and I realized very quickly that I was treating these students --- the latecomer and the disinterested sigher --- the way I would treat my three-year-old son: I was scolding them, telling them to align their behaviors with accepted practice of politeness and social norms.

This was a personal revelation. In my ten-or-so years of teaching, I have had little trouble talking about technology and methodology, but I have had a much harder time talking about social norms. With a three-year-old, who is naturally testing his boundaries, I am now accustomed to monitoring his actions at home: when his antics shift from impish fun to dangerous or rude, it has to be nipped in the bud. This was exactly my reaction to the students in class: without even pausing to think about it, my "Dad Response" kicked in, and I scolded these students in pure authoritarian style.

As I continue to reflect on these actions and my reactions, I am having a hard time determining if I overreacted or not. Coming to class late is disrespectful to the whole learning community, and beyond that, it is very distracting. I made an example of the latecomer, and I like to rationalize my response as being a matter of fixing broken windows. I suppose the proof of the pudding is in the eating, and we'll see if students stop coming in late. If not, then it was neither worth embarrassing this one student nor worth establishing a negative atmosphere. Would it have been better to put off the day's lesson to have a roundtable discussion about why tardiness is rude, or why emitting sighs of ennui is impolite? I have a hard time imagining that to be the case. I am tempted to spend a few minutes on Thursday telling the students how I feel. Did they know that professors have feelings? It might be cathartic for me, but would it be a valuable use of our limited time together? Worse, would I just be preaching to the choir?