Thursday, December 30, 2010

CS315: What We Learned

During the "final exam" meeting for CS315, I conducted an exercise similar to the one at the end of CS222. I gave the team 30 minutes to list all of the significant things they learned during their 315 Studio experience, and this resulted in 78 items. The list of 78 were consolidated to 76 to remove redundancy. I gave each student three stickers which they used to vote on the items they thought were most significant. When we isolated the top 10% by votes, we ended up with this list:
Interestingly, only two of these—the top two—are explicit learning objectives in the course syllabus. The rest deal with the practice of developing a game. Just below these top vote getters were similarly useful tools: Mercurial, Javadoc, and shell scripting.

The surprising thing to me is that there were almost no items listed that dealt with team communication. At the end of each sprint, we held a Sprint Retrospective, during which time the team discussed what went well and what did not go well. Team communication (within teams and across teams) was listed every sprint as something that was going well, and so I assumed that given the opportunity, they would articulate team communication as a lesson learned. I mentioned my surprise in our debriefing after the exercise, and one of the students astutely observed that interpersonal communication was considered a "background" skill, something that was taken for granted. Given more time, I would have liked to have explored this theme, but we were already running late.

I am left with this question: what does it mean that students consider interpersonal communication to be different from other skills developed in the course of a project? Is this an artifact of the university infrastructure, an effect of this particular course design, or just human nature to consider it so? There were no explicit team-building or communication-enhancing activities: there was just the project, run through Scrum, and the associated reflective retrospectives. This might be enough to dismiss the point, since students were not explicitly exposed to interpersonal communication as a subject of study, except that "coding for reuse" was also not separately identified, and this made the top six.


Fall CS222 Redux

I'm teaching CS222: Advanced Programming again in Spring. Fall was the first offering of the course, and in the grand scheme of things, I am happy with the results. However, I think think there are many opportunities for improvement. The course had two major components:
  1. Daily writing assignments, many of which were based on readings of Effective Java, relating the lessons of that book to current and past programming tasks.
  2. A few small individual programming assignments early, a two-week pair project at the early-middle, and a six-week team project delivered in two milestones.
In keeping with the "2" theme, there are two key observations I have upon reflecting on the course. The first is fairly simple: when we spent some time reflecting on how we learned, reading Effective Java did not come up. I spoke with some students after the meeting who admitted that for most of the class, they saw this as busywork, but towards the end, they could see more how it related to their work. Some students complained that the readings weren't tied to lecture and that they didn't like the "textbook," which betrays the fact that they didn't really get the point: it's not a "textbook," and it was supposed to affect practice, not waste time being reiterated in lecture.

The other observation is more confounding. In the six-week project, there was a milestone deliverable and presentation at week three. I explained to the class that it was purely a formative assessment: it would let them get feedback and see what their grade would be if they turned this in. In addition, the students were given a checklist of the features and ideas I expected to see in the project (such as automated unit tests and evidence of best practices a la Effective Java). Many of the milestone deliverables lacked required elements, and so their formative evaluation reflected this. I was explicit in my feedback about what was lacking. In the final deliverable, at the end of the semester, most projects still lacked those elements that were lacking at week three. That is, there is no evidence that any team actually used the formative evaluation to influence their practice!

In the final analysis, another interesting and perhaps predictable trend emerges: students' grades on the writing assignments were not significantly different from their grades on the programming assignments.

Here are some ideas I'm kicking around for revising the course. I welcome your comments.
  • More programming projects. They should be programming all the time.
  • Require programming projects to have accompanying technical reports that explicitly connect the artifact to theory.
  • Rather than provide goalposts for students to show competency in specific skills, provide a list of concepts and allow students to demonstrate competency in their own way. That is, provide them with a checklist of sorts that they keep throughout the semester, and that I check off when I believe they have shown adequate competency in an area. The course grade then would be directly and transparently derived from this document.
  • Rather than assign specific sections of Effective Java as I find them interesting, require students to browse the book and understand its structure, then require them to incorporate relevant areas into their programs throughout the semester. That is, replace assigned reading&writing assignments with student-directed inquiry.
  • Introduce Mercurial very early in the semester and require all projects to be hosted. This way, I can easily pull up any student's project during lecture to use as an example with fiddling with switching laptops or strange computer configurations.
  • Mandate some kind of individual accountability into team projects, such as modified Sprint Burndown charts. This would help students learn to identify tasks and estimate effort in addition to increasing the transparency of team operation.
The main problem for which I do not yet have a solution comes down to the teams' need for individualized consultation balanced against teams' dominant interests in their own projects. The students develop their own project ideas, and so each project is different. Hence, the concepts and tools that one team needs to succeed will be different from another. For example, I had one team in Fall who were using Android: they would have benefited from more time spent with me discussing how to architect Android applications, but this would probably not have been motivating to anyone other team. By itself this is not a problem, since I could add "work days" to the class schedule, during which time I would circulate among teams and provide more guidance. (Note that office hours inevitably do not coincide with student working hours: they're in class while I'm available to help, and when they're working, I'm at home.) There is an opportunity for students to share their individual findings through class presentations, but I have not yet found a good mechanism for adequately incentivising both presenters and the audience. I do not want to introduce inauthentic artifacts purely for assessment, such as graded quizzes or audience feedback, but I have a hard time conceiving of an authentic incentive that could be used.

For what it's worth, I'm still quite on the fence regarding adding "work days" to the course meeting schedule. The class does not have a scheduled lab component, but maybe it needs one. On this issue, I have a hard time separating my knowledge about teaching and learning, my intuition, and my own K-12 and undergraduate experience. Added into the mix, I am not sure how many of these problems are really due to structural issues beyond my control, namely the fact that this is one of several disconnected learning experiences that students are engaged in. However, this would certainly help with my sense that I was talking too much in class.

Saturday, December 11, 2010

CS Ed Week

Today marks the end of 2010 CS Education Week. I could enumerate all the reasons why this is significant, but it's certainly easier for you to just follow that link and see for yourself.

In honor of CS Education Week, I want to reflect a little on how the field of CS Education has impacted me personally and professionally. I have always had a penchant for teaching, possibly because both of my parents were trained to be teachers. (Not that it's the training that matters per se, but that they both had interest and aptitude in the pursuit.) I had several teaching experiences prior to entering graduate school, and I knew that it was something for which I had a passion. However, entry into the academy is not through a focus on teaching, but rather on the individual intellectual pursuits represented by the doctorate. I spent seven years at the University at Buffalo, first on my masters and then on my doctorate. Although I would occasionally talk with my advisor about my interest in teaching, he sagely recommended avoiding teaching until the doctorate was complete: both are at least full-time jobs, and one who begins a lectureship tends to have much more difficulty finishing the dissertation.

I taught a few courses while at UB, but it wasn't until I became an Assistant Professor at Ball State University that I could really focus in on what it meant to be a good teacher. I found it challenging to keep up with my work on JIVE, partially due to the distance from UB and partially due to the stresses of the new job, being as how BSU is not a research-focused institution: with no grad students to work under me, I was unable to keep the pace of research and development.

My interest in design patterns and games led me to explore the intersection of these ideas with students, and this led to my first CS education publication, "Computer Games as a Motivation for Design Patterns," which I presented at SIGCSE 2007. This was my first time attending the conference, but it was an eye-opening experience. I learned more about education research and realized how many more opportunities I could make for significant research, but most importantly, I was inspired by being surrounded by a thousand CS professors who care deeply about student learning. Most professors are good folks who want their students to learn, but the SIGCSE community is different: these are scholars who have devoted their lives to helping make computer science education better. I'm proud and humbled to be among their ranks.

I have also become involved in the Consortium for Computing Sciences in Colleges, Midwest Region. I think I first went to this conference in 2006, inspired primarily by the idea of bringing a team of undergraduates to the programming competition, even though I didn't know any of the three members of the team. Since then, this has become one of my favorite annual trips, gathering several of our high-achieving students and spending two days chatting about research, education, and life. The regional conference serves a role similar to the international SIGCSE conference: it brings together a vibrant community of dedicated faculty from around the Midwest, and it's always reinvigorating to spend time with them. In fact, this is now my second year as the publicity chair for the conference and my first year as an at-large member of the regional steering committee.

There is a lot of room for improvement in higher education, and so there is a lot of room for improvement in Computer Science education. Thank you to all the scholars who have gone before me and the ones who will come afterwards—this year is my sixth as a professor, and I am proud to have seeded two alumni into CS graduate school who I know will be excellent professors. Thank you to organizations like ACM and IEEE who promote computing. Thanks to CCSC for their support of regional conferences, especially with tight travel budgets being the new norm. Thanks to CSTA for their work in K-12, where there is the greatest need to help students see the value of computational thinking, regardless of their future careers. Thanks to you, dear reader, for considering the value of computing and computer science education.

Thursday, December 9, 2010

CS222: What we learned and how we learned it

I did something experimental today in our last meeting of CS222: Advanced Programming. Regular readers may recall that this was the first offering of the course, not just by me, but by the university; following the SIGCSE mailing lists, I think other departments are starting to see the need for such a course as well. I decided to devote most of our 75-minute meeting time today to a student-directed analysis of what we learned, inspired by the structure that Michael Goldsby used when he led our six-hour Future of Education Task Force meeting.

First, I asked the students to list anything that they learned during the semester that was incident upon the CS222 experience. I told them that it didn't have to be something that was explicitly listed in the syllabus or in our meetings, but anything that was somehow related. I recorded these on a large self-stick easel pad, and as we filled up each sheet, I had my undergraduate teaching assistant post it around the room. We filled eight sheets with 75 items (with one item on the ninth sheet) in just under 30 minutes.


After listing the 75 items, I distributed sticker sheets and asked each student to put a sticker by the three items that were the most important or valuable to them. I briefly explained that this was purely subjective—that they were free to define what "important" and "valuable" are themselves. Based on the distribution, we made the cut-off at four stars, giving us a consensus on the following items as most valuable:

  • Team programming
  • Test-driven development
  • Use of libraries (software)
  • Refactoring
  • UML
  • Design patterns
From these, I asked the students to consider how they learned these. It took a bit of prompting to construct this second list, but we ended up with 13 items. For example, the first item offered was, "by writing code." I asked for more information about the kind of situation the student meant, because many of our ideas are reified in code. He clarified that he meant, "by writing code that uses these ideas." I pushed a little harder into the kind of situation he was describing, and we ended up with, "by writing code that uses these ideas in the final project." Not all of these were articulated as well as I would have hoped, but this might reflect the most interesting part: that the students did not have the vocabulary to describe activities that they thought were useful to their learning.
Each student was given two stars for these sheets, and these four rose to the top:
  • Lecture-based example that was built upon in assignments
  • Looking at code as a group in class
  • By writing code that uses these ideas in the final project
  • Demonstrations of practice
Disturbingly missing from the entire list of thirteen is any mention of books or the Internet. It seems the focus of student's thinking about learning is classroom-based, despite the emphasis this semester on reflective practice, metacognition, and explicitly learning how to use external (i.e. non-self, non-university) resources. I do not want to jump to conclusions about this, since as mentioned above, the students clearly lacked a vocabulary for describing their learning experiences.

Wednesday, December 8, 2010

MythTV Upgrade

I have just completed an upgrade to my MythTV box. I originally set it up in 2007, basing it on Mandriva 2007.1, and it's been running mostly seamlessly since then. I have had occasional problems with session management, in which multiple frontends would start upon boot, but after some tinkering this became a matter of routine maintenance. However, I've been more recently interested in leveraging the streaming options of my NetFlix subscription, and unfortunately, they do not support streaming to Linux. I understand that this must be due to contracts with the content providers, who insist upon DRM. I wish there were a better solution to this, but I honestly don't have one, and the more pragmatic issue was that I was disappointed with my inability to stream video to my family room TV.

A few days ago, I was browsing the Web and came across a mailing list post from earlier this year in which the author describes how he configured NetFlix streaming in MythTV by way of running Windows XP within VMWare Player. I usually use VirtualBox for all my virtualization needs, but the mythtv-users thread suggested that there are impassable audio barriers with VirtualBox that don't show up with VMWare Player. I have a spare Windows XP license, so I installed it on my myth box about two weeks ago. There is a sense in which it worked, but it was painfully slow: the machine only had 512MB RAM on a Sempron 3000, and trying to do anything with VMWare Player caused hard drive thrashing with swap access.

Poking around my closet of abandoned hardware, I found a case from my previous desktop machine and booted it up. Finding everything in working order, I picked up a 1TB drive on a Newegg Thanksgiving deal and proceeded to transfer the Myth hardware to the other box. This one had 2GB RAM and an Athlon 64 3700+, a vast improvement of memory and a significant improvement of processor.

Unfortunately, getting the new system installed was not as seamless as I hoped. I tried many different distributions and each one ended up with some kind of problem. The odd thing is that moving the old hard drive into the new shell worked just fine, and I was able to check that the hardware configuration was working. However, Mandriva 2007.1's ALSA drivers were too old to work with VMWare player, and so the audio was garbled when streaming video.

After quite a bit of tinkering, I started doing more diagnostics of the installation media themselves, and I found each to have an error. It appears that the burner on my workstation cannot accurately burn 700MB CDRoms. Who knew? When I took the ~700MB image and put it on a blank DVD instead of a blank CD (and swapped the graphics card, which may or may not have made a difference but definitely reduced noise), the installation went more smoothly. I still had to specify "nomodeset" as a kernel parameter in order to get to an installer, but now I have a nice shiny Mythbuntu 10.10 installation working great.

Two unexpected changes from the old installation: First, my USB wifi device worked automagically, without having to download any extra drivers or anything. Huzzah! Second, my StreamZap remote control was automatically recognized as a keyboard, but it was also configurable through the Mythbuntu control center. The odd result was that I was getting double input for the four arrow keys. After some digging online, I discovered that I could just comment out the arrow keys in the ~/.lirc/mythtv configuration file, and now it's working fine.

One of the nicest features I originally set up was for the machine to wake itself up to make a recording and then shut itself down afterwards. This way it doesn't have to be an always-on machine. On the Mandriva 2007.1 installation, this was enabled through nvram-wakeup, and because my motherboard was not in the database, this required a good deal of tinkering to get set up correctly. By switching to Mythbuntu and a newer motherboard, it is all done now with ACPI calls. Specifically, I followed the instructions for configuring ACPI wakeup on the mythtv wiki, and this worked like a charm.

Installing Windows on VMWare Player was not a problem, but hooking it up with the MythTV frontend was not as easy as described in the mailing list. I had an odd situation: if I opened a terminal through the desktop environment (XFCE), then I could run vmplayer with no trouble. It generated some warnings, but it started and run without issue. However, if I launched a terminal from mythwelcome or via the mythfrontend button, running vmplayer would generate the same error messages but then do nothing. After many fruitless attempts to fix this, I asked the resident Unix expert in the department, and he suggested I use printenv in both terminals and diff the results. I had been trying to do something similar but in a much more awkward way—always nice to learn a new *nix command! After a few failed attempts, I discovered that by unsetting GTK_PATH, I could start vmplayer consistently.

So there you have it! I have a machine that boots faster, runs quieter, and allows me to watch streaming Netflix movies from the comfort of my living room. It also has four times the hard drive space, so my son can record as many Dinosaur Train episodes as he wants without my wife's America's Test Kitchens needing to be deleted to make room. My original build four years ago was supposed to be built of spare parts, but it ended up costing me about $400 due to my hardware being faulty or having the wrong interfaces. This revision only required purchasing a bigger drive, and the rest was accomplished with existing hardware and elbow grease. Thanks to the Myth community for all the excellent software and resources, thanks to Spencer for the Unix help, and thanks to Paul for the TV tuner cards that allowed me to build it at all the first time.

Tuesday, December 7, 2010

Future of Education: Inside and Outside the System

This is a continuation of my series of posts (1,2) on the Future of Education Task Force at Ball State University.
I have had two very productive meetings with my working group within the task force, which consists of three faculty and a student. We worked individually on some ideation, and at our first meeting, we collected and sorted our ideas into a laundry list. In our second meeting, about a week later, we restructured our list and removed redundancies as well as elements that are clearly outside our sphere of influence.

The most exciting deliverable from our working group comprises of a pair of prototypes for improving higher education. The two are orthogonal, one being "inside" the current system and the other being "outside" of it. I will start with the latter.

Institution within an Institution

This prototype is based the observation (a la disruptive innovation) that institutions tend to be self-preserving regardless of the need for disruption. The premise is that an "inner institution" be created, in the spirit of a spin-off company, that would be able to experiment with different organizational structures. Easy examples include current models of credit hours, faculty loading, and core curricula. In order to succeed, the inner institution would need to have timeboxed experiments and measurable plans for impacting the rest of the institution. It would be a test-bed for allowing faculty and students to deeply explore a design-thinking-oriented solution to place-based higher-education.

I can see echoes of my discipline and preferences in this model, of which I was a key designer. In software development, there is a dangerous tendency to talk about what might be the best solution rather than sitting down and exploring it. It's one of the reasons I like test-driven development: you start by considering how the module will be used, and then build the module to match, via rapid prototyping with short feedback loops. This applies the same principles to higher education.

I find this model appealing because of its obvious connections to principles of agile software development and design thinking. In software development, we know that we cannot know all of the users and their needs a priori, and so we use agile methods and feedback to continuously improve. Rather than engaging in endless debates—as faculty are wont to do—this structure would allow instead for real experimentation and measurement.

There are many dangers with this kind of approach, of course. As with the connection between a research unit and the rest of a company, the pressures of the controlling organization may quash the potential of the experiments, and there is a risk of irrelevancy of the experimental structures to the rest of the institution. The students involved would have to trade expectations and establishment for an experimental undergraduate experience, and I would not blame parents for being hesitant to send their own kids into such a situation.

Themed Institutes

The working group could not come up with a compelling name for this concept, so for the sake of discussion, I will use "institute". The premise is that the university would identify a small number of prominent, post-disciplinary tensions or problems, such as "Digital Culture and Ethics" or "Capitalism and Sustainability". Faculty from any department could apply to be part of the institute. It merits repeating that these institutes would be post-disciplinary by definition, so they could never fit within an existing department.
The institute would be timeboxed: it would be created and exist for a fixed period of time, after which it would be dissolved.

The faculty proposal would include a description of the learning experiences he or she could offer that further the institute's inquiry. This would practically require that faculty collaboratively propose participation in order to identify how various scholarly traditions would intersect within credited learning experiences. Students would be recruited directly into the institute, potentially directly from high schools and freshmen-level experiences. The students would therefore be part of the institute just as faculty are, and the mixed cohort would grow and learn together.

Students would need degrees, of course, and it would be challenging (though perhaps appealing) to offer a degree in "Digital Culture and Ethics". Hence, another aspect of faculty participation would be organizing plans for students to earn a degree within the faculty member's home department through the institute. Again, this encourages pre-participation negotiation among faculty and departments. This could take the individual pain of course articulation from VBC fellows and making it a shared responsibility in which no one department has the hammer over any one faculty member.

Any prototype for institutional change is going to run into the problem of limited resources. By and large, faculty resources are allocated through departments. Themed institutes may be conceived as transient departments, and depending on how much of a faculty member's load is associated with the institute, they would require facilities and equipment to match. The timeboxing of institutes mitigates some of the risks of allowing resources to follow individuals, even if they are working outside of the department in which they hold tenure.



The next step is that our prototypes and part of our laundry list will be shared with the upper administration, who will provide feedback for the next round of prototypes. We already have a meeting planned for early January in which the task force chairs will share their feedback with us, and I will in turn relay that information to you, dear reader.

Allow me to reiterate that these are not plans: they are only prototypes. They are ideas that we have articulated for the express purpose of learning from them and then throwing them away. The hope in any prototyping process is that each build is a little closer to solving the problem, even if the problem itself may not have been well-identified at the outset. I hope that I have articulated the prototypes clearly enough to foster consideration and discussion.