Wednesday, April 29, 2020

Reflection on teaching CS222 in Spring 2020

The most shocking thing about this past semester's CS222 class was how very normal it was. Just after the first iteration of the final project, campus closed down, but the overall arc of the semester was the same as it usually is.

I used the first three weeks to get the students started in Clean Code, TDD, and OOP as I usually do, transitioning into the two-week project from there. The one wrinkle in this plan was that the person teaching the prerequisite course did not introduce any GUI programming, which I didn't know until the third week of the semester or so. I expected them to have some exposure to it, and I thought about pulling back on some of the two-week project requirements. After reviewing them, though, and knowing how the purpose of the two-week project is more about setting them up for the final project than anything else, I decided to keep it as-is.

Their lack of GUI experience did not end up being insurmountable for such a simple project as the two-week project. The bigger impediment was that I started with the assumption that we would use GSON to parse some JSON data, but the JSON data we were getting from Wikipedia was arduous to parse in this way. It wasn't until near the end of the two-week project that I discovered and tinkered with JsonPath and found it much more fit for purpose. I need to remember, if I do a project like this again, to use JsonPath from the beginning.

Right after the two-week project, teams formed around four project ideas: a D&D character generator, a simple card game, and two variations on comparing data about professional NBA basketball players. RPG character creation tools show up periodically in the class, but I never really engaged this group with one of my favorite issues: the philosophy of character generation. Like every other team, they made something with no real soul, as far as I am concerned. It's not that the idea was bad: their killer feature was that the program filled out a PDF for you to print in the end. It's just that I find this kind of thing uninteresting from a design point of view. It was odd to have two separate groups doing NBA data analysis, but this was convenient for sharing tips between teams. The card game team essentially took a drinking game and digitized it, and there's nothing wrong with that.

The coincidence of two NBA teams brings me to another lesson learned. I could have had more interaction with these teams, and fostered more convenient interteam interactions, if I had set up a shared Slack server for all of us. Some of the teams used Discord and some just used text messages, but there was no convenient way for me to either monitor their discussions nor get teams talking to each other. I talked about this a bit in my Gane Studio reflection yesterday, and I think I could adopt a similar kind of idea here.

Once I made the decision to adopt an asynchronous, distributed approach: the teams had to keep working together on their projects, but I did not want to do remote lectures or require "attendance." Normally, what I would do in class during the last six weeks of the semester is set up exercises to help them succeed in the final project; I was able to accomplish these goals through graded assignments rather than in-class exercises. It's not the same, of course, but I feel like it accomplished the goals well enough.

There were corners that I cut, of course. The one I am saddest about is the discussions of design theory and human-centric computing. I would normally do more in-class exercises to help the students think about what it means to be user-centered vs engineer-centered, to reflect on their own processes, and to think about usability. We covered the minimal amount required by the syllabus, but I do feel like this is an area where sometimes students start to gain some insight into professional development or into the nature of design.

One of the things I had kind of forgotten about, or at least had not steeled myself for, was how very disappointing this class can be around week 12. This is the end of the second iteration of the projects, and I tend to be overwhelmed with a sense of confusion. I spend hours typing up feedback for the students and creating video tutorials (some public on Youtube, others privately shared). When I look at what the students submit, it makes me question the whole enterprise. Why bother writing up grading criteria and advice when students are not following it? Still, weeks later, it puzzles me, in a similar way to how students don't understand checklists in CS315.

One idea that's crossed my mind this semester is that I found myself describing the course as "giving students enough rope to hang themselves with." There's something both true and morbid about that. It's a course with a lot of freedom, and irresponsible students will cause themselves trouble—and worse, cause trouble for their teams. However, I don't think you can really address the course's essential questions without this freedom: you cannot explore what it means to be a professional unless you have the freedom to choose to act like one.

My colleague Huseyin Ergin taught the other section of CS222 this semester, in what I think may just be his second time teaching it. He has tried something new this semester, where every Friday he does a tutorial on a different topic, and students complete it with a concrete deliverable. In a sense, he has made every Friday into a lab day. I think this is a great idea, in part because CS222 is the first programming-oriented class without a lab in the curriculum. By putting in these tutorial days, he's not just teaching practical skills: he is helping transition them toward being more independent but without such a steep drop-off. I need to talk to him about how that went, and I'll think about if that's something I want to bring in.

I'm not scheduled to teach CS222 in the Fall. I'll have three different courses, so I feel like I have to once again abandon this course to memory. The blog is useful for this process, since I can come back and read these notes next time I teach it. However, it is a bit frustrating to think about improving a course but then have to just set it aside.

Those are my notes from the semester. The students are taking their final exams tomorrow, so I can post an addendum if there are any surprises. Otherwise, it's time for me to transition to thinking about what I'm doing this Summer and how I'll prepare for Fall.

Tuesday, April 28, 2020

What We Learned in CS222: Spring 2020 Edition

For many years now, when teaching CS222, I have started the final exam by having the students make a list of what they learned. This small, reflective, timeboxed exercise is conveniently executed during a conventional final exam slot, and we normally then transition to compiling our lists. This compilation process starts out with students simply reciting what they have on their lists, but inevitably, one person's comment spurs the memory of another, and new ideas emerge that were not written down. It also can get a bit silly, in a very good way.

I wanted to keep as much of this exercise in tact as I could despite the closing of campus. I ended up creating a final assignment for the students, due on the last regular meeting day of class, in which they each had to post a list of what they learned to a Canvas discussion board. Canvas discussion boards have an interesting feature in which you can prevent students from seeing others' posts until they themselves post, and this seemed the right option for the assignment. The result is that each student got to share an unadulterated list for their classmates to see. Unfortunately, this lost the fun of compiling the list together, but at least it got the reflective part.

I went through all of the posts and compiled a spreadsheet to count what was most common. Of course, there was a possibility of interpretive error here, but I think on the whole that it is representative. I only found one item among all those shared that I couldn't really classify at all. In the end, I counted 73 distinct learning outcomes. Here are the top six, with their number of occurrences:

  • Clean Code (9)
  • Single-Responsibility Principle (8)
  • Test-Driven Development (8)
  • Model-View Separation (7)
  • GitHub (7)
  • Making GUIs in JavaFX (6)
This is not significantly different from a normal semester's outcome. The ones most often mentioned tend to be the generic ones, such as "Clean Code," whereas specific Clean Code practices or theories are less frequently represented (and counted separately). Curiously, items relating to distributed development were sparse. Many people wrote about working on a team, but only one explicitly mentioned remote collaboration.

That's enough blogging for today, since I already wrote up my reflection on the Spring 2020 Game Studio this morning. I expect I will be back tomorrow with a reflection on the CS222 course.

Reflecting on the Spring 2020 CS490 Game Production Studio

I remember that there were many times in the past months where I thought, "I really should be blogging more about the CS490 class," and yet, I was surprised when I looked through the archives and saw that I hadn't really written anything at all. The only reference is in my post about how we moved to distributed teams. That being the case, let me lay a little foundation here.

I have been teaching a game production studio course in the Spring semester for several years. This has been done through the university's Immersive Learning program, which connects faculty, students, and community partners to build solutions to interesting problems. I have been having students apply to the experience so that I could get the best mix of talent. This led to my feeling rather guilty in Spring 2019, when I had over thirty people apply for the ten or so positions I could fill. This told me that there was much more demand than I had supply. One one hand, it's good for the partners to get the best students they can on the project; on the other hand, this means a lot of students who want to learn about multidisciplinary game production do not have the opportunity.

In response to this, and as part of my continued consideration of how to scale up games scholarship at Ball State, I changed the format of CS490 for Spring 2020. This time around, it was not an Immersive Learning class—no community partner meant lower stakes. I decided to welcome any students who wanted to join, as long as they were past the foundations courses of their major, and that I would let them pursue whatever creative games projects they desired.

I ended up with twenty-one students in the course. A few came from my Spring courses on game programming and game design, but most were new to the domain. I gave them a crash course and starter methodology, mentored them through a pitch and selection process, and we were off to the races. Four teams formed around four ideas.

I seeded each team with a starter methodology similar to the one I have been using for some time. I added a "student producer" role, and to support them, I tried to make some of the parts more explicit. This documentation, along with other pieces I shared during the semester, is in a GitHub repository. My plan was to make this public and reusable by future teams, but I think I should do an audit on it first.

The intention was that each team would be autonomous, and indeed, they were—almost to a fault! Whereas I usually am acting as a producer and director on the teams, I was now removed from some of the day-to-day details. This left me feeling out of the loop, and I am sure I missed some opportunities to help. The biggest danger was certainly second-order ignorance: the students don't know what they don't know. When I am embedded in a team, I can detect the signs of problems and move to identify or correct them. Outside of the teams, I suspect it is impossible for me to write down for students, in a clear and actionable way, all the possible things I might notice as potential problem indicators.

We moved forward in two-week sprints. The teams all struggled with the concept that the sprint should produce an executable release. However, I don't think they know that they struggled with this. Yes, each sprint produced something that ran, but it was months before anyone produced anything that was remotely testable. Indeed, the biggest failing of the semester was that, as far as I can tell, none of the teams ever really playtested their games. In a high-functioning team, like the best of my past Immersive Learning teams, we produce something that we can play with other people each sprint—ideally conducting the playtesting within the sprint, since we don't have separate "design sprints." As a result, the games all contain what I would consider significant impediments to playability that a modicum of playtesting would have made obvious.

The shift to distributed development went fairly smoothly for this class. There was an uptick in the use of our existing Slack. Teams did their daily  (MWF) stand-up meetings via Zoom, which I then also used to host the Sprint Review meetings. Zoom's remote control feature was valuable for when I was working with students on technical problems as well, although for the Sprint Review, it was really just about screen sharing.

By the end of the semester, all the teams had been able to build playable games that matched their original vision. Some took a slow and steady route to get there, and others had to twist, turn, and reboot their efforts, but they all got there. I am proud of their work, and any criticisms I have offered here are primarily aimed at helping me be a better mentor.

Each of the teams made some kind of public release, even if it's just a public GitHub repository. I'm happy to share those links here:

  • Adventuring Supply Company, a game about running a shop for fantasy adventurers while paying off a debt to the dragon mafia
  • Axil, a game about growing plants in space
  • Teddy Bear Wars, a 2D platformer in which the demo level hints at a strange mystery
  • Title Pending, a still unnamed combination rhythm game / deckbuilder

During our final Sprint Review meeting yesterday, I invited the students to share any observations they had from the semester. The results were interesting, so I want to summarize a few of them here. One student mentioned how, when they had gone to distributed development, they originally decided to become lax on synchronous development, and they saw productivity plummet as a result. They quickly changed direction and went back to enforcing that the team should work together at 9-11 MWF, just as we had done in person, and they found their productivity shoot back up. Another student noticed that "group productivity" was higher before the campus closure, but that afterward,
"individual productivity" went up. That is, if I understand it correctly, individual team members had to be more accountable to their work rather than relying on the collective in the room to solve problems. Several students offered closing comments about how much they learned and how they enjoyed this as being a very different kind of university experience.

At this point, I want to turn from talking about what happened toward what I should consider doing differently next time.

One of the mistakes I made early in the semester was that we formed teams before I set up a common Slack server for the studio. In the meantime, each of the teams had set up their own communication channels—primarily Discord. Once I realized this, I created the Slack as a place to have all the teams in one place: this made it clear that we are one studio with multiple teams, and it also allowed me to track all the teams in one place, watching for places where I could offer encouragement or advice. I requested that the teams dismantle their other, ad hoc communication systems, but one of the teams did not. They continued using Discord because it was more familiar to them, despite its being inferior for our purposes. I suspect, as any reasonable person might, that this was also a matter of pride: they felt ownership over the tool they liked and so assumed it was the right tool, the same way people do for operating systems and text editors. In any case, I share this vignette here as a reminder to myself to set up the Slack first, before the semester even starts, to make it clear where we go to share information with the team.

We did not have a good system for assigning grades to students. I deployed my usual approach here, which was to require individual reflective submissions at the end of each iteration. These took different forms and offered different perspectives, but they were also easy to game. For example, if I asked a student to share something they contributed over the past two weeks and a reflection on what it means, anybody can talk about some small contribution: it doesn't mean the contribution was valuable, that it was communicated well, or that it mattered. In some cases, students who clearly were doing a bang-up job were just overwhelmed with other obligations and forgot (or deprioritized) the submission of the reflection. Clearly, if I just used these, it would be insufficient and inaccurate. However, the intuition I would normally use when embedded in a team runs the risk of being unfairly biased with distributed teams. Indeed, I am not even entirely sure how I want to assign grades even now!

A different approach is clearly needed to deal with managing multiple teams, whether they are distributed or not. Last year, I wrote about the idea of paths for students; I like that idea, but I have not pursued it. Something more like contract grading or specifications grading may be in order. This approach would also let me make it more obvious which concrete elements of the methodology I value the most. For example, all the teams this year were supposed to be using burndown charts to track their progress, but only one did, and only that team reaped the benefits. The other producers either forgot or didn't care. If this were part of a grading specification for the student producer, or the team writ large, then you can be sure they would do it. Similarly, if there were a grading specification that said you had to playtest with people outside your team and document the results, you can be sure students would do it with much more gusto than if I continue to give them periodic reminders.

The problem with this is, of course, that you can actually make a great game without using burndown charts or without using a particular form of playtesting. Is this course about making a great game or learning my particular process? I cannot rightly tell the teams that they have authority and then also say they have to do things my way. This past semester, though, I leaned toward a libertarian approach and the result was something like a tragedy of the commons: students who I am sure did not carry their weight are grazing on the high grades of those who did.

Another option is to use peer evaluations. During the final meeting, one of the students mentioned another professor's course that uses a scale of percentage contribution, which he found useful. I took issue with that, since I'm not sure that percent contribution makes sense on a multidisciplinary team: what is a percent of the project anyway? The low-hanging fruit here is, of course, for me to just use the self- and peer-evaluation strategy that I have used in CS222 for years. Really, I could just deploy this almost as-is and see how it goes. Thinking back on the semester, I think it was mostly inertia that prevented me from doing it. The switch to distributed development was, in retrospect, probably the right time to do this. So, if nothing else, I can try this as a starting point next time around.

As I reminded my students yesterday, the game production studio course is hands-down my favorite thing to teach at Ball State. I am grateful to be able to do it. I hope to run another studio in Spring 2021, and I am continuing to invest energy in finding ways to scale up these experiences without sacrificing quality. Thanks for reading!

Monday, April 27, 2020

Infinitely scrolling starfields in Godot Engine 3.2.1

In my previous post, I mentioned that I wanted to create infinitely-scrolling starfields in Godot Engine, but I could not get it working within the time limits of the jam. Yesterday, I spent some time investigating different ways to accomplish this, and I was able to come up with a workable solution. This morning, I started recording a video tutorial about it, but I found certain steps hard to explain, and so I stepped back and looked at alternatives. Turns out, there was a much easier way to do it than I was going to record. Here, I want to share both approaches.

First Attempt: Using Shaders

It seemed to me that a solution should lie in Godot's Parallax Background feature, which I had never used before. As I perused tutorials during the jam, it looked to me like the background could only be used over a fixed area—not an infinite plane. This got me thinking about the classic, old-school game programming problem: should I move the player or should I move the world?

The structure of my solution, then, was this:
  • Keep a single copy of the background image under the camera
  • As the player moves, move the the world in the opposite direction, thereby keeping equilibrium.
  • Pan the background using a custom shader to give the illusion of motion
This approach works. Here's a screenshot demonstrating the scene layout:
The Background is a TextureRect showing the starfield, and under it in the tree is a Node2D called World. Actors can move within this world, but it won't affect the background.

The script that brings these together is this:

 extends Node2D  

 export var speed : float = 200  

 func _process(delta:float):  
      # Calculate movement direction  
      var direction = Vector2()  
      if Input.is_action_pressed("ui_up"):  
           direction += Vector2(0,-1)  
      if Input.is_action_pressed("ui_down"):  
           direction += Vector2(0,1)  
      if Input.is_action_pressed("ui_right"):  
           direction += Vector2(1,0)  
      if Input.is_action_pressed("ui_left"):  
           direction += Vector2(-1,0)  

      # Determine velocity from the direction, speed, and elapsed time  
      var velocity = direction.normalized() * speed * delta  

      # Update player's position based on its speed  
      $World/Player.position += velocity  

      # Update the world's offset based on player's speed  
      $World.position -= velocity;  

      # Track the background shader with the world's offset  
      $Background.material.set_shader_param("offset", -$World.position / $Background.texture.get_width())  

The background has a custom shader that I lightly modified from a common approach I found online. Incidentally, it is the first shader I have ever written.
 shader_type canvas_item;  
 uniform vec2 offset;  
 void fragment() {  
      vec2 shifted_uv = UV;  
      shifted_uv += offset;  
      vec4 color = texture(TEXTURE, shifted_uv);  
      COLOR = color;  

This is simply taking in the offset and shifting the UVs by the appropriate amount. Note that the TextureRect has to be set with a Stretch Mode of Tile for this to work.

An advantage of this approach is that all the in-game actors are still expressed in game coordinates. The obvious disadvantage of this approach is the need to have a separate World node—an organizational layer separate from the background. This complicates the process of adding nodes, since you cannot just add them to the root as one might expect, but rather to the World branch of the root.

Using Parallax Background and Layers

As I was tinkering with ParallaxLayers, I found this:

"Useful for creating an infinite scrolling background" you say? 

I had seen this configuration option earlier but had dismissed it. I did not want to mirror anything, so that couldn't possibly be what I want. Turns out, it is exactly what I want. Simply adding a mirroring value equal to the size of the texture created an infinitely-scrollable view of the starfield.

This led to my next question: Why is it called "mirroring"? In computer graphics, "mirroring" is a wrapping setting that can be given to a texture to determine how it is rendered when the UV coordinates exceed the normal [0,1] range. This is clearly explained in the official OpenGL documentation. This figure, taken from the official documentation, demonstrates the four wrapping options:

As you can see, mirroring ... well, it mirrors, to no big surprise. When I look at what the Motion Mirroring does in Godot Engine's ParallaxLayer, I don't see any mirroring at all—neither in the OpenGL texture wrapping sense nor in the informal, colloquial sense. I even turned to the source code of ParallaxLayer for a hint. That just took me to the source for rendering_service_canvas, which justifies its need for mirroring by cyclically referring back to the ParallaxLayer's need for mirroring. Nowhere did I see any code that actually looked like it mirrored anything. It looks to me like it's just the word the developers used to describe this feature, but I suggest that this is simply the wrong word.

If "mirroring" is the wrong word, what is the right word? It's probably something more like "wrapping', which is the one that OpenGL uses, or "repeating", which is what it looks like it's doing when it's working right. 

I was glad to find this feature, and hopefully by using it and writing this post, I will remember it next time I need it. At the very least, perhaps Google will bring me back to this post when I search for help. For a while there, it looked like I was going to make my first Godot Engine video tutorial for my series. I'm glad I found the easy answer before posting a video about my convoluted alternative, although I still wish the feature had a better name.

Sunday, April 26, 2020

Fam Jam #2: Rocket and UFO Game

After the March Fam Jam, the boys were clamoring to do it again. "We should do one a month," they shout, because shout is the default volume level with four boys in the house. I love their enthusiasm and that the joy they get from the experience overrides both the essential and the accidental frustrations.

TL;DR: We had fun making a game, and you can play it online right now (as long as you have a keyboard). Here's a gameplay video for those who cannot play it directly:

The night before the jam, I did a little bit of prep work. I knew we would end up using Godot Engine since that's what #1 Son has been spending a lot of time with. I set up a repository on GitHub, copying over some of the structure and utilities from Canning Heroes Special Edition. I completed that project since the last Fam Jam, and in so doing, learned more about best practices for Godot Engine projects and how to script builds and deployments. Whereas Joe Johnson Gets Captured used the repository docs folder to publish on GitHub Pages, and hence had the problem that pushing and deploying to master were the same process, Rocket and UFO Game takes a more elegant approach of using the gh-pages branch.

Around 7:30 in the morning, with my morning cup of tea, we set a deadline of 8PM to finish the jam. We formally assigned #3 Son to be the Creative Director. He had been asking for days—maybe weeks—if he would be in charge next. I cautioned him that we had to have modest scope, as we only had one day. He asked if we could extend the deadline—this kid has already learned something about estimation! I reminded him that we had just set the 8PM deadline, and #1 Son commented that the deadline is a crucial part of the jam process; he has done a few jams now and clearly has gotten his head into what it's all about.

Our Creative Director paused as he tried to simplify his design ideas, and then he said he wanted a game in which you control a rocket that is always moving forward, with a force field, who shoots UFOs that drop chickens. Makes sense to me! I expected his design to include more technical details of rocketry, since this is the kid who devours any information about NASA, the Apollo missions, rockets, and so on. He's currently saving up his allowance money to buy a fancy Saturn V model kit. I am glad he chose something a bit more modest and conventional.

I dug into some of the rocket controls while my wife and #1 Son made some delicious sourdough pancakes for breakfast. They used sourdough starter that is literally older than I am, gifted to us from a family friend. In my first pass, I kept track of velocities and changed the position of the rocket on each tick, but then something I read reminded of the built-in move_and_X methods. Using move_and_collide instead clearly was the best option, and so that's how I approached the rest of the pieces.

After breakfast, we made a more formal plan of who was working on what, and the day proceeded with excitement and creativity. Here are a few vignettes:

I wanted an infinite scrolling field, but we could not sort out how to make that work, and so we ended up just bounding space. I would like to investigate how a background could be infinitely scrolled under a camera, since it seems to me it must be possible.

#2 Son had a blast using bfxr to make the sound effects. He also made a song for the game in LMMS, but when we put it in, the intensity of the music crowded out the sound effects, and it was hard to tell what was going on. We moved that song to the title screen instead.

I wrote a short loop for the main gameplay, with the idea that I'd hand it over to #2 Son to add some leads, but clearly we were having a mismatch. I think I assumed he had a bit more savvy around how music and songs were structured than he does, so I just took this over. Ironically, then, my first song was terrible. It was in Dm, and that was just wrong for a game with space chickens. I threw that away and did a similarly-voiced song in E major. The B part of the song was, again, terrible, so in the interest of time, I tossed it, and we're left with a 12-bar loop that is similar in tone and repetitiousness as Goofball's.

It may seem like a small thing, but I think the flames that come from the rocket are wonderful. #2 Son took a stop-motion animation class with BSU animation professor Brad Condie last Fall, and one of the most interesting things I saw him do was make a crackling campfire. I had never really thought about anything like that before: he insisted it was relatively easy, but like so many things, it's all about knowing the tools of the trade. I encouraged him to work with #4 son to collaboratively draw a few frames of flame animation for the rocket. We got those scanned and put into the game, and I think they are great.

I disagreement with my wife in the morning which made me reflect on how she was too peripheral to the process, so I did what any loving husband would do: I taught her to use git. We set her up with ssh keys and a GitHub account and downloaded Godot Engine onto her machine. I gave her a primer on how to interact with the version control system, and in no time, she was able to not just make content, but integrate and test it herself. Once she got her sea legs, I explained to her that in my multidisciplinary game studio classes, my rockstar artists will make sense of version control, which empowers them to be first-rate creative contributors to the project, whereas my timid members will avoid learning about version control and, as a result, only ever be on the periphery of the project.

Incidentally, I (re-)learned two important facts about git during this process:

  • Using git pull --rebase will avoid those annoying merge commits.
  • git clean -n and git clean -f will clean out junk such as post-merge detritus from a repository.
Later in the day, my wife wanted to approach the title screens by designing them in Inkscape and then exporting them as images, as she had done in the March FamJam. I explained how parts like the instructions should really be done in-engine so that we can change the text easily, and how even the logo of the game, if it is only plain text, could also be done in-engine. Both of these reduce iteration time, which happened to be the concept that sparked our little disagreement in the morning. She had already become proficient with git and could run the game herself, so I gave her a quick tutorial on how to design a UI within Godot Engine. She has a lot of experience and talent with desktop publishing applications, and so she took to it like a fish to water. Later in the day, she even tried to copy some of the scripts that my son and I had in place to drive the menu system, in an attempt to integrate the credits screen herself. We had to help a bit here because of the unintuitive signals system, but I give her credit for the confidence.

It was exciting to see how #1 Son was an essential developer on this project. He's been doing jams with me for a few years now, but he has often been something of a legitimate peripheral participant. The last few weeks, he has been excitedly working on some side projects in Godot—projects I have been mostly outside of, aside from the occasional bugfixing tip. For Rocket and UFO Game, he handled a lot of the code himself. Some of this was a bit awkward, such a his first attempt at programming the UFOs. He knew he wanted them to follow a path, so he made them into PathFollow2D objects themselves, with Sprite children. This meant that the UFO scene started with a compiler warning, since PathFollow2D nodes need to be children of Path2D nodes. He was unconcerned but also could not think of another way to do it. I showed him how, instead, we could make the UFO into a plain old Area2D node, allowing it to be tested independently, and then programmatically create the PathFollow2D node as a parent of the UFO when it is spawned. We plan on doing a code review later today to see if there are other places where he can learn some new ideas.

By supper, we had everything in place except for the force fields. I was concerned that, with only about two hours until our deadline, we should probably focus on some balance and polish issues. The rest of the team really wanted to put in force fields, and so we decided to go for it. #1 Son really did all the development on this feature, while I fought with a frustrating bug in the sound system. The takeaway on that one is to make sure you use AudioStreamPlayer and not AudioStreamPlayer2D for your music. That's the kind of kick-self-in-pants idea that is obvious only after spending an hour trying to track down the bug.

We pushed the final version of the game to GitHub at 7:58PM, coming in just under our self-imposed 8PM deadline. We played a few rounds, the high score being, I think, 24 chickens. After that, it was time to get the littler ones to bed, and the rest of us played our worst game of Just One that we've ever done. I think we were all tired and, perhaps, had used up our luck for the day.

That's the story of Rocket and UFO Game. I hope you enjoyed reading about the experience. I definitely encourage you to try something like this with your friends and family. One thing that has struck me is that it's much more fun to make something in a tight timebox than to start something more grandiose and fizzle out. I'm not sure yet what that means for my summer, since I'm still having a hard time getting my head around the impact of my constraints. In any case, thanks for reading, and enjoy Rocket and UFO Game. All the source code is available on GitHub under the GPL v3, so feel free to study it, tinker with it, and learn from it. 

Monday, April 20, 2020

Ludum Dare 46: Goofball

This past weekend, I participated in Ludum Dare 46. The theme was "Keep it alive," and as soon as I read it around 9:30 on Friday night, my mind went to breakout-style games. I've always had a fondness for this genre, but I've never built one.

Here's a gameplay video that shows the result of my efforts:
Here are some essential links:
The rest of the post will provide some narrative about how the project came about.


When my wife saw what I was working on, she said, "This used to be the only kind of game." It's true that my inspiration for this project draws from games of my youth: the 1980s and the Commodore 64.

One of my favorite games on the C64 was Krakout. Check out this video of a chap playing the first several levels of the game, and tell me if that's not some of the best SID music you have ever heard. It amazes me what some of these guys could do with three-voice polyphony. On top of that, Krakout had the trippy option to scroll the background with the ball and, of course, some kind of destructible floating heads. I didn't go back to watch the video until after the jam, when I was writing this post. Now I have that music running in the background, and I'm digging it.

I thought about it a bit Friday night and got up Saturday ready to dig in. I decided to use Unreal Engine 4 for a few reasons. One reason was that there were some specific engine features I wanted to explore. Another was I was hoping to encounter some ideas to wrap up my semester's work on tutorial videos that explore Computer Science concepts in UE4. That project has taken a back seat to just trying to keep the lights on this semester. Fortunately, my playlists exist outside of the semester timeframe, so I can continue to explore the project beyond this semester.

What Didn't Make It In

A Little English

One of the ideas I wanted to explore from a game design point of view was the effect of torque or spin on the ball. I have never seen a breakout-style game that does this, so I thought it would be fun. My hope was that I could get this all done using UE4's physics engine. I tinkered with it for maybe two hours Saturday morning and made little headway, but here's some of what I learned.

I was able to get a ball launched so that when it bounced off of a wall, it was given appropriate spin. However, this also slowed it down. It slowed down because of restitution and friction. Restitution was easy to change without hindrance, but when I removed the friction from the surfaces, the ball no longer gained any spin. In retrospect, this is sensible: of course it is the friction of the collision surfaces that would impart a rotation. In a pure, frictionless, inelastic collision, we would expect no other kind of force to be involved. I could not reconcile the desire for spin and the lack of friction, and I had no gameplay at all at this point, and so I abandoned this line of research. 

Later in the day, I read about the potential of using a physics motor to drive the forward momentum of the ball, perhaps in a way that I could increase its energy on each bounce instead of reduce it. However, at this time I had decided I could not afford any more effort along these lines.

I had switched instead to writing my own collision logic. I have always had a hard time remembering fundamentals of linear algebra: I feel like I am always learning it from scratch. I was able to do some research to discover the formulae I needed for bouncing off of arbitrary surfaces, and I found this StackOverflow answer to be particularly useful.

After programming the basic bounce logic, I jumped back into trying to get some spin on the ball. I spent too long on this without yielding any fruit, but I do want to share a small success. At one point, I determined that the spin I wanted to add was determined by how the original vector and the resulting bounce vector intersected. Dot product was being used to determine the new angle, but how could I sort out if I should impart a clockwise or counterclockwise spin? Cross product! Yes, friends, I remembered something from linear algebra and deployed it successfully. Huzzah! Unfortunately, I was sinking way too much time into this little bit of polish, and at lunchtime still didn't have a playable game, and so this all went into a dead branch on git.

Destructible Meshes

I had read about UE4's destructible meshes but never used them, and I thought this would be a fun way to use the 3D engine to enhance an essentially 2D gameplay experience. Making destructible bricks was not too difficult, and I quickly had them exploding nicely into shards. Unfortunately, these shards were then bouncing off of the gameplay bounds, but only sometimes: some shards passed through, and some bounced off at very high speeds. I fiddled with the collision settings so that the bricks only collided with the ball but not the bounds, but then the result was that, even if I turned off collision at the time of explosion, some still collided with the ball. It seems to me that the handling of collision on these shards is inconsistent. I followed some advice about tweaking the size settings so that they would not collide at all, but I did not find the engine simulation to match the suggestion. Of course, I was doing destructible meshes in an unconventional way, and so the best I can figure is that what I wanted to do is not what this technology was designed to do.

I looked into the new Chaos destruction engine, which I had seen mentioned in a livestream months ago. I knew plugins were available for it, but I didn't realize that you have to do a 4.23 source build for them to work. That put a mercifully quick end to this exploration.

This was another Saturday morning exploration of polish that left me a little smarter but without any kind of playable game. Over lunch, I expressed my frustration over not following the advice to get something working first, and then add polish. My justification is that the whole project was really just an exploration of polish: of course I knew I could make a breakout game, but the crux of it was making it with these particular ideas in place. My wife said, "Learning is slow." I responded with such vigor that she thought I was mad, but I was really just emphatic about how right she was. Her perspective was that perhaps a game jam was not time to learn something new but to deploy what you already know. Well, maybe next jam I'll try that approach, but I do tend to use them as a timeboxed exploration of something new.

Level Sequences

For all the time I've spent in UE4, I've never really designed levels. My major UE4 release, Kaiju Kaboom, uses procedurally-generated levels. I've never used the level sequencer or matinee to create in-game cinematic effects, and I thought this would be fun. In Goofball, I did explore this for doing some fun camera work, but in the end, everything I was trying to accomplish I could do better with other techniques. For the camera, it was a simple Set View Target with Blend node. I was hoping to make the main game camera do a sort of figure-eight movement that gets more intense with each level, but I ran out of time before I could explore that.

Spline Meshes

I originally had simple stretched cubes to establish the game area, and as I was replacing placeholders with real assets, I realized that this might be a good opportunity to explore spline meshes. When I first learned about spline meshes—probably years ago—I was blown away. I had never really thought about how I would model something like a highway before. Again, because I don't really do any level design as such, I never came across a need for them. It got me thinking, what if I designed the boundary of the game with a spline rather than a static model?

This was driven by whimsy: unlike torque, destructible meshes, and level sequences, I did not come into the jam wanting to explore spline meshes. I was able to get a simple and ugly spline mesh laid down, but what really surprised me was that I was getting collision from the path of the control lines rather than the rendered path of the geometry. I am still puzzled by this, but I realized pretty quickly that this was not core to the project and just dropped it. Perhaps I can take a closer look at this in a future project.

Gameplay Cues and Decorators

Along the lines of finding topics to pursue in a tutorial video, I had hoped to get Goofball to the point where I could explore either simple gameplay cues (as I wrote about a few weeks ago) or the use of the Decorator pattern for stacking powerups. Unfortunately, neither worked out. I never got beyond simple ad hoc handling of visual effects, and without any powerups at all, there was no need for decorators. Still, I think both are interesting topics, and I need to consider whether it's worth making a video tutorial on these. I would have to spend a few more post-LD46 hours with the code to see if I've left it in an amenable state.

What did make it in

Now, let's turn our attention to some of the cool things that are in the game.

Fisheye Lens

One of the main drivers for this project was to play with visual effects, and chief among these was the idea of a fisheye lens effect. You know what I'm talking aboutWaughmp Waughmp. I didn't have any idea how to do this, so I started a-Googling. There were a few mentions of FOV tweaking, but I saw some cautions against that and never tried it. I went right for what seemed to be the consensus approach: a post-processing volume. This was exciting to me since this was another thing I had never done.

Emre Karabacak's tutorial was helpful in getting me set up with a minimum working material. My next step was to consider how to stack the effects. Timelines are a useful way in UE4 to play a canned animation, such as running the fisheye intensity up and down over half a second or so. What I wanted was something more like curve addition or scaling. While the intensity is rising, the ball may hit another brick, which should raise the ceiling and extend the curve. I looked into UE4 curve assets (again, something I had never used), but I did not see a way to operate on them, only to evaluate them. 

I ended up using the magic of FInterpTo with a pair of values. The target intensity indicates the peak of the curve, and every time there's a ball collision, that intensity increases. That target also falls toward zero. The interpolated intensity will always move toward the target intensity. The result is exactly what I wanted: a peak that can raise arbitrarily high, but falls off, and a visual effect that tracks it.

The standard tutorial code one finds online for a fisheye material has it centered in the camera, but as I played with my game, I realized it could be interesting to track the ball's position with the effect. This also presented me with a good challenge: had I learned enough of the UE4 material editor, and did I understand the fisheye demo code well enough, to make these changes? It took little time and it looked great, so I was happy when this all came together. The basic idea was to get the ball's position, project it to screen space, and then normalize this to put it into UV space. This is then a modification to the RadialGradientExponent's center parameter.

I knew early on in the project that I wanted the profusion of visual effects to be the main source of difficulty escalation in the game. Most breakout games do this with level design, but I wanted to tinker with technology rather than clever arrangements of blocks. One of the last features I added was that, as the player clears boards, the amount of intensity added by ball bounces increases by 50%. This is really disorienting to me as a player, and I think it delivers on the "goofball" promise. (Here's a treat for my readers: Numpad * kills all the blocks on the screen. Use this a few times in a row to ramp up the level, and then check out the insane visuals. Enjoy!)

Having dropped the physics engine and the destructible meshes, I do wonder whether I could have done this effect with reasonable performance in Godot and built the game for HTML5. This certainly would have made it easy to share. I barely know the syntax of GLSL shader programming, but I wonder if my time spent in the UE4 material editor would give me a good grounding for getting into it. That's a challenge for another day.


The last time I composed any music on my computer was for Please, Sir, my entry to Ludum Dare 28 in (yikes!) 2013. It seems I did not write a blog post about it, but I found my GitHub repository. I remember writing the soundtrack in Rosegarden, which I chose in part because it was most like the Cakewalk composition software I used back in the 1990s to record an album of MIDI tunes. I used to spend a lot more time making music than I do these days, and I miss it sometimes. You can't do everything... but you can do game jams!

One of the things I love about Ludum Dare is that it pushes you to be a renaissance man. The Ludum Dare compo requires you to make the whole thing yourself: programming, design, art, music. A few years ago, they added the Ludum Dare jam, which allows you to work in teams and use others' assets. Given that there's basically infinity jams on the Internet going on now, I don't know why someone would want to come to Ludum Dare to do that. While the Ludum Dare site describes the compo as "hard mode," to me, that's really what makes it stand apart from the rest. 

In any case, I was excited to get into LMMS and figure it out. I mentioned before that this was the tool that I recently learned about and then coached my son into using for our March FamJam. Here was my chance to write something for it myself.

One aspect of the tool that surprised me is that every beat/bassline track has the same instruments. This is not at all obvious to me from the interface: it looks to me that my song can have any number of beat/basslines, so why should they be different? This cost me some time as I worked on one beat, set it aside, worked on another beat, and removed instruments from it. Then, back to the original beat, and things are missing.

After making better sense of the interface, I laid down a beat and a single repeating melody, which was just a simple doodle in A minor. My plan was to come back to it and add some more panache, but I ran out of time. Well, what actually happened was that the time I was going to use to add some more audio interest was instead spent helping my son publish his own Ludum Dare entry, but that was a good use of my time.

The number of samples and synths in LMMS is overwhelming. However, it also became a great source of sound effects. All of the sounds in Goofball are pulled from LMMS. My original plan was to tinker with something like bfxr, but as I worked with LMMS, it seemed crazy not to use it.

Color Palette

I did not have a particular look in mind for the game going into it. While the rest of the family was engaged with Ludum Dare participation, my wife was working on a puzzle:

The pieces were spread across the kitchen table, and the color scheme caught my eye. I started replacing placeholder materials with some drawn from this palette, and I dropped the background to pure black. I'm not sure what colors I would have ended up with if my wife hadn't been working on this puzzle, but I was pleased to find such convenient inspiration.


I wasn't the only one working on a Ludum Dare 46 project. As I mentioned above, #1 Son made his first official entry to the jam. He's a teenager with his own GitHub account, and so I hope this was a good experience for him. There doesn't seem to be any minimum age for Ludum Dare accounts, but the infrastructure around it benefits from things like email and Web-based repositories. #2 Son also worked on a game in Construct, although he said he did not want to actually submit it to the Compo. However, this morning, he mentioned that he wouldn't mind sharing it with family and friends, so folks in those categories can watch my Facebook page for links later. #3 Son also made an entry using Kodu. He really wanted to share it, but I don't think it's quite the right time for him to do so, and I'm not sure I can export a game out of Kodu anyway. Still, I love his spirit. #4 Son spent the weekend mostly watching his brothers make games. I suspect that one of these days his interest in reading will kick in, and then there will be no stopping him.

That's the story of Ludum Dare 46 from my perspective. I hope you found something interesting to take away from it. Thanks for reading!

Monday, April 13, 2020

Final Project Licensing

This morning, my game studio teams had their final project planning meetings. One of the things I urged them to discuss was the licensing of their projects. It's also something that I bring up with my CS222 this time in the semester. Normally, in both courses, I just take a few minutes to lay the groundwork, give my recommendations, and answer questions. I have been trying to make as many things asynchronous as I can in order to accommodate the upheaval in students' schedules, and so I took a few hours today to transcribe my thoughts on final project licensing. I wrote this specifically for my CS222 students, who will be assigned to consider this, but I also tried to keep it accessible for other uses as well.

The "live" version of my introduction to licensing is in a private GitHub repository for my students. In the spirit of no wasted writing—and also to help me find it later—I decided to also share it here. This also gives an opportunity for you, dear reader, to share your feedback with me about it. I originally composed it in Markdown, but I ran it through a converter to post it here on my blog.

EDIT: Based on feedback received from a colleague over email, I have added a clarification in the "live" version that students have available.

Without further ado, here is the first draft of the final project licensing tips I wrote up for CS222 final projects.

Final Project Licensing Considerations


I am not a lawyer. I am providing you the results of my scholarly investigation into intellectual property. If you need legal advice, get a lawyer.
By default, you own the copyright on whatever you create. This means that you have the right to decide who gets to make copies of the work: it is literally the right to copy. In the United States, you don't have to do anything special to have the copyright. If you draw a doodle on a napkin, you own the copyright on it. You can register your copyrights with the federal government, which can be useful in the case of copyright dispute. However, remember that the copyright itself is automatic. 
For the final projects, your team co-owns the copyright to the project. This has the implication that if one of you wants to do something particular with the project, such as put it up on GitHub or try to commercialize it, you need the whole team to agree. If one member of the team does not agree, and in the absence of any other legally-binding agreement, there's really nothing else to be done about it. 
There are certain circumstances under which the university claims ownership to student-created work. An easy example is if the university paid you to make it, such as is the case for employees of the Digital Corps. This is called "work-for-hire" and it's very common: if your job is to write code for an employer, then they own the copyright—not you. The more nebulous circumstance is when the university has put significant resources into the development of your work. They have changed the articulation and specification of this rule during my years here, but in general it is meant to include cases where students benefit significantly and extraordinarily from input of faculty or access to equipment. It is important to note that for this class, I designed our interactions so that the university would have no clear claim on the intellectual property. You defined the problem, you designed the solution. While it is true that I evaluated the structure and process of your solution and helped you approach the learning objectives of the course, the projects themselves are yours.


A license determines what rights a user has with regard to a software system. This includes, but is not limited to, the rights of the user to investigate the source code of the software or to decompile or disassemble it. You probably have seen End User License Agreements (EULA) when you run software for the first time. Maybe you've even scrolled through them before clicking "Accept". These are licenses, and they are an important part of the organizational structures around software development companies. EULAs generally are examples of proprietary licenses. This means that the user has no rights to inspect the source code of the programs they are running. Proprietary licenses are not the only kind of licenses. 
Licenses do not just apply to end users. As a developer, your use of third-party libraries is also bound by the terms of its license. Make sure you understand the licenses of the libraries you add to your applications.


Before we can get into the non-proprietary options for licensing, it is important to understand freedom. Richard Stallman famously defined free software as software that preserves the following freedoms:
  • The freedom to run the program as you wish, for any purpose (freedom 0).
  • The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.
  • The freedom to redistribute copies so you can help others (freedom 2).
  • The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.
Software that upholds these freedoms is said to be "free software." Note that "free" here means "free as in freedom." Critics of Stallman's choice of words have suggested that we use "libre software" instead, to make it clear that this is not about economic cost. This is also called the distinction between "Free as in speech vs. free as in beer." 
A common misconception of Stallman's position is that he wants software to be given away, but that could not be farther from the truth. Stallman advocates selling software for as much as the market will bear. One of his key observations is that "the market" here is different from "the market" in, say, sub-saharan Africa. It would be permissible under a free software license for you to buy software and then to share it with your disadvantaged brother—a form of charity that is generally disallowed by proprietary licenses. 
Free software is distinct from open source software. Free software is about licensing, but open source is about development methods. Open source software is developed "in the open" for everyone to see. Rather than do their development behind closed doors, open source software developers keep their code publicly readable. Part of the philosophy is open source is that many eyes make all bugs shallow. 
Many open source projects are also free, such as the Linux kernel, but not all open source projects are free, and not all free software is open source. However, because these often go together, you see the combination of the two referenced as FOSS, meaning Free and Open Source Software.

Common FOSS Licenses

There are a few licenses that I recommend for FOSS-licensed student project work. The Expat License (a.k.a. MIT License) is a simple and permissive license; if you've never read a license before, start here, because it's short and to the point. The GNU General Public License (GPL) is a free software license that clearly upholds the four software freedoms. This observation is echoed by, which is a project by GitHub to help people choose the right license for their work. 
The GPL is the one I recommend to student teams because I think it aligns best with the mission of the university. The GPL says that your users can study, share, and modify the software, as long as their changes are also shared back with the community. This feels like a good choice for projects created at a public university, where the public is a stakeholder in our collective success. I like the idea that others can learn from the things you have made, and that what they learn from it is also shared back in a virtuous positive feedback loop. 
There are many more free and open source licenses. The Free Software Foundation provides a useful list of them, organizing them by whether they respect software freedom or not. However, GPL and Expat ("MIT License") are two you see frequently in use, along with Apache 2, which is essentially similar to Expat. 
Note that in none of these licensing cases do you give up copyright, nor do they restrict you from entering into other license agreements. For example, just because you release a version of your project under the GPL does not mean that you cannot also sell a version of it without GPL clauses to a customer: you own the copyright and you dictate the terms. Licenses are for the users or customers, not for the creators.

Public Domain

Besides proprietary and FOSS licenses, you have the option to release your copyright to the project and put it in the public domain. This means that anyone can do literally anything they like with the work. Note that if you choose to go this route, I recommend that you state your software as following CC0 1.0 Universal (CC0 1.0), which is essentially an international form of American public domain designation. One does not normally use Creative Commons licenses for software, but CC0 1.0 is an exception.

Considerations for Future Employment

You have spent a lot of time and energy on your project this semester, and there are extrinsic benefits to this effort. In particular, you can put it to work to help you land a job or an internship. I recommend that, at the end of the semester, you migrate your repository from our private class organization to a public presence on GitHub. This will allow you to share that link with potential employers, who can then see the fruits of your labor. That is, rather than just see that you earned some grade in some course, they can actually see the work you did. 
Of course, before you make a public location for your project, you should choose how you want to license it. Choosing your license removes any ambiguity from those reading your code and from within your team. GitHub provides a convenient Web-based interface for adding the requisite license files to your repository. Keep in mind that the libraries you use may place restrictions on how you can license your own work. For example, if you use a library that is licensed under the GPL, then by the terms of that license, your work must also be licensed under the GPL. 
Of course, it should go without saying that you need to make sure you have followed all the licenses of the libraries and assets you have used in your project! I recommend that you list all of your assets and dependencies in your file, including the licenses under which you used them—including those that you are using under public domain, just to cover all your bases. This, in addition to the project itself, shows potential employers that you are serious about the business of software development.


For more about software freedom, I recommend Free Software, Free Society: Selected Essays of Richard M. Stallman
For more about the history of intellectual property and copyright and their impact upon creativity, innovation, and society, I recommend Free Culture by Lawrence Lessig. This book establishes the case for the important Creative Commons movement.

Saturday, April 11, 2020

Getting students in CS222 thinking about SRP and reasons to change

It has been challenging to determine which aspects of CS222 to maintain and which to drop this semester. In a normal semester, we would have done more in-class code review, more live programming, and more discussion of human-centered design; this semester, I have pivoted toward encouraging teams to make sustained effort on their projects and to meet with me to discuss their progress. There is a sense in which it is more project-focused, but there's another sense in which it is less project-centered: we have not been able to use the projects themselves for as many fruitful discussions as I would normally like.

I have tried to change some of the in-person activities into distributed, asynchronous exercises for the teams to complete. This past week, I put up what I thought was a clever assignment to help the teams think more critically about the Single Responsibility Principle (SRP). I know that thinking in terms of good object-oriented design is hard for them, and SRP is the rule to which we always return. I have emphasized—even before the switch to online—that the "responsibility" of SRP is a responsibility to someone. This has been a useful perspective to me, gleaned from watching lectures by Robert C. Martin (RCM). The assignment, then had two parts: to watch one of RCM's presentations and then do an SRP analysis activity:

For Friday, Apr. 10:
  • Watch this presentation on the Single Responsibility Principle. In the presentation, he talks frequently about SQL, which is a language for interacting with relational databases. His point is not about databases per se but about dependencies generally. It would be fine, for example, to think about making data requests over the Web—as we did in the two-week project—when he talks about SQL and databases. Indeed, when he says that “databases should be a detail”, you can think about JavaFX as being in the same category: an implementation detail that belongs on the periphery, not at the center.
  • Work with your team to analyze the classes in the current build of your final project. For each class, identify to whom it is responsible; that is, identify its reasons for change, using the framework that Martin established in the video. Of course, we want this to be singular, but in practice, they may not be. Determining whether each has one responsibility or more is an intended outcome of this assignment. Document your findings in your team document and submit a link to the corresponding section.

I added that clause about SQL because most of my students would not have seen nor heard of SQL at this point in the curriculum. In the second part, I used essentially the same verbiage I have been using all semester to talk about SRP.

I was surprised when I reviewed the assignment submissions that nobody met my criteria of satisfaction. Indeed, none of them talked about people nor roles in their discussion of responsibilities at all. All of them talked about the classes in their program as being responsible for certain things ("This is responsible for creating a URL," "This is responsible for comparing playing cards," etc.) but not about responsibilities to people in a theoretic organization. I had a bit of a doubt when I wrote the question that they might struggle with this since they may not have a good frame of reference for how organizations function. What I did not expect is that they would also completely miss the part about identifying reasons for change. Despite this being explicit in the question, something I discussed multiple times in class, and a central point of the recorded lecture, none of the four teams addressed this in their submissions at all.

In response to having read and commenting on their submissions, I wrote up the following announcement for them on Canvas this morning. It may help to know that of the four projects, two involve reading online data sources about the NBA, one is a card game using conventional playing cards, and one is a D&D character generator.
It seems that everyone had trouble with Friday's (4/10) assignment. This announcement provides some additional clarifications. It's the kind of thing that would work better as a conversation than a lecture, so as always, feel free to post questions in the comments below this announcement.

First, make sure you watch the required video. Listen carefully to what RCM is saying about responsibilities and change. We're not reinventing the wheel here: we're following what he has to say about it. Listening to a talk like this is like reading a good book: you have the understand each piece, each sentence, each idea. Give it your full attention. If you don't understand it, go back and listen again, the same way you would go back and read a difficult sentence or paragraph in text. Just because the author or speaker keeps going doesn't mean you have to! When (not if) you encounter something you don't understand, then you switch to research mode: check your notes, write an essay, ask a question, find other sources. This is the scholarly way that you should understand as part of the university education process.

Once that's done, recognize that RCM says that responsibilities are people, but what he really means by that is roles in the organization. In a large organization, there is specialization; in a small organization, like our teams, there has to be cross-functionality. Also, since all team members co-own the code, it would not be right to say "This code is Jim's responsibility," because that's not true: the code is collectively owned, so it's all of your responsibility.

What RCM is talking about is reasons for change. What role in an imagined organization would request changes?

To help you out, here are some ideas you can draw on:
  • A change to the user-interface could be requested by the user-interface designer.
  • A change to the code that generates connections to your data source (e.g. URLs) could be requested by the database administrator (DBA)
  • A change to the domain model could be requested by:
    • In a game, the game designer
    • In a productivity application, the business manager. (This is the role that potentially has the most variability in title, but I'll go with this one for now. The main idea is that it's making decisions about the core business logic: the pure model-layer of the application.)
With that in mind, look again at the current state of your projects. For each class, why would it have to change, and who would make that request? For example, rearranging the elements of the UI or changing the flow of interaction would be the UI designer. Changing the data source because you don't have a license to use the one you had been using would be the DBA. Adding another suit to a deck of cards or changing the racial bonuses for halflings would be the game designer, while changing the definition of what constitutes a basketball team would be the business manager.

For what it's worth, I've never given this particular assignment in asynchronous form before: it's been an exercise we've done in class. It's interesting to me that people had trouble with it, because if you had trouble with it now, that implies that perhaps you didn't really understand SRP when we first covered it months ago. I encourage you to think about this, because it has strong connection to our essential questions.

I encourage the teams to resubmit the 4/10 assignment. Working through this will help you gain a better understanding of OOP in general and SRP in particular. It will arm you to be able to discuss OOP in a much more clear and pointed way than many people can. For example, I see a lot of discussion in my game development circles of people discussing the role of OOP in game development; the truth is, they have very shallow understanding of what OO really is. Then, of course, they fight about it, because it's the Internet.

The second paragraph is really about how to watch a lecture. I suspect many of them don't really know how to do this in the way that they also don't know how to read for understanding. They may know how to skim and how to have a video playing while doing something else, but it is clear from their lack of questions that they don't really know how to read-for-understanding in any serious sense. Knowing that, it made me think that they probably don't know how to listen for understanding either. In fact, I'm sure they don't, since I have seen them listen to lectures before and fall into the "I will remember this" fallacy even when I explicitly point it out.

I decided to share this here in part to try to follow the "no wasted writing" rule. I don't think the problem I encountered is localized or idiosyncratic. However, I also wanted to leave my future self some breadcrumbs. I am not always opposed to the idea that students should have an assignment that I expect them to stumble on the first time. In fact, research shows this is a better way for them to learn then to have them simply do something easy. However, the next time I give this kind of assignment, I think I will have to provide more scaffolding around the idea of "a software development organization." That's probably out of the reach of people who are still trying to figure out how to program. It might be interesting to take this assignment and spin it narratively: to name these people in the organization, provide examples of the kinds of things they would ask for changes about, and then have the students write their own stories based around changes to their projects. In particular, this might help get them thinking about what "reason for change" really means, since I suspect many of them lack the maturity to understand that real software is not just fire-and-forget. The nine-week final project in CS222 helps get them to understand this, but it's a journey.

I also wanted to share this because I wonder what you, the reader, thinks about the way I have described the roles in a software development organization. Are these good titles to get the students thinking about? I intentionally omitted technical management, who would ask for unit tests or refactoring, because that's already a part of the required methodology—which you wouldn't know from looking at three of the four projects, but that's a topic for another day.

Monday, April 6, 2020

Canning Heroes: Special Edition

Some of you may remember that in Spring 2019, I had a crack team of undergraduates who developed Canning Heroes—an award-winning, multiplayer, cooperative game about food preservation. This was part of an immersive learning collaboration with Minnetrista. Canning Heroes was developed as a museum installation, playable on an all-in-one touchscreen intended for installation in a tabletop display. This constraint gave the team necessary and laser-like focus on one specific set of hardware, but it had the unfortunate side-effect of meaning that it is hard for anyone else to ever see or play the game. We posted it on for people to check out, but if you don't have a 1920x1080 Windows touchscreen, you really don't have the intended experience.

Coincident with the severity of the pandemic becoming more apparent, I was talking with our partners at Minnetrista about whether Canning Heroes could be made more available somehow. It was written in Unreal Engine 4, which does support deployment to multiple platforms, but distributing those applications is non-trivial—especially for Mac. I also remembered that much of the code and the asset design assumed the game would always run at one resolution, further complicating any attempts to port it to other hardware configurations. I imagine that many museums and cultural institutions are looking for ways to reach out to people who cannot visit their facilities in person, but at the time, I reported that it was possible but not easy to build the game for other platforms.

A little over a week ago, this story crossed another one that has been bouncing around my mind: that, as I wrote about in my March Fam Jam post, I have been interested in learning Godot Engine. The Fam Jam game was a good one-day project, but it didn't give me the opportunity to look into some of the real details of Godot. This lead me, on the weekend of April 28–29, to look into what it would take to make an HTML5 version of Canning Heroes using Godot Engine.

I'm glad to tell you that the exploration has been fruitful. I have been able to reuse the original assets and design to make a Web-playable version of Canning Heroes that, for lack of something punnier, I am calling Canning Heroes: Special Edition. Like the original, it's free software licensed under the GNU GPL, and I just flipped the switch to make the repository public this morning. You can play the game at, and you can browse the source at

One of the beneficial side-effects of the shift to online courses is that I've been able to put my head down and get this port done in just over a week. I have been able to have many meetings with my students, but I haven't had any other distractions to get in my way. This effort clearly fits into research, teaching, and service aspects of my job as a professor, so I feel like this was fruitful scholarship. I worked on it for several hours every day, including a several evenings last week as well as a few this past weekend. I would estimate the whole effort took about 60 hours, but I did not track my hours. I thought about using this as an opportunity to learn HackNPlan as well, but this seemed like busywork since I already had a reference implementation to guide my work. In fact, I recorded a video of myself playing from my desktop machine (awkward without a touch interface) as well as this photo of the credits screen so that I could use them for reference.

For the rest of this blog post, I want to highlight a few interesting things from this week of development. Note that the current build as of this post is Build 14.

Tween Pool

I started in with washing carrots, just like Happy Accident Studio did in the original. I knew that the carrot minigame contained everything that all the others needed, so it seemed like a good place to start. I knew there was an easy cap on the number of carrots on the screen at once, but I could also foresee that as I dealt with more and more moving pieces, I would need to manage more dynamically-defined animations. Since I didn't know where exactly on the screen I would spawn the vegetables or where I wanted them to go, I used Tween nodes, which are easily configurable through scripting in a way that AnimationPlayers seem not to be. However, I wasn't sure exactly how many of them I would need, nor was I really sure how expensive they were to create, so I decided to implement a quick pooling solution.

The code for it is pretty simple. I made a scene that can be instanced into any other one, but was really designed to be used by the Game "shell". It responds to requests for tween instances and returns free ones or, if none are free, makes a new one. When the tween is finished, it goes back
into the queue of free ones. 

I embedded that code using a Gist. For many years, I've used this formatter instead, which is fine but always struck me as a bit of a kludge. I'm not so sure about the Gist approach since I cannot actually see it while editing in the rich-text editor, and in the HTML editor, it is of course just a script.

In any case, back to the approach: I admit that it is likely a case of YAGNI, but it gave me a good opportunity to think about the implications of the node structure on a classic design pattern.

Thinking in Scenes

I mentioned in my Fam Jam post how I think that "scene" is a bad metaphor for what one is building using Godot Engine. I still think that's the case, but it's also true that the longer you work in the engine, the less that this word has any conventional meaning: it becomes simple a tag for the thing you do in the editor. 

There's nothing fundamentally novel about Godot's tree of nodes. In UE4, a level is really a tree, and a good OO approach would have "branches" which are actors that aggregate other actors. I wonder, though, if seeing the world as a single tree can help learners understand dependency injection better than when it is only a mental model. It makes me wonder what it would be like to scaffold a learning experience between the two engines. That is, would learning something like Godot help someone pick up on how to write high quality UE4 code, or even PlayN code—two other elegantly-designed systems that leave a lot of architectural power in the hands of the developer?

Build Scripting and Build Numbers

Once I had my tools in place, I knew I needed to move from manual uploading of builds to automating the process. Godot's command line makes it easy to invoke an export configuration, and so from there, it was mostly just remixing the scripts I have for publishing other applications to GitHub Pages: build the artifact, put it into a new repository that points to the gh-pages branch of the main repository, and push away

As I was debugging the HTML5 build, I had some trouble knowing what version of the application I was running on the Web. When you push to GitHub Pages, it's not the case that you instantaneously can get those files through a Web query. Also, it was not at all clear to me if the HTML5 exporter in Godot had any kind of cache checks as I use in Polymer to ensure that the most recent version of the application is loaded. I decided that a good, conventional solution would be to show the build number right on the UI of the game. You can see the build number in the bottom left of this screenshot.
I'm not sure I've used build numbers like this before, despite it being bog standard practice. I did a little searching on the Web to see if anyone had documented approaches for doing this within Godot Engine, but I didn't find any. I put together a quick test that I could read a build number from a TextFile resource (whose API documentation you may notice is a bit sparse). From there, I created this BuildNumber script that simply loads the text resource's content into an integer and makes it available. The script is used by the MainMenu scene to create the label.

With that done, it was easy to be sure which version of the application I was testing.


A fair part of my learning last week involved reading documentation and poking around the Web for tips and tricks. The Godot Engine documentation is really quite nice, being both technical detailed and approachable. There are some ideas whose explanation is unfortunately spread across multiple, non-interlinked pages, but with enough patience, one can find what they need. The popularity of the engine among novices means that it's easy to find quick answers to common problems as well.

I am glad that I came across the GDQuest Best Practices: Godot GDScript page. It gave me a solid framework on which to build the application skeleton, which is important since it's too hard to refactor once things are in place.

As with the word "scene", working with the idea that whitespace is significant gets easier with time. However, I still think it's a serious language design problem. This is best illustrated by the fact that the API encourages the use of long, untyped parameter lists but the style guide calls for short lines of code. You end up, then, in a position where you're using whitespace both for significance of blocks and also to make single lines readable. While it's true that you can scan around and try to figure out what a broken line might mean, you it also means you cannot use indentation unambiguously—which was, as I understand it, the point of this design choice in the first place. Compare to C-style languages, where if you see someone using a fluent interface and arranging it cleverly with whitespace, there's no doubt what's happening: it's not surrounded in braces, after all. Not to beat a dead horse, but some of the complications of long methods could be fixed by breaking them down into more functions, but without good refactoring support, all such changes are potentially game-breaking. Combine this with the idea that scripts link to each other through names of methods (strings) rather than function pointers, and you get the potential to silently break distal parts of the system.

It's not all pedantic. I'm seriously thinking about the pedagogic implications of learning programming through GDScript. Parts of it are appealing, like the speed with which you can get something up and running. Other parts are disconcerting. Consider this phenomenon: when you use the assignment operator, it may assign a value to a variable, or it may invoke a mutator method. At the calling site, you don't know. As a developer, I get the convenience of that: essentially, I can think of this as always going through a mutator and only sometimes does it matter. However, what if I'm a complete novice and I'm trying to understand the semantics of assignment? Whoah. Suddenly I have to either go with the pedagogy of little white lies or I have to understand function call semantics and object scope in order to debug what look like assignment operations. Let's say I make sense out of that: what practices will I move forward into languages like C++ or C#? Sure, C# does something similar with properties, but then we can kick it up a notch: if all we're doing is adding accessors and mutators to private fields, then we're not actually doing good OO modeling at all! We're repeating the sins of the 1990s all over again and still never teaching learners to do anything besides procedural design.

But I digress. Hopefully you can see that there are some interesting considerations for someone who does both teaching and practice, and there are many opportunities for research on teaching and learning.

Differences from the Original

I tried to keep Special Edition as close as possible to the original in terms of the gameplay. The only feature I added was the ability to go fullscreen, which, combined with screen scaling, was amazingly easy in Godot Engine. I kept the assets, even the ones that I thought were a bit underproduced by the original team, such as the red dots. Yes, they call your attention to the think you can chop, but they're not very visually interesting. I should mention that it was a joy to use most of the assets: dropping in those lovely carrots or that jaunty music let me go much faster than if I was building up a game from scratch or if I had to scour the Web or my hard drive for assets.

I made some changes to the "About" page to provide a bit more context about the game. Again, since the original was made for a specific physical location, it could get away with some shorthand.

There are a few things in the original that I do not have in the Special Edition. The most obvious omissions are the illustrations on the main and end pages and the tutorials. The illustrations are really just a mater of laziness: they don't do anything, and I know I can add them, I just haven't done it yet. Who knows, maybe by the team you read this, I will have gone in and added them. I remember the original team talking about wanting to make them more interactive, like having the little jar people boogie or react to being touched, but that's probably out of scope: I would likely just make them play the "clink" sound, which is what they do in the original. 

The tutorials are a more serious omission, since our playtesting showed that they really helped players understand what they want to do. I've been able to learn both the Tween and AnimationPlayer APIs, and so I am confident that I could add animations to demonstrate how each minigame works. This would take a lot more time than, for example, dropping in the illustrations. It would also be time that I'm not sure would be productive for me. For now, then, I'm leaving this on the cutting room floor. 

Wrapping Up

I believe those are all the major points I wanted to share about last week's learning adventure. I'm glad I did it, and I hope that the result is something that can bring some joy to other people. I am feeling more confident now with Godot Engine. If I don't get too rusty too quickly, I think I could use it productively in a summer project or a jam. This week, I need to return to making some video tutorials for my YouTube channel. I have considered whether any of the things I've mentioned above might make good tutorials, but I can't shake the feeling that something like pooling or build scripts is really better explored in text: the devil's in the details for such things, and it's hard to get at those in a bite-sized video. 

Thanks for reading! If you have any questions, let me know.