Sunday, May 21, 2023

Noodles

There are many reasons to come to this blog, but the most important one is probably the recipes. Here is where I share with you important information, such as the ratio of chocolate to cream when making truffles and how many arbitrary things I've thrown into alcohol to make drinks. Crucially, this is a place where I can come and find things that I might otherwise forget. Since it is an Internet Law that one must share a story before a recipe, I will give you an example.

I have fond memories of homemade chicken soup. When I was sick, my Polish grandmother would make chicken soup with big, soft noodles. I asked her how to make these noodles, and her answer was basically that they are just noodles—you just add flour and eggs and water until it's right. That is undoubtedly the proper way to make them, but it led to many years of my unsuccessful attempts to recreate something like her noodles. For some time, I was making something more like pasta, which I'd roll out and cut. These were made using a cup of flour, and egg, a pinch of parsley, and just enough water to hold it together. This was based on a combination of my grandmother's advice and online recipes for kluski.

A recent issue of Milk Street magazine featured a recipe for Hungarian dumpling noodles called "nokedli." After reviewing the issue, I had remembered the name incorrectly, which explains why my arbitrary combination of letters was returning no results. The article claims they are like German spaetzle, which puts it certainly in the same category as a comforting Polish noodle. I put these into my next soup, and I'm sure these are much more like what my grandmother meant with her advice. What I mix up now is batter rather than dough, and I roughly flick it into the boiling water off the back of a cutting board rather than carefully cut and separate them before dropping them in.

Look at that, only three paragraphs so far and not a single ad. You're welcome. Here's the ratio that makes it all happen:

  • 2 cups of flour
  • 2 eggs
  • 1 cup of water
Mix it up, cover it, let stand for half an hour or so, and that's really all there is to it. The magazine's recipe also calls for 1 tsp. salt, 1/2 tsp. pepper, and 1/2 tsp nutmeg. These are fine things to add, though last night I tried making noodles without these, and I do not think anyone noticed. Since I'm throwing them into soup, they pick up the flavor of the broth. If they were being served as a standalone carb, they would probably want something like spices or butter.

Saturday, May 13, 2023

Awaiting multiple signals in Godot GDScript

I ran into a problem with my summertime project where I want to wait for multiple coroutines or signals to complete. These happen to be related to animations. Depending on the situation, multiple different animations may play together, but I don't want to advance until all of them are complete. I have used some other asynchronous programming libraries that have built-in support for this, but alas, Godot has no such abstraction. 

I built this utility class that fills the bill. As with my previous post, I decided to share the code here since the "live" version is locked away in a private repository.

class_name CompositeSignal
extends Node

signal finished

var _remaining : int

func add_signal(sig: Signal):
    _remaining += 1
    await sig
    _remaining -= 1
    if _remaining == 0:
        finished.emit()
    

func add_call(object, method_name:String, argv:Array=[])->void:
    _remaining += 1
    await object.callv(method_name, argv)
    _remaining -= 1
    if _remaining == 0:
        finished.emit()

This was developed using TDD using GUT. Here's the test code. There's not actually a test here to demonstrate that the two add_ techniques work together, but that is exactly how my integration works in practice.

extends GutTest

## The amount of time a timer can wait that is safe for a unit test to run
const _TIMER_DURATION := 0.1

var _executions := 0

func before_each()->void:
    _executions = 0


func test_add_call(p=use_parameters([[1],[3]])):
    var composite = autofree(CompositeSignal.new())
    var number_to_add :int = p[0]
    for i in number_to_add:
        composite.add_call(self, "_run_coroutine")
    await wait_for_signal(composite.finished, 2, "Signal should have returned")
    assert_eq(_executions, number_to_add, \
        "There should have been %d invocations." % number_to_add)


func _run_coroutine()->void:
    await get_tree().create_timer(_TIMER_DURATION).timeout
    _executions+=1


func test_add_signal():
    var generator :SignalGenerator = autofree(SignalGenerator.new(self))
    
    var composite = autofree(CompositeSignal.new())
    composite.add_signal(generator.a_signal)
    generator.run()
    
    await wait_for_signal(composite.finished, 2, "Signal should have fired by now.")
    assert_true(generator.completed, "Expected to wait for generator state to update")
    
    
    
class SignalGenerator:
    signal a_signal
    
    var context
    var completed := false
    
    func _init(context_p):
        context = context_p
    
    func run():
        await context.get_tree().create_timer(_TIMER_DURATION).timeout
        completed = true
        a_signal.emit()

Wednesday, May 10, 2023

Running Godot unit tests using GUT with a pre-commit hook

I've been tinkering with a project idea for months, creating and discarding over a dozen variations. I was hoping to have a prototype or vertical slice complete before the start of summer, but I was unable to do so. This current variation is promising, but I ran into a situation where a lot of the code I had laid down was hard to test. In particular, I needed to check the interaction of subsystems that were probabilistic, and that's not only difficult to do manually, it's also prone to regression. Time for unit tests!

I'm using Godot Engine 4.0, and so I brought in GUT and started redesigning my code to be more testable. This alone is almost certainly worth the effort. Tests lose their value if they are not executed, and so I started looking into automatic ways to run my tests. Godot Engine has no such system built in, but I am using git, and so I investigated pre-commit hooks. This is something I've known about, and I think I may have even tinkered with before, but I don't believe I have ever incorporated them into a project. My previous work with automation used IDEs with built-in capability to run tests on commit, or I used server-side integration tests to check my work.

Someone who does a lot of shell scripting and git hooks would probably laugh at how simple this is. However, I want to share the script here in part because the "live" version is currently hidden away in a private repository. Here is the executable script that I put into my project's .git/hooks directory:

#!/bin/sh
if godot -d -s --path project addons/gut/gut_cmdln.gd -gdir=res://test/ -gprefix="" -gsuffix="_test.gd" -gexit
then
    echo "Tests passed"
else
    cat <<\EOF
Unit tests failed. Commit aborted.
EOF
    exit 1
fi

This works pretty well. One downside is that it has to pop up a window during the tests. I would like to find a way to prevent that. Also, I could probably get away with having the tests run only when I'm committing to master rather than on feature branches.

Wednesday, May 3, 2023

Reflecting on CS222 Spring 2023 and the problem of participation

This was my second semester using Dart and Flutter in CS222, and I have no regrets about the platform  and language change. I still have many lingering questions about how best to introduce the topics and pace them, and I have some related research questions as well: Does learning null safety make students better or worse at managing null values when using a non-null-safe language? Does Flutter's declarative UI approach make students more or less able to learn other UI frameworks—as well as game engines—later?

The other major change that was made just this semester was the separation of assigned non-project work into "activities" and "challenges," where the latter can be resubmitted for a higher grade a la mastery learning but the former are fixed in time. From a philosophical point of view, I think this also worked, in that it captured in words something that was previously only in my mind. The presentation and distinction among these can be clarified to the students, and it probably should be, given the small but non-zero number of students who tried to resubmit "activities."

There were three projects this semester, and all of the teams did a fine job bringing the programs together. Unfortunately, all three also dropped the ball at the end when it came to rigor. Some had made changes to their code that broke their unit tests, while others had no indication of understanding TDD despite having reasonable model-view separation. Put another way, I think they got caught up in cowboy coding at the end, trying to make it work rather than trying to make it right. This is a strong temptation to students, and while it's frustrating, it is also an excellent assessment of whether or not they learned the topics of the class. The central topic of the course is, essentially, that software development should be done right. I don't think there are any traps here besides the ones the students set for themselves, but I will continue to try to find ways to keep the presentation of expectations clear. I still suspect that very few students actually read and review the requirements, preferring to do what they want rather than what I ask.

A much more troubling observation from the semester is the very low levels of participation in the mastery grading aspect of the class. I haven't kept rigorous notes, but I remember feeling like participation was low last semester. This may be a trick memory, though, given how many other people on campus are talking about low engagement rates. I sat with my spreadsheet this morning to try to quantify participation, and I have included the table below.

Score
31010443735
201644111
111301041
032166667
% Satisfactory71.43%78.57%71.43%57.14%50.00%57.14%28.57%42.86%

The columns after the first are the challenges. Following triage grading, a student can earn 3, 2, 1, or 0 points for each: 3 means it's basically correct, 1 means it is incorrect, 2 means it is middling, and 0 means it was not turned in. 

The table tells an important story of the semester. The first challenges, which were quite easy, were completed successfully by almost everybody. Not all of these were successfully completed on the first try, of course, but they were done. Almost immediately, by week two, there's a significant decline in correctness. This is followed, between weeks two and three, by a significant decline in participation. By the fourth challenge, a plurality of students did not submit anything at all. They also did not take the opportunity to resubmit this work despite many reminders and even admonitions. To be clear, especially at the end of the second iteration of the final project, I pointed out to them the low rates of participation, and I posed the question, "How will you be able to apply principles for chapters you haven't read?"

It's not a rhetorical question, but no one was able to answer it. I am really baffled. It should be no surprise that "good students" did the reading, submitted things, got some feedback, and then showed some understanding of the concepts. I really don't know what happened with the other ones. Did they forget, time and again? It's possible, since they also don't write anything down. Did they not care? Also possible: maybe they see the class as an impediment to be avoided rather than an adventure to be undertaken. 

Here's the real kicker: 5% of the students' grades comes from participation in the achievement system. The achievements are meant to be fun explorations of new ideas that clearly connect the concepts of the class with students' interests. Many are designed to help them become better learners. This semester, only two students submitted any achievements at all. Those two earned 10 and 12 points out of possible 12. No one else submitted any, nor did they even ask about them, despite repeated reminders. What makes a student look at a part of a class like this and simply reject it? The conventional wisdom for teachers is that if you value something, put points on it, but when 12/14 of the class won't even pursue the points, you know there must be something else going on.

Without answers to those questions, I am left unsure of what kinds of change might improve participation. I have kicked around some ideas of a major overhaul to the course, using only badges and gated progressions. Given my other goals for the summer, though, I don't think I will have the spare cycles to commit to something like that.

Monday, May 1, 2023

Ludum Dare 53: Mr. Delivery Man

Ludum Dare 53 took place the weekend before final exams. This is a great time for a professor to step away and do some creative exploration, but it's not a good time for my students. Truly, it's hard to find a good time for students to jam since it's not like the rest of their classes hold off for a bit so they can spend 48 hours making something, despite the fact that this might be the single best way to get better at making things.

The theme for Ludum Dare was "Delivery," and at first I was a little disappointed. It felt a little like people were voting on the worst part of RPG quest grinding. After some discussions, I realized there are a few different interpretations, and I look forward to exploring more of the games during the scoring period. For my own game, though, one of my first ideas is what really stuck: a delivery person who manhandles packages. The result is Mr. Delivery Man.

I read the theme Friday night but could not set to work until Saturday morning. It did not take me long to get a minimally playable proof-of-concept implemented: drop a package and kick it as far as you can. I sketched some character ideas on paper, and then I pulled out my Wacom Bamboo to draw them up in Gimp. I feel much more comfortable sketching in pencil than I do doodling with my digital tablet. Every time I do something like this for a jam, I feel like I should invest some more time in getting fluent with the tools. Then I think about how long that would take and deprioritize it. Still, I wish I had more confidence. I watch my art majors with their tablets, and they fluidly add layers, sketch ideas, and move on. All my ideation is on paper, which makes the digital production feel weighty, leading to a lot of reworking. It's not like you'd know it from the results, either. They are only good enough.

A digitally-enhanced photograph of a goofy pencil sketch of some character ideas

I tinkered with trying to add a knee joint to get something with a QWOP feel, but I ended up preferring the straight-leg kick. I also spent a little time trying to animate the character as if he was holding the package before dropping it, but this proved tiresome and didn't significantly add to the gameplay. The result is that he's got a bit of Homestar Runner magic in how he holds that box in the air.

Recording the sound effects was some of the most fun I had on the project. I did a little quick research on how to Foley the sound of glass breaking. I took a break from that to make an afternoon coffee. Opening the cupboard, I was inspired to clink some mugs together. Perfect. I recorded myself clinking mugs together for the glass sound, and then I punched a spare board game box for the impact sounds. Conveniently, the box had some bits in it, which gave it a nice depth of sound. It's subtle, but it's quite nice. Whenever the box is struck, one of four random glass sounds and one of four random impact sounds are mixed together, so even though it's a small set of sounds, it still feels audibly satisfying.

The music also brought be great joy. Tritones play on tremolo strings before dropping the box, and this adds lovely tension to the very silly experience. There is a moment of silence as it falls. I experimented with playing audio through the fall and the kick, but this was much less fun than just bringing in the music after the box came to a rest. I think the polka is just right for a successful kick, and the sad trombones came out as I desired. I had to look into LMMS' controller tracks and pitch variation for the trombones. This feature didn't work quite like I wanted, since I wanted to mix square and interpolated forms; there may be a way to do this, but I couldn't find it. I was able to get a satisfactory sound by adding enough control points to a curved wave.

By the end of Saturday, I wanted to get a Web build up to make sure everything was working as intended. Here is where the headaches began. My usual continuous integration configuration makes use of the godot-ci action, but it does not currently support Godot 4.0. There is an open issue about this that actually had some discussion contemporaneously with Ludum Dare 53. The latest post points to an alternative approach. I looked into that, and after a lot of frustration and back-and-forth with GitHub Actions (push, wait, check, fix, repeat), I ran into a more troubling issue: Godot 4.0 Web build requires SharedArrayBuffers. I had just encountered SharedArrayBuffers on Friday when working with my game studio team: a subset of them were trying to fix an audio stuttering problem, and we had discovered that we could address it by enabling the "Threads" option on the HTML5 export. This worked fine on our test machine but then failed in integration. This is because SharedArrayBuffers require custom http headers, and GitHub Pages does not allow such a thing. All of this came after I had just added on-screen buttons for all the controls, assuming the Web build would work when deployed, but now I had to consider the real possibility that this would be strictly a native game. The biggest problem with this is that it's much harder to integrate with the Ludum Dare review community if you don't have a Web build. Indeed, one of the threads I just saw on the LD site was asking for a filtering feature since the author did not want to run any non-Web games.

Somewhere on my journey through many search results, I saw someone mention that itch.io had recently added custom header support. Whether this was specifically to support Godot 4.0 games or a beneficial coincidence, I do not know. A quick test showed that, indeed, this worked just fine. While I prefer just having all my jam projects in one place, and that place being GitHub, it was very easy to get the version on itch.io playable. I set up a Makefile to automate the generation of the Web build and its zipped contents, but I have not yet gone so far as to automate uploading to itch.io using butler. If I have to continue using itch.io as a platform for jam games or side projects, I will have to look into it.

It was midday on Sunday before I had the deployment issues sorted out, and I still had the energy to do a little bit more with the game. Something that had inspired me early on in the project is that just kicking the boxes around is a fun toy, and I figured I could use achievements to reward players for trying different things. Mr. Delivery Man then became my first game with achievements. I think that they serve the design purpose very well, and this sentiment is echoed by the early comments on the game during the Ludum Dare rating period. The software system behind the achievements it not terribly clever: each achievement is an object with a function that can tell whether the achievement has been met based on a game record, and after each play, the list is checked to see if there's any new ones met. I had hoped that after implementing my first achievement system, I would have a better idea about how I would want to do it next time. The truth is that it was just sort of pieced together, and I'm not sure I gained any great insight into implementation patterns.

That's the story of Mr. Delivery Man. He was almost named Delivery Dan in an homage to great characters in both Red Meat comics and Firesign Theatre, but leaving him anonymous ended up feeling right. Once again, here is the Ludum Dare page for the project; while you are there, consider checking out Delivery Destroyer and Messenger, which were created by two of my sons.