Wednesday, September 18, 2019

Revisiting the state analysis of Every Extend

Back in 2007, my first Computer Science education research paper was published in the proceedings of SIGCSE. The paper provides an overview of how game programming can be used as a motivating context for learning design patterns. One of the examples in the paper, which I have used for years, is the behavior of the player-controlled bombs in the inimitable Every Extend, which can be used to explain the State Design Pattern. In the paper, I argue that the player's pawn can be described by a state machine like the following.
In yesterday's meeting with my CS315 Game Programming students, I was inspired to create a UE4 tutorial video about using state machines for state management. In particular, I wanted to show how an enumeration can be used to define the possible states, and then behavior can be switched depending on the current state. This is not, of course, the State Design Pattern, but it's a heck of a lot better than an ad hoc collection of Boolean variables. It also follows from Michael Allar's Gamemakin' UE4 Style Guide that is mandatory for my students, which says:
Do not use booleans to represent complex and/or dependent states. This makes state adding and removing complex and no longer easily readable. Use an enumeration instead.
To prepare for the video, I have spent a few hours the last two days building a simple playable framework similar to Every Extend. Over a decade ago, I did a similar exercise in Java to produce EEClone. This time, I did it in UE4. (A bit of an aside: Every Extend is like my kata for new game programming frameworks. I use the essential gameplay of it as a case study to understand how game engine functions. Of course, then, when I first started learning UE4, I tried making an Every Extend clone, but it went quite poorly. After a few years of working with UE4, it's become a lot easier to get the game up and running!)

With my gameplay framework in place, I started working on the diagram shown above to use in my series of game programming tutorial videos. However, as I looked at the diagram, I realized I had a rather fundamental problem: these are not the states of the player bomb. It's true that these are, abstractly, states of the game, but they are not the states of a single stateful object. Instead, what I had programmed was leading into the following state model:
That is, a player bomb is created, which has it play a spawning animation. When that is completed, then the player becomes vulnerable to collisions but can also press Space to explode. However, unlike in the initial analysis, there are no other states after this. If the player presses Space or is hit by an obstacle, the player's bomb is destroyed: that actor—that object—no longer exists. It is the game mode that either counts down to respawn or waits until a chain of explosions is complete and then counts down to respawn. These states are not part of the player pawn.

This observation shoots a hole in my plan to make a video using the Every Extend example. I still want to make a video about state analysis, but I need to find a better example—one where a single actor's state is interesting enough to be modeled but simple enough to fit into a tutorial video. Please feel free to share your suggestions in the comments.

Monday, September 16, 2019

Muncie DevFest 2019

I was invited to speak at DevFest Muncie 2019, an event hosted by my friend and colleague Chris Turvey, who runs the Muncie Google Developer Group. I've known about DevFest for a few years, but this was the first time I had the chance to attend. My oldest son and I went together, and it was fun to share the day with him. The morning speakers included talks about playing music over a 300-baud modem on a C128, how to design custom circuit boards, smart cities, developing experimental audio systems using Javascript and Firebase, along with my own humble contribution, a brief discussion of Kaiju Kaboom.

One of the reasons I'm writing this post is simply to share a link to my slides. Like most of my presentations, the slides will not make a whole lot of sense without the stories to support them, but I want to make them available nonetheless. The slide showing my sales got a hearty applause when I explained that there is no animation or anything—that's just the chart.

In the afternoon, my son and I joined a workshop introducing Particle products and services. It was the first time I have gotten my hands dirty with IoT technology or, really, with anything in that hardware hacker space. The integration of their technology is really amazing to me, since I understand at least the theory of how each part works. And yet, despite this, it took us over an hour to get an LED to blink on a board. As for me, I think I'll stay in softwareland, but it does seem that my son's interest was piqued. Thanks to the sponsorship of Particle, we were able to take the boards home with us, so as soon as I get them rigged up to my home Wi-Fi network, he should be able to tinker some more.

Thanks to Chris and the rest of his co-organizers, as well as the MadJax staff and the other sponsors. It was a nice day to share with developers of all experience levels from the region.

Sunday, September 8, 2019

Painting Lord of the Rings Journeys in Middle Earth

My brother got me excited enough about Lord of the Rings: Journeys in Middle Earth (JiME) that I pre-ordered it not long before it was scheduled to be released. The idea at the time was that it would make a great summer painting project, and that my sons and I could get into the campaign in those quiet summer evenings. Turns out, I did not get as much painting done this summer as I hoped to. Instead, almost all my creative energy went into Kaiju Kaboom. It wasn't until today that I finally finished painting the JiME miniatures.
Box Art
Several of the techniques and color palettes that I used were taken from Sorastro's series. For most of the mob characters, I took a new approach to try to speed them up. I zenithally primed them via airbrush and then used thin paints in single coats followed by a wash to darken the recesses. This allowed me to let the zenithal highlights show through the paint. One thing I learned using this approach is that one has to be much more careful with the individual layers than I normally am with the first coat. When taking more time with each layer, it actually matters less if the first one or two run into each other, since it will be touched up. However, with this approach, a little slip of the brush can lead to a contamination of one area with the wrong color. There were some parts of the following figures where I added manual highlights and shades, but they were primarily done with single thin coats and washes.

Ruffians
Goblin Scouts

Orc Marauders
Orc Archers
Getting through all these was pretty quick. The results are certainly adequate though a bit unsaturated or washed-out looking. You can probably tell by a quick glance here that I had some fun mixing different flesh tones for both the humans and the goblins & orcs.

I painted the wargs in a similar way, just spending a bit more time on some of the highlights. I also added glowing red eyes. Now that I look at the eyes, I'm a bit ambivalent about it as a visual effect. It seems to be a common design in illustrations of the wargs. I spent a few minutes trying to determine if it was canonical or not, but it's still inconclusive.
Wargs
The wights were mostly one color, and I gave them a little more attention than the orcs and goblins, though still working rather rapidly. These were the only ones in the set where I added some weathering, namely, mud on the cloaks. I did this with drybrushing, although in retrospect, I wonder if I should have just painted it on more boldly. I still struggle with weathering. I love watching painters make figures look like they are part of their environment, but I personally feel lost somewhere between not being good at it and not wanting to risk ruining an otherwise good paint job.
Wights
You cannot actually see the weathering so well from the front, so here's a rear view.
Wight, Back
You cannot see it so much there either—hence my comment about drybrushing perhaps being the wrong path here. That said, I am pretty happy with the weathering on the sword and armor: they look old and corroded.

The last of the villains is the Big Bad of this set: the Hill Troll. I like the way it turned out. I used wet blending to get nice transitions between its two skin tones, and some purple glazes add nice color variation around the face and neck.
Hill Troll, Front

Hill Troll, Back
With the villains out of the way, I moved on to the heroes. I'll show them in the order I painted them, starting with the elves.
Legolas, Front
Legolas, Back
Elena, Front
Elena, Back
Legolas and Elena both use essentially the same palette since I figured they probably have the same tailor. For basically all the heroes, my standard approach was to use basecoat, wash, and highlight for everything but the cloaks. The cloaks were basecoated, then the shadows and highlights painted in using two-brush blending.

Given Tolkein's elven names like Legolas, Glorfindel, Elladan, and Luthien, I wonder who at FFG decided that a great name for an extended universe elf would be "Elena?" So mysterious! So otherworldly!

Berevor, Front 
Berevor, Back
I started Berevor and Aragorn at the same time, giving them the same base skin tone. I intended to do a similar palette-sharing as with Legolas and Elena. However, as I started working with Berevor, I found I was content to just buckle down and work on this one figure. I think she turned out quite nicely. Although she is not an original Tolkein character, I have a special fondness for her as she is the ranger in Middle Earth Quest, which I finished painting earlier this year. It's interesting to compare the figures to see how Fantasy Flight Games' philosophy regarding miniatures has changed to match popular demand. The MEQ figure is fine for what it is, but the JiME one is much more interesting. It also benefits from being a larger scale, about a head taller.

Aragorn, Front
Aragorn, Back
I tried to make Aragorn look a little more mature, his hair starting to grey. I think I have conceived of Aragorn as progressively older as I have gotten older. It's been over a decade (maybe two) since I've read Lord of the Rings. So many things to read and so little time. Anyway, I was partially inspired by watching Ghool's Quick Tip video on painting graying hair. I didn't follow his approach, but the video did get me thinking about trying to get that aesthetic.

I will mention here that all the heroes above have very similar earthy tones, being almost entirely greens and browns. I kept each one internally consistent so that the same colors are used, and I think this helps them look coherent without being too bland. I dressed up Aragorn a bit by painting the trim on his leather accessories in multiple colors, which may draw too much attention when rangering around Bree-land. I figure that for his sculpture, he wore his dress leathers.

Bilbo, Front
Bilbo, Back
Bilbo's red vest and gold buttons gave a much-needed reprieve from the greens and browns of his companions. I should mention that I still haven't played the game, and I find myself quite curious how their story will get Bilbo Baggins—only three feet tall!—to do any adventuring after his experiences in The Hobbit.

Gimli, Front

Gimli, Back
Dwarves. I swear, there must be some kind of curse upon them, or maybe sculptors just really don't like sculpting them. The card art for Gimli has him wearing a very reasonable chainmail jerkin with leather bracers and belt. The miniature, on the other hand, has... well, I guess it's a leather shirt over chainmail jammies, and a plate mail belt, with massive steel pauldrons and bracers. On his right arm, you can even see that the sculptor had to run the bracer and pauldron together, which just goes to show how unreasonable this would be in combat. I understand that sculpting to scale must be really hard, but look at the size of his hands! It's all very strange. Yet, I have a suspicion one my sons will be very excited to play an axe-swinging goblin-smiter, so I'll have to get used to seeing him on the table.

For the many metallic parts of this figure, I busted out my P3 Armor Wash, which really is like magic for this kind of figure. The highlights on the hair are just zenithal highlights showing through thinned paint. I thought about darkening his hair, but basically all the other heroes have such dark hair that I wanted someone besides Bilbo to have different colored locks.

I based all the figures in this set using the same technique. I prepared some rocks by cutting pieces off of a cork trivet that I saved from the trash many years ago; these pieces I painted a neutral grey and set on a corner of my painting desk. I started basing with an application of Vallejo Brown Earth basing paste, embedding some rocks into it. had ordered the basing paste after watching Sorastro's Star Wars Legion painting series. When I first applied it to one of the ruffians, I was shocked and disappointed in how red it appeared. I decided to work with it to see what I could do. After drybrushing the rocks, I mixed a wash using black and sepia inks, and this served to tone it down a bit. Then I applied flock using my usual approach: dab some glue, sprinkle some flock, wait for it to dry, repeat with a different material. After finishing my test model, I wondered what would happen if I just mixed all my flocking material together, so I did.
Mixed Flock
That container holds a mix of black tea, burnt grass fine turf, and medium green fine turf. I sprinkled this on to the other figures over dabs of thinned white glue—except for the wights, which used a less green mixture—and it saved me a bit of time and mess. After varnishing, I then added the green static grass. The results were pretty quick and look fine for a wilderness setting. At least, I think they will; I will find out for sure when we finally get the game to the table.

Here's a picture to help compare the size difference of the Hill Troll and the other characters. The Big Bad here is not as big as some of them I have painted, yet I think he's plenty big to be intimidating.

Aragorn and the Hill Troll

Finally, here are some group shots.

Fellowship of the Fanfic Prequel
A Medley of Villainy
Thanks for reading!

Thursday, September 5, 2019

How I might teach using the Working Software Cycle

I've been out of teaching CS222 for several semesters now, but it still occupies my mind from time to time. This afternoon, I read the latest blog post by Ron Jeffries, which concludes with this:

There is a particular trick to the working software cycle, as I prefer to do it. It goes like this:
  1. Start with working software. Ideally an “End Card”, ideally bug-free, but something that runs. Your existing legacy product will do if that’s all you’ve got.
  2. Select a very simple next feature to do. No, that’s too big, even simpler. There you go.
  3. Write an automated example that will fail until that feature works, and then succeed. If it requires more than one example, go back to #2.
  4. Start programming: Run the example; add some code; run the example; every time the example runs, refine and improve the code a bit. When you’re satisfied with the example and code quality, that bit of feature is ready to ship.
  5. You’re back to working software, with a new feature. Go to 2.

It made me think about the nine-week project in CS222 and its three three-week iterations. Students always struggle with getting a productive rhythm of software development. Next time I teach the course, I need to think about Jeffries' idea instead. That is, rather than ask them to write up plans for what to have done in three weeks, have them come up with something to have done, say, by tomorrow. Make the loop even tighter so they can both expose their misunderstanding about planning and feel what its like to be productive. (Of course, they should also start in a bug-free condition, which is also a challenge for them!)

Also, putting this together makes me think I should write more often. We're wrapping up the third week of the semester, and I feel like I've been thrashing. That usually means its time to schedule reflection and writing time and hold myself accountable to the schedule.

Friday, July 26, 2019

Summer 2019 Course Revisions

My main focus this summer was my first commercial game project, but I have always kept in the back of my head that I had some serious work to do to get ready for Fall semester. In fact, I was feeling a bit stressed about it a few weeks ago and took some time off of Kaiju Kaboom to sketch out plans for my game programming and game design classes. My third class for Fall is scheduled to be CS445/545 Human-Computer Interaction, but it is right on the edge of being underenrolled; if it doesn't make minimum enrollment, the dean's office will cut it and I'll be deployed somewhere else with very little notice. However, because my time to work on course prep is drawing to a close, I devoted the day to sorting out as much of that course as I can.

In today's blog post, I'll give some highlights for my three courses. This will be shorter than in some previous years, so I'm condensing it all into one.

CS315: Game Programming

I'm excited to be teaching game programming again, since this year we actually did get new machines in the lab—machines that are capable of running UE4. Last Fall's course went well, and so I am keeping most of the plan as-is, but trying to keep a pedagogic eye on how I can use in-class examples and workshops to drive some of the lessons home. I expect that we will do three to four mini-projects followed by a larger team project. This year, I have dropped the achievements, since I think the course is already quite full of places where students can make meaningful decisions about what to pursue. I am keeping both the specifications grading and the project reports, both of which served their purposes last year. I have been tempted to set up some kind of team role system or other accountability system for the final project, to prevent the case where one student carries the rest, but I am still unsure how to do this; perhaps I should turn that question around to the students and have them contribute to setting the rules.

Here is the draft course plan. Only the first of the mini-projects is posted, but I expect to re-use my progression from last Fall, where we went essentially from 1D to 2D to 3D across three projects.

CS439: Introduction to Game Design

I do not have an immersive learning project lined up for this academic year. Instead, I am using the year to try to intentionally explore how I can integrate some of the work I've done through immersive learning into formalized Computer Science department offerings. One step in this direction is offering a version of my game design course—which I have taught for several years as an honors colloquium—as a Computer Science elective that anyone can take. The easiest way to do this was with our "seminar" course, although this is not ideal for marketing the course, since it still shows up in the catalog as a 400-level CS elective. Still, I look forward to teaching this and seeing how the audience compares to the honors colloquium.

One way that I have made this "computer science-y" is to require our intro programming course as a prerequisite. This is not because I expect to do much programming, but rather because I want to be able to draw upon metaphors of computational thinking when looking at games system design. This seemed like a good idea at the time, but I have questioned this the more I've worked on it. Indeed, in my random sketching of how I would consider proposing this as a formal service course in my department, I've strongly considered dropping any prerequisite.

As for the course structure, here is the draft course plan. It is based strongly on what I have done in previous years. Even though the examples in the free online text we use are showing their age, I like both the presentation and the price. Once students build a core vocabulary, they can make use of the exhaustive supplemental information that is discoverable online. Without a community partner via immersive learning, the students will be working on projects of their own design without external constraints, which I haven't done in a class like this since roughly 2008. I'm eager to see what they pursue. As before, we'll spend the first half of the semester studying fundamentals and then the second half of the semester building projects. Some of the students who have enrolled are ones with whom I really enjoy working, and so I'm looking forward to spending time with them again too.

CS445/545: Human-Computer Interaction

The last academic year, I taught this course both semesters in a collaboration with the David Owsley Museum of Art. I had a fruitful meeting with their education director several weeks ago as we debriefed the experience. I am glad to say that we are continuing our collaboration, but we are narrowing the focus toward one specific problem: helping visitors navigate the physical museum. This means that my students won't have to do so much problem discovery, but I think that's OK. They really struggled with the idea of finding a legitimate problem vs. inventing a problem and then justifying their work. I think this new focus will help them get into the solution design part of the course, which is really more important for our single, elective course on HCI.

Knowing that this will be a relatively low-enrollment class allows me to treat it as a studio class. We will start with some common readings and structured exercises, but then I would like to move quickly into tackling this navigability problem, using my familiar tactics of just-in-time teaching and reflective practice to have a meaningful learning experience. The draft course plan only lays out activities for the first three weeks or so of class, after which I can work with the students to assess our situation and move forward as needed. It does mean there is kind of a hole around the grading policy of the course, and I hope that this does not cause the students any undue stress. My plan is to work with them to develop a methodology that embeds assessments into it, which I think they will enjoy and learn from.

A word about the sites

Careful readers may have noticed that my course web sites have undergone a visual overhaul. This is related to my learning lit-element, as I wrote about earlier this summer. Whereas my sites were previously based on the polymer starter kit, now I am using the PWA starter kit prerelease. I had to do a bit of finagling to get it to work on our departmental Apache server, but once that was done, I could easily replicate it across the three sites.

Tuesday, July 23, 2019

Kaiju Kaboom!

I am pleased to announce the release of Kaiju Kaboom, my first commercial game project. You can find it on Google Play, exclusively playable on Daydream.

Most summers, I set my own work and creative goals rather than teach classes or work on grant-funded projects. During the Spring semester, I decided that one of my summer goals would be to create a game from scratch and release it commercially. The outcomes of this plan are objectively measurable, and as of last night, I have met them, and I feel good about it.

The game

The game itself is an expansion of the concept that my son and I developed for Global Game Jam. Inspired in part by Terror in Meeple City, the player takes on the role of a kaiju who returns to its island home, only to find it infested with people. Of course, the people are meeples, the buildings are made from wooden blocks, and you are a miniature giant monster, but this both adds to the charm and simplifies the asset development for a novice modeler. The biggest change to the game is moving it to mobile VR via Google Daydream, but I also added randomized levels, meeples that fight back, and four different kaiju powers, three of which are unlocked through the high score system.

I spent some time around finals week building a proof of concept to ensure that the game would be enjoyable and within scope, and having a positive experience with that, I devoted the summer to it. I did not use any formal task tracking tools. Instead, I have a pad of paper on my desk where I would either write down features to explore, sketch geometric or software solutions, or record defects to address. I worked roughly eight to ten hours each day, sometimes up to twelve, and about half days on Saturday, with a few gaps for family trips. I showed an early build to some friends and family around Memorial Day, and this confirmed to me that the core gameplay was really enjoyable.

The tech

Kaiju Kaboom was built using Unreal Engine 4. I started the project in Blueprint and initially just added C++ for the parts that were faster to write in textual code than they were to prototype in Blueprint. In retrospect, there are parts of the core game loop that I left in Blueprint which I probably should have done in C++, because the Blueprint can get hard to follow. I don't think there's any real performance hit from using Blueprint here; the real problem is that Blueprint makes it harder to control modularity, so it's hard to intuit the dependencies after walking away from a subsystem for a while.

Getting up and running on Daydream was not much trouble, and the documentation is good. However, the Google VR branch of UE4 is clearly no longer being maintained. This led to a problem as I was getting ready to release, in which the game crashed on a friend's device. Scouring the web, reading semi-related threads, and trying various approaches, I was able to finagle a build system that worked for both of us. I continued to do my regular feature development in UE4.22.3. For releases, however, I downloaded and built UE4.22 from source, replacing its GoogleVR libraries with those from 4.20 release—the most recent release—of the googlevr-unreal fork. My friend also had some troubles that I thought might be due to texture formats, so the build includes both ASTC and ETC2 formats, even though only the former is needed for my own Pixel 2.

The null business plan

At this point, the thoughtful reader may ask, "Why did you target Google Daydream?" The simple answer is that I have one and it seemed like fun, both as a technical challenge and through that proof-of-concept I built in early May. Note that mine was a commercial project not because I wanted to make money but because I wanted to see what was involved in making it a commercial project. I know literally one other person who has a Google Daydream setup: my friend Chris, who was my lone tester. If I can sell ten copies, I'll buy some nice beer. I'm in the blessed and enviable position where I can invest a summer in learning and improving my skills without it having to be economically profitable. As I alluded to before, abother benefit of Daydream is that because it's low-powered for VR, I could get away with my own limited asset-creation skills, whereas I know I could not make something by myself in two months that would be competitive for, say, the Vive.

I did borrow an Oculus Go from a friend, in hopes that it would be easy to release on both platforms. Unfortunately, the logic of my character was tightly coupled with the GoogleVR components, and I didn't see a clean way to separate these. I would consider buying one of these $200 headsets and doing a port if I had any belief that my little hobby project could make at least that much back plus porting time, but right now, I don't think I can put that much more effort into it, since I need to transition into "preparing for Fall semester" mode.

The last two or three days, as I've geared up for the public release, I have thought that perhaps I should have planned on doing a more modern release, where the first part of the game is free, and people who like it can pay to unlock the rest. One reason for doing this is so that I could explore both the libraries and the platform support for DLC and licensing, which would be interesting. I decided to stick with my original plan, augmenting it slightly by releasing the game under the GPL. This means that if someone buys the game, they get access to the source code as well, and they are free to learn from it just as I did. I don't know if that will have any real impact, but it feels like the right thing to do, given my situation.

The learning

Originally, I had it in my head that I would try to make all the assets of the game from scratch in true artisan fashion. Along the way, I did use a few public domain sound effects, and as my available time quickly ran out, I added CC-BY music from indie- and jam-favorite Kevin MacLeod. I have written two other posts to detail some specifics about things I've learned through this project: importing Blender animations into UE4 and using UE4 LiveCoding for rapid unit testing. I did not keep a rigorous log of other things I've learned, and some things I had to learn twice because they did not stick in my head. Here, however, I am going to try a brain dump, in part so that I can come back here later when I need to re-learn something again.

Blender

  • UV Unwrapping
    • I found this video in particular to explain well how to unwrap and texture paint, which technique I used to mark out regions of a texture for manual painting in Gimp.
  • Normal maps
    • The technique involves modeling high-poly and low-poly models in the same file, exporting the low-poly one (by selecting it and exporting only the selection) for UE4, but exporting the normal map of the high poly one and using it. I used this on an early version of the blimp.
  • Smooth shading instead of normal maps
    • An easier way to do what I wanted with normal maps was to simply select Smooth Shading from the Tools menu in Blender, and to make sure this was exported to FBX.
  • Boolean modifiers
    • These modifiers enable constructive solid geometry; for example, the union of a plane and a sphere was used to generate the blimp's fins
  • Keyframes in video editing
    • Just as explained in this video, I had been doing gamma cross fade effects, but now can do more robust effects by keyframing. Right-click on a property, insert keyframe. I'm still not sure how to see all the keyframes and move them around, but this was good enough for me to compose the scaling and fading effects of the trailer.

Gimp

  • Filters→Render contains all kinds of neat stuff to play with visual effects. I used the Lava and Cell Noise filters to generate quick but effective textures for the fireball and acid spit spheres.
  • Multiple monochrome images can be packed into the RGBA channels of a single image using the Colors→Components→Compose feature. Make each image into a separate layer, and then this tool allows you to directly compose the layers into the RGBA channels.
  • Two ways to add shadows behind text:
    • A "soft" shadow can be had by copying a text layer, darkening it, and then blurring it. I did this at some point in the past, though I cannot remember where.
    • A harder shadow is possible by using Filters→LightAndShadow→DropShadow.

UE4

  • AI vs Physics
    • An actor cannot be driven both by a behavior tree and by physics simulation. Of course not, he says in retrospect, that would make no sense.
    • There are actually two meeple actor types in the game: AIMeeple and PhysicsMeeple. AIMeeple is controlled by one of two behavior trees, depending on whether they are armed or not. When they are hit, they convert in-place to a PhysicsMeeple.
  • AnimGraph
    • I had done a little with AnimGraphs for an unreleased tech demo last Fall, but very little of that stuck in my head, in part because I may very well have been doing things wrong. For Kaiju Kaboom, AnimGraphs are used in a more conventional way, driving the animation states of both AI and Physics meeples.
  • Landscape Materials
    • The landscape has sand and grass layers, and it fades toward white with higher altitude.
  • Cascade Particles
    • I was hoping to learn Niagara, but it is not supported on Daydream. Still, I was able to learn enough of Cascade to make my own simple particle effects, such as the fireball explosions and acid spit splash.
  • Following a Spline
    • It's a fundamental technique in level building, but I haven't really done much level building before. The blimp's flight path is specified by a closed spline.
  • Efficient Materials
    • I learned some more about how to balance GPU and CPU processing. For example, my original implementation of the blimp's propellers involved using a rotating movement component for each. After watching some videos (possibly one of Tharle VFX's shader maths videos), I moved this to be computed by a shader using world position offsets.
    • Several of my materials use customized UVs so that they are computed per vertex instead of per pixel.
    • I have a few "master materials" and use a lot of material instance constants. I understand this to be more efficient, although I didn't go so far as to measure the difference.
  • Visualizing Shader Complexity
    • As mentioned above, I haven't really done a lot of level design in UE4, so I've never really needed to inspect my scenes for performance problems. On this project, I tinkered some with the different editor view modes such as shader complexity, in order to find areas that were not performing how I expected. Turns out, everything was performing as I expected, but it was still neat to see this.
  • Unit tests
    • I knew it was possible to write unit tests for C++ code in UE4, but I had never done it before. TDD seemed like the best way to approach my high score table: unlike experimental gameplay code, it had very well-defined rules. Here is the source code for the unit test. The testing library assumes all the tests are in one method, and so I have used macros to give a fluent layer on top of that assumption. This is not as robust as something like chai, but it's much easier to read than the alternative.
    • I did not have a continuous integration system, so I had to run the tests by hand, which means that I rarely ran them. If I were working on a team on a larger-scale project, I would definitely invest in CI.
  • Functional tests
    • As with unit tests, I have known for some time that it was possible to do automated testing through the UE4 session frontend, but I never took the time to really figure it out. In this project, I created a test level containing four automation tests in a 2x2 format to ensure that falling columns and floors damage AI meeples and physics meeples.
    • In my original approach, I was spawning the relevant actors in the PrepareTest event of the Blueprint. Later I realized that it was much more effective for me to create these actors as child actors within the construction script, since this allowed me to see and manipulate them in the level. 
    • The result is fun to watch, but the approach was somewhat hamstrung by the fact that I was not using continuous integration as mentioned above..

Wrapping Up

It seems like there's no end of good ideas that I did not have time to put into the game, ranging from minor quality-of-life improvements to major new features. The spirit of this project is inspired by Jeff Vogel from Spiderweb Software, who taught me this wonderful expression: It's better than good—it's good enough! I'm pleased to have this project more-or-less wrapped up for the summer. I have started doing some course prep, but I have more of that and other university-related business to take care of this summer, as well as family obligations. 

I've also done much of this work with an injured middle finger on my right hand, which, combined with the intensity of this project, has left me well behind on my summer painting goals. Now that Kaiju Kaboom is out in the wild, it may be time to clean up the office and get the paints out.

Thanks for reading! If you know of someone with a Daydream who you think might enjoy throwing boulders at meeples, please let them know about the game. That is, pretty much, my marketing plan.

Thursday, July 4, 2019

UE4 Live Coding for Unit Testing

TL;DR: Use the new Live Coding feature in UE4.22 as a workaround for the fact that automation unit tests cannot be Hot Reloaded.

My goal for the day was to write a high score system for my summer project, which I am creating in Unreal Engine 4.22. As I started thinking about the requirements, I realized that TDD would be a good approach. Of course, I've spent more hours diving into how to make this work than I would have spent on a brute force approach, but I'm hoping I'll recoup the investment in the future.

The Automation Technical Guide provided enough scaffolding for me to write a single unit test. I took some time then to pull this out into its own module—this seemed like a good idea, although developing with multiple modules is something else I've also never done. Orfeas Eleftheriou's blog post was instrumental in my pulling my unit tests into their own module.

I remembered reading months ago that unit tests in UE4 are not Hot Reloaded, and this is confirmed as expected behavior in UE-25350. The frustrating fallout of this is that the editor has to be reloaded in order to see changes made in the C++ test implementation. This slow and tedious process is anathema to good TDD rhythm. I came across Vhite Rabbit's workaround, but I could not get it to jive with the modular decomposition I gleaned from Eleftheriou.

Then I remembered that 4.22 shipped with an experimental new Live Coding feature, which promises to allow changes from C++ to be brought into a running game session, as long as they are not structural changes. I wondered, if this works for running games, would it work for unit tests?

The happy answer here is Yes. I had to restart the editor because Live Coding cannot be used after any Hot Reloading. The editor opens with a separate window that seems to be managing the Live Coding feature. I went into my unit test and turned its simple "return false" into "return true", hit the magic Ctrl-Alt-F11 combo, waited just a few seconds for the Live Coding system to run, and then re-ran my test. Sure enough, now the test passes.

I have essentially no progress on a high score feature after the morning's work, but hopefully by documenting my findings here, I can help others move forward in their unit testing adventures.