I have mentioned this Summer's digital archaeology simulation project in other posts. We started the project last Monday, and the team has set up a blog where we are sharing some reflections. Check it out at http://digitalarchaeologyproject.blogspot.com/.
The posts there are being used to foster reflective practice and metacognition. They are relatively unstructured, the students having no real guidelines or requirements for posting other than using it to share their thoughts and feelings. There have been some interesting bits shared there so far, and we've asked each of our eight undergraduates to post there about once a week.
Saturday, June 25, 2011
Sunday, June 19, 2011
Relying on technology
From USA Today:
A five-hour computer outage that virtually shut down United Airlines Friday night and early Saturday is a stark reminder of how dependent airlines have become on technology.Airplanes, for example.
Thursday, June 16, 2011
A Ruby DSL for creating entity system components in Java
Have you ever been working with entity system architectures in Java and wished for a quicker way to prototype components? Well, you've come to the right place.
After working with entity systems for about year, I have an idiosyncratic way of defining any of my components. Keep in mind that components are just dumb data objects, not proper OO-objects, and I'm roughly following Adam Martin's approach. Here's the way I would define a position component for two-dimensional floating-point space:
I've skipped documentation for the sake of brevity. I like to keep my data private, even though it's exposed through accessors and mutators—I just can't bring myself to make public instance fields in Java. The naming convention for accessors and mutators is a Smalltalk approach, using the same name for both accessor and mutator, which can lean designs towards fluent API design instead of pushbutton (see Fowler's DSL book for more on this). The mutator returns
Now if I'm feeling really spiffy, I'll use a builder:
but that's not my point.
My point is that this is almost all boilerplate code. All I really want to say is that there's a component called
The title of the post gives it away, but this is a little Ruby DSL that I hacked together this afternoon, just to see if I could. I've been meaning to learn Ruby for some time. I have read quite a bit about Ruby and its use in DSLs, but never actually created one. This time around, I used Jonathan Magen's tutorial as a reference.
The
The one custom piece is the
Here's a very quick walkthrough for those readers who do not know Ruby. The
There were two specific pieces of Ruby syntax that I had to learn to make this work: here documents and expression substitution in strings. The oddly named "here documents" are multiline string literals. In my program, they run from
In order to make my program a bit more robust, I added configuration information to control the Java package and path to the destination as well as per-class and per-field embedded Javadoc, passed as optional parameters (via variable-length argument lists). The biggest problem I had with this was creating directories automatically from relative paths, but in the end, I pulled it off with a little custom method shown below.
To integrate this into Eclipse, I wrote a simple ant build script. Honestly, I wrote this very early in the experiment, but the process of debugging Ruby through Eclipse and Ant was quite cumbersome, and once I hopped over to a console and emacs, I was able to iterate much faster. To incorporate this into the build process in Eclipse, hop over to your project preferences and check out the "Builders" section. This requires the JRuby jar to be on the build path.
I enjoyed working on this and learned a bit about Ruby, both in terms of the language and how to express myself within it. I acknowledge that I do so much work in Java, it can be hard to switch paradigms or idioms, and so it's good to actually make something useful from time to time. However, it also makes me wonder if all that trouble is actually worth it when I can immediately imagine how to support a Java-based solution like the following.
After working with entity systems for about year, I have an idiosyncratic way of defining any of my components. Keep in mind that components are just dumb data objects, not proper OO-objects, and I'm roughly following Adam Martin's approach. Here's the way I would define a position component for two-dimensional floating-point space:
public class Position {
private float x;
private float y;
public float x() { return x; }
public float y() { return y; }
public Position x(float x) { this.x=x; return this; }
public Position y(float y) { this.y=y; return this; }
}
I've skipped documentation for the sake of brevity. I like to keep my data private, even though it's exposed through accessors and mutators—I just can't bring myself to make public instance fields in Java. The naming convention for accessors and mutators is a Smalltalk approach, using the same name for both accessor and mutator, which can lean designs towards fluent API design instead of pushbutton (see Fowler's DSL book for more on this). The mutator returns
code
to promote fluent API design as well, to permit code like the following.
Position p = new Position().x(10).y(15);
Now if I'm feeling really spiffy, I'll use a builder:
Position p = new Position.Builder().x(10).y(15).build();
but that's not my point.
My point is that this is almost all boilerplate code. All I really want to say is that there's a component called
Position
and it has two float
s, x
and y
. So maybe what I really want to be able to write is something like this:Component.build :Position do
float :x
float :y
end
The title of the post gives it away, but this is a little Ruby DSL that I hacked together this afternoon, just to see if I could. I've been meaning to learn Ruby for some time. I have read quite a bit about Ruby and its use in DSLs, but never actually created one. This time around, I used Jonathan Magen's tutorial as a reference.
The
Component
module is the starting point, and it uses what I take to be a fairly standard approach, sending blocks of code from the context to be executed within the object context:module Component
def self.build(name, &block)
builder = ComponentBuilder.new(name)
builder.instance_eval(&block)
builder.build()
return builder
end
end
The one custom piece is the
build
method on the ComponentBuilder
. What this does (perhaps not clearly indicated by the name, in retrospect) is tell the constructed object to dump out a Java source file based on the content of the block. The ComponentBuilder
looks like this:class ComponentBuilder
def initialize(name)
@name = name
@floats = []
end
def float(x)
@floats << x
end
def build
File.open(@name.to_s + ".java", "w") do |theFile|
theFile.syswrite("public class #{@name.to_s} implements Component {\n")
unless @floats.nil?
@floats.each do |var|
theFile.syswrite <<BLOCK
private float #{var};
public float #{var}() { return #{var}; }
public #{@name.to_s} #{var}(float #{var}) {
this.#{var} = #{var};
return this;
}
BLOCK
end
end
theFile.syswrite("}")
end
end
end
Here's a very quick walkthrough for those readers who do not know Ruby. The
initialize
method is the constructor, and it's setting the instance variable @name
to the argument's value and initializing an empty array for @floats
. The next method is called float
, which is not a reserved word in Ruby, and it appends its argument to the @floats
array. The build
method opens the appropriately-named file, writes the class declaration, and then iterates through the @floats
field, dumping out the field, accessor, and mutator definition for each one.There were two specific pieces of Ruby syntax that I had to learn to make this work: here documents and expression substitution in strings. The oddly named "here documents" are multiline string literals. In my program, they run from
<<BLOCK
to the recurrence of BLOCK
. This is a really useful feature for a program that is essentially filling templates and dumping them out. C# has something similar, but Java—my usual production language— does not. Note that you can do it with Groovy, though it appears the proposal to put this in 1.7 hasn't been approved. Regardless, expression substitution in strings is simple and elegant: within a string literal, put an expression within #{...}
, and it is evaluated and interpolated into the string at runtime.In order to make my program a bit more robust, I added configuration information to control the Java package and path to the destination as well as per-class and per-field embedded Javadoc, passed as optional parameters (via variable-length argument lists). The biggest problem I had with this was creating directories automatically from relative paths, but in the end, I pulled it off with a little custom method shown below.
def defensive_makedir(dir)
array = dir.split('/')
accum = '.'
array.each do |partial|
accum = accum + "/" + partial
unless File.directory? accum
Dir.mkdir accum
end
end
end
To integrate this into Eclipse, I wrote a simple ant build script. Honestly, I wrote this very early in the experiment, but the process of debugging Ruby through Eclipse and Ant was quite cumbersome, and once I hopped over to a console and emacs, I was able to iterate much faster. To incorporate this into the build process in Eclipse, hop over to your project preferences and check out the "Builders" section. This requires the JRuby jar to be on the build path.
<project name="FunWithComponents" default="make_components">
<property name="lib" location="${basedir}/vendor/lib"/>
<property name="jruby.jar" location="${lib}/jruby.jar"/>
<property name="ruby.src" location="${basedir}/ruby"/>
<property name="generated.src" location="${basedir}/gen"/>
<target name="make_components">
<java jar="${jruby.jar}" fork="true" dir="${ruby.src}">
<arg value="${ruby.src}/make_components.rb"/>
</java>
</target>
</project>
I enjoyed working on this and learned a bit about Ruby, both in terms of the language and how to express myself within it. I acknowledge that I do so much work in Java, it can be hard to switch paradigms or idioms, and so it's good to actually make something useful from time to time. However, it also makes me wonder if all that trouble is actually worth it when I can immediately imagine how to support a Java-based solution like the following.
public static void main(String[] args) {
ComponentFactory.instance().build("Position")//
.withFloat("x")//
.withFloat("y")//
.build();
}
Monday, June 13, 2011
Morgan's Raid Postmortem
Morgan's Raid enjoyed its public release at the Conner Prairie Educator Event last Thursday. With that event, the project is essentially "done," modulo the inevitable maintenance. For the past few weeks I've been sketching out notes for a postmortem in the style of Game Developer magazine, and now I'd like to share these with you. In truth, I vary a bit from their style, looking more at structural elements of the team than technological decisions, but this is natural given the dual nature of the project, which was designed to create something significant while also meeting learning objectives for the developers. (As I think about this, it would be interesting to write one in their style too, but that's a task for another day.)
What went well
Our modifications to Scrum were relatively minor. Because the team only met three times per week, we did not have daily stand-ups, but our "once-per-workday" stand-ups were sufficient. One aspect that we completely dropped was the product burndown chart. With our already low number of collaborative, collocated working hours, I decided against adopting formal planning poker sessions. In the Fall, I made up story points based on my experience as a developer and a teacher, and in the Spring, we skipped these entirely. I do not doubt that they would have been an academically interesting feature to keep, but aside from academic intrigue, I think I made the right call to slim Scrum down a bit.
Due to scheduling issues we could not control, our artist could not meet with the developers in the Fall. As a result, his work was rather undirected and his contributions relatively few. In the Spring, he was able to meet with the developers, creating a real cross-functional team. The results were amazing, and I need to be sure to promote cross-functional teams in future experiences. This is not directly related, but I'm sure that our separation of "designers" and "developers" in the Scourge of Orion project was a major contributor to its lack of productivity.
It was especially useful to playtest with different demographics. "Indiana fourth grade students" encompasses a wide variety of skills and dispositions, and even in our relatively limited playtesting, we were able to witness this range. I suspect that this perspective not only helped the team to design the game but also to develop empathy for the challenges of the elementary school system.
Every child who played Morgan's Raid gave us some feedback, and often they had feature requests. Some were clearly outside of our scope, such as adding multiplayer support and leaderboards. Many others were based on the students' experiences with school and educational software. At some point later, I plan to write a more detailed piece on why there are no quizzes in Morgan's Raid, despite this suggestion being proffered by many playtesters. What was more salient to this post is that some of the team members jumped on these recommendations, an unspoken assumption being that we should provide what the intended end users wanted. There's a deeper anthropological lesson here that I tried to explain to the team, that while the users are always right, they also aren't experts. This brings us back to the design issue mentioned earlier, that experience design is difficult, and it's important that someone have the power to veto.
The fact that this was a team of amateurs is significant. The ideals of proper object-orientation can and should lead to modular, robust, scalable software systems. However, because the students lack experience, perspective, and technical understanding, it is insufficient to give them a software design task and expect production-quality results. Even with the entity system architecture, there are parts of the code that are ugly to behold, unnecessarily coupled across disparate systems. That said, the fundamental idea of the architecture—decoupling behaviors into components, which are acted upon by systems, and combining components into entities—was simple enough that it could be explained with a few examples and then applied by the team.
Maybe I'm cheating by putting two in one here, since the version control system is orthogonal to the software architecture, but I'm going to mention it here anyway. This was my first time using a distributed version control system, and Mercurial more than proved itself. The students were able to learn it quickly and became quite savvy with the workflow. It was a little distressing that even at the end of development, some team members were mad at Mercurial for making them merge, where I would have liked them to adopt the more enlightened perspective that merging is a necessary part of teamwork, and Mercurial made it infinitely easier than doing it by hand.
The incredibly impact of dedicated space has made me an evangelist for more responsible use of university resources. It got to the point where, on the Future of Education Task Force, people would just put "space" down for me in any list of things we need to change. Space is a deeply political issue on any college campus, but one important lesson I learned is that I should pay more attention to it during the planning stage of any future activity. If I can find any partner who is wiling to pony up space as part of a grant proposal, for example, I'll jump on it.
The only distressing part about this success is that the team is now gone. Usually, at the end of fifteen weeks, I am ready to push my students out to face their next challenges. With this team, a part of me wishes I could keep them here so that we can continue to face challenges together. I build an emotional bond with these undergraduates. They were my teammates and coworkers, my partners in a difficult endeavor, my colleagues, and my friends, and I will miss them.
What went well
Scrum
The application of Scrum was excellent, especially as we refined our process in the Spring. The students and I appreciated the public nature of the task board, and they healthily struggled with what it meant for a team (not an individual) to commit to a task. The impact of the burndown charts was evident in how students discussed their work and our progress, and the sprint renegotiation process was relatively smooth. The sprint reviews and retrospectives gave the team a chance to analyze and discuss the product and the processes, and their empowerment helped fuel their continued motivation.Our modifications to Scrum were relatively minor. Because the team only met three times per week, we did not have daily stand-ups, but our "once-per-workday" stand-ups were sufficient. One aspect that we completely dropped was the product burndown chart. With our already low number of collaborative, collocated working hours, I decided against adopting formal planning poker sessions. In the Fall, I made up story points based on my experience as a developer and a teacher, and in the Spring, we skipped these entirely. I do not doubt that they would have been an academically interesting feature to keep, but aside from academic intrigue, I think I made the right call to slim Scrum down a bit.
Due to scheduling issues we could not control, our artist could not meet with the developers in the Fall. As a result, his work was rather undirected and his contributions relatively few. In the Spring, he was able to meet with the developers, creating a real cross-functional team. The results were amazing, and I need to be sure to promote cross-functional teams in future experiences. This is not directly related, but I'm sure that our separation of "designers" and "developers" in the Scourge of Orion project was a major contributor to its lack of productivity.
Playtesting
Our external playtesting experiences were fundamental to the success of the game, and I express again my gratitude to those teachers and schools who shared their time with us. I know that I learned a lot by seeing how the end users interacted with our various prototypes, but more importantly, I could see how it helped the developers build empathy for the end users.It was especially useful to playtest with different demographics. "Indiana fourth grade students" encompasses a wide variety of skills and dispositions, and even in our relatively limited playtesting, we were able to witness this range. I suspect that this perspective not only helped the team to design the game but also to develop empathy for the challenges of the elementary school system.
Every child who played Morgan's Raid gave us some feedback, and often they had feature requests. Some were clearly outside of our scope, such as adding multiplayer support and leaderboards. Many others were based on the students' experiences with school and educational software. At some point later, I plan to write a more detailed piece on why there are no quizzes in Morgan's Raid, despite this suggestion being proffered by many playtesters. What was more salient to this post is that some of the team members jumped on these recommendations, an unspoken assumption being that we should provide what the intended end users wanted. There's a deeper anthropological lesson here that I tried to explain to the team, that while the users are always right, they also aren't experts. This brings us back to the design issue mentioned earlier, that experience design is difficult, and it's important that someone have the power to veto.
Development methodology: Entity system architecture and Mercurial
I have written before about the entity system architecture, and I list it here as one of the great successes of the project. Adopting this architecture allowed us to separate tasks rather seamlessly, certainly better than I had experienced before in working with teams of undergraduates.The fact that this was a team of amateurs is significant. The ideals of proper object-orientation can and should lead to modular, robust, scalable software systems. However, because the students lack experience, perspective, and technical understanding, it is insufficient to give them a software design task and expect production-quality results. Even with the entity system architecture, there are parts of the code that are ugly to behold, unnecessarily coupled across disparate systems. That said, the fundamental idea of the architecture—decoupling behaviors into components, which are acted upon by systems, and combining components into entities—was simple enough that it could be explained with a few examples and then applied by the team.
Maybe I'm cheating by putting two in one here, since the version control system is orthogonal to the software architecture, but I'm going to mention it here anyway. This was my first time using a distributed version control system, and Mercurial more than proved itself. The students were able to learn it quickly and became quite savvy with the workflow. It was a little distressing that even at the end of development, some team members were mad at Mercurial for making them merge, where I would have liked them to adopt the more enlightened perspective that merging is a necessary part of teamwork, and Mercurial made it infinitely easier than doing it by hand.
Spring Space
The use of a dedicated space in Spring was fantastic. This is not easy to come by in a university setting, and on behalf of the entire team and our stakeholders, I am grateful to my department for granting us the use of the room. Everything one would expect to improve was improved. While the Google Docs approach to managing the Scrum artifacts was reasonably effective when working with the 25-member team in Fall, it pales in comparison to the power of three giant whiteboards and piles of sticky notes in the Spring.The incredibly impact of dedicated space has made me an evangelist for more responsible use of university resources. It got to the point where, on the Future of Education Task Force, people would just put "space" down for me in any list of things we need to change. Space is a deeply political issue on any college campus, but one important lesson I learned is that I should pay more attention to it during the planning stage of any future activity. If I can find any partner who is wiling to pony up space as part of a grant proposal, for example, I'll jump on it.
Team formation
It was great to see how the team gelled, especially the smaller group in the Spring. They really were a functional independent game development team in every way that matters. There was some work that I did to try to foster this sense of team, such as adopting agile methodologies, bringing goodies for Sprint Review and Retrospective meetings, and hosting board game sessions. These definitely helped, but I also could tell that the students were hungry to make something important, to do something that mattered beyond a credited learning experience. Even if we all had our own idiosyncratic views of what Morgan's Raid should be, we shared this vision of being able to work together to make something significant that none could do on his or her own.The only distressing part about this success is that the team is now gone. Usually, at the end of fifteen weeks, I am ready to push my students out to face their next challenges. With this team, a part of me wishes I could keep them here so that we can continue to face challenges together. I build an emotional bond with these undergraduates. They were my teammates and coworkers, my partners in a difficult endeavor, my colleagues, and my friends, and I will miss them.
What didn't go well
Code reviews were not in our methodology
We did not have formal code reviews, and in retrospect, our code quality suffered because of it. It is easy at the time to ignore reviews in favor of "productivity," but I think this was a classic managerial blunder. I am not certain what methodology we could have used in the Fall to promote this, but we certainly could have inserted some more formal code reviews in the Spring. This would have directly helped us meet some learning objectives while improving the code base, since as I and others have said before, software development is a learning process.
Because of the generally poor code quality from Fall, I made sure to introduce a formal verification step into the task approval process. When I had the time and energy, this meant that I would not just check that the task was working, but that the underlying implementation was reasonable; however, working under our significant time pressure, I often did not have the time and energy.
In truth, I do not have much experience with code reviews. Sure, I've graded countless assignments, but these have primarily been summative assessments, not formative assessments. This is not an excuse but rather an opportunity for me to include code reviews into a future experience, so that my students and I can learn together how best to use them in our environment.
No technical specification
I invested a significant amount of time in Summer 2010 working the results of the Spring design colloquium into a functional design specification. This took the form of a wiki on Google Sites, and it was shared with all the team members and product owners. This was a colossal effort, since the output from Spring was nowhere near the quality we expected, and so really I was redesigning the whole game from disparate prototypes—and even then, what I wrote up was not that good and ended up being significantly changed before shipping. This does not mean it was wasted effort: it was still a necessary step along the way. However, because this took all of my spare cycles in Summer 2010, I was out of time when it came to the technical specification. I wrote up a few notes about methodology (inspired by Cockburn's excellent book that I read the same Summer), but for the most part, the software architecture was only in my head.
I suspect that the lack of a technical specification was a major factor in the relatively low productivity of the Fall semester. I believe that the productivity of Spring would not have been possible without the learning of Fall, but at the same time, and as pointed out by a team member the other day, much of the Fall was undirected learning. That is, the students were learning, but they didn't always know where they were going with this. Being able to rely on, for example, a software architecture specification, could have helped them see how their little piece was actually going to make a big difference.
Undergraduates, by and large, do not have the perspective necessary to design large software systems. After all, having wide perspectives and the ability to integrate across disciplines is an outcome of higher education, not an input. I'm sure a technical specification would not have improved Fall's productivity by an order of magnitude, but it would have helped a little. Most importantly, it would have helped the students see, from early in the process, the value of documentation. About two-thirds through the Fall semester, we did a massive architectural overhaul, and to get this done, I turned to de facto standard UML. Once the students understood how this language was part of my bigger process, they could see the value in it and showed interest in learning it. I suspect this transformative experience could have been brought about earlier through more formal technical specification.
IP
Dealing with the university's intellectual property policies has been an exercise in frustration. I suspect that mine is like a lot of other institutions of higher education, where tight budgets are causing the administration to jump on the commercialization bandwagon, and yet the staff has little or no experience dealing with software innovations. As a point in case, I have yet to meet anyone at the university who seems to understand software licensing, and all of the forms and procedures are aligned with physical invention.
In the university's defense, I put off dealing with IP until we were 90% done with the project. I just didn't want to deal with it, because I knew it would be a quagmire. Sure enough, it was, and still is, since some of our issues are still not sorted out: I was told that I would receive official written documentation on some issues over a month ago that I has still not arrived. I suppose I should have started the internal processes earlier.
On the positive side, all of the programmers have agreed to license their code under an OSI-approved license, and so I hope to make a source release of the game and announce official licensing policy by the end of the month.
The takeaway on the IP front is that, in future projects, I should deal with it earlier on, ideally during the proposal process. This would lead to a traceable paper trail and explicit acknowledgement of the funding agency of the intended licensing strategy, and we would only have to go through the current headaches if things changed.
Undergraduate game designers
I hesitate to mention this here since several of my undergraduates may read this, but understand that I write this with respect and affection.
We put too much faith in undergraduate designers. Game design is hard, and I posit that legitimate educational game design is even harder. I think what we asked of our Spring design colloquium class may have been impossible for students at that level of experience. Not only were they lacking game design knowledge specifically, they had essentially no prior experience with design thinking at all.
I recently took the time to reflect on the game designs I have come up with during my lifetime. I've always been a hobbyist, and specifically, I remember writing up rules for a tabletop RPG that got some playtesting when I was an undergraduate, and I sketched out at least one CCG rulebook that was circulated among my friends in grad school. When I look back at these, I made classic amateur mistakes, prioritizing the wrong elements and trying to do big design up front rather than incremental and iterative design. That is, I knew nothing formally of design thinking. As I reflect on what my students designed, in the design colloquium and during the evolutionary design of the development process, I see the same kinds of amateur mistakes.
For those who were on the team for more than one semester, I'm sure that they saw how I made changes to cope with these observations. The design colloquium was very open-ended, but in the Fall and Spring, I positioned myself as the design authority. While anyone could pitch any idea to me—and indeed the great design successes of the game are primarily student-generated—I was the gatekeeper, and there were many directions that I forced to be shut down. As benevolent dictator, I needed to ensure that energy was being spent fruitfully, even if it was perceived as less glamorous by the team.
Trying again to look forward, I think this will have to be the perspective I take in future endeavors. We were probably not critical enough in the Spring, and there wasn't enough time in a three-credit course for the students to develop healthy levels of design self-criticism.
Attention management
In the Spring, we had an excellent space in RB368, and I am sure that the success of the project rested in having access to the space. That being said, I'm not sure that I managed it correctly. As I have read more about agile software development spaces, I realize that we did not have a good management system in place for dealing with distraction. Our shared workspace was regularly interrupted by both related and unrelated events. I know there were several times where I was in the zone and then got broken out of it by well-intended interruptions.
For many students (at least it is said), working in a high-distraction environment is their modus operandi. The sad result is that they have no idea how much productivity they are losing, since they are not intentionally managing their attention. This would have been a good learning opportunity for them to see what being productive really felt like. Next time around, such as next Spring's VBC seminar, I will be more explicit about working conditions. I am confident that this can be done without negatively impacting the social bonds of the team, and in fact, I suspect it will lead to an increase in mutual respect.
Sunday, June 12, 2011
State Design Pattern in Unity with C#
Following up on my initial investigation into game engines, I decided that the best way to get to know Unity would be to build something with it. A few years ago, I wrote a clone of Every Extend called EEClone, which is described in this JERIC paper (ACM DL). I figured a new spin on EEClone would give me a good point of comparison, so I set out to recreate it in Unity.
As I mentioned earlier, one of the strengths of Unity is that it supports a variety of scripting languages, including C#. This intrigued me the more I thought about it, since it meant that it should be straightforward for me to apply my knowledge of Java design patterns in Unity with just a few syntactic changes. Specifically, I decided to dive into the state design pattern, which time and again I have found to be useful in practice. Just as with Java, a quick search of the Web for advice on how to implement state machines in Unity comes up with the classic procedural approach. The costs and benefits of the OO and procedural approaches are documented in the aforementioned paper; for this post, I will focus on the mechanics rather than justification.
Without further ado, this is what the game looks like. The video shows two player-initiated explosions followed by one obstacle collision.
The goal was to explore how the pattern is reified within Unity and C#, not to faithfully recreate EEClone, and so I went with a slightly simpler state diagram than in the Java version:
The astute reader will sure notice where I have cut corners. For example, PlayingState transitions directly to AwaitingRespawnState after the player sets off the bomb, without waiting for the explosion chain to terminate. Clever handling of explosion chains and scoring was beyond the scope of this example.
The state diagram can be converted into an implementation following the Gang of Four approach, yielding this structure:
Within Unity, this entire structure (modulo the external MonoBehavior) is contained in one source file, StatefulPlayer.cs. The class diagram reveals the two defining characteristics of the state design pattern:
The constructor passes its argument to the superclass' constructor, where that reference is kept in a protected field, player. The Update method overrides the default empty implementation from AbstractState and is responsible for keeping track of elapsed time. You can clearly see that when this accumulated time exceeds three seconds, we go back to a new SpawningState.
There is one point where I had to "cheat" a bit, and that is with animation termination. The transition from SpawningState to PlayingState should be triggered by the end of the spawn-in animation. However, I could not find a way in Unity for a general-purpose end-of-animation observer. I could have set a timer and hardcoded it to go off at the same time as the animation, but this would clearly be fragile to change. My solution was to manually insert an event at the end of the animation, and to have this event call the AnimationFinished method of the player—indeed, that is its raison d'être. This method is also used at the end of the death animation, as shown in the state diagram above.
It is laudable that Unity promotes rapid prototyping with its asset-centric development model, which allows you to go in and fiddle with values with very few restriction. Yet, I find it frustrating to have to deal with literals rather than expressions: without such abstraction, one is left thirsty for more refactoring. My approach for handling animation termination is effective, but it does mean that the entire state model has a dependency on two animations, a dependency that cannot be determined by static analysis of the C# code itself. This is an architectural weakness, but it is the cost one pays for using tools like Unity that prevent one from having to hand-code each and every animation.
As I mentioned earlier, one of the strengths of Unity is that it supports a variety of scripting languages, including C#. This intrigued me the more I thought about it, since it meant that it should be straightforward for me to apply my knowledge of Java design patterns in Unity with just a few syntactic changes. Specifically, I decided to dive into the state design pattern, which time and again I have found to be useful in practice. Just as with Java, a quick search of the Web for advice on how to implement state machines in Unity comes up with the classic procedural approach. The costs and benefits of the OO and procedural approaches are documented in the aforementioned paper; for this post, I will focus on the mechanics rather than justification.
Without further ado, this is what the game looks like. The video shows two player-initiated explosions followed by one obstacle collision.
The goal was to explore how the pattern is reified within Unity and C#, not to faithfully recreate EEClone, and so I went with a slightly simpler state diagram than in the Java version:
The astute reader will sure notice where I have cut corners. For example, PlayingState transitions directly to AwaitingRespawnState after the player sets off the bomb, without waiting for the explosion chain to terminate. Clever handling of explosion chains and scoring was beyond the scope of this example.
The state diagram can be converted into an implementation following the Gang of Four approach, yielding this structure:
Within Unity, this entire structure (modulo the external MonoBehavior) is contained in one source file, StatefulPlayer.cs. The class diagram reveals the two defining characteristics of the state design pattern:
- Each state is represented as an object. In this case, each is defined as a unique class, with an abstract class holding common behaviors and implementations, the result of refactoring.
- States are responsible for their own transitions.
To illustrate these points, consider AwaitingRespawnState, the point of which is simply to wait for three seconds to elapse before going back to SpawningState:
class AwaitingRespawnState : AbstractState {
public AwaitingRespawnState(StatefulPlayer p) : base(p) {}
private float elapsedTime;
public override void Update() {
elapsedTime += Time.deltaTime;
if (elapsedTime > 3) {
player.transform.renderer.enabled = true;
player.EnterState(new SpawningState(player));
}
}
public override void OnEnter() {
elapsedTime=0;
}
}
The constructor passes its argument to the superclass' constructor, where that reference is kept in a protected field, player. The Update method overrides the default empty implementation from AbstractState and is responsible for keeping track of elapsed time. You can clearly see that when this accumulated time exceeds three seconds, we go back to a new SpawningState.
There is one point where I had to "cheat" a bit, and that is with animation termination. The transition from SpawningState to PlayingState should be triggered by the end of the spawn-in animation. However, I could not find a way in Unity for a general-purpose end-of-animation observer. I could have set a timer and hardcoded it to go off at the same time as the animation, but this would clearly be fragile to change. My solution was to manually insert an event at the end of the animation, and to have this event call the AnimationFinished method of the player—indeed, that is its raison d'être. This method is also used at the end of the death animation, as shown in the state diagram above.
It is laudable that Unity promotes rapid prototyping with its asset-centric development model, which allows you to go in and fiddle with values with very few restriction. Yet, I find it frustrating to have to deal with literals rather than expressions: without such abstraction, one is left thirsty for more refactoring. My approach for handling animation termination is effective, but it does mean that the entire state model has a dependency on two animations, a dependency that cannot be determined by static analysis of the C# code itself. This is an architectural weakness, but it is the cost one pays for using tools like Unity that prevent one from having to hand-code each and every animation.
Saturday, June 11, 2011
Books that influenced my teaching practice
I've been thinking these last two days about the books I have read in my six years as a professor, and how these books have affected my practice. When I started at Ball State University, I was like a lot of freshly-minted Ph.D.s: I had spent the last several years focused on research with just a little bit of teaching experience, and I had never really been mentored in effective teaching practices, not beyond some TA training just before starting grad school.
Hindsight is imperfect, but if a new CS faculty member were to ask me today what to read in order to develop a better understanding of teaching, here's what I would say. They're listed in the order I read them.
Hindsight is imperfect, but if a new CS faculty member were to ask me today what to read in order to develop a better understanding of teaching, here's what I would say. They're listed in the order I read them.
Holub on Patterns. This is one of my favorite books on patterns. It takes the Gang of Four patterns and presents them in the context of two case studies, and these come after two of the best chapters on the philosophy object-oriented design that I have ever read. The fact that the patterns are shown in collaboration is significant: one of my lessons from grad school was that it was very hard to teach or learn the patterns in isolation, because that's not how they emerge in practice. While his book is not explicitly about teaching, the idea of holistic education arises every time I design a learning experience. I think I read it at a very influential time as well: as I was finishing my doctorate and thinking about what kind of professor I wanted to be.
Scholarship Reconsidered and Scholarship Assessed. I read the Boyer and Glassick et al. books when serving as chair of my department's Promotion and Tenure Committee. These should be required reading for anyone going interested in becoming a professor, and university administration should be required to re-read them every three years. Boyer famously describes a taxonomy for scholarship, but I think it's Glassick's book that forced me to think about the extent to which my teaching was scholarly activity.
The Pragmatic Programmer. I wish I could say, "'Nuff said" and be done with it, but I that is insufficient. All computer science professors who have not read this book should do so now, especially if involved in curriculum design and outcomes assessment. Reading this book is as close as you can get to having an expert advise the practice / applications side of the curriculum.
Disrupting Class got me me seriously thinking about the educational complex more than any other book. It's easy to find flaws in the system, but rather than dwell on these, the book addresses fundamental concepts innovation and growth. It raises the important point that organizations protect themselves at all costs, and that disruption can only be achieved by making a market where there was not one before. Several of the ideas from this book influenced my concept of the university sandbox, a place to foster the reimagining of higher education, as documented in the Future of Education Task Force report.
Disrupting Class got me me seriously thinking about the educational complex more than any other book. It's easy to find flaws in the system, but rather than dwell on these, the book addresses fundamental concepts innovation and growth. It raises the important point that organizations protect themselves at all costs, and that disruption can only be achieved by making a market where there was not one before. Several of the ideas from this book influenced my concept of the university sandbox, a place to foster the reimagining of higher education, as documented in the Future of Education Task Force report.
Pragmatic Thinking and Learning is a brilliant presentation on how people learn, focusing on software developers in particular. I would love to know what a non-developer thinks of the book. I found it to be general enough to extract several pieces into completely unrelated teaching experiences, but I also have the benefit of sharing a jargon and mindset with the author and the intended audience.
How People Learn. By the time I read this book earlier this year, I knew most of the big ideas already, having come across them in other readings or through SIGCSE and CCSC:MW. The piece of the puzzle I had not previously considered was the role of culture in learning and the vast, undocumented differences that can exist within superficially homogenous groups. I have always known that my game-related examples in CS120, for example, appeal to subcultures in the class, and so I would try to balance them against other examples, such as economics or sports-related. However, before reading this book, I hadn't considered the deeper issues of how people communicate and express understanding.
This is far from an annotated bibliography, but after reading Spinuzzi's eight-year anniversary blog post, it got me thinking that I should be sharing more about the books I read, and providing more critical evaluations of them. If nothing else, it will help me remember how I've grown and changed through reflective practice, and hopefully it will also help provide a corpus of related work as I continue to document my research.
Friday, June 10, 2011
Predator and Pray
A funny thing happened at the Conner Prairie Educator Event. I was there with some of the Morgan's Raid team to show off the game to elementary school teachers. The six of us were standing at our table a little before the event was to begin, when an attractive 40-ish woman and her presumed husband approached our group and asked if any of us were suffering from chronic pain or allergies. Needless to say, we were a little dumbfounded. The mind cannot help but wonder why someone would ask such a thing. The amazing automatic pattern-matching machine in my head figured she was a masseuse with healing crystals or otherwise had some snake oil to sell.
She was quite insistent, and no one in my group was biting. Never one to back away from the unusual, I laughed and said she should see me in the Fall, when my allergies act up. She smiled and told us that she had felt that there was some pain among us, and then she asked if she could pray for me.
This was not quite what I expected, but assuming they were not worshipers of some deranged devil-god, I figured it wouldn't hurt, so I said yes. Their follow-up question, whether they could lay on hands, was the obvious next step.
She proceeded to invoke the Holy Spirit, asking for it to cleanse me of my allergies so that I might never have to take Claritin again, because God had created our bodies to enjoy nature and not to fight against it. Then, the Holy Spirit told her that I was also going have a promotion soon, and she asked for God's grace in this as well. Before and after the clearly spoken part of the blessing, the man and woman whispered what I could not hear but I assume to have been speaking in tongues. At some point, the third in their party---who I assume to be the mother of one or the other of these two---took a photograph, which made me wonder if I'd show up on candid camera.
They were pleased with themselves, and I thanked them for the blessing. They were clearly fervent in their beliefs and felt that they had done a Very Good Thing (tm). Since I'm not opposed to the power of prayer, I then asked them to pray for a friend of mine in need, which they promised to do, and off they went.
This was an interesting experience, which of course got some chuckles from the Morgan's Raid team. As for my allergies, I'll admit that they're annoying, but I would not pray to have them removed. I just don't think that's how prayer works. Allergies are part of the human condition. Everybody has their own problems, not because we're bad or sinful, but because we're here. How about prayers of thanks for modern science, from which comes Zyrtec? It didn't bother me that they were so cocksure that my allergies would never bother me again, although I would not be surprised if I spend 1/4 of the year on antihistamines because I have been for over ten years.
The addition of a promise of promotion, though, was too much. Assuming I stay a professor, I will probably get exactly one more promotion in the rest of my career, and it will only come if I chase it. Given that I haven't considered applying for promotion, I find it very unlikely that the Provost is currently writing a memorandum of spontaneous promotion for me. This kind of unsubstantiated, faith-based claim is not only demonstrably wrong, it is manipulative. I'm cynical enough to be relatively unaffected by it (aside from the inspiration to write about it here), but this kind of behavior can wreak havoc on people with low wisdom, as we gamers might say. Maybe, at this point, I should have called them out, let them know that this "Holy Spirit thinking I will be promoted" was BS. What would it have gained? As it is, I held my tongue and gave myself time to consider and reflect on the experience. I'm not sure which is worse: benevolent charlatanism or cowardice in the face of the same.
I'll be sure to post a follow-up once it's ragweed season to let you know how it turns out.
She was quite insistent, and no one in my group was biting. Never one to back away from the unusual, I laughed and said she should see me in the Fall, when my allergies act up. She smiled and told us that she had felt that there was some pain among us, and then she asked if she could pray for me.
This was not quite what I expected, but assuming they were not worshipers of some deranged devil-god, I figured it wouldn't hurt, so I said yes. Their follow-up question, whether they could lay on hands, was the obvious next step.
She proceeded to invoke the Holy Spirit, asking for it to cleanse me of my allergies so that I might never have to take Claritin again, because God had created our bodies to enjoy nature and not to fight against it. Then, the Holy Spirit told her that I was also going have a promotion soon, and she asked for God's grace in this as well. Before and after the clearly spoken part of the blessing, the man and woman whispered what I could not hear but I assume to have been speaking in tongues. At some point, the third in their party---who I assume to be the mother of one or the other of these two---took a photograph, which made me wonder if I'd show up on candid camera.
They were pleased with themselves, and I thanked them for the blessing. They were clearly fervent in their beliefs and felt that they had done a Very Good Thing (tm). Since I'm not opposed to the power of prayer, I then asked them to pray for a friend of mine in need, which they promised to do, and off they went.
This was an interesting experience, which of course got some chuckles from the Morgan's Raid team. As for my allergies, I'll admit that they're annoying, but I would not pray to have them removed. I just don't think that's how prayer works. Allergies are part of the human condition. Everybody has their own problems, not because we're bad or sinful, but because we're here. How about prayers of thanks for modern science, from which comes Zyrtec? It didn't bother me that they were so cocksure that my allergies would never bother me again, although I would not be surprised if I spend 1/4 of the year on antihistamines because I have been for over ten years.
The addition of a promise of promotion, though, was too much. Assuming I stay a professor, I will probably get exactly one more promotion in the rest of my career, and it will only come if I chase it. Given that I haven't considered applying for promotion, I find it very unlikely that the Provost is currently writing a memorandum of spontaneous promotion for me. This kind of unsubstantiated, faith-based claim is not only demonstrably wrong, it is manipulative. I'm cynical enough to be relatively unaffected by it (aside from the inspiration to write about it here), but this kind of behavior can wreak havoc on people with low wisdom, as we gamers might say. Maybe, at this point, I should have called them out, let them know that this "Holy Spirit thinking I will be promoted" was BS. What would it have gained? As it is, I held my tongue and gave myself time to consider and reflect on the experience. I'm not sure which is worse: benevolent charlatanism or cowardice in the face of the same.
I'll be sure to post a follow-up once it's ragweed season to let you know how it turns out.
Saturday, June 4, 2011
An afternoon with game engines
After an exciting morning at the XFC, I had an iced coffee and decided to dig into a few game engines. I am looking for technology to support up to three upcoming activities: the digital archaeology simulation mentioned a few days ago, my Fall game programming class (which so far has no plan), and my Spring VBC seminar. This is an informal story about my afternoon explorations, more about impressions than any deep review.
I keep my Windows start menu at the top of the screen, where it belongs despite its not being the default position. Unity slams all of the dialog boxes to the top of the screen, and they come up under the taskbar, where I cannot move or close them. Ugh. You know, in Linux I just alt-drag to move any window. For this, I temporarily had to set the taskbar to autohide just to tinker with Unity.
The documentation was high quality, and it didn't take me long to make this:
You can run the "game" and use the keyboard to move the grey box around the blue field. Not too shabby. Note that the scripting language is JavaScript, which is nice since I have passing familiarity with it, and there's a high likelihood that students will have at least seen it before. If not, it's close enough to Java or C# to make progress fairly quickly.
With this impressive start, I started looking at version control, because you should always use version control Unity's own product page lays it out for you: to use external (i.e. third-party) version control systems, you need to get the $1500 Unity Pro (as opposed to the $0 free version), and for an extra $500 per seat you can use their Asset Server. The difference between $1500 per seat and $0 per seat is pretty steep. There are workarounds on various forums, but the long and short of it seems to be that you need the Pro version to make this work.
I started the Fish tutorial, and as soon as I had to start tinkering with scripts, I got worried. First, despite this being an official and apparently-popular tutorial, the scripts and folder structure in the tool were completely different from those in the tutorial: names, directories, functions---all different. I struggled through to a point where it appeared I could start doing some OO scripting, and the tutorial diverged too far from the tool for me to care to go any further. The scripting language was significantly different from what I and my students are used to. Just as an example, functions require "%this" to be sent as the first parameter, and yet they can be invoked via dot-notation on objects:
I need the documentation to be in sync with the tool if I'm going to throw this at students, especially on a short time-frame project. A good experiment, but time to move on to...
Also, I remembered having seen some really high-quality tutorials, including great video tutorials. All these links on the UDK documentation pages are dead. I'll have to come back to it when I'm feeling inspired and the Web site is fixed. I love the idea of using UDK, but I'm afraid of the learning curve.
Working without version control is practically a non-starter for any significant development effort. Spending ten grand on software licenses is definitely a non-starter. This is a shame, since Unity seems to have pulled ahead in the engine wars for projects the scale I am considering. I may have to invest more time into trying some of the version control workarounds that folks have posted so that we can get away with the free version, since there's literally nothing else in the Pro version that I expect I would need.
As always, I am open to your suggestions.
Unity
I started by working with Unity. I had looked at it briefly before, but had not really done anything with it. I was immediately infuriated (yes, infuriated) by this:I keep my Windows start menu at the top of the screen, where it belongs despite its not being the default position. Unity slams all of the dialog boxes to the top of the screen, and they come up under the taskbar, where I cannot move or close them. Ugh. You know, in Linux I just alt-drag to move any window. For this, I temporarily had to set the taskbar to autohide just to tinker with Unity.
The documentation was high quality, and it didn't take me long to make this:
You can run the "game" and use the keyboard to move the grey box around the blue field. Not too shabby. Note that the scripting language is JavaScript, which is nice since I have passing familiarity with it, and there's a high likelihood that students will have at least seen it before. If not, it's close enough to Java or C# to make progress fairly quickly.
With this impressive start, I started looking at version control, because you should always use version control Unity's own product page lays it out for you: to use external (i.e. third-party) version control systems, you need to get the $1500 Unity Pro (as opposed to the $0 free version), and for an extra $500 per seat you can use their Asset Server. The difference between $1500 per seat and $0 per seat is pretty steep. There are workarounds on various forums, but the long and short of it seems to be that you need the Pro version to make this work.
Torque 2D / Game Builder
Next I decided to look up Torque, which despite having contacted their friendly customer support people and glanced through one of their books, I had never actually used. The main reason for this is that there's a timed 30-day trial, and I never really had set up serious time to evaluate it. My first impression was very good, especially because I am keen on 2D games: that's one less D required for art resources, and my teams are almost always heavy on coders and low on artists.I started the Fish tutorial, and as soon as I had to start tinkering with scripts, I got worried. First, despite this being an official and apparently-popular tutorial, the scripts and folder structure in the tool were completely different from those in the tutorial: names, directories, functions---all different. I struggled through to a point where it appeared I could start doing some OO scripting, and the tutorial diverged too far from the tool for me to care to go any further. The scripting language was significantly different from what I and my students are used to. Just as an example, functions require "%this" to be sent as the first parameter, and yet they can be invoked via dot-notation on objects:
// Definition
function FishClass::setSpeed(%this)
{
%this.speed = getRandom(%this.minSpeed, %this.maxSpeed);
}
// Invocation
%this.setSpeed();
This is not a deal-breaker, but it is quite ugly, and I found the quality of documentation generally to be inferior to Unity's. I will mention that there was an excellent feature in the Torque tutorial in which each step was followed by a link of the form, "If you don't know how to do that, click here." Clicking that would expand a frame containing step-by-step details on how to access the corresponding feature.I need the documentation to be in sync with the tool if I'm going to throw this at students, especially on a short time-frame project. A good experiment, but time to move on to...
UDK
The mac daddy of readily-available commercial-quality game engines, with incredible university-friendly licensing terms for non-commercial use: basically, no cost. The only real cost with UDK is that it's a bit complicated. After tinkering with Unity and Torque, and glancing again at good old GameMaker (which I taught to kids in grades 2-8 a few Summers ago), UDK is clearly a different kind of beast. I had played with it a little many months ago, and opening it back up, I thought maybe I could recreate the jolly good "move the box with the keyboard" game. No such luck. In fact, I couldn't seem to intuit how to do anything at all. This was late in the day, and at this point I had had enough interruptions to wreck my flow. I brought up some videos and tutorials that I'm pretty sure I had read before, and I remembered bits and pieces of how it works. I also remembered my initial response to UDK the first time I had used it: clearly this is a powerful tool that we could use to build something very tight, but throwing this at a team of undergraduates with no background in any of these ideas would be very dangerous.Also, I remembered having seen some really high-quality tutorials, including great video tutorials. All these links on the UDK documentation pages are dead. I'll have to come back to it when I'm feeling inspired and the Web site is fixed. I love the idea of using UDK, but I'm afraid of the learning curve.
Slick
I can't shake the feeling that we could always just do it in Slick. The biggest problem with Slick that we had in Morgan's Raid was that it was hard to gain momentum, since we had to do so much ourselves. However, I will have nearly half the Morgan's Raid Spring team on the two big projects (digital archaeology and VBC), and it's tempting to just build off of what we had. The complication with this, which I will discuss in an upcoming post mortem on the project, is that students don't have the experience to do software architecture, and so leaving core engine decisions up to them can have further productivity-impacting consequences.Something like conclusions
It was a great afternoon of tinkering, though not without frustrations. However, it is good to be reminded of what it feels like to be thrown into an overwhelming interface: this maintains sympathy with the experiences of the undergraduate computer science student.Working without version control is practically a non-starter for any significant development effort. Spending ten grand on software licenses is definitely a non-starter. This is a shame, since Unity seems to have pulled ahead in the engine wars for projects the scale I am considering. I may have to invest more time into trying some of the version control workarounds that folks have posted so that we can get away with the free version, since there's literally nothing else in the Pro version that I expect I would need.
As always, I am open to your suggestions.
Thursday, June 2, 2011
Morgan's Raid Extravaganza
I uploaded the beta version of Morgan's Raid the other day and did some work to beautify the Web site. Play the beta or find out more at https://sites.google.com/site/morgansraidgame/
The latest Emerging Media Update includes a feature on Morgan's Raid. Find out more at http://emergingmediainitiative.com/updates/may-2011/. The Update includes the following video in which I talk about the project and make a push for broader changes in higher education to promote this kind of work.
The latest Emerging Media Update includes a feature on Morgan's Raid. Find out more at http://emergingmediainitiative.com/updates/may-2011/. The Update includes the following video in which I talk about the project and make a push for broader changes in higher education to promote this kind of work.
Subscribe to:
Posts (Atom)