This is a look back at the development of my second One Game a Month entry, Hackey: the Hackening, developed using Unity 3D. You can find details about #1gam here. Hackey is available to play in your browser, here, or on Google Play, here. My #1gam profile is here.
The Key Word: Neon
This month’s optional #1gam keyword was “Neon,” conjuring ideas of hacking and cyberpunk cities and Tron lightcycles. For some time, I’d had an idea for a game where you move from node to node in a pseudo network, battling your way to some goal, which felt like a good fit for the theme word, so that’s what I went with. Easy!
Glows are Hard!
So first up, I tried messing with some pretty lines on the screen. As it turns out, getting a nice glow on things is quite a feat! I learnt that a glow has to live somewhere between being visible enough to make a difference, but not so visible that it no longer looks like a glow. I spent some time tweaking the glow effects for the lines and nodes. I suspect that the glow falloff rate isn’t linear, but rather has some kind of curve to it; there’s probably whole books written on the subject. In the end, I tweaked things until it felt right, and left it at that.
Colours are Hard!
I had great difficulty in picking colours that matched the “neon pinkey-purple glowey” effect I held in my mind, which surprised me, given the simple nature of the design (i.e. dots and lines). Again, I spent a lot of time tweaking the colours, but was unable to pick a pink-or-purple colour that looked nice. I also had to consider the different colour schemes used by “owned”, “unowned” and “enemy” nodes and lines; I was unable to work pink or purple into this scheme and in the end opted for simple green (owned), blue (unowned) and red (enemy) colours.
While I wasn’t able to achieve the pink-purple effect, I am happy with how the game looks. Also, it leaves pink and purple available for more crazy effects or abilities, should I ever implement them.
A Strong Focus on UI and User Experience
After last month’s game, where I completely failed to implement any sort of UI, and the controls feeling pretty… shitty, I set myself a goal of having a much better User Experience in this month’s game. This meant resurrecting some half-baked UI code I had developed in previous projects that allows state-based control of the UI and game world.
I spent a lot of time (re)designing a state-based UI framework to control the GUI and player input, which involved many failed design attempts and flip-flopping between ideas. While I’m fairly happy with the end result, the real test will be how easily it slots into future projects – I want to spent minimal time tinkering with non-game code, moving forward.
I also attempting to design a “model adapter” framework, to separate presentation code from game logic, which is generally a good practice in programming. Again, this involved lots of iteration, but ended up with something very similar to the Android SDK’s ListAdapter design (hooray for day jobs). I think this could use some further tweaking. It’s good enough for now, but the tinkerer’s job is never done!
Reflecting on the above paragraphs, it seems that I spent lots of time on “frameworkey” things this month, and coding “for the future,” which risks killing small projects: if too much time is spent gold-plating your awesome custom frameworks, you’re probably not spending very much time on game logic, and the end result will be boring! As I intend to use these mini-frameworks on future projects, I was happy to lose time to them, however, I was mindful that I was unable to implement some game features because of it.
Initially, I wasn’t happy with how Unity handles sound, especially when it comes to 3D positioning. I had placed some audio sources on game objects and moved the camera around in the scene, but the audio coming out of the speakers just didn’t seem “right.” The audio was too soft and the left-right positioning (panning) seemed off. So, I read some docs on how Unity handles sound, and what all of the crazy parameters mean, tweaked my project, and am now happy with the result. So basically, RTFM.
Publishing and Marketing
I was amazed at the quality of user feedback received from both Newgrounds and Kongregate in the first few days of the game being listed! I attribute it to good social design of the websites, whose purpose is to bring visibility to new games. The feedback slowed down after the first few days, but the number of views and plays for each is still creeping upwards and totals around 1500 between them. In terms of the content of the feedback, I was glad to see that there were a few feature suggestions and a few annoyances being reported, both of which let me know what to work on next and what to devote development attention to. There was also a nice scattering of compliments, which let me know the concept works as a game!
As for Google Play, given that there are a gajillion apps already on the Play store, and having some prior experience releasing business apps, I didn’t expect an influx of downloads. However, I was surprised to see a steady amount of downloads, maybe 4 per day. To me, this is pretty good! Somehow people are finding Hackey on the Play store. The retention rate is also quite good, with just under half of the downloaders still having Hackey installed on their devices. As of writing (1st April 2014), there are 43 total installs, with 21 active installs (i.e. people who haven’t uninstalled).
Based on user feedback from Kongraegate and Newgrounds, I implemented an inline tutorial. I call it “inline” because it is a non-blocking non-popup hint-style tutorial; e.g., it tells you to “drag this node to here.” I think it works quite well! It can get annoying when you replay the game, as it can’t be disabled and tends to reposition the camera which is frustrating. I would have liked to add an option to turn it off, but considered that lower priority than some other features.
Regarding tutorials, I discovered great difficulty in creating tutorial logic that is “fool proof;” for example, reacting to a user who doesn’t want to follow the tutorial but achieves the desired results anyway. The first step in the tutorial asks the user to capture a node; the tutorial had to guide the player through the steps “drag this… to here” but also be aware that the player might decide to do other things, such as turn the “tutorial” node into a Watchtower, rendering it unusable as a capturing node, and therefore breaking the tutorial. In this case, the tutorial has to then instruct the user to “right-click to cancel this node” to allow the node to become available for capturing again. Meanwhile, the player may have just captured the targeted node anyway, using yet another node! The tutorial logic has to cover all of this, allowing the player freedom, otherwise the tutorial interferes with the player’s desire to… play the game.
From a technical standpoint, I used Unity’s “coroutines” to implement the tutorial steps, which worked great. I got the idea from this great article about doing exactly that, except in iOS apps, from this article: Await in the Land of iOS – Scripting Users.
In the end, I was reasonably happy with the unobtrusiveness of the tutorial, if not completely happy with the code, which was a bit icky. I later refactored the code to a much more coder-friendly style after handling some more pressing gameplay issues.
Deploying to Android
The decision to deploy to Android was based upon the game being very touch-friendly in it’s original design, in that the user drags things around to get stuff done. I thought, hey, let me just turn on touch events and see what happens! Surprisingly, it worked quite well, due to the automatic touch-to-mouse event simulation Unity performs. Unfortunately, it simulated right- and middle-mouse dragging by the user dragging multiple fingers, which I found this to be quite cumbersome, and tweaking it meant doing some actual work. Sigh!
Implementing “proper” touch controls via a Unity plugin, Unity.Touches, took a few nights but the end result is nice and allowed me to deploy to Android.
With the touch issues out of the way, I turned my attention to the horrible performance on tablets – the game did not run well at all, dropping to <10 FPS towards the end of a level when a lot of nodes have been exposed. This was a failure of planning on my part, due to my developing the game on a super-powerful (as they all are) desktop machine, then expecting similar performance from much-less-powerful mobile devices.
After some experimenting (never optimise without measuring!), I discovered that the slowdowns were mainly due to my using multiple separate particle systems for node bodies and glows, each which incurred their own “draw call.” In developing game for Unity, you want to keep draw calls as low as possible. I switched the node rendering to use a single quad-polygon per node, which is much faster and sees almost no drop in FPS, even with 500 nodes on-screen. Node special effects still use particle system, however, there are few special effects visible at any one time, so performance is maintained.
The lines between nodes were also causing slowdowns, for much the same reason – draw calls. I fixed this by packing all lines into a single non-continuous line, which is rendered with one draw call. Yippee!
Lastly, the in-game node numbers were each incurring their own draw calls, despite my best efforts to optimise into a single call; for some reason, the texts simply weren’t being detected as similar and therefore weren’t being considered for “draw call batching.” After some Googling, I found a fix whereby rendering the texts using a separate camera allows them to be batched, though I don’t know why. But it worked, and the FPS returned to a comfortable 40-60!
The optimisation performed are another case of RTFM and knowing the ins and outs of your tools, specifically “draw call batching.”
I’m happy with the final result and especially happy with it being available to play online and on Android devices. As with last month’s game, I would love to continue development, adding feature and polish, but must resist the urge! On to the next game!
Audio: “9 sound effects” from OpenGameArt, attributed to “Michel”, with only this link to go on. Thanks Michel, whoever you are!
Unity Plugins Used
Vectrosity for the vector lines – an excellent package. Distributed as a DLL, making it fit perfectly into my project setup. Eleventy-billion stars!
Input.Touches for the touch controls. The code is a mess but it gets the job done!