Case Study: Online Play Games
Developing games for English language learners, and a debugging tool to make the in-house QA process more accurate and efficient.
Overview
The Online Play games were created to improve student engagement in English language lessons. Research had shown that having more games alongside a course was consistently in the top three requests from teachers of additional content they desired.
The games typically involved a simple game mechanic (jumping over things, finding hidden objects, etc) coupled with question-and-answer rounds based on the language point or learning objective being taught in that part of the course.
A games bank, using the Phaser games engine, was created to allow us to scale games out to multiple courses by swapping out the underlying content to fit each course and level.
In addition to this I created a debugging tool to improve the QA process and build confidence that the shipped games would work correctly.
You can play these games from one of the courses on the Everybody Up student website. Select a level, then the games section to start playing.
Approach
As one of two user interface developers working on these games, I was responsible for:
Adding animations and behaviours to the code
Identifying bugs and fixing the code
Preparing assets (sprites, audio, json settings)
Loading game assets to create new instances of each of the games
Testing and QA
Building an in-engine debugging tool
Challenges and their mitigations
-
Many games needed to be created in a short time frame.
Mitigations:
Created a debugging tool to improve efficiency and accuracy.
Documentation created to support the developing and reversioning of each game.
Training documents created for business-as-usual handoff off to production department.
-
Games had to be balanced for younger learners without them being too easy or too hard.
Mitigations:
User testing with young learners.
Generous allowances on time for language learners.
-
As with any development there were a lot of bugs to address.
Mitigations:
Testing was undertaken across supported browsers and devices.
Bugs were triaged, replicated, and then fixed before being extensively tested.
Documentation was maintained in response to these changes.
Games we created
-
Art Studio
-
Double Trouble
-
Maze
-
Mega Munch
-
Smash Beach
-
Smash Space
-
Super Sam
-
Switch
-
Treasure Hunt
Debugging Tool
I developed an in-game debugging tool which supported all the Online Play games.
The main benefits of the debugging tool were that the game elements could be run in-engine and any obvious problems with the assets could be discovered easily without using trial and error by having to play through the games repeatedly and hoping to happen upon a problem.
The debugging tool inspected the content of assets (audio, images, text, data files), and was used as a supporting tool to find errors in the content and games themselves.
Testing with young learners
Testing was undertaken with young learners to establish what types of games they were interested in playing but also whether the games we were developing were usable.
Our research showed that the most successful games were those with replayability and a sense of competition such as Super Sam and Mega Munch, so recommendations for future game development would focus on these traits to further raise engagement.
How was success measured
As a result of my input, we were able to produce scalable games that were deployed to several different international primary courses.
By developing an in-engine debugging tool we were able to save an extensive amount of QA time and were able to deliver bug free games with a high degree of confidence.