“Are We there yet?” - Android powered tangible educational geography game
With the vast availability of Android devices around in the world today, people have developed a number of different ways to communicate with and use their devices. These methods however, tend to follow the traditional patterns of interaction. To demonstrate the flexibility of the Android platform to develop a tangible user interface, we have developed a game that uses many Android sensors to give a real sense of feedback to the user. The game allows users to travel to a country through means of directional and tilt sensors and explore the world.
“Are We There Yet?” is an educational game which tasks users with finding the locations of cities across the globe. The game makes use of the accelerometer and rotational vector sensors on android phones, allowing the user to control the game completely via movement and rotation. We used the Google Maps API as our mapping engine. This provides a powerful framework already in place without taking up too much memory, which was the problem with earlier versions of the game using our own map image.
The aim of the game is to move the visible portion of the map onscreen to within a circle around the destination city. The user can tilt the phone forwards and backwards to control the scroll speed of the map. The direction of motion is determined by the direction the player is pointing the phone. For instance, if the phone is pointed north, the direction of motion will be in the north-south vector. Upon finding the destination city, the destination will be updated randomly from a list within a text file.
During the creation of the project, Google and Nintendo teamed up to create a similar Google Maps-based game for Android devices, using the popular Pokémon franchise. 150 Pokémon were hidden throughout the map and the player had to navigate around the world to find them. (Statt 2014)
Two concurrent prototypes were created to test two different proposed implemetations.
The canvas prototype for the application made use of the Android SurfaceView, a View (classes for dealing with user interface and interaction) for drawing graphics onto the screen.
This prototype took in image files that were placed in the resources folder, and decoded them into Bitmap objects that could drawn onto the SurfaceView using the Canvas, which holds the drawing calls.
The Canvas was then used to draw the map background, and an arrow was drawn on top of that background.
A Thread was used to keep the graphics being drawn without it taking away processing from the user interface, so the application would not become unresponsive.
The Canvas was also used to draw a rectangle in the corner of the screen, where text would be displayed to display the score.
Problems were encountered with this prototype, as the limited amount of memory available for Android applications made it a problem to load a large map file. One of the alternative fixes would have been to use multiple image files and have them programatically stitched together inside the application itself. This proved to be a big problem and the prototype was eventually scrapped.