Phidget Powered Tangible Interface for Google Earth


Google Earth is a fantastic tool for viewing satellite imagery from all over the world. The addition of new 3D terrain and height mapping further improves this experience. It accomodates user-uploaded models, and can be a powerful tool for visualising architecture and places of interest, both new and historical. We propose a new interface for use in museums and exhibitions, providing the same experience as a keyboard and mouse, but within a limited user interface.


“Are We There Yet?” is an educational game which tasks users with finding the locations of cities across the globe. The game makes use of the accelerometer and rotational vector sensors on android phones, allowing the user to control the game completely via movement and rotation. We used the Google Maps API as our mapping engine. This provides a powerful framework already in place without taking up too much memory, which was the problem with earlier versions of the game using our own map image.

The aim of the game is to navigate an icon representing the player within a circle around the destination city. The user can tilt the phone forwards and backwards to control the speed of the icon. The direction of motion is determined by the direction the player is pointing the phone. For instance, if the phone is pointed north, the direction of motion will be in the north-south vector. Upon finding the destination city, the destination will be updated randomly from a list within a text file.

Literature Review

During the creation of the project, Google and Nintendo teamed up to create a similar Google Maps-based game for Android devices, using the popular Pokémon franchise. 150 Pokémon were hidden throughout the map and the player had to navigate around the world to find them. (Statt 2014)


Early Prototype

The early prototype for the application made use of the Android SurfaceView, a View (classes for dealing with user interface and interaction) for drawing graphics onto the screen.

The prototype took in image files that were placed in the resources folder, and decoded them into Bitmap objects that could drawn onto the SurfaceView using the Canvas, which holds the drawing calls.

The Canvas was then used to draw the map background, and an arrow was drawn on top of that background.

A Thread was used to keep the graphics being drawn without it taking away processing from the user interface, so the application would not become unresponsive.

Replace this text with your caption

Our Prototype

write here matt