There are 5 basic defining properties in order for a system to be considered a tangible user interface\cite{Kim_2008}. Here we will assess each one in turn and establish whether the system developed is a tangible user interface or not.
This can be generalised to a physical to digital mapping. A good example of this can be a computer mouse where there is a clear mapping of physical to digital movement. Our system demonstrates this in every action that is performed and so we believe that our system successfully meets this criteria for input specifically. However, physical output of the system is NOT included in our Phidget controller and so this criteria is therefore only partially met.
Unfortunately, one of the limitations of our system is that devices cannot be operated concurrently. This is a software limitation and given more time this could be developed such that concurrent usage would be allowed.
Each individual component has a specific task that it achieves, and it can achieve only that tasks. Due to this we agree that we have met this objective.
Our components are at the moment spatially unaware of the surroundings. In future revisions this could easily be addressed to include more interaction with the physical environment by adding sensors attached to the main interface kit.
At the current moment in time, our devices are only functional for one specific purpose and cannot be reconfigured spatially. Again this is something that could easily be addressed in future developments of the system.
Given the current system’s hardware limitations, we believe that it does not currently constitute a tangible interface. With proper concurrency, we could satisfy most validity criteria. In addition, to make the system a true representational tangible interface, we could embed the RFID tags within objects representing locations in the world. This would allow users to directly interact with real world models, and be taken to their counterparts in Google Earth by scanning them.