Topic report plan - Using sensor fusion in an android application to determine a user direction

Our bachelor thesis investigate the possibility of constructing an android application simulating sound sources at GPS coordinates using smartphone sensors. There are many issues and problems to overcome in achieving that, one of them being to correctly determine the direction that the application user is facing.

Research question

How is it possible to combine android smarthone sensors in order to with a sufficient accuracy as well as in real time determine the direction that the user of the smartphone is facing?

Plan of action

This article will cover tre distinct parts: understanding the theory associated with the problem, implementing one possible solution and evaluating the solution.

Literature study

Information about the sensors can be found in the android API. Furthermore a lot of similar sensor fusions has been implemented before (allthough usually not between gps and gyroscope), hence a study of sensor fusion tutorials would be appropriate.

Implementation

Suggest and implement a solution based on knowledge from the previous litterature study. The implementation will be described in the final article

Experimentation

Test the solution in the Run For Life application and evaluate if the method gives a direction more authentic than one using only gps bearing (or compass value)

[Someone else is editing this]

You are editing this file