You are here

  1. Home
  2. Augmented Reality for Open Field Navigation

Augmented Reality for Open Field Navigation

Topic Description

Augmented reality navigation apps are appearing for a wide range of applications, from museum guides and pedestrian navigation to industrial-scale implementations for ship bridges and aircraft cockpits. However, the basic design principle seems often to be, “we can so we shall”, with an underlying assumption that it’s quite acceptable to force the user to change the way they do things. Furthermore, there seems also to be an assumption that technologies that work in one environment – such as a short-range pedestrian navigation – will be equally applicable for others – such as small boat navigation on the sea. There is often little evidence either of seeking to understand the specific requirements of target user groups or of identifying the performance limitations inherent in specific technologies [3].

The majority of current navigation apps focus on pathfinding (following predefined route segments such as roads, footpaths etc.) through urban environments, often towards predefined points of interest [1,2,5]. These applications combine accurate position determination and three-dimensional models of the local environment to allow registration of the 3D model with the device's camera image.

For open-field navigation, such as on a boat or in open country, the requirements can be more challenging. Not only may there be fewer constraints on direction of travel (few predefined route segments for pathfinding), but the registration process may be much more challenging, primarily because of the distances involved. For example, when seeking to identify the entrance to a harbour, navigation decisions may need to be made at distance of several miles, from which distance the whole of the visible coast may subtend significantly less than a degree of vertical arc. This is then compounded by the difficulty of stabilising an image for a device that is handheld on a boat that moves with waves on the water… Furthermore, there may be reduced visibility, comparable with or perhaps less than the distance to the coast. And, of course, enhanced position fixing beyond standard satellite technology is not available [4].

These additional complexities open the opportunity to investigate a range of questions surrounding the identification, representation and tagging of landmarks that could be used to help provide visual cues to the intended goal (such as a harbour entrance). This research should address fundamental questions about what influences people's ability to recognise landmarks at a distance, how navigators actually use landmarks, and technical issues about the construction and evaluation of prototype software.

The research will involve the development and evaluation of enhancements to current AR idioms, laboratory investigations, and in-the-wild studies using eye-tracking devices and prototype AR environments. The likely contexts for these studies will include yacht navigation in coastal waters, but may extend also to open country trekking and potentially to drone piloting.

Skills Required:

  • Background in HCI and software development for mobile devices.
  • An appreciation of the challenges implicit in open field navigation - particularly yacht navigation in coastal waters - would be an advantage.

Background Reading:

[1] Basiri, Winstanley, Amirian 2013, "Landmark-based pedestrian navigation", http://www.geos.ed.ac.uk/~gisteac/proceedingsonline/GISRUK2013/gisruk201...

[2] Basiria1, Amiriana, Winstanley, Marsha1, Moorea1 and Gales 2016, "Seamless Pedestrian Positioning and Navigation Using Landmarks", Journal of Navigation 69 (01) January 2016, pp 24-40 http://dx.doi.org/10.1017/S0373463315000442

[3] J R Blum, D G Greencorn, J R Cooperstock, 2013. Smartphone sensor reliability for augmented reality applications. In Mobile and Ubiquitous Systems: Computing, Networking, and Services (MobiQuitous 2012). p127-138.

[4] Bowers, Morse, King, "The challenge of (mis-)identifying landmarks - Augmented reality issues for visual navigation", submitted to PACM Interact. Mob. Wearable Ubiquitous Technol., https://drive.google.com/file/d/1biKSzH3GdQpnqwhrLpc70fWQbHI4eZY8/view?u...

[5] Chan, Baumann, Bellgrove and Mattingley, 2012 "From Objects to Landmarks: The Function of Visual Location Information in Spatial Navigation", Front Psychol. 2012; 3: 304 http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3427909/

Contact

Request your prospectus

Request a prospectus icon

Explore our qualifications and courses by requesting one of our prospectuses today.

Request prospectus

Are you already an OU student?

Go to StudentHome