AR User Interfaces
Also Posted On

How can we take advantage of both screenspace and world space to present, organize and manipulate information.

  • Team: Self Directed
  • Notable Tools: Blender, Unity, C#
UI pattern displaying the relative scale of real time earthquake data in 3D

2D to 3D: Push To World UI Pattern

Digital screenspace UI largely is constrained to flat layers. Conversely, most Augmented Reality elements float isolated within world space. I wanted to look into ways to take advantage of both. You can read more about this in another post.

To illustrate this, I came up with a ‘push to world’ UI interaction using information that benefited from being shown in 3D, in this case live earthquake data from the USGS. Being able to see certain types of data in 3D gives us a better sense of scale and proprtion. Siesmic waves are measured in a logarithmic scale, so visualizing information spatially can give us a better sense for relationships.

This required some primitive proceedural mesh generation and world / screen space management, which I hacked together.

A proof of concept for a UI pattern that shows real time earthquake data.

You can especially get a sense of scale at higher magnitudes.

A proof of concept for a UI pattern that shows earthquake data of high magnitudes

In hindsight moving towards the camera is too intense and a ‘receed only’ interaction would be much less jarring. Having other shortcuts to rotate the visualization in world space would let people be more economical with their physical movements.

This project led to other ideas currently in progress.