In what new ways can we can interact with information in spacial MxR / AR settings through optically adaptive typography.
Already having experimented with variable fonts I wanted to explore how AR could enable and provide new types of interaction with typography.
Earlier collaborations with Erik Van Blokland involved experimentation with interpolating type (as a mesh) based on user distance. Erik drew text that accounted for the limitations of using a simple set of constructed meshes. Here, we explored the concept of signage that could highlight relevant details based on someone’s proximity to it.
I collaborated with CJ Dunn to create a variable font type specimen with a distance based optical axis. We spent extra time thinking about how to best showcase multiple weights and words in an AR space.
Download it for iOS here.
I recieved some numerals drawn by David Jonathan Ross and explored panels of text that changed based on distance. Originally we wanted to look into adjusting optical sizes.
After trying to make them larger when the user was farther away (and shifting to a more legible optical size) David suggested making them larger when the user was closer instead. That landed us at a panel of numbers that each read their distance to the user. Walking closer ended up having an engaging cascading effect.
Distance can also drive a number of visual effects. This concept made use of vertex shaders to create ripples in the text.
Another exploration involved adapting the typeface based on viewing angle for more readability. Thanks to Bianca Berning I was able to use Dalton Maag’s Venn typeface which featured a width axis.
Using a width axis allowed the typeface to remain more readable at wide angles while also staying on the same plane – something billboarding doesn’t solve.
From there, I looked at giving the text the ability to adapt to two axes of angles, in cases where type may be lying on a flat surface that people may walk around or adjust the rotation of. It took a lot of trial and error to get the right angle measurements from the readers view of the text.
I designed a custom variable font with proportional measurements that would sit well within the interface.
I also explored adapting the typeface based on viewing angle for more readability. Using Sandrine Nugue’s Infini typeface I set up the typeface’s color to respond based on ambient light gathered from ARKit and Unity.
Another set of explorations looked into manipulating typography through tangible controls.
Together, these proof of concepts lead to wider reaching hypotheses on the nature of typography in MxR/AR.
You can read how they contribute towards a larger thesis at a talk I gave at Dynamic Font Day here: “Breaking Boxes: Typography and Augmented Reality”.
A break down of applications for distance and angle adaptive typography can be found here: “Approaching Spatially Adaptive Type”.