2021 WMU Graphic Design Thesis's profile

Emma Wiest | The Future of UX in Wearable Tech

The Future of UX/UI in Wearable Tech
Designer: Emma Wiest
Thesis Statement

How might a designer consider, analyze, and design for the problems and opportunities brought by a new, intrusive technology while balancing innovation with the needs and comfort of the user?

Essentially, my project explores the UX/UI of AR/MR enabled smart contact lenses, so I’m looking closely at the organization, interaction, and interface design of contact lenses that are able to project an interface onto a tiny screen contained within the lens. The environment around the user would still be visible while using the lens, and the interface will be able to recognize and interact with that environment. A user would control the interface using the movement and focus of their eyes like a mouse cursor on a computer screen.
Left: dramatization of smart contact lenses. Right: UI in daytime.
Use Case Scenario videos
Audience + Goals
My primary audience is fellow designers, because I want them to start considering how this kind of technology will affect their users and how they can solve unique problems caused by unique technologies.My secondary audience in people who are interested in this technology and its advancement, because ultimately they will be the early adopters of new tech, and as its first users, they will have a say in a product’s functionality.

What does the future of UX/UI look like? With this project, I want to start a conversation, and these are the kinds of questions I’m hoping my audience will start asking themselves and their fellow designers. What are some considerations and implications of different kinds of technology we could be seeing in the near future? How do we adapt the design and organization of technology today to keep up with new software and hardware?
Research
So, in my research, back when I was unsure what the medium I would be designing for would look like, I came across a Fast Company article on these lenses, by a company called Mojo. They are augmented reality enabled smart contact lenses that also enhance vision to enable the user to see something microscopically close to the surface of the eye. This technology became the basis for the UX/UI.
Left and middle: UI AR examples. Right: product projection timeline. Images © Mojo Lens.
When I started thinking about the UX, I decided to replace Mojo's simple traffic-weather-music functionality and generated a huge list of possible functions for these lenses. Ultimately, I decided to go with functionality very similar to that of a smartwatch. The market research for smartwatches is already there, I wouldn’t have to decide really what to include because I could just pull from the most successful content of smartwatches and fitness trackers, and be able to focus on the interface itself. So, I took a deep dive into the functionality of smartwatches and other habit tracking apps, and looked at what could be translated and adapted for a lens.
Functionality from other wearables and apps that influenced my designs. Images © Apple Inc (top and bottom middle), Fabulous (bottom right), and Todoist (bottom left).
Process
One of the biggest problems I set out to solve was how a user would activate, interact with, and deactivate part os the interface using only their eyes. Mojo Lens uses a directional navigation. Their lenses are set up so that depending on the periphery where you move your eyes, a different function would appear. So, moving your eyes into the upper periphery, which basically amounts to a half eye-roll, you would activate a weather app. However, as soon as you moved your eyes out of that periphery, the app would disappear. The biggest issue with that is how uncomfortable it is to keep your eyes in that position for an extended period of time. I decided to organize my interface with functions on the top, right, and left periphery, that when focused on, would open a screen on the center of the lens, so you could move your eyes back down and have that screen stay active. That screen could then be dismissed by either shifting focus back out to your environment or by looking down to the lower periphery to exit.
Using this directional navigation, I began to map out functions and assigning them to locations:
Which I then refined to a final user interaction map:
At this point, I developed user personas to help me get a better idea of the age range I was targeting and what their primary needs would be. Below is a synopsis of each persona, their needs, and how they came into contact with the technology.
After that, I started developing the UI. Below, you can see my original icons and design ideas for screens. One of the biggest UI challenges was designing an interface that would be placed in an unknown and potentially high contrast environment. I started out with an entirely white interface that used hefty drop shadows to bring a sense of depth and contrast to separate the UI from the environment around it. The icons here represent the different app functions, and would appear in their respective periphery for the user to focus on to activate a function. In this iteration of the UX, looking left would activate the settings icon, up for dashboard, right for apps, and down to exit if the user is already within the interface.
With my v1 ready to go, I brought the interface to a few potential users to test. 
User testing was instrumental in my design process, and helped me make decisions about usability and aesthetic. Granted, this was not a traditional user test, as I was unable to replicate the exact product experience, but with a combination of traditional screen testing and augmented reality through Adobe Aero, I was able to roughly simulate what it would be like to use the lens.

After my first round of testing, I added a low-opacity background to the screen to solve some contrast issues, which basically functions like wearing low-tint sunglasses while you’re using the interface.

After the second round, I added more navigation features including a “return to last screen” navigation area, which replaced the universal exit point when the user is not currently using the interface. I switched my typeface from Avenir to Proxima Nova for increased visibility in lowercase copy.

I added back arrows to all screens further than one step away from the menu to ease navigation there. This is also where I changed my hover state from a simple opacity change to an opacity and scale change for increased visibility.

Below is an example of the v1 draft (left) of the UI compared to the final version:
I then had to decide the best way to communicate my findings and show my interaction design. I decided to create video mockups with Adobe AfterEffects, which can be viewed at the top of this project. Each video corresponds to one of the user personas. Below are some sample storyboards that I created to communicate my vision for the videos:
Results
I learned a lot from this process. It gave me the opportunity to look into the psychology behind wearable technology and how designers are solving those problems in the present day. I also got to really look at the philosophy of the UX process beyond the traditional web design steps, I had to really understand it to be able to apply it to this new medium. I really enjoyed the challenge and being able to consider the future, and addressing it with a technology that has such incredible implications for accessibility. 

I would love to continue further research on this topic, and within AR/MR technology in general. What other applications, maybe more widespread or relevant to societal issues, could this technology have that I didn’t explore?
Sources
Ariel, Galit. How AR can make us feel more connected to the world. | TEDwomen. (2018). https://www.ted.com/talks/galit_ariel_how_ar_can_make_us_feel_more_connected_to_the_world#t-6772

Batchu, Vamsi. Designing better Apple watch apps. (2020). Retrieved 11 February 2021, from https://uxdesign.cc/designing-amazing-apple-wearable-apps-ef941d11a166

Caraban, Ana. The '23 ways to nudge' framework: Designing technologies that influence behavior subtly | ACM Interactions. (2021). Retrieved 19 January 2021, from https://interactions.acm.org/archive/view/september-october-2020/the-23-ways-to-nudge-framework


Davis, Meredith. Bridging digital and physical experiences. (2021). Retrieved 19 January 2021, from https://www.aiga.org/aiga-design-futures/bridging-digital-and-physical-experiences/

Fink, Charlie, et.al. Convergence: How the World Will Be Painted with Data \ (January 1, 2019).

Fuzzy Math. UX Design Principles for Wearables. (2015). Retrieved 11 February 2021, from https://fuzzymath.com/wp-content/uploads/2015/08/Fuzzy-Math-UX-Design-Principles-for-Wearables.pdf

MacDowall, Jason. Charlie Fink (Author) on Inspiring the Next Generation of AR/VR Creators — The AR Show. (2020). Retrieved 19 January 2021, from https://www.thearshow.com/podcast/103-charlie-fink

Michaels, Mary. The Apple Watch Case Study | Human Factors International. (2015). Retrieved 11 February 2021, from https://humanfactors.com/downloads/whitepapers/apple_watch_case_study

Moggridge, Bill. Designing Interactions | The MIT Press. (2007).

Norman, Don. The Design of Everyday Things: Revised and Expanded Edition | Basic Books. (2013).

OMG, AR + VR IRL. (2020). Adobe Max Conference 2020. Retrieved 11 February 2021, from https://www.adobe.com/max/2020/sessions/omg-ar-vr-irl-s7001.html

Pimmel, Kim. Creative Ideas for Augmented Reality with Adobe Aero . (2020). Adobe Max. Retrieved 11 February 2021, from https://www.adobe.com/max/2020/sessions/creative-ideas-for-augmented-reality-with-adobe-ae-s6704.html

Quick Look Gallery - Augmented Reality - Apple Developer. (2021). Retrieved 11 February 2021, from https://developer.apple.com/augmented-reality/quick-look/

Razvan, G. (2015). Designing A User Experience For Wearable Devices - Usability Geek. Retrieved 11 February 2021, from https://usabilitygeek.com/wearable-devices-user-experience/

Ross, Philip. Designing Behavior in Interaction: Using Aesthetic Experience as a Mechanism for Design | International Journal of Design. (2021). Retrieved 19 January 2021, from http://www.ijdesign.org/index.php/IJDesign/issue/view/24

Sullivan, Mark. The making of Mojo, AR contact lenses that give your eyes superpowers. (2020). Retrieved 19 January 2021, from https://www.fastcompany.com/90441928/the-making-of-mojo-ar-contact-lenses-that-give-your-eyes-superpowers

Tall, Tidjane. Augmented Reality vs. Virtual Reality vs. Mixed Reality – An Introductory Guide. (2021). Retrieved 19 January 2021, from https://www.toptal.com/designers/ui/augmented-reality-vs-virtual-reality-vs-mixed-reality

Valdiva, Gabriel. Immersive design: the next 10 years of interfaces. (2018). Retrieved 11 February 2021, from https://uxdesign.cc/immersive-design-the-next-10-years-of-interfaces-16122cb6eae6

Emma Wiest | The Future of UX in Wearable Tech
Published:

Emma Wiest | The Future of UX in Wearable Tech

Published: