NASA Spacesuit User Interface 2022-2023

Awarded as finalists for a moon exploration AR spacesuit information display.

MY ROLE

Co-lead: project management
Co-lead: UIUX design
NASA JSC onsite participant

TIMEFRAME

Sept 2022 - May 2023

TOOLS

HoloLens 2
Figma
Unity

Introduction

NASA SUITS, Spacesuit User Interface Technologies for Students, is an annual challenge that invites students to create new interpretations of what an augmented reality Heads-Up-Display (HUD) interface for astronauts on future NASA Artemis missions could look like.

Our Team

LEADERSHIP

Jessica Young, Ashley Fan, Michael Wang

FACULTY LIASON

Michael Lye

DESIGN

Bill Xi, Bryce Yao, Dan Luo, Dong Yoon Shin, Keya Shah, Linlin Yu, Pei-Jung Hsieh, Ryan Lee

DEVELOPMENT

Jamie Chen, Danielle Kim, George Xu, Martin Ma, Julius Beberman

Process Overview

Problem Statement

How might we create an intuitive and clear spacesuit display that helps astronauts navigate and complete tasks on the moon?

3 Main Design Objectives

Visual accessibility

Interface must account for extreme lighting conditions on the moon by enhancing and not obstructing field of vision.

Tactile accessibility

Special considerations must be held for tactile interactions due to limited hand mobility when wearing a spacesuit.

Visual clarity

Task clarity and visual hierarchy are high priority considerations, to minimize potential accidents.

4 Main Design Features

Egress

Egress marks the beginning of the astronauts’ lunar journey. The interface should ensure proper sequencing and performance of procedures while minimizing risk for human error. 

Navigation

This component should efficiently guide astronauts across the lunar site, carry out mission tasks at marked points of interest, and return back to the lander safely by closely monitoring terrain anomalies.

Rover Commanding

An astronaut should be able to intuitively and precisely direct the autonomous ROVER from its original location to a point of interest. There should also be a recall function.

Geological Sampling

Lunar rock sample information such as sample #, lithology, and timestamps, is collected by an astronaut using an RFID hand tool. The information collected is saved in a section of our interface.

Final Design

1. Easy access palm menu

DESIGN OBJECTIVE - TACTILE ACCESSIBILITY

Palm menu [shown on right]: due to limited tactile mobility, we made the main menu accessible with a simpler motion–simply flip your hand.

For a clearer recording, in the [below] GIFs, the palm menu functions are being demonstrated by clicking on-screen buttons.

EVA menu opening

Accessing palm menu

2. Egress task guidance

DESIGN OBJECTIVE - CLARITY

✦ Procedures on task list turn green with a check mark once the UIA switch has been confirmed flipped.

✦ Before a user can progress to the next task, a confirmation of "proceed to procedure X of 9" appears.

✦ Task list's EVA information can be accessed at any time on the palm menu [shown in above GIF]

UIA panel layout

UIA switch toggle & interface response

Kelly Mann testing egress on-site

3. Rover Commanding

DESIGN OBJECTIVE - VISUAL ACCESSIBILITY

✦ Click rover command button to drop a rover point of interest. This will direct the rover to begin moving. [demonstrated on right GIF]

✦ Rover return button is found on the palm menu.

Rover on the move!

Dropping rover waypoint & self-centering

4. Navigation map and compass

DESIGN OBJECTIVE - VISUAL & TACTILE ACCESSIBILITY

Compass stays on top of user's field of vision. When compass isn't directly needed, it hovers slightly above the field of vision. When desired, a slight head tilt is all that is required to bring it into a direct line of sight.

✦ Placing a waypoint or hazard will mark that specific location on the map. [demonstrated on right GIF]

Placing a waypoint

5. Geological sampling session confirmation

DESIGN OBJECTIVE - TACTILE ACCESSIBILITY & CLARITY

✦ Enter or disengage geological sampling session through the palm menu.

✦ Sample rock data is collected using an RFID scanner [demonstrated on right GIF]

✦ To prevent accidentally ending a session, the user must confirm the session ending to return back to the navigation home screen.

Using RFID scanner on a sample rock

Watch our team present our design exit pitch!

Research Key Insights

As the sole surviving member from the 2020-2022 design team, I had a few preliminary responsibilities before we started this year’s SUITS challenge:

Compile research & interviews

Analyze previous year's design

Main insights

We interviewed specialists in pertinent fields such as astronauts, geological scientists, cartographers, XR specialists, and UIUX designers. These insights served as research starting points for our 2022-2023 challenge and helped determine what further research the team should conduct.

James H. Newman

Former NASA Astronaut

Steve Swanson

Retired NASA Astronaut

Jim Head

Geological Sciences

James Russell

Earth, Env, Planetary Sciences

Peter H. Schultz

Geological Sciences

Jonathan Levy

Cartographer

Isabel Torron

UX Designer

Alejandro Romero

VR/UX Specialist

Iteration Techniques

Us co-leads divided the design team into 4 sub-teams, each corresponding to one of the 4 main design objectives: Egress, Navigation, Rover commanding, and Geological sampling. Each sub-team was responsible for the research, iteration, and design of their main design objective; however, all members cohesively worked together as one large team during meetings to share and combine ideas.

My main responsibilities as a co-lead consisted of managing design team meetings, overseeing interface and asset creation, and facilitating effective communication and collaboration between the design and development teams.

Overview of iteration process

Iteration 1

[Research] Discussed features and tested HoloLens capabilities. Reviewed the downfalls of 2020-2022’s design.

✦ Prototyped using MRTK2 provided toolkits.

Iteration 2

[User testing #1] Modified interface to solve issues in visual readability and feature consolidation.

✦ Developed on the MRTK2 Unity API using Figma Bridge.

Iteration 3

[User testing #2] Modified more features to improve accessibility and clarity even further.

✦ Discovered the downfalls of MRTK2 and its overall limitations.

Iteration 4

[User testing #3] Created custom assets to further improve readability and contrast.

✦ Unity processing and latency improvements.

Iteration 5

[HITL field testing #1] Tested if the color scheme worked well under lunar environments.

✦ Worked on issues with GPS system and the Telemetry Stream Server.

Iteration 6

[HTIL field testing #2] Discovered key pain points of the existing prototypes: glitching, icon clarity, etc.

✦ Made sure the MVP could be delivered.

Preliminary user flows

Initial user flows were created with 2 main ideas in mind:
incorporating research insights found by sub-teams on what features were plausible to create and any current or new technology we could take inspiration from.
✦ iterating on which features to incorporate, based off of last year's testing results, NASA feature requirements, and prioritizing any additional features we deemed necessary or appropriate to include.

Our whole team ideated together and came to a few initial design decisions: 

Universal Functions

Functions shared across all screens (ex. compass, main menu) should be cohesive.

Hand triggers

To address the lack of tactile mobility, same actions should be triggered by the same hand movements.

Order of Operations

Our design should follow the given sequential order (egress first, etc) rather than being freeform.

Initial experience testing

✦ Interviewed 5 adults, aged 18-50.
✦ Tested on Figma desktop prototype mode with scripted interview questions.

Cheeny
Celebrado-Royer

Assistant Professor, RISD

Leah
Beeferman

Assistant Professor, RISD

Matthew
Bird

Senior Critic - RISD ID

Testing insights

Visibility contrast

Overall design needs higher color contrast and larger font sizes.

Icon clarity

Some icons were difficult to identify. Icons should look more cohesive.

Feature consolidation

Merge rover command feature into the navigation map screen.

Revised user flow

Design system

Typography

Color Palette

Icon set

Main components

Human-In-The-Loop [HITL] Field Testing

Interface testing on Hololens

Tests were conducted in pitch black with extreme light sources, on uneven terrain at two local parks.

Design-wise, we tested the accessibility of our interface [mainly color contrast & font size] in harsh lighting conditions and further edited our design to standard.

Test Week! - NASA Johnson Space Center [JSC]

As challenge finalists, our RISD SUITS team was invited to Houston, TX to present and test our design at their JSC facilities.

ON-SITE TEAM

Jessica Young (team co-lead, UIUX design)
Michael Wang (team co-lead, operations)
Linlin Yu (UIUX design)
Danielle Kim (development)
Martin Ma (development)

TESTING DURATION

May 18 2023 - May 23 2023
(1 week)

LOCATION

NASA Johnson Space Center
Houston, TX

Design Testing & Evaluation 1 - June 19

Our on-site testing was conducted at the JSC Rockyard, and our design was evaluated by Skye Ray. Testing was sectioned into three components: briefing, testing, and debriefing.

BRIEFING

Our developers connecting our design to the Hololens & NASA's telemetry stream. The rest of our team members are briefing our evaluator, Skye Ray, on our design features.

TESTING

Our design has successfully loaded in the Hololens! Team member Danielle Kim accompanies and guides our NASA evaluator through the egress procedure.  

TESTING

Using our navigational features, Skye Ray finds the geological sampling site. An RFID scanner [held in hand] is used to scan rocks and collect geological information.

TESTING

Our evaluator then moves onto the rover command section. Here, he must direct the rover [in middle of photo] to and from the designated location using our interface.

DEBRIEFING

Skye Ray debriefs general observations, pain points, and positives he came across while using our interface. We then use his valuable feedback for our next round of testing!

Post-debrief design revisions

REVISION 1 - VISUAL ACCESSIBILITY

Clarified asset imagery

✦ Edited unclear icons.
‍✦ Created new buttons for newly added features.

REVISION 2 - VISUAL ACCESSIBILITY & CLARITY

Added coordinates to map

✦ Added coordinates to bottom left of map to specify which point of interest is being clicked.

REVISION 3 - CLARITY

Egress confirmation UX clarified

Task progress bar added.
✦ “Loading” icons added.
✦ "Proceed to next step” display added.

REVISION 4 - CLARITY

Geo sampling display revised

✦ Revised sample information display.
✦  
Created clearer scan completion indicator.
Added confirmation for ending geological sampling session.

Design Testing & Evaluation 2  - June 22

Our next on-site testing was also conducted at the JSC Rockyard. This time, our evaluator was Kelly Mann.

BRIEFING

Our team briefs our new evaluator, Kelly Mann, on our design features. We also equipt him with our handmade high-beam lights hardware.

BRIEFING

Our team member Danielle Kim teaches Kelly Mann the action that corresponds to our palm menu feature before testing begins.

TESTING

Our evaluator flips a switch on the provided UIA panel. This action should instigate our interface to confirm and proceed in egress.

TESTING

Kelly Mann drops a point of interest and asks the rover to maneuver towards it. The rover begins to move towards its destination.

DEBRIEFING

Our team heads towards the debriefing site to meet up with Kelly Mann to listen to his feedback and discuss potential further revisions.

Moving forward

Photo Feature

Geological sampling would benefit from a way to store photos of samples.

Voice Note Recording

Implementing a voice note recording feature could be beneficial for navigation.

Asset differentiation

As Skye Ray pointed out, some haptic and visual assets looked too similar.

Intercom

For easier team communication between members during onsite testing.

Reflection

Multi-disciplinary collaboration

Learning to work cohesively was as complex as it was fun. Coming together weekly to share ideas was exciting, as our diverse backgrounds brought many fresh ideas to the table.

Leadership

Co-leading was an unforgettable experience. I enjoyed learning how to facilitate weekly team work sessions, sharing design feedback, and just having a great time with my team! 

Design: adding in redundancies

Sometimes your design won’t work as intended! We learned that implementing redundancies for functions is a great backup in case a function fails during testing.

Design: AR considerations

The interface spatially & visually presents itself differently on Figma than when seen in Hololens. It would be productive to implement HoloLens prototype testing earlier in the future.

Publication & Proposal

Published 2020-2022 MIT Space CHI Paper

I was on the 2020-2022 SUITS team and wrote and proofread multiple sections. This paper proved significant in how we decided to proceed with certain UI features and UX interactions in the 2022-2023 SUITS challenge.

This paper was published and accepted into MIT Human-Computer Interaction Space Exploration conference.

2022-2023 NASA SUITS proposal

I, as well as the other co-leads Michael Wang and Ashley Fan, oversaw the creation of our project proposal. This includes researching, writing, guiding, and proofreading sections in this paper.

Our proposal was accepted by NASA and allowed our team to proceed in the challenge as finalists.