IRIS - NASA SUITS Challenge 2024

IRIS - NASA SUITS Challenge 2024

IRIS - NASA SUITS Challenge 2024

Student Organization

Student Organization

Collaborative Lab for Advancing Work in Space

Collaborative Lab for Advancing Work in Space

Role

Role

UX Design Team Lead

UX Design Team Lead

Timeline

Timeline

October 2023 - May 2024

October 2023 - May 2024

Platform

Platform

Microsoft HoloLens 2 and Web

Microsoft HoloLens 2 and Web

CLAWS is a top University of Michigan software development project team that competes each year in the NASA SUITS Challenge to build an augmented reality heads-up display for astronauts and a supporting web-based mission control center. For the 2024 challenge, I led the group's 12-member UX Design team to craft our project, IRIS. In May 2024, I travelled to Houston to test and present the project to NASA at the Johnson Space Center. After the successful completion of the project, I was chosen to serve as the team's Project Manager for the following school year.

CLAWS is a top University of Michigan software development project team that competes each year in the NASA SUITS Challenge to build an augmented reality heads-up display for astronauts and a supporting web-based mission control center. For the 2024 challenge, I led the group's 12-member UX Design team to craft our project, IRIS. In May 2024, I travelled to Houston to test and present the project to NASA at the Johnson Space Center. After the successful completion of the project, I was chosen to serve as the team's Project Manager for the following school year.

Final project video submitted to NASA

Final project video submitted to NASA

Overview

Overview

Overview

NASA SUITS Challenge 2024

NASA SUITS Challenge 2024

NASA SUITS Challenge 2024

NASA SUITS (Spacesuit User Interface Technologies for Students) challenges university teams to build augmented reality spacesuit information displays. These holographic interfaces are intended to inspire NASA as the agency develops technology to assist astronauts exploring the lunar surface during future Artemis missions.

Each year, NASA releases a new SUITS Mission Description and teams from across the country apply by submitting a detailed proposal outlining their planned applications. NASA selects the top 10 teams to compete in the challenge based on how well their proposal addresses the mission's challenges. Selected teams have the opportunity to test their applications at the Johnson Space Center's rockyard during SUITS Test Week in May. For the 2024 challenge, the mission was set on Mars and teams were required to create augmented reality software for astronaut's helmets and a Mars-based local mission control center (LMCC).

NASA SUITS (Spacesuit User Interface Technologies for Students) challenges university teams to build augmented reality spacesuit information displays. These holographic interfaces are intended to inspire NASA as the agency develops technology to assist astronauts exploring the lunar surface during future Artemis missions.

Each year, NASA releases a new SUITS Mission Description and teams from across the country apply by submitting a detailed proposal outlining their planned applications. NASA selects the top 10 teams to compete in the challenge based on how well their proposal addresses the mission's challenges. Selected teams have the opportunity to test their applications at the Johnson Space Center's rockyard during SUITS Test Week in May. For the 2024 challenge, the mission was set on Mars and teams were required to create augmented reality software for astronaut's helmets and a Mars-based local mission control center (LMCC).

CLAWS Team

CLAWS Team

CLAWS Team

CLAWS (Collaborative Lab For Advancing Work in Space) is a 90-member software development project team at the University of Michigan working on creating advanced space exploration technology. The organization is comprised of 8 subteams: AR, Web, AI, UX Design, Hardware, Research, Business, and Content. CLAWS has competed in the NASA SUITS Challenge since 2018.

CLAWS (Collaborative Lab For Advancing Work in Space) is a 90-member software development project team at the University of Michigan working on creating advanced space exploration technology. The organization is comprised of 8 subteams: AR, Web, AI, UX Design, Hardware, Research, Business, and Content. CLAWS has competed in the NASA SUITS Challenge since 2018.

My Role

My Role

My Role

I led CLAWS's 12-member UX Design subteam, overseeing the entire application design process. I spent the first five weeks of the project teaching the design members about our design systems, AR design principles, and the SUITS project guidelines. Throughout the year, I helped with creating the interfaces in Figma, working to make sure each feature was feasible to develop, easy to use, and consistent in function and styling. I worked closely with the team's developers throughout the process so designs were properly implemented. I also had to communicate with NASA about effectively meeting the mission requirements. When our project was evaluated at the Johnson Space Center Rockyard, I guided the NASA design evaluator through using the application.

I led CLAWS's 12-member UX Design subteam, overseeing the entire application design process. I spent the first five weeks of the project teaching the design members about our design systems, AR design principles, and the SUITS project guidelines. Throughout the year, I helped with creating the interfaces in Figma, working to make sure each feature was feasible to develop, easy to use, and consistent in function and styling. I worked closely with the team's developers throughout the process so designs were properly implemented. I also had to communicate with NASA about effectively meeting the mission requirements. When our project was evaluated at the Johnson Space Center Rockyard, I guided the NASA design evaluator through using the application.

Project Goals

Project Goals

Project Goals

  • Design an astronaut heads-up information display and Local Mission Control Center (LMCC) interface.

  • Build out cohesive AR and web design systems and work with designers on consistent system use across features.

  • Center the interface design process on maximizing astronaut safety by reducing the cognitive load of mission tasks.

  • Work with the AR, Web, and AI development teams to ensure designs are properly implemented.

  • Administer a user testing program for the developed application, reflecting the mission tasks provided by NASA.

  • Test and present the final application with NASA engineers at the Johnson Space Center in May 2024.

  • Design an astronaut heads-up information display and Local Mission Control Center (LMCC) interface.

  • Build out cohesive AR and web design systems and work with designers on consistent system use across features.

  • Center the interface design process on maximizing astronaut safety by reducing the cognitive load of mission tasks.

  • Work with the AR, Web, and AI development teams to ensure designs are properly implemented.

  • Administer a user testing program for the developed application, reflecting the mission tasks provided by NASA.

  • Test and present the final application with NASA engineers at the Johnson Space Center in May 2024.

Final Result

Final Result

Final Result

AR Interaction Mediums

AR Interaction Mediums

AR Interaction Mediums

Based on interviews with NASA astronauts, testing experience from using the team's previous applications, and the environmental conditions of Mars, the design team decided to use voice control and direct touch as the primary interaction mediums for the IRIS AR application.

Direct Touch: Allows the astronaut to use their hands to press buttons. Provides a quick, direct, and familiar interaction system. Direct hand control is possible due to the thin Maritan atmosphere allowing for slim-fitting spacesuit gloves. The astronaut user can also use the HoloLen's hand ray system and a pinch gesture to select buttons further away.

Voice Control: Powered by IRIS's voice assistant VEGA (Voice Entity for Guiding Astronauts). This system is necessary in addition to touch control because astronauts are often holding tools or working with their hands. Further, if the touch tracking system breaks during the mission, the voice assistant serves as a crucial fallback. To assist with usability, every button on the interface has a text label, instructing the user on what to say to select the button. VEGA also accepts complex, multi-step commands, letting the astronaut more quickly accomplish their desired interface actions.

Based on interviews with NASA astronauts, testing experience from using the team's previous applications, and the environmental conditions of Mars, the design team decided to use voice control and direct touch as the primary interaction mediums for the IRIS AR application.

Direct Touch: Allows the astronaut to use their hands to press buttons. Provides a quick, direct, and familiar interaction system. Direct hand control is possible due to the thin Maritan atmosphere allowing for slim-fitting spacesuit gloves. The astronaut user can also use the HoloLen's hand ray system and a pinch gesture to select buttons further away.

Voice Control: Powered by IRIS's voice assistant VEGA (Voice Entity for Guiding Astronauts). This system is necessary in addition to touch control because astronauts are often holding tools or working with their hands. Further, if the touch tracking system breaks during the mission, the voice assistant serves as a crucial fallback. To assist with usability, every button on the interface has a text label, instructing the user on what to say to select the button. VEGA also accepts complex, multi-step commands, letting the astronaut more quickly accomplish their desired interface actions.

AR - Main View

AR - Main View

AR - Main View

The default view of IRIS was designed to be as unobtrusive and streamlined as possible while providing maximum utility to the astronaut.

Main Menu: A six-button menu lines the top of the astronaut's field of view, allowing them to quickly launch apps for the six primary functions of the IRIS. The positioning effectively balances being high enough to leave the user's view of the surrounding terrain largely unobscured while keeping the functions quickly accessible by touch, hand ray, or voice.

Notifications: Compact alerts appear in the top left open space of the user's view. Each has a standardized format of an icon, title field, and body text, allowing information to be scanned quickly.

Mini Map: The map shows the surrounding terrain, nearby waypoints, and the location of the companion astroanut.

Task Widget: Informs the astronaut of their currrent task and allows them to quickly mark the entry as complete. The frquently utilzied element is positioned within the astronaut's immediate reach.

The default view of IRIS was designed to be as unobtrusive and streamlined as possible while providing maximum utility to the astronaut.

Main Menu: A six-button menu lines the top of the astronaut's field of view, allowing them to quickly launch apps for the six primary functions of the IRIS. The positioning effectively balances being high enough to leave the user's view of the surrounding terrain largely unobscured while keeping the functions quickly accessible by touch, hand ray, or voice.

Notifications: Compact alerts appear in the top left open space of the user's view. Each has a standardized format of an icon, title field, and body text, allowing information to be scanned quickly.

Mini Map: The map shows the surrounding terrain, nearby waypoints, and the location of the companion astroanut.

Task Widget: Informs the astronaut of their currrent task and allows them to quickly mark the entry as complete. The frquently utilzied element is positioned within the astronaut's immediate reach.

LMCC - Layout

LMCC - Layout

LMCC - Layout

The mission control center utilizes a 2-monitor setup. The left screen's call column persistently displays both astronaut's video feeds and accompanying call controls. The other two-thirds of the screen allows the LMCC operator to monitor and control the astronauts' mission. The LMCC's features can be accessed using the persistent menu at the top of the screen. The tabbed approach helps the user have a focused, full-sized view of each page's content and controls. The right monitor contains a full-sized map, providing the location of all mission assets and active routes.

The mission control center utilizes a 2-monitor setup. The left screen's call column persistently displays both astronaut's video feeds and accompanying call controls. The other two-thirds of the screen allows the LMCC operator to monitor and control the astronauts' mission. The LMCC's features can be accessed using the persistent menu at the top of the screen. The tabbed approach helps the user have a focused, full-sized view of each page's content and controls. The right monitor contains a full-sized map, providing the location of all mission assets and active routes.

Tasklist

Tasklist

Tasklist

AR: The simple task view was designed to allow the astronaut to scan through all their tasks and subtasks. Selecting any of the task buttons opens a pop-up with the full task description. The progressive disclosure approach was necessary to comply with the HoloLens' large minimum button size requirement and to have the AR experience remain streamlined and focused.

LMCC: The mission control center tasklist view provides the operator with an overview of both astronaut's tasks. The two column layout allows for detailed descriptions to be selected. The bar at the top of the page groups all the relevant controls, enabling the creation of new tasks and allowing for existing tasks to be edited or deleted.

AR: The simple task view was designed to allow the astronaut to scan through all their tasks and subtasks. Selecting any of the task buttons opens a pop-up with the full task description. The progressive disclosure approach was necessary to comply with the HoloLens' large minimum button size requirement and to have the AR experience remain streamlined and focused.

LMCC: The mission control center tasklist view provides the operator with an overview of both astronaut's tasks. The two column layout allows for detailed descriptions to be selected. The bar at the top of the page groups all the relevant controls, enabling the creation of new tasks and allowing for existing tasks to be edited or deleted.

Navigation

Navigation

Navigation

AR: The navigation screen allows the astronaut to select a waypoint and begin navigation. A large map is also featured, providing the user awareness of nearby companions and waypoints. When a waypoint is selected, the app showcases a route preview.

Web: The LMCC view provides a list of all waypoints and companions to the operator. Location details can be quickly viewed on the right details pane after a list item is selected from the left list. The web design system's horizontal command bar element is utilized in Navigation, allowing the user to edit a waypoint's details or send a route to AR.

AR: The navigation screen allows the astronaut to select a waypoint and begin navigation. A large map is also featured, providing the user awareness of nearby companions and waypoints. When a waypoint is selected, the app showcases a route preview.

Web: The LMCC view provides a list of all waypoints and companions to the operator. Location details can be quickly viewed on the right details pane after a list item is selected from the left list. The web design system's horizontal command bar element is utilized in Navigation, allowing the user to edit a waypoint's details or send a route to AR.

Route Details: The route preview screen provides time and distance details for the selected waypoint. The consumption depletion details are key in ensuring the astronaut does not venture too far from the habitat.

Navigation Mode: The focused view hides non-vital interface elements, providing a clear view of the surrounding terrain during navigation. A small persistent notification is displayed in the top left of view for the duration of the journey, providing up-to-date route information. Arrow guides line the path toward the target waypoint.

Route Details: The route preview screen provides time and distance details for the selected waypoint. The consumption depletion details are key in ensuring the astronaut does not venture too far from the habitat.

Navigation Mode: The focused view hides non-vital interface elements, providing a clear view of the surrounding terrain during navigation. A small persistent notification is displayed in the top left of view for the duration of the journey, providing up-to-date route information. Arrow guides line the path toward the target waypoint.

Geological Sampling

Geological Sampling

Geological Sampling

AR: Numbered icons identify rock samples in space and on the mini map. A pop-up menu allows the astronaut to create a new sample and add details. The focused sampling mode view hides the main menu and allows the astronaut to focus on the geological sampling process.

Web: The LMCC samples database allows the mission operator to view sample details and data visualizations.

AR: Numbered icons identify rock samples in space and on the mini map. A pop-up menu allows the astronaut to create a new sample and add details. The focused sampling mode view hides the main menu and allows the astronaut to focus on the geological sampling process.

Web: The LMCC samples database allows the mission operator to view sample details and data visualizations.

UIA Egress

UIA Egress

UIA Egress

As the astronaut prepares to leave the airlock, holographic arrows guide the astronaut through the UIA Egress procedure to refill the suit's oxygen and water supplies. This is an improvement on designs from previous years which only included text instructions.

As the astronaut prepares to leave the airlock, holographic arrows guide the astronaut through the UIA Egress procedure to refill the suit's oxygen and water supplies. This is an improvement on designs from previous years which only included text instructions.

Voice Assistant

Voice Assistant

Voice Assistant

VEGA shows a live transcription of the astronaut's spoken command, providing critical confirmation that the speech was heard correctly. VEGA also lists out the interface actions it takes, providing the user with full awareness of how the interface is being controlled.

VEGA shows a live transcription of the astronaut's spoken command, providing critical confirmation that the speech was heard correctly. VEGA also lists out the interface actions it takes, providing the user with full awareness of how the interface is being controlled.

Design Systems

The team used Microsoft’s MRTK3 (Mixed Reality Toolkit 3) design system for the AR application with extensive additions and modifications. The changes including a custom text label system and button alignment tweaks.

We started with Microsoft's Fluent Design web system as the base for the LMCC and modified most components to fit the web system's needs. This included creating new elements such as the sidebar list and command bar that are utilized across most of the web features.

Research

Research

Research

NASA Astronaut Interviews

NASA Astronaut Interviews

NASA Astronaut Interviews

The CLAWS UX Design subteam has conducted several interviews with former NASA astronauts and current NASA engineers over the past several years. This experience, along with extravehicular activity details found on space agency archives and information provided directly by NASA to the team, allowed us to construct a detailed persona for an astronaut on a Martian spacewalk mission.

The CLAWS UX Design subteam has conducted several interviews with former NASA astronauts and current NASA engineers over the past several years. This experience, along with extravehicular activity details found on space agency archives and information provided directly by NASA to the team, allowed us to construct a detailed persona for an astronaut on a Martian spacewalk mission.

The CLAWS UX Design subteam has conducted several interviews with former NASA astronauts and current NASA engineers over the past several years. This experience, along with extravehicular activity details found on space agency archives and information provided directly by NASA to the team, allowed us to construct a detailed persona for an astronaut on a Martian spacewalk mission.

NASA Astronaut on Mars Persona

NASA Astronaut on Mars Persona

Segment Description:

NASA astronauts exploring the Martian surface. The priority is maximizing astronaut safety by providing assistive information and situational awareness. The thin Martian atmosphere allows for thin spacesuit gloves, enabling direct touch interactions. Astronauts are often holding tools or using their hands for tasks. They also prefer to have their view of the surroundings as unobscured by interface elements as possible.

Needs:

  • View their current task and mark it as complete.

  • Communicate with mission control over text and voice.

  • Photograph and log details for geological samples.

  • View a vitals dashboard and receive alerts if any values fall out of a desired range.

  • Navigation assistance to waypoints.

  • Guidance through the airlock departure procedure.

  • Quick and hands-free interface operation options.

  • Unobscured view during terrain traversal.

Analyzing Previous Applications

Analyzing Previous Applications

Analyzing Previous Applications

The team looked at SUITS Challenge application designs from previous years that were created by CLAWS and teams from several other universities. This process allowed us to find common design approaches and issues after conducting heuristic evaluations and analyzing feedback from testing.

The team looked at SUITS Challenge application designs from previous years that were created by CLAWS and teams from several other universities. This process allowed us to find common design approaches and issues after conducting heuristic evaluations and analyzing feedback from testing.

CLAWS NOVA Application (2023)

CLAWS HOSHI Application (2022)

CLAWS ATLAS Application (2021)

CLAWS NOVA Application (2023)

CLAWS HOSHI Application (2022)

CLAWS ATLAS Application (2021)

Findings

Findings

Findings

  • Direct touch control is the quickest interaction system, but it cannot be used when astronauts' hands are full.

  • A voice control guide listing out potential commands was very helpful during previous years' rockyard testing.

  • Provide astronauts with multiple interface interaction mediums, specially touch control and voice control.

  • Vitals data visualizations should be organized, and the system state should be simple to understand at a glance.

  • Astronauts prefer when their view of the surrounding terrain in unobscured in order to navigate safely.

  • A mini-map is helpful in allowing the astronaut to know where nearby waypoints and companions are.

Process

Process

Process

Sketches and Low Fidelity

Sketches and Low Fidelity

Sketches and Low Fidelity

Messages Whiteboard Sketch

Early Vitals AR Layout

3D Map Feature Testing

Messages Whiteboard Sketch

Messages Whiteboard Sketch

Early Vitals AR Layout

Early Vitals AR Layout

3D Map Feature Testing

3D Map Feature Testing

Throughout the year, I oversaw the team's 12 designers as they worked through the research and design process, ensuring design elements were consistent across the 16 AR and web features that were created for the IRIS project.

To start the design process, UX members interfaced with developers about any technical limitations the designs would need to work around. After, we decided on the specific functionality each feature would have and sketched out potential layouts on paper and whiteboards. The designers then moved into Figma to create wireframes and low-fidelity prototypes.

As the designs progressed, all the designers provided feedback on each other’s work. This was super helpful in ensuring the consistent implementation of the design system and patterns across features. When designs were completed, they were handed off to developers with documentation providing implementation notes.

Throughout the year, I oversaw the team's 12 designers as they worked through the research and design process, ensuring design elements were consistent across the 16 AR and web features that were created for the IRIS project.

To start the design process, UX members interfaced with developers about any technical limitations the designs would need to work around. After, we decided on the specific functionality each feature would have and sketched out potential layouts on paper and whiteboards. The designers then moved into Figma to create wireframes and low-fidelity prototypes.

As the designs progressed, all the designers provided feedback on each other’s work. This was super helpful in ensuring the consistent implementation of the design system and patterns across features. When designs were completed, they were handed off to developers with documentation providing implementation notes.

Testing Process

Testing Process

Testing Process

After each feature was developed, the design team members were provided with working prototypes. For the web mission control center, developers just shared a link to a testing website. For the headset side of the application, the developers loaded the prototype onto the HoloLens 2 headset. Testing the interfaces on-device was vital in ensuring designs were implemented as intended and usable in-practice.

The design team created a testing script that mirrored the mission we would be performing at the Johnson Space Center’s rockyard during SUITS Test Week. Using this process, we were able to find issues with the application and have the developers make needed adjustments before we had to present in May 2024.

Design Challenges

Design Challenges

Design Challenges

  • Ensuring the feature details on the AR helmet side and supporting web mission control center side were aligned.

  • Proper implementation of the AR and web design system by twelve different UX design team members.

  • Syncing the design and development timelines so every feature completed development by the end of the year.

  • Working with NASA to set up a testing procedure that accurately simulated the mission and rockyard environment.

Test Week at NASA's Johnson Space Center

Testing at the JSC Rockyard

Testing at the JSC Rockyard

Testing at the JSC Rockyard

NASA Design Evaluator Testing our IRIS Application at the Rockyard

Development Leads Coding at the Rockyard

Local Mission Control Center

(Credit: NASA JSC STEM Office)

NASA Design Evaluator Testing our IRIS Application at the Rockyard

NASA Design Evaluator Testing our IRIS Application at the Rockyard

Development Leads Coding at the Rockyard

Development Leads Coding at the Rockyard

Local Mission Control Center

(Credit: NASA JSC STEM Office)

Local Mission Control Center

(Credit: NASA JSC STEM Office)

Over two days, CLAWS tested our IRIS application at the Johnson Space Center Rockyard in Houston, Texas. Each day, a NASA engineer used our headset software and guidance from our Local Mission Control Center to conduct a simulated Martian spacewalk.

Our augmented reality IRIS application proved very successful helping the design evaluator quickly navigate each step of the mission, which included navigating between waypoints, logging rock samples, and using tools to repair an equipment tower. We also received positive feedback on our mission control center interface, with the NASA evaluators praising the easy-to-use design and large astronaut camera feeds on the display.

Over two days, CLAWS tested our IRIS application at the Johnson Space Center Rockyard in Houston, Texas. Each day, a NASA engineer used our headset software and guidance from our Local Mission Control Center to conduct a simulated Martian spacewalk.

Our augmented reality IRIS application proved very successful helping the design evaluator quickly navigate each step of the mission, which included navigating between waypoints, logging rock samples, and using tools to repair an equipment tower. We also received positive feedback on our mission control center interface, with the NASA evaluators praising the easy-to-use design and large astronaut camera feeds on the display.

Poster Presentation to NASA Engineers at the JSC Cafeteria

CLAWS Team Photo at Test Week

(Credit: NASA JSC STEM Office)

Presenting IRIS to NASA Panel and other SUITS Teams

Presenting IRIS to NASA Panel and other SUITS Teams

Presenting IRIS to NASA Panel and other SUITS Teams

Poster Presentation to NASA Engineers at the JSC Cafeteria

Poster Presentation to NASA Engineers at the JSC Cafeteria

CLAWS Team Photo at Test Week (Credit: NASA JSC STEM Office)

CLAWS Team Photo at Test Week (Credit: NASA JSC STEM Office)

We received feedback on how to improve certain experiences for the next year’s challenge. The advice centered around adding more interaction redundancies, simplifying the rock sampling process, and making the software less buggy. As UX Design Lead, I documented all testing feedback during the evaluation.

This following year, I was appointed to serve as the new CLAWS President and Project Manager. Over the past several months, I have expanded the team to 90 members and implemented changes to the software design and development process to ensure a more polished final product. Our testing timeline has also been greatly extended this year which will allow us to make design improvements earlier and more efficiently.

We received feedback on how to improve certain experiences for the next year’s challenge. The advice centered around adding more interaction redundancies, simplifying the rock sampling process, and making the software less buggy. As UX Design Lead, I documented all testing feedback during the evaluation.

This following year, I was appointed to serve as the new CLAWS President and Project Manager. Over the past several months, I have expanded the team to 90 members and implemented changes to the software design and development process to ensure a more polished final product. Our testing timeline has also been greatly extended this year which will allow us to make design improvements earlier and more efficiently.

Noah Feller

Noah Feller

Noah Feller