52111845202_e23af0a464_o.jpg

Augmented Reality Interface for NASA Lunar Missions

Validated AR Technology as a visual and interactive tool to support and automate the process of performing major tasks in space.

Time: 1.5 years

Team: 16-20 Designers & Engineers 

Role: Team Co-Lead (managing both design and software teams), Design Lead (lead design efforts & train members)

Client: NASA

The Product

AR task tools, information displays, and performance guides.

 

Including guided navigation on the lunar surface, vitals and suit controls display, geological sampling recording tools, and search and rescue messaging system.

 

The Challenge

Problem: How can we give astronauts autonomy to complete their tasks in space without the help of Mission Control?  

NASA Spacesuit User Interface Technologies for Students (SUITS) selected our team to develop Augmented Reality (AR) software on the HoloLens.

Milestones

Research.png

Understanding the user

We talked to AR, voice & navigation designers,  NASA Designers,  Brown Planetary Geoscience Professors, and Astronauts.

1200px-James_H_Newman.jpeg

1. "The Moon is Difficult to Navigate"

  • Dangerous terrain with tripping hazards.

  • An environment with no distinctive features for landmark recognition.

James H. Newman

Former NASA astronaut 

1200px-Steven_swanson_v2.jpeg

2. "Space Suit Limits Movements"

  • Whole-body movements are required to accomplish simple tasks.

  • Small hand gestures require a large amount of effort.

Steve Swanson

Retired NASA astronaut 

 

Astronauts' main objectives on the moon

1. Navigation

- Long-range point A to point B

- Short-range terrain mapping & obstacle avoidance

3. Staying Alive: Vitals

Vitals and suit status

2. Geological Sampling

- Taking Samples

- Documentations

3. Staying Alive: Rescue

Emergency Navigation & Communication

 

Designs

1. Automated Interactions

Navigation Selection

  • Automatically opens the next feature: navigation tools are launched once the route is selected

Initial Design Before Iterations

  • Requires more hand activations and causes more physical exhaustion to the astronauts.

2. Did More With Less

Navigation Mode

  • Designed tools that combine multiple features into one

Nav - Feature set View – 2.jpg

Mini-map provides short-range and long-range direction but takes up space

Nav - Suit heading matches Nav HDG; FWD arrow (less intrusive) – 1.png

Arrow provides short-range direction

Navigation System Expanded View.png

Red Direction Indicator added to existing compass to indicate long-range direction

SUITS FIGMA (Copy)_Page_3.jpg

Pathway replaced the arrow and mini-map to provide comprehensive short-range direction

The pathway also shows approximate closeness to the destination with color change

Initial Design

After Iterations

SUITS FIGMA (Copy)_Page_4.jpg

3. Unified Elements

Vitals Display & More

  • Found that more than 2  interfaces floating and following the user cause spatial confusion. 

  • Unified interfaces to a single switch menu.

Initial Design

After Iterations

Select location.png
SUITS FIGMA (Copy)_Page_2.jpg

4. Cleared the Field of View

Emergency Messaging & Geological Sampling 

  • Interfaces can obscure users' fields of view. They were redesigned to open partially visible on the left of the viewing port to leave the space straight ahead clear.

Design Drafts

After Iterations

When Nav System is Active.png
Navigation System Expanded View.png
On Hover of location-5.png

Experimented on interface opening positions to maximize the opening space in front of the user.

Navigation System Expanded View.png

Testing proved that the edge of the frame doesn't exist in an AR space. An interface that opens to the left and is partially obscured would still be translated as a complete interface to the user. This allows the space in front of the user to be opened up.

5. Set separate use cases for each plane of interaction

Divided features to 3 interaction levels on the HoloLens

Spacesuit-helmet-with-Iron-Man-HUD_Hypergiant-Industries.webp

Registered in the environment in a set location for guidance during task performance in the environment.

Sticky interface registered in the environment following the user's view for decision-making interfaces.

Registered to the hud following the user's view for frequently used display tools.

Initial Design Before Iterations

Navigation Instructions.png
  • Users' attention can only focus on 1 plane of distance at a time and will miss elements in the far environment when they are interacting with the interfaces in front of them, and vice versa.

Design Process

💡 1. Learned to eliminate

We couldn't do everything.  I learned to ask myself and the team, is this function/feature necessary to achieve our goals? If not, let's put these aside into our future features bank.

💡2. Learned to adopt structures for project management

 I found that for this project of 20 members, narrowing the type of decisions each member has to make creates a more efficient and less confusing experience for everyone. 

Example: We separated key functions into teams. A separate user flow was created for each and responsibilities were distributed by dividing the tasks on the user flows.

Untitled (1).jpg

💡 3. Learned to implement early and fast!

We learned quickly that too many false assumptions were made when we didn't test our designs on the device early. I adjusted our timeline so that small adjustments were tested on the HoloLens before we refine the design details.

IMG_0821_edited.jpg

Testing AR by simulating floating interfaces with cut out papers

Screen Shot 2022-06-23 at 4.40.10 PM.png

Software behaves on the HoloLens entirely differently than how it behaved in Unity. 

Nav - Bigger Map – 11.png

Figma Mockup

Designs behave in Unity entirely differently than how they behaved on Figma.

My First Board.jpg
 

NASA Testing Recording from the HoloLens

Recovered footage from the HoloLens during testing on the simulated lunar environment at NASA

A week of testing & software iterations at NASA Johnson Space Center

IMG_1650.HEIC
52112870581_40e31541fe_o.jpg
IMG_1859.jpeg

Core Leadership team at JSC in Huston! My co-lead and I, and our software leads

52112907468_c5af46a800_o 2.jpg
52112918293_349972be71_o.jpg

NASA evaluator Skye tested our software while performing mock astronaut tasks at the simulated lunar environment.

Working on code well into mid-night at the hotel.

52113127329_695f1a32a8_o.jpg

Showcasing our designs to NASA engineers

67495019793__4E38AF5C-EC99-445C-80D3-1E3BE77C5CD5.HEIC
CA4FF895-78A7-4C99-8DB2-CD8F57F1ED5D_1_201_a.heic

Creating an interface controls panel as a backup after we encountered poor lighting on our first dry run on the test site (The HoloLens doesn't run well in poor lighting environments). Thankfully we didn't have to use this later.

IMG_1828.HEIC
IMG_7592.jpeg

Validations from NASA

52112932998_722543eee9_o.jpg

Very intuitive and clear user interfaces

"The buttons worked well, I can see the tools on top of each other, and the layout was intuitive. The vitals screen looks clear."

Have just the right tools needed

"Having an additional map that opens up is helpful on top of the 3d Line(pathway) "

"The line(3d pathway) was pretty cool, it led me right to the destination."

Fast learning curve and a easily understood structure

"I like having a menu that you can collapse to. Having it on the left just made sense."

 

Moving Forward...

1. Explore AR interactions that we haven't played with! Actions such as pinning an element to a user's body part.

"It would have helped if the map could be pinned to my lap."

2. Design more feedbacks!

"It would help to have distance markers along the path, and more precise distance summaries during navigation". 

3. Design for more scenarios

"When key consumables from the vitals screen are in low supply, a way to highlight that to the astronauts through a change in color or an alert would be helpful."

Thank You!

Thank you! I am very excited to show you this work and I would love any criticism and welcome any feedback from you!