The Outpost


Six robots have crash landed on a beach on a distant planet and they have one job to do: chill out and socialize while they wait for help to arrive.

Grab your VR headset and visit the far reaches of space in this new Social VR experience where each participant controls a unique procedurally generated robot avatar.

Client

SIGGRAPH 2020

Year

2020

Category

Social VR Storytelling

Live Project

View Now

Project Image
Project Image
Project Image

Background

Background

Background

The Outpost is a remote VR social experience for 2-6 participants who are completely immersed together as robots on a tropical beach -- set in the future on a faraway planet. As a participant, you sip tropical drinks, play games, and chat with other SIGGRAPH attendees in this strange and wondrous refuge.

Process

Process

Process

This project was a collaborative effort of the brilliant minds at NYU Future Reality Lab under the guidance of Prof. Ken Perlin (father of Perlin noise in Computer Graphics!) . It was an honor to work under his guidance on this incredible storytelling project with the artists in residence and the PhD students at the lab. My role focused on designing and implementing an interactive lobby menu, allowing participants to select their desired spaceship or room for the experience.


GitHub Link:https://github.com/autotoon/TheOutpost

Project Image
Project Image
Project Image

Challenge

Challenge

Challenge

The initial menu design featured a watch interface that users could access throughout the Social VR experience—within the lobby, while interacting with other robot participants, and as a time tracker in the spaceship. The intent was to create a seamless and persistent interface, but during testing, a major usability issue emerged. Participants struggled to intuitively locate the menu, as they were unaware that they needed to turn their wrist to bring it into view. This led to confusion and frequent external guidance, disrupting the immersive experience.

Project Image
Project Image
Project Image

Result

Result

Result

Observing user behavior during testing led to a quick and effective solution. I noticed that the first thing participants did upon entering the experience was check their arms to see if they had transformed into a robot. I leveraged this natural interaction to trigger the menu—whenever users looked at their palms, the menu would automatically appear. This intuitive design eliminated confusion, making the experience seamless and effortless. I also added an instruction section during the scene loading to ensure that the participants do not miss this interaction.

Project Image
Project Image
Project Image
  • More Works More Works

  • More Works SEE ALSO

IDEATE.DESIGN.CREATE.

IDEATE.DESIGN.CREATE.

IDEATE.DESIGN.CREATE.

IDEATE.DESIGN.CREATE.