Loading…

Adapting a design system to work for the Metaverse

Design systems enable developers and designers to rapidly develop products that are consistent across all platforms. Existing design standards could be directly applied in the Metaverse. But many other concepts, unique to 3D environments, required a lot of definition.

Article hero image

I have had great success using design systems when developing software products. They’re a great way to enable developers and designers to rapidly develop engaging products that are consistent across all platforms.

I’m advising a company enabling hybrid digital workspaces—including VR environments—and I wondered if we could extend the principles we established in our 2D design system to enable the same productivity boosts we experienced when building for other platforms. We learned that there are many places where existing design standards could be directly applied in the Metaverse. But many other concepts, unique to 3D environments, required a lot of definition.

This article will discuss the lessons we learned when adapting our design system to guide Metaverse design and development.

High-level considerations

2D and 3D experiences both benefit from following specifications based on fundamental design principles. These constraints guide product design and development towards consistent, positive user experiences and guard against overwhelming customization.

Standards are guidelines, not strict mandates, so there were times when the existing standards didn't fit our needs. This was okay and expected. We allowed exceptions when needed and—crucially—we made sure to clearly document the exception, so everyone involved could understand why the change was made..

Basing our Metaverse standards on fundamental design principles ensured these experiences would integrate well with our existing products. Our web, mobile, and print experiences are all developed using the same fundamental design language.

Presence

The goal in designing a good Metaverse experience was ensuring the environment felt natural, the user was comfortable in our space, and the user was able to meaningfully experience the content (instead of escaping to the real world to deal with navigation or visual distractions). The term used in VR for feeling like you are really there is “presence.” This term was first coined in Mel Slater’s theory of presence.

As Mark Zuckerberg told investors in his recent press conference for Workday:

"The defining quality of the metaverse is presence, which is this feeling that you're really there with another person or in another place. Creation, avatars, and digital objects are going to be central to how we express ourselves, and this is going to lead to entirely new experiences and economic opportunities."

The goal of our Metaverse-related design specification was to establish patterns to ensure the user felt comfortable and was able to interact with the content in a natural way. We learned that we could use design specifications to enforce the illusion of self embodiment and establish interaction patterns that enforced the illusion of physical interaction.

Which 2D specifications translated directly?

We discovered many of our traditional design standards could be applied directly to 3D environments. After all, the Metaverse is still a visual medium built by code that must be implemented by developers.

Design tokens

We created consistent, human-readable design tokens for various colors, dimensions, and typography specifications. This made the design and development process much simpler because there was a limited number of standard tokens used. These tokens quickly became the language used in our design mock-ups, allowing our team to communicate using consistent language.

By ensuring the design specifications all used the same semantic names, we reduced the likelihood that custom values would be introduced.

YES: AccentColor, SideMargin, Headline4 NO: #bada55, 16px, Montserrat/14px/Bold

Color

We didn’t support every color in the spectrum; instead, we defined a limited color palette with a more manageable number of values. We had success using just a few options tied to specific tokens.

Using a limited number of brand-specific colors drove a consistent aesthetic across all the platforms we supported. This ensured our Metaverse experiences matched our 2D or print media branding.

Since the colors were centralized and consistent, we could easily switch them to rebrand. This control helped keep our applications accessible since we could ensure each color met minimum contrast requirements.

However, we discovered that the limited range of colors we used for 2D designs did not necessarily translate for immersive spaces. Metaverse spaces involve lighting and appeal to our intuitive need to grasp the depth of our surroundings. We needed to support a wider range of colors for immersive environments.

Selection of color palettes needed to be done with the intent and mood of a room in mind. For 3D spaces, we used the same basic colors specified in our design system, but allowed designers to adjust their brightness or saturation. This allowed more variation, but kept the overall aesthetic consistent with our branding.

Lighting in immersive environments sets the mood, so we set minimum and maximum brightness levels to manage this. Lighting will impact text legibility and the app’s overall accessibility, so we watched it closely. We used a shorter range of soft and harsh contrasting and complementary colors to enable strong focal points when highlighting certain content.

Typography

The 2D benefits of creating a small number of typography styles translates directly to the Metaverse, and we didn’t change anything. This didn’t directly affect presence, but did make the development process much simpler.

Spacing

It was important to set our spatial system’s range with a memorable base number and document clear expectations about how it’s used. This resulted in our layouts aligning to a grid, which is visually pleasing to the user.

We used a Base8 system (allow dimensions divisible by 8). We used this because it matched up with many browsers' base font-size of 16px (8x2), and because many popular screen sizes are divisible by 8 on one axis.

Base8 measurements are always divisible by 2, so we avoid scaling issues that result in a.5px offset that will happen in Base5 systems. Pixels with .5px offsets will display an edge that appears blurred due to antialiasing of that half pixel.

Standard for embedded resources (videos and images)

When creating Metaverse environments, customers tended to want irregular shapes for media to fit oddly shaped spaces. We struggled to adjust our 3D spaces to accommodate uniquely shaped media. Standardizing media sizes simplified our designs so that we weren’t trying to fit an infinite number of shapes into our 3D environments.

The simplest way to constrain this was to define a limited number of supported aspect ratios:

  • 9:16 (16:9)
  • 3:4 (4:3)
  • 1:1

Ensuring all media fit one of these limited formats simplified implementation and eliminated a ton of rework since we no longer had to adjust our environments to accommodate media of all sizes.

Which specifications were unique to Metaverse design?

We discovered a variety of design categories we used to help establish the user’s presence in the virtual world. This section will introduce the main categories we identified.

Animation

It was critical that animations obey physics and move in a natural way. This meant that objects didn’t move linearly, which looks unnatural to the human eye.

The same basic animation principles we used in other media were just as important in our Metaverse designs. We stated that all animations must use easing curves that are the correct duration (generally 200ms - 500ms).

In the Metaverse, users perceived unnatural physics much more negatively then they did in web media.

Audio

In Metaverse environments, using audio effectively was a critical part of the user experience.

Spatial

Spatial audio is reactive to the user’s position in space. In short, sound volume is a function of distance. This means the closer a user is to a content panel or another user, the higher the audio volume is.

An effective way to give the user a sense of direction is to make a louder noise come from one direction. As an example: If your friend is standing to your left and speaking to you, your left ear will hear the sound slightly louder and sooner than your right ear. Lack of good spatial audio can make an environment feel flat.

Defining this detail was quite complicated. For instance, to get good 360-degree sound, we needed to consider the shape of the room and “reflect” the sound based on this.

Ambient audio

This is a sound that plays quietly throughout the experience to establish and enforce the mood. As its name suggests, it should add ambiance, not distract from the overall experience. Ambient audio can help avoid unnatural silence when other feedback is not present. We discovered with ambient audio, it is best to fade the audio in gradually, rather than blasting the user. When the user mutes, we prefered the audio to instantly go silent. We avoided loud audio that impeded hearing other content, and always muted ambient audio when other media was playing.

Audio feedback

We used sound triggers to guide the user in immersive environments. For example, when the user closed a door, we played a confirming sound to provide them another level of feedback. This type of feedback was often better than visual methods used in 2D design.

Avatar

How the user is represented in the virtual space directly impacts their presence. When the avatar can successfully mimic their real world movements, the experience gets even better. The user will feel more like the avatar when the movement is realistic and intuitive—enforcing that all-important sense of presence.

Limitations of avatars

Getting an avatar to successfully mimic real-world interactions is very difficult, and when it’s not done correctly, it frustrates users and takes them out of the experience. Poor digital representations can fall into the uncanny valley, giving the user a sense of unease.

If fully representative avatars were not available, an excellent compromise was to use a small profile video or image. This was a natural and comfortable extension to how users regularly represent themselves in video conferencing. We adorned the profile images with informative badges indicating location or mute status.

Navigation

When we started to discuss camera control and navigation, we quickly realized this was an exceedingly complicated subject. In fact, we decided that it needed to be its own specification, not just a section in our design spec.

We decided to define the following details: Camera perspective

We needed to define which camera angles our Metaverse would support. This could vary between a 1st person view, a 3rd person/dollhouse view, or a 2D overworld map. Each of these perspectives required a lot of detailed description.

Control layout

We needed to define how the user controlled the camera and their movement within the environment. We quickly learned that having intuitive controls was important. We discovered there are many different control mechanisms, and the “best” way varies quite a bit between users. For example, we supported standard WASD key commands, but some users preferred to use the arrow keys or the mouse to move. These specifications were further complicated because we could not depend on the user having access to our certain controls. For example, mobile devices required virtual joysticks when a keyboard was unavailable.

Locomotion and navigation

Locomotion and navigation were important to ensuring a positive user experience. The ease with which they could move and the motion simulated as they moved was critical to avoiding motion sickness.

Littlest distance

We needed to consider how easy it was for a user to move around our environment. If it was tedious to walk from one side of the room to the other the user would become frustrated. We defined maximum distances between rooms, and prioritized how the user would flow through our environments to minimize distance travelled.

Lessons we learned

In adapting our design system for the Metaverse, we learned that many of the fundamental reasons traditional specifications work are universal and translated directly to 3D design and development. This was not surprising, because we were using established design patterns that were well researched, so we expected them to translate well.

We discovered many additional ways to help our users feel present, and we used specifications to establish patterns to ensure our users had positive experiences. When designing experiences for the Metaverse, we discovered it is very important to have empathy for the user, and will be critical to the success of products targeting the platform.

Add to the discussion

Login with your stackoverflow.com account to take part in the discussion.