Companion describes an interaction and design approach in the field of future extended reality technology on head-mounted displays. The operating-system-prototype explores how future user interfaces could behave in a real, physical space in order to best integrate into the environment. This includes new approaches to ‘positioning’ options in a space, user-centered functionality with gesture control, a hand-bound menu-concept and contemporary approaches in UX and UI regarding system utilisation and visual layout.
This webpage provides a deep dive in the different elements and concepts of companion. The whole system was created as part of my MultiMediaArts masters-graduation-project at the University of applied Sciences in Salzburg.
Nominee nextRealityContest 2021 – Young Talent.
To create a haptic XR experience, companion allows the real world to be overlaid one-to-one with the digital mirror identity. For this purpose, two points are set in the physical space, which can be focused by VR controllers. By entering the two points into the system, the digital space is congruently adjusted in position, rotation and scale to the physical environment. Walls, tables and chairs are thus made tangible with their digital counterparts.
The digital overlay also provides the possibility of manipulation. As an example, companion allows to set different, predefined environments. On the one hand, this serves as a showcase of the potential on future XR devices: reality can be changed and adapted to the respective preferences of the user. On the other hand, a new paradigm emerges, especially in prototyping. In this way, an environment can be created in the creation of design & user experience that covers the most diverse use cases and can serve as a template as well as a verifier.
To create a haptic XR experience, companion allows the real world to be overlaid one-to-one with the digital mirror identity. For this purpose, two points are set in the physical space, which can be focused by VR controllers. By entering the two points into the system, the digital space is congruently adjusted in position, rotation and scale to the physical environment. Walls, tables and chairs are thus made tangible with their digital counterparts.
One UX principle of the shown prototype is a new approach on user-centered design in extended reality enviroments. To make an easy understandable, usable and scalable proposal, companion enhances elements from broad known screendesign and merges them with the depth and felt tangability of a three-dimensional, digital reality. The basic structure of the system contains two element types.
A handmenu serves as the metaphorical home screen of the entire system. The hands not only serve as the main interaction tool but also carry all other accesses to the companion system. To make the experience soothe but also functional, the handmenu is divided into several sections, which can be accessed by different push & gesture interactions. The upper visual shows an simplified handmenu model, which can also be placed in AR.
To display application-specific content, companion uses a window structure. All of them have one thing in common: the multidimensional bar as the central handler of window manipulation. Based on this concept, different window sizes and types can result in the future, which can be designed according to content, preferences, and concept.
How could UI-elements behave in this future world?
Not a simple question to answer. Companion explores therefore an approach how positioning of interfaces could look like. Regarding the different positioning and behavior posibilities, the system distinguishes between three versions of automatic interface placement.
This option describes the behavior of the interface directly in the user’s field of view. The distance between the interface window and the head remains the same, the rotation follows the user. This version is particularly interesting when the user wants to retrieve quick information or to immerse himself completely in the XR world. Furthermore testing resulted in this option to be best suitable on window startup.
The window detaches itself from the user’s head and floats in space, similar to a hologram. Nevertheless, the rotation follows the direction of the user’s gaze. In addition, manipulation of the position – by simply grabbing the window and moving it to the desired position – is possible. This option is especially usefull for multi-window purposes or to set different windows in dedicated locations of the room.
Surfaces surround us. Companion can use these surfaces as interface holders. In this way, the haptic, digital world is populated with various anchors to which the respective windows can jump by the users gaze. In addition, there is not only a haptic feedback through the real world when pressing UI-elements, but also the possibility of movement of the interface through the space. Thus, the content remains constantly visible, but integrates itself into the environment.
Furthermore the protoype allows changing the behavior of surface positioned windows. There by a selection from two modes is possible:
The selection of all introduced modes can be done via different interaction types. This multi-way approach is based on the experimental user-centric design philosophy the system is exploring. Therefor companion supports on ‘classic’ button pressing. In addition, there are two new action options. On the one hand, the particular mode can be selected with the window-specific Multidimensional Bar.
Beyond that, companion also offers a distance selection of windows directly in the hand menu options. When this component is active, windows can be controlled by direct buttons in the handmenu as well as by dedicated hand-gestures.
Companion uses a responsive, user-controlled system to place different anchors directly on any surface. Anchor-specific objects are predefined and especially oriented towards large, flat planes in the room (e.g. tables, walls, doors). The user can create an unlimited number of anchors and place them at the desired locations in the room. An edit mode also offers the option to delete, reposition and – depending on the background color – invert already set anchors to ensure better contrast. Therefore, companion offers a maximum in flexibility and customisability.
The multidimensional bar serves as the central interaction element within companions window system. The element acts as a easy understandable controlling component on the bottom of each content window. By simply grabbing and dragging the element, the user can determine the behavior of the particular window as well as closing the window permanently.
The manipulated axes are strongly oriented on the natural interaction patterns of humans. In addition, when designing the respective directions of movement, attention was also paid to the factor of the physical-haptic surface, which can block respective axes when the interfaces are placed directly.
Controlling elements by hand gestures might be the biggest potential of sensor-heavy XR devices, creating an extended reality enviroment. Not only can users navigate the system faster, the interaction feels way more natural and brings a flat learning curve too.
Companion integrates this new idea of gestures and offers the user an easy explorable way of controlling the system. The baseline concept in the gesture system is a separation of both hands.
Accordingly to the used system elements, the left hand gestures control the handmenu itself while recognising opening and closing of the hand. The right hand – mostly used for direct manipulation of all sorts – only activates gestures detection in specific modes of the system and serves as position detection and mover.
Probably one of the most striking visual features are the carrier elements of the UI. All are designed with a blurred glass translucency effect. This is often referred to Glassmorphism and nowadays seen especially in Webdesign. But especially in a future augmented or extended reality setup, there will be the problem of covering or occluding the real world with interface elements. This style of glassmorphism give the possibility to use carrier plains for different UI elements and give them a frame, but also don’t rip out the user of the “real world” or other virtual environments. The user can perceive what’s behind the element and therefore stays a little bit more connected to the surrounding space.
Until now, we have been used to app icons being hidden behind a two-dimensional screen. With the new medium XR, however, new potentials and patterns arise that invite us to leave two-dimensional design behind and add depth to even small objects. Companion operates within this space and uses – although often only recognizable on closer inspection – genuinely extruded elements.
Another consideration is the visual layout at itself. As a designer you normally use grid-systems to give the layout an distinctive appearance. Not only uses companion twodimensional a grid invented by Karl Gerstner, but also pushes the grid to the third dimension with adding depth to all big elements. Hence this the UI is feeling more touchable and real. The grid also enables a high degree of flexibility in the visual design. For example, the columns that run into each other also allow layers and UI elements to be placed on top of each other. This is especially interesting in a three-dimensional world, where depth can be considered a new pragmatism in this design sphere.
Although we are in a walkable digital world, there are situations in which symbols will work better without the third dimension layer. For this reason, various icons were designed for the entire system, which are strongly adapted to the respective area of application in terms of line width and style. Especially in the area of today’s XR HMD, the hardware is often a limiting factor, e.g. in display sharpness even in the peripheral area of the field of view. Elements adapt to this limitation – as far as possible – to create a better overall visual image and improve usability.
Companion is a concept that is now tangible through futurological extended reality technology. The process of creation, however, was not only in this reality. There were many “classical” applications and methods on the way to the prototype. For example, various creative approaches – e.g. design thinking, imaginary brainstorming or various mind maps – were experimented with. Especially here, the still often elusive concept of this new reality became clear.
In addition, approaches outside of the 3D program can also be found in the practical conception. For example, sizes and connections of the system were protoyped in an analog, lo-fidelity manner, with pen and paper. In addition, the hand menu in particular underwent a number of different design iterations in order to achieve a balance between aesthetics and usability. Furthermore a dozen of low to high fidelity prototype, wireframes and flowcharts where created, to on the on hand show further scalability but also experience problems in a possible user journey.
My personal drive arose especially from one conviction: Mankind is – I am sure – on the verge of the next digital transformation. Extended reality (XR) will play an significant part in this new technological revolution.
Despite all the difficulties in the public discourse, I believe there is enormous potential in the consideration of extended reality on head-mounted or wearable devices. Exemplarily, the new type of medium makes a fixed screen abstinent and centers itself entirely around the user in the approach of an even more human-centered-design. This type of technology can and will fundamentally change the interaction with the digital sphere and among us humans in the future.
I am convinced of this myself as a master student of MulitMediaArt – making companion as my master graduation project – my colleagues and I will significantly shape the content, function and perception of this technology in the future years. Although the technology is still in its infancy in some parts, now it is time to apply our knowledge of aesthetics, interaction behaviour and design to the development of new products to create a sustainable, inclusive and natural visual language and interaction that benefits all users. Therefore, companion is the first step for me personally with getting my hand as an UX Designer on this new reality and offering a proposal for a solution.
The prototype is created in Unity. Special features are the use of the Universal Render Pipeline to achieve certain effects in the visual design language as well as the integration of the Mixed Reality Toolkit from Microsoft, which was actually developed for the Holo Lens. However, the current target device is an Oculus Quest 2, mostly running companion via Oculus AirLink.
© Lukas Kröninger, 2021