It’s safe to say that Virtual Reality (VR) has been gathering a lot of momentum lately. There are strong contestants for high-end solutions such as the HTC Vive and the Oculus Rift, but the growing popularity of VR might also be strongly linked to the expanding range of devices that can run VR applications. Google Cardboard and Daydream are prime examples of a solution that takes advantage of a standard smartphone and its capabilities to make VR an actual "reality" for the masses. If you own a smartphone that is a couple of years old and you can fold a few pieces of cardboard together, you should be more than qualified to get started with VR.
At Nuxeo, we have a knack for technology and are always on the lookout for opportunities to explore new possibilities with the latest technology. We can envision VR representing a new world of possibilities for a richer Digital Asset Management (DAM) experience. In this alternative reality you wouldn’t have to be limited to the typical two dimensional point and click approach. Imagine being able to look at 3D models of your digital assets rendered in the VR environment! To do so, new interaction concepts for DAM have to be explored and put to the test, and that’s where we came in.
The challenge was to develop a solution with "VR and Digital Asset Management" as the main theme and show it to everyone present at the kickoff.
The rules for the hackathon submissions were very simple and not too restrictive, allowing both prototype and presentation entries. So, even if programming isn’t your thing, you would still be able to share your ideas. This truly was an event for everyone.
For the developers who decided to dive into the subject, the use of technologies such as the Google VR SDK, which provides a complete set of APIs, tools and samples to develop VR applications for smartphones supporting Google Cardboard and/or Daydream was suggested. With this, we got down to work.
With a whole new dimension to explore, new challenges arise when it comes to user experience. Depth perception, spatial audio cues and auxiliary controllers can all blend together to contribute to a richer and more intuitive interactive experience.
Despite the already known requirements necessary to trick the human brain into feeling that we are in a real world and not in a simulation (head tracking latency, field of view, image resolution and framerate), we're dealing with a considerably different form factor (compared to the traditional 2D canvas) that we are still exploring and learning to design realistic and immersive user experiences.
Some progress has been made in identifying some of those challenges, and if you feel like trying VR development, the Designing for Google Cardboard guide is a good place to start and learn some of the best practices and patterns, not only from a technical point of view but also taking into account physiological considerations.
The hackathon submissions included a wide range of solutions for browsing, visualization and user interaction. From browsing through floating asset grids, panels and viewports surrounded by immersive worlds, to live previews of assets, including visualization of 3D models rendered in the VR environment. The diversified set of user interaction approaches was comprised of gaze triggered widgets, bluetooth controllers, speech recognition and even interesting patterns such as search by example and asset manipulation.
You can see some of these solutions on GitHub:
This event opened up the discussion about VR and its possibilities when combined with DAM. The results and the feedback both exceeded our expectations!
What will the future hold for VR? Could it be the next big thing for DAM? If yes, we sure are ready to rise to the occasion!