June 4, 2019
Since Oculus announced the shipping date of the Quest in April, I have been eagerly awaiting the arrival of our headset. It’s inside-out approach to tracking is intriguing and with the prospect of no more umbilical cord attached to my head, there is a lot to be excited about.
Everyone who has tried Virtual Reality with us has been really excited by what the technology is capable of, not only now, but what it has the potential to become. A huge draw back thus far has been the cost of the device, coupled with the need for a beefy gaming computer to run the software. This is why the Quest has got everyone, including myself, so excited. Without the need to do the computing externally the barrier to entry is dramatically reduced.
The plan is to port all of our existing applications to the Quest, without giving up too much of the quality and complexity in our scenes. Given that we are essentially porting to a mobile processor, this is going to be a big challenge.
I have been working with the Oculus Rift and HTC Vive for the past 3 years using the Unity Engine. It has been great having the performance of a desktop computer powering the headset, but sometimes it is still a struggle to get certain experiences to run smoothly on these devices. I am going to need to double down on optimization, both for the models and code if I want to get these applications running smoothly on the Quest.
Before I begin this daunting task, I want to play around with a 3rd party application, VRidge Riftcat, which I’ve been following ever since I got my first Google Cardboard. The software essentially renders the game on your desktop computer and streams it over WIFI to another device, be it a phone in Google Cardboard or a Gear VR. This is a great solution for having high performance experiences in cheap, portable headsets.
After playing around with Riftcat I am quite impressed. They managed to include full Steam VR compatibility, including their room scale guardian system with 6 DOF controllers. This must not have been an easy accomplishment. Unfortunately, this is clearly still a beta. The tracking was flawless (apart from a little latency, but I can forgive them for that) but the quality of the visual stream was a bit lackluster. It suffered from numerous dropped frames and image compression if the WIFI signal degraded even a little. I am going to keep an eye on the project in hopes that they overcome these shortcomings, but for now, I can’t recommend this to clients as a solution.
It looks like I better open Unity and get porting!
Connect with iTRA to discuss your next project.