Get Connected!
Get updates via RSS
Get updates via Email
Connect to DigitalDoyle via twitter

Unistation 2112 Virtual Reality Demo

by DigitalDoyle

 

I’ve been a developer of interactive experiences for decades now, and have been blessed to have been involved in many very cool and unusual projects over the years, but I have never been excited about a technology more-so than virtual reality. Late last year I bought an Oculus Rift and the Touch controllers for the express purpose of getting up to speed with VR development. Virtual Reality and its cousin Augmented Reality are going to be the next big technological waves. I’ve been able to ride every major tech wave as they’ve broken, and I think this one will probably be the one that has the most impact on society and industry. We’ll see.

VR development uses many of the same skills as other types of interactive development. So to teach myself the VR-specfic things I need to know, I chose to build out a proof of concept application that’s basically a virtual trade show product demo. Since I know it so well, I chose to use a very unique computer workstation that I designed, engineered, and fabricated as my product for this demo; the ultimate desktop. I call the concept the Unistation, and this particular model is the 2112, and it’s where I work everyday. I have several models, and you can find out more about them here.

The 2112 has complex geometry and functionality, and making a VR demo that allowed realistic interaction would be a perfect challenge to quickly ramp up my skills. I used the Unity game engine as my development platform and found it accessible and familiar, once I got into it. Pretty amazing what’s possible to create with these powerful game engines like Unity and Unreal. I intend to learn Unreal next, as it’s good to be able to develop in both.

I am also using the very excellent VRTK, Virtual Reality Tool Kit for the VR locomotion and interactions. Very highly recommended! If you’re doing VR development in Unity, you’d do well to check it out. Great toolkit. Great support. Dramatically cuts development time. I wrote up some thoughts about VRTK here. One of the many very cool things about VRTK is that you can develop your app and have it work on the Oculus Rift or HTC Vive.

For this demo, I rebuilt most of the 3D models I created years ago when I designed and built the 2112. I updated all the surfaces and textures and learned how Unity rewrites the polys when you import the models (all triangles). My goal with this phase 1 demo was to get basic functionality up and running with locomotion and being able to reach out and grab and move things and cause interactions with the virtual models, and I was able to accomplish those goals, as you’ll see in the above video.

In phase 2 I’ll be developing an interactive UI that will let users change the colors and textures of the different parts of the 2112, and maybe even let them take a virtual photo or short video and be able to post it to social media. But to be able to change colors and textures and see those changes immediately in a virtual world would be, I think, a very useful thing. And a thing that could be used in many different applications.

I know that many/most see the potential for VR pretty much mainly in entertainment and games, but I think this technology is also perfect for things like architectural visualization and product demonstration and marketing. Lots of opportunities will be coming down the pike pretty quickly, and I’m going to be in a position to meet those needs.

And for those curious about how I recorded my experience in the Oculus Rift HMD when I created the video above, I did it using ShadowPlay, which is a capture application that comes free with the Nvidia GeForce Experience that you normally install to support your Nvidia video card. Works great, but I found that the frame rate of the resultant .mp4 files isn’t the standard 59.94 frames per second. It varies quite a bit, and that causes issues when importing the footage into Adobe After Effects. I got errors when I imported some of the files, and when I tried to use the Interpret Footage option in After Effects, for some unknown reason, it shortened and lopped off the end of some of the footage; like the last minute or more of a 13 minute capture. But I found that I could conform the footage to 59.94 using Premiere and export it at that frame rate and have it work perfectly in After Effects.

I have several ideas for VR games and experiences, and am very much looking forward to bringing them to virtual life.

 

 

So what can we do for you?

{ 0 comments… add one now }

Leave a Comment

Previous post: