Categories: Virtual Reality

How to create your own iOS-VR experience.

iOS (good operating system for visualizing 3D configurators) has amazing native frameworks to speed development. UIKit and SceneKit are some examples of the many frameworks used in daily development. A framework with native support for VR applications is unfortunately not yet available. In this blog we explain how we have combined existing frameworks into our own VR experience.

Definition of the key aspects of VR.

However, before we can begin development, we need to understand what VR is.

VR is a computer-generated simulation of a 3D image or 3D environment that can be interacted in a seemingly real way by a person using special equipment such as VR glasses with a screen inside. The human senses play a crucial role in a realistic experience.

From this we can conclude that we need something that can handle the input and output of our simulated environment, VR headset specifics and stereo audio rendering. To capture these functionalities, we use the Google VR SDK for iOS.

The Google VR SDK is designed  for Cardboard, an accessible and affordable VR platform that supports both Android and iOS. It enables immersive VR experiences by combining data from the phone`s sensors to predict the user`s head position in both the real and virtual worlds. Combined with an easy-to-use VR headset, it is ideal for a great VR experience.

We also need a tool that can process the content in the environment (e.g. models with textures, rigs and animations). Fortunately, iOS has a native 3D kit, SceneKit. It even comes with a WYSIWYG editor for 3D content integrated into Xcode.

Architecture design.

Now that we have our two main components, all we can do is bring them together. Let`s look for similarities.

Both frameworks are based on OpenGL. Google VR makes its OpenGLContext available, while SceneKit can also use it. This turns out to be child`s play.

Increase your conversion rate.

We help you to generate more inquiries from your website with Virtual Reality content.

Last but not least, we have developed our own custom renderer to delegate information between the Google VR SDK and its native SceneKit. This information contains everything related to head tracking (e.g. the user`s position in the virtual world). Here we also define the initializers, programs, shaders and renderers of some user-defined drawing objects. All this is done in OpenGL to work efficiently with our native SceneKit.

Remember, we have the Google VR SDK, which passes the head tracking information to our custom renderer. The custom renderer delegates this information along with the video and custom objects to the three available SCN renderers. The three renderers are used for each eye and one for the center used for Magic View mode. These form our scene, which the user can experience and enjoy.

Now comes the magic.

Time to create our world in VR. First we take Google VR SDK`s CardboardView and find its OpenGLContext to pass it to our custom renderer.

Then we create a SCN scene and append it to the three SCN renderers created with CardboardView. In this SCNScene we can import any model/object into the visual editor (preferably with a .dae extension).

In our custom renderer we have defined functions to read an embedded video and a 360 degree sphere to draw the scene on which the video will be projected. It also includes an extension to the GVRHeadTransform class, which is responsible for passing head tracking information to simply rotate or move our objects based on the user`s current head position.

With our 360 degree projected video and imported objects, we can slide our smartphone into a VR viewer, sit back, watch and enjoy our own virtual world.

What`s next?

With the advent of ARKit, we will definitely be looking for native alternatives, as Apple could blow our solution right out of the water if they decide to stop supporting OpenFL in favor of their own Metal API. But until then, this self-generated solution should serve its purpose.

3DMaster