I have a web application which produces a 3D scene using WebGL. I am wondering whether it is possible to show this scene using Oculus rift? How difficult is it?
There are two major components to rendering to the Rift, head tracking and distortion.
Distortion is normally accomplished by the Oculus SDK using either OpenGL or Direct3D, but it's possible to implement it in Javascript. You can see an example of this here. That page uses a pre-baked set of distortion vertices pulled out of the Oculus SDK, suitable for use with the DK1 model.
Head tracking is significantly more difficult, because it requires access to the hardware, or the runtime which is talking to the hardware. Mozilla is working on a set of APIs for accessing head tracking, and possibly fetching distortion parameters, but it appears far from stable.
Three.js appears to have examples that want to support the experimental VR apis, as well as examples that use something called ouclus-rest.
Basically it boils down to: If you want to do VR with Javascript, you're either going to have to roll your own solution, try to hit a moving target, or have patience.
Overview
While I realize this is an old question, and there's a lot more info about out there now, I'm going to post an answer anyway, as the answers previously posted are not up to date.
I've created a basic plunker that illustrates what you need to do to get three.js, webvr, and oculus working together. Note: I couldn't quite get fullscreen to work under plunker, but if you run the plunker with an oculus Rift (OR) under mozilla nightly build you should see that head rotation is working. You should be able to get the full OR VR experience if you run it outside of plunker.
I think another good app to refer to is RiftSketch. This is what I first used to learn how to get the OR to work under a browser (this is actually the app that was written by the original poster of this question).
Here are the relevant webvr snippets that would be different from a standard three.js app:
this.controls = new THREE.VRControls(this.camera);
this.effect = new THREE.VREffect(this.renderer);
this.effect.setSize(this.width, this.height);
this.vrManager = new WebVRManager(this.renderer, this.effect);
and in the render function:
this.controls.update();
this.renderer.render(this.scene, this.camera);
if (this.vrManager.isVRMode()) {
this.effect.render(this.scene, this.camera);
}
else {
this.renderer.render(this.scene, this.camera);
}
Modules Needed
The other thing you need to do is supply the following four libraries (in addition to three.js):
- VRControls.js
- VREffect.js
- webvr-manager.js
- webvr-polyfill.js
VRControls.js and VREffect.js are available from the three.js library under 'examples/js/controls' and 'examples/js/effects' respectively.
Update: I recommend you obtain all libraries from webvr-boilerplate github. as the three.js doesn't seem to have the latest versions.
The other two can be obtained webvr-boilerplate github.
You can choose to access the webvr API directly as described here, but I think it's much easier to use the support libraries.
Final Words
You basically do not have to deal with the Oculus Rift SDK at all. The only people who need to directly call the OR SDK API are Unity engine developers, and Mozilla API developers.
WebVR creates a common API that attempts to present a standardized interface for all HMD devices such as Cardboard, OR, and (presumbably in the future) Samsung, HTC Vive, leap motion et al. If you choose to use VRControls, and VREffects, you have an additional layer of API to make it even easier. It's basically just a bunch of boilerplate. In the end I don't thing you really gain much understanding about what's really going on behind the scenes. You basically just set it up once and never touch it again.
Once you have the OR support, then developing for your app is pretty much just like any other three.js app.
Instead of using a third party software to get access to the tracking data I would recommend to base your implementation on the experimental WebVR APIs that are available in custom builds of Firefox and Chrome.
Provided that WebVR and VR in general gets enough traction this is probably the safest bet.
There are two major components to rendering to the Rift, head tracking and distortion.
Distortion is normally accomplished by the Oculus SDK using either OpenGL or Direct3D, but it's possible to implement it in Javascript. You can see an example of this here. That page uses a pre-baked set of distortion vertices pulled out of the Oculus SDK, suitable for use with the DK1 model.
Head tracking is significantly more difficult, because it requires access to the hardware, or the runtime which is talking to the hardware. Mozilla is working on a set of APIs for accessing head tracking, and possibly fetching distortion parameters, but it appears far from stable.
Three.js appears to have examples that want to support the experimental VR apis, as well as examples that use something called ouclus-rest.
Basically it boils down to: If you want to do VR with Javascript, you're either going to have to roll your own solution, try to hit a moving target, or have patience.
How difficult it would be is based on your experience. That said, I found a library to connect the Rift to the web (assuming that's the direction you're going) that may be of some assistance: Oculus Bridge
From the site: "The goal of this project is to provide a flexible, simple way to access the tracking data and display configuration for the Oculus Rift for use with webGL or any other browser-based content."
© 2022 - 2024 — McMap. All rights reserved.