From a development standpoint, dropping the prefab into the scene was a breeze. After some selective renaming of cameras and connecting a few scripts, we were right there in it - riding the Tumbleweed Express. You can see from the footage it threw off our projectile launchpoint to a space above the caboose's turret (probably the result of interaction between our script calculating projectiles from a third-person mouse orbit and the "neck-height" magic of the Oculus prefab) - and of course our reticle disappeared since it's still keyed to the center of the screen and the GUI is pretty much lost. But the resulting feel? AMAZING. The depth really brings out the beauty of the Tumbleweed world and integrates well with our aiming mechanic (some of us don't want to go back to a mouse). The "aim-with-your-eyes" approach does make it difficult to switch back and forth between the front and back of the train so we'll get Sneaky Pete working on some sort of lever to bring the turret around faster than your neck can move.
Like many backers, we were disappointed to learn that "Unity Support" means "Unity Support (with Pro License)" but are grateful for the extension of the pro-trial to four months. Since upgrading the team to full licenses is somewhere in the range of college tuition or a new car, we experimented a bit with rolling our own dual-camera prefabs. Could we write our own scripts to read Oculus as a controller and simply output a 1280x800 view from two cameras? Splitting the screen for dual cameras is a pretty simple trick: under the Inspector Camera settings, use the "Normalized Rect View" to set the left camera W to 0.5 with X set to 0 right camera W to 0.5 with X set to 0.5 (i.e. make the cameras half as wide as the screen, and have the right camera view start at the midpoint of the screen). It's easy to see how this could be used in lots of games for handling multi-player on a single screen. With some key-commands to adjust the distance and rotation of the cameras, we were able to calibrate certain parts of the field of view into focus manually, but getting the whole thing in sync seems to depend on the "fisheye" lens-distorting you can see in the increasingly recognizable Oculus-view videos. The difference between the plugin and the roll-our-own dual screen was striking so much appreciation to the Oculus team for doing the heavy lifting! If you're reading this: thanks for keeping the SDK builds coming - we're hoping you're in talks with Unity to find a solution that will make it possible for indie devs to use it without breaking the bank.