Owlchemy Labs Teases Fresh In-Engine MR Tech

Category: 
Owlchemy Labs Teases Fresh In-Engine MR Tech

Owlchemy Labs, the studio known for the genre-defying game Job Simulator, have cooked up a fresh way of doing mixed reality of which not only promises to be more realistic, however can be sure to grab the attention of VR streamers in addition to content creators alike. They’re calling the idea ‘Depth-based Realtime In-app Mixed Reality Compositing’. the idea sounds complex, however the idea seems to simplify the entire production pipeline.
 
Green screen VR setups have littered expos ever since Northway Games teased mixed reality integration in Fantastic Contraption earlier This particular year. Requiring little more than a green sheet, an external camera in addition to a few different bits in addition to bobs (Northway published a step-by-step guide), the results are easy to see:

The video above however can be the result of extensive polishing in addition to after effects like rotoscoping to correctly occlude items, producing the idea appear of which the player can be in 3D space instead of flatly sandwiched between the foreground; the contraption, in addition to the background; the virtual environment.

Owlchemy Labs recently teased a fresh in-engine method of putting you inside middle of the action, correctly occluded, of which promises to eliminate extra software like Adobe After Effects or composition software like OBS coming from the equation.
 
They do the idea by using a stereo depth camera, recording video in addition to depth data simultaneously. They then feed the stereo data in real-time into Unity using a custom plugin in addition to a custom shader to cutout in addition to depth sort the user directly inside engine renderer. This particular method requires you to replace your simple webcam having a 3D camera like the ZED 2K stereo cam—a $500 dual RGB camera setup of which importantly doesn’t use infrared sensors (like Kinect) which can screw with VR positional tracking. however if you’re pumping out mixed reality VR footage on the daily, then the time savings (in addition to admittedly awesome-looking results) may be worth the initial investment.
 
Owlchemy says you’ll be able to capture footage with either static or full-motion, tracked cameras, in addition to do the idea coming from 1 computer. Because the method doesn’t actually require a VR headset or controllers, you can technically capture a VR scene with multiple, non-tracked users.
 
“Developing This particular pipeline was a large technical challenge as we encountered many potentially show-stopping problems, such as wrangling the process of getting 1080p video with depth data into Unity at 30fps without impacting performance such of which the user in VR can still hit 90FPS in their HMD,” writes Owlchemy. “Additionally, calibrating the camera/video was a deeply complicated issue, as was syncing the depth feed in addition to the engine renderer such of which they align properly for the final result. After significant research in addition to engineering we were able to solve these problems in addition to the result can be definitely worth the deep dive.”
 
The studio says the idea still needs more time to complete the project, however they “have plans inside works to be able to eventually share some of our tech outside the walls of Owlchemy Labs.” We’ll be following their progress to see just how far reaching the idea becomes.

Related articles

VRrOOm Wechat