With ARKit on iPhones, it is possible to track both the real world position and rotation of the device. Using prototyping tool called ZigSim (https://1-10.github.io/zigsim/) I am able send the information from the iPhones ARKit over the network via OSC. The data in brought into TouchDesigner for review and conversion (ie. quaternion rotation to degrees etc.) the piped over to Blender. The information for TD is connected directly to the virtual cameras own parameters for XYZ location and rotation.
And it works really well!
This is the beginning step in making the tools needed for Virtual Production (perhaps the most vital step, actually). I can now use my iPhone to interactively film 3D scenes as if I were actually in the scene filming it with my phone. It makes the process of capturing 3D scenes more intuitive. It also yields a “hand-held” feel to the footage. If that’s not the goal, adding a LAG chop before the OSC Out CHOP in TD, and enabling clamp acceleration, with make the motion of the virtual camera much smother. Also, turning off Y-rotation on the virtual camera will make sure the horizon is always straight in your shots.
The next step is mounting the phone to a DSLR and using it to track the location of the camera.