We present an interactive mobile application that supports the design, AR guided capture, and post-processing (stabilization, take management, and rough cut assembly) of cinematic shots on mobile devices, unifying filmmaking stages that traditionally require separate tools and personnel.
Our shot planning interface represents scenes as parallel foreground and background layers suspended in 3D space. Users design and previsualize shots by manipulating viewport rectangles on each layer. Camera position, field of view, zoom, and focus are then derived from these viewports. The Camera Feed panel shows the resulting view that the user would see through the viewfinder during filming. In the app, users can replace these abstract layers with real photos from location scouting. During capture, the shot plan drives augmented reality guidance with live object tracking and segmentation, helping the camera operator match the planned framing while the app automatically adjusts zoom and focus in real time (see our video above). Try the presets below to see how different viewport combinations produce distinct cinematic effects.
Additional prototype clips (720p, compressed for the web). Use controls to play.
These scenes were recorded using the same, scale-invariant shot plan.
More coming soon…
This work was partially supported by a National Science Foundation Faculty Early Career Development Grant under award #2340448. We thank our study participants and testers, especially Shamus Li and Xinrui Liu, for their help and feedback in developing our app.