Baby Yoda
Client
NYU
Year
Category
Virtual Production
Live Project
View Now
Inspired by the potential of real-time filmmaking, we explored an experimental workflow using Unreal Engine 4 (UE4) and an iPad to create an interactive virtual camera system. By linking the UE4 virtual camera to the iPad’s camera, we allowed real-time panning within a virtual environment. Our focus was on bringing Baby Yoda to life with motion capture, combining creative 3D modeling and technical integration.
We started by designing Baby Yoda from scratch, adapting a 3D model onto MetaHuman to achieve realistic proportions and movement. Custom textures were created to match the character’s look, ensuring an authentic visual appeal. Motion capture data from Mixamo was applied to rig Baby Yoda, enabling fluid and lifelike animation. The iPad's camera was then connected to UE4, allowing the virtual sequencer to sync with real-world movements. Mounted on a tripod, the iPad acted as a dynamic camera, seamlessly panning within the digital setup.
One of the key challenges was achieving a smooth and responsive connection between the iPad and UE4. Ensuring that camera movements translated naturally within the virtual space required precise calibration and real-time synchronization. Additionally, integrating motion capture onto the custom-rigged Baby Yoda model posed difficulties in maintaining realism, especially when blending animations with the environment.
To enhance tracking accuracy, we optimized the link between UE4 and the iPad’s camera, fine-tuning latency issues for a more immersive experience. Adjustments to Baby Yoda’s rigging and weight painting helped refine the motion capture results, making movements feel more organic. By leveraging UE4’s sequencer, we enabled real-time cinematic panning, allowing for a smooth transition between physical and virtual perspectives.This experiment pushed the boundaries of interactive filmmaking, demonstrating how traditional cinematography techniques can merge with immersive virtual production.