Camera Tracking and Solving
- Software:Blender 2.70 ·
Blender's Practical VFX Pipeline
In this course I will show you some of the major stages of a Visual Effects (VFX) pipeline by doing a fun little project: Compositing a 3D monster into some life action footage. The scope of this course is specifically teaching you the VFX skills so we won't be focusing on modeling and animation much. We are going to grab a pre-modeled and pre-rigged character from Blendswap, made by Theory Animation: George. The source files for this course are available to everyone under the CC-BY-SA license. Download them HERE. Thank you, Theory Animation! First I will cover some housekeeping to get familiar with the file structure as well as some small tasks like fixing a few bone-weights and modifying the shaders of our monster. Since we are not going to cover the whole animation process, I will use a pre-made run cycle. However, this cycle needs some minor cleanup so I'll go over that real quick. If you are interested in creating a run-cycle yourself. It's perfectly fine to follow the course with your own run cycle. A great resource to make George run is the tutorial by his creators at Theory Animation! After all the cleaning and fixing, it's time for the fun stuff: Camera Tracking! We will be using a hand-held shot I recorded, tracking it, then undistorting and solving it with the Blender camera tracker. Finally lining it up perfectly with our 3D scene. In Part 5 we are going to make this course even more interesting by doing a small fire simulation! We will set George's horn on fire and render it with Cycles, which has only been possible since the 2.72 release. When our match-moving and simulation tasks are all done, it's time for rendering and compositing, which I have dedicated 3 parts to: Setting up the render layers and passes, combining it all correctly in the compositor, and applying effects such as light wraps, motion blur and color correction to realistically blend our rendered content into the live action footage.