Using the Camera Tracker – Round 2

In this Blender 2.5 video tutorial we give you another run through the Camera Tracker system that has been implemented via the GSOC 2011 Tomato branch.

Leave Comment

Discussion

32 Responses to “Using the Camera Tracker – Round 2”
  1. Posts: 1
    Steve Elkins says:

    *Feature Request* Use camera tracking to “UV Unwrap” a room or object directly from the average of all the movie clip stills. Steps: Take a movie clip of a room or object. 2. Make the mesh in blender with camera tracking. 3. Paint the mesh with the movie stills!

  2. Posts: 2
    cameron newham says:

    Actually, I’ve spotted a very big ommision from the match moving as it currently stands. The distortion adjustment is currently useless because it doesn’t undestort the images – it only applies the undistort to the calculated points (i.e.: calculating the camera positions).

    This means that when you overlay your movie on the Blender 3D view you will be compositing a distorted with an undistorted view and the matching won’t work perfectly.

    If you want perfect compositing it would require batch processing all the movie frames through an undistort filter and then using those frames in Blender.

    However, it appears that the distortion in most of the samples (including my own) I’ve seen so far is rather low, so one can get away with it.

    It would also be nice if they provided a reverse calculation function – i.e.: given the solved camera position, calculate the 3D position of a tracked point. Currently as it stands, doing a recalc for a new point calculates the camera path anew from all the points and can adversly affect the result if the point you want to calculate for is only in a small number of frames.

    • Posts: 6
      Guest Guest says:

      That is not an omission, that is just work in progress. ;)
      But in the meantime Sergey has already implemented two new nodes in the compositor, with that you can either undistort the footage, or distort the CG.
      https://img.skitch.com/20110922-x2bbbpt8jdwfd31yrpxfhc69tk.jpg
      Unfortunately the nodes are currently rather slow, but at least they work more or less.
      There are also plans to implement a function to undistorted the proxies so that the 3d viewport uses the correct undistorted images.
      :)

      • Posts: 2
        cameron newham says:

        Thanks for the update. That’s cool – I was expecting them to include undistort image at some stage.

        In the meantime I suggest using PanoTools (either directly or via the PS plugin) to batch undistort the images before using them in Blender.

  3. Posts: 18
    Mads Alber says:

    Thanks so much for this, I have been looking forward to another update. I hope you will keep making tutorials on the tracking and on combining footage with animation :D

  4. Posts: 2
    Chris Walker says:

    Are there any plans in the works to include the ability to track panning shots? Considering how prevalent panning shots or just long shots in general are, it would seem a serious misstep not to include a way to track them. (Not to mention incredibly frustrating for me considering I do love my pans and oners)

  5. Pingback: Kleiner Herbstgruß :) » www.noemis.de

  6. Posts: 4
    Makaco Studios says:

    Thanks for the tutorial…is good

  7. Posts: 24

    Nice tutorial, im kinda new to Blender and I was wondering what would you use this camera tracking for. I also wanted to know if there was a more beginner friendly tutorial on camera tracking. Thanks!

  8. Posts: 7
    Stephen Johnson says:

    track lines not showing up in the 3d viewport?
    Whats going on?

  9. Posts: 6

    anyone know if it is possible to set this up with cycles i keep trying but end up having to render it out with a green screen behind it and using the chroma key to render the two picture sequences together. Is it even possible to set this up with cycles.

  10. Posts: 1
    jeremy Smith says:

    Great Tutorial have been following since the first one. Thanks for all of the effort. The results look fantastic. Here is my result: http://youtu.be/THDSvN1xi8k I think it tracked well, I am experimenting with different tracking objects.

  11. Posts: 1
    Anton Krug says:

    Hi Sebastian. Nice tutorials, I have Cannon powershot SX1IS who shoots 1080p at 30fps with 40Mbit/s bitrate, so no visible compression artefacts. The video looks clean with good contrast (when I shoot in good conditions, so no low light shots) etc.. so it could be nice for tracing. But I realised the CMOS sensor has pretty huge rolling shutter, almost 100%. It’s so bad that it almost cannot get worse. When I do small motions it’s ok but when I move rapidly or shake the camera I can notice it. In case I will practice very slow,steady and smooth camera motions and make some cheap steady cam. Plus there are some trick to lock it at some shutter speeds and in wider aperture than normally (normally it’s almost full auto mode in video, so this could remedy it little bit) etc… When I will have the best case scenario when it will not be visible to me, but still there will be something present (don’t know how big, maybe pixel or subpixel skew). How much it will affect the camera tracking. When I saw how small differences in pixels make hudge results in the solution I’m afraid that even with some tools and tricks still the rolling shutter will be big enough to ruin my camera tracking options. What do you think where is the line where camera tracking is not worth or even possible.
    Thanks for any infos. Anton.
    PS: Upgrading camera is not the option, so if this camera can’t do it I won’t do 1 project what I wanted to start.

Leave a Comment

You must be logged in to post a comment.