How to make puppets in Blender

(This is a work in progress.)

Blender as a puppet theatre

I love traditional puppetry—hand puppets, rod puppets, marionettes, bunraku puppets, shadow puppets… But I don't always have the materials, skill, or space to create the puppets of my dreams. Filming puppets can be a complex operation. You have to build an appropriate stage, light it, wire up microphones, and keep everything looking good while the puppetteer stays out of sight. Even though shabby-looking puppet shows can be charming and fun, there are too many points of failure for a single performer to do a complex multi-character scene.

With Blender, I can instead create a virtual stage and virtual puppets. The overall effect gives the impression of real puppets, but the process is unencumbered by the laws of physics (except if we want the laws of physics). It's also possible to perform multiple characters asynchronously, use multiple cameras, and generally take advantage of what 3D rendering programs have to offer.

Modeling principles

Different types of real-life puppets

  • hand puppets / sock puppets
  • rod puppets
  • shadow puppets
  • marionettes
  • multi-person bunraku or theatrical puppets


Designing for performance

  • low polygon count because you need to be able to capture the performance at high fps

Rigging principles


Performance capture tools


  • Use the Video Sequence Editor to import audio. You may want to place the audio with a few seconds of silence before and after, to give you some breathing room so you don't overwrite your capture if the playback starts looping.
  • There's also a script I use to stop the playback from looping:

    import bpy
    def stop_playback(scene):
        if((scene.use_preview_range == False and scene.frame_current == scene.frame_end) or 
        (scene.use_preview_range == True and scene.frame_current == scene.frame_preview_end)):

Eevee / Viewport

As of version 2.8 or so, Blender has a realtime rendering engine (Eevee) and a path-tracing engine (Cycles). Using Eevee makes it possible to capture a performance of a puppet rig in close-to real time. I don't have a high-end graphics card, but I can still get a decent capture framerate by setting the viewport to solid mode.

Auto Keyframe

Automatic keyframe insertion. If you press the play button and start transforming an object, keys will be inserted at the document Frame Rate. (Note that this is not the same as the Step value.) If you want to capture a very stop-motion look, set the Frame Rate to 6 fps. If you want a live-puppet look, make sure it's at 24 fps (or higher if you're into that sort of thing).


The quickest and easiest way to capture live movement in blender is with the mouse. An optical mouse is good enough to catch a lot of small movements. With Auto Keyframe switched on, select your control bone and press the spacebar (or whichever key you have set to start playback). Move the control while listening to the audio.

  • The movement is going to be relative to your view of the controller, so experiment first to see where you want to orient your view to get the controller movement in the right direction in 3D space.
  • The movement is also going to be relative to your zoom level. If you're zoomed way out, your mouse movements cause much bigger controller movement than if you're zoomed in. So if you want very small, subtle movement, zoom in more. If you want big, exaggerated movement, zoom out.

Pen tablet


Touchosc experiments

Puppetry vs. Animation

Puppetry Animation
- portraying character through movement - portraying character through movement
- "realtime" performance - gradually composed performance
- high fidelity, low precision* - high precision, low fidelity*

Puppetry has certain advantages over animation. A very simple puppet can feel more alive than a super-complex animated character (in 2D or 3D).

crow.jpg Fig. 1: Mister Personality

\* what do I mean by "fidelity" – how closely it gets at the "truth" of the movement. It's abstract. There are subtle movements that we do with our bodies that are just too miniscule to be practical to animate, but they're the vital quality of seeing a real, living being. There's a lot of noise in our natural movements, and animation (especially computer animation), is optimized around getting rid of most of that noise and doing everything in clean curves and arcs.

straight-ahead animation vs. pose-to-pose

  • It's not just that you use straight-ahead to do fast action and pose-to-pose to do everything else
  • straight-ahead is really good for conveying physical energy or force.

Stop-motion animators traditionally work straight-ahead

Animation in CG settings has been keyframe/pose-to-pose oriented

  • The rig is complicated and has too many controls to do straight-ahead animation with.

Performance capture brings the spontaneity and embodied aspects of straight-ahead animation and puppetry into the CG animation context.

  • For certain types of acting, doing a quick puppet style animation is much less complicated than trying to key out a long sequence with sound sync.

Applying animation techniques to the rough "footage"

space switching for overlapping action



(Tutorials > How to make puppets in Blender)