These are a pair of chromakey plugins for the Blender sequencer. The jKEY plugin is used to extract the key and remove spill, jMERGE can then used to immerse the jKEY result to the background using color matching & light wrapping

Disclaimer: I have created these plugins from ideas found on various sources on the net and in my head. As the green/blue/chroma key techniques are generally heavily patented I can't be certain that the methods I've used in these plugins can be used for commercial work without violating some of those patents (the base key construction and spill removal are mostly from Petro Vlahos' now expired patents that Ultimatte was based on). My own interests are purely on a hobby basis and I don't intend to profit from these plugins in any way.

You can download the source code and precompiled Windows dlls for these plugins here:

Find out what all the parameters do and how to use these plugins.

Optical Flow

This is a plugin for the Blender sequencer that implements the Horn & Schunck optical flow algorithm with a multi-resolution approach for better detection of larger motions. The plugin also has two examples of optical flow field usage: a blend between two frames & motion blur.

You can download the source code and precompiled Windows dll for the plugin here:

Optical flow is the motion of pixels from frame to frame. (more info @ Wikipedia)

So.. what does this plugin do actually?

Image Based Lighting

IBL means recreating the lighting conditions of a realworld place from a hdr image. I wrote a python script which takes the text output of HDR Shop's lightgen plugin and creates the appropriately coloured lamps in Blender.

[] - not sure if the script still works as the py-api might have changed considerably since I wrote this.


In the late 2002 school started to feel boring and I started having a terrible urge to really learn to code. After many "hello worlds" this was the project that got me hooked on programming.

A relatively simple raytracer with multiple lamps with colour, intensity and range. Spheres & tris as objects with location, rotation, color and reflection properties. User defined camera location and supersampling for antialiasing. All parameters were read from a text file and output was done into bmp images.

Thinking back it was slow as hell and not really that high tech at all but oh the joys of getting a proper camera mapping, shading, and reflections working after the countless nights without sleep learning c++ and object oriented programming :)