Particles
jKEY & jMERGE
These are a pair of chromakey plugins for the Blender sequencer. The jKEY plugin is used to extract the key and remove spill, jMERGE can then used to immerse the jKEY result to the background using color matching & light wrapping
Disclaimer: I have created these plugins from ideas found on various sources on the net and in my head. As the green/blue/chroma key techniques are generally heavily patented I can't be certain that the methods I've used in these plugins can be used for commercial work without violating some of those patents (the base key construction and spill removal are mostly from Petro Vlahos' now expired patents that Ultimatte was based on). My own interests are purely on a hobby basis and I don't intend to profit from these plugins in any way.
You can download the source code and precompiled Windows dlls for these plugins here:
Find out what all the parameters do and how to use these plugins.
jKEY Settings
jKEY can have one or two inputs, the first input is allways the foreground footage, the optional input is interpreted as a garbage matte with the value max(r,g,b) for every pixel.
[note: jKEY will keep the original foreground alpha channel also and if a garbage matte is specified the min(alpha_foreground, value_garbage) is used as the resulting initial key]
General:
- View: what is shown as output 0=original footage, 1=base key, 2=detail key, 3=combined key, 4=final result
- Green: is the footage on green or blue screen (default is selected=green)
- Color: a specific sample of the background color to determine it's variance from pure green/blue
- DV-Blur: tries to reduce the effects of dv-video's chroma sampling (use if your footage is from a dv source) 0=do nothing, 1=NTSC-DV 4:1:1, 2=PAL-DV 4:2:0
Base Key:
- White & Black: just like Photoshop levels command, determines what is pure white/black in the key
- Tune: moves the key inward/outward
- Blur: softens the base key
Detail Key:
- Luma: you can use either the luma or key-color channel to extract the detail key (default is selected=luma)
- White & Black: same as with the base key
Combined Key:
- White & Black: as with the base & detail key
- Tune: as with base key
- Blur: as with base key
Spill Removal:
- Value: how much spill to remove
- A&B: parameters to controll from where to remove spill
jMERGE Settings
jMERGE requires two inputs, the first is the jKEY result (or any other footage with an alpha channel), the second input is the background footage.
[note: all of the views of jMERGE (with the exception of 2=background) contain the original alpha channel to be used for further composition]
- View: what is shown as output 0=composite without correction, 1=foreground, 2=background, 3=corrected foreground, 4=final composited result
- Correct: use color correction on foreground (default is unselected=off)
Foreground & Background:
- Highlight sample: the color that should be neutral white in the foreground/background
- Midtone sample: the color that should be a neutral gray in the foreground/background
- Shaddow sample: the color that should be a neutral black in the foreground/background
Edge Blend & Light Wrap:
- Mode: how to add the edge blend / light wrap to the foreground, 0= normal (blend), 1=add, 2=screen
- Relative: how much the brightness of the background affects the edge blend / light wrap, 0=no relation, 1=linear relation, 2=square relation
- Value: how much to add the edge blend / light wrap
- Size: how much the edge blend / light wrap extends inside the foreground
- Blur: how much the background is blurred in the edge blend / light wrap
Worflow example
Prework:
- fire up blender & divide the screen for sequencer work
- we add our foreground footage to the sequencer..
- ..and a quickie garbage matte..
- 4. ..and the jKEY plugin to the footage
jKEY:
- our footage is on greenscreen so let's leave the Green button selected
- let's pick a color from the green screen to the color control with a color & shade that matches most of the background around the foreground
- and set DV-Blur to PAL-DV (=2) since our footage is exactly that
- to edit our base key we change the View to Base Key (=1)
- and move the White & Black sliders to get a nice basic key that fills the inside of the foreground
- and to ge the detail key we change the View to Detail Key (=2)
- It seems that the green channel gives better edge detail than the luma channel so we unselect Luma
- and move the White & Black sliders to get a nice detail key for the edges of the foreground
- Now we can view our combined key by changing the View to Combined Key (=3)
- now we could adjust the combined key's white & black points but I think it looks just fine so we'll leave it as it is
- so we're happy with our key and change the View to Result (=4)
- the we adjust the Spill Removal Value so that all of the background color has disappeared from the edges of the foreground
- and adjust the "B" parameter so that the foreground color returns to normal
In Between:
- we add the background footage to the sequencer
- add the jMERGE plugin to the jKEY plugin and the background footage
jMerge:
- we change the View to Foreground (=1)
- and try to find samples of a neutral hilight, midtone and shaddow to the foreground color controlls
- then we change the View to Background (=2)
- and again try to find samples of a neutral hilight, midtone and shaddow to the background color controls
- finally we change the View to Result (=4)
- and select Correct to see the effect of the color correction
- then we can set the Edge Blend and Light Wrap Values to something and play with the other parameters :)
- and we're done!
General chromakey tips
- Lock your exposure! If your camera can do this then do it. Otherwise the automatic exposure controll is going to move your blue/green background's exposure in and out resulting in worse keys over time.
- Do a garbage matte! Doing a quick prekey around the foreground subject will help you alot especially if your green/blue screen isn't exactly Hollywood quality :)
- Use progressive video! Nothing's worse for keying than interlaced video. If your camera can't do progressive scan atleast deinterlace the footage in the sequencer by choosinf FilterY in the strip properties.
- Background first! Have the background allready done when you start shooting the foreground footage, this way you know better how to light & from what angles etc to shoot
- Take some distance! Get the foreground subject away from the green/blue screen to reduce spill. You can also get away with quite awful and uneven green/blue screens if you shoot from such a distance that you have to zoom in a bit. This widens the aperture of your camera and allows the screen to be defocused and thus smoothed.
Blender tips
- Set the render resolution to the foreground footage resolution to get rid of nasty interpolation artifacts.
- In Windows to get DV or any type of video directly to Blender without converting to raw avi you can use Wax2 to frameserve the video in RGB24 format to Blender. Even if the FFMPEG addition will allow DV to be imported directly to Blender Wax2 is usefull to roto garbage mattes etc.
- You can zoom the plugin property box with +key to allow better slider control by dragging.
Close explanation
Optical Flow
This is a plugin for the Blender sequencer that implements the Horn & Schunck optical flow algorithm with a multi-resolution approach for better detection of larger motions. The plugin also has two examples of optical flow field usage: a blend between two frames & motion blur.
You can download the source code and precompiled Windows dll for the plugin here:
Optical flow is the motion of pixels from frame to frame. (more info @ Wikipedia)
So.. what does this plugin do actually?
Plugin setup
The plugin has to be set to two identical clips one frame apart as shown in the image. (Blender's current sequencer plugin system doesn't allow arbitrary frame access so two inputs of the same sequence are needed with differing times)
With nice settings (& with a bit of luck :) the optical flow field is got for every frame-pair. The sequence in question here undergoes rather uniform motion but still there are few mistakes (so there's something to work on still :). A "flow key" is shown at the lower left corner for easier interpretation of the flow field.
Explanation
You have a video clip right? And you want to get an image between 2 frames.. right? So what do you do..? Blend the frames with 50-50 alphas and get something like this image??!! Not good. But if you use the data provided by the optical flow plugin instead your results can be like the next image.
The image has errors at the areas the above flow field suggests but otherwise rather good eh?
You can also compare the above result with the real inbetween image. (The original video was interlaced, optical flow was calculated from odd-fields-duplicated video, the real inbetween is the even field between the odd fields.
This image shows the other possibility of the plugin: motionblur.
Close explanation
Image Based Lighting
IBL means recreating the lighting conditions of a realworld place from a hdr image. I wrote a python script which takes the text output of HDR Shop's lightgen plugin and creates the appropriately coloured lamps in Blender.
[lightgen2blender1.1.py] - not sure if the script still works as the py-api might have changed considerably since I wrote this.
jTrace
In the late 2002 school started to feel boring and I started having a terrible urge to really learn to code. After many "hello worlds" this was the project that got me hooked on programming.
A relatively simple raytracer with multiple lamps with colour, intensity and range. Spheres & tris as objects with location, rotation, color and reflection properties. User defined camera location and supersampling for antialiasing. All parameters were read from a text file and output was done into bmp images.
Thinking back it was slow as hell and not really that high tech at all but oh the joys of getting a proper camera mapping, shading, and reflections working after the countless nights without sleep learning c++ and object oriented programming :)