Post DOF vs Rendered DOF

Although it’s far more common to render 3D images without Depth-Of-Field (DOF) as it renders quicker and offers some flexibility in compositing. In some situations that isn’t always the case as large ZBlur’s within Nuke can take a heck of a long time to render. In fact depending on your scene and your renderer it’s often quicker to render DOF in 3d than it is to apply it as a post process in 2d.

Post DOF

  • Flexibility in compositing. Can adjust effect as needed.
  • Quicker to render.
  • Can be inaccurate – no physically based parameters (although this is largely dependant on the plugin used. The effect is driven by the artist.
  • Large blurs are slow to render.
  • Prone to artifacts. Can’t handle certain situations at all without lot’s of hackery.

Rendered DOF

  • No Flexibility in compositing.
  • Slower to render.
  • Accurate. Physically based parameters.
  • Requires more pixel samples in order to avoid noisy renders.

The following render was done at 2048×1556 and was rendered without any DOF. The total render took 83 seconds. The Depth AOV was rendered using a Zmin filter with a filterwidth of 1×1 in order to avoid anti-aliasing in the render.

Click to see full resolution.

I also rendered the same image with DOF on.

Click to see full resolution.

Unfortunately my plan to show the difference between the two full resolution images was put on hold by Nuke taking far too long to render the full resolution ZBlur. I gave up after 10 minutes so decided to concentrate on a particular region.

Crop region used.

The following 1:1 crop demonstrates the difference between Post DOF and Rendered DOF.

Keep in mind that the Nuke time for the Post DOF was only for the crop area your seeing above – it was taking too long to render the full image with Post DOF. As you can see the Post DOF breaks down quite heavily in some places, while the rendered DOF image did take longer to render, it’s much more accurate and the read time of the image is less than a second in Nuke.

Observations…

  • The rendered DOF spent less time ray-tracing and more time spent on sampling the image. This was due to the increase in pixel samples in order to get less noisy results and higher focus factor.
  • With pixel samples at 3×3 the DOF render took 57 seconds, faster than the 83 seconds that it took to render the Non-DOF version although the final result was unacceptable. For less extreme blurs pixel samples could be set as low as 6×6.
  • Focus Factor parameter in 3Delight helps speed up DOF renders by reducing the shading rate in areas of high blur with very little perceivable difference.
  • Despite some noise, the end result is much more visually pleasing than the ZBlur result in Nuke.

 

 

Colour Temperature in Maya

For a while I’ve wanted to implement colour temperature control into my lighting workflow but I’ve never been able to figure out how it’s calculated. Then I came across this site, which has already mapped out blackbody temperatures to normalised sRGB values.

Using this as a starting point I mapped out the values into a SL function…

color blackbodyfast( float temperature;)
{
	uniform color c[16] = 
		{
		(1,0.0401,0),(1,0.1718,0),(1,0.293,0.0257),(1,0.4195,0.1119),(1,0.5336,0.2301),
		(1,0.6354,0.3684),(1,0.7253,0.517),(1,0.8044,0.6685),(1,0.874,0.8179),(1,0.9254,0.9384),(0.929,0.9107,1),
		(0.8289,0.8527,1),(0.7531,0.8069,1),(0.6941,0.77,1),(0.6402,0.7352,1),(0.6033,0.7106,1)
		};
	float amount = smoothstep ( 1000, 10000, temperature );
	color blackbody = spline ( "catmull-rom", amount, c[0],
		c[0],c[1],c[2],c[3],c[4],c[5],c[6],c[7],c[8],c[9],
		c[10],c[11],c[12],c[13],c[14],c[15],
		c[15]);
	return blackbody;
}

I decided rather than map every temperature value from 1000K to 40000K, I decided just to deal with 1000K to 10000K using the CIE 1964 10 degree Colour Matching Functions – only because of the later date of 1964, I couldn’t see (nor greatly understand) the difference between the colour matching functions. The original function I wrote called blackbody used every value of the kelvin scale from 1000K to 10000K, this resulted in an array of 90 values. The modified one above uses every 6th value which brings the array size down to 16 values, in my tests I didn’t notice a speed difference using 90 values, but looking at a comparison of the two functions I couldn’t see enough visual difference to bother using the full 90 steps.

Blackbody temperature comparison in sRGB. Temperature is mapped to T coordinate.

There is a slight peak where the warm and cool colours meet in the 90 step version. It’s a bit more obvious looking at the image in linear light.

Blackbody temperature comparison in Linear. Temperature is mapped to T coordinate.

Because the values are in sRGB, they need to be converted to Linear before getting used in the shader. The SL used in the main body of my test surface looks something like this…

#include "colour.h"

surface blackbody_srf(
	uniform float temperature = 5600; #pragma annotation temperature "gadgettype=intslider;min=1000;max=10000;step=100;label=Temperature;"
)
{
	color blackbody = blackbodyfast (temperature);
	blackbody = sRGB_decode(blackbody);
	Oi = Os;
	Ci = blackbody * Oi;
}

Used in a light shader the output looks something like this…

Blackbody temperature. Light intensity is the same throughout. sRGB.

The only problem now is that 3Delight doesn’t show a preview of light shader or more importantly the colour temperature in the AE settings for my light.

To get around this I decided to implement an expression which changed the colour of the Maya light that my 3Delight shader was attached to. Because MEL doesn’t have a spline function like SL does I had to improvise using animation curves. First up the MEL to create the three curves that I need to create the RGB colour temperature.

$red = `createNode animCurveTU`;
$green = `createNode animCurveTU`;
$blue = `createNode animCurveTU`;

setKeyframe -itt "spline" -ott "spline" -t 1 -v 1 $red ;
setKeyframe -itt "spline" -ott "spline" -t 10 -v 1 $red ;
setKeyframe -itt "spline" -ott "spline" -t 11 -v 0.929 $red ;
setKeyframe -itt "spline" -ott "spline" -t 12 -v 0.8289 $red ;
setKeyframe -itt "spline" -ott "spline" -t 13 -v 0.7531 $red ;
setKeyframe -itt "spline" -ott "spline" -t 14 -v 0.6941 $red ;
setKeyframe -itt "spline" -ott "spline" -t 15 -v 0.6402 $red ;
setKeyframe -itt "spline" -ott "spline" -t 16 -v 0.6033 $red ;

setKeyframe -itt "spline" -ott "spline" -t 1 -v 0.0401 $green;
setKeyframe -itt "spline" -ott "spline" -t 2 -v 0.172 $green;
setKeyframe -itt "spline" -ott "spline" -t 3 -v 0.293 $green;
setKeyframe -itt "spline" -ott "spline" -t 4 -v 0.4195 $green;
setKeyframe -itt "spline" -ott "spline" -t 5 -v 0.5336 $green;
setKeyframe -itt "spline" -ott "spline" -t 6 -v 0.6354 $green;
setKeyframe -itt "spline" -ott "spline" -t 7 -v 0.7253 $green;
setKeyframe -itt "spline" -ott "spline" -t 8 -v 0.8044 $green;
setKeyframe -itt "spline" -ott "spline" -t 9 -v 0.874 $green;
setKeyframe -itt "spline" -ott "spline" -t 10 -v 0.9254 $green;
setKeyframe -itt "spline" -ott "spline" -t 11 -v 0.9107 $green;
setKeyframe -itt "spline" -ott "spline" -t 12 -v 0.8527 $green;
setKeyframe -itt "spline" -ott "spline" -t 13 -v 0.8069 $green;
setKeyframe -itt "spline" -ott "spline" -t 14 -v 0.77 $green;
setKeyframe -itt "spline" -ott "spline" -t 15 -v 0.7352 $green;
setKeyframe -itt "spline" -ott "spline" -t 16 -v 0.7106 $green;

setKeyframe -itt "spline" -ott "spline" -t 2 -v 0 $blue;
setKeyframe -itt "spline" -ott "spline" -t 3 -v 0.0257 $blue;
setKeyframe -itt "spline" -ott "spline" -t 4 -v 0.1119 $blue;
setKeyframe -itt "spline" -ott "spline" -t 5 -v 0.2301 $blue;
setKeyframe -itt "spline" -ott "spline" -t 6 -v 0.3684 $blue;
setKeyframe -itt "spline" -ott "spline" -t 7 -v 0.517 $blue;
setKeyframe -itt "spline" -ott "spline" -t 8 -v 0.6685 $blue;
setKeyframe -itt "spline" -ott "spline" -t 9 -v 0.8179 $blue;
setKeyframe -itt "spline" -ott "spline" -t 11 -v 1 $blue;

rename $red "colourTemperatureRed";
rename $green "colourTemperatureGreen";
rename $blue "colourTemperatureBlue";
The resulting animation curves.

Then the next stage was to create an expression which linked the outputted colour temperature to the light colour.

float $r, $g, $b;
if (will_point_lgt1.colourType > 0)
{
	$temp = will_point_lgt1.temperature;
	$amount = `smoothstep 1000 10000 $temp`;
	$c = 16 * $amount;
	$r = `getAttr -t $c colourTemperatureRed.output`;
	$g = `getAttr -t $c colourTemperatureGreen.output`;
	$b = `getAttr -t $c colourTemperatureBlue.output`;
}else{
	$r = will_point_lgt1.lightColourR;
	$g = will_point_lgt1.lightColourG;
	$b = will_point_lgt1.lightColourB;
}
point_lgtShape.colorR = $r;
point_lgtShape.colorG = $g;
point_lgtShape.colorB = $b;
Previewing the light inside Maya. The Maya-specific settings of this light are ignored in the final render.

OpenEXR and 3Delight

The OpenEXR format has a number of useful features which are super handy for CG animation and VFX such as saving the image data in either half or full floating point, setting data-windows and adding additional metadata to the the file. 3Delight for Maya allows you to use all these features, but doesn’t cover how to use them in the documentation (at least I couldn’t find mention of it).

In order to use gain access to these features you need you need to add an extra attribute to the render pass called “exrDisplayParameters”. In the example below, the name of my render pass is called “beauty”.

addAttr -dt "string" -ln exrDisplayParameters beauty;
setAttr "beauty.exrDisplayParameters" -type "string"
"-p \"compression\" \"string\" \"zip\" -p \"autocrop\" \"integer\" \"1\" ";

The string attribute should end up looking like so in the attribute editor…

-p "compression" "string" "zip" -p "autocrop" "integer" "1"

The above sets the compression type to zip and to also tells 3Delight to autocrop the image when it’s rendered. Auto-crop adjusts the bounding box of the data-window (or ROI, region-of-interest) to only contain non-black pixels (I believe it does this based on the alpha channel), this allows Nuke to process the image quicker as it only calculates information within that data-window. See this tutorial on Nuke Bounding Boxes and how to speed up your compositing operations.

The basic syntax of the parameter string is easy enough to understand, the three arguments passed to the -p flag are name, type and value.

-p "[name]" "[type]" "[value]"

You can also add additional metadata to the header of the EXR render. For example you may wish to include things such as

  • Project, scene and shot information.
  • Characters or creatures in the shot.
  • Model, texture, animation versions used.
  • Maya scene used to render the shot.
  • Focal length, Fstop, Shutter Angle, Filmback size.

3Delight already includes some metadata already with the EXR, so you don’t need to add information for the following…

  • Near and far clipping planes.
  • WorldToCamera and WorldToNDC matrices. The Nuke Python documentation has info on how you can use this to create cameras in Nuke based of this data.

You can add this metadata using the “exrheader_” prefix and then the name of your attribute. The following will add three metadata attributes called “shutter”, “haperture” and “vaperture”.

-p "exrheader_shutter" "float" "180" -p "exrheader_haperture" "float" "36" -p "exrheader_vaperture" "float" "24"

While the following will add the project name “ussp” and the maya scene name that was used to render the shot…

-p "exrheader_project" "string" "ussp" -p "exrheader_renderscene" "string" "h:/ussp/bes_0001/scenes/lighting_v01.ma"

The easiest way to get information from your scene to this parameter pass is to set up a Pre-Render MEL script in your render-pass along the lines of…

string $sceneName = `file -q -sn`; //Grab the name of the current scene.
string $projectName = `getenv "PROJECT"`; //This assumes you have an environment variable called "PROJECT" with the project name setup already.
string $parameters = "";
$parameters += (" -p \"exrheader_renderScene\" \"string\" \"" +  $sceneName + "\" ");
$parameters += (" -p \"exrheader_projectName\" \"string\"" + $projectName + "\" ");
setAttr ($pass + ".exrDisplayParameters") -type "string" $parameters;

See the 3Delight documentation has more information on what type of metadata you can add  to the EXR.

Custom Shader UI in 3Delight

When you create a custom SL shader in 3Delight it’ll create a automatically create a shader which looks like this in Maya. Now the following UI doesn’t look very useful – the names we’ve called our variables vary in how descriptive they are – which isn’t very useful if others are going to be using this shader

This is based off a shader which looks like this the following SL code.

surface ui_example_srf
(
	string texmap = "";
	float blur = 0;
	float usebake = 1;
	float numsamples = 16;
	float doRefl = 0;
	color diffuseColour = color (0.5);
)
{
	// SHADER DOESN'T DO ANYTHING //
}

3Delight does however provide a method of creating nice looking shader UIs. You can use #pragma annotations in your shader source code to make things nicer.

#pragma annotation texmap "gadgettype=inputfile;label=Texture Map;hint=Texture Map"
#pragma annotation blur "gadgettype=floatslider;label=Blur;min=0;max=1;hint=Blur the Texture Map"
#pragma annotation usebake "gadgettype=checkbox;label=Use Bake;hint=Use Bake"
#pragma annotation numsamples "gadgettype=intslider;min=1;max=256;label=Samples;hint=Number of samples to use."
#pragma annotation doRefl "gadgettype=optionmenu:gather-env:occlusion-env:ptc-env;label=Reflection Method;hint=Reflection Method"
#pragma annotation diffuseColour "gadgettype=colorslider;label=Diffuse Colour;hint=Diffuse Colour."

This will create a shader that looks like this. The hint will be displayed either in the Maya status line or as a tool-tip if you hover the cursor over the UI element.

You can place the #pragma lines anywhere in your SL file, to see them you will need to re-compile the shader and then reload the shader inside Maya by right clicking on the shader, selecting “reload shader” and then selecting the shader in either the Assignment Panel or the Outliner.

Custom Passes in 3Delight

Quite often in CG when working with multiple passes – you want  to do different things depending on what type of pass your rendering – the two more common ways to do this are to either create two different shaders and change the assignment for each pass or to write a shader which is able to figure out what type of pass it is and act appropriately.

For example in a bake pass you typically want to bake out attributes which aren’t dependant on the position or direction of the camera  – this includes things like diffuse/ambient shading, subsurface scattering and ambient occlusion data  – which can all be baked out and reused on objects. Only when the position and direction of the objects or lights changes does the scene need to be rebaked.

In order to control this in 3Delight you can use the RiMel commands to setup custom RIB commands which can be read by your shaders. The following MEL commands are placed inside a PreWorldMEL attribute on the render pass itself.

RiOption -n "user" -p "pass" "string" "bake";

RiAttribute -n "cull" -p "hidden" "integer" "0";
RiAttribute -n "cull" -p "backfacing" "integer" "0";
RiAttribute -n "dice" -p "rasterorient" "integer" "0";

This will output the following commands into the RIB file.

Option "user" "string pass" [ "bake" ]
Attribute "cull" "integer hidden" [ 0 ]
Attribute "cull" "integer backfacing" [ 0 ]
Attribute "dice" "integer rasterorient" [ 0 ]

Then within the shader I can query what pass I’m currently rendering and do something appropriate for that type of pass.

uniform string passtype = "";
option( "user:pass", passtype );
if (passtype == "bake")
{
    // Do something here
}

By default 3Delight also exports out the name of the render pass to an attribute called delight_renderpass_name – this is name of your render pass inside Maya. You can query that name using the following RSL. Obviously using this method highly depends on what you call your passes – the following shading code wouldn’t work if the render pass was called anything other than “bake” – for example if you wanted to do multiple bake passes for within the same scene like separating out animated bakes (which require multiple frames) from static bakes (things which don’t move which can be stored in one frame).

string passtype = "";
option ("user:delight_renderpass_name",passtype);
if (passtype == "bake")
{
	// Do something here	
}