Linux for CG – Setting Up The Command Line

The command line (or unix shell) seems like one of the most archaic way possible to interact with a computer file-system. However most animation and vfx studios require that artists know the basics of interacting with the command line.

There are different types of command lines available, two of the more common are bash and tcsh.

Setting up the tcsh prompt

I primarily use tcsh, so I’ll start with setting it up so it’s more user-friendly. When you start using the default command line, it normally looks pretty dull…

Linux ShellI have my shell setup like so…

Linux ShellIt contains a bit more information. It contains my username (will), hostname (will-A8HE), the time and date as well as what directory I’m currently in. I’ve also put the prompt (>) on a new line, this gives me more real estate when dealing with long paths or filenames.

You can control this behaviour by setting up a .cshrc file (also sometimes called .mycshrc). This is a text file which stores information about how you want your shell setup. The first part of my .cshrc looks like this…

# Colors! This makes it easier to edit the colours on the prompt.
set     red="%{\033[1;31m%}"
set   green="%{\033[0;32m%}"
set  yellow="%{\033[1;33m%}"
set    blue="%{\033[1;34m%}"
set magenta="%{\033[1;35m%}"
set    cyan="%{\033[1;36m%}"
set   white="%{\033[0;37m%}"
set     end="%{\033[0m%}" # This is needed at the end... 🙁

set prompt="\n%B${green}`whoami`@`hostname`${white}%b %P %D/%W/%y ${blue}%// \n ${white}> ${end}"

# Clean up after ourselves...
unset red green yellow blue magenta cyan yellow white end

This sets up the prompt and because it’s quite awkward setting up colours, we’ve set some variables to make things easier. The hash (#) is used to add comments to the .cshrc file and get ignored by the shell.

The set prompt line can be broken down as follows..

  • \n Start a newline.
  • %B Make the text bold.
  • ${green} Make the text green.
  • `whoami` List my username.
  • `hostname` List my hostname (computer name).
  • ${white} Make the text white.
  • %b Make the text not-bold.
  • %P Time in 24 hour format.
  • %D Date.
  • %W Month.
  • %y Year.
  • ${blue}
  • %/ The current working directory.
  • \n Start a newline.
  • ${white} Make the text white.
  • ${end} End colour.


For a more complete listing of all the ways you can modify the prompt, see this page. Next up a few variables to set how the prompt behaves.

set filec
set autolist set color set colorcat
set nobeep

The first of these (filec) enables filename completion, the second (autolist) lists possible options for filename completion and the third (nobeep) stops the command line beeping at you. If you start typing the name of a file and press tab it either completes the name of the file or lists what options you have available. For example if there is only one file in the current directory which starts with ‘a’ and you press ‘a’ then ‘tab’, it will complete the filename for you – saving you from typing. If you have two filenames starting with ‘a’ like ‘andy’ and ‘andrew’ it will complete ‘and’ and list both filenames, press ‘y’ and ‘tab’ and it’ll complete ‘andy’, press ‘r’ and ‘tab’ and it will complete ‘andrew’.


The alias command is a command which can be used to setup custom commands. For example I often setup up commands like ‘myren’ or ‘myscripts’ to quickly jump to my render folders or mel/python scripts folder.

alias ll 'ls -lh --color=auto'
alias lr 'ls -ltrh --color=auto'
alias duho 'du -h --max-depth 1'
alias resource 'source ~/.cshrc'
alias remwhite 'rename "y/ /_/" *'
alias copy 'rsync --progress'
alias mydev 'cd ~/Develop'
alias myren 'cd /prj/$PROJECT/$SHOT/maya/renders'

The first two list the contents of the directory, (ll) lists them in alphabetical order, (lr) lists them in descending time with the most recently modified file at the bottom. (duho) reports back the disk-usage of the current directory. (resource) is used to restart the shell to pick up any changes or new aliases I’ve added. (remwhite) removes whitespaces from any file names in the current directory. (copy) is similar to (cp) in that it’s used to copy files from one place to the other, however rsync gives you a progress meter so you know how long the file is going to take. (mydev) changes the directory to my local Develop directory, while (myren) relies on an environment variable called $PROJECT and $SHOT to be set, this allows me to change to the correct render folder depending on what project and shot I happen to be working on.

Most studios have their own environment variables setup, so you can normally reference those by using the (env) command in the shell to list all the environment variables available. To set up environment variables you can use (setenv), how the env variables are stored is dependant on where you set them. In my .cshrc I have these setup to affect any shell I open.

# Setup env variables for OpenEXR viewers.
setenv CTL_DISPLAY_CHROMATICITIES "red 0.6400 0.3300 green 0.3000 0.6000 blue 0.1500 0.0600 white 0.3127 0.3290"

If you want to just affect the current shell your in, you can type these out or use aliases to set variables.

# Set which project and shot I'm working on.
alias setshot 'setenv PROJECT \!:1; setenv SHOT \!:2;'

(\!:1) and (\!:2) are command lines arguments. Now when I type…

setshot myproj myshot

I now have two env variables called $PROJECT and $SHOT that my (myren) alias can reference.

# So this command...
# Is the same as running...
cd /prj/myproj/myshot/maya/renders

My final .cshrc file looks something like this…

# Colors!
set     red="%{\033[1;31m%}"
set   green="%{\033[0;32m%}"
set  yellow="%{\033[1;33m%}"
set    blue="%{\033[1;34m%}"
set magenta="%{\033[1;35m%}"
set    cyan="%{\033[1;36m%}"
set   white="%{\033[0;37m%}"
set     end="%{\033[0m%}" # This is needed at the end... 🙁

# Set up the prompt...
set prompt="\n%B${green}`whoami`@`hostname -s`${white}%b %P %D/%W/%y ${blue}%// \n ${white}> ${end}"

# Clean up after ourselves...
unset red green yellow blue magenta cyan yellow white end

# Set shell behaviour...
set filec
set autolist set color set colorcat
set nobeep

# Setup aliases...
alias ll 'ls -lh --color=auto'
alias lr 'ls -ltrh --color=auto'
alias duho 'du -h --max-depth 1'
alias resource 'source ~/.cshrc'
alias remwhite 'rename "y/ /_/" *'
alias copy 'rsync --progress'

# Setup environment variables.
setenv CTL_DISPLAY_CHROMATICITIES "red 0.6400 0.3300 green 0.3000 0.6000 blue 0.1500 0.0600 white 0.3127 0.3290"

# Setup alias for which Project and Shot I'm working on.
alias setshot 'setenv PROJECT \!:1; setenv SHOT \!:2;'

Linux for CG – APT

APT (Advanced Packaging Tool) is a collection of tools to install and manage software packages on Debian and it’s various off shoots (for example Knoppix and Ubuntu). Much like installing software on other operating systems, a package can contain multiple executables (programs).

Some useful packages for CG work include..

In order to install these packages, these are the typical steps that one needs to go through.

Note that APT requires superuser (administrator) privileges which means you have to prefix the sudo command in front of the command. It’s likely if your an artist in a large CG studio that you won’t be a superuser, so these instructions only really apply if your wanting to setup linux at home or in a small studio environment.

The first thing you need to do is get an up-to-date list of all the available packages from the internet.

> sudo apt-get update

The next step is to search for the name of the packages your interested in. APT allows you to do simple word searches for packages

> sudo apt-cache search [package]

Once you’ve found the package your after…

> sudo apt-get install [package]
ie... sudo apt-get install openexr

If the package exists it will download and install the package on your computer, ready to go – no need to restart the computer. Although sometimes you may need to open a new shell in order to pick up any changes. Once that’s done you can upgrade any existing packages on your system by using…

> sudo apt-get upgrade

As an illustrated example, to install the openexr commands you’d run the following series of commands…

> sudo apt-get update
> sudo apt-cache search openexr
> sudo apt-get install openexr

To check if it’s installed the openexr commands correctly try run either exrmaketiled or exrstdattr on the command-line.




OSL – BlackBody Colour

This node outputs a normalised colour given an input temperature in degrees Kelvin. This can be used as an input for lights to give the effect of different colour temperatures.

Download Here

BlackBodyColour Controls
BlackBodyColour Controls
  • Temperature: Temperature in degrees Kelvin.


OSL – Constant Surface

This shading node acts as a constant surface shader. It can be used to output flat colour without any shading (diffuse, specular, etc) on an object.

Download Here

Constant Controls
Constant Controls
  • EmissionOn: When set to 0 it will only compute the InColour for Camera rays. This prevents it from bleeding light onto surrounding surfaces. When set to 1 it will behave as OSL intended it as act as an emissive light, which bleeds light onto surrounding surfaces.
  • InColour: The input colour. Can be constant or texture.

Note that when EmissionOn is set to 1 it will take a lot longer to compute than when EmissionOn is set to 0.

OSL – ColourSwitch

This shading node does a binary switch between two input colours. Potentially useful if you want to test out two different textures or colour settings and don’t wish to rewire your shading network.

Download Here

ColourSwitch Controls
ColourSwitch Controls
  • Which: Which colour input to use. 0 = InColour0 and 1 = InColour1.
  • InColour0: Input Colour 0
  • InColour1: Input Colour 1


OSL – Exposure

This shading node applies an exposure adjustment to the input colour. This allows you to manipulate the colour using photographic stops.

Download Here

Exposure Controls
Exposure Controls
  • Active: Disable the effect by setting this to 0. This allows you to disable the adjustment without having to rewire your shading network.
  • InColour: The input colour. This can be any type of colour (texture, constant, etc)
  • Exposure: Measured in stops. Each stop will increase (double) or decrease (halve) the brightness.

Post DOF vs Rendered DOF

Although it’s far more common to render 3D images without Depth-Of-Field (DOF) as it renders quicker and offers some flexibility in compositing. In some situations that isn’t always the case as large ZBlur’s within Nuke can take a heck of a long time to render. In fact depending on your scene and your renderer it’s often quicker to render DOF in 3d than it is to apply it as a post process in 2d.

Post DOF

  • Flexibility in compositing. Can adjust effect as needed.
  • Quicker to render.
  • Can be inaccurate – no physically based parameters (although this is largely dependant on the plugin used. The effect is driven by the artist.
  • Large blurs are slow to render.
  • Prone to artifacts. Can’t handle certain situations at all without lot’s of hackery.

Rendered DOF

  • No Flexibility in compositing.
  • Slower to render.
  • Accurate. Physically based parameters.
  • Requires more pixel samples in order to avoid noisy renders.

The following render was done at 2048×1556 and was rendered without any DOF. The total render took 83 seconds. The Depth AOV was rendered using a Zmin filter with a filterwidth of 1×1 in order to avoid anti-aliasing in the render.

Click to see full resolution.

I also rendered the same image with DOF on.

Click to see full resolution.

Unfortunately my plan to show the difference between the two full resolution images was put on hold by Nuke taking far too long to render the full resolution ZBlur. I gave up after 10 minutes so decided to concentrate on a particular region.

Crop region used.

The following 1:1 crop demonstrates the difference between Post DOF and Rendered DOF.

Keep in mind that the Nuke time for the Post DOF was only for the crop area your seeing above – it was taking too long to render the full image with Post DOF. As you can see the Post DOF breaks down quite heavily in some places, while the rendered DOF image did take longer to render, it’s much more accurate and the read time of the image is less than a second in Nuke.


  • The rendered DOF spent less time ray-tracing and more time spent on sampling the image. This was due to the increase in pixel samples in order to get less noisy results and higher focus factor.
  • With pixel samples at 3×3 the DOF render took 57 seconds, faster than the 83 seconds that it took to render the Non-DOF version although the final result was unacceptable. For less extreme blurs pixel samples could be set as low as 6×6.
  • Focus Factor parameter in 3Delight helps speed up DOF renders by reducing the shading rate in areas of high blur with very little perceivable difference.
  • Despite some noise, the end result is much more visually pleasing than the ZBlur result in Nuke.



Salamander Sculpt

After many years I’ve finally bit the bullet and decided to learn ZBrush in earnest. This is the first of my sculpting endeavours…

Blocking out the main features. I’m not entirely sure on the nostrils at the moment, which is why they’re rather faint.

I’m referring to it as a salamander at the moment, but so far I’ve primarily been working without reference, so it’s kind of morphed into a hippo-lizard hybrid.

Head of a hippo, body of a lizard. Let’s say I was originally aiming for a creature which was more gelatinous than well-formed. 😉


A Recipe for Creating Environment Maps for Image Based Lighting

This is the recipe I use for creating environment maps for use in image based lighting. While the example I’m going to use specifically involves a chrome ball, a lot of this also applies to environment maps captured by taking panoramic photos.

Goals and Flow

The two main goals of this technique are to…

  1. Maintain consistent and high-quality results.
  2. Make things as easy and automated as possible.

The first goal requires that we use image formats which allow floating-point colours and image processing techniques that degrade the image as little as possible.

In terms of balance between consistency and quality, I’d prefer to sacrifice quality in order to maintain consistency – this mainly becomes a problem when dealing when dealing with colour-spaces.

The second goal is to make things as uncomplicated and simple as possible. It’d also be nice to make as much of this as automated as possible so that large batches of images can be processed with minimal fuss.

If I was a bit more sorted my workflow would look some like this, where the raw image gets converted into an image which is worked with and then that gets converted into whatever output format I’m aiming for.

Ideal workflow.

However I’m not entirely keen on bring raw images directly into Nuke at the moment, primarily cause I’m not entirely happy with the results, so I’ve added an additional step to the process. This involves converting the raw image to an intermediate image, which at this stages means exporting the image out as a 16bit TIF with a gamma-encoded colour-space.

Current workflow.

So that means we’re aiming to use formats like OpenEXR or if push comes to shove we’ll use 16-bit TIF. We’re also going to try keep any colourspace conversions or resampling of the images to a bare minimum.

The Ingredients

  • Adobe Lightroom – This is my personal preference, but your probably able to get similar (or perhaps even better) results using other raw converters.
  • The Foundry’s Nuke – This works well with processing large batches of images and has good colour support. It also has a handy little node for converting mirror ball images into lat-long images.
  • J_Ops for Nuke – Primarily for the J_MergeHDR node, but it also contains  J_rawReader which allows you to read camera raw images within Nuke.

Preparing in Lightroom

The first goal after importing your images is to zero out any default tonal adjustments made by Lightroom, for this I apply the General – Zeroed preset in the Develop module.

Zeroed preset applied in Lightroom.

From here I export with the following settings…

  • Format: TIF
  • Compression: None
  • Colourspace: sRGB
  • Bit-Depth: 16bits per component
  • Image resizing: None

With regards to the colourspace, I’ve chosen sRGB because it’s the easiest colourspace to deal with. Ideally I’d like to use ProPhoto as it has a larger colour gamut, but I’m still working on the finer details of using ProPhoto within Nuke.

Hopefully the ACES colour-space will become more common in the future as it has a much larger colour gamut and is linear, but at this stage software support for it is limited.

In Nuke

Once you bring in all your images that you exported from Lightroom. The first thing you want to do is crop the image to the boundaries of the chrome ball. It’s best to get the crop as tight as possible.

Cropping in Nuke, the radial node is used to visualize the crop by overlaying a semi-transparent circle on top.

I use a radial node into order to visualise the crop to make sure things are lining up. You can also copy the settings from the radial node onto the crop node.

You can copy values by clicking and dragging from one curve icon to another.

A couple of little tips here, the first is to use whole pixel values (ie… 2350) for your crop values rather than sub-pixel values (ie… 2350.4). The reason for this is that Nuke will resample the image is you use sub-pixels – if your not careful when resampling an image you can lose quality and introduce either softening or sharpening to the image.

The second tip is if you want to maintain a perfect square when cropping. In order to do so click in the area.y attribute on the radial node and press the = key. In the expression editor that pops up enter…

area.t - (area.r - area.x)

Now when you adjust the top and side edges, the bottom edge will adjust itself automatically so that it maintains a square 1:1 ratio.

Merging into an HDR image

Once I’ve set up the crop on one image, it’s just a matter of copying the same crop node onto all the other images and plugging all of those into a J_MergeHDR node.

Cropped chrome ball images plugged into a MergeHDR node. Click for larger image.
MergeHDR node settings.

The first thing to do is click on the Get Source Metadata button to read the EXIF information off the images. The second thing to do is to set the target EV. You can either do this by setting the target ISO, Aperture and Shutter settings or by clicking on the EV Input checkbox and then manually setting a target EV value (I’ve set it to 12 in the above image).

Using the EV values we can also match exposures between images shot with different ISO, Aperture and Shutter settings.

The EV values can be used to match exposures on two images shot with different ISO, Aperture and Shutter settings.

In the example above we can use the difference between the two EV values (5.614 and 10.614) in order to match the exposure on one to the other. The difference between the two is approximately 5 stops (10.614 – 5.614 = 5), so if we apply an exposure node to the brighter image and set it to -5 stops, we can get a pretty good exposure match between two images. Although the example below is perhaps a bit extreme – as there are plenty of clipped values – in certain areas the exposures match up pretty well.

Where this potentially comes in useful is matching reference photography where automatic settings were used. If you don’t want to figure out the differences yourself, you can plug a MergeHDR node into each image and then set the target EV on all the MergeHDR nodes to the same value.

Applying an exposure node to the over exposed image and setting it to -5 stops.

From Chrome Ball to Lat-Long

The penultimate step in the puzzle is to convert the chrome ball into a lat-long image. This is easy using the SphericalTransform node in Nuke.

SphericalTransform node plugged into a MergeHDR node.

The settings to use are…

  • Input Type: Mirror Ball
  • Output Type: Lat-Long Map
  • Output Format: Any 2:1 image format (ie… 4096×2048, 2048×1024, 1024×512, 512×256)

Exporting from Nuke

The very last step is to write it out as an EXR and make sure the colourspace is linear.


Colour Temperature in Maya

For a while I’ve wanted to implement colour temperature control into my lighting workflow but I’ve never been able to figure out how it’s calculated. Then I came across this site, which has already mapped out blackbody temperatures to normalised sRGB values.

Using this as a starting point I mapped out the values into a SL function…

color blackbodyfast( float temperature;)
	uniform color c[16] = 
	float amount = smoothstep ( 1000, 10000, temperature );
	color blackbody = spline ( "catmull-rom", amount, c[0],
	return blackbody;

I decided rather than map every temperature value from 1000K to 40000K, I decided just to deal with 1000K to 10000K using the CIE 1964 10 degree Colour Matching Functions – only because of the later date of 1964, I couldn’t see (nor greatly understand) the difference between the colour matching functions. The original function I wrote called blackbody used every value of the kelvin scale from 1000K to 10000K, this resulted in an array of 90 values. The modified one above uses every 6th value which brings the array size down to 16 values, in my tests I didn’t notice a speed difference using 90 values, but looking at a comparison of the two functions I couldn’t see enough visual difference to bother using the full 90 steps.

Blackbody temperature comparison in sRGB. Temperature is mapped to T coordinate.

There is a slight peak where the warm and cool colours meet in the 90 step version. It’s a bit more obvious looking at the image in linear light.

Blackbody temperature comparison in Linear. Temperature is mapped to T coordinate.

Because the values are in sRGB, they need to be converted to Linear before getting used in the shader. The SL used in the main body of my test surface looks something like this…

#include "colour.h"

surface blackbody_srf(
	uniform float temperature = 5600; #pragma annotation temperature "gadgettype=intslider;min=1000;max=10000;step=100;label=Temperature;"
	color blackbody = blackbodyfast (temperature);
	blackbody = sRGB_decode(blackbody);
	Oi = Os;
	Ci = blackbody * Oi;

Used in a light shader the output looks something like this…

Blackbody temperature. Light intensity is the same throughout. sRGB.

The only problem now is that 3Delight doesn’t show a preview of light shader or more importantly the colour temperature in the AE settings for my light.

To get around this I decided to implement an expression which changed the colour of the Maya light that my 3Delight shader was attached to. Because MEL doesn’t have a spline function like SL does I had to improvise using animation curves. First up the MEL to create the three curves that I need to create the RGB colour temperature.

$red = `createNode animCurveTU`;
$green = `createNode animCurveTU`;
$blue = `createNode animCurveTU`;

setKeyframe -itt "spline" -ott "spline" -t 1 -v 1 $red ;
setKeyframe -itt "spline" -ott "spline" -t 10 -v 1 $red ;
setKeyframe -itt "spline" -ott "spline" -t 11 -v 0.929 $red ;
setKeyframe -itt "spline" -ott "spline" -t 12 -v 0.8289 $red ;
setKeyframe -itt "spline" -ott "spline" -t 13 -v 0.7531 $red ;
setKeyframe -itt "spline" -ott "spline" -t 14 -v 0.6941 $red ;
setKeyframe -itt "spline" -ott "spline" -t 15 -v 0.6402 $red ;
setKeyframe -itt "spline" -ott "spline" -t 16 -v 0.6033 $red ;

setKeyframe -itt "spline" -ott "spline" -t 1 -v 0.0401 $green;
setKeyframe -itt "spline" -ott "spline" -t 2 -v 0.172 $green;
setKeyframe -itt "spline" -ott "spline" -t 3 -v 0.293 $green;
setKeyframe -itt "spline" -ott "spline" -t 4 -v 0.4195 $green;
setKeyframe -itt "spline" -ott "spline" -t 5 -v 0.5336 $green;
setKeyframe -itt "spline" -ott "spline" -t 6 -v 0.6354 $green;
setKeyframe -itt "spline" -ott "spline" -t 7 -v 0.7253 $green;
setKeyframe -itt "spline" -ott "spline" -t 8 -v 0.8044 $green;
setKeyframe -itt "spline" -ott "spline" -t 9 -v 0.874 $green;
setKeyframe -itt "spline" -ott "spline" -t 10 -v 0.9254 $green;
setKeyframe -itt "spline" -ott "spline" -t 11 -v 0.9107 $green;
setKeyframe -itt "spline" -ott "spline" -t 12 -v 0.8527 $green;
setKeyframe -itt "spline" -ott "spline" -t 13 -v 0.8069 $green;
setKeyframe -itt "spline" -ott "spline" -t 14 -v 0.77 $green;
setKeyframe -itt "spline" -ott "spline" -t 15 -v 0.7352 $green;
setKeyframe -itt "spline" -ott "spline" -t 16 -v 0.7106 $green;

setKeyframe -itt "spline" -ott "spline" -t 2 -v 0 $blue;
setKeyframe -itt "spline" -ott "spline" -t 3 -v 0.0257 $blue;
setKeyframe -itt "spline" -ott "spline" -t 4 -v 0.1119 $blue;
setKeyframe -itt "spline" -ott "spline" -t 5 -v 0.2301 $blue;
setKeyframe -itt "spline" -ott "spline" -t 6 -v 0.3684 $blue;
setKeyframe -itt "spline" -ott "spline" -t 7 -v 0.517 $blue;
setKeyframe -itt "spline" -ott "spline" -t 8 -v 0.6685 $blue;
setKeyframe -itt "spline" -ott "spline" -t 9 -v 0.8179 $blue;
setKeyframe -itt "spline" -ott "spline" -t 11 -v 1 $blue;

rename $red "colourTemperatureRed";
rename $green "colourTemperatureGreen";
rename $blue "colourTemperatureBlue";
The resulting animation curves.

Then the next stage was to create an expression which linked the outputted colour temperature to the light colour.

float $r, $g, $b;
if (will_point_lgt1.colourType > 0)
	$temp = will_point_lgt1.temperature;
	$amount = `smoothstep 1000 10000 $temp`;
	$c = 16 * $amount;
	$r = `getAttr -t $c colourTemperatureRed.output`;
	$g = `getAttr -t $c colourTemperatureGreen.output`;
	$b = `getAttr -t $c colourTemperatureBlue.output`;
	$r = will_point_lgt1.lightColourR;
	$g = will_point_lgt1.lightColourG;
	$b = will_point_lgt1.lightColourB;
point_lgtShape.colorR = $r;
point_lgtShape.colorG = $g;
point_lgtShape.colorB = $b;
Previewing the light inside Maya. The Maya-specific settings of this light are ignored in the final render.