Frequently Asked Questions

This is a list of some of the more specific frequently asked questions about the assignment. Please see also the roadmap if you have more general questions about how to approach the assignment.

Problems compiling the code under Windows

This program has a significant number of dependencies. To get everything working, please download:

I'm pretty sure this will all only work under the latest version of Visual Studio (VS6's template support is pretty poor). If you experience the following warnings,

C:\boost_1_31_0\boost\thread\exceptions.hpp(29): warning C4275: non dll-interface class 'std::logic_error' used as base for dll-interface class 'boost::lock_error'

you can safely ignore these (at least according to this thread); to make them stop appearing go into the project settings, go under C/C++ -> Advanced and add "4275" to the "Disable Specific Warnings" box.

How can I use the same random sample location on the light for both the getDirection and getAttenuation functions?

This is indeed a problem with the code organization. In the current ray tracer, Material::shade makes two separate calls to getDirection and getAttenuation, and it is not clear how to share the information of what random light position you chose across these two calls (in a deterministic ray tracer, this is a perfectly reasonable breakdown, of course). There are two ways that I can see to fix this; one is a very simple hack, and the other is more principled but takes a little bit of work.

Method 1: the hack

This is not really a multithreaded ray tracer (although it is fairly close); since you can assume that only one ray tracing thread is going at a time it seems reasonable to simply create a mutable variable in the area light class (which is actually an emissive Square in this implementation) and store this information in said variable.

Method 2: the more principled way

You are of course welcome to make a change in the Light base class, collapsing the two functions into a single getAllLightInformation (or whatever you want to call it) function that returns a struct containing all the relevant information. The reason this takes a little bit of work is that you have to propogate this change down the heirarchy -- but an easy way to fix this is to just have the default getAllLightInformation implementation simply call the getDirection and getAttenuation functions that have already been declared. This is how the reference implementation does it.

When I sat down and implemented the assignment myself I realized the second way would be better, but I'm afraid to release a patch changing things because most people have already written their own lighting code and I don't want to trample on that. Note that the original version of the ray tracing code that I started with actually had 3 functions, one to get the distance attenuation, one to get the direction, and a third to determine shadowing, which have been collapsed into the two that you see now.

How should we handle transparency, reflectivity, and the Fresnel equations?

A number of people have asked how the Fresnel term applies to the reflective sphere. Getting the correct Fresnel behavior for metals is fairly complex; for this assignment, you should just assume that the reflective sphere is purely reflective in the standard ray-tracing sense (i.e., always reflecting about the normal). You can either choose to always reflect a ray about its normal and add its contribution scaled by kr, or you can use the Russian roulette/path tracing approach and reflect a ray with probability kr and then add in its value unscaled.

Using the Fresnel term: At left, refraction without the Fresnel term; at right, the Fresnel term is used.

For the refractive sphere, you should be using the Fresnel term in the following way. Compute the portion of the light that is reflected using the Schlick approximation to the Fresnel equations for unpolarized light (recall that you can compute F_0 by simply plugging 0 in for the theta in the full Fresnel equations). Now, you can again use either Russian roulette or recursive ray tracing to determine what to do next, as above.

A simple description of the algorithm using a fully Russian roulette path tracing approach follows,

mu_0 = unif(0,1);
if( mu_0 < (kr) )
   Reflect a ray
else if( mu_0 < (kr + kt) )
   F_r = Fresnel term
   mu_1 = unif(0,1);
   if( mu_1 < F_r )
      Reflect a ray
   else
      Refract a ray
else if( mu_0 < (kr + kt + kd) )
   Diffusely scatter
else
   absorb

In the 2-pass algorithm, how do I use the photon map estimate for computing diffuse interreflection?

In our breakdown of the rendering equation, we have separated diffuse interreflection from the other parts (direct illumination, etc.). In some ways, this is the trickiest part to compute. We are going to use photon mapping to accelerate this in the following way, suggested by Henrik: once this ray hits another diffuse surface, we will take the irradiance estimate directly from the photon map, and not allow the ray to reflect any more.

There are some key points here:

Now that we have an estimate of the irradiance I(yi, yi → x), we can use this to estimate the contribution that diffuse-diffuse interreflection makes to the radiance I(x, x → eye). Note that in this case we must include an extra scaling factor of π, cancelling out the 1/π term in the diffuse BRDF; derivation of why this is can be found in this document.


Questions? Contact Christopher Twigg.