Assignment 4: Global Illumination and Path Tracing

Due November 29, 2017 at 11:59pm on myCourses
Worth 20%


Overview

You will build path tracers capable of rendering realistic images with global illumination effects on surfaces. Your implementations will account for both the direct and indirect illumination in a scene.


Task 1: Implicit Path Tracing (40 points)

In this first exercise, you will implement a naive implict path tracer. Recall the hemispherical form of the rendering equation discussed in class: \begin{equation} L(\mathbf{x}, \omega') = \int_{\mathcal{H}^2} f_r(\mathbf{x}, \omega', \omega) L(r(\mathbf{x}, \omega'), -\omega') \cos\theta' \, \mathrm{d}\omega', \label{path} \end{equation}

where $r(\mathbf{x}, \omega')$ is the ray tracing function that returns the closest visible point from $\mathbf{x}$, in direction $\omega'$. We know that equation \eqref{path} can be approximated by a single-sample Monte Carlo integral estimator as: \begin{equation} L(\mathbf{x}, \omega') \approx L_e(\mathbf{x},\omega') + \frac{f_r(\mathbf{x}, \omega', \omega) L(r(\mathbf{x}, \omega'), -\omega') \cos\theta'}{p(\omega')}. \end{equation}

The updated basecode includes a new integrator definition in include/nori/path.h. This class has various input properties:

<integrator type="path">
        <boolean name="isExplicit" value="false"/>
        <string name="termination" value="russian-roulette"/>
        <float name="termination-param" value="0.2"/>
        <string name="direct-measure" value="solid-angle"/>
        <string name="indirect-measure" value="hemisphere"/>
        <string name="indirect-warp" value="cosine-hemisphere"/>
</integrator>

The path integrator has three additional methods : PathIntegrator::implicitLi(), PathIntegrator::explicitLi() and PathIntegrator::stopPath(). You will implement the implicit and explict path tracing logic in the first two functions, whereas the stopPath function defines the stopping criterion for your path construction. Depending how you implement your path tracing algorithms (i.e., whether recursively or in a loop), you may need extra logic/variables to bookkeep the current number of vertices in a path that you're constructing, for instance. Feel free to add any additional methods to the integrator that you may need to structure your particular algorithm.

Biased Path Termination (40 points)

Implement PathTracer::implicitLi() for both uniform and cosine-weighted recursive indirect lighting sampling distributions. Each path should bounce m_terminationParam times from the eye; clamping the maximum path length will introduce bias in your estimator. Note, for example, that m_terminationParam = 0 should yield an image where only the pixels overlaping the emitters are non-zero, whereas m_terminationParam = 1 should generate an image with only direct illumination.

Once you're done this task, you can test your implementation on scenes/hw4/cornellbox.xml. If your (potentially recursive) path construction is implemented correctly, your rendered image should look something like this:

Cornell box Cornell box rendered with implicit path tracing and two bounces, at 256 spp.

Notice how the colour of the walls bleeds onto the side of the boxes: your first global illumination effect!


Task 2: Explicit Path Tracing (40 points)

The reason why the image you just rendered is noisy is that we are blindly tracing paths, hoping to eventually hit a light. We can take advantage of improved importance sampling schemes from Assignment 3 to arrive at a more effective estimator: now, every time light scatters at a surface point, we will split our estimator to compute both a direct and indirect illumination estimate. In other words, at every bounce we will sample both a direct and an indirect contribution at the intersection point, while being careful to avoid double counting of the same transport contributions (as discussed in class): \begin{equation} L(\mathbf{x},\omega) = L_e(\mathbf{x},\omega) + L_{\mathsf{dir}}(\mathbf{x},\omega) + L_{\mathsf{ind}}(\mathbf{x},\omega). \end{equation}

This estimator performs explicit direct illumination estimation at every path vertex, and implementing this explicit path tracing algorithm is the goal of this next task.

Base Explicit Path Tracing Integrator (35 points)

You can use your DirectIntegrator::Li() as a starting point to implement PathIntegrator::explicitLi(). The key difference between explicit and implicit path tracing is that direct and indirect lighting contributions are decoupled, meaning that they are sampled separately. To avoid double counting, an indirect ray needs to be re-sampled if it intersects a light (and, so, contributes directly to transport along the path).

Implement PathIntegrator::explicitLi() where both subtended solid angle and area sampling are allowed for direct lighting sampling, and uniform and cosine-weighted distributions are both allowed for indirect lighting. Below are reference images for different number of indirect bounces, rendered at 256 spp, with solid angle sampling is used for direct lighting and cosine-weighted sampling is used for indirect lighting.

0-bounce (emitter only)
1-bounce (direct illumination only)
2-bounces of global illumination

Russian Roulette Path Termination (5 points)

Artificially truncating path lengths to a fixed depth introduces bias. To avoid this problem, you will implement a Russian roulette termination method that probabilistically terminates your path construction. Use the termination parameter to branch on this feature, and the terminationParam to query the RR termination probability.


Task 3: Rectangular Area Light (15 points)

Your last task is to implement a rectangular area light. There are two approaches you can use.

Approach 1: Create a New Shape (15 points)

Create a new shape Rectangle in src/rectangle.cpp and include/nori/rectangle.h and implement all methods inherited from Shape. You will have to analytically derive the intersection of a ray with your (clipped) plane. The implementation of your rectangle should support the following parameters : center, width, height and surface normal. You can refer to PBRT for more details, if necessary.

Note that you only need to implement Rectangle::samplePosition() and Rectangle::pdfPosition() since only area sampling is supported in this case. Simply throw a NoriException for subtended solid angle sampling to avoid unimplemented virtual function errors.

Approach 2: Implement a Mesh Light (15 points + 5 Bonus Points)

Begin by familiarizing yourself with the Mesh class to see how vertices, faces and normals are stored. Next, add Mesh::samplePosition() and Mesh::pdfPosition() and implement them.

You may find the DiscretePDF class located in nori/include/dpdf.h useful to implement the sampling step. We suggest that you use this class to build a discrete probability distribution that will allow you to pick a triangle proportional to its surface area (relative to the entire mesh's surface area). Once a triangle is chosen, you can (uniformly) sample a barycentric coordinate $(\alpha,\beta,1-\alpha-\beta)$ using the mapping \begin{equation} \begin{pmatrix} \alpha \\ \beta \end{pmatrix} \mapsto \begin{pmatrix} 1 - \sqrt{1 - \xi_1} \\ \xi_2 \sqrt{1 - \xi_1}, \end{pmatrix} \end{equation}

where $\xi_1,\xi_2 \in [0,1)$ are uniform random variables.

A scene-dependent precomputation is necessary to build the discrete probability distribution, and this can be performed in the Mesh::activate() function, which is automatically invoked by the XML parser. To add a rectangle to your scene, simply create a unit square in a .obj file and attach an area light to it.

Rendering the Cornell Box (5 points)

Render your final Cornell Box scene with a rectangular ceiling light, using explicit path tracing with Russian Roulette termination. Below is a reference image that was rendered to convergence.

Cornell box Cornell Box rendered with explicit path tracing and Russian Roulette termination with probability $q = 0.25$, at 512 spp

If you implemented the rectangle light using a mesh, you'll have to modifiy the XML file to use your mesh light. Use the center point of the plane to define an appropriate translation and, if you use a unit square mesh object, width and height can be used to define your non-uniform scaling factors.


What to Submit

Finished? Submit your modified files and new files if you used the first approach for the rectangle light. Render the 3 final scenes, run the given script and submit!