Assignment 0: Nori Preliminaries

Due September 27th, 2017 at 11:59pm on myCourses
Worth 5%


About

Nori is a minimalistic ray tracer written in C++, developed by Wenzel Jakob, and it runs on Windows, Linux and macOS. The code we provide as a foundation for the homework assignments in the ECSE 689 is a customized fork of Nori tailored to our course content and programming assignments.

While Nori provides a significant amount of scaffolding to simplify the development of a full-fledged renderer, the code you will start with doesn't actually generate any impressive output: it loads a scene and saves a rendered image as an OpenEXR image — but any actual rendering code is missing, and so the output image consists of only black pixels. Over the course of the semester, your task will be to extend this system into a relatively complete physically-based renderer. The programming assignments will guide you incrementally through this process.

This first assignment statement is particularly detailed, including compilation instructions for our modified Nori basecode as well as "getting started" guide. Despite its somewhat daunting length, the tasks you will actually have to complete here are quite small.

Core features

Our Nori base code provides many features that would be tedious to implement from scratch, including:

References

You may find the following general references useful, especially as the course progresses:

Permissible reference sources

Feel free to consult additional references when completing the homework assignments, but remember to cite them in your submission JSON file.

When asked to implement feature $X$, we request that you don't rely (or even search for) existing implementations in other renderers. You will likely not learn much by doing so, and we will be on the lookout for this. Feel free to refer to PBRT, which gives a lot of implementation details, instead. If in doubt about this rule or any of the references you intend to use, ask the instructor.


Instructions

There will be five programming assignments this semester, each of which will help you progressively build a fully-functioning physically-based renderer. The assignments are to be completed individually, although you are open to consult your colleagues and the instructor. Each assignment has a strict deadline and submission instructions, both outlined in its handout.

It is your responsibility to convince us that you have implemented the assignments correctly. We strongly urge students to start working on the assignments as early as possible. Building your own advanced renderer is often very rewarding: use all the resources at your disposal, and don’t forget to have fun during the process!


Setting up a C++ compiler and building the base code

Linux / macOS

Begin by installing the CMake build system on your system. On macOS, you will also need to install a reasonably up-to-date version of XCode along with the XCode command line tools. On Linux, any recent version of GCC or Clang will work. Navigate to the Nori folder, create a build directory and start cmake-gui as follows:

$ cd path-to-nori
$ mkdir build
$ cd build
$ cmake-gui ..

Linux CMake Set the build type to "Unix Makefiles" and then press the Configure and Generate buttons.

After the Makefiles are generated, simply run make to compile all dependencies and Nori itself.

$ make -j 4

This can take quite a while; the command above would compile the base code using four cores in parallel (feel free to adjust this number depending on the number of cores you want to use on your system). Note that you will probably see many warning messages while the dependencies are compiled - you can ignore them.

Build Type Linux Tip: it's a good idea to set the build mode to Release unless you are tracking down a particular bug. The debug version runs much slower (by a factor of 50 or more).

Windows / Visual Studio 2017

Begin by installing Visual Studio 2017 (versions older than 2013 won't work) and a reasonably recent (≥ 3.x) version of CMake. Note that you can get a free student license of Visual Studio via Microsoft Imagine. Start CMake and navigate to the location where you extracted Nori.

Windows CMake Be sure to select the Visual Studio 2013 (or later) 64 bit compiler. It is also generally a good idea to choose a build directory that is different from the source directory.

After setting up the project, click the Configure and Generate button. This will create a file called nori.sln—double-click it to open Visual Studio.

Visual Studio Build Then open the Visual Studio 2013 project. It's a good idea to set the build mode to Release (see the red marker) unless you are tracking down a particular bug. The debug version runs much slower (by a factor of 50 or more).

The Build->Build Solution menu item will automatically compile all dependency libraries and Nori itself; the resulting executable is written to the Release or Debug subfolder of your chosen build directory. Note that you will probably see many warning messages while the dependencies are compiled — you can ignore them.


A high-level overview of the base code

The Nori repository consists of the base code files (left table) and several dependency libraries (right table) that are briefly explained below.

Main structure

Directory Description
src A directory containing the main C++ source code
include/nori A directory containing header files with declarations
ext External dependency libraries (see the table to the right)
scenes Example scenes and test datasets to validate your implementation
CMakeLists.txt A CMake build file which specifies how to compile and link Nori
CMakeConfig.txt A low-level CMake build file which specifies how to compile and link several dependency libraries upon which Nori depends. You probably won’t have to change anything here.

External libraries

Directory Description
ext/openexr A high dynamic range image format library
ext/pcg32 A tiny self-contained pseudorandom number generator
ext/filesystem A tiny self-contained library for manipulating paths on various platforms
ext/pugixml A light-weight XML parsing library
ext/tbb Intel’s Boost Thread Building Blocks for multi-threading
ext/tinyformat Type-safe C++11 version of printf and sprintf
ext/hypothesis Functions for statistical hypothesis tests
ext/nanogui A minimalistic GUI library for OpenGL
ext/nanogui/ext/eigen A linear algebra library used by nanogui and Nori
ext/zlib A compression library used by OpenEXR

Let's begin with a brief overview of the most important dependencies:

Eigen

When developing any kind of graphics-related software, it's important to be familiar with core mathematics support libraries responsible for basic linear algebra types, such as vectors, points, normals, and linear transformations. Nori relies on Eigen 3 for this purpose. We don't expect you to understand the inner workings of this library, but we also recommend that you at least take a look at the helpful tutorial provided on the Eigen web page.

Nori provides a set of linear algebra types that are derived from Eigen's matrix/vector class (see, e.g., the header file include/nori/vector.h). This is necessary since we will be handling various quantities that require different treatment when undergoing homogeneous coordinate transformations and, in particular, we must distinguish between positions, vectors, and normals. The main subset of types that you will most likely use are:

where the number in the type indicates the dimension and the subsequent character denotes the underlying scalar type (i.e. integer or single precision floating point).

pugixml

The pugixml library implements a tiny XML parser that we use to load Nori scenes. The format of these scenes is described below. The XML parser is fully implemented for your convenience, but you may have to change it if you wish to extend the file format.

pcg32

PCG is a family of tiny pseudorandom number generators with good performance, prepared by Melissa O'Neill. The full implementation of pcg32 generator (one member of this family) is provided in a single header file ext/pcg32/pcg32.h. You will be using this class as a source of (pseudo-)randomness starting with the second programming assignment.

Hypothesis test support library

Certain programming assignments might require you to run statistical hypothesis tests in order to verify the correctness of you algorithms. You can think of these as unit tests with an extra twist: suppose that the correct result of a given computation is known to be a constant $c$. A normal unit test would check that the $c'$ computed by your implementation satisfies $|c-c'|<\epsilon$ for some small constant $\epsilon$ (i.e., to allow for floating-point rounding errors, etc.) However, rendering algorithms usually rely on stochastic/random processes (e.g., Monte Carlo algorithms) and, in practice, the values $c'$ that you will compute can be quite different from the reference $c$, which makes it tricky to choose a suitable constant $\epsilon$.

A statistical hypothesis test, on the other hand, analyzes the computed value and an estimate of its variance and tries to assess how likely it is that the difference $|c − c '|$ is due to random noise or an actual implementation bug. When it is extremely unlikely (usually $p<0.001$) that the error could be attributed to noise, the test reports a failure.

OpenEXR

OpenEXR is a standardized file format for storing high dynamic range images. It was originally developed by Industrial Light and Magic and is now widely used in the movie industry and for rendering. The directory ext/openexr contains the open source reference implementation of this standard. You will probably not be using this library directly but through Nori's Bitmap class implemented in src/bitmap.cpp and include/nori/bitmap.h to load and write OpenEXR files.

NanoGUI

The NanoGUI library creates an OpenGL window and provides a small set of user interface elements (buttons, sliders, etc.). We use it to show the preview of the image being rendered.

Intel Thread Building Blocks (TBB)

The tbb directory contains Intel's Thread Building Blocks. This is a library for parallelizing various kinds of programs similar in spirit to OpenMP and Grand Central Dispatch on Mac OS. You will see in the course that renderings often require significant amounts of computation, but this computation is easy to parallelize. We use TBB because it is more portable and flexible than the aforementioned platform-specific solutions. The basic rendering loop in Nori (in src/main.cpp) is already parallelized, so you will probably not have to read up on this library.

Scene file format and parsing

Take a moment to browse through the header files in include/nori. You will generally find all important interfaces and their documentation in this place. Most header files also have a corresponding .cpp implementation file in the src directory. The most important class is called NoriObject — it is the base class of everything that can be constructed using the XML scene description language. Other interfaces (e.g. Camera) derive from this class and expose more specialized functionality (e.g. to generate an outgoing ray from a camera).

Nori uses a very simple XML-based scene description language, which can be interpreted as a kind of building plan: the parser creates the scene step-by-step as it reads the scene file from top to bottom. The XML tags in this document are interpreted as requests to construct certain C++ objects, including information on how to put them together.

Each XML tag is either an object or a property. Objects correspond to C++ instances that will be allocated on the heap. Properties are small bits of information that are passed to an object at the time of its instantiation. For instance, the following snippet creates a red diffuse BSDF:

<bsdf type="diffuse">
    <color name="albedo" value="0.5, 0, 0"/>
</bsdf>

Here, the <bsdf> tag will cause the creation of an object of type BSDF, and the type attribute specifies what specific subclass of BSDF should be used. The <color> tag creates a property of name albedo that will be passed to its constructor. If you open up the C++ source file src/diffuse.cpp, you will see that there is a constructor which looks for this specific property:

Diffuse(const PropertyList &propList) {
    m_albedo = propList.getColor("albedo", Color3f(0.5f));
}

The piece of code that associates the "diffuse" XML identifier with the Diffuse class in the C++ code is a macro found at the bottom of the file:

NORI_REGISTER_CLASS(Diffuse, "diffuse");

Certain objects can be nested hierarchically. For example, the following XML snippet creates a mesh that loads its contents from an external OBJ file and assigns a red diffuse BRDF to it.

<shape type="obj">
    <string type="filename" value="sphere.obj"/>

    <bsdf type="diffuse">
        <color name="albedo" value="0.5, 0, 0"/>
    </bsdf>
</shape>

Implementation-wise, this kind of nesting will cause a method named addChild() to be invoked within the parent object. In this specific example, this means that Shape::addChild() is called, which roughly looks as follows:

void Shape::addChild(NoriObject *obj) {
    switch (obj->getClassType()) {
        case EBSDF:
            if (m_bsdf)
                throw NoriException(
                    "Shape: multiple BSDFs are not allowed!");
            // Store pointer to BSDF in local instance
            m_bsdf = static_cast<BSDF *>(obj);
            break;
            // ...
    }
}

This function verifies that the nested object is a BSDF, and that no BSDF was specified before; otherwise, it throws an exception of type NoriException.

The following property types are currently supported within the XML description language:

<!-- Basic parameter types -->
<string name="property name" value="arbitrary string"/>
<boolean name="property name" value="true/false"/>
<float name="property name" value="float value"/>
<integer name="property name" value="integer value"/>
<vector name="property name" value="x, y, z"/>
<point name="property name" value="x, y, z"/>
<color name="property name" value="r, g, b"/>
<!-- Linear transformations use a different syntax -->
<transform name="property name">
    <!-- Any sequence of the following operations: -->
    <translate value="x, y, z"/>
    <scale value="x, y, z"/>
    <rotate axis="x, y, z" angle="deg."/>
    <!-- Useful for cameras and spot lights: -->
    <lookat origin="x,y,z" target="x,y,z" up="x,y,z"/>
</transform>

The top-level element of any scene file is usually a <scene> tag, but this is not always the case. For instance, some of the programming assignments may ask you to run statistical tests on BRDF models or rendering algorithms, and these tests are also specified using the XML scene description language, as follows:

<?xml version="1.0"?>
<test type="chi2test">
    <!-- Run a χ2 test on the microfacet BRDF model (@ 0.01 significance level) -->
    <float name="significanceLevel" value="0.01"/>

    <bsdf type="microfacet">
        <float name="alpha" value="0.1"/>
    </bsdf>
</test>

The integrator class

In Nori, rendering algorithms are referred to as integrators because they generally solve a numerical integration problem. The remainder of this section explains how to create your first (toy) integrator to visualize the surface normals of objects.

A skeleton for this integrator is provided in include/nori/integrators/normals.h and src/integrators/normals.cpp:

#pragma once

#include <nori/integrators/integrator.h>

NORI_NAMESPACE_BEGIN

class NormalIntegrator : public Integrator {
public:
    NormalIntegrator(const PropertyList &props);

    Color3f Li(const Scene *scene, Sampler *sampler, const Ray3f &ray) const;

    std::string toString() const;
};

NORI_NAMESPACE_END
#include <nori/integrators/normals.h>
#include <nori/core/scene.h>
#include <nori/shapes/shape.h>

NORI_NAMESPACE_BEGIN

NormalIntegrator::NormalIntegrator(const PropertyList &props) {
}

Color3f NormalIntegrator::Li(const Scene *scene, Sampler *sampler, const Ray3f &ray) const {
    // ECSE689: Add calculations to display an object's normals and return the result

    return Color3f(0.f, 1.f, 0.f);
}

std::string NormalIntegrator::toString() const {
    return tfm::format(
        "NormalIntegrator[]"
        );
}

NORI_REGISTER_CLASS(NormalIntegrator, "normals");
NORI_NAMESPACE_END

To try out this integrator, we need to add these files to the CMake build system. This is already done for you in CMakeLists.txt, but keep in mind that any file you create needs to be included in this compilation file. You can now compile, and if everything goes well, CMake will create an executable named nori (or nori.exe on Windows) which you can open from the command line.

Finally, create a small test scene in the scenes folder with the following content and save it as test.xml:

<?xml version="1.0"?>
<scene>
    <integrator type="normals">
        <string name="myProperty" value="Hello!"/>
    </integrator>

    <camera type="perspective"/>
</scene>

First, some text output should be visible on the console:

$ ./nori ../scenes/test.xml

Property value was : Hello!

Configuration: Scene[
  integrator = NormalIntegrator[
    myProperty = "Hello!"
  ],
  sampler = Independent[sampleCount=1]
  camera = PerspectiveCamera[
    cameraToWorld = [1, 0, 0, 0;
                     0, 1, 0, 0;
                     0, 0, 1, 0;
                     0, 0, 0, 1],
    outputSize = [1280, 720],
    fov = 30.000000,
    clip = [0.000100, 10000.000000],
    rfilter = GaussianFilter[radius=2.000000, stddev=0.500000]
  ],
  medium = null,
  envEmitter = null,
  meshes = {
  }
]

Rendering .. done. (tookc 93.0ms)
Writing a 1280x720 OpenEXR file to "test.exr"

The Nori executable echoed the property value we provided, and it printed a brief human-readable summary of the scene. The rendered scene is saved as an OpenEXR file named test.exr.

Green Secondly, a solid green window pops up. This is the image we just rendered! The slider at the bottom can be used to change the camera exposure value.

Visualizing OpenEXR files

A word of caution: various tools for visualizing OpenEXR images exist, but not all really do what one would expect. Adobe Photoshop, HDRITools by Edgar Velázquez-Armendáriz, and HDRView by Wojciech Jarosz work correctly, but Preview on macOS for instance tonemaps these files in an awkward and unclear way.

When in doubt, you can also use Nori as an OpenEXR viewer: simply run it with an EXR file as a command line parameter, like so:

$ ./nori ../scenes/test.exr

Task 1: Surface normal integrator (30 points)

Let's now build a more interesting integrator which traces some rays against the scene geometry. Your first task is to modify your normal integrator to display the surface normals of a sphere mesh. To do so, you will have to modify NormalIntegrator::Li.

This function is called earlier by Nori, after the rendering engine generates rays through each pixel. Each one of these ray calls the Li function of the integrator instantiated from the scene file. The ray parameter passed to Li is a camera ray through a given pixel; in your NormalIntegrator, you will intersect the scene with this ray to find the closest surface to the ray origin, in its direction. In the case of camera rays (i.e., the rays passed to Li), this intersection corresponds to the closest visible surface from the viewer. If an intersection is found, simply return the component-wise absolute value of the shading normal at the intersection point, as a color. Have a look at the Intersection structure in include/nori/shapes/shape.h to see how to retrieve the information you need.

To run your renderer and (hopefully) see the result of your work, invoke nori on the file scenes/hw0/Tests/sphere-mesh.xml, and you should get the image below.

Sphere mesh Sphere mesh scene rendered using a surface normal integrator. Notice the tessellation on the surface.


Task 2: Analytic sphere shape (70 points)

In this second exercise you will implement a new shape called Sphere, which will provide support for analytic spheres in your scene. The new class Sphere is derived from Shape. We provide you with the skeleton of the class in src/shapes/sphere.cpp. Your task is to implement the missing functions Sphere::rayIntersect, Sphere::updateIntersection and quadratic in src/core/math.cpp.

Ray-sphere intersection

The function Sphere::rayIntersect tests whether a ray with origin $\mathbf{o}$ and direction $\mathbf{d}$ intersects a sphere centered at $\mathbf{c}$ and with radius $r$. The solution can be found algebraically: recall that points on the surface of a sphere are defined as the set $\mathbf{p} \in \mathbb{R}^3$ that satisfy $\|\mathbf{p} - \mathbf{c}\|^2 = r^2$. Points along a ray are defined parametrically as $\mathbf{r}(t)= \mathbf{o} + t\mathbf{d}$. You'll need to substitute the ray equation into the sphere equation and solve for $t$. The sign of the discriminant of the resulting quadratic polynomial (in $t$) will tell you whether the ray intersects the sphere or not. If so, you'll need to return the smallest of the two real positive roots in order to compute the nearest intersection point. The function returns true if and only if the ray intersects the sphere within the parametric ray bounds ray.mint and ray.maxt, and the functions sets the value of an output parameter $t$ with the nearest paramteric intersection distance in this case. At this point, we will not use the IntersectionQueryRecord.

Shading information

The function Sphere::updateIntersection must be called when the closest intersection has been found. You are required to populate the reference Intersection& its with information about the intersection event: the parametric intersection distance, the intersection point, the geometric and shading coordinate frames, and the corresponding $uv$ coordinates from the surface parameterization. For the $uv$, you should use spherical coordinates of the intersection point linearly scaled to fit in $[0,1] \times [0,1]$. To validate your results, compare the analytic sphere rendered in the scene scenes/hw0/Tests/sphere-analytic.xml with your previously rendered tesselated mesh sphere. You should notice that your new sphere is much smoother! You can also check against the reference images in scenes/hw0/References.

Analytic sphere Analytic sphere scene rendered using a surface normal integrator. The surface is silky smooth.


What to submit

Submission script

Download the assignment script submission.tar.gz on myCourses. Throughout the course, you will reuse this script to submit all your rendered images. This script allows the graders to directly compare your output with theirs using an horizontal slider. The structure of the script is as follows:

File Description
imgs/ Directory containing all your rendered images in PNG format (in order)
config.json Configuration file containing all your student information (e.g. name, ID)
template.html HTML template that will be filled with your data
submit.py Script that generates your final submission HTML file
submission.html Example submission file

Notice that the example submission file does not contain any references images, only a dark overlay; this is just a filler for the actual references that will be used to grade your work.

This script relies on BeautifulSoup in conjunction with html5lib which you may have to download or install through the command-line. Both can be obtained through pip or conda.

To create a submission, first convert your rendered EXR images to PNG format and add them to the imgs/ folder. This can be done via Preview on Mac, or Adobe Photoshop (with an extra plugin) or ImageMagick on Windows. Then, fill in your student information in the JSON file and run the script. This will create a file submission.html that is self-contained: all rendered images get inlined inside the HMTL in Base 64:

$ python submit.py 
imgs/1.png added to template
imgs/2.png added to template
imgs/3.png added to template
Assignment 1 (submission.html) successfully built!

Files to hand in

Render scenes/hw0/Final/finalhw0.xml and submit your HTML file submission.html, as well as your implemented normals.cpp, sphere.cpp and math.cpp files packaged into a .zip or .tar. You should not have to modify any other files to complete this assignment. As such, if we need to run your code, we will test it by replacing the unimplemented src/integrators/normals.cpp, src/shapes/sphere.cpp and src/core/math.cpp with your implementation in the assignment source code; make sure this runs before submitting!