Assignment 2: Simulating Realistic Cameras

Due: Thursday, Mar 13 2008 at 11:59pm


Most computer graphics systems assume a pinhole camera model, which images the entire scene in focus. In contrast, real cameras use multi-lens systems with finite apertures that produce interesting effects such as depth of field (FOV), nonperspective projection, and vignetting. In this assignment, you will extend pbrt with support for a more realistic camera model that accurately simulates these effects. We will provide you with specifications of real lens systems with various focal lengths, each composed of multiple lens elements. You will build a RealisticCamera class that simulates the traversal of light through the lens sytem and generate camera rays for the ray tracer. With this simulator, you can experiment with settings available on a real camera, such as changing focus, aperture size, and focal length.

Step 1: Download the starter code and scenes

Click the link below to download a zip file containing the starter code, several pbrt scene files / textures, and the camera specification files:

You should unzip and copy the source code to the \pbrt-1.03\cameras folder, project files to the corresponding folder on your platform, and scene and camera files to your \scene folder.

Understanding the starter code

Look through the files and make sure you understand the basic code structure and interface functions. The parameters passed into the RealisticCamera class include standard Camera class members and some additional ones:

  • focaldistance: The distance at which the camera is focused at. For example, setting this to 4000 means the camera is focused at 4 meters away from the origin of the lens system.
  • focallength: The focal length of the lens system. This is only used to set the proper aperture size of the lens system.
  • fstop: Relative aperture size (typically varies between 1.4 and 32). The effective aperture size is focallength / fstop.
  • filmdiagonal: Film size (typically 35mm).
  • specfile: Camera specification file.

Note that all units are in millimeter (mm). You can find examples of these parameters in the provided scene files.

Understanding the camera specification file (.dat)*

You should read this paper to understand the description of a camera file. Again, all units are in millimeter. File parsing has been provided to you in the starter code through the lensSystem::Load() interface function. Initial data read from the file are stored in each lensElement structure.

In pbrt, a camera's viewing direction is assumed to be along the positive z-direction in camera space. Therefore, you should assume that the lens elements are places along the z-axis in camera space, and your camera should be facing directly down the z-axis. Certainly for convenience you can assume otherwise but should remember to make corresponding conversion when transforming a camera ray to the world space.

The scene files (.pbrt)*

There are 8 scenes in total provided to you. The first set of four scene (cones-xxxx.pbrt) demonstrate four different lens assemblies. Below are sample images produced for these four scenes:

cones-dgauss.jpg cones-wide.jpg cones-fisheye.jpg cones-telephoto.jpg

 50mm double gauss     22mm wide angle      10mm fisheye      250mm telephoto

The second set of scenes (dof-xxx.pbrt) use the same camera lens (50mm double gauss) but vary the aperture size and focal distance, so you can experiment with camera focusing and depth of field effects. Note: for 491K students, as you are not required to implement camera focusing, the three dof-f2.8-xxx scenes will end up looking the same; however, you should observe the change in depth of field between dof-f2.8.pbrt and dof-f16.pbrt.

dof-f2.8.jpg dof-f16.jpg dof-f2.8-near.jpg dof-f2.8-far.jpg

    f/2.8 (less DOF)            f/16 (more DOF)        f/2.8 (near focus)      f/2.8 (far focus)

Step 2: Implement the RealisticCamera class

You only need to make changes in the three provided source files: realistic.cpp, lenses.h/.cpp. These files will build a pbrt plugin realistic.dll (or on Linux). Your main task is to implement the RealisticCamera::GenerateRay function. GenerateRay takes a sample position on the film plane (given by sample.ImageX and sample.imageY) and returns a stochastic ray from the camera out into the scene. To the rest of pbrt, your camera appears as just any other camera. Therefore, in case you have any question about how it works, examine any available camera class in the \camera folder. In addition, you may need to read part of Chapter 6 in the textbook to get familiar with how the camera class works.

Here are the specific steps you should take (all computation should happen in camera space):

  1. Compute the position of the sample on the film plane. To do so, you need to take use of sample.ImageX and sample.ImageY (which vary from 0 to the film resolution) and the film's physical size (filmdiagonal).
  2. The color of a pixel is proportional to the irradiance received at that pixel. This value is an estimate of all light reaching the pixel from the world and through the lens system. The easiest way to sample light paths is to fire a random ray at the back (rightmost) lens element, and trace the ray through the lenses until it exits the front element of the lens system. In order to pick a stochastic sample on the lens element, you should use the ConcentricSampleDisk() function. This function returns a uniformly random sample on a unit disk. You should pass in sample.lensU and sample.lensV as the first two parameters, and the disk sample result will be returned in the last two parameters of that function. The code for tracing a ray through the lens system should be implemented in lenses.h/.cpp.
  3. Certainly some rays will terminate while passing through the lenses, either because they fail to intersect a lens element or they hit the aperture stop. If successful, the generated ray (i.e. its origin and direction) should be returned in the Ray* ray parameter that is passed in to the GenerateRay function. Note that you must call CameraToWorld to transform this ray to world space before returning. Also remember to set ray->mint and ray->maxt properly. You can follow example in any other camera class (e.g. perspective.cpp).
  4. GenerateRay also needs to return a weight proportional to the geometric factor in irradiance estimate. Initially you can return 1.0 for all rays that succesfully pass through the lens system. However, doing so will produce a rendering result that is biased. The proper weight should be assigned according to the projected solid angle of each sampled ray. For rays that fail to pass through the lens system, you should return a weight of 0.0.
  5. Render each of the provided scenes using your realistic camera simulator. Example images are given below. Note that in order to get a noise-free image, you need to use many samples, hence the rendering could take quite a while to finish.


  • For debugging you should use a smaller image resolution (such as 64x48) and a small number of samples (such as 4 to 16). For finaly quality rendering, you should use the default resolution and at least 256 samples.
  • If pbrt reports that 'Non-a-number radiance value returned', that probably means the camera ray you generated is invalid. Check the origin and direction of the camera to make sure they are valid.
  • While tracing a ray through lenses, you need to test intersection of ray with each lens surface. You also need to determine how rays refract if they intersect lens' surface. The refraction is computed based on Snell's law; a review of the basic formula can be found here.
  • Use special rays for checking your lens tracing function. For example, a ray along the z-axis (i.e. starts at the film center and goes toward the lens center) is guaranteed to pass through the lens system. If that fails, something is wrong with your tracing code.
  • Very Important: The final quality rendering can take a long time to complete, make sure you leave enough time instead of rushing at the last moment!!!
  • [Update] A drawlens program is provided for you to visually debug your ray-lens tracing code. The program is provided as is. You should read the README.txt to understand how to use it: Download

Step 3: (691MM only) Focusing

So far the film plane has been placed at a fixed location specified by the camera definition file. Your task here is to implement the 'focusing' capability for your camera class so it can be focused at any distance in the scene. In other words, you need to adjust the camera's film plane position (filmePlane) based on the focaldistance parameter provided in the scene file. You have two ways to achieve this goal, and you can implement either:

  1. The first method is to fire a paraxial ray that starts at the focaldistance along the z-axis and toward the lens: use your existing lens tracing code to find out where the ray intersects the z-axis at the other (right) side of the lenses. Then you can simply place the film place at the intersection point. Note that you should make this ray close to the z-axis or otherwise it can fail to pass through the lenses. This method is essentially using inverse tracing to figure out the film plane position.
  2. The second method is to adjust the distance based on Section 3.3 of the Kolb paper. To do so, you need to 1) find out the two principle planes and focal lengths by firing parallel rays from the left and right and compute intersections with the z-axis; 2) solve a quadratic equation to compute the distance to adjust.

If successful, your code should produce different results for the following three scenes: dof-f2.8.pbrt, dof-f2.8-far.pbrt, dof-f2.8-near.pbrt. The same camera is used in all these scenes but is focused at a different focaldistance.


For this assignment, you should pack the following files in a single .zip file and email it to

  1. Your modified realistic.cpp, lenses.h and lenses.cpp.
  2. One image (in EXR or JPG or PNG format) for each of the 8 scene provided (you must render them with at least 256 samples).
  3. Submit any other images you generated and let us know what other cool things you did.


This assignment will be graded on a 0-4 scale. Partial grades (such as 3.5) will be given based on partially complete implementation.

  • 0: Little or no work was done.
  • 1: Significant effort was put into the assignment, but the code does not work or fails to produce any meaningful image.
  • 2: The code produces images but have significant artifacts, such as wrong field of view, a large amount of noise / black spots, significant distortion, missing geometry etc.
  • 3: The code produces images that are almost correct, but have slight artifacts such as noise, lack of vignetting, incorrect depth of field etc.
  • 4: All requirements met. All images are generated correctly without obvious artifacts. For 691MM students, the camera focus must work correctly in order to get 4 points.

Topic revision: r7 - 2008-03-12 - RuiwanG
This site is powered by the TWiki collaboration platform Powered by PerlCopyright © 2008-2018 by the contributing authors. All material on this collaboration platform is the property of the contributing authors.
Ideas, requests, problems regarding UMass CS EdLab? Send feedback

mersin escort adana escort izmir escort gaziantep escort