Forcing extreme supersampling with POV-Ray

I recently worked on a project that involved rendering images with the POV-Ray raytracer.  In the particular scene I was rendering, every pixel was expected to be mostly black with a very small, very bright white spot.  I needed very high supersampling for the output pixel to be the correct shade of gray.

POV-Ray has built-in support for anti-aliasing only via adaptive supersampling.  Unfortunately, that doesn’t work with my scene, because most initial rays will come back the same color (black), and the adaptive algorithm decides that further sampling is not necessary.

Another way to approach the problem is to render at a larger resolution and reduce the final image.  However, that does not work either because of clamping.  As I said, most values would be black, but a few would be very bright.  Let’s say we have 8 rays that return black and one ray that returns a value of 1000.  When that gets saved to the image, the white value gets clamped to 255.  The reduced image would then have a value of 255/9 or 28 for that pixel.  Using the correct value of 1000 would give 1000/9 or 111.  I tried getting around the clamping issue by saving to a high dynamic range format such as EXR.  Either POV-Ray or ImageMagick (which I used to reduce the images) was still clamping values, though.

In the end, I used a new feature in the POV-Ray beta called a mesh camera.  The basic idea is to create a mesh in the scene with each polygon mapping to a pixel in the output image.  To supersample, I used multiple meshes, each offset by a small amount.  The values from the corresponding polygons in each mesh can then be averaged before being output.  Without further ado, here is the code I used:

// Our camera is orthographic
#declare ca_mesh =
  mesh {
    #local px_inc = 1; // Distance between vertices in the mesh
    #local py_inc = 1;
    #local row_count = 0;
    #while (row_count < image_height)
      #local col_count = 0;
      #local d_y = row_count * py_inc;
      #while (col_count < image_width)
        #local d_x = col_count * px_inc;
        triangle {
          <d_x, d_y, -1>
          <d_x + px_inc, d_y + py_inc, -1>
          <d_x + px_inc, d_y, -1>
        }
        #local col_count = col_count + 1;
      #end
      #local row_count = row_count + 1;
    #end
  }

// Our sample grid per pixel is sampleCount by sampleCount.
// In other words, sampleCount is the square root of the number of samples per pixel.
camera {
  mesh_camera {
    sampleCount * sampleCount
    0 // distribution #0 averages values from multiple meshes as described
    #local i = 0;
    #while(i < sampleCount)
      #local j = 0;
      #while(j < sampleCount)
        mesh {
          ca_mesh
          translate <i / sampleCount - .5, j / sampleCount - .5, 0>
        }
        #local j = j + 1;
      #end
      #local i = i + 1;
    #end
  }
}

5 Comments

  1. Posted January 23, 2011 at 6:02 am | Permalink

    I was directed to your posting, and would like to know if you’d like to make a contribution (scene file and write up) that we can include with the povray distribution.

  2. Posted January 29, 2011 at 10:11 am | Permalink

    Sure, I’d love to. I have something started, but it’ll take a little while to smooth out the wrinkles.

  3. Posted June 25, 2011 at 5:54 am | Permalink

    Hi!
    Sorry for advising an obvious tip, but as I understand the params from an INI file, you could use such lines:

    Antialias=On
    Sampling_Method=1
    Antialias_Depth=16
    Antialias_Threshold=0

    to make the antialiasing non-adaptive. Setting Antialias_Threshold to zero makes POV-Ray trace maximum number of rays always.

    Yours Alex.

  4. Posted June 29, 2011 at 10:07 am | Permalink

    That is close, but the result is very noisy for some reason. I tried forcing jitter off with -J, just to be sure, but that wasn’t the problem. Maybe my mesh camera version was blurring the scene, and it really is noisy because of the texture map. I’m afraid I can’t give out the actual scene file, because it relates to research that hasn’t been published yet.

    I think I had originally tried on the command line with +A0, but that doesn’t work. Apparently it must be +A0.0.

    Thanks for the feedback.

  5. Posted April 13, 2014 at 7:41 pm | Permalink

    Hey,My name is miaoyu.I am a student.I have read your article.I am doing an experiment to study the luminance of each final pixel.My goal is get luminance of pixels.
    I set the output file openexr.I have got the value of standard solar irradiance in 700nm is 1.42666,then I use the conversion coefficient.
    That is,I converted irradiance to illumiance as my lightsource input.My whole code in pov is as below:
    #version 3.7;
    global_settings{ assumed_gamma 1.0 }
    #include “spectral.inc”
    #declare SpectralWavelength =700;

    #declare camera_za=0;
    #declare camera_aa=0;
    #declare camera_h=4267.2;
    #declare camera_dist=4267.2+435;

    #declare camera_y = camera_dist*cos(radians(camera_za));
    #declare camera_LL = camera_dist*sin(radians(camera_za));
    #declare camera_x = camera_LL*sin(radians(camera_aa));
    #declare camera_z = camera_LL*cos(radians(camera_aa));

    camera
    {
    angle 7.54371
    location
    look_at
    rotate
    right x
    up y
    }

    #declare sun_za = 31.65;
    #declare sun_aa = 187.89;
    #declare sun_dist = 152589828000;

    #declare sun_y = sun_dist*cos(radians(sun_za));
    #declare sun_LL = sun_dist*sin(radians(sun_za));
    #declare sun_x = sun_LL*sin(radians(sun_aa));
    #declare sun_z = sun_LL*cos(radians(sun_aa));

    light_source {
    SpectralEmission(3.99750132)
    }

    #declare geom_file_name = “pov-xyz.txt”;
    #declare spec_file_name = “pov-ref.txt”;

    sphere
    {

    ,
    1.55
    finish {
    ambient 0
    emission 0
    specular ref
    }
    }

    #declare n = n + 1;
    #end
    #debug concat(str(n,15,2),”\n”)

    I have read the openexr source ,it says y channel stand for luminance,how can I set in pov ?
    Above ia all my code in pov,whether need I set additional output or statements to get my lumiance output.Beg your help.

Post a Comment

Your email is never shared. Required fields are marked *

*
*