• Finishing the ray tracer

    Done! Or sort of at least... I did not manage to fix that last problem I had with the reflection looking a bit off compared to the book, but I decided to go on with the rest anyway since the problem only shows up in the special case when the radius of a sphere is defined negatively (to get the look of a completely transparent glass ball). Anyway, here is the final result!

    Image showing three larger spheres, one with a lambertian material, one with a dielectric material and one with a metallic material. They are surrounded by smaller spheres of various materials.
    Final render.

    I couldn't generate the additional randomly placed spheres inside the shader, so I had to add some code in javascript that generates a string defining each sphere that I then append to the string literal with the shader code. I also added some basic interactivity to the ray tracer so the user can pan around and zoom in and out of the scene. Previosuly I rendered the image once but to get the interaction to work I had to rewrite the code a little so that's continously being re-rendered and update as the user interacts with it. By first rendering to a texture and then ping-pong between two textures as described in an earlier post I can let the image get smoother over time when it's kept still and when the user moves the camera the scene is "reset" and becomes a grainier before becoming smooth again.

  • Living in the material world

    I've added other materials than the lambertian reflection model to the code. The first one is a metal material that reflects the light of it. It has a variable for controlling fuzziness, which basically means that incoming rays are reflected back with some randomness added to their direction.

    Metallic spheres
    Sphere with lambertian reflection at the center alongside two metal spheres displaying varying degrees of fuzziness.

    The other material is a dielectric one, i.e. incoming light is both reflected and transmitted through the material. I'm not quite sure however if I've managed to get it to work properly. The reflection in the sphere looks different from the example image in the book, or it looks different when I add the Schlick approximation. The only difference that I can think of right now however is the function for generating random numbers, but it's possible that there is something else that I'm missing.

    Dielectric spheres
    Sphere on the left with dielectric material.
  • Fixed the lambertian reflection!

    Finally, I managed to get the lambertian reflection to work! I found a working glsl shader implementation of the weekend ray tracer that I could compare my code to which helped me. First of, I replaced the functions I had for generating pseudo random numbers with the one from that code. Troubleshooting that code is not easy so using something I know works seemed like a better idea. I then compared their code with mine and two major issues, one was the way I was iterating over the ray hits.

    In the Ray Tracing in one Weekend book it's done recursively but that's not possible in webGL and my implementation had some problem so I had to change it slightly. I was trying to sum the effects of each iteration on the color in one variable and then multiply the color by the final result, but now I instead just set the color to all white at the start and then multiply it by 0.5 each iteration where it hits something.

    The second problem was that in the same function, but when I did not register a hit I set the color to be the color of the background (or the sky), but I should of course multiply the accumulated color with the color of the background.

    Lambertian reflection
    Lambertian reflection!

    Something else I've done is to move the antialiasing loop out of the shader code. Instead I draw the result to a texture that I can render to the canvas. I have another texture that I read from inside the shader and I switch between these textures so in one iteration I'm reading the previously rendered texture from one of them and drawing to the other, "ping-ponging" between the textures. On each iteration I add a little less of the new rendering and a little more of the old and since I'm always adding a little bit of randomness to the rays that are cast the image gets finer and finer. I got the idea from this.

  • Frustration

    I've been working for the last two days on getting the rendering of diffuse materials to work, but I'm struggling with getting it to render properly. Right now I'm getting a lot of artefacts and I can't quite figure out why.

    Frustration
    Lambertian reflection?

    I've been looking at others who have done something similar, but the only real difference I can find is that they all seem to render several passes to a texture that they then take the average of. I'm basically doing the same but in a for-loop inside the fragment shader and I'm not sure what the actual difference would be, but I can't seem to come up with any other way of proceeding right now.

  • More spheres

    I added some basic support in the shader for more than one sphere. Took me a while to figure out that you have to add an "out" qualifier to function's input parameters that are to be changed, similar to pass-by-reference in c++.

    Now with more ground
    Now with more ground.
  • Rendering a sphere

    I've been trying for some time to get something to work that I saw in another implementation of a ray tracer by Evan Wallace. He renders the output from the fragment shader to a texture and then on the next render he switches to another texture which uses the results from the first to create a more detailed render. This means that the wuality gets better and better as time goes, unless you move the scene which means that the program has to recalculate the shaders and start rendering from scratch. I couldn't get this to work right now so I've started just implementing Ray Tracing in One Weekend code directly in the fragment shader and just render it to a rectangular surface.

    A blue sky
    A blue sky.

    I had to make some small changes to get things to work in WebGL, but transferring the code has been very straightforward so far. To calculate a ray going from the camera through each pixel I just run the fragment shader and use the gl_FragCoord variable to get the window relative x and y coordinates of each pixel (which I divide by the window width and height respectively).

    A blue sky
    A fiery red ball in the sky.

    I've also made use of some of the inbuilt functions in glsl like mix() instead of writing my own linear interpolation function.

    A blue sky
    A sphere colored according to the directions of its surface normals.

    The next step will be to setup javascript classes for handling spheres and making the code a bit more general.

  • Hello world

    I started working on rendering an image in webGL today and got this "hello world" example working.

    Rainbow colored image
    "Hello world!" Graphics edition

    So far I've been learning how webGL works from the site WebGL Fundamentals and I've adapted the code from the examples there to get this image from section 2.2 in Ray Tracing in One Weekend. It's really nice that webGL handles all the interpolations and that I "only" have to hand it the positions and color data.

    I also found a nice library from the Khronos Group for debugging webGL which let's me output all the function calls by webGL and any errors that might occur.

  • Initial setup

    This is my first entry on this blog. I'll be trying to write down notes here on my development of a ray tracer in WebGL for the course 'Computer graphics and interactions' at KTH. I've set this site up so that you can see the current state of the project on the front page and read about it in this blog.

    Currently the canvas on the front page is only rendering a black screen, but the next step is to start rendering something a little more interesting.