The end result at a normal resolution looks all right now, although there's always something which could be improved when snooping with a magnifier ๐ฎโ๐จ
Posts by Nelarius
Finally, I switched to the sigma values directly from the SVGF paper and fixed some noise getting through the luminance edge stopping weight.
Noticed that Q2RTX was further scaling their depth term with the inverse A-trous stepwidth. Interestingly, this also had a positive impact on the result.
Then used chrismile.net/blog/2024/sv... as a better depth gradient expression. This also helped with artifacts along edges, and yields much better results than finite differences.
First: do albedo demodulation/remodulation before and after the wavelet transform loop. This had a big impact on shadow appearance.
Write your own denoiser they said, what could possibly go wrong? ๐
Turns out, a lot of things. Spent much time pixel-peeping at a 1/4 resolution renders and found all kinds of artifacts. Luckily github.com/NVIDIA/Q2RTX exists and was able to find problems in my pseudo-svgf implementation.
The adjustable sky-view altitude can make for nice-looking skies although it can look a bit dorky in the viewport
Some kind of #voxel RGB temple.
Haven't posted about my #voxel project in a while, but I finally got around to implementing orthographic projection ๐ถ
#screenshotsaturday #raytracing
Cool, so that's what you've been up to with all these color quantization skeets! ๐ Looking forward to seeing the book, definitely not too much material about this online.
Rewrote my voxel raytracing experiment to use spherical harmonics. Probes are now spawned per air voxel adjacent to a diffuse face (as opposed to per diffuse face), which simplifies probe allocation a lot and hopefully leads to easier probe LOD system in the future.
And the final result, with the original 8spp noisy image for reference again.
Pixel peepers will see all sorts of problems with the filtered result, and the edge stopping function could be improved using tricks from the SVGF paper. Still, it's better than waiting for an 2048 spp image to converge ๐
The biggest difference is that the ร-trous algorithm is run iteratively, with the previous c' being used as input for the next run. Each iteration, the gap between neighboring pixels q doubles. The linked paper used 5 iterations.
Finally, moving to the ร-trous wavelet algorithm yields still better results. The filter looks similar to the previous one, with the spatial Gaussian replaced with a cubic B-spline.
This particular filter is inspired by a bilateral filter (course notes: people.csail.mit.edu/sparis/bf_co...), as well as the edge stopping function from the ร-Trous wavelet filter paper (jo.dreggn.org/home/2010_at...)
The edges are now preserved, but the spatial blur is still not sufficient for removing the noise.
The Gaussian blur can be improved by applying an edge stopping function. It's a Gaussian blur with extra weight multipliers controlling for color and normals, not just distance from the center pixel.
This results in a blurred image, which is a start!
First, ran the image through a Gaussian blur. The new color c' is the average over neighboring pixels.
Worked on a homemade ray tracing denoiser. Ended up with something in the spirit of ร-Trous wavelet and SVGF papers, which works quite well for simple, diffuse #voxel meshes.
8 spp images, before and after. Details of journey from Gaussian blur to ร-Trous wavelet denoiser in thread ๐งต๐
#raytracing
Added a quick and dirty VOX importer to my #voxel project. Since I recently played through Monument Valley, I checked out Ephtracy's Monument Valley files first ๐๏ธ
The planetary sunrises and sunsets you get with the Unreal sky model are probably overkill for the #voxel renderer, but makes for nice #screenshotsaturday material ๐ช
The new model gives the transmittance at any point in the sky, so the sun turns convincingly red as it approaches the horizon๐
Integrated the Unreal Engine sky model into the #voxel editor. The sunsets didn't look right in the Hosek-Wilkie model (left). With the new sky model, sunsets are now exactly how I wanted them ๐
Pardon the fireflies, I need to get proper denoising set up โจ
#raytracing
At a quick glance I thought you had gone down some kind of hardcore atmospheric rendering rabbit hole ๐
The sky is an implementation of Hillaire's model: sebh.github.io/publications...
The atmosphere is rendered into a small 2D angular coordinate lookup table using raymarching integrators from section 3. The LUT is small enough that it can be updated in real-time on the CPU.
The images show a 2D look-up table of scattered light.
The sun position and camera altitude can be played around with in real time. But zooming to space and back just looks so good ๐
Good morning from an altitude of 40 km!
Putting off other tasks in the #voxel project to work on a much nicer ray-marched sky for the ray tracer.
#screenshotsaturday #raymarching #raytracing
One unexpected benefit of the color palette is storing just an offset index in the voxel. Voxels shrank from 4 bytes to a single byte.
Added a simple SIMD optimization in my RLE algorithm, yielding a 6x #voxel chunk compression speed up ๐๏ธ
#screenshotsaturday
Mandatory view of the scene in the #raytracing mode โจ