There are some pretty nice test cases in this glTF issue which I'm looking forward to adding to the test suite: github.com/KhronosGroup...
Posts by Pablo Delgado
Currently implementing Gaussian splatting in guc (glTF to USD converter) so that it can be released as soon as the extension comes out of RC status.
I also hooked it up to the USD animation system, so that it matches playback and can be controlled with the usdview slider.
This weekend I added support for the MaterialX <time> and <frame> nodes to Gatling. Procedural materials can now be animated over time, such as this clock authored by Roberto Ziche:
Since the renderer is used as a Hydra render delegate, this is the concern of the host application (usdview). I just pass it some CPU memory.
In general, it was nice getting to know Metal a bit - and I can't wait writing some user land code on my MacBook Air! Btw: Vulkan backend is ~3,8k LOC and Metal ~2,2k LOC.
As these functions don't have access to intrinsics, we need to pass them along as part of their signature. Which the function tables are parameterized with. Here's an example of how the ray generation shader signature ends up looking with two ray payloads:
Closest-hit shaders and miss shaders need to be emulated. They are compiled with their entry points being "visible" and are invoked from visible function tables (VFTs) based on the traversal result. For each payload type we have 1x IFT and 2x VFTs.
An important difference between the graphics APIs is that Metal only has 'intersection' shaders that are invoked for non-opaque geometry. This is what I map any-hit shaders to. Their function addresses are stored in an intersection function table (IFT), similar to an SBT.
On the API side I use metal-cpp and for shaders I partially implement the GLSL_EXT_ray_tracing extension in SPIRV-Cross. It's a bit hacky and it only supports the features I need, but overall content agnostic. Ideally it's transitionary until Slang or KosmicKrisp's compiler can be used.
Finally finished Gatling's Metal backend! Here's a teaser of NVIDIA's USD / MDL sample scene 'Attic' running on macOS. What's special about the backend is probably that it uses the same GLSL code as the Vulkan backend, complete with hardware ray tracing. How does that work?
My "No Graphics API" blog post is live! Please repost :)
www.sebastianaaltonen.com/blog/no-grap...
I spend 1.5 years doing this. Full rewrite last summer and another partial rewrite last month. As Hemingway said: "First draft of everything is always shit".
Blog post on motion blur rendering (which is impressively thorough) by Alex Gauggel
gaukler.github.io/2025/12/09/n...
This scene from WireWheelsClub looks like this out of the box - no editing was required.
Performance is not a priority right now, but is good enough with this simple lighting on an RTX 2060 @ WQHD
There are still some minor issues to resolve, including getting rid of static state (for multiple viewports) and implementing support for orthographic cameras.
Spent some time this weekend updating Gatling’s Blender integration to the latest 5.0 release.
A cloud rendered using jackknife transmittance estimation and the formula used to do so.
Ray marching is a common approach to GPU-accelerated volume rendering, but gives biased transmittance estimates. My new #SIGGRAPHAsia paper (+code) proposes an amazingly simple formula to eliminate this bias almost completely without using more samples.
momentsingraphics.de/SiggraphAsia...
Btw, a few weeks ago I released a small plugin for usdview. It allows inspection of UsdShade networks using a custom Qt node graph. It’s open source, lightweight and simple to install.
The Pixar RenderMan team has a paper out at HPG 2025 this week about the architecture of RenderMan XPU. There's a lot of interesting details in the paper- definitely a worthwhile read!
diglib.eg.org/bitstreams/d...
Source code on GitHub: github.com/pablode/cg-a...
Small weekend experiment: ported an LCD shader from Blender to #MaterialX (originally authored by PixlFX)
We just posted a recording of the presentation I gave at DigiPro last year on Hyperion's many-lights sampling system. It's on the long side (30 min) but hopefully interesting if you like light transport! Check it out on Disney Animation's website:
www.disneyanimation.com/publications...
www.youtube.com/watch?v=BUpD...
Implemented USD point instancer primvar support in my toy renderer the last few days. Can now render this 2D Gaussian Splatting (2DGS) scene with a single mesh & MaterialX material.
(Created by Oliver Markowski with Houdini, link see below ⬇️)
Thanks to a lot of colleagues' great work, happy to share Vulkan samples for RTX Mega Geometry. They should run on all RTX GPUs using today's new drivers
github.com/nvpro-sample...
github.com/nvpro-sample...
github.com/nvpro-sample...
github.com/nvpro-sample...
Three different examples of the Chiang Hair BSDF in MaterialX v1.39.2, rendered in NVIDIA RTX.
Highlighting one of the key contributions in MaterialX v1.39.2, Masuo Suzuki at NVIDIA contributed the Chiang Hair BSDF, seen below in NVIDIA RTX, which opens the door to the authoring of cross-platform, customizable hair shading models in MaterialX and OpenUSD.
github.com/AcademySoftw...
Wrote a blog post about my development process and the tools I’ve built to develop the Spark codecs:
ludicon.com/castano/blog...
Lastly, implemented proper support for AOVs including those used for picking (primId, instanceId, elementId). (asset: standard shader ball) (4/4)