Advertisement Β· 728 Γ— 90

Posts by Mike Dereviannykh

I will be presenting in the 10.30 session in room 208-209. I will share thoughts about implicit/generative and explicit appearance representations. See you there if that sounds interesting!

8 months ago 2 1 0 0

In just less than an hour we'll present "Neural Two-Level Monte Carlo Real-Time Rendering" on SIGGRAPH 2025 at "Best of Eurographics" session

Please join us, it'll be fun! 😊

🏠Room 208-209
πŸ•œ10:30-11:30

8 months ago 3 0 0 0
Video

Physically-based differentiable rendering enables inverse rendering, but handling visibility is hard. Our SIGGRAPH
2025 paper uses quadrics to importance sample silhouette edges--outperforming all existing unidirectional differentiable path tracers.
momentsingraphics.de/Siggraph2025...

8 months ago 41 7 1 1

Done, thank you so much for the info!

8 months ago 1 0 0 0

And it's a fantastic coincidence that I'm already next to Vancouver and don't need to suffer from jet lag the 2nd time... a horrible thing

8 months ago 1 0 0 0
Preview
Presentation - SIGGRAPH 2025 Conference Schedule

Session's time and location for your schedule:
s2025.conference-schedule.org/presentation...

8 months ago 0 0 0 0

Big news! The Eurographics Association invited us to present our work "Neural Two-Level Monte Carlo Real-Time Rendering" at #SIGGRAPH2025 πŸŽ‰
Super honored - my first SIGGRAPH!

Let’s discuss neural & real-time rendering, grab a coffee, or just hang out - feel free to leave a DM

8 months ago 8 0 2 0
Post image

Just joined RealityLabs at Meta to do some real-time neural rendering research. Unfortunately without a 9- digits compensation package, but WIP 😁

I'm in the Redmond office, but already visited Seattle. If you wanna grab β˜• - DMs are opened

8 months ago 2 0 0 0
Advertisement

Currently, the issue with efficient spatial encoding is more or less resolved (iNGP or the new GATE by @boksajak.bsky.social )

But it's not the case for the directional domain at all. Despite its importance for the unbiased rendering, Cache-Based Resampling and 2-Level Monte-Carlo Estimator

9 months ago 2 0 0 0
Post image Post image

To be completely honest, it's a equal-time-comparison:

3SPP vs 1SPP+25 Neural Resamples

I believe we should invest more resources in more rapid adaptivity of neural caches and more aggressive quantizations, so we could deliver it to production real-time rendering

9 months ago 2 0 1 0

It's a cool work which deserve a lot attentions from real-time rendering community!

Previously I've got a little bit of time to conduct similar experiments on top of our NIRC, as each additional neural sample costs just pure tensor FLOPs ~ 1.5ms on 4080

1 spp vs 1 spp + 25 cache resamples

9 months ago 2 0 1 0

I found it cool to hear the motivation for High-Frequency Learnable Encoding for NRC\NIRC\Neural Ambient Occlusion from the perspective of Kernel Machines!

Classics 😊

9 months ago 5 1 0 0

Hmmm, 1 min sounds amazing, I didn't expect such speed!

10 months ago 0 0 0 0

Congrats to you and to the whole team!

I've got a question: basically the volumetric blocks don't have any transparency, do they? And you rely on pruning to make them look right because of it?

10 months ago 0 0 1 0

Thanks πŸ₯³

11 months ago 0 0 0 0
Post image Post image Post image Post image

I was very honoured to receive one of the two Eurographics Young Researcher Award 2025 yesterday!

This is the combination of the work of many people, mentors, collaborators, students, friends who trusted me and taught me so much along the way!

11 months ago 38 4 9 0

Huge congrats, Valentin!

100% Deserved

11 months ago 1 0 1 0
Preview
Neural Two-Level Monte Carlo Real-Time Rendering Efficient real-time global illumination rendering using Neural Incident Radiance Cache combined with Two-Level Monte Carlo. By Mikhail Dereviannykh, Dmitrii Klepikov, Johannes Hanika, and Carsten Dach...

If you've missed the paper, just check it out:
mishok43.github.io/nirc/

And feel free to reach out to me, if you wanna help me with pushing forward the neural rendering for real-time applications

11 months ago 0 0 0 0
Advertisement

We've received an "Honorable Mention" at the Eurographics 2025 for our work on "Neural Two-Level Monte Carlo Real-Time Rendering" in London! πŸ₯³

Huge thanks to everyone who supported me along the way, and to the EG chairs, committee, and organizers for this recognition

11 months ago 4 0 1 0
Post image

That's what I mean by the lack of compute

11 months ago 0 0 0 0

If you're interested in something like AlphaEvolve but focused on CG, GameDev, or offline rendering, feel free to reach out. I’ve been leading research in this space with strong results so far.

But we need support with compute, expertise and even maybe with engineering

11 months ago 1 0 1 0

And of course, huge thanks to my amazing co-authors β€” Dmitrii Klepikov, Johannes Hanika, Carsten Dachsbacher β€” and industry friends @kaplanyan.bsky.social , Sebastian Herholz, @momentsingraphics.bsky.social, who helped me on the way!

11 months ago 1 0 0 0
Preview
Neural Two-Level Monte Carlo Real-Time Rendering Efficient real-time global illumination rendering using Neural Incident Radiance Cache combined with Two-Level Monte Carlo. By Mikhail Dereviannykh, Dmitrii Klepikov, Johannes Hanika, and Carsten Dach...

But there are so many other cool questions that we tried to cover

So please check out our webpage, demo videos, and paper itself: mishok43.github.io/nirc/

πŸ“I’ll be at Eurographics next week in London β€” if you're around and want to talk rendering, AI, or just grab coffee, DM me!

11 months ago 1 0 1 0
Post image

In equal-time comparisons, NIRC achieves surprisingly cool results both in the biased and unbiased cases

But yeah... variance may increase next to foliage, brush, trees🌿 β€” still the eternal pain in CG πŸ˜…

11 months ago 0 0 1 0
Post image

Using Two-Level Monte Carlo, we can debias NIRC while still cutting variance β€” thanks to fast cache sampling - dozens of times for the cost of 1 real path

It works like (N)CV, but doesn't introduce any architectural constraints! No need to train Normalizing Flows on-the-fly

11 months ago 1 0 1 0
Post image

Another positive scalability property: deeper MLPs do improve quality here

Downside: not all scenes benefit from it (esp. with high-variance MC estimator. must be further researched)

11 months ago 0 0 1 0
Post image

And we got basically classical Monte-Carlo integration, but over the neural domain!

But it scales pretty well with the number of neural samples!

11 months ago 0 0 1 0
Post image

NIRC amortizes iNGP costs via task-reformulation: from outgoing to incident radiance

1. Use hash-grid on surface point β†’ get latent light rep
2. Sample incoming dirs via BSDF
3. Decode radiance using MLPs (per-dir)

The more directions, the more we leverage GPU tensor FLOPS πŸ’₯

11 months ago 0 0 1 0
Advertisement
Post image

Inspired by NRC + iNGP’s adaptivity from amazing Tomas MΓΌller, Christoph Schied, Jan NovΓ‘k, Alex Evans and et al, but found key limits:
– Up to 70% time spent on iNGP β†’ memory-bound
– MLP depth β‰  sign. better quality β†’ poor FLOPs scaling
– Biased for specular & detailed BSDFs with normals

11 months ago 1 0 1 0
Neural Two-Level Monte Carlo Real-Time Rendering
Neural Two-Level Monte Carlo Real-Time Rendering YouTube video by Mike Derevyannykh

🚨 CG Paper, EG 2025

As scenes & lighting in games grow in complexity, we introduce Neural Incident Radiance Cache (NIRC) – a real-time, online-trainable cache that:

πŸš„ Costs just ~1ms/neural-sample for 1080p
☘️ Decreases MC variance
πŸ₯³ Saves on bounces
www.youtube.com/watch?v=Y791...

11 months ago 30 9 2 3