Advertisement · 728 × 90

Posts by Max Liani

I thought it’s the other way around: time runs slower when traveling at high velocity, so a server in orbit should run fewer cycles than a server on Earth. Am I seeing this backwards?

4 days ago 1 0 1 0

It’s multi-modal :)
When you count that, the amount of non-verbal sensorial input is huge.

2 weeks ago 1 0 0 0

Thank you. I’ll write a blog post about the method.

2 weeks ago 0 0 0 0

Yes, it’s 12 bytes of indices and flags per line that stores the context of the expansion loop. Which node, which parent, which siblings, is the tree node expanded… stuff like this.

2 weeks ago 0 0 0 0

It’s a good point, I don’t have an implementation for multi-selection reparent and it doesn’t change selection

2 weeks ago 1 0 0 0

With that many nodes it takes a sensible fraction of a second to invalidate and rebuild the cache when the scene structure changes, or a node is expanded/collapsed, changing the flattened list.

2 weeks ago 3 0 1 0

I had to create a flat cache for the line drawn and that lets me jump to the visible band of TreeNode lines I need. I still rely on the nested structure, so it isn't a flat list... but a few hops in the cache reconstruct the nesting context to initialize ImGui::TreePush/TreePop

2 weeks ago 2 0 1 0
Video

Alright, I made some further improvements to the tree view, which now draws in constant time (in respect to the scene complexity). Here is a test with 16 million transform nodes viewable in the scroll area. Without screen recording, the whole app renders at 8kHz.

2 weeks ago 15 1 2 0
Advertisement

It’s probably the sensible thing to do. Thank you. I do support left-click on first selection line, followed by shift-left-click on to complete selection over a continuous range of lines.
Maybe shift-left-click-drag would be the natural thing for marquee.

2 weeks ago 1 0 0 0

I recall in Maya left-click-drag is always select, for drag&drop one has to use middle-click-drag. Never liked that much either, it's not very laptop friendly.

2 weeks ago 0 0 1 0

The typical implementation requires one to click-drag on empty space to start a selection marquee. Somehow that feels ok for a file explorer window, but I find it annoying and limiting in a scene hiearchy tree view.

2 weeks ago 1 0 1 0
Video

I played a bit with the UX of interacting with the tree view. The common operations are selection and re-parent. I want to control both with the same gesture. So, if the drag&drop is mostly vertical -> reparent. If it widens to the side, it becomes marquee selection.
Yes, no?

2 weeks ago 8 1 2 0

dear imgui offers a convenient ImGuiListClipper to draw only a range of a long list. In this case I am not relying on that because my data structure is not a flat list. For widgets out of view, I call a cut down version of TreeNode I call TreeNodeSkip, and that feels good enough for now.

2 weeks ago 0 0 0 0

That's a test with 1 million lines in the tree view, which is likely much more than one can reasonably expect to interact with. I am happy my take on multi selection works and feel zippy.

2 weeks ago 0 0 1 0
Video

I am not sure why I am doing this, but I optimized further the scene tree view as it was getting a bit slow and clunky to operate when I had many nodes expanded. It's not cheap, it's still the most expensive part of the program to draw.

2 weeks ago 9 0 1 0

I have had this in the background today. A mostly silent stream with the occasional com, making me jump from time to time 😆

2 weeks ago 4 0 0 0

It should also work with rectangular textures, as far as the area in pixel of each mip is a power of 2, or with non power of two sizes, as far as the data is padded to align to power of two boundaries.

3 weeks ago 1 0 0 0
Advertisement

Take your number & mask if you mip 0 is the 1x1 resolution, or number & ~mask if mip 0 is the high resolution. Multiply that by the pixel size in bytes.
If my mental calculations are correct you should get your offset.

3 weeks ago 1 0 1 0

I haven’t tried this, but if the texture is power of 2, you can pregenerate a number which is the sum of the mips area in pixel: 1+4+16+64… then from the resolution of the current mip level (say 8x8 = 64, which should be a single bit), subtract 1 to flip on all the bits below. Use that as mask.

3 weeks ago 1 0 2 0
Post image

Here is how I sample this light:

3 weeks ago 12 0 0 0

I read you paper when you posted it a while ago, it’s great stuff!
A difference perhaps is that this is not a spherical rectangle, it’s more close to an orthogonal spherical crop. Those derivations on a different starting function are beyond my abilities.

3 weeks ago 0 0 1 0
Video

It's not a uniform distribution. Well, it is relatively uniform if at least one of the angles is small. It creates 4 singularities when both angles approach π. But that is outside the intended use because there are easier ways to cover the entire hemisphere :D

Here is the result in render.

3 weeks ago 12 0 1 0
Video

A long time ago in a galaxy far away (circa 2011) I came up with a distant rect light for my lighting crew. I can't remember how I did it, but I am trying to recreate it.

Here is a test for the light sample distribution in the hemispherical domain.

3 weeks ago 32 1 3 0

Some times you want the character feel moving through the environment, so practical lights and indirect illumination covers that. Some time you need to highlight a feature and keep it stable -> distant lights.

3 weeks ago 2 0 0 0

What are distant lights good for?
Not for the sun, you get much better result with a HDR map. They are good for fill, rim, modeling lights, especially when the characters move around in the sequence and you need to maintain shot-to-shot consistency.

3 weeks ago 5 0 1 0
Video

After how many years? I added code for distant area lights. Some call it "directional". The control "scale compensation" is meant to preserve the light power (radiant flux), so as the emissive area increases, the intensity decreases.

3 weeks ago 27 0 1 0
Advertisement

It looked fun for a minute, but it turned sour quickly.
I don’t know what to do with those people. It’s not worth showing them, it only empowers them. Answering with patience is pointless (I tried, he didn’t reply).
I wish they’d find a better way to channel their eagerness.

3 weeks ago 10 0 0 0

Likely I could preallocate those resources. It would complicate some code, like the fallback if initialization fails. Some other denoisers won’t be able to tell how much memory they need until initialized.

4 weeks ago 1 0 0 0

Everything is better when workflow is simple and repeatable.

4 weeks ago 0 0 1 0

Thanks. A big GPU helps :)

4 weeks ago 2 0 1 0