What absolute buffoonery! Balloons only generate static electricity when rubbed on hair.
Posts by Real Time VFX Mike
We've been over this, the shambler needs hair to generate static electricity for his lightning attack.
A little behind the scenes. All shadows are bakes and lighting is all very much faked.
It looks like the animated wallpaper plugin I was using no longer works with Windows 11, but this should work with Wallpaper Engine and I'll look into making it work as a screen saver. Hopefully this looks ok compressed.
I will be returning with NEW music for the Prodeus DLC!
This works a lot better than I expected it to. #screenshotsaturday #indiedev
You could treat the midpoint like a point in a bezier curve. Then that value will be guaranteed at that point in the gradient. You would have to generate some handles that make sense based on where you are moving the midpoint.
Looks cool but why not just put it all in 1 shader? /s
If I shared with you I'd have to share with everyone.
IT at game studios, "Let me just calibrate this $3000 monitor for you."
People buying a tv, "Gimme the biggest, brightest, cheapest screen you got!"
Bacon wrapped filet with gorganzola cream sauce and roast cauliflower and carrots. I'm participating!
Week old pinaple and a vodka coke. #foodish
Does the triplanar 1 sample dither mess up mip maps? Sometimes when using logic in shaders it makes the ddx ddy go crazy for that pixel quad.
Light probe version is on the back burner as it is kinda ugly, but it does run faster as it doesn't require any filtering after being drawn to the screen.
Also added some rejection algorithms to the temporal blending. The base blending amount is 3% but if the camera is moving a lot, the light changing a lot, or if areas are being exposed or hidden the blending will increase to mitigate ghosting artifacts.
Lighting changes:
Bleeding the lit voxelized geometry instead of just black for light blocking. (more light propagation)
Generating mips for voxel textures for softer cone samples. (more stable)
Below is a view of what a light cone sees VS what the scene looks like.
There is probably more that can be done with the filtering. I want to try doing 1 sample per pixel full res and then breaking the screen into 64 tiles for broad filtering with better cache hits. I saw a research paper about this for AO but I can't find it now.
Perf changes:
Switched from SSAA to MSAA for voxelization.
Skipped distance field as it took too long to generate.
Added a low-res bilateral pre filter before upsample.
Finally got this brighter and running at 110-120 fps on a 1080GTX at 1080p. #screenshotsaturday #unity3d #gamedev #shaders
Still getting around 90 fps on 1080 GTX. The filtering and cone traces on the screen are expensive so switching to a light probe volume technique will hopefully speed things up and brighten stubborn dark areas.
SEGI is voxel cone trace-ish. As cones get larger it samples coarser cascades.
Some more work GI work. Removing all the biases and magic numbers I can find. Made things a little harsher/blotchy but is more predictable in more situations. I think a light probe volume might be better for evenly spreading more light. #screenshotsaturday #gamedev #unity3d #shaders
Some more updates on SEGI integration. Lights with shadows affecting the GI, custom emissive material, unfiltered GI samples, Filtered GI samples. Running at 95 fps at 1080p, on 1080 GTX. Hoping to get it up to 120 fps. #screenshotsaturday #gamedev #unity3d #shaders
Also update the voxel visualization shader. Uses the larger cascades like a sparse voxel octree to guess if there is anything in the lower cascades. Some of the normals are gray due to both sides of the cloth getting rendered to the same voxel. And there is a little light leaking from the sun.
Nah, that guy was obviously trying to mess with the EQ. V-shaped or death.
You can also add is to just a part of a character and make a cool invisibility cloak.
Posting some old stuff. Here is an active camo effect I have used on a few projects. This keeps the object opaque and pulls the background colors inwards.
Write up: vfxmike.blogspot.com/2018/06/opaq...
Repo: github.com/SquirrelyJon...
#vfx #unity3d #madewithunity #gamedev #indiedev #tutorial
I should note that some of the light code was taken from the PipLight project which is all about static depth maps on lights. Untouched for 8 years. bitbucket.org/Zuntatos/pip...
Using the voxelized scene I can draw shadows for all the lights in 1 pass to an atlas of 64 lights. These lights and their shadows get injected in the next voxelization pass and are included in the GI. These shadows can also be used when lights get too far away to warrant a full shadow pass.
During the voxelization pass I render the surface normal to another buffer, this is the only data needed for second bounce pass and also is handy later on when you need "soft" normals of the scene.
Normally the scene is voxelized in one pass by telling a camera to render and then voxelized again in a light bounce pass. This can be skipped by executing a command buffer that draws a series of planes that sample the voxelized scene to do the bounce pass.