I hope all the little touches you put in your work may be appreciated like this one day
Posts by Samperson
holy shit you summoned
Hey!! Again massive compliments to you and the team ๐ซก
Just followed, I'll reach out once I get a minute!
Wack question, but does anyone know anyone on the Minecraft Create team that'd be down to talk about building stable game physics? The Create Auronautics mod is one of the most impressive things I've ever seen on this front, genuinely. Flex after flex after flex.
youtu.be/SO8tpWfZjGk
also this
i love playing in the outline, pulling in a bunch of scattered concepts, finding the most important-feeling load-bearing bits and fleshing them out as needed, mashing 'em together like action figures
i think the most taxing editing/writing job i've ever done was the last 10 minutes of the bass reeves doc
which is hilarious because it's the least-edited feeling bit of that entire project
i think i have several dozen versions of that timeline, all "test" "rev5" "v3", none "final"
either way i accomplished what i set out to do, and even if everything i did was a dead end, i still successfully mapped out a little more of whatever creative space that project is operating in. which is progress!
every tip i've seen has been a variant of "decide that you're just fucking around, and make sure your insecurities get the memo"
which is rad, because that's great advice
sometimes i'll set a 15min-1hr timer, and just set the goal of "do tinkering"
i can spend time. then i hit a stride, or don't!
you can't decide the quality of what you make - you can only control what your hands do, so the active goal has to be within reach of that
every project i've ever finished had the word "test" in the filename
I think being pessimistic is smart! And yeah, it's very publicly available - I only bothered to start really digging in after seeing it working, including on social media posts.
www.pangram.com
Right???? Like every bullshit alarm is going off looking at it from the outside
and to be clear, this is a service that detects whether or not text was human-written or AI-generated
like this is CRAZY, especially because, again, it seems to not be bullshit
their training process is Weird but makes. sense. sick as hell that this exists and works, especially at the current base model scaling plateau we're at, tho
i'm reading their technical report but "we simply trained the AI on GPT-generated text" feels too easy for the level of accuracy they seem to have - not just in test cases, but in the real world
arxiv.org/pdf/2402.14873
okay how does pangram labs work
i always assumed text would be too low-signal to accurately detect AI past a certain point, but this seems to genuinely mostly work? for real?
does it have to detect generation in a model-specific way based on token sequences or some shit
help
the medium of games is really special because unlike books, songs, or films, they can't be about anything
yeah alright close enough, welcome back
Tried a hack to do it in world space, which sorta-works, but float precision means that you need to keep tweaking the far clip offset or you'll get a hole in the back - which feels silly when I could handle it directly in clip space? But maybe the real problem is going back to object space at all.
So in short: I think you're right that my loose grasp of the flow between these spaces is the problem
I tried a more 'correct' variant running through view space to get back to object space, but this cancels out the clip space z alterations.
(the inverse z version ifdef'd out correctly inverts the z of all points... in object space haha
I assume that since this is starting from a float4 clip space value, this w shouldn't need to be set to 1 - that said, not working in NDC space could be a problem?
This also results in a strange behavior where it like, halves the object's visual position in world space? (video in next)
This is amazingly helpful, thank you so much
Just to run through each of these - TransformObjectToHClip does take a float3 and handle that 1 natively, unless I'm missing something? (This is from SpaceTransforms.hlsl)
I really appreciate you taking the time to mess with this!
Seems like I just end up recreating my original code every time haha, so maybe this is the way to go.
Your idea led me to try using the Screen Position (mode = raw) node to do the math in the graph, but going back to object space was still my issue
Hm. Maybe just manually specifying the far clip and doing CamPos + diffVec*farDist through a method like this is the way to go.
I'm trying to think of the most reliable way to get the world position of the far plane at each vertex in-shader, without the busted transformation I had originally.
I was pulling the matrix constants/functions by directly referencing SpaceTransforms.hlsl, I'm not sure if I misunderstood something there or selected the wrong one?
Yeah :(
I wish I understood where I was going wrong in the vertex displacement function. It looks super straightforward, not sure how to begin debugging what's going wrong.
My best guess is I misunderstand something in the Clip->Object space transformation (going float4 to float3)?