Noo not RT on 6502! I'm crazy but I'm not crazy!
But maybe mad hacks with the tube Co Pro and trying a tile based blob abomination to draw filled polys fast enough. 6502 is wild west.
But in testing the idea I accidentally wrote a compute raytracer for my laptop.
Posts by
Well... I have a 6502ish renderer that can manage that, if you'll allow the tube coprocessor?
With a potential blob optimisation if you don't mind triangles with more or less than three sides.
The modern renderer tears through 16K verts. I think I can go faster with vulkan - this is handmade opengl.
Mistakes were made. The reference deferred renderer is 200μs - the raytracing is ten times slower! Looks like there is an opportunity to optimise after all!
Modern GPUs are faster than you think.
I've now written a tumbling cobra at every tech level from 6502 assembly to compute raytracing.
Holy smokes when did hardware get so good? 200μs render is... five thousand frames per second? Yeah no.
I'm thinking I need more evil plans and stuff. Any ideas?
Welcome to town!
OK I tried some code and slept on it.
Drawing triangleish blobs without precise edges should be faster if you dont have a GPU but there are problems.
Depth gets weird. But there is an image stability problem. One frame is faster because errors but different errors every frame is an ocular nightmare.
kicking ass is back in town.
This week in mad code plans... writing a barycentric approximator!
You know triangles? What if you were bad at drawing them but it was faster. Sure they are a bit blobby but those blobs are a couple of pixels and the GPU isn't fill rate bound.
Yes I have gone wrong.
Its a simple games project - OpenGL, SDL, basic rendering, physics, and gameplay logic.
Its representative of the code you would see in mainstream game dev. A fair amount smaller, less complexity, fewer dependencies so I'd hope a lot smaller context required.
Its personal games project in C++, ~1% the size of a project from my professional life based on number of classes and lines of code.
I've run out of context in a conversation about a small fraction of that code. RAG and Cline help but straight up context needs to improve 10x or 100x
My 2020s AI journey has come to an end. As a long term professional game developer who cares about the human experience more than anything, I've come to a dead end with the technology.
I'll get back to writing about games now :)
The full goodbye is here.
substack.com/home/post/p-...
Yea, I had hoped that would steer me toward a task it can excel at.
Ask for a horoscope, no precision or accuracy required. Meets expectations.
Ask for a json or yaml config, precision required. Results incorrect.
Ask for C++, precision and reasoning required, solved the wrong problem badly.
Well, my AI coding adventurers have been fun but are coming to a close, and I'm working on the final write-up.
An 80b qwen with 256k context is woefully insufficient to help with simple tasks in code ~1% the size of a real game.
I've still been unable to find anything it can help with.
Gonna make a hookup app for geeks called "pull request" #geekjoke
You and me baby ain't nothing but yaml,
So let's do it like they do it on the properties panel.
If you like languages where whitespace matters please reach out. We can help you.
If you design a language where whitespace matters you are going to hell. Straight to Hell. To the boiler room of Hell. All the way down.
Python and yaml I'm looking at you.
I hope my post reflects that I am not finding use, so I can't rightly fathom your question or respond to it honestly. And if I have not made it clear, I have no "most useful".
Unless you are a bot, which would rather prove my point that modern AI appears to struggle with reading and comprehension.
It currently is not the right tool, or I've not found a job its good at. Personally this isn't a good start. I'd
At a professional level ... once it acts as a force multiplier, helping teams communicate and iterate then it'll get adopted, until then its just an expensive way to lose expertise.
I suspect the only way to fight ensloppification is to boycott companies that are using AI to make a product worse. Vote with your wallets isn't much but its the tool I have.
I remain committed to determine how to use this tool to make (specifically games) better. It's harder than you'd think.
My AI coding "assistant" just gave me a inappropriate singleton that I'm going to have to remove. And uses strings where it should very obviously use integers.
If it makes bad decisions like this, whose code was it trained on?
I can identify and reject these, but would an inexperienced developer?
This is pretty much my biggest problem with the application of technology right now.
Answering a question I didn't ask - no matter how quickly - provides no value. Doesn't matter if its customer support or agentic programming.
If its not what I asked for then its worth nothing.
I'm a software engineer and the field moves quickly. I learned OpenGL from the Red Book and have been constantly learning since. AI trained on old code scraped from the internet has limited value.
AI that learns manners online is not going to end well. If this is the future we are in deep Barney.
Just the context here, it'd given me consistently wrong instructions for longer than it took to second screen 2024s fourth-part-of-a-trilogy Kingdom of the Planet of the Apes.
Nearly three hours of insisting a deprecated API works, and me demonstrating that it doesn't so I can learn how to use AI.
This AI agent has suggested that perhaps we should try again later when I have more patience or try something else.
An alternative would be that its less wrong all the time.
There are only so many times I can call out a liar who responds with "You're absolutely right!" before I give up.
Bluesky doesn't really allow for essay answers, so I put this weeks AI thoughts onto substack where I try, once again, to find value in this new technology.
It was an interesting journey, if anybody can tell me what the destination is I'd appreciate it.
rogerbennett1.substack.com/p/2020s-ai-a...
In my AI experiments, it's great at making things that are trivial copy-paste but really fails with anything new.
Typicsl game development happens in a private repo that it hasn't trained on.
I don't use AI for work work, but my attempt was comparable.
It got to the wrong answer really fast, though.
The evidence speaks for itself!
I'm on a train so no AI programming adventurers this week, but I am able to sit and develop some human made code.
I did encounter the phrase "Slop jockey" as a pejorative for AI programmers.
This week I'm trying to pull every trick to not allocate memory, and reduce, reuse, recycle my footprint.
Nothing else matters?
I can't always tell what gizmo has been up to. But IDK.
This lil ginger boy has it all going on.
The economic angle is weird.
Looking at a $20 month subscription vs $2000 GPU.
A hundred months of rental shouldn't be cheaper than purchase, so the economics of it have to be broken.
And the local GPU is worse than the subscription.