It’s Code Red
Posts by grumpycat (leslie-alexandre d.)
Trump TS post featuring an illustration of himself as a Christian faith healer
I’m not sure it has broken through to the general public that the president is a megalomaniac crazy person. Hopefully posts like this help.
Update your gemma-4 chat template
- dense 4B: huggingface.co/google/gemma...
- MoE 26B: huggingface.co/google/gemma...
Merci pour la découverte
Quick test of unsloth/gemma-4-26B-A4B-it-GGUF (llama.cpp building function)
1. only 8k context size at the end
2. very precise, follow instructions well
3. 4.6GB of vRAM only (experts on CPU)
The task was pretty simple but I like the correctness in following style, instructions and patterns.
finally.
copilot is drunk. But soon enough it will be okay.
woah
damrnelson.github.io/github-histo...
thanks, you don't want to know how much I "invested" in keebs.
here an HHKB Rama Thermal with tactile switches (~62gr) and a GMK Monokai keyset.
git ESC is jealous.
La blague. J'ai failli y croire 😳
There is also @loves.brussels list atproto.brussels/atproto-apps
I have another ones, will post tomorrow on the computterr.
🖖 Acces a quoi ? Je n'ai pas suivi :x
Verify:
- node_modules/plain-crypto-js
- /tmp/ld.py
- axios manifest and version
Block:
- packages.npm.org
- sfrclak.com:8000
www.stepsecurity.io/blog/axios-c...
It's clearly corruption, everybody knows it at this point.
Oh, GMK?
Microsoft is busy laying off and shilling its stock with AI replacing humans, I guess /s
I'm preparing something at my level with @frenchsky.app I'll begin with PDS and everything related to it before moving on. What are the topics you think of when you speak about infrastructure?
tuuurboQuant tq3_0, perfect for the gpu poor
experimental implementation(s) in llama.cpp
> github.com/ggml-org/lla...
> github.com/ggml-org/lla...
Local first Fill-in-the-Middle (FIM) with llama.cpp
> leaf.eagleusb.com/3mhv6sz2pf22b
#llm #localllama
Just the beginning.
Today was a Fill In the Middle day. llama.cpp is custom built, ready to get some Zed's edit predictions locally.
Sweep Next-Edit is quiet fast, indeed (based on Qwen2.5 coder).
huggingface.co/sweepai/swee...
> TIL `❯ git restore [--theirs | --ours] path/*`
The tricky part is the inversion of `ours` and `theirs` meaning during rebase, as said in manpage.
It's a real pattern.