If you think Microsoft breaking Windows is a new thing - think again🤭
They've killed their own widget platform 6 times in 30 years. Each one died from a different spectacular failure.
But the last iteration might actually be done right
Read the full history:
xakpc.dev/windows-widg...
Posts by Pavel Osadchuk
apparently most current LLMs suck in C#
it figures
I'm almost at the point where I could use my own LLM chat app to develop my own LLM chat app
art by @ironlily.bsky.social
I'm building my own LLM client as a native Windows app because every app from major vendors is painfully slow
as for today's background, I chose the coolest art from @ironlily.bsky.social
The main feature of my 🪟 app is done - a truly instant quick chat.
Here's a comparison of desktop apps from different providers versus what you can get when cutting out all the noise (next post ⬇️)
I think it's good, what do you think?
Today I decommissioned the last piece of my late startup
This .NET Framework 4 web app, hosted on Azure, worked for 8 years total, and the last 5 almost by itself
Now, when the last user no longer need it, it can finally go to rest. Thanks for all your work, small app 🫡
Why does copilot sometimes suck? Here's example N+1:
I simply asked it to add docs.
`inheritdoc` is OK, but what's with this `min` method that doesn't even exist in C#?
As soon as I expanded my ChatGPT subscription, I set myself a recurrent task
So far it's not bad, so I'll create more 'summarize news' tasks
*weekly limit
o3-mini-high was quite good, but I spent my 50-message daily limit 😁
o3-mini - not sure yet if it's even on the same level as Claude
The same code was produced by ChatGPT o3-mini (not high) and Claude 3.5 Sonnet
Both produced the result after 2-3 steps of dialog with clarifications
Let me try some LLM clients from windows store
Oh, okay...
Instead of going for a walk, I'll add shallow support for ollama.
I'll regret it later, but for now it's quite fun
And before I wrap for a day, some funny shit
Because why not, it's my app in the end
You probably have around 10GB of RAM and most likely a GPU of some kind on your windows machine.
Your chats with LLMs should NOT take several seconds to load couple messages. This is the speed we need 🎞👀
(forgot to shut off some BGM I was listening to while coding, so enjoy)
You know your project is good👌
when you install Microsoft.Extensions.Hosting NuGet to it
Test your app with proper prompts
Starting to look like an actual app
Enterprise grind is done for this week; back to building desktop apps
The R1 model is funny to follow through all of that
But wait!
But wait!
But wait, there is more
I kinda gave up on 🦋 lately, but I haven't given up on building stuff.
Here is an early version of a Windows app (the ugliest one): an LLM app outperforming both Copilot and ChatGPT native apps in speed on the same 4o-mini model.
I might even ship it soon with BYOK or with some kind of sub
A couple of TRAXX P160 AC2
Hiking is my greatest discovery this year
Day spent.. interesting
Now that's a remarkable achievement
/s
Danube is marvelous
I think Carson is not here so there is no one to push htmx content
The last 9-5 day this year just ended
it's time to lie down and do nothing for a day or so