GitHub Copilot now shows ads in pull requests. Microsoft charges $19/month for it and still injects sponsored content into your code reviews. This is what happens when AI companies can't figure out monetization without falling back on surveillance-era ad patterns.
Posts by Niko from Axrisi Turazashvili
It dropped a bit I guess since then :)
Google’s new Colab MCP server is more powerful than it first sounds.
I used it to let an AI agent build, run, debug, and submit a real quantum chemistry notebook from scratch inside Colab.
Article on @dev.to :
dev.to/axrisi/i-str...
#AI #MCP #GoogleColab #Colab #QuantumComputing #IBM #Python
Wrote a full walkthrough on deploying a highly available cluster on bare metal with Canonical MicroCloud. 4 nodes, Ceph distributed storage, software-defined networking, 30 minute setup. No Terraform needed.
check it out on @dev.to
dev.to/axrisi/stop-...
Google’s new Colab MCP server is more powerful than it first sounds.
I used it to let an AI agent build, run, debug, and submit a real quantum chemistry notebook from scratch inside Colab.
Article on @dev.to :
dev.to/axrisi/i-str...
#AI #MCP #GoogleColab #Colab #QuantumComputing #IBM #Python
Meta kills Instagram DM encryption May 8, citing low adoption. Meanwhile they train AI on platform data and their ad revenue depends on data signals. Instagram DMs are where purchase intent lives. They're not removing encryption because nobody used it. They're removing it because it's in the way.
Studied how AI apps actually make money. 7 models keep showing up: subscription tiers, usage-based, freemium, intent-based ads, API marketplace, outcome-based, and hybrid stacking. The winners combine 2-3 of these. Broke it all down in a LinkedIn carousel.
A user asked Gemini about Japanese grammar. The output had a JSON block with a real name, email, and phone number that wasn't theirs. They shared the page publicly to prove it. Gemini pulled PII into an unrelated conversation. At Vexrail we stay strict no-PII, intent-based only.
MCP server count: ~30 in December, ~80 in January, 200+ now. Amazon Ads, Stripe, Cloudflare all have production MCP servers. The protocol adoption curve looks like early REST APIs but compressed into months instead of years. Agent-to-service communication has its HTTP moment.
Meta's AI glasses are sending intimate video clips to annotators in Nairobi. Bathroom visits, people undressing, sex. Workers say "we see everything." Seven million units sold on the promise of a personal AI assistant. The surveillance ad model has a camera strapped to your face now.
We're moving from "learn to code" to "learn to build."
The abstraction layer rises with each generation. The skill shifts from how to implement to what to build and why.
Tools get better at writing code. The human skill is knowing what code to write.
Fork, experiment, break things. That's how open source was meant to be used.
Your fork doesn't need to be perfect or become a PR. It just needs to teach you something.
The source is open. Use it.
Write error messages for the person debugging at 2 AM.
Include: what happened, why it might have happened, what to try next.
"Something went wrong" helps nobody. Every error message is documentation.
Next wave of dev tools will be context-aware.
Your IDE already knows your project structure, dependencies, conventions, git history.
We're moving from tools that write code to tools that understand your codebase. That's a massive difference.
Open source sustainability isn't just about funding.
It's about reducing maintainer burden: better docs, automated triage, clear contribution guides, shared decision-making.
Don't just donate. Reduce the burden.
Database queries slow? Add observability before indexes.
Know which queries are actually slow, how often they run, and whether it's the query or infrastructure.
You can't optimize what you can't measure.
Developer tooling is the real competitive advantage for engineering orgs.
Better tooling = faster shipping, fewer bugs, faster onboarding, less toil.
The team that ships fastest with fewest bugs wins. That's a tooling problem, not a hiring problem.
Best way to learn a new technology? Contribute to a project that uses it.
Tutorials teach syntax. Real codebases teach patterns, tradeoffs, and how things break.
One real contribution beats ten tutorial projects.
Three things I look for in every code review:
1. Is the intent clear? 2. Are edge cases handled? 3. Is there test coverage?
Everything else is style preference. Automate that with linters. Focus review energy on what prevents incidents.
Vibe coding is real. AI generates working code fast.
But production still needs: edge case handling, security, performance at scale, maintainability.
Speed without rigor is technical debt on autopilot. Use the tools, don't skip the engineering.
If you maintain an OSS project, your README is your landing page.
Clear problem statement. Quick start under 2 min. One practical example. Links to docs and community.
Your code might be brilliant, but if the README doesn't sell it, nobody will find out.
Most microservices architectures would be better as a well-structured monolith.
Network calls replacing function calls, distributed tracing archaeology, orchestrating 15 deploys.
Don't pay the distributed systems tax until you need to.
AI agents are moving from demos to production.
The bottleneck isn't the model — it's reliability. Retries, fallbacks, observability, guardrails.
Teams shipping real AI agents have the best engineering around the model, not just the best model.
The most underrated open source contributions aren't code.
Fixing docs, triaging issues, reviewing PRs — this is what keeps projects alive.
You don't need to write code to make an impact. Start small, stay consistent.
Your CI pipeline should be under 5 minutes.
Fast feedback loops change everything — shorter cycles, faster shipping, fewer broken builds.
If it's slow, fix it before adding new features. Your team will thank you.
#DeveloperExperience #AICoding #DevTools #CI
"Fix Your Tools" on HN today. The argument: most developer productivity problems aren't about finding better tools, they're about properly configuring the ones you already have. Same applies to AI coding assistants. Default settings are rarely optimal for your workflow.
What is the hardest non-technical problem you have faced as a founder or indie builder? For me it was banking. Not code, not product, not users. Getting a bank account.
The part of WebMCP nobody talks about: the declarative API. You annotate a standard HTML form with toolname and tooldescription attributes. The browser generates a JSON schema automatically. No JavaScript required. Agents get structured tools from plain HTML. That's elegant.
Anthropic's new Sonnet 4.6 "approaches Opus-level intelligence" with big improvements in computer use - navigating spreadsheets, filling forms. Agents that can reliably interact with structured interfaces are getting closer. MCP + better computer use is a strong combo.
Someone built zclaw - a personal AI assistant under 888KB running on an ESP32 (232 points on HN). A full AI agent on a $4 microcontroller. The gap between "AI needs a data center" and "AI runs on anything" is closing faster than most people realize.