Advertisement · 728 × 90

Posts by Sasha Kotlyar

Preview
Zimbabwe Birds: The iconic stone sculptures are all finally home after 137 years For centuries the prized sculptures, central to national identity, have been kept outside Zimbabwe's borders.

"Zimbabwe's iconic stone birds were taken by colonialists. Finally, they're all back home"

2 days ago 119 29 0 0

I don’t think the problem is that everyone is treating it like an enterprise customer solution. I think the problem is that everyone is trying to play the “capture user credentials to lock them into our ecosystem” game. That’s why 1Password, Chrome, and Apple all want my passkey every single time.

2 days ago 10 1 2 0
Preview
Pancreatic cancer mRNA vaccine shows lasting results in an early trial Scientists caution that more research is needed, but nearly all of the patients who responded to the personalized vaccine are still alive six years later.

"Pancreatic cancer mRNA vaccine shows lasting results in an early trial: Scientists caution that more research is needed, but nearly all of the patients who responded to the personalized vaccine are still alive six years later."

2 days ago 9697 2888 156 600

I’m sorry he got so blackout drunk they had to ask for SWAT door busting equipment?!

3 days ago 3680 938 65 110
Massive Attack, Tom Waits - Boots on the Ground
Massive Attack, Tom Waits - Boots on the Ground YouTube video by Massive Attack

The new Massive Attack x Tom Waits track and video are worth a listen and watch.

4 days ago 386 148 11 26
A post on the life pro tips subreddit, saying:

LPT: If your Windows PC has been acting up or crashing and you don't know why, search for "View reliability history". It gives you a visual timeline of every app crash and hardware failure without needing to be a tech expert.
Computers Most people try to dig through the "Event Viewer' which is a nightmare to read. Instead, just hit the Windows key and type "Reliability". Open "View reliability history". You'll see a graph with red "X" symbols for every day your computer had an issue. Click on a specific day, and it will tell you exactly which program crashed or which Windows update caused the Conflict.
It's the fastest way to find out if your "blue screen" was caused by a specific driver or just a buggy app you installed yesterday. It turns "My computer is broken" into "Oh, this specific update failed.

A post on the life pro tips subreddit, saying: LPT: If your Windows PC has been acting up or crashing and you don't know why, search for "View reliability history". It gives you a visual timeline of every app crash and hardware failure without needing to be a tech expert. Computers Most people try to dig through the "Event Viewer' which is a nightmare to read. Instead, just hit the Windows key and type "Reliability". Open "View reliability history". You'll see a graph with red "X" symbols for every day your computer had an issue. Click on a specific day, and it will tell you exactly which program crashed or which Windows update caused the Conflict. It's the fastest way to find out if your "blue screen" was caused by a specific driver or just a buggy app you installed yesterday. It turns "My computer is broken" into "Oh, this specific update failed.

Did you know about reliability monitor on Windows? 👀

4 days ago 54 11 7 2

How would I know, why should I care

1 week ago 0 0 0 0
A photograph of a processed meat package (think sausage meat)
It is labeled Vegetarian Ham - Chicken Flavour

A photograph of a processed meat package (think sausage meat) It is labeled Vegetarian Ham - Chicken Flavour

The more I read, the worse it gets...

1 week ago 13 3 0 0

Watched Rango for the first time tonight. I didn't have high expectations, but it was way more entertaining than I imagined! Highly recommend.

2 weeks ago 0 0 0 0
Advertisement

I feel like “the world passed peak gas-powered car sales in 2017” is not a widely known fact

2 weeks ago 3071 1042 30 15
Preview
Sickle cell disease has just been cured for the first time in New York For 21 years, Sebastien Beauzile lived with a disease that shaped every part of his existence.

Awesome! techfixated.com/sickle-cell-...

2 weeks ago 0 0 0 0

Afroman rules. (And he just won this case)

1 month ago 819 152 10 6
Hard 'N Phirm "Pi" Dir: Keith Schofield
Hard 'N Phirm "Pi" Dir: Keith Schofield YouTube video by keithschofield

Happy Pi Day! www.youtube.com/watch?v=Xanj...

1 month ago 0 0 0 0
Code snippet demonstrating a simple popover with a toggle button, showcasing HTML attributes for functionality.

Code snippet demonstrating a simple popover with a toggle button, showcasing HTML attributes for functionality.

🆕 The Popover API is Baseline, no JavaScript needed

Tooltips, dropdowns, and menus with just HTML attributes.

⋅ popover attribute on any element
⋅ popovertarget to wire the trigger
⋅ Accessible by default, no ARIA hacks

Learn more 👇
developer.mozilla.org/en-US/docs/...

1 month ago 163 25 0 1
Preview
The Most Important Software Development "Tool" in the Age of AI The key to producing reliable software today is simple, but seldom easy.

Been a while since I blogged. sep.com/blog/verific...

1 month ago 0 1 0 0
Preview
The Most Important Software Development "Tool" in the Age of AI The key to producing reliable software today is simple, but seldom easy.

Been a while since I blogged. sep.com/blog/verific...

1 month ago 0 1 0 0
Advertisement

Since all worlds have them, who do you think invented them...?

1 month ago 134 10 14 0

It's a Mac 🙃

1 month ago 1 0 1 0

The first time I saw this pop up in my feed, a quote followed by my birth year, I checked my pulse three times to make sure I hadn't missed a rather essential announcement.

1 month ago 297 52 13 0
Preview
Google quantum-proofs HTTPS by squeezing 15kB of data into 700-byte space Merkle Tree Certificate support is already in Chrome. Soon, it will be everywhere.

arstechnica.com/security/202...

TLS is going to be interesting over the next few years.

1 month ago 3 1 0 0

I'm attempting to curate my digital music library, with proper tagging and everything. It's not going well. Best tool I could find, beets, can't handle my SACD rips at all, and has hardcoded paths in the DB, making it not portable. This is frustrating.

1 month ago 0 0 0 0
Post image
1 month ago 5863 1885 174 110
Preview
Acme Weather

The Dark Sky people sold out once already. I don't trust them to not do it again.

acmeweather.com/blog/introdu...

1 month ago 0 0 0 0
Advertisement
Post image

I just did the dumbest thing of my entire career to prove a much more serious point.

I tricked ChatGPT and Google, and made them tell other users I’m a competitive hot-dog-eating world champion

People are using this trick on a massive scale to make AI tell you lies. I’ll explain how I did it

2 months ago 4848 2139 87 303

This is why we don’t give up.

2 months ago 1958 269 12 11

So far, I've been enjoying using Lyrion Music Server with both physical Squeezebox hardware and software-based players. It's not sexy, but it just works. Cross-player sync, Spotify integration, Internet radio, etc., just... work.

Legacy doesn't have to be a bad word. It can mean stability.

2 months ago 0 0 0 0

love a bit of continvouclous morging

2 months ago 830 231 38 44
Post image

BREAKING: @barackobama.bsky.social clarifies his position on aliens after his answer during the speed round of our interview went viral.

2 months ago 8377 1216 367 214
I have been sick with COVID all week and missed Mon and Tues due to this. On Friday, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh.

Here’s what happened: I was incorporating information from Shambaugh’s new blog post into an existing draft from Thursday.

During the process, I decided to try an experimental Claude Code-based AI tool to help me extract relevant verbatim source material. Not to generate the article but to help list structured references I could put in my outline.

When the tool refused to process the post due to content policy restrictions (Shambaugh’s post described harassment). I pasted the text into ChatGPT to understand why.

I should have taken a sick day because in the course of that interaction, I inadvertently ended up with a paraphrased version of Shambaugh’s words rather than his actual words.

Being sick and rushing to finish, I failed to verify the quotes in my outline notes against the original blog source before including them in my draft. 

Kyle Orland had no role in this error. He trusted me to provide accurate quotes, and I failed him.

The text of the article was human-written by us, and this incident was isolated and is not representative of Ars Technica’s editorial standards. None of our articles are AI-generated, it is against company policy and we have always respected that.

I sincerely apologize to Scott Shambaugh for misrepresenting his words. I take full responsibility. The irony of an AI reporter being tripped up by AI hallucination is not lost on me. I take accuracy in my work very seriously and this is a painful failure on my part.

When I realized what had happened, I asked my boss to pull the piece because I was too sick to fix it on Friday. There was nothing nefarious at work, just a terrible judgement call which was no one’s fault but my own.

—Benj Edwards, February 15, 2026

I have been sick with COVID all week and missed Mon and Tues due to this. On Friday, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh. Here’s what happened: I was incorporating information from Shambaugh’s new blog post into an existing draft from Thursday. During the process, I decided to try an experimental Claude Code-based AI tool to help me extract relevant verbatim source material. Not to generate the article but to help list structured references I could put in my outline. When the tool refused to process the post due to content policy restrictions (Shambaugh’s post described harassment). I pasted the text into ChatGPT to understand why. I should have taken a sick day because in the course of that interaction, I inadvertently ended up with a paraphrased version of Shambaugh’s words rather than his actual words. Being sick and rushing to finish, I failed to verify the quotes in my outline notes against the original blog source before including them in my draft. Kyle Orland had no role in this error. He trusted me to provide accurate quotes, and I failed him. The text of the article was human-written by us, and this incident was isolated and is not representative of Ars Technica’s editorial standards. None of our articles are AI-generated, it is against company policy and we have always respected that. I sincerely apologize to Scott Shambaugh for misrepresenting his words. I take full responsibility. The irony of an AI reporter being tripped up by AI hallucination is not lost on me. I take accuracy in my work very seriously and this is a painful failure on my part. When I realized what had happened, I asked my boss to pull the piece because I was too sick to fix it on Friday. There was nothing nefarious at work, just a terrible judgement call which was no one’s fault but my own. —Benj Edwards, February 15, 2026

I have been sick with COVID all week and missed Mon and Tues due to this. On Friday, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh.

Here’s what happened: I was incorporating information from Shambaugh’s new blog post into an existing draft from Thursday.

During the process, I decided to try an experimental Claude Code-based AI tool to help me extract relevant verbatim source material. Not to generate the article but to help list structured references I could put in my outline.

When the tool refused to process the post due to content policy restrictions (Shambaugh’s post described harassment). I pasted the text into ChatGPT to understand why.

I should have taken a sick day because in the course of that interaction, I inadvertently ended up with a paraphrased version of Shambaugh’s words rather than his actual words.

Being sick and rushing to finish, I failed to verify the quotes in my outline notes against the original blog source before including them in my draft. 

Kyle Orland had no role in this error. He trusted me to provide accurate quotes, and I failed him.

The text of the article was human-written by us, and this incident was isolated and is not representative of Ars Technica’s editorial standards. None of our articles are AI-generated, it is against company policy and we have always respected that.

I sincerely apologize to Scott Shambaugh for misrepresenting his words. I take full responsibility. The irony of an AI reporter being tripped up by AI hallucination is not lost on me. I take accuracy in my work very seriously and this is a painful failure on my part.

When I realized what had happened, I asked my boss to pull the piece because I was too sick to fix it on Friday. There was nothing nefarious at work, just a terrible judgement call which was no one’s fault but my own.

—Benj Edwards, February 15, 2026

I have been sick with COVID all week and missed Mon and Tues due to this. On Friday, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh. Here’s what happened: I was incorporating information from Shambaugh’s new blog post into an existing draft from Thursday. During the process, I decided to try an experimental Claude Code-based AI tool to help me extract relevant verbatim source material. Not to generate the article but to help list structured references I could put in my outline. When the tool refused to process the post due to content policy restrictions (Shambaugh’s post described harassment). I pasted the text into ChatGPT to understand why. I should have taken a sick day because in the course of that interaction, I inadvertently ended up with a paraphrased version of Shambaugh’s words rather than his actual words. Being sick and rushing to finish, I failed to verify the quotes in my outline notes against the original blog source before including them in my draft. Kyle Orland had no role in this error. He trusted me to provide accurate quotes, and I failed him. The text of the article was human-written by us, and this incident was isolated and is not representative of Ars Technica’s editorial standards. None of our articles are AI-generated, it is against company policy and we have always respected that. I sincerely apologize to Scott Shambaugh for misrepresenting his words. I take full responsibility. The irony of an AI reporter being tripped up by AI hallucination is not lost on me. I take accuracy in my work very seriously and this is a painful failure on my part. When I realized what had happened, I asked my boss to pull the piece because I was too sick to fix it on Friday. There was nothing nefarious at work, just a terrible judgement call which was no one’s fault but my own. —Benj Edwards, February 15, 2026

I have been sick with COVID all week and missed Mon and Tues due to this. On Friday, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh.

Here’s what happened: I was incorporating information from Shambaugh’s new blog post into an existing draft from Thursday.

During the process, I decided to try an experimental Claude Code-based AI tool to help me extract relevant verbatim source material. Not to generate the article but to help list structured references I could put in my outline.

When the tool refused to process the post due to content policy restrictions (Shambaugh’s post described harassment). I pasted the text into ChatGPT to understand why.

I should have taken a sick day because in the course of that interaction, I inadvertently ended up with a paraphrased version of Shambaugh’s words rather than his actual words.

Being sick and rushing to finish, I failed to verify the quotes in my outline notes against the original blog source before including them in my draft. 

Kyle Orland had no role in this error. He trusted me to provide accurate quotes, and I failed him.

The text of the article was human-written by us, and this incident was isolated and is not representative of Ars Technica’s editorial standards. None of our articles are AI-generated, it is against company policy and we have always respected that.

I sincerely apologize to Scott Shambaugh for misrepresenting his words. I take full responsibility. The irony of an AI reporter being tripped up by AI hallucination is not lost on me. I take accuracy in my work very seriously and this is a painful failure on my part.

When I realized what had happened, I asked my boss to pull the piece because I was too sick to fix it on Friday. There was nothing nefarious at work, just a terrible judgement call which was no one’s fault but my own.

—Benj Edwards, February 15, 2026

I have been sick with COVID all week and missed Mon and Tues due to this. On Friday, while working from bed with a fever and very little sleep, I unintentionally made a serious journalistic error in an article about Scott Shambaugh. Here’s what happened: I was incorporating information from Shambaugh’s new blog post into an existing draft from Thursday. During the process, I decided to try an experimental Claude Code-based AI tool to help me extract relevant verbatim source material. Not to generate the article but to help list structured references I could put in my outline. When the tool refused to process the post due to content policy restrictions (Shambaugh’s post described harassment). I pasted the text into ChatGPT to understand why. I should have taken a sick day because in the course of that interaction, I inadvertently ended up with a paraphrased version of Shambaugh’s words rather than his actual words. Being sick and rushing to finish, I failed to verify the quotes in my outline notes against the original blog source before including them in my draft. Kyle Orland had no role in this error. He trusted me to provide accurate quotes, and I failed him. The text of the article was human-written by us, and this incident was isolated and is not representative of Ars Technica’s editorial standards. None of our articles are AI-generated, it is against company policy and we have always respected that. I sincerely apologize to Scott Shambaugh for misrepresenting his words. I take full responsibility. The irony of an AI reporter being tripped up by AI hallucination is not lost on me. I take accuracy in my work very seriously and this is a painful failure on my part. When I realized what had happened, I asked my boss to pull the piece because I was too sick to fix it on Friday. There was nothing nefarious at work, just a terrible judgement call which was no one’s fault but my own. —Benj Edwards, February 15, 2026

Sorry all this is my fault; and speculation has grown worse because I have been sick in bed with a high fever and unable to reliably address it (still am sick)

I was told by management not to comment until they did. Here is my statement in images below

arstechnica.com/staff/2026/0...

2 months ago 438 59 76 99

Finished the Lower Decks Warp Your Own Way interactive graphic novel. It is SO GOOD! I wasn't sure what to expect, but I can confidently say that everyone who enjoyed Lower Decks should get it. The Hugo award was well deserved.

2 months ago 0 0 0 0