I thought it already did, and that’s why it was dangerous.
I thought it already did, and that’s why it was dangerous.
Wow that adder was honestly impressive. Great article overall.


On paper, I like this solution better than every app/site developer having to hack together (or outsource) their own age verification system. But I’m sure it opens up a ton of potential problems. And if it’s open source, someone could just fork it and make a version that always says “yes” so unfortunately it’ll never be FOSS.


Are they coming to harvest the RAM from my computer?


Humans are historically pretty good at offloading mental capacity to some sort of tool in order to tackle larger and more complex problems. Consider solving a math problem mentally. Compare that to the kind of problem you would be able to some with a pen and paper. Then consider what you could do with a pen and paper and a calculator. An LLM purports to be all of that, and more, for any subject. It doesn’t matter that the results are often horrifically wrong, once they’ve offloaded the entirety of their mental capacity to the magic box and refocused their attention somewhere else.
This article was so well-written that I was briefly surprised to encounter the term “nerfed” in the middle. I guess it’s common parlance in tech circles at this point.