I do find some use for LLMs when I need to make a genuine natural language query:
Is there a name/theory describing a shift in the political spectrum (left or right) that results in what were once moderate left/right views becoming considered “extreme”?
Turns out there is a thing called the Overton Window 🤷♀️
Though, not sure I agree with Lehman right now:
The most common misconception is that lawmakers themselves are in the business of shifting the Overton window. That is absolutely false. Lawmakers are actually in the business of detecting where the window is, and then moving to be in accordance with it.
I see reposts are a thing on Instagram now. Yet more content I can’t choose not to see.
I was in Barcelona recently and saw an apparently unofficial femicide memorial, just there, by the side of the street. We need more of this. Especially in the UK. Our politicians have got everyone frothed up about migrants they’ll likely never see or meet, and the biggest crime is literally right nextdoor.
Just had a classic LLM loop: “How do I do this?/You could use this setting/I can’t find that setting/that’s because it doesn’t exist”
See also https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/:
Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.
The 33-year-old Columbia University protester had been held in an immigration detention centre for a year.
It’s ludicrous, but I genuinely hope the law eventually catches up. Assuming it is allowed to…
Why are the problems here not manifestly obvious to all involved?
For me, this is the most damaging result of our short-sighted rush towards #AI “productivity”:
More recently, heads said parents had been using AI to generate lengthy, legalistic complaints that required increasing amounts of time to administer.
This is a completely justified use of an LLM and it’s going to cost more time than an LLM will ever save. Unless you use an LLM to respond, in which case we’ll just have LLMs burning resources achieving nothing.
Shameful