I do find some use for LLMs when I need to make a genuine natural language query:

Is there a name/theory describing a shift in the political spectrum (left or right) that results in what were once moderate left/right views becoming considered “extreme”?

Turns out there is a thing called the Overton Window 🤷‍♀️

Though, not sure I agree with Lehman right now:

The most common misconception is that lawmakers themselves are in the business of shifting the Overton window. That is absolutely false. Lawmakers are actually in the business of detecting where the window is, and then moving to be in accordance with it.

I was in Barcelona recently and saw an apparently unofficial femicide memorial, just there, by the side of the street. We need more of this. Especially in the UK. Our politicians have got everyone frothed up about migrants they’ll likely never see or meet, and the biggest crime is literally right nextdoor.

Just had a classic LLM loop: “How do I do this?/You could use this setting/I can’t find that setting/that’s because it doesn’t exist”

Replied to https://rubyquartzglasses.me.uk/2026/02/4321/ by Phil Phil
Humans are terrible at sustained vigilance for rare events in high-volume streams. – https://mastodon.online/@pseudonym/116135917950981989

See also https://electrek.co/2026/03/17/former-uber-self-driving-chief-tesla-fsd-crash-supervision-problem/:

Tesla is asking humans to supervise a system that is specifically designed to make supervision feel pointless. As he puts it, an unreliable machine keeps you alert, and a perfect machine needs no oversight, but one that works almost perfectly creates a trap where drivers trust it just enough to stop paying attention.

Read Short tempers and legal threats: UK teachers report rise in problem parents by Richard Adams

For me, this is the most damaging result of our short-sighted rush towards “productivity”:

More recently, heads said parents had been using AI to generate lengthy, legalistic complaints that required increasing amounts of time to administer.

This is a completely justified use of an LLM and it’s going to cost more time than an LLM will ever save. Unless you use an LLM to respond, in which case we’ll just have LLMs burning resources achieving nothing.

If I hadn’t let the RAF recruiter convince me to apply on an officer track JUST because I was a graduate, I reckon I’d have seen at least four “hot” combat deployments and three or four “very warm” airlifts, by now. Well, assuming I survived being deployed to Afghanistan on probably my first ever tour. Weird thinking about it.