LLM Exhaustion
AI Use at Work Is Causing “Brain Fry,” Researchers Find, Especially Among High Performers
Developer’s Honest Assessment of AI at Work Rattles the Official Narrative
At work there's more and more of a push to use generative AI for basically anything. Reading emails, writing emails, writing code, you name it. Part of this is coming from the very top, where bespectacled schmucks who've never written any meaningful code in their life have been taken by the promises of Microsoft, Anthropic, and OpenAI. But part of it as well is being driven by some of my fellow colleagues, who see AI tools as a way of doing more, faster.
I didn't get a smart phone until 2017, and since then I've been kind of appalled at how it's rewired my brain: I'm no longer content with boredom, I can feel it burning in my pocket, and at any moment, I feel like I want to take it out, check my email, check Bluesky, check Reddit. Everything people say about smart phones is true: not just about how convenient they are, how many capabilities they offer you, but how they mess with your concentration and focus. Almost ten years later, I think back. Would I be more productive without a smart phone? In some ways, no. In many ways, yeah.
And that's what I feel about the current LLM-for-everything craze that's seems to be gripping the world. I've seen the demos. In some cases, it's been impressive (if nothing else, it's excellent at generating web apps using well-known patterns), but in many that I've tried, it feels like absolute trash. Claude Code is the current hotness, but at work, we're stucking using Microsoft Copilot. Everything you read about it being less capable seems to be generally true. But that hasn't stopped some of my colleagues from making heavy use of it. We've already seen a bunch of LLM-generated code cause issues in prod.
Tech CEOs say the era of 'code by AI' is here. Some software engineers are skeptical
Within a month or so of use we had our first production incidents that were traced back to Copilot code and lack of testing. And I think that goes hand-in-hand: there's an assumption of correctness, or maybe it just makes lazy developers lazier. I remember at my first company, the amount of rigour that went into development. You had to have a test plan. It had to be reviewed by a senior developer. The test plan had to pass. Then you passed it off to QA, who had their own test plan.
The company set extremely aggressive metrics: no more than 2% of your stories could be returned back by QA per year, Or Else. That feels laughable now, doesn't it? That company is long gone, but many of its key figures formed another one locally. They talk about all the AI they're using on LinkedIn, while also talking about how AI use in their domain doesn't meet stringent regulatory standards (oh, the doublespeak!). We've gone from Joel On Software blog posts about the importance of code quality, to just letting AI write everything for you, and seeing what happens.
What a world.
I've been writing code for a long time, since 1995, when a boy in my ninth grade typing class showed me how to write some command in basic. PRINT. GOTO. COLOR. FOR. That quick lesson set something off in my brain, immediately. I knew I wanted to write code. I wanted to make things. I wanted to enjoy the process of doing this.
Which is why the current moment is so exhausting. Dullards tell me I should want to use AI for everything, that it's always a productivity multiplier, and it's definitely not going to cause me headaches as I deal with my colleagues fucked up AI-generated code. I'm a senior developer now - have been for around ten years - and I'm closer to the end of my career than the start. That feels like a blessing. If this is the direction we're supposed to be going, I want out. I want to retire and work on my projects. I want to work on them myself. Because what people who haven't written a serious amount of code don't realize is, the joy isn't in what you've made, but in the process of making it.
We'll see where this all goes. I expect the bubble will pop, companies will collapse or be bought out (the way Claude is supposedly better than anything, I'll bet against Microsoft and OpenAI), and companies will raise prices more and more to recoup their costs until LLM use is seen as prohibitively expensive, and developers are instructed to produce "person-created code", or some other term yet to be coined. This is exactly what we've seen with cloud computing: companies and governments were sold the promise of cheaper maintenance, freed from the tyranny of managing an on-prem data center. Only, it turns out that cloud is expensive, that your budget lives and dies by Amazon and Microsoft corporate decision-making, that costs can spiral out of control very quickly. Many companies are moving to a hybrid approach, or even back to on-prem.
I think that's where we'll be going with LLM use. The costs will get prohibitive. Use will be restricted. Maybe at that point, companies will start hiring more junior devs again, because let's be honest, senior developers have to come from somewhere. The older I get, the more I see the corporate world as a series of hype cycles. I just wish that those in charge, the ones who make the industry-shaping decisions, would take a step back and start to see the same. But in the meantime, I have (I hope) another ten or fifteen years to go before I'm done. Hopefully the former. I want to wake up, kiss my wife, walk the dogs. Work on my forever-project. And take enjoyment in what my years of practice have let me make myself.