Always leave the code a little better than you found it.
First make the change easy, then make the easy change.
These were the quotes to live by for a generation of programmers. You wouldn’t necessarily launch a grand refactoring initiative, but if you touched a file to fix a bug, you’d also clean up the nearby mess, tidy a confusing variable name, break up a giant function, delete a comment that had long since become a lie. It was a form of hygiene. It kept the codebase from decaying.
It wasn’t always easy. If you encountered a tangled, incomprehensible function, updating it was a pain. You had to spend an hour, head in hands, tracing through the logic, building a mental model just to add one small feature. But that pain was a powerful incentive. It screamed “REFACTOR ME!”. The right thing and the easy thing were aligned. And pain enforced taste.
AI lowers that pain. The default is to prompt your favourite AI Assistant - “add this shiny new feature”. Each “just get it done” change is rational locally. You don’t need to spend a day understanding all the invariants anymore. AI makes bad code tolerable. “Understanding” is no longer a prerequisite for “shipping”.
But tolerable code compounds. The nested conditionals grow another level deep. The variable that used to mean one thing now means three. Modules couple in subtle ways. Every future change gets harder. You keep moving, until one day you can’t.
Perhaps, in a few years AI will be so good that it won’t matter. Or perhaps this is a silent crisis. What do you think?