After the watershed
Shaping the post-hype AI world
I’m very excited about AI. I’m also absolutely sick to death of it. Cognitive dissonance? Can confirm.
My feelings about it run the gamut from delight at what it has managed to find out or solve for me – much faster and more effectively than I ever could have – to towering, dystopian visions of a technology‑enslaved humanity where the flatlining pulse of critical thinking makes the dodo look like it’s alive and kicking.
For all its myriad capabilities, AI can’t yet write the history of its own r/evolution with hindsight, so let’s examine where we’re at. AI isn’t just a tool, it’s a mirror. One that reflects our anxieties, our aspirations, and our blind spots. And right now, that reflection is… complicated at best.
And, of course, this isn’t the whole story. There are the other, heavier conversations that need to be had about AI: the ecological cost of training models the size of small nations, the privacy minefields, the potential for criminal misuse, the weaponisation risks, the labour displacement, the geopolitical arms race… All of that matters. But this piece isn’t trying to solve those Gordian knots. (Later pieces may. Stay tuned.) I’m limiting myself to something smaller and more intimate: how AI is reshaping our behaviour, our expectations and the strange cultural weather system we now inhabit.
More than anything, we seem to be revolting – quietly or otherwise – against the force‑feeding of AI‑everywhere‑ness that started in 2022 and has barely let up. Let’s pause there for a second. It’s been not even four years since the launch of ChatGPT, and already we need to concentrate hard to remember what life was like before it, even if we never actively use it. It’s already suffused every conceivable stratum of technology, society and culture to some degree: the answer to a problem we’re only beginning to articulate.
Cheap knock-off
For all the force‑fed hype, most people still see AI as cheap, second‑rate, or vaguely shameful – the cheating, soulless, cut‑price version of creativity. It’s the creative equivalent of instant noodles: technically edible, they’ll keep you alive, but nobody’s bragging about having them for dinner.
And it really is force‑feeding. Every shiny new gadget is pitched for its AI abilities. Yet the data is unequivocal: most of us simply don’t care and we certainly don’t buy products because they have AI. If they happen to have it, fine. If they’re actually useful, wow – amazing. On the whole, whatevs.
In another twist, even when AI makes things easier, it can make them harder. A recent study found that adopting AI workflows can actually create more work, more burnout and more stress. It’s a paradox reminiscent of expanding motorways: add more lanes and you don’t reduce traffic, you invite more of it. AI accelerates output, which raises expectations, which increases workload. The marketing‑driven promise of a Valhalla of efficiency becomes a trap.
If ever there was an industry with technology in its very DNA, it’s gaming. Yet even in that tech‑worshipping temple of entertainment (the largest of all, let’s not forget, with worldwide earnings far outstripping those of films and music combined) the pushback was loud and clear when Nvidia announced its new DLSS AI‑upscaling technology – a system that boosts game performance by generating higher‑resolution frames using AI. It may look amazing and perform brilliantly (the jury is still out), but it’s a telling cultural barometer that even tech‑up‑to‑the‑eyeballs, hardware‑obsessed gamers rolled their eyes very hard at what they perceived as a cheap shortcut: inauthentic and – oh, the irony – ‘not real’. Why? Because it was perceived as AI replacing craftsmanship and artistry, and whether that perception is accurate is effectively beside the point.
AI as friction
I have personally found AI useful in an unassuming outpost of productivity that I would never have considered a practical application: it’s become my digital pause button. My ADHD brain benefits massively from being able to delegate its constant ‘How should I…’ questions to my AI bot of choice. This does three things:
It interrupts my natural impulse to react immediately and forces me to articulate the problem carefully.
It then gives me non‑emotional, non‑reactive feedback, free of human ego, urgency or agenda.
And that, in turn, pushes me to think critically. This is an AI, and it can – and does – make mistakes. I have to evaluate its suggestions rather than blindly accept them.
This process is enough to calm my racing brain and help me make better decisions. Not because it’s making them for me, but because it makes me slow down, take a step back, and examine the issue. I’m not outsourcing my thinking, I’m outsourcing my panic. More often than not, the answer it gives me is the one I’d already arrived at in the process of formulating my query. (Which then makes me slightly nervous that it’s just confirming my bias, but still…)
And on we go
So where does this quagmire of contradictions leave us?
Maybe the real question isn’t so much what AI will do to us, but what we will do with it. Who do we want to be in a world where the boundaries between real and unreal, skilled and automated, human and machine are increasingly nebulous?
My expectation is that AI will make real skill more valuable, not less. When the baseline becomes automation, human discernment becomes its corresponding premium counter‑offering. Craft, nuance, originality – all that stuff that can’t be templated – will become the real differentiators. In a world flooded with synthetic content, authenticity becomes a prized resource rather than an outdated or superseded one.
I also suspect that AI will ultimately force us to become more discerning, more intentional and more awake. When the world becomes harder to trust, the only option left to us, ultimately, is to read it more critically and fastidiously. Curiosity over convenience. Discernment over automation. Agency over autopilot.
AI won’t decide who we become, but it is making it damned hard to avoid looking in the mirror. And that may not be a bad thing.


