
Getting my hands dirty – literally – is something I genuinely enjoy. The problem? I’m spectacularly inept when it comes to anything involving construction skills.
Demolition and destruction? I’m your person. But when it comes to building, it’s best to assign me the humble task of handing over tools or cold drinks. Maybe I’ll hold something heavy in place. Wield an axe? Absolutely. Operate a chainsaw? Not a chance. Assembling IKEA furniture? Barely. Mounting a television on a wall? That’s just an insurance claim waiting to happen.
As the BC Lottery wisely reminds us: Know your limits, play within them.
What’s brought all this to mind—aside from five days spent in the rugged wilds of British Columbia—is artificial intelligence. AI has crept quietly into our lives, becoming just another tool of convenience. Except, it isn’t just that.
Here’s the danger: AI gives people the illusion that they’re experts. That specialized help is no longer necessary. That context, nuance, and experience can be outsourced. But AI is far from infallible. It’s not a replacement for research, judgment, or professional knowledge.
Can AI draft a letter or write a news release? Sure. But don’t expect soul, personality, or passion. The results are often flat – like they were written by a committee. Back in my editor days, I was careful not to over-edit stories to the point where they lost the writer’s unique voice. I believed a piece should carry the energy of its creator—byline or not. That’s what gave our paper vitality.
And here’s the kicker: AI is frequently wrong.
A BBC study reported by The Guardian in February found that over half of AI-generated answers from ChatGPT, Copilot, Gemini, and Perplexity had “significant issues.” The New York Times echoed this, warning that while AI is growing more powerful, its hallucinations—yes, that’s the technical term—are getting worse.
“Hallucinations” is an apt word. Because that’s the risk we face: letting fiction pass for fact. In an age where misinformation already floods our social feeds, relying solely on AI means we risk amplifying errors and embedding them into our narratives.
So, does AI have a role? Absolutely – but it’s a starting point, not a final destination.
AI is an incredible tool for gathering information, sorting data, and even offering insights. But it can’t think like a human. It can’t draw cultural parallels or recognize subtle differences that matter deeply in real-world contexts. It won’t know that an idea that works brilliantly in Toronto might be a complete non-starter in Kelowna.
This is where your humanity comes in.
Writing, designing, engineering – whatever your craft – you must inject your knowledge, your experience, your you into the final product. Use AI to gather raw material, then fact-check, personalize, and shape it into something that reflects your expertise.
Sound self-serving, coming from a writer warning against the overuse of AI? Maybe. But the consequences are real. Just look at the lawsuits dismissed because AI generated completely false citations.
Whether you’re in the wilderness or behind a keyboard, knowing your limits isn’t weakness. It’s wisdom.
Leave a comment