AI makes language effortless—but thinking is done best when it is effortful. Thus, beware the “chat trap”: how casual use of generative AI can quietly soften judgement by replacing framing, definition, and trade-offs with fluent prose.
In an age of cheap text and AI-generated plausibility, leaders are not misled by too little information but by too much of the wrong kind. In this piece I explore why attention has become a governance risk—and why learning what to ignore now matters as much as what to know.
The em-dash has fallen under suspicion—treated as a tell-tale sign of artificial writing rather than what it has always been: a mark of care, rhythm, and thought in motion. It should return to good standing so we can recover linguistic standards we seem oddly eager to abandon.
A 2025 review of authority, trust, coherence and attention—plus a strategic outlook for 2026 on AI governance, provenance, regulation and decision quality. What to prioritise, what to ignore, and why clarity beats theatre.
A practical playbook for human-centred AI: redesigning workflows, building capability, governing judgment, and sustaining talent pipelines so AI amplifies human agency rather than hollowing out the organisation.
Most companies use AI, but few achieve real impact. In part I of this series I explore why people—not tools—determine AI maturity, and why the next serious organisations will dominate the space between pilots and profit.
In an age when machines can mimic thought, the real question is who stands behind the words. A reflection on authorship, judgement, and the human presence that gives writing its authority.
AI models often mirror our beliefs, rewarding us with agreeable but shallow answers. This sycophancy flatters rather than challenges, eroding judgment and candour. To gain true value, leaders must set incentives that favour truth over comfort, design prompts that demand trade-offs, and treat AI as a
Clicks are dying, and with them provenance, nuance, and the economics of ideas. AI summaries reward skim over substance. We need friction, attribution, and long-form thinking—treating summaries as aperitifs, not meals—so leaders decide from sources, not headlines.
In their fear of missing the AI bandwagon, many boards are blindly investing in tech they barely understand—driven more by hype than strategy. Mimicry, not discernment, has become the default. The result? Strategic incoherence, wasted billions, and millions of people thrown unnecessarily out of work