The board pack arrived at 11:12 p.m. the night before the offsite: 386 slides, three annexures, and an AI-generated executive summary that was—superficially—excellent. The language was confident, the charts were neat, and the recommendations were crisp enough to look decisive.
In the first session the following day, a director asked where one particular statistic came from. The room stalled. Someone searched. No one could find the original source. After some awkward digging, it turned out the number had been lifted from a commentary piece citing another commentary piece, which cited—eventually—a paywalled report from a different industry and a different year. The statistic wasn't merely wrong; it was irrelevant padding masquerading as relevant data.
The organisation had not been misled by a lack of information. It had been misled by too much low-grade information, processed too quickly, with too little discrimination.
In an economy where the marginal cost of producing words has collapsed, attention is no longer a personal productivity matter. It is a governance problem and a strategic risk. Critical thinking still matters, but it is increasingly deployed too late—after attention has already been captured by what should never have been considered. What leaders now need is a complementary competence: critical ignoring—the deliberate, defensible refusal to engage with information that does not deserve the privilege of shaping judgement.
Why Critical Thinking Buckles Under Volume
The current, almost obsessive, exhortation—"be data-driven"—often functions as a moral instruction: if you are overwhelmed, it is because you don't have the right data to keep up. Yet research on information overload is as old as it is unromantic: when the volume and complexity of information exceed available cognitive capacity and time, decision quality deteriorates. No ifs, not buts. At any given moment we have finite capacity and when the garbage coming in exceeds our capacity to absorb and process, far from garbage coming out, we will simply be mute on the subject.
Herbert Simon (1916–2001) put it more sharply: 'a wealth of information creates a poverty of attention'.
This is the first executive mistake: treating attention as an infinitely scalable capability. It is not.
The second mistake is assuming that critical thinking—because it is usually misunderstood and ill defined—is the appropriate first response to abundance. In high-noise environments, critical thinking becomes a costly luxury. Every minute spent painstakingly evaluating a low-value claim is a minute not spent on higher-order work: clarifying assumptions, building causal explanations, or reading primary sources.
Worse, engagement itself is a kind of concession. Exposure increases familiarity, and familiarity is often misread as truth. The cognitive problem is not simply that bad information exists, but that it competes for attention using emotional triggers—outrage, curiosity, indignation—designed to pre-empt deliberation.
This is why the "just teach people to think critically" agenda is increasingly inadequate. It assumes that the mind can patiently appraise everything presented to it. That assumption belonged to a bounded information environment. It does not belong to industrial-scale content production.
The good news, and something that friend of the show Herbert Simon observed back in the early '70s, we can solve for the excess of information by allocating our 'attention efficiently among the overabundance of information sources that might consume it.' A key part of this allocation process is critical ignoring.
Critical ignoring begins where classical critical thinking curricula often refuse to begin: before engagement, at the moment of selection.