The Philosopher Who Warned Us First

In 1950, MIT mathematician Norbert Wiener published 'The Human Use of Human Beings,' followed in 1964 by 'God and Golem, Inc.' In these works, Wiener described with striking precision the danger of deploying machines optimized for goals humans don't fully control. He wrote explicitly: 'If we use, to achieve our purposes, a mechanical agency with whose operation we cannot interfere effectively... we had better be quite sure that the purpose put into the machine is the purpose which we really desire.' He called the field 'cybernetics' and warned that automation without value alignment could destabilize society in ways no engineer intended. His books were reviewed, discussed briefly, then largely shelved. By the 1990s, philosophy departments had stopped assigning them. Then in 2014, Oxford...

Mental Models

Discourse Analysis

Popular framing: Nick Bostrom identified the AI alignment problem in 2014, catalyzing a new field of existential risk research that mobilized billions in funding and serious scientific attention for the first time.

Structural analysis: The alignment problem was clearly articulated by Norbert Wiener in 1950 and 1964 with near-identical core claims. The 64-year gap between formulation and institutional uptake reflects not a discovery event but a social cascade: the idea required celebrity amplification, proximity to visible technology, and elite network endorsement before it could travel beyond its originating discipline. The 'discovery' narrative is a post-hoc construction enabled by recency bias and the availability heuristic — Bostrom's book is vivid and recent; Wiener's books are out of print and unassigned. The role of 'Feedback Loops'—Wiener's core insight was that the alignment problem is a control-theory problem, not a programming problem.

The gap matters because it reveals a structural failure in how societies process warnings about emerging technologies: the quality and timing of the warning is far less determinative than the social position of the warner and the cultural readiness of the audience. If alignment researchers treat 2014 as year zero, they will systematically miss insights from earlier technological transitions — nuclear risk, automation anxiety, cybernetics — that contain hard-won wisdom about exactly the dynamics they are trying to model.

Competing Interpretations

Research Sources

Explore more scenarios on WiseApe

Loading...

Categories

Scenarios

All Models

🔍

Your Progress