If I were to suggest that the progress of human civilisation is slowing, most people would doubtless disagree. Social-media feeds brim daily with new triumphs in artificial intelligence, persuading even seasoned technologists that a looming “singularity” will arrive in the 2040s. As an engineer, I lean on ChatGPT for coding, literature searches and language polishing. Yet, despite AI’s recent leaps, I cannot see humanity about to vault into an era of break-neck innovation. On the contrary, I fear we may be drifting towards a new period of stagnation—one not unlike the medieval centuries.
A classic signal of genuine technological prosperity is a surge in jobs linked to the breakthrough. During the First and Second Industrial Revolutions (roughly 1750-1850 and 1870-1914), new factories demanded workers, technicians, engineers and managers. Enterprises and capital poured in; erstwhile peasants found higher-paid roles; even colonised regions benefited, however unevenly, from incoming investment and goods.
By contrast, the digital revolution has not generated employment on a comparable scale, even while start-ups and tech behemoths post record profits. Countless papers and consultancy white papers forecast which occupations AI will displace; few outline convincing new career paths it will create. Microsoft already reports that AI writes roughly 30% of the code in its products—and the proportion is climbing. In blunt terms, software engineers are busily building tools that may render many of them redundant.
In past upheavals, obsolete trades usually gave way to broader opportunities. Carriages vanished, but drivers and mechanics for trains and motor-cars flourished. Today’s AI boom, I suspect, will not replicate that pattern. A handful of elite engineers, versed in advanced mathematics and cognitive science, can sustain an entire AI division. Such expertise demands years of schooling that will deter the majority of school-leavers; the cohort can never rival the legions of traditional developers.
For new graduates, the employment gate is narrowing further as firms automate basic tasks. Advertisements for “entry-level” roles in finance, IT and manufacturing increasingly stipulate several years of experience—even, absurdly, in technologies launched mere months earlier. The prospect is unsettling: a generation of well-educated young adults could find themselves excluded from the workforce the moment they leave university, with diminishing hope of entry later.
Beyond unemployment looms a subtler danger: a drain on creativity. Enthusiasts contend that AI will eventually devise transformative solutions, yet they overlook a foundational distinction. Human insight often emerges from leaps of imagination—intuitive, even illogical. An AI model, by design, distils patterns from existing data; its “original” music, prose or code is an elaborate summary of what it has already ingested.
Consider authoritarian states such as China, Iran or North Korea, where access to external information is curtailed. Students there frequently begin to question official narratives despite exposure only to state material. A large language model trained exclusively on the same material would simply parrot the propaganda; it cannot raise the meta-question, “Does this make sense?”, unless a human explicitly prompts it to do so. True artificial general intelligence would require spontaneous self-questioning—a capability we barely understand in ourselves, let alone in silicon.
Historically, civilisation advanced not merely through incremental refinement but through imaginative leaps: the steam engine, impressionist painting, quantum theory. If businesses, engineers and artists default to AI for answers, we may see fewer such leaps. A model cannot dream what it has never “seen”.
Finally, we should remember that the digital economy alone produces no tangible goods. Real wealth grows from making physical things better or inventing new ones: lighter aeroplanes, more efficient turbines, affordable solar panels. An exquisitely rendered virtual meal in the metaverse, however lucrative to its owners, will not nourish the diner. Likewise, no matter how vividly XR depicts the Martian landscape, no Martian minerals will arrive on Earth unless a spacecraft physically lands and returns. AI can optimise research, but its advances still rest on the gritty realities of materials science and manufacturing.
None of this makes me a nostalgist yearning for “the good old days”. ChatGPT is indispensable to my daily workflow. Yet I do not expect AI to birth an unprecedented idea unsupported by existing evidence. Human minds can ignite with inspirations no one has ever imagined; computers cannot. That spark in the human brain remains the ultimate engine of civilisation—and we neglect it at our peril.
Leave a comment