The tech industry has a long history of a recuring story: a Big New Thing comes along, it's actually pretty good in some important ways, some of its proponents oversell it, this becomes a big deal, eventually those who bought into it are let down by the reality, there's a backlash and finally the sane folk who knew what it was from the get-go can settle down to making good use of it without having to fight off the stupid attempts to make it do more than any reasonable person would expect of it. The only really surprising thing is how many people airbrush that history out of their memory and then go on to repeat the whole sordid, stupid mess when the next Big New Thing comes along. And the next. And… I think you probably get the idea.
The current (I'm writing this in autumn 2025) example is large language models (LLMs), a type of machine-learning system that does some pretty good things pretty well, and I'm sure folk shall be using it for a fair while hereafter, but you've probably heard some much more enthusiastic hype about it. I've addressed that elsewhere, so won't get into specifics here.
As the late lamented splendid Dævid Allen said in a song by the band
Gong, nothing is new that's under the sun, except for what we have
forgotten
(well, OK, I do think that's a slight overstatement, but not by
much and anyway the sentiment is sound) and, indeed, that modern AI
hype
has a lot in common with the many technologies that have come before it, albeit
turning some of it up to eleven. So here's a quick list of significant past
examples, maybe in roughly reverse chronological order, but I'm making no
promises. Hopefully I'll find time at some point to expand on each of these in
some sections below but, for now, I need somewhere to note this down while I'm
working on something else, so I don't digress there to talk about it.
Web 3, and crypto-currencies. (Please, stop abbreviating that to
crypto, it makes real cryptographers cry or angry, and we depend on their good-will for internet trade of any kind to remain viable.)
Web 3because there had previously been a
Web 2.0– which meant corporations raking in the spondooliks on the back of
contentgenerated by their users – what a lovely little shit-storm that's degenerated into.
desktopmetaphor. I never know whether to cringe or laugh when I see or remember that scene where Scotty goes all super-speed on a Macintosh – one of those moments when Hollywood's wilful unrealism about tech transcends its usual awfulness and rises to a form of side-splitting poetry. And, again: no, Microsoft did not invent them (nor did Apple, who famously got the idea from Xerox PARC). They were a bit slow to jump on the bandwaggon, but they knew how to leverage one monopoly to make another. The original terminology,
windows, icon, mouse, pointerwith acronym WIMP, has since had folk's understanding of its first word irreparably changed by a trademark that should never have been granted because it covered a generic term. Literally every GUI has windows (no, not Windows™, just plain windows) in it.
PCas if it meant
Microsoft PC, but Microsoft didn't even invent them. It just provided the software (that bore a striking resemblance to a hobbyist's QDOS) for an IBM product (which was a late-in-the-game attempt to get in on The Big New Thing before it made IBM irrelevant – they are, of course still in fact selling mainframes, among other things: they really needn't have panicked, but hindsight is famously easier than foresight) and then leveraged their copyright (which is hilarious given how contemptuous they were of copyright law when ripping off QDOS) to monopolise the market, aided and abetted by Steve Jobs's control obsession that kept The Mac from becoming an open platform, which was what really made MS-DOS such a roaring success (and I guess was probably originally IBM's idea, rather than Microsoft's).
executable binary filesthat computers can run – and interpreters, which make sense of a laguage intelligible to human programmers as instructions it can then run. These days, compiler and interpreter technology is pretty mature. Both do a respectably good job and support a wide range of programming languages, from things a reasonably awake teenager can learn to use through to things that specialists of diverse varieties can use to do real work efficiently.
Notice that none of these has ever gone away, indeed most of them have gone on to be even more … whatever each is, even though some folk have been foretelling the deaths of some of them for some time. (I dunno, maybe Arcade Games are a thing of the past ? But my guess is they're still out there, somewhere, loved by the folk that love them; and I grouped them with a bunch of other stuff, that's definitely still going strong, and it was the whole bundle, taken together, that was the Big Thing. Game arcades were just a commercial implementation detail.) Admittedly, some of them have vanished from view to continue their lives hiding inside something else, but I assure you they're still there, delivering the thing they're actually good for (to, at least, someone), day in, day out, for better or for worse.
Each of them had folk heralding it as revolutionary and saying they were going to change the world – which, in fact, they did, just nowhere near as much as the hypesters would have had you believe. I could probably add telephone, telegraph, printing press and so much more, right back to wheel, plough and fire for that matter, but I'm sticking to tech because it's what I know, including some of what folk actually said about its Big New Things before they quietly got integrated into the background of our lives so thoroughly that we take them entirely for granted.

Written by Eddy.