I don't know if it's selection/survivor bias, but every time I watch a video about computers from the 60s and 70s, I am amazed how spot on they are with the trajectory of the technology.
Take this CAD demo from MIT back in 1963 showing features that I commonly use today: https://youtu.be/6orsmFndx_o
Then the 80s and 90s rolled in, the concept is computers that entered the mainstream. Imagination got too wild with movies like Electric Dreams (1984).
Videos like this make me think that our predictions of AI super intelligence are probably pretty accurate. But just like this machine, in actuality it may look different.
JdeBP · 38m ago
It definitely is survivorship bias. Go and watch videos from the retrocomputing enthusiasts. There are loads of branches in computing history that are off-trajectory in retrospect, inasmuch as there can be said to be a trajectory at all.
Microdrives. The Jupiter Ace. Spindle controllers. The TMS9900 processor. Bubble memory. The Transputer. The LS-120. Mattel's Aquarius. …
And while we remember that we had flip-'phones because of communicators in 1960s Star Trek we forget that we do not have the mad user interfaces of Iron Man and that bloke in Minority Report, that the nipple-slapping communicators from later Star Trek did not catch on (quelle surprise!), that dining tables with 3-D displays are not an everyday thing, …
… and that no-one, despite it being easily achievable, has given us the commlock from Space 1999. (-:
The Transputer as an implementation has failed, but all modern server/workstation CPUs have followed the Transputer model of organizing the CPU interfaces, starting with some later models of the DEC Alpha, followed by AMD Athlon and then by all others.
Unlike the contemporaneous CPUs and many later CPUs (which used buses), the Transputer had 3 main interfaces: a memory interface connecting memory to the internal memory controller, a peripheral interface and a communication interface for other CPUs.
The same is true for the modern server/workstation CPUs, which have a DRAM memory interface, PCIe for peripherals and a proprietary communication interface for the inter-socket links.
By inheriting designers from DEC Alpha, AMD has adopted this interface organization early (initially using variants of HyperTransport for peripherals and for inter-CPU communication), while Intel, like always, has been the last in adopting it, but they were forced to do this eventually (in Nehalem, i.e. a decade after AMD), because their obsolete server CPU interfaces reduced too much the performance.
pcblues · 15m ago
The Jupiter Ace was unreal, but only from a computer science perspective. You had to know a lot to know how to program Forth which was the fundamental language of that white but Spectrum-looking dish of a PC, in spite of a manual that read like HGTTG. Critically, it didn't reward you from the start of your programming journey like Logo or Basic did, and didn't have the games of the ZX Spectrum. I knew a person who tried to import and sell them in Australia. When I was young, he gave me one for free as the business had failed. RIP IM, and thanks for the unit!
That's one of the reasons why touchscreen smartphones dominated the market in less than one decade. They made the dream of "real-time videotelephony from a rectangle" come true, a dream which had been present in literature and culture for around hundred of years.
Cthulhu_ · 1h ago
And yet, while 90's (and earlier) TV was talking breathlessly about video communication, it feels like it just "snuck in" to our daily lives when webcams and e.g. Skype became mainstream, and it never felt magical. Of course, the demos were tightly scripted and stifled.
extraisland · 28m ago
I saw a BBC archive video about AMSTRAD. AMSTRAD owned a PC manufacturer called Viglen. In the archive the CEO of Viglen was having a video call to someone offsite presumably on what looked like Windows 3.11. This was 1996.
In the same video the salesman was selling a Pentium 75MHZ machine.
People had seen the tech working in some form on TV for some time. It just wasn't mainstream.
undebuggable · 1h ago
Skype made the the first major milestone. The software and network parts were "simply working" but the hardware part, CRT displays, headsets, and webcams, were still plasticky and tacky.
smokel · 2h ago
One might also take on the more cynical perspective and be disappointed that we are still stuck with these early achievements.
FCOL most of us are now happy to have our AI overlords type out software on 80 column displays in plain ASCII because that is what we standardized on with Fortran.
AlbertoGP · 42m ago
That man, Rex Malik, participated in (among other things) the 1982 BBC series “The Computer Programme” (https://en.wikipedia.org/wiki/The_Computer_Programme), typically in a small section at the end on an episode but also as narrator in other parts and is credited as “Programme Adviser”:
I used to take home a terminal from work in the mid 70s. Same principle but portable. It had two rubber cups which the two ends of the phone would push into and after dialing up I was ready to go.
I felt space age.
bigtones · 47m ago
Acoustic coupler for the win !
kstenerud · 1h ago
I laughed at the first scene, where he's placed next to his bed a machine with a rather loud fan, that also periodically goes CHUNKA-CHUNKA-CHUNKA-CHUNKA!
It's also interesting to note his lack of adeptness at typing (sign of the times, I suppose).
hilbert42 · 3h ago
I wonder what that kid ended up doing for a profession and what he thinks of today's computers.
That BBC news report is interesting as it puts about 60 years of tech/computing progress into perspective.
Now extrapolate 60 years hence—right, today's mind just boggles.
KevinMS · 1h ago
I wonder why they didn't find somebody with a CRT display if they were doing a story about the future instead of those horrendous teletypes.
lopis · 13m ago
I think in 1967, an affordable computer terminal was not more than 2-way fax machine. Being able to drive a CRT sounds significantly harder than driving a typewriter.
Take this CAD demo from MIT back in 1963 showing features that I commonly use today: https://youtu.be/6orsmFndx_o
Then the 80s and 90s rolled in, the concept is computers that entered the mainstream. Imagination got too wild with movies like Electric Dreams (1984).
Videos like this make me think that our predictions of AI super intelligence are probably pretty accurate. But just like this machine, in actuality it may look different.
Microdrives. The Jupiter Ace. Spindle controllers. The TMS9900 processor. Bubble memory. The Transputer. The LS-120. Mattel's Aquarius. …
And while we remember that we had flip-'phones because of communicators in 1960s Star Trek we forget that we do not have the mad user interfaces of Iron Man and that bloke in Minority Report, that the nipple-slapping communicators from later Star Trek did not catch on (quelle surprise!), that dining tables with 3-D displays are not an everyday thing, …
… and that no-one, despite it being easily achievable, has given us the commlock from Space 1999. (-:
* https://mastodonapp.uk/@JdeBP/114590229374309238
Unlike the contemporaneous CPUs and many later CPUs (which used buses), the Transputer had 3 main interfaces: a memory interface connecting memory to the internal memory controller, a peripheral interface and a communication interface for other CPUs.
The same is true for the modern server/workstation CPUs, which have a DRAM memory interface, PCIe for peripherals and a proprietary communication interface for the inter-socket links.
By inheriting designers from DEC Alpha, AMD has adopted this interface organization early (initially using variants of HyperTransport for peripherals and for inter-CPU communication), while Intel, like always, has been the last in adopting it, but they were forced to do this eventually (in Nehalem, i.e. a decade after AMD), because their obsolete server CPU interfaces reduced too much the performance.
https://80sheaven.com/jupiter-ace-computer/
Second Edition Manual: https://jupiter-ace.co.uk/downloads/JA-Manual-Second-Edition...
https://youtu.be/XX53VbgcpQ4?t=793
In the same video the salesman was selling a Pentium 75MHZ machine.
People had seen the tech working in some form on TV for some time. It just wasn't mainstream.
FCOL most of us are now happy to have our AI overlords type out software on 80 column displays in plain ASCII because that is what we standardized on with Fortran.
Episode 1 - “It’s Happening Now”: https://www.youtube.com/watch?v=jtMWEiCdsfc
Episode 4 - “It’s on the Computer”: https://www.youtube.com/watch?v=UkXqb1QT_tI
Episode 5 - “The New Media“: https://www.youtube.com/watch?v=GETqUVMXX3I
Episode 10 - “Things to Come”: https://www.youtube.com/watch?v=rLL7HmbcrvQ
I felt space age.
It's also interesting to note his lack of adeptness at typing (sign of the times, I suppose).
That BBC news report is interesting as it puts about 60 years of tech/computing progress into perspective.
Now extrapolate 60 years hence—right, today's mind just boggles.