I don't know if it's selection/survivor bias, but every time I watch a video about computers from the 60s and 70s, I am amazed how spot on they are with the trajectory of the technology.
Take this CAD demo from MIT back in 1963 showing features that I commonly use today: https://youtu.be/6orsmFndx_o
Then the 80s and 90s rolled in, the concept is computers that entered the mainstream. Imagination got too wild with movies like Electric Dreams (1984).
Videos like this make me think that our predictions of AI super intelligence are probably pretty accurate. But just like this machine, in actuality it may look different.
That's Ivan Sutherland though. He's one of the living legends of computing.
His doctor advisor was Claude Shannon and some of his students include the founder of Adobe, The founder of SGI and the creators of both Phong and Gouraud shading.
I always fund it amusing that Phong was actually the guy's given name, but Vietnamese family name ordering isn't the same as in the US so everyone thought it was his surname and just rolled with it.
Yes! He is greatly missed among his colleagues. I went to an event in Utah in 2013 and multiple people spoke quite fondly of him, including Blinn and Jim Clark.
When they were showing old photos of Ivans VW Bug they were taking measurements of, there was an obvious grief pause whenever he was in one.
There's videos of these on YouTube. I sat next to Sutherland in the room btw
That is 2nd to when I had a buffet breakfast showing up early to an event when it was just me and woz in the room and I talked to him for an hour not realizing who he was.
Actually my bullshit flags went up thinking "this guy sure likes to tell fantastic exaggerated stories!"
Kinda like talking to, say Harrison Schmitt, not knowing him, and saying "you? Landed on the moon? Sure old man. Stepped foot on the moon... then you were a senator? alright."
NB: Sutherland is co-directs a lab on asynchronous logic with a colleague at Portland State. As the site says: "Please visit us when you are in the neighborhood!" The lab takes summer students, too, although Portland State is broke so don't expect compensation.
It definitely is survivorship bias. Go and watch videos from the retrocomputing enthusiasts. There are loads of branches in computing history that are off-trajectory in retrospect, inasmuch as there can be said to be a trajectory at all.
Microdrives. The Jupiter Ace. Spindle controllers. The TMS9900 processor. Bubble memory. The Transputer. The LS-120. Mattel's Aquarius. …
And while we remember that we had flip-'phones because of communicators in 1960s Star Trek we forget that we do not have the mad user interfaces of Iron Man and that bloke in Minority Report, that the nipple-slapping communicators from later Star Trek did not catch on (quelle surprise!), that dining tables with 3-D displays are not an everyday thing, …
… and that no-one, despite it being easily achievable, has given us the commlock from Space 1999. (-:
The Transputer as an implementation has failed, but all modern server/workstation CPUs have followed the Transputer model of organizing the CPU interfaces, starting with some later models of the DEC Alpha, followed by AMD Athlon and then by all others.
Unlike the contemporaneous CPUs and many later CPUs (which used buses), the Transputer had 3 main interfaces: a memory interface connecting memory to the internal memory controller, a peripheral interface and a communication interface for other CPUs.
The same is true for the modern server/workstation CPUs, which have a DRAM memory interface, PCIe for peripherals and a proprietary communication interface for the inter-socket links.
By inheriting designers from DEC Alpha, AMD has adopted this interface organization early (initially using variants of HyperTransport for peripherals and for inter-CPU communication), while Intel, like always, has been the last in adopting it, but they were forced to do this eventually (in Nehalem, i.e. a decade after AMD), because their obsolete server CPU interfaces reduced too much the performance.
The Jupiter Ace was unreal, but only from a computer science perspective. You had to know a lot to know how to program Forth which was the fundamental language of that white but Spectrum-looking dish of a PC, in spite of a manual that read like HGTTG. Critically, it didn't reward you from the start of your programming journey like Logo or Basic did, and didn't have the games of the ZX Spectrum. I knew a person who tried to import and sell them in Australia. When I was young, he gave me one for free as the business had failed. RIP IM, and thanks for the unit!
>There are loads of branches in computing history that are off-trajectory in retrospect, inasmuch as there can be said to be a trajectory at all.
Vectrex. Jaz drives. MiniDisc. 8-track. CB Radio.
The more I notice, the less I feel there is a discussion to be had over this distinction.
The sci-fi predictions all came true - many of them, also came to pass, which is to say that the weight of the accomplishment of speculation to reality becomes immediately irrelevant in the context of the replacing technology.
Star Treks' communicators did catch on - among the content creation segment - but on the other hand, we also got the 'babelfish'-like reality of EarPods ..
I think the never-ending march of technology becomes fantastic at first, but mundane and banal the moment another fantasy is realised.
That's one of the reasons why touchscreen smartphones dominated the market in less than one decade. They made the dream of "real-time videotelephony from a rectangle" come true, a dream which had been present in literature and culture for around hundred of years.
I read and watched quite a bit of sci-fi (including from the golden age) as a kid in the early 90s and don't recall such a dream. What media exactly did I overlook?
Jetsons cartoon, Back to The Future, Space Odyssey, even more distant predictions [1]? That's from the top of my head without even searching thoroughly.
So the claim is that video calls are the interesting thing, not having a touchscreen interface or having the display cover an entire surface or being a single unfolded piece? (I had been struggling to understand what was special about "from a rectangle", since non-smart phones have had raster displays for much longer.)
Yeah, that really just never figured into my visions of "the future" as anything significant, I have to say. And even nowadays it doesn't seem like people often make video calls, given the opportunity. They'd rather take advantage of those displays to doomscroll social media, or see amazingly crisp (and emoji-filled!) text in a messaging app.
The point is it was a predicted future for 100 years and now it actually is here. Gen Z actually makes video calls all the time when elders would make normal calls.
And yet, while 90's (and earlier) TV was talking breathlessly about video communication, it feels like it just "snuck in" to our daily lives when webcams and e.g. Skype became mainstream, and it never felt magical. Of course, the demos were tightly scripted and stifled.
When I first got our labs two SGI Indy's webcams pointed at each others relevant coffeepots over ISDN, with 30km's space between them, there was definite magic.
The same when I sat in the hills of Griffith Park with a Ricochet modem and a tiBook, wondering how much ssh'ing and CUSeeMe I'd be able to do until the batteries ran out.
Once these kinds of activities became integrated into a laptop, the magic of all of the pasts' future predictions definitely became atmospheric.
Yes, it was a regular tool for determining if there was still power to various racks around SoCal, and the reason it was still in use was because those racks were in various locations around SoCal and nobody had the budget to switch to something else (plus, CuSeeMe binaries for SGI were a thing...)
Hah! I thought we were the only ones. I was using it to watch the screen of a machine with no other out-of-band monitoring, in a server room in 2002, mostly because "I've been using it forever and it still works".
Yes indeed, in fact my early productive use of videoconferencing mostly didn't involve humans, but rather - as you say - out-of-band monitoring of devices and systems.
On occasion it was nice to know when some tech was also in the closet, in case I knew their # and could get them to flick a switch or two, on my behalf, in lieu of the 1 or 2 hour bike ride (depending on traffic) I'd have had to endure to use my own fingers...
I saw a BBC archive video about AMSTRAD. AMSTRAD owned a PC manufacturer called Viglen. In the archive the CEO of Viglen was having a video call to someone offsite presumably on what looked like Windows 3.11. This was 1995.
Skype made the the first major milestone. The software and network parts were "simply working" but the hardware part, CRT displays, headsets, and webcams, were still plasticky and tacky.
One might also take on the more cynical perspective and be disappointed that we are still stuck with these early achievements.
FCOL most of us are now happy to have our AI overlords type out software on 80 column displays in plain ASCII because that is what we standardized on with Fortran.
(I've never seen "FCOL" before and had to look it up. For onlookers: "for crying out loud", apparently.)
We aren't stuck with the terminal and CLIs. We stick with them, because they actually do have value.
80 columns is a reasonable compromise length, once you've accepted monospace text, that works with human perception, visual scanning of text etc. But many programmers nowadays don't feel beholden to this; they use any number of different IDEs, and they have their linters set varying maximum line lengths according to their taste, and make code windows whatever number of pixels wide makes sense for their monitor (or other configuration details), and set whatever comfortable font size with the corresponding implication for width in columns. (If anything, I'd guess programmers who actually get a significant amount of things done in terminal windows — like myself — are below average on AI-assisted-programming adoption.) Meanwhile, the IDE could trivially display the code in any font installed on the system, but programmers choose monospace fonts given the option.
As for "plain ASCII", there just isn't a use for other characters in the code most of the time. English is dominant in the programming world for a variety of historical reasons, both internal and external. Really, all of the choices you're talking about flow naturally from the choice to describe computer programs in plain text. And we haven't even confined ourselves to that; it just turns out that trying to do it in other ways is less efficient for people who already understand how to program.
Monitors, keyboards, programming in textual representations, all seem quite unnatural. They were all the result of incremental technical progress, not the result of an ideal thought process. Just look at the QWERTY layout, and the limited number of people actually able to do programming.
If one reads science fiction novels from the 1970s, this is typically not the way people envisioned the 21st century.
I agree that the solutions have value, but I'm certain that we are stuck in a local optimum, and things could have been wildly different.
That's my "unpopular opinion" too. As I look at computer history, it amazes me how many things from the 1970s we still use. We are stuck at a local maximum due to the historical trajectory. Languages, terminal windows, editors, instruction sets, operating systems, CLIs, ...
The first person with a home computer in the UK, not just a terminal, was probably computer music experimenter Peter Zinovieff, who bought a DEC PDP-8/S for his studio in the late 1960s, for the insane cost of around £80,000 (inflation adjusted to today.)
And in addition, DEC made its name in the 1960s by selling computers at unprecedented low prices. A complete PDP-8/S system was quoted at $25000 in 1965 [0], equivalent to over a quarter of a million dollars today, for a computer that barely had an instruction set. These days we can buy supercomputers for five of today's dollars.
That man, Rex Malik, participated in (among other things) the 1982 BBC series “The Computer Programme” (https://en.wikipedia.org/wiki/The_Computer_Programme), typically in a small section at the end on an episode but also as narrator in other parts and is credited as “Programme Adviser”:
I used to take home a terminal from work in the mid 70s. Same principle but portable. It had two rubber cups which the two ends of the phone would push into and after dialing up I was ready to go.
I had one of those, dunno what happened to it...fun story, I was living in Boston at the time, and there was too much line noise on the phone system for 300 baud, but 110 baud worked like a charm.
Fascinating to see how much of his personal information was computerized – bank account status, personal diary, stocks. How did internetworking with his bank work? Was his stuff securely stored?
I laughed at the first scene, where he's placed next to his bed a machine with a rather loud fan, that also periodically goes CHUNKA-CHUNKA-CHUNKA-CHUNKA!
It's also interesting to note his lack of adeptness at typing (sign of the times, I suppose).
The terminal in the video was a rebadged KSR-33, which was common as a computer console. A few people at MIT and Bolt Beranek and Newman had them at home at that time. A KSR-33 went for about $1000, according to Perplexity. There were few video terminals available then; the most common were the IBM 2260 series, a character-mode device that I remember as being very clunky. But you couldn't have used one at home, it relied upon a very clunky control unit, and therefore couldn't be used remotely.
One additional example of the technology of that time. In 1968, I was a computer science student, and found myself called upon to arrange a demonstration of remote computing. The university at that time had no timeshared computing facility, so we used IBM's Call/360 service. The terminal was an IBM 1052 (big clunky printing terminal) with an acoustic coupler. To move this across campus, we arranged for a truck with 2 or 3 people to put the thing on a dolly, put it into the truck, and move it into the student union building. Later that day, the truck, and the helpers, came back and we reversed the process.
I think in 1967, an affordable computer terminal was not more than 2-way fax machine. Being able to drive a CRT sounds significantly harder than driving a typewriter.
Take this CAD demo from MIT back in 1963 showing features that I commonly use today: https://youtu.be/6orsmFndx_o
Then the 80s and 90s rolled in, the concept is computers that entered the mainstream. Imagination got too wild with movies like Electric Dreams (1984).
Videos like this make me think that our predictions of AI super intelligence are probably pretty accurate. But just like this machine, in actuality it may look different.
His doctor advisor was Claude Shannon and some of his students include the founder of Adobe, The founder of SGI and the creators of both Phong and Gouraud shading.
He also ran the pioneering firm Evans & Sutherland, a graphics research company starting in the 1960s. They produced things like https://en.wikipedia.org/wiki/Line_Drawing_System-1
He was a key person during the Utah school of computing's most influential years - when the Newell's famous Teapot came out for instance.
Saying his predictions are right on is kinda like saying Jony Ives predictions about what smartphones would look like was accurate
https://en.wikipedia.org/wiki/Bui_Tuong_Phong
When they were showing old photos of Ivans VW Bug they were taking measurements of, there was an obvious grief pause whenever he was in one.
There's videos of these on YouTube. I sat next to Sutherland in the room btw
That is 2nd to when I had a buffet breakfast showing up early to an event when it was just me and woz in the room and I talked to him for an hour not realizing who he was.
Actually my bullshit flags went up thinking "this guy sure likes to tell fantastic exaggerated stories!"
Kinda like talking to, say Harrison Schmitt, not knowing him, and saying "you? Landed on the moon? Sure old man. Stepped foot on the moon... then you were a senator? alright."
I'm not a big fan of his climate change denialism but yeah, he did walk on the moon.
https://arc.cecs.pdx.edu
Microdrives. The Jupiter Ace. Spindle controllers. The TMS9900 processor. Bubble memory. The Transputer. The LS-120. Mattel's Aquarius. …
And while we remember that we had flip-'phones because of communicators in 1960s Star Trek we forget that we do not have the mad user interfaces of Iron Man and that bloke in Minority Report, that the nipple-slapping communicators from later Star Trek did not catch on (quelle surprise!), that dining tables with 3-D displays are not an everyday thing, …
… and that no-one, despite it being easily achievable, has given us the commlock from Space 1999. (-:
* https://mastodonapp.uk/@JdeBP/114590229374309238
Unlike the contemporaneous CPUs and many later CPUs (which used buses), the Transputer had 3 main interfaces: a memory interface connecting memory to the internal memory controller, a peripheral interface and a communication interface for other CPUs.
The same is true for the modern server/workstation CPUs, which have a DRAM memory interface, PCIe for peripherals and a proprietary communication interface for the inter-socket links.
By inheriting designers from DEC Alpha, AMD has adopted this interface organization early (initially using variants of HyperTransport for peripherals and for inter-CPU communication), while Intel, like always, has been the last in adopting it, but they were forced to do this eventually (in Nehalem, i.e. a decade after AMD), because their obsolete server CPU interfaces reduced too much the performance.
https://80sheaven.com/jupiter-ace-computer/
Second Edition Manual: https://jupiter-ace.co.uk/downloads/JA-Manual-Second-Edition...
Vectrex. Jaz drives. MiniDisc. 8-track. CB Radio.
The more I notice, the less I feel there is a discussion to be had over this distinction.
The sci-fi predictions all came true - many of them, also came to pass, which is to say that the weight of the accomplishment of speculation to reality becomes immediately irrelevant in the context of the replacing technology.
Star Treks' communicators did catch on - among the content creation segment - but on the other hand, we also got the 'babelfish'-like reality of EarPods ..
I think the never-ending march of technology becomes fantastic at first, but mundane and banal the moment another fantasy is realised.
[1] - https://news.artnet.com/app/news-upload/2021/09/1280px-Franc...
Yeah, that really just never figured into my visions of "the future" as anything significant, I have to say. And even nowadays it doesn't seem like people often make video calls, given the opportunity. They'd rather take advantage of those displays to doomscroll social media, or see amazingly crisp (and emoji-filled!) text in a messaging app.
The same when I sat in the hills of Griffith Park with a Ricochet modem and a tiBook, wondering how much ssh'ing and CUSeeMe I'd be able to do until the batteries ran out.
Once these kinds of activities became integrated into a laptop, the magic of all of the pasts' future predictions definitely became atmospheric.
On occasion it was nice to know when some tech was also in the closet, in case I knew their # and could get them to flick a switch or two, on my behalf, in lieu of the 1 or 2 hour bike ride (depending on traffic) I'd have had to endure to use my own fingers...
https://youtu.be/XX53VbgcpQ4?t=793
In the same video the salesman was selling a Pentium 75MHZ machine. So it must have run on a PC of similar specification.
People had seen the tech working in some form on TV for some time. It just wasn't mainstream.
FCOL most of us are now happy to have our AI overlords type out software on 80 column displays in plain ASCII because that is what we standardized on with Fortran.
We aren't stuck with the terminal and CLIs. We stick with them, because they actually do have value.
80 columns is a reasonable compromise length, once you've accepted monospace text, that works with human perception, visual scanning of text etc. But many programmers nowadays don't feel beholden to this; they use any number of different IDEs, and they have their linters set varying maximum line lengths according to their taste, and make code windows whatever number of pixels wide makes sense for their monitor (or other configuration details), and set whatever comfortable font size with the corresponding implication for width in columns. (If anything, I'd guess programmers who actually get a significant amount of things done in terminal windows — like myself — are below average on AI-assisted-programming adoption.) Meanwhile, the IDE could trivially display the code in any font installed on the system, but programmers choose monospace fonts given the option.
As for "plain ASCII", there just isn't a use for other characters in the code most of the time. English is dominant in the programming world for a variety of historical reasons, both internal and external. Really, all of the choices you're talking about flow naturally from the choice to describe computer programs in plain text. And we haven't even confined ourselves to that; it just turns out that trying to do it in other ways is less efficient for people who already understand how to program.
Monitors, keyboards, programming in textual representations, all seem quite unnatural. They were all the result of incremental technical progress, not the result of an ideal thought process. Just look at the QWERTY layout, and the limited number of people actually able to do programming.
If one reads science fiction novels from the 1970s, this is typically not the way people envisioned the 21st century.
I agree that the solutions have value, but I'm certain that we are stuck in a local optimum, and things could have been wildly different.
By the mid-70s the studio had turned into this:
https://www.thewire.co.uk/audio/tracks/listen_peter-zinovief...
[0] https://en.wikipedia.org/wiki/PDP-8
Episode 1 - “It’s Happening Now”: https://www.youtube.com/watch?v=jtMWEiCdsfc
Episode 4 - “It’s on the Computer”: https://www.youtube.com/watch?v=UkXqb1QT_tI
Episode 5 - “The New Media“: https://www.youtube.com/watch?v=GETqUVMXX3I
Episode 10 - “Things to Come”: https://www.youtube.com/watch?v=rLL7HmbcrvQ
I felt space age.
That BBC news report is interesting as it puts about 60 years of tech/computing progress into perspective.
Now extrapolate 60 years hence—right, today's mind just boggles.
It's also interesting to note his lack of adeptness at typing (sign of the times, I suppose).
One additional example of the technology of that time. In 1968, I was a computer science student, and found myself called upon to arrange a demonstration of remote computing. The university at that time had no timeshared computing facility, so we used IBM's Call/360 service. The terminal was an IBM 1052 (big clunky printing terminal) with an acoustic coupler. To move this across campus, we arranged for a truck with 2 or 3 people to put the thing on a dolly, put it into the truck, and move it into the student union building. Later that day, the truck, and the helpers, came back and we reversed the process.
I really like my ThinkPad!
The old Teletype in question was a Baudot machine with a 60 mA current loop, rather than ASCII and 20 mA loop for the Model 33.