You are viewing archived messages.
Go here to search the history.

Pine Wu 2023-06-29 18:51:57

i would like to have a bsky invite too 🙏

Ivan Reese 2023-06-29 19:17:45

I have 2 invites I can offer — DM if you want one

Ivan Reese 2023-06-29 22:07:06

#past-company

When the industry switched over from mainframes and minicomputers to personal computers, how big a regression was the performance of a typical PC versus that of a mainframe / mini at the time? (Keeping this question vague because I'm not quite sure how to phrase it. Feel free to interpret it however you like.)

Eli Mellen 2023-06-29 22:16:27

This is anecdotal because I’m an Ives era iMac baby, but my understanding was that it sometimes was a speed up, because of resourcing sharing on many mainframes you couldn’t reliably get at all the power available.

Ivan Reese 2023-06-29 22:20:43

Yeah! That's sort of what made me wonder. Like, the mainframe had to time share across a bunch of jobs submitted by users, so presumably things slowed down when the number of people submitting jobs went up. So there's, like, a relative improvement in perceived performance when switching to PC. (And, AFAIK, a reduction in per-user cost.)

But in absolute terms? Like, if you had time on the mainframe when nobody else was around… how much fun was that? Like, for how long after someone got a PC would they still be tempted to stay late and submit jobs to the mainframe because it was that much faster?

Ivan Reese 2023-06-29 22:21:44

(I'm reminded of my early days of doing 3d animation in high school, where I got to use around a dozen computers in the lab as an overnight render farm — the inverse of timesharing on a mainframe!)

Konrad Hinsen 2023-06-30 06:24:42

This is not an easy question to answer, because it depends a lot on what you want to do, and in particular where the performance bottlenecks are. The biggest performance difference was in mass storage. Processing datasets larger than core memory was common practice in mainframes, but impossible on the first PCs. But for interactive use on small tasks, the PC was the best choice from day one.

Konrad Hinsen 2023-06-30 06:29:37

Anecdata: In 1983, my high school had a few Video Genie computers (Tandy TRS-80 clones, with a Z80 processor) and a few terminals for remote access to the IBM mainframe of a nearby research center. We wrote and ran games on the Z80, which we couldn't have done on the mainframe, where response times for a simple command were about a minute (I couldn't try at night!). On the other hand, I used the mainframe for doing my math homework, solving linear equations for six unknowns (in APL) with the same reponse time of a minute, which I found impressive. The next day, I decided that with that machine, I could go beyond the scale of my math homework. 50 unknowns, no problem - still one minute. The Z80 machine couldn't have handled those for lack of memory.

Jack Rusher 2023-06-30 08:01:29

Echoing Konrad Hinsen, “it depends.” I went from 8-bit home machines to VAXes, and I was initially disappointed because there were no graphics on the DEC VT-100 terminals. But there was so much more memory (not to mention disk space!) available that I could compute more interesting things, even if I had to do it nearly blind. When personal computers started to get popular, they had ~640K of memory while VAXes could hold 128MB of RAM, so off hours compute jobs continued to run much better on the bigger machines. OTOH, the Sun-2/Sun-3 workstations had 8MB/up to 32MB of RAM, plus a super nice display (for the times), which often made it feel much better in practical terms.

Dave Liepmann 2023-06-30 11:01:52

I spent my early career (circa late 2000s) deploying to various AS/400s, which are midrange machines. Up-front caveat: I used them for the most boring business-y work: querying databases, moving save files around, modifying some job queues; no computationally intensive tasks. I also worked from a modern laptop and only used the greenscreen for specific tasks. However, I spent a lot of time with people for whom an AS/400 terminal was their primary workspace.

  1. My impression of user interaction was snappy . Power users absolutely flew from screen to screen. Navigation and most tasks are almost always instantaneous (~0.1s); blips into the ~1s range were rare enough to be notable. For the most part, job priority and time slicing just worked. The ceiling for fluid human-computer interaction in the greenscreen always seemed higher than in modern operating systems. I saw three reasons: the rich API surface of keyboards, the ubiquity & consistency of keybindings, and being blessed with the constraint of text screens. I feel we genuinely lost something here moving to mouse- and icon-oriented systems, and with speciation of UI commands across applications. I've seen several greenscreen->GUI modernization projects where the actual user experience regressed severely, despite prettier screenshots, in terms of responsiveness and access.
  1. My memory is vague when it comes to how long program compilation took. A few seconds? (The odd architecture of the tech I worked on made this a non-issue: we would develop in an IDE on our laptop, compile part of it locally, and then send the program object to the AS/400 to compile another part and bundle it for execution. Local compilation (seconds to minutes) and transfer time (some seconds) by far dominated this workflow — live programming it was not.)
  1. Only tangentially related: I always like to mention that these machines are shockingly reliable. I worked with many dozens of them and my company was involved with hundreds more and I never even heard of a crash or a machine getting into a bad OS state. Absolutely rock solid.
Paul Tarvydas 2023-06-30 23:27:31

I crashed all of Toronto IBM with a simple APL program (graphics - bar charts using RBG primary colours, nonetheless (unheard of at the time)).

I crashed all of UofT computing with an assembler homework assignment (it took a few runs before they figured out that it was my fault).

I bought/owned a UNIX[sic] Nabu system. Before Linux was born.

Going from mainframe to Vax/780 to home-built S100 was essentially a speed + memory + disk + $’s issue. I graduated from cassette tapes to 8.5" floppies when I could afford it on my student allowance.

Yet, something about desktop PCs was /different/ from mainframes. Desktop PCs allowed one to think in terms of /many/ computers instead of /timesharing/ a behemoth. Then, came uucp, Napster, p2p, etc. I doubt that VisiCalc or Hypercard or blockchain or internet could have been conceived-of without the presence of cheap(er) home PCs.

To me, it’s not a question of brawn strength but of ubiquitous-ness.

To me, computing hardware in 2023, is very, very different from computing hardware in 1950-onwards.

IMNSHO, CompSci is out to lunch. “SICP Considered Harmful”??? The Reality of 2023 hardware is vastly different from the Reality of 1950 hardware.

Kevin Greer 2023-07-01 01:02:16

"Like, if you had time on the mainframe when nobody else was around…" I never had time on the first mainframe that I used. I submitted my program on punch cards and got back a printout a week later, hopefully with the results rather than a syntax error. So I have no idea of how fast it was. I think that it was common for many mainframes to be batch oriented, so for them, the question doesn't even make sense. But when I got my 68000 Amiga computer, it had the same processor as the WiCat and SUN minicomputers I was using, but I had it all to myself. The big difference was that they had 16X as much memory and disk drives instead of floppies.