24 July 2020

Past is Prologue

When I returned (circa 1990) to things data based after a brief sojourn in the land of freelance photography, my first significant role was as IT director for a local mechanical contractor. The main project was to replace two ageing minicomputer hosted applications: one for the contracting/construction side and the other wholesale/distribution side. It turned out that both ran on TI-990 cpu, one the original full board version while the other the more recent (at the time) chip (8086 packaged).

Finding full packages that were designed to handle both types of business turned up nothing, which wasn't surprising since that was what they had been forced into already, leaving separate applications dedicated to each arm of business, but written to the same platform. And by platform one means, at least, one machine but in the better case, also the same infrastructure. Being RDBMS fluent, that meant both applications used the same database. In the end there were two candidates, one that was hosted on AS/400 (a machine with an 'integrated' version of DB2), and the other on the RS/6000 using Progress 4GL/database.

The RS/6000 was chosen. I visited the company a few years ago, and they're still running the systems. I guess it worked out OK.

At the time the RS/6000 was IBM's foray into RISC and microprocessor/*nix systems. Their version of Unix was AIX; although now linux is a significant part of the business. These early RS/6000 machines weren't single chip cpu, but multi-chip.

Why drag up 30 year old history? Because Intel is repeating it. Not that Intel is the first to step away from monolithic chip construction, but it is kinda interesting. Who was it who said that history doesn't repeat itself, but it rhymes?
[N]o matter what happens, Intel will be transforming. At a minimum they will be transforming from a company that relies on monolithic dies to a company that embraces multi-chip packages,

I'd be willing to bet the motivation today is the same as 30 years ago: there was no way to reliably put so much function into a single chip, and still get a reasonable number of units that actually worked. As semiconductor engineering and manufacturing improved over the next 25 or so years, billions and billions of transistors would be written to a single silicon die and nearly all the functions of a PC ended up on that one die. But, since we're now getting down to feature sizes defined by not-so-many atoms, this problem isn't likely to succumb to 'smarter engineering'. You can't fool Mother Nature.

No comments: