02 January 2014

It's Not a Joke, It's a Pun

It's now 2014, and the (amateur) punditocracy finally show some signs of understanding Amdahl. Here's exhibit A. It's been a bit less than a decade since Intel started with multi-core (distinct from multi-chip package) chips. One would think that's plenty of time to have figured out teh issue. It's not a new problem, what with the likes of Thinking Machines having found a market wanting when it tried to build and sell such machines as servers back in the 1980s.
However, there's a catch. While there are some "embarrassingly parallel" workloads out there, they're usually not all that common in the consumer space. Why? The truth is that many applications and algorithms aren't well suited to parallelization (i.e. being coded in a fashion that allows the work to be distributed across a huge number of chips), so at the end of the day the vast majority of consumer applications (whether it's PC games or mobile apps) benefit far more from having more performance on 1 or 2 cores than less performance per core but more cores.

Gee, ya think?

Single core with ever increasing clocks was the key to superior performance for decades, until Intel and the rest discovered that physics made it clear that The End of the Road (a great unkown novel, by the way; if you don't know Barth, you've missed out majorly) had been reached. So far as meaningful, i.e. linear with respect to historical increases, new performance was concerned not so much.

Which brings us to New Year's Predictions. No Cringely am I (not bothered to follow him, either one, for years), but what the hell. I've been goaded so here's two.

From Revolutionary guys, I was moved to comment
Nope. The big thing in 2014 will be embracing of in-database analytics with PL/R-ish functionality in all the major databases. SAP/HANA is the prototype. Oracle is a bit behind, but does claim to have something that sounds like PL/R. DB2 is still doing SPSS from its side, alas; although they do have a Netezza (nee, Postgres) implementation for the Z machines.

As corps begin to realize that Big Data is a limited phenomenon, they'll also see the advantage of providing analytics in their applications, and that is most easily done from within the database engine. While it's been a bit more than two years since the Triage piece, and I despair every now and again that the deciders (in the C suites and pundit thrones) will ever get it, reality appears to be bopping dem boys upside the head. Good for reality.

And, for the second, will be the return of very old fashioned host/terminal computing. One might argue that The Cloud is already a manifestation of same, but the point of the prediction is a turning away from client-centric coding (including "managing transactions in the client" mentality) to server-centric data management, relegating the client device to display/input duties.

So, come back in a year (but keep those cards and letters coming), and we'll see if intelligent life in IT has poked its spiny little head above the grass.

No comments: