Another, not so, happy coincidence. It's been a while since I visited Nick Carr's blog, and we get two very unique and related posts.
First, the FAA admits pilots don't fly much, anymore.
Second, programmers don't write code much, anymore.
The first shouldn't be too surprising, while the second impinges more directly on this endeavor. I've recounted recollections (without a cite I can claim, alas) of studies done in the, I'll guess, early 90s that GUI-fied applications (Word, you loathsome bastard) deflect the user's attention from content to eye-candy. And, thus, earn my enmity. The comment stream is nearly as interesting as the post and the linked article of Nick's.
What I conclude from my experience, the post, and the comments:
1) Even coders who acknowledge that there might be a problem with automating coding don't get that the code structure is a function of the code generator. While the discussion was largely directed at IDEs, the more insidious problem is with application frameworks: RoR, Struts, and the like. IDE/framework is a crutch. The application of "higher level thinking" isn't in the use of such crutches, but in their construction. Tool maker versus tool user, once again.
2) None see the extension to The Great Recession, which is that "higher level thinking" is just codeword for "let's stop doing real brain work, and just all get rich doing finance thingees". Possibly a larger indictment of the quant squad, but it's the coders' fault, too (much of Wall Street runs on C++). One must not forget the Li issue, as told by Salmon. Recent times, last 3 decades, have been largely about devolving from productive employment (actual rocket scientists) to leech employment (automated financial despoilation), all in the name of "higher level thinking". Other than process size improvements, largely an engineering issue, not science or discovery, not much can be said about "higher level thinking" having moved civilization forward. Are Apple's round cornered rectangle patents progress? Are such "inventions" what we should be showering with the maximal rewards (enforced by the legal system, of course)? One comment mentions chip layout, but skips over the relevant part: one can still find photos of chips being taped out (1970s), with real mylar and real hands; these days one uses builders which offer up ever larger blocks of circuits which an "engineer" (for those few still doing such) Lego blocks (verb form) together. How those circuits actually work, and whether there's any alternative, is gone. Whether this Lego blocking truly implements "higher level thinking" is not obvious, to me. There's a reason most everything on a platform looks the same. Apple, in particular, asserted as a marketing meme that a "seamless experience" was justification for uniformity in applications. As biologists well know: monoculture leads to doom. This is not to assert we should all revert to various assemblers (and not nearly as many as there were in 1980, say), but it's equally true that most OO coding (mentioned by more than one comment) isn't OO at all, but olde style FORTRAN function/data with cuter syntax. Even the OO revolution was a sham.
3) Much of the defense of the status quo involved citing the likes of Instagram! Facebook, twitter, Google (even) and such are toys, not tools or anything actually useful. The great minds of the younger generation worry about keeping "in touch" with friends. Somehow, I don't think Einstein had that as a priority. Nor did Crick and Watson.
4) Humans have run up against an immovable object: there's little left to discover in the real world. We've found all the land, resources, and physical laws. We're on the edge of miniaturization of nano-object construction. There is no warp drive or time travel. There is no new continent to exploit (modulo, possibly, Africa), at least in the previously unknown sense. We haven't had a plague in centuries, so we're well past the heel of the population hockey stick, again. Again, the rocket scientists of today are in finance devising ever more opaque methods for extracting moolah for themselves from the productive economy.
5) Oddly, none see the explosion of healthcare.gov as indicative of bloat and inefficiency engendered by such automatons, which by their nature are code-centric rather than data-centric. (There's this disputed NYT report that the system, per se, is 500 million lines. Not likely. There could well be that much including all the foreign systems it connects to, though.) Back in the old days, it was the WinTel monopoly: Intel needed MicroSoft to make ever more greedy software to justify buying a new machine with a fancier X86 chip, while MicroSoft needed ever more bounteous cpus to run the clusterfuck it called Windows. Whether ARM or Intel or whoever will find a needy compatriot is an open question. That we can see the physical limit of node shrinking is not in question (and no, nanotech isn't the answer; once you get to the single atom as we nearly are, the next step is inside the atom which straps your ass to quantum mechanics and that's nowhere you want to be). Given my predilection, the continued ascendence of coding, as opposed to DRI in RDBMS, is the highest mark of IT's regression. While the Kiddies have all their new languages, they've reverted to the paradigm of COBOL/VSAM development that Granddaddy used. Where COBOL was invented, intentionally, so that (non-programmer) managers could read and understand what the application did, today's IDE/framework puts the coder in the same frame of ignorance. Just jam a bunch of them code blocks together. This is intellectual regression, not progression. And they haven't a clue that they're flying blind.
6) None mentioned Knuth.
30 November 2013
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment