31 January 2016

Strange Bedfellow

Greg Mankiw isn't on my list of favorite pundits. Forced to assign him to a list, it would certainly be Plutocrat Panderer. But then he goes and writes this piece. What makes the piece amusing is its backhanded swipe at the puffed up quants.
So if looking at contemporaneous economic conditions is not a reliable way to judge presidents, how should they be graded?
Similarly, a better way to judge presidents is by the policies they pursue, not the outcomes over which they preside. This task is harder than merely looking at unemployment, inflation and the growth of gross domestic product. It requires having a view about what policies are best at fostering prosperity and acknowledging that the experts are often divided on that question.

IOW, if not the proximate data, then what? Well, how well do you reward your friends and punish your enemies? Policy changes.

29 January 2016

Days of Wine and Roses

Since Data Science definition has become at least as lucrative a job as doing Data Science, I suppose I should do my part. Let's start with science, from the Wiki of course.
To be termed scientific, a method of inquiry is commonly based on empirical or measurable evidence subject to specific principles of reasoning. The Oxford English Dictionary defines the scientific method as "a method or procedure that has characterized natural science since the 17th century, consisting in systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses."

A bit of history. Back in the mid-60s, before I ever got involved, there were IBM and the 7 dwarves. If one designed, built, or sold such machines one was almost always a EE, preferably from a top tier engineering school. Likewise, if one programmed such machines. Programming was mostly in each machine's assembler. Among real science majors, the hierarchy started:
1 - math
2 - physics
3 - electrical engineering
X - everything else in any order, since they don't matter to the first 3

This became a problem, since "doing computers" was the au courant field, much as the web and big data and data science and such are today. The problem was that very few could survive a EE curriculum. The demand for some degree that was "computers" was high, but the number who qualified for the then appropriate degrees was small. So, just as sub-prime and ALT-A mortgages were invented to fulfill the demand for MBS, so was the computer science degree. Not smart enough for EE? Want to do computers? OK, you can write programs, just sign here.

The gag continues. We've seen NoSql created just because the average kiddie koder can't grok simple set theory, so toss out ACID and DRI and such and return to the thrilling days of yesteryear with COBOL and VSAM, only now it's java/PHP/C# and xml. Or whatever file type is hot.

Data science is quite the same. Your average quant wannabe can't grok stat or OR or maths, so let's create a new discipline that's lite on the tech, but heavy on the buzzwords. Thus, data science. Many of the skills attributed to data science used to be carried out by admin assistants: data cleaning, data entry, and other drudge tasks. Now, these are the critical skills of the data scientist.

The nature of science is to discover previously unknown aspects of God's world. Humans don't create scientific artifacts, we find them lying around, often hidden under millennia of ignorance. Priestley didn't invent oxygen, just found it lying around, after some experimental effort to isolate it. And so on for the rest of the periodic table. Einstein found relativity by asking a really simple question: what happens if this tram departs from the clock tower as fast as light? Nothing more than that. The resulting maths are, to be fair, a tad intimidating. And he had help with that bit.

What science, then, is there in data science? What previously unknown aspect of God's world has been found by the efforts of data science? None that have crossed my path. Nor will there be.

Old wine in new bottles. Or, as the child observed, he has no clothes.

26 January 2016

"A Man's Got to Know His Limitations" [update]

Today we muse on Apple and Data Science. Apple's quarterly is due after market close today, and more and more tutorials/explanations/paeans to Data Science appear. Let's see what they may have in common.

Apple has been on a growth spurt ever since the iPhone came into being. Those with short memories may have forgotten that it was a sick puppy at birth. Teeny display. Barely 3G capable. Exclusive to the lousiest carrier in the country. But the Apple crowd bought into the reality distortion field meme that the iPhone was somehow different and better. "They" concluded that Apple had made the first smartphone. Not hardly. What Apple had was a lock, for a time, on cap touchscreens. Kind of hard for anyone else to make a phone with one. Skip happily forward to today, and iPhone accounts for about 2/3 of company revenue and more net income (including spend on that "ecosystem").

The nature of today's quarterly is written about by the tonne of virtual ink. Mostly, "channel checking" says that iPhone sales, at best, are flat for the current quarter and next; at worse, both fall a bit. The compromise is that current quarter meets expectations, but guidance for next is down.

But, isn't this just the strong suit of Data Science? Accumulating unstructured data, analysing with arcane Bayesian functions, and confirming the priors? What could be easier? Wait... Cook has, in the past, tut-tut-ed the analysts for extrapolating from their channel checking to conclusions about Apple. Could he be right again, this time?

The stock market is an insiders' game first, last, and always. Corporations are under no obligation to report material issues other than in filings, and, on the whole, each company decides the meaning of "material". Prominently displaying pretty financial results, aka non-GAAP, while burying the legal numbers in footnotes is a common tactic. Shareholders in biopharma companies get blindsided every morning. Nearly all meaningful financial data is quarterly/annual filings. Every blue moon there might be a separate filing on a material issue, but often only good news and mostly not direct financial information. Like, "we told our suppliers to reduce deliveries by 30% this and next quarters". Doesn't happen. Management, or anyone else for that matter (just ask Martha Stewart), isn't supposed to go out and trade company instruments knowing such non-public information. Since Apple is a significant component in mutual funds, hedge funds, indexes, and ETFs one might wonder whether Tim et al is allowed to trade those instruments? Clearly, the NASDAQ has been down because Apple has been down. Some people think so. Research it if you're interested.

If Data Science were all such magic that real stat can't do, wouldn't we already know what Apple will say a bit after 4:00 Eastern?

Well, the verdict is in: the channel check looks mostly right. Moreover, many have pooh-poohed Beijing's remnimbi rumba as no big deal. Not so much, it turns out. Not so many rich Chinese, either.

21 January 2016

Fools on Errands

Wanting one's cake and eating it too is a common syndrome in humans. In corporations, it's epidemic. With all the chatter about the US's healthcare expenditures being the highest, and general population health nowhere near that, high amongst that chatter is that "it cost $X billion to get a new drug to market!!" I'm among those who blame drug companies for continuing to throw good money after bad. After all if they didn't, the CxO class wouldn't be able to keep their million dollar sinecures.

Today Alkermes reports it's latest trial bit the big weenie. Or, as Adam Feuerstein put it:
Alkermes argued Thursday that hints of efficacy in the failed clinical trials means ALKS 5461 isn't dead yet. A third phase III study in depression is still underway.
"We are steadfast in our commitment to developing new medicines for serious CNS conditions where there is a clear and compelling need for new treatment options for patients and their families," said Richard Pops, Alkermes CEO, in a statement.

IOW, "I'll keep beating this dead horse until my second house in the Hamptons is finished". Oh, and the company is really in Waltham, MA, but inverted to Ireland in 2011. So, you the taxpayer get to pay for that house even more directly. Failure couldn't happen to a better bunch of knuckleheads.

20 January 2016

Retro Progress

There's a new book out that I guess I have to read, and Eduardo Porter has a bit of review, and reporting on it. I'm curious to see whether Gordon ties his thesis to science and engineering reaching an asymptote, or just crunching latter data against previous. Porter's commentary suggests the latter.
Americans like to think they live in an era of rapid and unprecedented change, but this kind of comparison -- pitting the momentous changes of the mid-20th century against the seemingly more modest progress of our present era -- raises a critical question about the nation's future prosperity.

Of course, Gentle Ben gives sort of backhanded support to Gordon's thesis
Ben S. Bernanke, the former chairman of the Federal Reserve who is now at the Brookings Institution, points out that long-term interest rates have been declining for a very long time. This is in response partly to the accumulation of savings in China and other developing economies, which have been buying Treasury bonds hand over fist. But it also suggests that investors, whether they realize it or not, may agree with Professor Gordon's proposition.

"People who invest money in the markets are saying the rate of return on capital investments is lower than it was 15 or 30 years ago," Mr. Bernanke said. "Gordon's forecast is not without some market reality."

My take on the surge in progress through the mid-20th century continues to rest on two points:
1) our collective scientific and engineering knowledge of the real world became nearly complete; today's science is either sub-atomic or cosmological and is highly unlikely to give us new products
2) the 50s and (less so) 60s were led by public and private leaders still embued with a war time socialism, creating a large middle class, thus a large TAM for product

Someone observed that software was going to eat the world. May be so, but that works only so long as the exchange rate betwixt coding and farming/manufacturing remains asymmetric. That will only last so long. I guess many haven't noticed that 2nd and 3rd world coders have eviscerated much of the coding middle class.

There's no question that productivity has burgeoned over the last couple of decades, and as automation continues to displace labor in the production of physical widgets people want to buy, the reckoning will come: as the upper X% become the TAM for widgets, prices have to rise, since the capex to make those widgets can't be laid off the way humans can.

Two new data points from today's news.

- Total CPI dipped to -0.1% (Briefing.com consensus 0.0%) in December
- "There has been concern that consumers are less enthusiastic about the feature and performance gains with the latest iPhone 6s/6s+ models," UBS continued. "Indeed, more consumers appear to be opting for last year's iPhone 6 models that are priced $100 lower." here

The (1 - X%) aren't spending much, and when they do, "good enough" rules. Of course, The Giant Pool of Money continues to demand ever higher returns on government instruments. IOW, a transfer of wealth from taxpayers (only the little people pay taxes) to the X%.

18 January 2016

A Bit of a Problem

If you like what OPEC means for oil prices, you'd love what the gold standard would do to financial markets.
-- Michael Feroli/2011

Well, Bitcoin is back in the news, and not, it seems, in a pleasant way. (There was a time, perhaps still is, when bitcoin was called the new gold.)

For those who've read history, 19th America was a 100 years' war, of sorts. Those with wealth waged war on those without it. Deflation/depression/panics were the order of the day. The country was in one most of the time. Those who had specie liked that, since they got a "return" on their positions without any investment risk. What could be better? And some wonder why I've concluded that the same motive and incentive™ exists with today's cash holders: corporations, hedgies, and 1% individuals. The Giant Pool of Money is still out there, and growing. Really new innovation in the consumer arena (the only one that matters, supply side ideation notwithstanding) is stalling out. Not least because battery chemistry has run up against the limits of the periodic table. A really Godzilla depression is the holders' best chance of risk-free return. American housing didn't work out so well.

So, here's why bitcoin is worse than gold:
Like many of the programmers who took an early interest, Mr. Hearn admired the rule-bound nature of the system. Only 21 million Bitcoins would ever be created. And the distribution of new Bitcoins was clearly laid out, relying on mathematical algorithms that left no room for human meddling.

Two points to keep in mind:
1) with a strict limit on the number of "specie" in circulation, deflation in prices has to result with any level of expansion of an economy -- this is the 19th century experience
2) bitcoin, by design, is a pyramid scheme

OK, 2) isn't obvious. The fundamental meme of a pyramid scheme is two points:
1) first entrants get (nearly) all profits
2) late entrants bear (nearly) infinite cost

That's bitcoin in a nutshell. The mining of bitcoin is, by (not necessarily intentional) design, more expensive for each succeeding unit, so that the first 100 bitcoin were orders of magnitude cheaper to acquire than the next 100 from today.
According to my calculation, a single Bitcoin transaction uses roughly enough electricity to power 1.57 American households for a day.
The Bitcoin protocol will continue to increase the difficulty of the cryptopuzzles to keep rewards constant, continuing the arms race until the last block is mined.

That's a pyramid scheme.

Now, the current report deals with enabling cheaper mining and network management. Needless to point out: the early entrants who got their bitcoin on the cheap don't want to lower the cost today; that's giving away the farm. And, needless to continue, the point of a specie currency is that it levels the field for all participants. And, of course, the incumbents retaliated against the liberators (bitcoin XT).

When that last bitcoin is mined, not in our lifetimes,
But since the last Bitcoin block is projected to be mined around the year 2140, adopting Bitcoin as a major (or world) currency anytime in the next few decades would just exacerbate anthropogenic climate change by needlessly increasing electricity consumption until it's too late.
life won't be so nice. Given the restriction, so far, on coinage rate and the increasing cost of mining, bitcoin may both drive deflation (should it ever gain specie status, and long before coinage exhaustion since new bitcoin approach infinite cost soon) and end the global warming problem by consuming all electricity. Ain't capitalism grand?

14 January 2016

Buddy, Can You Spara Digm?

It is difficult to get a man to understand something, when his salary depends upon him not understanding it.
-- Upton Sinclair

Celko's latest SQL tutorial on simple-talk sat around for a while, nearly off the front page, until the inevitable plaint about DRI not being embraced by the kiddie koder korp:
Joe, your analysis is excellent, but one thing I've seen continually is that many many installations simply refuse to use DRI, and it's enforced via standards.

Many large shops have very little control over the knowledge level of their developers, and they opt to "keep things simple" by disallowing any background processes like DRI to enforce data integrity.

These shops are content to enforce RI via code, often with spectacular failures as a result.

I'd appreciate some feedback on ways to evangelize for this to potential clients.

Common arguments against using DRI include too much background overhead, additional complexity that new developers may not see, and insufficient documentation once the DRI is in place.

Thus encouraged, I allowed as how this is not out of ignorance:
Allen Holub (my coding heeero) dealt with the issue, although not specific to RDBMS, in his "Bank of Allen" series on OOD/OOP in the mid to late 90s. here: http://www.drdobbs.com/what-is-an-object/184410076

"The only recourse is to change the ROMs in every ATM in the world (since there's no telling which one Bill will use), to use 64-bit doubles instead of 32-bit floats to hold account balances, and to 32-bit longs to hold five-digit PINs. That's an enormous maintenance problem, of course. "

Holub illustrates the problem with client code doing what should be done on the server/datastore, so it's not specific to RDBMS, but I am among those who contend that the RM/RDBMS is the first implementation of "objects": they are data and method encapsulated. The coder class of folks simply refuse to cede control/LoC to the datastore. Too much moolah to loose. Their managers are clueless, and desirous of keeping their bloated headcount, so they demur.

As I began this missive, I found additional comments, one from Celko himself. I feel so proud.

But the issue demands further discussion. After all, the RM is more than 45 years old (measuring from Codd's IBM restricted paper of 1969), and Oracle's first version went commercial in 1979, a decade later (depending how you measure each "event"). Why should Celko, or even such a nobody as I, need to conduct SQL/RM tutorials to the IT community? Do random physicians post here and there on the benefits of anatomy to other physicians? Of course not. Shouldn't they all have learned the RM/SQL and such in school? Shouldn't development managers have seen the advantages to smaller data footprint, stronger integrity (ACID), and such? Yet we see NoSql re-invent the VSAM/IMS paradigms of flat-files and hierarchy. And out-of-process transaction control in the manner of CICS (1969)!!

I later responded that the reason for avoiding the RM/RDBMS in practice is not technical, but behavioral. Once again, those pesky motive and incentive™ hold sway over technical benefit or productive efficiency in the soul of a new manager. Motive: keep adding more same-skill coders, who I know how to hire and boss around, or at least think I do. Incentive: the more of these widget coders I have, the more power and money I get. And that is the key to understanding why NoSql came to be. If one is careful not to rock the dinghy one is currently in, with a little luck, one can grow it to a President Class aircraft carrier (have to be US, of course). It is another attempt to co-opt power in the development realm by client-centric coders. Bureaucracy wins, since the currency of bureaucracy is power, and that power rests in the size of budget and headcount. The manager with the largest has the most power. Any manager who changes to or adopts a paradigm that needs significantly fewer developers or maintenance (no one to run around changing ROMs every few weeks, for instance) is doomed.

The quality of the product is irrelevant, so long as it isn't so lousy as to cause losing law suits. "Good enough" means that the bureaucracy follows the path of least resistance, which in turn means that development follows from established skill set. Academic, and self-directed, IT is language centric. One takes courses in many languages and algorithms. One does not take many courses in understanding and implementing the RM; it's assumed to be "obvious" and thus of no interest. The plain fact that SQL and existing engines do only part of the job set out by Codd for the RM, is taken to mean that the RM is flawed rather than the true state of affairs: both the language and the engines were built to be "good enough" based on 1980s tech. Bah.

In the beginning, Codd didn't specify implementation, and purposely so. Random access storage was expensive, relative to tape, and batch processing was still the norm. While not quite as ancient as Celko, the first system I was involved in building was an MIS (remember that acronym?) for a state program running on an outsourced (and clapped out) 360/30 with a couple of 2311 disk drives and a COBOL compiler. DB2 didn't yet exist, not even Oracle version 2. The coders' heads were cemented in the FOR loop paradigm that persists to this day. And, of course, the FOR loop works best if the data is processed in some single order, RBAR. Thus, simple sorted files on things like SSN or Order Number or whatever were/are the norm. The 2311, by the way, was used as if it were holding sequential files from a tape drive, of course. Three tape sort/merge done on disk. The advantage: the disk was faster at sequential scans and only one "job" was run at a time. Batch multi-programming came later.

There's the notion that the RM exists to support multi-user systems, primarily some sort of terminal interaction. But such were, at best, rare in 1969. TSO didn't get to IBM mainframes until 1971, and then you had to run MVT. Codd devised the RM in the context of IMS batch processing on OS/360, which, if memory serves, didn't support file sharing across partitions (with a narrower definition than used today), which means applications. In other words, the benefits of the RM, at that time, were construed within each application silo. It wasn't until much later that having ACID meant that one could have fewer (one?) datastores supporting multiple applications from one engine.

With desktop sized machines having 10s of gigabytes of memory, 100s of gigabytes of SSD storage and a hefty Xeon cpu for under $10K, why hasn't the canard that "joins are too sloooooooooow!!" (and DRI checking, and so on) been tossed into the dustbin it deserves? Simply because managers are too timid to change paradigms. With a datastore-centric paradigm, client style coders wouldn't have much more to do than write screen I/O routines, and those can be (and are) generated from the schema. The web has been characterized as a paradigm shift, by some. Yet, from a development standpoint, it is reactionary; terminal batch processing, older than terminal time-sharing, by a lot. The web engendered yet another round of client-centric code. The early http based innterTubes didn't support persistent connections (too slow and not enough of them available from web servers), so the disconnected local/browser code attempted to do all edit/integrity processing and ship the result to the server. javascript grew like Topsy. Of course, data collisions happened all the time. Blame went to the database, of course.

Mentioned recently are the GE commercials where a 20-something coder is defending his decision to write industrial control programs in the face of Angry Birds colleagues. Why can't the RDBMS vendors make similar? May be someday.

13 January 2016

Da Wave Boss, Da Wave!!! [update]

Here comes the Money Tsunami again!!!!!!
[briefing.com 13 January, 2016 1:35 pm]
Elsewhere, at the top of the hour, the Fed''s $21 bln 1-year note reopening was met with robust demand, showing the highest bid-to-cover since since December of 2014 and the highest indirect bid since February 2011. The auction drew a high yield of 2.09% on a bid-to-cover of 2.77.
This just in from Al Sleet, the Hippy Dippy Forecaster.
Expect a 90% chance of inversion over that Manhattan Island and the Treasury building down in DC. It's an unusual inversion, being it happens at two different places at once. But, hey, I know how that is. Half the time my brain is in Haight-Ashbury and my body is in Fresno. Be prepared for being hot and cold at the same time. Don't spend it all in one place!

08 January 2016

The Tyranny of Average Cost, part the eighth

Yet another example of The Tyranny: the assumption that auto-cars are safer.
"This technology will be disruptive to the insurance industry," Mr. Albright [principal of actuarial and insurance risk practice at KPMG] said. "There will be winners, and there will be losers. There will be fewer companies than there are today. But the question is, Who will survive?"

Of course, the balance of the reporting alludes to reduced cost and revenue, both being passed on to consumers. When was the last time an oligopoly did that? Will the insurers layoff 60% of their bureaucracies? As is continually reported, and studied by academics for decades, the "high value" FIRE jobs are least susceptible to productivity increase. Nearly pure labor cost.

And, of course to my cynical mind:
"The insurance industry is historically data-driven," [Joe Schneider of KPMG] said. "There's been an actual person behind the wheel of every car for 100 years, and all of a sudden saying the rules are going to be different going forward, that's a very difficult situation to wrap your head around."

Always remember one of Dr. McElhone's dicta: "thou shalt not predict outside the range of the data". Insurance does it all the time, of course. Weather disasters, if nothing else, are determined by today's conditions. Historical data may tell you that there really is global warming, and that coastal cities are at increasing risk, but it can't tell you when the next event will happen, nor where. Moreover, current iteration of auto-cars do have accidents.

And what happens when Joe Sixpack gets pissed that his GoogleZonSoftie mobile stays in the slow lane where it's supposed to be? Or he's behind one of them running at the speed limit on a rural two-lane where no human has ever gone so slow? Pull out, floor it, and head on into a soccer mom in a van with a dozen soccer kiddies? Who's at fault, then?

Eye of the Beholder

Now, this is really cool. Two aspects of this tech that impresses at first blush (I've just now found and read it).
1) it relies on optical tech that's been know for centuries, total internal reflection.
In 1604, the German scientist Johannes Kepler suggested that the intensity of light from a point source varied inversely with the square of the distance from the source, that light can be propagated over infinite distance, and that the speed of propagation of light is finite. He also described the phenomenon of human vision thereby explaining long and shortsightedness. In 1610, he presented an explanation of the principles involved in the convergent and divergent lens of microscopes and telescopes. He also discovered total internal reflection, but was not able to explain it. In 1621, Willebrord Snell discovered the relationship between the angle of incidence and angle of refraction when light passes from one transparent medium to another.
2) it is implemented by those socialist, backward Europeans: Zeiss of Germany. touche`

07 January 2016

The Stag Party

So, today's version of the what and why of 'secular stagnation'. It's not too bad, but as all of the others I've seen, ignores the three underlying causes/drivers of the condition:
1) stagnation, whether called panic/depression/recession, is always caused by flagging demand. Demand need not diminish, in need only slow its expansion. The CxO class does not invest in physical capital without certainty that there exists unmet demand.
2) there has been a kind stagnation since the 1990s. The dot.bomb was followed by the housing mania, and both were driven by the same force: demand for high returns on risk-free instruments. There was a time when the CxO class brayed that Treasury interest rates were depriving them of capital. These days the CxO class is demanding Treasury interest greater than they can, or will, generate with organic growth of their businesses with hard assets.
3) moreover, in the example, the piece doesn't mention the distribution problem: as more of the national income concentrates, demand slackens simply because the few can eat only so many carrots or send so many tweets.

The answer is that $10,000 in saving gets turned into investments through financial markets. This is one of the main functions of financial markets: to find a way to turn saving into productive investments.
At which point the text should distinguish between hard assets and fiduciary instruments, but doesn't. Bad juju. Both the dot.bomb and housing mania were, mostly for the former, and entirely for the latter, just fiduciary. Despite attempts to 'impute' value to consumer durables, particularly housing when prices are skyrocketing, there isn't any marketable output from such 'investment'.

Classical economists did not believe this could happen. They argued that financial markets would always find a way to equate saving and investment.

They also didn't conceive of such things as credit default swaps. They always assumed that investment was ultimately physical. Again, physical capital and fiduciary instruments are *not* interchangeable. Especially instruments disconnected from underlying assets. Financial engineering?

06 January 2016

The Funniest Story Ever Told

Those who cannot remember the past are condemned to repeat it.
-- George Santayana

Every now again, not nearly so much recently as years past, I visit Artima in search of code nuggets. Most often these are horse droppings, being as how they're the fruit of client-centric coders. Today brings one of the funniest posts I've ever seen.
the switch we made on the Brackets project from Scrum to Kanban

For those who've not been regular readers, a reminder: years ago I took some workshops with W. Edwards Deming, widely regarded as the savior of Japanese manufacturing after WWII. His notion of QA amounted to: do it right the first time, and monitor production with rudimentary descriptive stat measures. It worked. The Japanese named a QA award after him. At about the same time Toyota developed a production process they called Kanban. In the 70s and 80s, when America still made some stuff, it was (along with the related JIT concept) promoted as the Magic Bullet.

It seems kiddie koders have re-invented yet another wheel. I suppose they'll lay claim to ACID once that becomes de rigueur in NoSql files. The gall.

Who Controls the Food Controls the People

Somethings never seem to grow organically in the brain. Some times it has to be read to be understood. Among all the recent assaults on community and freedom, a Borg-fascist future, this post is truly chilling. As most folks, I have been aware of the mechanization and corporatization of farming, Monsanto's efforts in particular, but not being a farmer or agronomist by training, a cursory effort at best.
At a congressional hearing on big data in agriculture last October, the President of the Missouri Farm Bureau said that "farmers should understand what will become of the data collected from their operation", including who has access to it and for what purposes it can be used. From the farmer's perspective, they "must do everything we can to ensure producers own and control their data, can transparently ascertain what happens to the data, and have the ability to store the data in a safe and secure location." It will certainly be interesting to see how this plays out, particularly between developed and developing nations.

One aspect of change in farming over the last few decades was what has been called The Green Revolution
The Green Revolution spread technologies that already existed, but had not been widely implemented outside industrialized nations. These technologies included modern irrigation projects, pesticides, synthetic nitrogen fertilizer and improved crop varieties developed through the conventional, science-based methods available at the time.

Somehow or another, Green has been morphed into meaning sustainable, which makes sense. Unfortunately, the earlier Green Revolution is the furthest thing from sustainable. In other words, turning the rest of the planet's farm land into what we have now, and (no cite that I can recall) one wag called it: "dead dirt, used only to hold up the plants". Stripped of its nutrients, and dependent on manufactured fertilizers, largely derived from petro. Eat or drive your car.

The folly of Wall Street to drive the US (and Western economies generally) into FIRE and other non-producing activities (I blame Maslow in the final analysis, but that's another episode) leads to the ultimate question, "how many tweets for that bag of carrots". Just as those who snipe at the Arab oil states with the pithy, "you can't eat oil", so to video game code. Speaking of which, there's been a series of GE TeeVee commercials in which a 20-something coder announces to family and friends that he'll (not a girl, alas) be programming for GE, doing machine control and such. All the while pestered by another 20-something kiddie koder showing off his latest website/game/toy. It only took 20 years for corporate America to realize the issue.

05 January 2016

A Matrix of Hessians

If you're interested in being a corporate quant, drop off your morals at the door. It's no secret that I regularly warn wannabe (retail) financial quants (who think they'll get rich from their quant skills) that they're playing against the House, who's dealing from a stacked deck. Motive and incentive drive the data, not the other way round, modulo some consumer advert testing, which is boring as all hell. It's not often that malfeasance in corporate quant is outed so dramatically as now.

Today's news of Takata's evil is stunning, even to one as deeply cynical as I am. If the data doesn't "prove" the current corporate profit scheme is within bounds; well just fudge the data.
"Happy Manipulating!!!" a Takata airbag engineer, Bob Schubert, wrote in one email dated July 6, 2006, in a reference to results of airbag tests. In another, he wrote of changing the colors or lines in a graphic "to divert attention" from the test results and "to try to dress it up."

That last line is of particular importance, since it displays the underhandedness of some quants serving the hand that feeds him. The pie chart is widely despised amongst the academic quant community, but along with Excel, is widely used in Business. Why? Because a narrative, either oral or text, can be developed to spin the underlying data the spinner's desired direction, since the pie chart is ambiguous from the git go.

It gets worser. The Takata quants went so far as to camoflage the meaning of distributions.
"I showed all the data together, which helped disguise the bimodal distribution," Mr. Schubert wrote. "Nothing wrong with that. All the data is there. Every piece," he added. But then he suggested using "thick and thin lines to try and dress it up, or changing colors to divert attention."

One might wonder what Hadley Wickham would have to say about that. Or William Cleveland. Or, God protect them, Edward Tufte.

A picture may be worth a thousand words, but I see many dead people in the picture. How many is stat sig?

As a GM engineer admitted:
a bimodal distribution showed the parts being tested were not consistent -- generally a requirement for meeting quality standards for automotive safety products.

For those, esp. the financial and micro quants, who think that data is judgment free, guess again.

04 January 2016

Don't Go East Young Man! [update]

What some have been saying for years, "Don't go East, young man!!", is playing out today. The Asian markets, led by China, have done a minor (one hopes) nosedive. Can another strategic devaluation from Beijing be far behind? All that anti-American off-shore profit by American corporations out to stick it to the working class going up in rubble dust? We'll see. But Beijing has done it before, and recently enough as to be remembered. One hopes.
Well, that didn't take long, a couple of days.
On Thursday morning, China's central bank set the rate for the renminbi at 6.5646 to the dollar, its lowest point in almost six years.

"People are worried about whether they are using currency depreciation to stimulate growth," said Steven Sun, head of China Strategy and Hong Kong and China equity research at HSBC.

Well, D'oh!!!! Of course they are. Just as every slave/indentured economy, China depends on being able to export to Blue States, worldwide. To the extent that the US kills off its Blue States, China's only option is to crater its currency to keep pace with Blue States diminishing buying power. Quants and pols (esp. of The Right) never want to admit that motive and incentive trumps (hehe!) data whenever they conflict.