29 March 2013

The China Syndrome, Part 2

"I know the vibration was not normal."
-- Jack Godell/1979

Guess what else isn't normal? Chinese are indulging in interest rate inflation the same way we did, by bidding up housing with a flood of moolah. Since the collapse of the US bubble, the Chinese are left to create their own bubble. Good for them. One might wonder whether their quants are putting heavy thumbs on their scales, too.

Remember, this mess all started with Greenspan cratering interest rates in 2001 (not in one swell foop, of course, but the Yellow Brick Road was well paved that early). A significant impetus to move funds from "risk free" instruments such as US Treasuries to housing was simply that industrial investment has become troublesome, in the sense that, at least in silicon based companies, payback period exceeds productive life. That's been an anecdotal assertion. Finally, I found a source that admits this:
Over the past ten years, the semiconductor industry's capital requirements have grown faster than its cash operating profits. As a result, the payback period on invested capital has nearly doubled, from about 0.7 years to about 1.4 years over that same time period (Refer to Appendix for underlying methodology). Product lifecycles in the industry have remained generally unchanged at 18-24 months, however. Soon, the industry will reach a point at which investment is no longer profitable from a cash net present value (NPV) perspective: the payback period on invested capital will exceed the timeframe over which the underlying assets are productive.

In other words, it becomes impossible to actually earn a return on real investment. Oops. Absent profitable real investment, fiduciary investment has the world as its oyster. The problem, of course, is that, ultimately, fiduciary returns are only possible if the underlying payers are themselves beneficially accommodated. This lack of accommodation is why the US housing bubble burst: there wasn't rising median incomes to pay the extra vig. Oops. The Chinese will find that their moolah will be just as wasted on their condos as it was on Florida McMansions. Couldn't happen to a nicer bunch of folks.

I'll mention, again, the NPR piece, "The Giant Pool of Money". The best, and rather early, popular explanation.

27 March 2013

Quo Vadis Data Science?

Insanity: doing the same thing over and over, yet expecting a different result.

"Sherman, set the WABAC for 1964".

The mid-60s to the end of the decade were a watershed period in the computer world. 1964 saw the announcement of System/360 from IBM. The 360 was IBM's attempt to merge the business class machines with the scientific class machines (each class being built on disparate architecture), producing one machine architecture that covered the entire circle of need. Turns out, on that measure the 3x0 machines have been an abject failure. CDC, thence Cray, took the science side of things, only the 370/158 had much credibility in the lab. Since then, highly parallel multi-processor RISC machines have ruled, some even from IBM; none on the 360 ISA. As commercial machines, they've decimated the Seven Dwarves. They began the road to coding as king, installing COBOL to the throne; where it still sits.

Then came DEC and the minicomputer. For anyone interested in computing, Tracy Kidder's "The Soul of A New Machine" should be read. While about a DG machine designed in the early 70s, the book describes how a new machine was built from discrete parts. It took real engineering to do this.

Finally, we get Dr. Codd in 1969 (publicly, 1970), who presented a math based approach to data. Whether or not his paradigm has yet been assimilated remains an open question. Most, even within the industry, conflate SQL with the RM. On the whole, one hears "we must denormalize for speed" more than any other assertion.

Real engineering and math were a problem for all those young men who wanted to "do computers". One first needed an BsEE. In the hierarchy of curricula, the most difficult (win/place/show) were/are: math, physics, electrical engineering. More folks wanted in, but most who couldn't cut the brain mustard. What to do, what to do?

What was done: invent a new curriculum, lighter on the maths, but still, "do computer". We did what the USofA turns out to do best: dumb something down. Thus was born Computer Science. The first such department, depending on how one defines CS, is likely Carnegie Institute of Technology (today, Carnegie Mellon University). Easy on the maths, heavy on the coding. Here's a proposal from 1994 which intends to drop COBOL from the curriculum. Was COBOL ever part of CS, you ask? Well, yes, yes it was, as late as 1982. The link should take you to page 96. Look at the faculty advert second down from the large IMS/Upjohn ad in the top left of the page. Yes, COBOL was the basis of CS. Anyone who's had the experience of Enterprise Java has seen loads of COBOL, in that other syntax. Kind of like a baby chewing razor blades funny.

So now we have much the same thing going on. Those who can't handle the rigors of operations research and statistics and mathematical analysis need a safe harbor, where they can claim to "do data". The wiki article says, "Data science requires a versatile skill-set. Many practicing data scientist commonly specialize in specific domains such as marketing, medical, security, fraud and finance fields." "Swell", to quote Dirty Harry. Marketing and finance, professional shills and thieves? More to the point: these are endeavors defined by rules made up by, and subject to change by fiat, humans. It's all soft serve, no hard science. I guess there's a bright future for data science. Just don't ask too many questions. There's a reason that financial quants brought the world's economy to its knees: it made them scads of money if they could pull it off, so they did. The "laws" of commerce were flouted or changed to suit the situation; thus destroying the rigor of the analysis.

From "Introduction to Data Science", by Jeffrey Stanton:
Data Science refers to an emerging area of work concerned with the collection, preparation, analysis, visualization, management, and preservation of large collections of information. Although the name Data Science seems to connect most strongly with areas such as databases and computer science, many different kinds of skills - including non-mathematical skills - are needed.

IOW, "we don't need no education". That could have been pulled from a 1970 job advert for an EDP position (ADP, if it were for the US government); it's just what COBOL applications did/do.

For a particularly trenchant take on the situation:
So why many scientists find data-driven research and large data exciting? It has nothing to do with science. The desire to have datasets as large as possible and to create giant data mines is driven by our instinctive craving for plenty (richness), and by boyish tendency to have a "bigger" toy (car, gun, house, pirate ship, database) than anyone else. And whoever guards the vaults of data holds power over all of the other scientists who crave the data.
The comments get a bit cranky with him.

The Big Data meme is increasingly melded with the Data Science meme. The blind leading the blind. Gad.

While this missive was in progress, the Sunday NY Times offered a darkly pessimistic take on Big Data.
[David C. Vladeck, a professor of law at Georgetown University] offers this example: Imagine spending a few hours looking online for information on deep fat fryers. You could be looking for a gift for a friend or researching a report for cooking school. But to a data miner, tracking your click stream, this hunt could be read as a telltale signal of an unhealthy habit -- a data-based prediction that could make its way to a health insurer or potential employer.

In othe words, Big Data is valuable only when it's misused; stealing from the ignorantly passive users by creating a valuable needle in the haystack of that data tsunami. There is a Big Brother, but it ain't the Damn Gummint, it's rapacious capitalists. Go looking for a fat frier, and your health premium doubles. In order to pay for all this Big Data deep mining, the miners have to find ways to monetize what little flecks of gold they find. Since insurance is either required (automobiles; Progressive has a nefarious program ongoing) or necessary (health), it is a fertile ground to plow.

Dumbing down, again. It's the 1-2-3/Excel-ization of data, yet again. Anybody, with an hour or two of training, will be able to do deep, detailed analysis. Yeah, right. The inappropriate quants (bailed out math/physics grad students) gave us The Great Recession (abetted, one should acknowledge, by a certain laxity of purpose in government). The US Senate report is especially chilling. One has to wonder what Data Scientists will concoct?

Mike Holmes has a running promo for his new, here in the US, episodes of "Holmes Inspection". In the promo he's sitting in a work area (which looks CGI, and thus fake), and talking about his father's advice. "If you can't do it right, don't touch it. Get the hell out and do something else." The Data Scientists may end up a use case.

20 March 2013

No SQL Left Behind

One of the blessed ironies of the Bush II reign was "No Child Left Behind". This was clearly an attempt to punish and de-fund inner city (read: minority) schools. But the purpose, if actually carried out, would have been detrimental to the Right Wing agenda. Enter, the true zealots who condemn the teaching of critical thinking in schools (and here, very juicy).

So, what's all that got to do with SQL? Well, it's the motivation. The NCLB folks don't want the kiddies to actually think, but to repeat fables (i.e., The Bible and Shit Dad Says) by rote, and obediently. SQL, not so much as the RM (but you've read that story here a few times), is a math based approach to data and its control. NoSql, on the other hand, is bowing devotion to the "skills" of kiddie koders who are sure they can type their way out of any corner.

This all comes to mind, again, because even some SQL folks, who should know better, are talking about "integrating" NoSql into RDBMS. Oracle DB2 SQL Server (not MS) PostgreSQL

I guess we'll be genuflecting to Chamberlin any day now.

19 March 2013

Stop Having So Many Kids

Among my favorite authors is Paul Theroux, but only his travel stuff; not so much his fiction. C-SPAN2/BookTV cover many, if not all, book festivals, generally live (thence repeated). Within the last year or so, he appeared at one such festival, blurbing a book I suppose; I don't recall because the most important thing he had to say, in response to a question from the audience, was: "stop having so many kids!". It might even have been, "damn many".

It is an article of faith, and data, among economists that as a group/society/country moves up the development pyramid, breeding recedes. The typical explanation, based on US experience at least, is that as farming transitioned from subsistence farming to commercial family plots to industrial farming on mega-acreage, fewer kids were needed to provide both child and, later, adult labor. One might also note that the migration from farm to city reduced the *need* for kids; modulo welfare gaming, if you believe in the existence of such. I don't, but that's another episode.

An R builder, Markus Gesmann, has been working googleVis for some time now, and has just released another version. googleVis is an R package that enables motion charts with Google Chart Tools from R in the browser. A link to googleVis proper is in the posting.

If you toddle off to his blog, you'll see his demo. As it happens, the data he uses is fertility and life expectancy (the latter a common surrogate for development). QED.

17 March 2013

How Many Shoes Does a Centipede Have? [update]

Well, yes, the centipede drops another shoe. Today's Times has an evaluation of the Levin committee's report on JP Morgan's whale watch. Here's the report. One wonders whether Google is in league with The Banksters, since the report doesn't appear in the results. I had to follow many links to get to it. (The Times piece does have a direct link, though.) For future reference, this is the URL for that subcommittee.

The report runs over 300 pages, so I'll (for now; there may be updates to the missive as I work through) drop some cute quotes from Morgenson's piece.
BE afraid.
That's the takeaway for both investors and taxpayers in the 307-page Senate report detailing last year's $6.2 billion trading fiasco at JPMorgan Chase. The financial system, thanks to dissembling traders and bumbling regulators, is at greater risk than you know.

What we care about here is to what extent the quants were either co-opted or active participants in the shenanigans. Defenders will assert, as expected, that the quants were either just following orders or were kept in the dark.
... risk limits created by the bank to protect itself were exceeded routinely ...

Morgenson goes on to re-state the obvious
Remember that this is a report examining JPMorgan Chase, the bank that enjoys the best reputation among its peers. One can only wonder: if JPMorgan Chase traders think nothing of misrepresenting the value of their trades to minimize losses, what are the financial world's lesser players up to?

Dimon was particularly trenchant in defending himself and his bank during The Great Crash. From some years ago:
Instead of simply trusting his traders, Dimon put himself through a tutorial, so that he would understand the complex trades the bank was exposed to. And rather than run its mortgage machine at full throttle for as long as possible, Dimon reined in lending earlier than did others and warned his shareholders of looming trouble.

May be he didn't really? Could be. He certainly didn't do an Ahab imitation this time.

So, we have this later from the report:
Hoping to understand JPMorgan's practice of relaxing its valuation method on the troubled investment portfolio, Mr. Levin asked of Mr. Braunstein: "Is it common for JPMorgan to change its pricing practices when losses start to pile up in order to minimize the losses?"

After a bit of back and forth, Mr. Braunstein said: "No, that is not acceptable practice."

Here's where it gets a bit mathy. Risk assessment in the bankster crowd relies on gauging tomorrow's chance of collapse based on historic fluctuations in whatever sector/industry/company/etc. under review. Which is a perfectly acceptable stat methodology when the object of investigation is some natural process. With natural processes, barring events such as asteroids taking out most of the Yucatan, there's so much going on in the environment around the process we're interested in, that the causes of fluctuations in the process under review have all been blended into the historic data. Think: Brownian motion (financial quants have been known to invoke BM in their work; if only they had a brain). Human driven processes, where events are the result of specific human decisions, just don't work that way.

Since they say they're doing quant, rather than policy, they look for a number to quantify risk. Turns out, they use a measure of variability in the historic data. It's an attempt.
Normal practice at the bank and across the industry is to value these kinds of derivatives at the midpoint between the bid and offer prices available in the market. But in early 2012, as it became apparent that JPMorgan's big trades at the chief investment office were going bad, the bank began valuing the portfolio well outside the midpoint. This reduced its losses.

In other words, just as a butcher who doesn't know you (or does, and still hates you) puts his thumb on his side of the scale, Morgan leaned full force on the scale. The losses weren't really reduced, only the reporting said so.
In March, however, all 18 deviated, and 16 were at the outer bounds of price ranges. In every case, the prices used by the bank understated its losses.
...
Then, in April 2012 alone, risk limits were exceeded 160 times.

That's one angry butcher. Sweeney Todd on meth.

Levin goes in for the kill. Morgan had done its own investigation:
"You just told us that shifting pricing practices to minimize losses is not acceptable," Mr. Levin said. "Did you say that in your report? Did you say that's what happened?"

"I don't believe we called that out in the report," Mr. Cavanagh answered.

Here's why all this matters.
JPMorgan, don't forget, is the largest derivatives dealer in the world. Trillions of dollars in such instruments sit on its and other big banks' balance sheets. The ease with which the bank hid losses and fiddled with valuations should be a major concern to investors.

Mama, don't let your boys grow up to be quants.

[update]
From page 94 of the report:
Step by step, the bank's high paid credit derivative experts built a derivatives portfolio that encompassed hundreds of billions of dollars in notional holdings and generated billions of dollars in losses that no one predicted or could stop. Far from reducing or hedging the bank's risk, the CIO's Synthetic Credit Portfolio functioned instead as a high risk proprietary trading operation that had no place at a federally insured bank.

From page 155 of the report:
The CIO's lead quantitative analyst also pressed the bank's quantitative analysts to help the CIO set up a system to categorize the SCP's trades for risk measurement purposes in a way designed to produce the "optimal" -- meaning lowest -- Risk Weighted Asset total. The CIO analyst who pressed for that system was cautioned against writing about it in emails, but received sustained analytical support in his attempt to construct the system and artificially lower the SCP's risk profile.

From page 168 of the report (quoting Morgan documentation):
"The Firm calculates VaR to estimate possible economic outcomes for its current positions using historical simulation, which measures risk across instruments and portfolios in a consistent, comparable way. The simulation is based on data for the previous 12 months. This approach assumes that historical changes in market values are representative of the distribution of potential outcomes in the immediate future. VaR is calculated using a one day time horizon and an expected tail-loss methodology, and approximates a 95% confidence level. .. "

In other words: "we expect today to be just like yesterday".

Here's where it gets interesting (and sounds familiar?). The Whale Pod decided that the risk model used by Morgan was too pessimistic, so they'd do their own, but
Although Mr. Stephan [an experienced quant having built VaR models for Morgan] remained employed by the CIO in a risk management capacity, he was not the primary developer of the new VaR model; instead, that task was assigned to Patrick Hagan, the CIO's senior quantitative analyst who worked with the CIO traders. Mr. Hagan had never previously designed a VaR model.

So, sing the bridge again:
Mama, don't let your boys grow up to be quants.

16 March 2013

Chris Farley

When was the last time you watched "Saturday Night Live" end-to-end? For me, a couple of decades. I time it for "Weekend Update", and that's it. In fact, I missed the entirety of Farley's tenure. It wasn't until he died, when news feeds showed clips of his bits, that I found out about Matt Foley. I gather the character was intended to be a sarcastic take on motivation, in general.

ESPN had pretty much non-stop programming on NFL free agency Thursday, and I caught Herm Edwards saying (from memory): "Just because you pay a guy millions more dollars, doesn't mean he's going to run faster". Of course, knuckleheads don't get that. And there are a lot of knuckleheads running sports teams. Just as there are running most business.

Whatever one uses to measure accomplishment, becomes the accomplishment. Mammon tends to evict all else. Artists, the real ones, naturally don't view the world quite that way. And building database applications is as much about art (although it's mostly science), aka creative activity, as anything. Sales, especially life insurance, is an activity which is purely about generating cash. There is no real product, just some promise to pay if some bad event befalls. And the insurance company can't manage to weasel out of paying by invoking some very fine print. After all, the profit comes from *not* delivering the goods; so to speak.

Whether coincidence or purposeful, the NY Times has an article today about Google's largess, and contrasts this specifically with Yahoo!'s Marissa's fiat.
That said, [Teresa Amabile, a business administration professor at Harvard Business School and co-author of "The Progress Principle"] added, "...none of this matters unless people feel they have meaningful work and are making progress at it. In over 30 years of research, I've found that people do their most creative work when they're motivated by the work itself."

I don't know whether it's still true, but one of the hallmark's of MicroSoft's allure was a private office for everyone. Google is into the 20-something "social" meme, what with all open concept and games and such. I've never been able to concentrate all that well in the middle of the Grand Central concourse at rush hour. Say what you will about MicroSoft, and I've never had much nice to say, they've produced a far larger number of useful software applications than Google, which remains a glorified advert agency.

12 March 2013

Get Some Giddy Up in Your Gallop

Among the earliest sites to promote SSD adoptions was AnandTech. So, today he/they are reporting on the effect of moving to SSD as primary data store. From what I recall, in previous discussions on the site, they've not done much to normalize the databases, so this is just a measure of raw horsepower. Impressive, nonetheless.
The final benchmark however does give us some indication of what a more random enterprise workload can enjoy with a move to SSDs from a hard drive array. Here the performance of our new array is nearly 20x the old HDD array.

09 March 2013

Me Talk Gooder [update]

Professor Henry Higgins, through many trials and tribulations, eventually taught Eliza Doolittle how to speak the King's English. Through sheer luck, I've found a professor willing to take on (to what full extent, I don't yet know) the NoSql crowd. And his second installment.
While I'll be the first to admit that I use colorful language and metaphors, my perspectives typically come straight out of Distributed Systems 101. Half the reason why I started this blog is that there is a big discrepancy between what people who have studied distributed systems know, and what is being said in the marketing materials of various vendors.

Boy howdy. The NoSql folks just want to talk their own Pig Latin dialect.

[update]
Oh well. Upon further investigation, the professor asserts that he (and minions) have made a better NoSql datastore.

[update, 1]
Have none of these folks read, at least, Gray & Reuter (really, Weikum & Vossen)? Too mathy, I suppose. The Dark Ages, redux.

08 March 2013

Good News, Bad News

It's the nature of (semi-)democratic politics for those parties out of power to complain that Good News isn't really (for some conspiracy that *they* would never engage in) so Good, and that Bad News (ditto) is really worse. And so it is today. The monthly jobs report was released earlier today. If not already, certainly by the end of the day, the Republicans/Libertarians/Free Market Capitalists will be crying foul.

Let's have a look at some numbers.

The headline grabber is that the "official" unemployment rate for February was 7.7%, which was last the case just before Obama was inaugurated: December, 2008. Non-farm payrolls were up more than predicted, 165-170K predicted with 236K reported. So, all is good, yes? Well, may be not.

As stated a few times here: these numbers come from sample surveys, not censuses, and the "official" unemployment rate has been, and continues to be, criticized as too positive. The "official" number is called U-3, and it is down .6 point YoY. U-6, the most encompassing measure is down from 15.0 to 14.3, .7 point, but a much lower percentage improvement. The ratio of employed to population (an arguably more accurate measure) is the same at 58.6%. So, how did the rate go down? The measured labor force decreased by about 2,000,000. Ain't algebra fun?

You can peruse Table B-1 for clues to where the jobs are growing (hint: it ain't those who live in 90210 land). The right wingers should be happy. Hated Gummint employment is down, again.

Is the situation any better? The global numbers say no. Wall Street has been hogging the trough of QE money, with little effect on Main Street. Those with a Left Wing agenda, and those who've seen the movie before, said from the outset that Obama's stint in the White House wouldn't help much if the only lever used was monetary policy. Monetarism, when it isn't used to actively punish Main Street, ends up punishing Main Street anyway, since it is a matter of pushing a string. That can't work. As the Prosperity Through Austerity folks in Europe are demonstrating, efforts to force the victims of the Great Recession to suffer even larger losses only leads to greater concentration of wealth, and thus power.

The numbers tell the story. And the story is that those out of power have managed, in part because those in power are feckless, to increase the pressure on the necks of the 99%. Makes one wonder whether quants have devolved to being mouthpieces, just as economists did some decades ago.

05 March 2013

Hickey & Boggs

Some years after the end of "I Spy", Culp and Cosby made a noir film, called "Hickey & Boggs". I remember seeing it, but nothing about it beyond the noir feel of the film, particularly the ending. WikiPedia has more info. Which brings us, mnemonically, to the Higgs Boson. While not quite so noir, also mysterious. If you're not addicted to right wing anti-intellectualism, head to today's NY Times Science Times, which is a Tuesday section, normally devoted to various news of science. Not this week. The entire section, seven of eight pages (the last page is always the astronomical data), tells the story of the Higgs Boson.

There's a sidebar, which has this tidbit:
If that should happen -- tomorrow or billions of years from now -- a bubble would sweep out through the universe at the speed of light, obliterating the laws of nature as we know them.

The quote refers to a discovery made around 1998, that not only was the universe expanding which had been known since the understanding of red-shift by Hubble in 1929, but the expansion is accelerating. This discovery led to the "creation" of dark matter and dark energy, which are the assumed source of the energy necessary to motivate the acceleration. At the macro-level, Newton's laws are still true, so an accelerating universe requires a power source, and rather massive at that.

The accepted theory has been that dark matter/energy acceleration is sufficient to overcome gravity forever, so the universe will sort of end with stars dying as isolated islands out of sight of each other. There is no big crunch and a recycled big bang. The Higgs Boson, if it's really the way it looks, means that we don't end with the cold, empty space, but a slurpy. Call it the Big Bubble.

04 March 2013

Seven Percent Solution

Nicholas Meyer wrote a Sherlock Holmes book, which became a movie. The point of the story was that Holmes was addicted to cocaine (a 7% solution), and sought out Freud to be cured. He was, and returned to playing his violin as amusement. Violin Memory, SSD maker FOTM, has been making bits of noise now and again for the last few years. They're, may be, going the IPO route; stories have been appearing for some months. Today brings yet another story, this time that Violin and Toshiba, their NAND supplier, are preparing to take on EMC/Fusion-io in a cage match.
"The old approach of disk storages has a zone of waste; you have to pay for much more capacity than you really need to get the performance you want," Basile said. "Our systems reduce that zone of waste."

Now, that could mean Violin is aiming to educate clients in the Normal Forms, and thus smaller data footprints. A quick search came up with this older piece, that kinda, sorta alludes to that.
At the heart of DataQuick's business lies a series of SQL Server 2000/2005/2008 databases composed of aggregated, standardized, and normalized public real estate information from across every region of the United States.

Normalization is your best friend.

03 March 2013

Seeing the Trees for the Forest

A three-peat of grist for both the quant and macro-policy mills, all in today's NY Times (that's a surprise). Both versions of this endeavor have mentioned the importance of the Bretton Woods agreement to both the growth of the US economy, and thence its collapse post-OPEC oil embargoes. That's covered in a book review relating how Bretton Woods got done.
America was unscathed at home and far stronger in 1945 than in 1941. Owner of the bulk of the world's gold bullion, a stunning 20,000 metric tons, the United States wanted to emerge from World War II with fixed conversion rates assured by a dollar tied to gold. And it went without saying that it was bent on becoming the globe's financial capital.

In other words: the USofA ran international finance. One effect was to create a blue collar middle class, never seen before in human history, which supported not only production domestically, but provided a market for the rebuilding economies of both the Allies and the Axis Powers. We gave with one hand (Marshall Plan), and taketh away with the other (Bretton Woods).
Although America's global dominance has long since melted away, no substitute is yet strong enough to shove the dollar aside, in part because of Bretton Woods.

Next, The Great Divide, a continuing series muses on the notion of consumption versus capital. I don't agree with much of the argument, but it's impossible to ignore this:
A consumption bias, economists argue, is not a bad thing, as it leads to cheaper goods for Americans. And after all, someone has to do the consuming -- otherwise, whom would the Germans and Chinese export to?

Of course, no one.

Which leads to Friedman's take on a climate change report. I've been unable to track down reporting I read ages ago, which asserted that the dissolution of the USSR wasn't Gorbachev's policy, but rather a rebellion driven by extended crop failures. This always made more sense. Now, we have reporting that crop failures motivated revolts in Arabia.
The numbers tell the story: "Bread provides one-third of the caloric intake in Egypt, a country where 38 percent of income is spent on food," notes Sternberg. "The doubling of global wheat prices -- from $157/metric ton in June 2010 to $326/metric ton in February 2011 -- thus significantly impacted the country's food supply and availability." Global food prices peaked at an all-time high in March 2011, shortly after President Hosni Mubarak was toppled in Egypt.

Consider this: The world's top nine wheat-importers are in the Middle East: "Seven had political protests resulting in civilian deaths in 2011," said Sternberg. "Households in the countries that experience political unrest spend, on average, more than 35 percent of their income on food supplies," compared with less than 10 percent in developed countries.

Everything is linked: Chinese drought and Russian bushfires produced wheat shortages leading to higher bread prices fueling protests in Tahrir Square. Sternberg calls it the globalization of "hazard."

For the macro-policy folks: it's the distribution. For the quants: policy trumps data every time; more specifically policy responses to bad things happening in the world. Whether climate change is a perpetual Black Swan herd (yes, I just looked it up) or not, quants need look out the window to know what's going on.

Or not. Phil Factor relates:
I remember when a 'rocket scientist' friend of mine in the City of London worked out, a while ago, that the price of sugar futures contracts correlated directly with the weather in Chicago.
...
Why? Chicago was the center of the bulk of sugar futures trading, at the time. The traders merely looked out of the windows and reacted instinctively.

There be dragons.

02 March 2013

Mr. Bayes, Tear Down This Strawman!

A long time ago, in a city far, far away, I worked in a unit of four folks dedicated to quants. We were named The Office of Analytic Methods, a small outpost in a small Federal agency on E st NW DC. Among the four was a Ab.D. in stats who talked much of Bayes. Now, this was at a time when stat packs ran only on mainframes, and had names such as SPSS, SAS, BMDP, PSTAT. I'd wager that no one has heard of the last two. I was of the frequentist mind (as such are called today), and viewed Bayesian methods as shown, on occasion, in the preamble to this endeavor (cite long lost, and from memory):
If you're so sure of your prior, then just state your posterior and be done with it.

The Bayesian approach asserts that one should start with an assertion of some parameter's value, then collect some data, and finally see whether this data is reason enough to change one's initial assertion. That is the prior. The result when one is done is the final answer, and is called the posterior.

Being somewhat open minded, I decided to have a new look at Bayes, and therefore got "Doing Bayesian Data Analysis: A Tutorial with R and BUGS". It's been highly reviewed, and has all those cute doggies on the cover (not explained, either). The first half of the book is built on Bernoulli and binomial distribution; a lot of coin flipping. Chapter 11 gets to the heart of the matter, "Null Hypothesis Significance Testing" (NHST, for short). Those who embrace Bayes (nearly?) universally object to usual statistical testing and confidence interval estimation, because they're based on testing whether values equal, as assumed. This is the Null Hypothesis: that two means are equal, for example. We assume that two samples (or one sample compared to a known control) have the same value for the mean, and set about to test whether the data support that equality. Depending on what the data say, we either accept or reject the null hypothesis. We don't get to say that the true mean (in this example) is the sample mean if we reject the null hypothesis. This lack of specificity bothers some. Bayesians say we can do better.

The NHST amounts to proof by contradiction, an age old maths proof method, but strongly objected to by Bayesians.
This chapter explains some of the gory details of NHST, to bring mathematical rigor to the preceding comments and to bring rigor mortis to NHST.

The crux of his argument is that a sample will accept or reject the null hypothesis, depending on the sampling plan. He puts a sinister spin on this fact:
We have just seen that the NHST confidence interval depends on the covert intentions of the experimenter.

That word, "covert" and similar, appears with some frequency in the discussion. In one sense, the NHST does accept in one case and reject in the other, but there's nothing covert going on. What happens is that one framing of the question is binomial, while the other is negative binomial. Since the distributions differ, the test, using the same data, can give different results. In the strawman presented, it does. Of course.

Later, he constructs a real strawman. He tells a tale of flipping flat-head nails.
We flip the nail 26 times, and find it comes up heads 8 times. We conclude, therefore, that we cannot reject the null hypothesis the nail can come up heads or tails 50/50. Huh? This is a nail we're talking about. How can you not reject the null hypothesis?

Think about that for a second. He's morphing a null hypothesis regarding a coin, asserted to be symmetrical in three dimensions and of homogeneous density, with a nail which has neither characteristic. This is flippin' silly.

So, I remain a frequentist.

01 March 2013

Stuffing the Corporate Mattress

In an earlier post, I mentioned that US corporations are sitting on something just south of $2,000,000,000,000. This matters, because it demonstrates that corporations have all the money they need to invest in physical plant and equipment. The shibboleth of government deficits stealing money from the hungry capitalists is bunk (well, if it's a shibboleth, *it is* bunk). So, what's been the history of capital investment, and profit/retained earnings? Is the current crop of Daddy Warbucks more, or less, biased toward purely fiduciary activities as against physical investment?

Where's the data?

I haven't yet found a time series of corporate profit and physical investment (and likely won't) here's a snippet from Bloomberg, not your average left wingnut organ.
Corporations are holding more than $1.7 trillion in liquid assets, reflecting uncertainty over future policies. They're investing in capital projects only 80 percent of their available internal funds, according to data compiled by Bloomberg. Though that figure is up about one-third from late-2009, that ratio has been below 80 percent only once since the end of 1958.

American capitalists still view profit as declining wages. Killing the middle class is quite the same as killing the golden goose. In past times, even Henry Ford understood this. That such behaviour leads to Depression doesn't get through; or may be it does. If you're sitting on $1.7 trillion, Dee Flation is your best buddy.