27 February 2014

Bitcoin, the Supernova of Deflation

Wherein we discuss the confluence of quants, computers, macroeconomics, and SQL.

Herbalife is a ponzi scheme, although it remains legal. So is bitcoin. Whether either will survive 2014 is up for debate. For this piece, bitcoin is discussed.

For those who're lazy (or don't like my prose), here's the punch line: should bitcoin's promoters get their way, i.e. bitcoin becomes the reserve currency of record, the world will fall into a permanent deflationary spiral. The sole winners will be the early buyers (Winklevii, I'm talking to you). Just as they are in a Ponzi scheme. There is historic precedent, for those who bother to look. 19th century USofA was a morass of falling prices and wealth concentration, and for the obvious reason; gold standard currency is dependent on native lodes of the metal and appeared only a very few times. The entire amount of gold extracted from the planet for all time has been described as about enough to fill an olympic swimming pool. The Wiki says a 21 meter cube. Near enough.

Here's an academic-ish review. On balance, the 19th century was volatile to a degree no one alive today, or their grandparents, have experienced first hand. But there was more deflation and depression than the 20th by a long shot.

By now, the Mt. Gox fiasco is well known. What isn't known, as I type, is whether the entirety of deposits (said to be 750,000 bitcoin) has vanished. The metaphor for bitcoin, mining, to gold was intentional. The difference is that the amount of recoverable gold is unknown, while with bitcoin it is fixed and published: 21 million of the little boogers, 2.5/minute and soon even less. There are reported to be 12 million already mined. 9 million to go, and a long way to get there: 2140. Unlike gold, which submits to standard mining techniques (pretty much) once located and whose total is unknown, bitcoin mining is intentionally non-linear with time, number of miners, and of a single source. The notion that such a snail's pace of currency generation can support expanding global commerce is somewhere between silly and evil. Making value independent of governments, sovereignty, geology, and place of residence (e.g., external factors which individuals can't easily change) sounds like an egalitarian motive. But the implementation is decidedly autocratic. The gating factor is bitcoins/unit time; this is fixed, until it gets smaller. The reason mining has become so compute intensive is that the gating factor is a constant, irregardless of the amount of mining attempted (the machine increases difficulty as the number of attempts increases to hold static the number of bitcoins released, a pure positive feedback loop); not the move from 50 to 25 per unit time. Moreover, by making the game one of rewarding timeliness rather than actual effort, the game is perverse. Kind of like being born 7 feet tall with a deadly 18 footer; pure chance to get massive reward.

As with the archetype Ponzi, the later you join the harder it is to succeed. Herbalife, but with a much larger cost of entry.

Which all reminds me of the taunt from grad school (voiced by MBA wannabes to the rest of us): "if you're so smart, why ain't ya rich?" Well, here's the other side:
At the Tokyo office building housing Mt. Gox, bitcoin trader Kolin Burges said he had picketed outside since Feb. 14 after traveling from London in an effort to get back $320,000 he has tied up in bitcoins with Mt. Gox.

Bitcoin is a ponzi scheme, and makes little effort to hide this fact. They've just used veiled language. At the beginning, mining was easy with few computers involved, so the originators could make all they wanted. Soon enough, godzilla sized computers are needed. In short order, mining them becomes nearly impossible, on a cost/benefit basis. It's already reached the point that the cost of the electricity alone to run the required computing power exceeds the value of mined coins. The system, by design, adjusts to the amount of mining by making the mining process more difficult. Thus, early miners who were few in number got lots of bitcoin with minimal effort. Since the release rate is what's fixed, more or less, at a point in time, then the effort/reward metric skyrockets as more mining is attempted. It's as if God goes about hiding the gold as more miners enter the lode. How Darwin of him. And so Ponzi. Today's serious miners are betting that bitcoin's value will move inversely to national currency (mostly, the US buck). That deflation will occur globally, and raise the purchasing power of bitcoin. Gold bugs have the same point of view.

The SQL part? MtGox has been hacked with sql injection before. And current reporting says that the missing bitcoins dribbled out over the years, not as a single event robbery. No Ocean's 666. I certainly wouldn't bet against the possibility that the sql injection vector has been ongoing since first admitted.

As to the macroeconomic effect of bitcoin uber alles? The main two are that wealth would concentrate to a degree not seen since the pharaohs, and deflation becomes the norm. Why, one might ask? Well, for the same reason as in the 19th century: with a fixed amount of legal tender, any expansion in output has to be accommodated within that currency store. Price * output == money supply. Price, in the aggregate, has to fall. Falling prices are an incentive to hoard, and thus a disincentive to produce. Death spiral. Of course it can be argued that if output *does* increase (the necessary motive for deflation), then no harm no foul as it's just a wash. History says it's never been a wash. The Fortune X00 are hoarding trillions of $$$ as we speak, CPI/PPI flip-flop around the 0 mark, and we've not yet entered the bitcoin spiral. The issue is that deflation must, necessarily, reduce output increases due to hoarding and wealth concentration.

Since the bitcoin mining supply rule is intentionally asymptotic to 0, and is well into its declining stage due to the release criteria: every 10 minutes 25 bitcoins appear. Since this is not a reward for effort, but for timeliness, the proportional return on effort is inverse to the total amount of mining computer power employed. And, as we know, that has skyrocketed. Just as in real world gold rushes, the more the sadder; all but a handful waste not only time and energy, but funds which could have been put to productive use. One might as well spend the moolah on Powerball. Odds might even be better. In real gold rushes, those that made money were the ones who sold eggs and beer to the miners. All that gold, and only so many eggs and kegs of beer. One might see the correlation in the move from cpu to gpu to asic processing in mining: the ones making the money are the egg and beer guys.

Tulips, anyone?

25 February 2014

Catch Me, If You Can

Yesterday, I tossed in a metaphor to the growing realization that there just might be limits to making "new" tech. Specifically, that:
May be vendors should include a phone life's supply of Prozac?

You know, with the stasis of tech, we enter a Dark Age, where little ever changes, the archetype of economic and social depression? A sarcastic metaphor?

So, what should show up in my inbox today, you may ask? This (I use Thunderbird, and only directly download headers. Cuts down on nonsense, so I'd advise you don't actually follow any links. I munged it just in case.)
Get Help for Depression SignsofDepression.z@jn.newmailonline.XXXXX

Look under the bed!!! Look under the bed!!!

24 February 2014

Boris Godunov

A recent itch, motivated in large part by the carping from the smartphone gallery, is that Moore's Law nears repeal. Unless some new physics becomes evident.

Announcements for the latest Samsung Galaxy phone abound. This is from AnandTech. Pay special attention to the comments. It's beginning to dawn on folks that limits exist. May be vendors should include a phone life's supply of Prozac?

The first comment to get it.
We're entering the era of "good enough" in the smartphone space. If you've been happy so far you'll be happy for plenty of time ahead.
-- Mondozai

Then we have, pithy the poor man.
How fast the market was moving from 2007 to 2009 led me to believe we would be seeing some true innovation and mind blowing things by now. Even Apple is doing incremental upgrades every year. Better cameras and a faster soc's, this has turned into the PC industry.
-- agent2099

Sounds like a financial sector quant. Just because one thinks today looks mostly better than yesterday doesn't mean tomorrow will look mostly better than today. Well, until stasis has won, so the days run into each other like a pea soup fog. Of course.


(Another comment, elsewhere, that gets elevated to essay. Kind of a selfie, I guess. The context: whether any of the other semiconductor companies can, or even should try to, match Intel. From the quant's point of view: what should be the measure of RoI, and how should it be measured? If tech is close to, even reached, stasis, then justification for buying more of existing producer goods rest solely on fulfilling unmet demand. With global wealth/income continuing to concentrate, is there meaningfully growing unmet demand? Has a black swan touched down in Wall Street?)

The larger issue, and certainly not imminent for the momo or day trading types, is what becomes the $$$ sink for The Giant Pool of Money? It's still out there. And now we've got more in the form of giant cash balances in the Fortune X00. The right wing Americans (mostly, financial sector with a dog in this fight) demand that we're spending too much on consumption, and not investing enough (they want those fees). Fact is, except for healthcare and semis, most "investment" is things like housing and finance and share buybacks. None of which produce anything. The Fortune X00 and the existing Giant Pool of Money are still sitting out there, looking for baksheesh. But without an avenue to buy "better" producer goods into, return on investment falls. As it has been, and without any help from the Fed. That's why the housing market was targeted by the Giant Pool of Money: seen as low risk, high payoff (relative to what real physical investment was providing); iow, low return on real investment had already happened by 2003, and is still here.

The return on physical investment is a function of technological progress; iow, there's no reason to add an open hearth steel furnace, only a better one. If there isn't....? You see where this is going? For much of the last 60 years, that's been semis. If semis also reaches a place of tech stasis, how to earn from investment? Housing pays off only if owners can earn more, since housing doesn't produce anything. Since they weren't the house of cards collapsed. If industry, too, hits a wall of stasis, what do we then?

[Looked at another way: the owners of the Giant Pool of Money seek to extort high return from Treasuries, i.e. taxpayers, rather than actually building out infrastructure. As the captains of industry seek to avoid making real investment due to perceived low real returns, then Treasuries' return must fall, too. Demanding X% from the Government when industry only returns Y% (less than X%, of course) with real investment perverts the system. Kind of moral hazard. Without expanding technology, and unmet demand, i.e. moolah in the hands of the many, capital loses value. And, no, rate of time preference is not the gating element, tech is.]

Have a nice day.

22 February 2014

Hair Brained

The 2008 FOMC transcripts are released. As you can see, they're extensive. More text than my single set of eyeballs really want to endure. Fortunately, the NYT has plenty of eyeballs, and have gone through them. And published a number of pieces.

What's missing from the writeups is any reference by those within the Fed to the disconnect between house prices and incomes as impetus to the crisis. If I survive chopping yet more ice from the driveway, I'll have a go at the transcripts and update. For now, here are my takeaways.

- Yellen is smarter than Bernanke and Paulson
- Quants are dumb as a sack of hair
- Fed members from fascist states are as fascist as one might expect

Here's the list of articles, in descending order of interest. Although I find them all interesting.

The Fed's Actions in 2008: What the Transcripts Reveal (The online version is much more extensive than the dead trees I read this AM.)

Fed Misread Crisis in 2008, Records Show

As Crisis Loomed, Yellen Made Wry and Forceful Calls for Action

Fed Fretted Over Reaction to Demise of Lehman

Reporter's short notes

Transcripts timeline:
Yellen: For example, East Bay plastic surgeons and dentists note that patients are deferring elective procedures. [Laughter] Reservations are no longer necessary at many high-end restaurants. And the Silicon Valley Country Club, with a $250,000 entrance fee and seven-to-eight-year waiting list, has seen the number of would-be new members shrink to a mere thirteen. [Laughter]
Which simply means that she looked out the window to discover that it really was raining, despite what the weathermen were saying. Anecdotal evidence is still evidence.

Yellen piece:
What the transcripts show is a woman who was constantly pushing her peers -- and also cleverly cajoling them -- to do more to help ordinary households, not just financial institutions. At the same time, she urged her colleagues to look at the flaws in the banks that caused the crisis in the first place. "I don't believe in gradualism in circumstances like these," Ms. Yellen said in March 2008, months before the situation came to a boil.

Fed Misread article:
The Fed's understanding of the crisis, however, was clouded by its reliance on indicators that tend to miss sharp changes in conditions. The government initially estimated, for example, that the economy expanded in the first half of 2008 because it basically assumed that some economic trends, like the pace of business creation, had continued apace.
The transcript for that meeting contains 129 mentions of "inflation" and five of "recession."
Some Fed officials have argued that the Fed was blind in 2008 because it relied, like everyone else, on a standard set of economic indicators.

Lehman piece:
Today, critics of the Treasury and the Fed say that the our-hands-were-tied argument may be an excuse, used after the fact, as a shield from criticism that they were negligent and miscalculated badly.

"It was a post-incident rationalization," Harvey R. Miller, a partner at Weil, Gotshal & Manges, said in an interview on Friday.
"Although Fed officials discussed and dismissed many ideas in the chaotic days leading up to the bankruptcy, the Fed did not furnish to the F.C.I.C. any written analysis to illustrate that Lehman lacked sufficient collateral to secure a loan," the [FCIC] report noted.

Reporters' Notes:
"While there are tales of woe, none of the 30 C.E.O.'s to whom I talked, outside of housing, see the economy trending into negative territory," said Richard Fisher of the Dallas Fed in January. "They see slower growth. Some of them see much slower growth. None of them at this juncture -- the cover of Newsweek notwithstanding, a great contra-indicator, which by the way shows 'the road to recession' on the issue that is about to come out -- see us going into recession."
And while the stock market might drop in the short term, [Jeffrey M. Lacker, then and now the Richmond Fed president] added, there was a "silver lining" to Lehman's collapse. "I don't want to be sanguine about it, but the silver lining to all the disruption that's ahead of us is that it will enhance the credibility of any commitment that we make in the future to be willing to let an institution fail and to risk such disruption again."
Life is so much more comfortable in that 1% bubble.

In sum then, no understanding that mortgages had and were outstripping incomes. Utter subservience to quants, who didn't have a clue. And, save Yellen, little concern except for their clients on Wall Street. Not a banner year. So far as cure, no indication that the Fed folks, as a whole, understood they were the last, but wrong, bastion against complete collapse. The correct approach, of course, was/is fiscal policy to restore demand, but the Republicans were and are steadfast against "giveaways" to the unworthy. That left the Fed, but recovery monetary policy is never any more than pushing a string, as we have seen. Generating a contraction, to kill dat ole debbel inflation, is yanking the gallows' rope. Works like a charm; Volker was a blessed savior for killing inflation caused by OPEC oil price hikes (not domestic profligacy). Even now, corporate America considers the Fed as constraining. Total ingrates.

21 February 2014


Ok, I know I should enter some blogging, commenting twelve step program, but may be tomorrow.

Anyway, one of the many postings, many places, chewing on the WhatsApp buying led me to post the following comment (it starts with a snip from a previous comment):
-- younger users will move from them extremely fast if something better comes along.

The point, of course, is that Zuck (and all the other advert shifting folks) is simply chasing the fungible whims of hormone overloaded kiddies. They don't even comprehend good, better, best; only different. And, if 55 coders could make something different that filled a whim, some other bunch of 55 coders will shortly do so, again. Flush $19 billion down the crapper. Again.

Someone(s), possibly Carr, voiced the current situation with "high tech" and "innovation" as computerized putting-out or cottage industry, which was based on a, relatively, cheap bit of technology (most often that new fangled sewing machine) dispersed in homes. The workers often got just subsistence wages, if that. All those HuffPo scribes and Seeking Alpha pundits, for example. The difference being that if fickle finger of fate dubs you, you're very rich. For creating a bit of software infotoyment. An economy and society built on sand.

20 February 2014

Viva la Difference [update]

Here's an interesting quote (from here):
Does that ring a bell? That kind of trade is similar to transactions in derivative products known as credit default swaps that played a key role in the financial crisis. Credit default swaps allowed investors to bet on the health of housing-related securities; with Bitcoin, they're betting on the health of Mt.Gox.

That's almost correct. And, of course, it was the quants that went hog wild ("But I was just following orders!") with CDSs. There is a difference: with CDSs, there's no limit on how many can bet on the underlying entity's failure. Kind of like the craps table: one shooter, but lots of bettors. The Mt. Gox situation is simply discounting the instrument, much as a stock will crater on bad news. Unless, of course, these Mt. Gox-ians are emulating "The Producers" by selling more than 100% of what they hold. Wouldn't that be fun?

"It took me less than 12 hours of programming to do this, and I didn't have to get approval from anyone," Jones said. "It's an uncertain time, and I think there's money to be made and lost."

If that sounds like financial anarchy, I'd bet you're quite correct. I wonder how many survivalists have BitCoins next to their Rands and M-16s?


Mt. Gox is officially dead: here

And in the same Top Stories box, we find that gold is (allegedly) manipulated by some London banks. "We make money the old fashioned way, we cheat."

19 February 2014

REM [update]

It's the end of the world as we know it
(It's time I had some time alone)
And I feel fine, fine
-- REM

Yes, yet another pot shot at the Zuck for dropping $4, $12, $16, $19 billion (depending on how you count) for something called WhatsApp. For the record, Forbes says that's for 55 coders. Lottery jobbing is the new normal.

I've added the following quote (from here) to top of page. Read the article, too.
"Advertising has us chasing cars and clothes, working jobs we hate so we can buy s**t we don't need," Koum, WhatsApp's CEO, wrote on Twitter in August 2011, quoting Tyler Durden, the main character from the movie Fight Club.

Thereby proving that Zuck/FB are scared shitless that something else will steal the advert business. At one time, the micro-payment meme was a movement. Not so much anymore. Recent data breaches here, there, and everywhere make it even less likely that the process will be implemented.

A web kiddie with my jaundiced view of the innterTubes. I am not alone!!! Rand be damned.

17 February 2014

I Have a Code in My Node

Humpty Dumpty sat on a wall,
Humpty Dumpty had a great fall.
All the king's horses and all the king's men
Couldn't put Humpty together again.

Now, consider what the results might be if Humpty's wall were setby the Yellow Brick Road that we've been traveling? You remember, the one to Oz where the RM (and it's rococco manifestation, the SQL database) vanquishes all of it's historic (and neo-) pretenders? If we anthropomorphize Intel (and, face it, all semi makers) as Humpty, what do we see? Well, the fall is the delay in getting 14nm parts out the door; and, a bit less so, no move to 450mm wafers (300mm began at least 2001). The innterTubes blogosphere is awash in argument: it's an unplanned, unexpected, unadmitted failure to execute on promised delivery; or it's just a pause by design, much as used to be part of Cape Canaveral launches.

This growing discussion of where we're at and where we're going in the realm of VVVVVVLSI (don't go to WikiPedia, I made that up) does matter to our trip down the Yellow Brick Road. If we're entering a period, perhaps prolonged if not permanent, of technological stasis, how will computing be impacted? The meme of Moore's Law being true forever and ever, Amen is false. In the near term, we can see that the line in the sand betwixt the Newton world and the Heisenberg world is a few short steps away. For myself, I don't see the cpu living in a quantum world. Whether some other element can be fabricated smaller than where that line's at and still make a deterministic cpu? Carbon? May be. May be not.

Of equal import: is there any unmet demand for cycles? Is there (or soon to be) an equivalent to the Wintel duopoly, where Intel needed a cycle hog to generate demand for the next chip and MicroSoft needed a source of more cycles to keep its piggy software running?

That Intel/AMD/ARM face unmet demand for cycles, which can only be satisfied with smaller feature size (node) and greater transistor budgets may well now be a wishful myth. Clearly, the halcyon days of the Intel/Microsoft Pentium/Windoze&Office symbiotic duopoly are way in the rearview mirror. The iPhone hasn't turned out to be the interminable cycle hog that Office was, alas (bandwidth, well, yeah). While M$ could graft on ever more esoteric editing/publishing functions (99.44% of which users never used, of course) to Word and add applications to the Office umbrella, all the while writing to the next Intel cpu as Bill said (well, may be), "if Windows is slow, we'll let the hardware fix it", smartphone vendors haven't had such a clear field.
Even so, running Windows on a PC with 512K of memory is akin to pouring molasses in the Arctic. And the more windows you activate, the more sluggishly it performs.

The camera (which has what to do with making phone calls?) and innterTube games (ditto) and such. But organic expansion through the device? Not so much. Is there some other device/platform which the semi makers can leverage in a similar way? That's the real elephant in the room. We may be running out of potable water, but have a surfeit of transistors, nor any thirst to slake. As with economics, supply doesn't make its own demand; rather, it is subservient to it. American auto companies, for decades, implemented planned obsolescence. And it worked, until it didn't. That approach stopped working in the PC world, and looks to be staggering in the smartphone world, too. Lots of content XP users. Smartphones, too?

To date, transistor budgets have been put to use by cramming existing functions into fewer, one in the SoC case, chips. For a given wafer size, a node shrink is valuable for one of two (both, in the best case scenario) reasons: the increased budget per unit area supports better implementation of the cpu/whatever or the increased budget means greater chip yield (the chip budget is static) per wafer. In the former case, value redounds to the semi maker through improvements in end user devices. In the latter case, more devices are shipped to end users; i.e. there's unmet consumer demand for the devices. The former has been played as a zero-sum game for the semi makers; in truth nothing new, just M&A at nm scale. The latter looks to be shaky. PC sales continue to spiral down. Apple refuses to expand its target market. Samsung continues to ship more devices. Only those two appear to make any moolah shipping phones/tablets. Stasis and zero-sum game can be considered two sides to the same coin.

Even apart from the SoC fork in the road, the cpu core specifically uses a dramatically decreasing proportion of the transistor budget over the last decade. Makes one, me in particular, wonder whether Intel made the wrong decision lo those many years ago: 1) use the growing budget to process microcode to a real RISC core or 2) build out the X86 ISA in pure silicon. They went with 1, as we all know.

One of the common notions: as feature size gets smaller less power is used and less heat is generated. Not quite true.
Shrinking transistors boosted speeds, but engineers found that as they did so, they couldn't reduce the voltage across the devices to improve power consumption. So much current was being lost when the transistor was off that a strong voltage--applied on the drain to pull charge carriers through the channel--was needed to make sure the device switched as quickly as possible to avoid losing power in the switching process.

One take on the issue.
With Dennard scaling gone and the benefits of new nodes shrinking every generation, the impetus to actually pay the huge costs required to build at the next node are just too small to justify the cost. It might be possible to build sub-5nm chips, but the expense and degree of duplication at key areas to ensure proper circuit functionality are going to nuke any potential benefits.

This related piece has the graph that tells all.
This expanded version of Moore's law held true into the mid-2000s, at which point the power consumption and clock speed improvements collapsed. The problem at 90nm was that transistor gates became too thin to prevent current from leaking out into the substrate.
The more cores per die, the lower the chip's overall clock speed. This leaves the CPU ever more reliant on parallelism to extract acceptable performance. AMD isn't the only company to run into this problem; Oracle's new T4 processor is the first Niagara-class chip to focus on improving single-thread performance rather than pushing up the total number of threads per CPU.
The fact that transistor density continues to scale while power consumption and clock speed do not has given rise to a new term: dark silicon. It refers to the percentage of silicon on a processor that can't be powered up simultaneously without breaching the chip's TDP.

There just aren't that many embarrassingly parallel problems. Well, there is one....

Let's continue with this very recent piece.
In 1987, 20 of the top 20 semiconductor companies owned their own leading-edge fabs. In 2010, just seven did. As the cost of moving to new processes skyrockets, and the benefits shrink, there's going to come a time when even Intel looks at the potential value and says, "Nah."

"the benefits shrink"? No wintel monopoly. Sniff.

So, in the end, if we can see both limits, demand for computing power and supply of computing power, is this a Good Thing or Bad Thing? Are we slipping and sliding on Humpty's yellow guts, unable to progress? Or did we stroll past him before he knockered his noggin? Well, in 1969, when Codd wrote his first paper, computing was in a period of stasis. Relatively speaking. IBM was on the verge of knocking off the Seven Dwarfs, with the 360 machines victorious. That ISA is still around today, and only recently expanded beyond 31 bits (no, not a typo). Could be, the bird has now back come home to roost. The RM was invented to solve the maintenance and performance problems of a known present and future of computing platform: the 360 mainframe.

If I, and these pundits, are right, then the future of computing rests on a static compute power base, much as it did in 1969 and the 360. Only more so now. How, then, to maximize that cpu(s)? Embarrassingly parallel problems become the preferred approach. Some are, some aren't. Here's where the power of the RM comes into play. With flatfile datastores, by whatever name, performance will be limited both by byte-bloat and RBAR throughput. Unlike when Bill was wrestling with Windows, we won't be able to code to the next generation of chip. The RM yields the minimum data footprint, inherently parallel processing, and with SSDs and lots of DRAM (which is a simpler structure), should become the new norm. So to speak.

13 February 2014

Bayes At the Moon

There's been an increased (hopefully, seasonal like some Sam Adams beer and the Groundhog) run of p-value bashing mashed up with Bayes Uber Alles meme-ing.

Enough already. If Bayesian analysis were all that much better at avoiding the "yesterday was like the day before, today is like yesterday, so tomorrow will be like today" pit which run-of-the-mill quants fill into when they gave us The Great Recession, then the Bayesians would have been trumpeting their superior analyses. I didn't see any. And, from a structural point of view, Bayesians are more likely to fall in the pit: the emphasis on "knowing" a prior, aka "yesterday's data". He's an old guy who died centuries ago, let him be.

10 February 2014

Love Actuarily

The AOL brouhaha has finally been outed. We think. The company, at least its CEO, hasn't the foggiest idea what insurance means. Hire an actuary or two. I might even volunteer; kind of like a HuffPo contributor. No, while I'm a nice guy (mostly), I'm not a Fellow.
Most large employers are self-insured for their workers' health coverage, given the savings such plans can yield over traditional group insurance. Self-funding means that an employer pays for health care rather than buying an insurance policy for their workers. Such plans now cover 60 percent of private-sector workers with health insurance--an estimated 100 million Americans.

That story doesn't cite an explicit source for the 60% number. But it does make some sense. When I was with Jack Anderson, back in the 1980s, he did just that for the staff; not that the staff numbered more than a dozen. I was only there for a short time, and not a regular staff member, but during staff meetings, Spear would tell folks that they just had to keep pestering Jack to get the money back. Apparently, staff had to pay first and get reimbursed. In the case of AOL, or any such large employer (and AOL ain't all that large, in the wider scheme of things), it appears they didn't get either any, or any accurate, advice from a health actuary.
AOL, the parent of the Huffington Post news site, is a media and Internet company with a workforce that may be younger--and healthier--than most employers'. If so, its annual claims could be even more predictable than a more age-diverse employee pool.

Ignoring the fact that mostly young folks have mostly babies. Babies are expensive, even when not at $1 million a drop. Kids, once down the chute, are too.
Per capita spending on children's health care rose to $2,123 in 2010, an 18.6 percent increase from 2007. Spending on health care for infants and toddlers was disproportionately high. Although children under 3 years comprised 17 percent of the covered child population, 31.4 percent of the total children's health care dollars was spent on them in 2010.

Having the rug rat alone costs a bunch. And, if you're a Right Wingnut of course, then there should be a birth penalty. This from the Family Values & We Don't Want No Birth Control cabal. Irony is lost on such folks.
It's a funny gripe for a number of reasons. First, as The Washington Post's Ezra Klein pointed out, high-deductible health care plans, or "health savings accounts," were a central tenet of Republican health care thinking in the days before Obamacare. The high-deductible complaint is even more hypocritical with regard to pregnant women, given that prenatal care is one of the key issues at hand in the whole Obamacare debate--and one some Republicans have consistently knocked as a stupid benefit.

Keeping 80 year olds alive sedated to Oz for another month is a valid discussion. But the younger set chews up a considerable amount of money. For the AOLs of the world, restricting employment to a class not mentioned in a long time, DINKs (Double Income, No Kids), who don't smoke or drink or do drugs (you know, Mormons that don't breed), is a way to fatten up the bottom line. Folks like me, fur instance.

06 February 2014

Thank You Thing [updated]

If you're not a regular reader of DailyTech, and you're also enamored of The Internet of Things, then you owe it to yourself to read this latest report on the Target problem. It assembles a number of reports; read them too.

As is usually the case, the greedy and incompetent sock puppets running American capitalism are looking for scapegoats. An obvious target in the damn Gummint isn't yet apparent.
...Target made a critical error in that it reportedly offered no separation between its store cash registers and its computerized heating and cooling controls. According to Mr. Krebs, once hackers obtained access to the heating and cooling controls they basically received administrator privileges on cash registers sufficient to install malware programs silently.

As usual, lip service to the "guests".
"I know that it is frustrating for our guests to learn that this information was taken and we are truly sorry they are having to endure this," said Gregg Steinhafel, chairman, president and chief executive officer, Target.

Which gets me pondering. Some years ago I worked, briefly, on the other side of Pennsylvania for a mechanical contractor that did commercial jobs, stores and malls and such; one that was a Trane dealer. Then, perhaps still I don't know, Trane had independent dealers who still had to be exclusive to Trane. Looking at one of Fazio's off sites (the main site is erroring, surprise), it claims to be a Trane dealer, although it lists competitors. Whether Fazio is a maintenance contractor rather then bid/installation? Again, don't know. In any case, since such commercial HVAC tends to be bundled by the vendor, Trane in this instance, providing an innterTubes control system. While Fazio is named in the DailyTech piece, the manufacturer is the deep pockets. Were I to guess who built the software to run Trane (or any commercial HVAC system) over the innterTubes, it would be Trane. Or they contracted it from a software house. But it almost surely wasn't Fazio. Whoever built the HVAC and the innterTubes software for Target's stores, and I'd bet it was Trane/Carrier or the like, is the next stop down the rabbit hole.

Well, curiosity got the better of me, so off to research HVAC control systems. Turns out they're both HVAC manufacturer systems and those bought in by said manufacturers (and possibly vendors). Turns out there is a "standard", called BACnet which many (most?) of these systems follow. And, wouldn't you know, a decade ago the Gummint did a threat study of these innterTubes aware systems:
Considering these attack vulnerabilities and scenarios it is clear that the typical BCS is not a desirable target. System resources are limited (storage space, CPU power, common OS and software packages, etc.), and valuable information is limited to the BCS system itself (configuration data, router tables) but no financial or personal information. However, this may change: as the BCS is connected to more and more service providers - giving access to more information either stored locally or providing a secured path to outside service providers' networks; and as the overall intelligence contained on the BCS network increases to accommodate smarter distributed controls and sensors. It is with this in mind that this document has been prepared, and for this reason that we look at general IT threats.

OK, so that was in 2003. Must be better today, right? Well, may be not:
The fact of the matter is that for IT our BAS box (controller) is a pain in the A@@. Half the time it isn't LDAP compliant, we sneak our network into the building like some third-rate ninja, and then it sits on a self-created bastardized network that resembles something between bailing wire hooked into a hub and a Sub-Saharan DSL line. As if that wasn't enough to make our IT counterparts cry uncle, even when IT does finally get to run some SNMP trapping and network monitoring on our devices we refuse to let them patch our systems because of Java or Windows .Net compliance. Look Mr. IT I know Java 3.0 has issues and you're using Java 7.x but if you upgrade Java on my box our User Interface won't run.

The Internet of Things is looking more like a Frankenstein's assembled body.

More to come, I'd wager.

Yet Another Acronym

It's kind of amusing to watch the 20-something "techy" types sit in a garret with their $2K quad core PC for a year or two, building yet another simplistic infotoyment web app. Unlike Hewlett and Packard who used their garage to build "something" real, all of these "techy" types are churning out YAAP (Yet Another Ad Platform; and such a apt acronym [so far as I know as I type, it's my invention]). Sooner or later: A) no one makes a "better" YAAP and/or B) there's no one making anything but YAAP thus, there's no more unmet need for ads.

A Case of the Vapours

For those who don't read the bidness pages with regularity, Twitter reported after market yesterday, and has been tanking in yesterday's aftermarket and today's premarket. To the tune of 25%. Seems that the legion of twitterites will be shrinking. Taken with Facebook losing all those cash spending teens, may haps the future of a social media driven economy is not just vapid, but vaporous.

04 February 2014

You Get a RetroGrade of F

Today's NYT brings yet another context-free Brooks column, wherein he makes yet more claims of a rosy future. Sort of.

He starts by trashing previously gold standard smarts, then poses the rhetorical query:
But what human skills will be more valuable?

So, what are his candidates?
... people who can recognize and alertly post a message on Twitter about some interesting immediate event

Say what? This rises above understanding thermodynamics, or partial differential equations? I guess so.
Technology has rewarded graphic artists who can visualize data, but it has punished those who can't turn written reporting into video presentations.

I suppose he loves pie charts. It appears he hasn't gotten the memo, both from smart management and smart worker bees: no more PowerPoint decks. Being a total reactionary, Brooks is out of the loop.

Anyway, here's his ordered list. I won't duplicate the paragraphs devoted to each. If you've the stomach, go for it.
First, it rewards enthusiasm.
Second, the era seems to reward people with extended time horizons and strategic discipline.
Third, the age seems to reward procedural architects.
(Seems to be, fourth) So a manager who can organize a decentralized network around a clear question, without letting it dissipate or clump, will have enormous value.
Fifth, essentialists will probably be rewarded.

Mostly, you can't really be a Twitterite (under First) and have any of the other qualities. Immersed in social media is now defined as "work"? Perpetually distracted, but long term focused? The second most complete oxymoron, after happily married (OK, I know...).

If that sounds like a thumbnail sketch of Dubya; Yes, yes it does. The future belongs, and the mightiest rewards go, to content free cheerleaders? That's the future? Oh my.

For those of a certain age, or know someone of a certain age, particularly those that saved old "Popular Science" and "Popular Mechanics" issues, may remember that both magazines were known for predicting how life would be so much better in 50 years, the millennium. Automation generally, and computers specifically, would relieve *all of us* of the drudge work, leaving us to be creative and leisurely for yet longer lives. I don't have such a stash, only memories of the issues. But, here's one site with some examples.

This site is more completist.

What none of these deal with, I suppose by Utopian assumption, is that animate processes lead to concentration, not dispersion. Name any significant human problem of the day, and you're inevitably led back to the divergence of the enfranchised and the disenfranchised. Inequality is the prime result of animate progress. The problem is that, unlike lower forms of life which obey external rules of survival, some of us get to make the rules which the rest of us must endure. The dystopian wins over the utopian; the future is always darker than you can imagine.