21 August 2019

Book Larnin - part the second

A couple of, one might think, disparate threads woven together today. We'll start with a long piece in the NYT delving into The Manchurian President's hitman at the Fed opening the barn door so the horses can run wild. All in the name of loosening the reins on our fantastic capitalists.
Some of the changes, seemingly incremental and technical on their own, could add up to a weakening of capital requirements installed in the wake of the crisis to prevent the largest banks from suffering the kind of destabilizing losses that imperiled the United States economy.

You know, the bankers would never put the rest of us at risk so they can make more than the market rate of return. Now would they?
The tinkering is being driven by Randal K. Quarles, the Fed's vice chair for supervision, whom Trump nominated in 2017...

We all know, of course, that he has our best interests (he he) at heart. Here's a short bio
As [George W. Bush's] Under Secretary, Quarles led the Department's activities in financial sector and capital markets policy, including coordination of the President's Working Group on Financial Markets, development of administration policy on hedge funds and derivatives, regulatory reform of Fannie Mae and Freddie Mac, and proposing fundamental reform of the U.S. financial regulatory structure.
[my bold]

In other words, an architect of the regime which crashed the world's economy.

Which brings us to Gary Marcus, who's in the news recently about his growing angst over what AI is and what AI can do. This not hot off the presses, of course.
And when I look at current AI through a perspective of human cognitive development — how children learn — I'm very dissatisfied by the state of AI. There's no AI that's remotely as clever as my four year old or my six year old.

Which reminds me of something My Pappy included in his war stories (WWII). For most of the War he was in North Africa, Algiers. He was in the Signal Corps building aircraft beacons. His unit was housed in a commandeered mansion, complete with houseboy. Said houseboy was somewhere about 10 and, according to Pappy, spoke about 6 languages. I recall hearing this when I was in high school, trying to learn languages. For a long time I was puzzled by this, but eventually the answer dawned on me: a child learning language isn't learning *a* language at all. The child is learning communication with humans, associating sounds with concrete things and even more abstract concepts. As such, each discrete language from the adult point of view, is just the sounds needed to communicate with certain adults; it's all one language to the child. So the sounds for the Brits are different from those for the French and from those for the Arabs. And so on ad infinitum. It's only later, when language becomes text, that difficulty arises. Can AI do that? I do wonder.

And of course, the coup
I have been saying for several years that deep learning is shallow, that it doesn't really capture how things work or what they do in the world. It's just a certain kind of statistical analysis. And I was really struck when Yoshua Bengio, one of the fathers of deep learning, kind of reached the same conclusion. I used to say this and people looked at me funny and got mad at me. And then suddenly here was one of the leaders of the field noticing the same thing.
[my emphasis]

Another recent piece tells the tale of how all that big data gets built to train AI. It's not high tech.

In all, we find the fact events drive data, not the other way around. Surprise, surprise.

No comments: