At JPMorgan Chase & Co., a learning machine is parsing financial
deals that once kept legal teams busy for thousands of hours.

The
program, called COIN, for Contract Intelligence, does the mind-numbing
job of interpreting commercial-loan agreements that, until the project
went online in June, consumed 360,000 hours of work each year by lawyers
and loan officers. The software reviews documents in seconds, is less
error-prone and never asks for vacation.

While the financial industry has long touted its technological
innovations, a new era of automation is now in overdrive as cheap
computing power converges with fears of losing customers to startups.
Made possible by investments in machine learning and a new private cloud
network, COIN is just the start for the biggest U.S. bank. The firm
recently set up technology hubs for teams specializing in big data,
robotics and cloud infrastructure to find new sources of revenue, while
reducing expenses and risks.

The push to automate mundane tasks and create new tools for
bankers and clients -- a growing part of the firm’s $9.6 billion
technology budget -- is a core theme as the company hosts its annual investor day on Tuesday.

Behind the strategy, overseen by Chief Operating Operating
Officer Matt Zames and Chief Information Officer Dana Deasy, is an
undercurrent of anxiety: Though JPMorgan emerged from the financial
crisis as one of few big winners, its dominance is at risk unless it
aggressively pursues new technologies, according to interviews with a
half-dozen bank executives.

Redundant Software
That was
the message Zames had for Deasy when he joined the firm from BP Plc in
late 2013. The New York-based bank’s internal systems, an amalgam from
decades of mergers, had too many redundant software programs that didn’t
work together seamlessly.

“Matt said, ‘Remember one thing above
all else: We absolutely need to be the leaders in technology across
financial services,’” Deasy said last week in an interview. “Everything
we’ve done from that day forward stems from that meeting.”

After visiting companies including Apple Inc. and Facebook Inc.
three years ago to understand how their developers worked, the bank set
out to create its own computing cloud called Gaia that went online last
year. Machine learning and big-data efforts now reside on the private
platform, which effectively has limitless capacity to support their
thirst for processing power. The system already is helping the bank
automate some coding activities and making its 20,000 developers more
productive, saving money, Zames said. When needed, the firm can also tap
into outside cloud services from Amazon.com Inc., Microsoft Corp. and International Business Machines Corp....MUCH MORE

True artists are possessed by a potent need to express themselves, and that yearning is exponentially stronger for the great ones who will do whatever it takes to put their art out into the world.

If you’d confiscated van Gogh’s paintbrushes, he would have scrawled wavy landscapes on the wall with his own blood. If you’d taken away Mozart’s piano, he would have banged on pots to create a melody. If you’d banned Shakespeare from the theater, he would have staged his plays in the fields.

Mike Mayo, the bank analyst who sparred with JPMorgan Chase & Co. executives at prior Investor Day presentations, showed up at the lender’s 2017 event less than 24 hours after his former employer, CLSA Ltd., shut down its U.S. equity-research operation.“Mike Mayo, free agent analyst,” he said when introducing himself Tuesday, before asking a question to JPMorgan Chief Financial Officer Marianne Lake about profitability from deposits....

According
to Morningstar, as of June 2016, the assets in smart beta exchange
traded products totaled $490 billion. BlackRock forecasts smart
beta using size, value, quality, momentum, and low-volatility will reach
$1 trillion by 2020 and $2.4 trillion by 2025. This annual growth rate
of 19% is double the growth rate of the entire ETF market. Are factors the
cure-all for our investment needs? Or are they like “active
management” that everyone wanted to have instead of passive index funds
in the 1970s?

No one then wanted to be just average.
This ironically gave many investors below average returns as they used the
same information to compete against one another. Superior
performance was usually due more to luck than to skill. But Bill McNabb, CEO of
Vanguard, points out that passive index funds have been in the
top quartile of long-term performance.

Factor-based investors and
advisors now think they have an advantage. They base this belief on
the results of theoretical asset pricing models, many of which have
failed empirically.

Asset pricing models
look at long-term long/short returns without taking into account the
price impact of trading. Factors that looked good on paper may be lacking in robustness, pervasiveness, persistence, or
intuitiveness. Let us see.

Does Size Matter?
The small cap size premium was the first identified factor. Banz wrote about it in 1981. His results were influenced by extreme outliers from the 1930s.

Looking
at more recent history, the oldest small cap index is the Russell 2000.
It started in January 1979. Here is the Russell 2000 annual return and
volatility over the life of the index compared to the S&P 500 index.

Russell 2000 underperformed the S&P 500 by 1.3% annually and had a
substantially higher standard deviation. The Russell 2000 thus
underperformed on both a risk-adjusted and non-risk adjusted basis.{1]

Here
is a chart comparing the Sharpe ratios of all small and large cap
stocks over a longer period of time. Small cap stocks usually failed to show
significantly higher risk-adjusted profits than large cap stocks.

In
the table below long-only small caps slightly outperformed large caps
globally since 1982. But small caps have underperformed large caps in the
U.S. since 1926. Where is the outperformance that Banz talked about?...

This guest post is from Michael O’Sullivan, chief investment
officer for Europe and emerging markets at Credit Suisse Wealth
Management…
__________
Well before he was celebrated
for his work on the link between unemployment and inflation, Bill
Phillips built an extraordinary machine that pumped coloured water
through glass vessels in order to demonstrate how money flows around an
economic system. Levers in the machine permitted users to simulate the
effect on the system of fiscal policy changes and such was the intuitive
appeal of the machine that major universities like Harvard and Oxford
ordered their own versions.

In today’s algorithmic, QE driven
markets such a machine might seem well out of place, but now that the
Greek situation has reared its head, again, and with some politicians in
Europe campaigning on the basis of euro exit, policymakers in Brussels
could do a lot worse than build their own version of Phillips’ machine.
Indeed, such a contraption might aid a much needed period of
introspection into the workings of the euro and may shine light on the
areas missed by the rather tame 2015 Five Presidents Report.

In
the spirit of Phillips’ machine, picture the euro-zone as a system
where liquidity passes through nineteen different vessels, at different
speeds and causing very disparate pressures on each of them. A few like
Greece, are too weak economically and politically to withstand these
pressures.

More broadly, it is now clear that in the absence of
counter balancing mechanisms or safety valves, a common monetary policy
can have a different impact on say, the Portuguese economy as compared
to the Finnish one. This much was clear in the early 2000’s when a
monetary policy that was set for a recovering German economy granted
negative real interest rates to Ireland and Spain, where banks and
consumers then deployed cheap cash with vigour.

In this respect,
the engineers of the euro need to figure out how to accommodate a single
monetary policy within different economies. The Commission should
consider the inveterate Greek crisis as a warning to resist the urge to
further expand euro membership and smother its adherents with fiscal
uniformity. Instead they must cleverly redevelop the way it is
engineered so that it is harmonious within its existing member economies
and transmission mechanisms within thin these.

There are at least ways in which this can be done, the most important being fiscal policy. The
principal suggestion here is to revise the restrictive set of fiscal
rules to give more emphasis to a framework where national governments
have fiscal flexibility. Importantly, they would be bound by
the advice of independent fiscal councils and through restrictions on
debt issuance, to run fiscal policies that are on balance
countercyclical to euro-zone monetary policy and that are economically
productive rather than political in their aim. Using fiscal policy to
complement monetary policy would help to ensure the euro-zone’s
economies, especially the smaller ones, do not become ‘too hot’ or ‘too
cold’ in the fashion of Bill Clinton’s ‘Goldilocks’ economy.

This
would also demand plenty of policy innovation at the macro-prudential
level and euro-zone countries like Ireland could fine tune their
structural recovery in this way. Doing so in a structured way, rather
than say populist ‘moral suasion’ makes more sense and would permit
greater policy engagement in how the moving parts in a small open
economy work together. Ideally, a body like the national central bank
should oversee the transmission of lending to households, in a way not
unlike it did before the euro. Additionally, instead of harmonising
fiscal rules, the EU may want to harmonise the processes required to set
up a new business, or create a quicker pan-European approach to this.

There
is a second engineering challenge facing the euro version of the
Phillips machine, which is that very different levels and forms of
indebtedness mean that a common monetary policy flows in distinct paths
across euro-zone economies, or in some cases doesn’t flow at all. Here
the EU might look to China where the authorities have ordained a program
of financial restructuring that privatises much of the debt held by
local and national authorities....MORE

After more than 40 years of observing inventive artificial neural
systems at work (i.e., Creativity Machines), I have found that they are
susceptible to the very same cognitive pathologies as the brain. In
fact, as such forms of synthetic psychology are pushed toward higher
levels of creativity, the more likely they are to exhibit the classic
psychopathologies, such as schizophrenia, manic-depression, and various
attention disorders.

Finally getting up the nerve to solo publish within a culture well
outside physics and computer science, I took my chances with Elsevier’s
journal Medical Hypotheses, and was pleasantly surprised. After all,
neurobiology obeys physics. So, if the biologists consider this the
proverbial “spherical chicken in vacuum” then they should reconsider its
power in providing a streamlined, bottom-up perspective on both the
mechanics of seminal cognition and what can go wrong with that process.
True, insanity and creativity have been known to positively correlate
with one another for centuries, but I am offering the nuts-and-bolts
model, using amazingly simple mathematical principles.

The AI researcher Dr. Stephen Thaler has given an interview recently in
which he claims that his AI research will lead to sentient, cognizant
"creativity machines" within 5 years.

The research continues to accelerate.

Consciousness appears to be more like an intensive rather than extensive
property/behavior of the brain. It’s sort of like the gas law equation,
PV= nRT, with P and T being intensive and n and V extensive. So,
consciousness is intensive, but we as humans deny simpler forms of
consciousness, while fearing the scaled up version attainable via
machine intelligence.

An Imagination Engine is a trained artificial neural network that is
stimulated to generate new ideas and plans of action through a very
amazing effect that is an outgrowth of scientific experiments conducted
in 1975 by our founder, Dr. Stephen Thaler. In these initial
experiments, neural networks were trained upon a collection of patterns
representing some conceptual space (i.e., examples of either music,
literature, or known chemical compounds), and then the networks were
internally 'tickled' by randomly varying the connection weights joining
neurons. Astonishingly...MORE

For the naysayers the practical results were good enough to land some Department of Defense and NIST funding.
Creative crazies, sounds about right.

BofAML Explains Why The Ag Economy Isn't Likely To Get Much Better In 2017

The fact that farm incomes have come under increasing
pressure over the past couple of years should come as little surprise to
our readers (for those who missed our latest update, see: "Midwest Farm Bubble Continues Collapse As Farm Incomes Expected To Crash In 2017").
Unfortunately, at least according to Bank of America's Global Ag
Chemical team led by Steve Byrne, farmers shouldn't expect a reprieve
any time in the near future.

As BAML points out, the grain commodity farmers of the U.S. are locked in a vicious cycle, the result of which is a perpetually oversupplied market. To summarize the key takeaways, farmers continue to plant so long as cash profits are positive (because depreciation isn't a real cost and who cares about returns on capital anyway...silly finance people) while
yield growth continues to outpace demand growth which leaves markets
perpetually oversupplied and commodity prices well below what would be
required to provide a normalized profit level for farmers.

Meanwhile, since farmers seem to be incapable of unilaterally reducing
supply, an external supply shock (e.g. a weather-related event) seems to
be the only hope of the industry ever normalizing again.

With that, here is a little more detail on the vicious ag cycle per BAML...
Yield growth per acre continues to average 1-2% per annum...

Yields continue to improve with no sign of abatement as seed
technology improves and farmers utilize better information technology
(precision ag) to gain better understanding of acreage and maximize
yield potential. While weather can disrupt yields year-to-year,
directionally yields have improved at a 1-2% CAGR for corn, soy and wheat since 2000.
In our view, this will continue to place deflationary pressure on crop
prices longer-term, particularly given the extent to which global yields
trail yields in more developed ag economies.

...which continues to drive new record highs in production despite an already weak pricing environment.

Global corn production is similarly heading for a new record
high in 2016/17, up 7% YoY and driven mostly by an almost equally big
rise in yields. The US 2016/17 crop that was just harvested
looks especially strong. Concerns over whether ear filling was impeded
by the hot and dry summer weather are now fading as the harvest is done
and the USDA revised up its yield estimate by 1% to 11.01mt/ha in
November. Meanwhile, in LatAm farmers are currently planting for
the 2016/17 harvest and production looks even stronger, up 26% on
presumed yield normalization and exacerbated by a 7% increase in
acreage.

Meanwhile, global corn demand is expected to recover somewhat in 2016/2017 but no where near the expected 7% supply increase.

Global corn demand growth slowed to just 2% per annum in the past two years, due to a drop in global pork production. Corn
is the staple diet of the word’s more than 1bn pigs. The decline in
pork production was mainly caused by an environmental crackdown in the
Chinese farming sector, and the country’s pork production fell by 3% in
2015 and another 5% likely in 2016.

Then in March 2016, China ended its domestic corn price floor, giving
relief to pig farmers, and corn demand started picking up again. Corn
demand from pig production will continue to rise structurally in the
years to come on the ramp-up of new modern mega farms in Northern China.
Overall global corn demand can recover to 3% growth this market
year (2016/17) and hold up at 2-3% growth annually in the years to
come, in our view. However, we have started to see signs of
slowing feed demand as elevated corn prices have led to substitution to
other feeds, in some instances. Global feed demand levels will be key in
determining the aggregate corn demand picture.

Monday, February 27, 2017

Monday morning at the Mobile World Congress trade show in Barcelona, SoftBank (9984JP) chairman Masayoshi Son gave a keynote to answer the question “Why do I spend so much money?”

Son, who sounds like he’s been taking campaign cues from Donald Trump,
emphasized the huge scale of his efforts. His firm’s investment fund,
called the “Vision Fund,” has $100 billion committed to it, which is
more than the global total of venture capital money of $65 billion. And
he spent $32 billion to acquire ARM Holdings, which makes chips that go into smartphones, cars, and much more.

Why all the spending? Son asked the audience if they knew about “The Singularity,”
the principle that machine intelligence will overtake human
intelligence. Son believes it’s real — not just that, he knows when it’s
happening.

“I calculated 20 years ago that it would be 2018” when the cross-over
happens, he told the packed auditorium. “I recently rechecked the
calculation and it was still 2018. If it’s off by a year or two, I don’t
care.”

“I believe it’s coming, in the next thirty years, so that’s why I’m in a hurry,” he said, “to bring the cash.”

Son went through an explanation of human IQ and machine IQ. Human IQ
is on average 100. He joked that probably his audience was a bit above
that. Then he said geniuses such as Einstein have 200 IQs. Then he said that machine intelligence will have a 10,000 IQ.

Humans have about 30 billion neurons, or binary connections, he said. The billions of transistors inside
chips would seem to be about at the point of overtaking the complexity
of those 30 billion binary connections in the human brain, if his
calculations are right.

What happens is that ARM will ship all the chips for that, he said confidently.

“They run 99% of smartphones,” he pointed out. “ARM will ship a
trillion IoT [Internet of Things] chips in the next decade,” he said.
So, Son, who controls Sprint (S), said, “we will have a trillion customers” to connect all that. “Okay?”

Okay. He observed that in a few years’ time, the average pair of sneakers will have more chip IQ than the average human....MUCH MORE

Graphics chipmaker Nvidia (NVDA) saw its shares tumble for the second straight day on Friday, but reassuring comments from Wall Street analysts helped the stock recover some lost ground.

Nvidia stock dropped as much as 4.8% to 95.70 in morning trading on the stock market today. But shares clawed their way to a gain of 1% to 101.46 at the close.

On Thursday, Nvidia plummeted 9.3% to 100.49 after BMO Capital Markets and Nomura cut their ratings and price targets on the stock, saying it was overvalued. The stock sliced through its 50-day moving average, a key support level.

On Friday, Mizuho Securities and UBS reiterated this buy ratings on Nvidia, saying the pullback had created a buying opportunity.

IBD'S TAKE: Nvidia stock has shown signs of hitting a climax top. It touched an all-time high of 120.92 on Feb. 7, and at that point had been up 358% in the past year.

"We would be buyers with the recent 15% pullback in NVDA post earnings," Mizuho analyst Vijay Rakesh said in a report Friday. "Gaming and Deep Learning will continue to be long-term sustainable trends as new (game) titles get released and enterprise transforms from agility and speed to a smarter, efficient, predictive enterprise."

"We keep a buy rating on Nvidia as we expect data center sales to more than double over the next couple years to $2 billion-plus and a 15% growth CAGR in the core gaming GPU business," Chin said in a report.

A MODEL OF TECHNOLOGICAL UNEMPLOYMENT, OXFORD UNIVERSITY Discussion paper, NO. 819, February 2017

In
the past 15 years a new area of the economics literature has emerged to
explore the consequences of technological change on the labour market.
This is the ‘task-based’ literature. The account that emerges is
optimistic about the prospects for labour in the 21st century. The
central argument of this paper is that this particular optimism is
unjustified.

In this paper I try to make two contributions. The
first is to show that the literature’s current conception of how the
latest machines operate and the capabilities that this implies -- what
is known as the ‘ALM hypothesis’ -- is incorrect. Put simply, the ALM
hypothesis implies that while machines can perform ‘routine’ tasks, they
cannot perform ‘non-routine’ tasks. Tasks are ‘routine’ when human
beings find it straightforward to explain how they perform them (rather
than because they are boring or dull). The ALM hypothesis argues that
because human beings cannot easily articulate the rules they follow when
performing ‘non-routine’ tasks, it is therefore hard to write a set of
rules for a machine to follow to perform these tasks. As a result, it is
claimed that these ‘non-routine’ tasks cannot readily be automated.

The
problem is that the ALM hypothesis is dated. It assumes that the only
way to automate a task is to understand, articulate, and replicate the
way a human being performs that task. But recent technological advances
in processing power, data retrieval and storage capabilities, and
algorithm design, means that this constraint no longer holds. It is no
longer necessary to replicate human thinking and reasoning processes in
order to outperform human beings....MORE

Dr Camilla Clark recruited 48 patients with frontotemporal dementia from
their dementia clinic at University College London. The
patient's friends and family were asked to rate their friend or
relative's enjoyment of different kinds of comedy.

These included slapstick comedy such as Mr Bean, satirical comedy such as Yes, Minister or absurdist comedy such as Monty Python
and examples of inappropriate humour. Dr Clark found that the dementia
patients preferred slapstick humour to satirical, when compared to 21
healthy people of a similar age....MORE

Speculators swell appetite for metal as price of scarce material up by 50%

Suppliers to Tesla and other
electric-car makers are scrambling to secure shipments of the key
battery material cobalt after a group of hedge funds amassed a large
stockpile of the scarce metal.

In a bold wager on higher prices, half a dozen funds, including Switzerland-based Pala Investments and China’s Shanghai
Chaos, have purchased and stored an estimated 6,000 tonnes of cobalt,
worth as much as $280 million, according to the investors, traders and
analysts.

The stockpile is equivalent to 17
per cent of last year’s production of the metal. Increasing use of
batteries with chemical forms of the metal by Chinese electric-car
makers, alongside ambitious plans by the likes of Elon Musk’s Tesla,
have created a fertile backdrop for speculators hoping to profit from
swelling appetite for cobalt, which boosts lithium-ion batteries’ power.

They are betting that demand for
electric vehicles will exceed market expectations and push the up the
price as battery makers such as Panasonic, which makes battery cells for Tesla, rush to lock-up supplies of the material.

Global demand

Global demand is already expected
to outstrip supply this year by 900 tonnes, according to commodity
consultancy CRU. It estimates demand will grow 20 per cent a year for
the next five years, thanks to buying from the electric car industry
whose production grew by 41 per cent last year and which accounts for
half of annual consumption.

The price of cobalt, mined almost exclusively in the Democratic Republic of Congo,
is already up by more than 50 per cent since November to $21 a metric
pound. Prices rose to a peak of about $50 a pound in 2007, before
dropping to a low of $10 in 2015....MORE

We have a silly habit of dropping breadcrumbs as we journey along the way, here's the intro to that 2016 piece:

I figured there were at best two thousand people in the whole world who
knew or cared about the back story and real import of what was going on
so I'd just drop it as an Easter egg for the cognoscenti and other
assorted electric vehicle/conflict mineral/African warlord/Elon
Musk/extractive industry/Génocidaire hunter/U.S. political corruption watchers to find.

Well now that cat's out of the bag.

Big kudos to the FT's Henry Sanderson for recognizing one hell of a
story and a small request for the Financial Times: Can you tell us what
the old ENRC is up to these days?

"...As we get closer to being a profitable company, we will be able to
afford more and more fun things. For example, as I mentioned at the last
company talk, we are going to hold a really amazing party once Model 3
reaches volume production later this year. There will also be little
things that come along like free frozen yogurt stands scattered around
the factory and my personal favorite: a Tesla electric pod car roller
coaster (with an optional loop the loop route, of course!) that will
allow fast and fun travel throughout our Fremont campus, dipping in and
out of the factory and connecting all the parking lots. It’s going to
get crazy good 😊"

A year ago, Lawrence Summers’ perceptive warnings
about the possibility of secular stagnation in the world economy were
dominating global markets. China, Japan and the Eurozone were in
deflation, and the US was being dragged into the mess by the rising
dollar. Global recession risks were elevated, and commodity prices
continued to fall. Fixed investment had slumped. Productivity growth and
demographic growth looked to be increasingly anemic everywhere.

Estimates of the equilibrium real interest rate in many economies were
being marked down. It seemed possible that the world economy would fall
into a “Japanese trap”, in which nominal interest rates would be
permanently stuck at the zero lower bound, and would therefore not be
able to fall enough to stimulate economic activity.

Just when the sky seemed to be at its darkest, the outlook suddenly
began to improve. Global reflation replaced secular stagnation as the
theme that dominated investor psychology, especially after Donald
Trump’s election in November. Why has secular stagnation lost its mass
appeal, and has it disappeared forever? Was it all a case of crying
wolf?

Lawrence Summers has always made it clear that in his mind secular
stagnation was a hypothesis, not a proven reality, especially in the US.
He and others have argued that the combination of very low global GDP
growth, alongside falling real interest rates, could be caused by two
factors: (i) inadequate global demand, stemming from low business
investment, high savings rates in Asia, wide disparity in income
distribution and rising risk aversion; and (ii) inadequate global
supply, stemming from falling productivity growth, and slowing growth in
the labour force....MORE

There's a good chance we'll be referring back to this piece so I wanted to have it easily available.
From Backchannel:

The Applied Machine Learning group helps Facebook see, talk, and understand. It may even root out fake news.

When asked to head Facebook’s
Applied Machine Learning group — to supercharge the world’s biggest
social network with an AI makeover — Joaquin Quiñonero Candela
hesitated.

It
was not that the Spanish-born scientist, a self-described “machine
learning (ML) person,” hadn’t already witnessed how AI could help
Facebook. Since joining the company in 2012, he had overseen a
transformation of the company’s ad operation, using an ML approach to
make sponsored posts more relevant and effective. Significantly, he did
this in a way that empowered engineers in his group to use AI even if
they weren’t trained to do so, making the ad division richer overall in
machine learning skills. But he wasn’t sure the same magic would take
hold in the larger arena of Facebook, where billions of people-to-people
connections depend on fuzzier values than the hard data that measures
ads. “I wanted to be convinced that there was going to be value in it,”
he says of the promotion.

Despite his doubts, Candela took the post. And now, after barely two years, his hesitation seems almost absurd.

How
absurd? Last month, Candela addressed an audience of engineers at a New
York City conference. “I’m going to make a strong statement,” he warned
them. “Facebook today cannot exist without AI. Every time you use Facebook or Instagram or Messenger, you may not realize it, but your experiences are being powered by AI.”

Last
November I went to Facebook’s mammoth headquarters in Menlo Park to
interview Candela and some of his team, so that I could see how AI
suddenly became Facebook’s oxygen. To date, much of the attention around
Facebook’s presence in the field has been focused on its world-class
Facebook Artificial Intelligence Research group (FAIR), led by renowned
neural net expert Yann LeCun. FAIR, along with competitors at Google,
Microsoft, Baidu, Amazon, and Apple (now that the secretive company
is allowing its scientists to publish), is one of the preferred
destinations for coveted grads of elite AI programs. It’s one of the top
producers of breakthroughs in the brain-inspired digital neural
networks behind recent improvements in the way computers see, hear, and
even converse. But Candela’s Applied Machine Learning group
(AML) is charged with integrating the research of FAIR and other
outposts into Facebook’s actual products—and, perhaps more importantly,
empowering all of the company’s engineers to integrate machine learning
into their work.

Because Facebook can’t exist without AI, it needs all its engineers to build with it.

***

My visit occurs two days after
the presidential election and one day after CEO Mark Zuckerberg
blithely remarked that “it’s crazy” to think that Facebook’s circulation
of fake news helped elect Donald Trump. The comment would turn out be
the equivalent of driving a fuel tanker into a growing fire of outrage
over Facebook’s alleged complicity in the orgy of misinformation that
plagued its News Feed in the last year. Though much of the controversy
is beyond Candela’s pay grade, he knows that ultimately Facebook’s
response to the fake news crisis will rely on machine learning efforts
in which his own team will have a part.

But
to the relief of the PR person sitting in on our interview, Candela
wants to show me something else—a demo that embodies the work of his
group. To my surprise, it’s something that performs a relatively
frivolous trick: It redraws a photo or streams a video in the style of
an art masterpiece by a distinctive painter. In fact, it’s reminiscent
of the kind of digital stunt you’d see on Snapchat, and the idea of
transmogrifying photos into Picasso’s cubism has already been
accomplished.

“The
technology behind this is called neural style transfer,” he explains.
“It’s a big neural net that gets trained to repaint an original
photograph using a particular style.” He pulls out his phone and snaps a
photo. A tap and a swipe later, it turns into a recognizable offshoot
of Van Gogh’s “The Starry Night.” More impressively, it can render a
video in a given style as it streams. But what’s really different, he
says, is something I can’t see: Facebook has built its neural net so it
will work on the phone itself.

That isn’t novel, either — Apple has previously bragged
that it does some neural computation on the iPhone. But the task was
much harder for Facebook because, well, it doesn’t control the hardware.
Candela says his team could execute this trick because the group’s work
is cumulative — each project makes it easier to build another, and
every project is constructed so that future engineers can build similar
products with less training required —so stuff like this can be built
quickly. “It took eight weeks from us to start working on this to the
moment we had a public test, which is pretty crazy,” he says....MORE

Now that the world has become addicted to portable
electronics, billions of people have come to see the companies providing
these gadgets as the most innovative, and the people who head those
companies as the most exalted, of all time. “Genius” is a starter category in this discussion.
But clever and appealing though today’s electronic gadgets may be, to
the historian they are nothing but the inevitable fifth-order
elaborations of two fundamental ideas: electromagnetic radiation, the
theory of which was formulated by James Clerk Maxwell in the 1860s, and miniaturized fabrication, which followed Richard Feynman’s 1959 dictum [PDF] that “there’s plenty of room at the bottom.”

Maxwell was a true genius. The history of science offers few examples
of work as brilliant as unifying electricity, magnetism, and light as
aspects of a single phenomenon: electromagnetic waves. As Max Planck put
it, “in doing so he achieved greatness unequalled.”

In late 1879 and early 1880, David Edward Hughes actually transmitted
and received those invisible signals but did not publish his results.
And in 1883, Thomas Edison also came very close to the actual use of
such waves with his patent for an apparatus “showing conductivity of
continuous currents through high vacuo.” He displayed it in 1884 at the International Electrical Exhibition in Philadelphia, then abandoned it as a mere curiosity.

That is why the second-order elaboration of electromagnetism came
only between 1886 and 1888, when Heinrich Hertz deliberately generated
and received electromagnetic waves whose frequencies he accurately
placed “in a position intermediate between the acoustic oscillations of
ponderable bodies and the light-oscillations of the ether.”

The third-order elaboration began with the first broadcasts, by Oliver
J. Lodge and Alexander S. Popov, in 1894 and 1895, and it continued with
the first transatlantic transmission, by Guglielmo Marconi in 1901; the
first long-distance transmission of voice and music, by Reginald A.
Fessenden in 1906; and the invention of vacuum tubes—John A. Fleming’s
diode in 1904, Greenleaf W. Pickard’s point-contact diode (cat’s whisker) in 1906, and Lee de Forest’s triode in 1907....MORE

We show from a simple model that a country's technological development can be
measured by the logarithm of the number of products it makes. We show that much
of the income gaps among countries are due to differences in technology, as
measured by this simple metric. Finally, we show that the so-called Economic
Complexity Index (ECI), a recently proposed measure of collective knowhow, is
in fact an estimate of this simple metric (with correlation above 0.9).

1
Introduction
The
standard
approach to economic growth and development simplifies a
country’s
whole
production to three aggregate
s
—
GDP, labor and capital
—
thus
disregarding
its complexity.
Complexity
of production
has to do with
the
diversity
of products a country makes,
which is itself a manifestation of the diversity of
productive
knowledge by which
many
products can be made
—
namely
the various skills and technical knowledge
applied
by
workers or automated by
machines
.
Products differ
precisely by
the amount of knowledge
involved in
their production,
which goes
from zero for
natural resources
sold in the raw
to maximum values for highly complex products such as aircrafts.

It is along such line of thought that
emerged
a literature
, by Hausmann and Hidalgo notably, which links complexity of production to economic development
[
1
-
3
]
.
Rich
countries
make various products, especially complex products, while
poor
countries make
fewer and more rudimentary
ones
.
In fact
the
mere
number of products a country makes,
or its
diversification
, indicates its development.
Though
basic
,
this opposes the long tradition in economics
that
link
s
international prosperity to the specialization of
countries
.
Hausmann and Hidalgo
propose a more elaborate metric called Economic
Complexity
Index
(ECI) to quantify the amount of productive knowledge (or
knowhow
) that underlies
a country’s production. ECI is therefore, to use a more traditional term, a measure of a
country’s
technology
—
if technology is taken to mean precisely
the sum of
practical
knowledge
within a society
. Similarly, we can define the
technological sophistication
of
a product by the amount of knowhow involved in its production. This is measured by the
Product Complexity Index (PCI) in the authors’ theory. In fact ECI and PCI are jointly
computed, based on the idea that an economy’s technology is reflected in the products it
makes, and,
vice versa, a product
reflects the technologies
of the economies making it.
A
reformulation of the same idea
was
suggested by Caldarelli et al., which
we shall also
consider
[
4
-
6
]
. There, the metrics are named Country Fitness and Product Complexity.

Our goal in this paper is to
propose
a simpler and mo
re natur
al measure of technology
:
the
logarithm of diversification
. This metric derives from the following basic combinatorics. First, a product is
but
some
transformed natural resources, namely some
raw materials to which is applied a set of
knowhow to turn it in
to a
valuable
outcome
.
Second,
and
more
fundamentally,
knowledge comes in discrete units
(
or
‘
bits
’)
that combine to
make
more and more sophisticated
knowledge
.

Therefore
with
k
units
of
knowhow
, a
country can make
potentially
2
k
d
products, whose sophistication
s
range from zero
for
natural resources (
sold
in
the raw)
to
k
.
Thus,
we
can estimate the
total
amount of
knowhow
k
involved in
a country’s
production
by
its log
-
diversification
(
up to a scaling
constant)
.
Only,
bits of knowledge don’t combine
such
randomly:
a collection of
ideas
is
productively
relevant only
when
it
form
s
a
coherent
set
of productive knowledge (
namely
when they can be
put
together
to transform a
raw material
)
.

So we
shall
develop a more
realistic
(yet
still simple
)
model of
this combinatorics
of knowhow
. The
point
remains
,
however: log
-
diversification
is
the
natural
measure of technology
.
We show
that this
simple
metric explains
much of
the income differences among countries.
Finally, we show theoretically and empirically that
ECI
is
in
fact
an
estimate of this
metric
,
in standardized form,
while Fitness is linked to it by construction.
But first we
develop
a
simple
conceptual
framework
and
describe
the data used throughout...

The first time we explained that one of the biggest risks
facing a world in which the dollar is the reserve currency is a global
USD shortage, was in mid-2009, when we wrote "How The Federal Reserve Bailed Out The World."

At the time, the IMF calculated that just ahead of the
financial crisis, "major European banks’ US dollar funding gap had
reached $1.0–1.2 trillion by mid-2007. Until the onset of the crisis,
European banks had met this need by tapping the interbank market ($432
billion) and by borrowing from central banks ($386 billion), and used FX
swaps ($315 billion) to convert (primarily) domestic currency funding
into dollars." The IMF then extrapolated that "were all liabilities to
non-banks treated as short-term funding, the upper-bound estimate would
be $6.5 trillion."

Since then the shortage, which some have dubbed a potential
multi-trillion dollar margin call, has only grown and became a prominent
issue back in March of 2015, when this phenomenon was used to explain
why the cross-currency swap had plunged to multi-year lows. As JPM explained at the time,
"the fx basis reflects the relative supply and demand for dollar vs.
foreign currency funds and a very negative basis currently points to
relative shortage of USD funding or relative abundance of funding in
other currencies. Such supply and demand imbalances can create big
shifts in the fx basis away from its actuarial value of zero."

Fast forward a year and
a half later, when none other than the Bank of International
Settlements, or the "Central canks' central bank", warned last November that
it was no longer the VIX that was the widely accepted barometer of
market "fear", it was now the dollar's turn to become the global fear
gauge: "just as the VIX index was a good summary measure of the
price of balance sheet before the crisis, so the dollar has become a
good measure of the price of balance sheet after the crisis. The mantle of the barometer of risk appetite and leverage has slipped from the VIX, and has passed to the dollar."

Now, in an exhaustive, 70 minute interview, submitted by Patrick Ceresna at MacroVoices.com,
another prominent analyst who has been closely tracking the global
dollar shortage, Alhambra Partners' Jeffrey Snider sat down with Erik
Townsend to explain - once again - why this is such a critical topic,
even if it comes at a time of unprecedented global complacency (it's
amazing what record high stock prices will do to concerns - or lack
thereof - about the future).

As Snider puts it, while most other risk indicators imply smooth sailing, "there is 'something' weird going on" when it comes to dollar funding and global imbalances of the world's reserve currency, i.e., dollar shortage.

In the interview, among the many topics covered, are

Understanding the Eurodollar Money Market

Swap Spreads and Interbank Hierarchy

Dimensions in the Eurodollar Futures and Eurodollar Money Supply

Why does the World Need So Many Dollars?

How the Eurodollar market supplanted the Bretton Woods System

U.S. Dollar and the Dollar Funding Gap

Reflation Trade Debunked

Interest Rates Trapped

Failing Global Currency System

While we urge readers to listen to the full interview below, here are some of the highlights, starting with "why the Dollar shortage a symptom of an inherently unstable system."

As Snider explains, "the dollar shortage isn't so much the
shortage per se, it’s the fact that it's a symptom of what is an
inherently unstable system." He notes that "the reason banks are
withdrawing from the system is that it's just is no longer tenable" and
"so there has to be some kind of – whether you want to look at it like
another Bretton Woods – conference, a global monetary system, a global
monetary get together where people start to analyze solutions to the
problem as they are rather than keep trying to apply band aids that are
not going to work. "

But, he concludes, "step one of that task is to actually
recognize the problem as it is and so doing more stimulus or doing more
QE isn't going to solve anything it isn’t do anything just like prior
QEs and prior stimulus haven't done anything either because the problem
is an unstable system."...MUCH MORE

Readers who have followed the Uber story over the last few years, especially if you read Izabella Kaminska at FT Alphaville, know that despite posting on the lurid details from time to time (us more than she) our (and her) focus has been on the business/finance/econ aspects of Uber, although the political economy and other social science stuff can't help appearing, because what Kalanick built was in his own image.

"From all our legends, mythology, and history (and who is to know
where mythology leaves off and history begins – or which is which), the
first radical known to man who rebelled against the establishment and
did it so effectively that he at least won his own kingdom – Lucifer."

-Page ix of Rules for Radicals.

That's Alinsky seemingly quoting himself and the way I read it he's
saying the Devil challenged authority and won his own kingdom.
That emulating the methods of Satan using any means fair or foul, including lying, cheating and stealing is the way to get riches and power.

And that was the moment when I stopped thinking of Uber as frat boys
making stupid boob jokes and started thinking of them as nasty little
political operatives.

If you're into this kind of stuff, Rule 12 appears to be the approach Uber management favors:

RULE 12: Pick the target, freeze it, personalize it, and polarize it."
Cut off the support network and isolate the target from sympathy. Go
after people and not institutions; people hurt faster than institutions.
(This is cruel, but very effective. Direct, personalized criticism and
ridicule works.)

I should note we are fans of Alinsky's tactical brilliance, oftentimes struggling to resist employing rule #5:

#5 Ridicule is man’s most potent weapon. It’s hard to counterattack ridicule, and it infuriates the opposition, which then reacts to your advantage....

So yeah, although the focus has been on the quantifiable, the soft science stuff is there as well and may be the thing that takes Uber down. At least that's the charitable interpretation, that Kalanick, blinded by hubris didn't see the flaws in the business plan.

The less favorable interpretation is that he knew all along and kept pushing in the hope that magic would happen.
That would be a fraud.

Anyhoo, here's one of the best automotive websites on the net talking Uber.

From Jalopnik:

If there is one quote that sums up the ethos of Uber,
it might be this cut from the company’s firebrand CEO Travis Kalanick:
“Stand by your principles and be comfortable with confrontation. So few
people are, so when the people with the red tape come, it becomes a
negotiation.” But after a month marked by one disaster after another,
it’s hard to see how Uber’s defiant, confrontational attitude hasn’t
blown up in its face. And those disasters mask one key, critical issue:
Uber is doomed because it can’t actually make money.

After a discombobulated 2016, in which Uber burned through more than
$2 billion, amid findings that rider fares only cover roughly 40 percent
of a ride, with the remainder subsidized by venture capitalists, it’s
hard to imagine Kalanick could take the company public at its stunning
current valuation of nearly $70 billion.

Yet even when those factors are removed, it’s becoming more evident
that Uber will collapse on its own. Barring a drastic shift in the
company’s business—an implausible rollout of self-driving car fleets across the U.S., an increase of fares
by three-fold, or a complete monopolization of the taxi and
ride-hailing markets—Uber’s lifeline is shrinking. Its business model
could collapse if one court case, and there are many, goes against it. Or perhaps more pressing, if it simply runs out of cash.
That
Kalanick quote about confrontation may be as innocuous as a random
sound bite, but it’s representative of the ride-hailing giant’s
methodology since its founding in 2009: a perpetual resistance to
regulatory oversight; a belief that, ultimately, an unfettered market is
the key to prosperity.

At first glance it seems like Kalanick’s libertarian ideals have paid off. Most recently valued at a reported $69 billion, Uber has captured a majority of the ground transportation market and flipped the taxi industry—a sector Kalanick once famously and snidely referred to as the Big Taxi Cartel—on its head. His philosophy mirrors the mindset of one of his favorite authors, the laissez-faire Ayn Rand. In 2012, Kalanick proffered
that Uber’s battle against government regulations has an “uncanny
resemblance” to the Randian philosophy. A billionaire fighting The
System—and prevailing. It’s a good story for those who find truth in Atlas Shrugged.

Uber’s long had skeptics, and it’s not innovative to paint Kalanick,
40, as the boogeyman of Silicon Valley, where unseemly savants exist in
vast supply.

The precarious moment in the company’s eight-year history falls on
Kalanick’s lap. It’s his baby after all—a startup founded on seemingly
nothing more than a vague idea, without much regard for the workforce to
make it possible, or even a clear idea of what business model it
actually wants to pursue. Uber has jumped from one idea to the next:
UberX, UberEats, autonomous cars, and now flying cars, of all things.

“Instead
of just focusing on being a good taxi company for the digital age… it’s
blowing all sots of money [on] self driving-cars and China and now
India. The company just so much reflects the megalomania of Travis
Kalanick and whatever he thinks he’s doing.”The impact of
Uber’s death would probably be as much of a rebuke of Kalanick’s vision
of running on a scatterbrained dream, not so much a solid business model
and philosophy, that you could muster.

It would also be devastating for some. The livelihood of 11,000 employees across the world
rests on Kalanick’s decision to submit to that philosophy—which, at its
core, is a ruthless way of doing business. At the very least, drivers
in the pre-Uber market could earn a decent living. Conversely, for
example, Uber drivers taking advantage of new “vehicle solution” pilot
program in Boston — renting cars by the hour through Zipcar — will earn less than Massachusetts’ minimum wage. How innovative.

The Contractor Problem
One
of the biggest issues that has left Uber’s business model hanging in
the balance is its resistance to classifying its drivers—there are reportedly600,000
in the U.S.—as employees, not contractors. If Uber is a house of cards,
this is a key part of the foundation that, once removed, would demolish
the structure.

Indeed, the company has said
reclassifying drivers could “force Uber to restructure its entire
business model.” The result of its opposition to readjust has been
entirely expected. Without the perks and protectionsthat an employee may enjoy—health care, benefits, gasoline and work reimbursements, vehicle maintenance, all of which could reportedly total as much as $730 million—complaints
from drivers have piled up, ranging from low pay to new services like
UberEats (a loathed food delivery service that’s reportedly
set to lose over $100 million annually) and UberPOOL, its carpool
option which increases the company’s take per-ride, lowers the take-home
pay for drives, and is understood to be quite a drag for drivers and
passengers alike. Drivers themselves said as much in a recent, disastrous question-and-answer session with Uber’s president....MUCH MORE

If interested, we have an awful lot of posts on Uber; here's the Google search of the site:

These
technologies all have staying power. They will affect the economy and
our politics, improve medicine, or influence our culture. Some are
unfolding now; others will take a decade or more to develop. But you
should know about all of them right now.