If agriculture is to continue to feed the world, it needs to become more like manufacturing, says Geoffrey Carr. Fortunately, that is already beginning to happen

TOM ROGERS is an almond farmer in Madera County, in California’s Central Valley. Almonds are delicious and nutritious. They are also lucrative. Californian farmers, who between them grow 80% of the world’s supply of these nuts, earn $11 billion from doing so. But almonds are thirsty. A calculation by a pair of Dutch researchers six years ago suggested that growing a single one of them consumes around a gallon of water. This is merely an American gallon of 3.8 litres, not an imperial one of 4.5 litres, but it is still a tidy amount of H2O. And water has to be paid for.Technology, however, has come to Mr Rogers’s aid. His farm is wired up like a lab rat. Or, to be more accurate, it is wirelessed up. Moisture sensors planted throughout the nut groves keep track of what is going on in the soil. They send their results to a computer in the cloud (the network of servers that does an increasing amount of the world’s heavy-duty computing) to be crunched. The results are passed back to the farm’s irrigation system—a grid of drip tapes (hoses with holes punched in them) that are filled by pumps. The system resembles the hydroponics used to grow vegetables in greenhouses. Every half-hour a carefully calibrated pulse of water based on the cloud’s calculations, and mixed with an appropriate dose of fertiliser if scheduled, is pushed through the tapes, delivering a precise sprinkling to each tree. The pulses alternate between one side of the tree trunk and the other, which experience has shown encourages water uptake. Before this system was in place, Mr Rogers would have irrigated his farm about once a week. With the new little-but-often technique, he uses 20% less water than he used to. That both saves money and brings kudos, for California has suffered a four-year-long drought and there is social and political, as well as financial, pressure to conserve water.Mr Rogers’s farm, and similar ones that grow other high-value but thirsty crops like pistachios, walnuts and grapes, are at the leading edge of this type of precision agriculture, known as “smart farming”. But it is not only fruit and nut farmers who benefit from being precise. So-called row crops—the maize and soyabeans that cover much of America’s Midwest—are being teched up, too. Sowing, watering, fertilising and harvesting are all computer-controlled. Even the soil they grow in is monitored to within an inch of its life.

People will want to eat better than they do now

Farms, then, are becoming more like factories: tightly controlled operations for turning out reliable products, immune as far as possible from the vagaries of nature. Thanks to better understanding of DNA, the plants and animals raised on a farm are also tightly controlled. Precise genetic manipulation, known as “genome editing”, makes it possible to change a crop or stock animal’s genome down to the level of a single genetic “letter”. This technology, it is hoped, will be more acceptable to consumers than the shifting of whole genes between species that underpinned early genetic engineering, because it simply imitates the process of mutation on which crop breeding has always depended, but in a far more controllable way.Understanding a crop’s DNA sequence also means that breeding itself can be made more precise. You do not need to grow a plant to maturity to find out whether it will have the characteristics you want. A quick look at its genome beforehand will tell you.

Such technological changes, in hardware, software and “liveware”, are reaching beyond field, orchard and byre. Fish farming will also get a boost from them. And indoor horticulture, already the most controlled and precise type of agriculture, is about to become yet more so.

In the short run, these improvements will boost farmers’ profits, by cutting costs and increasing yields, and should also benefit consumers (meaning everyone who eats food) in the form of lower prices. In the longer run, though, they may help provide the answer to an increasingly urgent question: how can the world be fed in future without putting irreparable strain on the Earth’s soils and oceans? Between now and 2050 the planet’s population is likely to rise to 9.7 billion, from 7.3 billion now. Those people will not only need to eat, they will want to eat better than people do now, because by then most are likely to have middling incomes, and many will be well off.

The Food and Agriculture Organisation, the United Nations’ agency charged with thinking about such matters, published a report in 2009 which suggested that by 2050 agricultural production will have to rise by 70% to meet projected demand. Since most land suitable for farming is already farmed, this growth must come from higher yields. Agriculture has undergone yield-enhancing shifts in the past, including mechanisation before the second world war and the introduction of new crop varieties and agricultural chemicals in the green revolution of the 1950s and 1960s. Yet yields of important crops such as rice and wheat have now stopped rising in some intensively farmed parts of the world, a phenomenon called yield plateauing. The spread of existing best practice can no doubt bring yields elsewhere up to these plateaus. But to go beyond them will require improved technology.

This will be a challenge. Farmers are famously and sensibly sceptical of change, since the cost of getting things wrong (messing up an entire season’s harvest) is so high. Yet if precision farming and genomics play out as many hope they will, another such change is in the offing.

ONE way to view farming is as a branch of matrix algebra. A farmer must constantly juggle a set of variables, such as the weather, his soil’s moisture levels and nutrient content, competition to his crops from weeds, threats to their health from pests and diseases, and the costs of taking action to deal with these things. If he does the algebra correctly, or if it is done on his behalf, he will optimise his yield and maximise his profit.The job of smart farming, then, is twofold. One is to measure the variables going into the matrix as accurately as is cost-effective. The other is to relieve the farmer of as much of the burden of processing the matrix as he is comfortable with ceding to a machine.An early example of cost-effective precision in farming was the decision made in 2001 by John Deere, the world’s largest manufacturer of agricultural equipment, to fit its tractors and other mobile machines with global-positioning-system (GPS) sensors, so that they could be located to within a few centimetres anywhere on Earth. This made it possible to stop them either covering the same ground twice or missing out patches as they shuttled up and down fields, which had been a frequent problem. Dealing with this both reduced fuel bills (by as much as 40% in some cases) and improved the uniformity and effectiveness of things like fertiliser, herbicide and pesticide spraying.

Since then, other techniques have been added. High-density soil sampling, carried out every few years to track properties such as mineral content and porosity, can predict the fertility of different parts of a field. Accurate contour mapping helps indicate how water moves around. And detectors planted in the soil can monitor moisture levels at multiple depths. Some detectors are also able to indicate nutrient content and how it changes in response to the application of fertiliser.

All of this permits variable-rate seeding, meaning the density of plants grown can be tailored to local conditions. And that density itself is under precise control. John Deere’s equipment can plant individual seeds to within an accuracy of 3cm. Moreover, when a crop is harvested, the rate at which grains or beans flow into the harvester’s tank can be measured from moment to moment. That information, when combined with GPS data, creates a yield map that shows which bits of land were more or less productive—and thus how accurate the soil and sensor-based predictions were. This information can then be fed into the following season’s planting pattern.

Farmers also gather information by flying planes over their land. Airborne instruments are able to measure the amount of plant cover and to distinguish between crops and weeds. Using a technique called multispectral analysis, which looks at how strongly plants absorb or reflect different wavelengths of sunlight, they can discover which crops are flourishing and which not. Sensors attached to moving machinery can even take measurements on the run. For example, multispectral sensors mounted on a tractor’s spraying booms can estimate the nitrogen needs of crops about to be sprayed, and adjust the dose accordingly. A modern farm, then, produces data aplenty. But they need interpreting, and for that, information technology is essential.

Platform tickets

Over the past few decades large corporations have grown up to supply the needs of commercial farming, especially in the Americas and Europe. Some are equipment-makers, such as John Deere. Others sell seeds or agricultural chemicals. These look like getting larger still. Dow and DuPont, two American giants, are planning to merge. Monsanto, another big American firm, is the subject of a takeover bid by Bayer, a German one. And Syngenta, a Swiss company, is being bid for by ChemChina, a Chinese one.Business models are changing, too. These firms, no longer content merely to sell machinery, seed or chemicals, are all trying to develop matrix-crunching software platforms that will act as farm-management systems. These proprietary platforms will collect data from individual farms and process them in the cloud, allowing for the farm’s history, the known behaviour of individual crops strains and the local weather forecast. They will then make recommendations to the farmer, perhaps pointing him towards some of the firm’s other products.But whereas making machinery, breeding new crops or manufacturing agrochemicals all have high barriers to entry, a data-based farm-management system can be put together by any businessman, even without a track record in agriculture. And many are having a go. For example, Trimble Navigation, based in Sunnyvale, at the southern end of Silicon Valley, reckons that as an established geographical-information company it is well placed to move into the smart-farming market, with a system called Connected Farms. It has bought in outside expertise in the shape of AGRI-TREND, a Canadian agricultural consultancy, which it acquired last year.By contrast, Farmobile of Overland Park, Kansas, is a startup. It is aimed at those who value privacy, making a feature of not using clients’ data to improve its products, as many farm-management systems do. Farmers Business Network, of Davenport, Iowa, uses almost the opposite model, acting as a co-operative data pool. Data in the pool are anonymised, but everyone who joins is encouraged to add to the pool, and in turn gets to share what is there. The idea is that all participants will benefit from better solutions to the matrix.Some firms focus on market niches. iTK, based in Montpellier, France, for example, specialises in grapes and has built mathematical models that describe the behaviour of all the main varieties. It is now expanding into California.Thanks to this proliferation of farm-management software, it is possible to put more and more data to good use if the sensors are available to provide them. And better, cheaper sensors, too, are on their way. Moisture sensors, for example, usually work by measuring either the conductivity or the capacitance of soil, but a firm called WaterBit, based in Santa Clara, California, is using a different technology which it says can do the job at a tenth of the price of the existing products. And a sensor sold by John Deere can spectroscopically measure the nitrogen, phosphorous and potassium composition of liquid manure as it is being sprayed, permitting the spray rate to be adjusted in real time. This gets round the problem that liquid manure, though a good fertiliser, is not standardised, so is more difficult than commercial fertiliser to apply in the right quantities.Things are changing in the air, too. In a recapitulation of the early days of manned flight, the makers of unmanned agricultural drones are testing a wide range of designs to find out which is best suited to the task of flying multispectral cameras over farms. Some firms, such as Agribotix in Boulder, Colorado, prefer quadcopters, a four-rotored modern design that has become the industry standard for small drones, though it has limited range and endurance. A popular alternative, the AgDrone, built by HoneyComb of Wilsonville, Oregon, is a single-engine flying wing that looks as if it has escaped from a 1950s air show. Another, the Lancaster 5, from PrecisionHawk of Raleigh, North Carolina, vaguely resembles a scale model of the eponymous second-world-war bomber. And the offering by Delair-Tech, based in Toulouse, France, sports the long, narrow wings of a glider to keep it aloft for long periods.Even an endurance drone, though, may be pushed to survey a large estate in one go. For a synoptic view of their holding, therefore, some farmers turn to satellites. Planet Labs, a firm in San Francisco, provides such a service using devices called CubeSats, measuring a few centimetres across. It keeps a fleet of about 30 of these in orbit, which it refreshes as old ones die by putting new ones into space, piggybacking on commercial launches. Thanks to modern optics, even a satellite this small can be fitted with a multispectral camera, though it has a resolution per pixel of only 3.5 metres (about ten feet). That is not bad from outer space, but not nearly as good as a drone’s camera can manage.Satellite coverage, though, has the advantage of being both broad and frequent, whereas a drone can offer only one or the other of these qualities. Planet Lab’s constellation will be able to take a picture of a given bit of the Earth’s surface at least once a week, so that areas in trouble can be identified quickly and a more detailed examination made.The best solution is to integrate aerial and satellite coverage. That is what Mavrx, also based in San Francisco, is trying to do. Instead of drones, it has an Uber-like arrangement with about 100 light-aircraft pilots around America. Each of the firm’s contracted planes has been fitted with a multispectral camera and stands ready to make specific sorties at Mavrx’s request. Mavrx’s cameras have a resolution of 20cm a pixel, meaning they can pretty much take in individual plants.The firm has also outsourced its satellite photography. Its raw material is drawn from Landsat and other public satellite programmes. It also has access to these programmes’ libraries, some of which go back 30 years. It can thus check the performance of a particular field over decades, calculate how much biomass that field has supported from year to year and correlate this with records of the field’s yields in those years, showing how productive the plants there have been. Then, knowing the field’s biomass in the current season, it can predict what the yield will be. Mavrx’s method can be scaled up to cover entire regions and even countries, forecasting the size of the harvests before they are gathered. That is powerful financial and political information..

A truly automated, factory-like farm, however, would have to cut people out of the loop altogether. That means introducing robots on the ground as well as in the air, and there are plenty of hopeful agricultural-robot makers trying to do so.At the University of Sydney, the Australian Centre for Field Robotics has developed RIPPA (Robot for Intelligent Perception and Precision Application), a four-wheeled, solar-powered device that identifies weeds in fields of vegetables and zaps them individually. At the moment it does this with precise, and precisely aimed, doses of herbicide. But it, or something similar, could instead use a beam of microwaves, or even a laser. That would allow the crops concerned to be recognised as “organic” by customers who disapprove of chemical treatments.For the less fussy, Rowbot Systems of Minneapolis is developing a bot that can travel between rows of partly grown maize plants, allowing it to apply supplementary side dressings of fertiliser to the plants without crushing them. Indeed, it might be possible in future to match the dose to the plant in farms where individual plants’ needs have been assessed by airborne multispectral cameras.Robots are also of interest to growers of fruit and vegetables that are currently picked by hand. Fruit-picking is a time-consuming business which, even though the pickers are not well rewarded, would be a lot faster and cheaper if it were automated. And robot pickers are starting to appear.The SW6010, made by AGROBOT, a Spanish firm, uses a camera to recognise strawberries and work out which are ripe for the plucking. Those that are have their stems severed by blades and are caught in baskets before being passed on by a conveyor belt for packing by a human operator sitting on the robot. In the Netherlands, researchers at Wageningen University are working on a robot harvester for larger produce such as peppers.All these devices, and others like them, still exude a whiff of the Heath Robinson. But robotics is developing rapidly, and the control systems needed to run such machines are getting better and cheaper by the day. Some think that in a decade or so many farms in rich countries will be largely robot-operated.Yet others wonder just how far farmers will let their farms be robotised. Self-guiding agricultural machinery such as that sold by John Deere is all but robotic already. It is like an airliner, in which the pilot usually has little to do between landing and take-off because computers do the work for him.

Yet Deere has no plans to hand over complete control to the cloud, because that is not what its customers want.

Tunnel vision

If total control still seems some way off in outdoor farming, it is already close for crops grown in an entirely artificial environment. In a warren of tunnels beneath Clapham, in south London, Growing Underground is doing exactly what its name suggests. It is rearing around 20 types of salad plants, intended for sale to the chefs and sandwich shops of the city, in subterranean voids that began life as second-world-war bomb shelters.In many ways, Growing Underground’s farm resembles any other indoor hydroponic operation. But there is one big difference. A conventional greenhouse, with its glass or polycarbonate walls, is designed to admit as much sunlight as possible. Growing Underground specifically excludes it. Instead, illumination is provided by light-emitting diodes (LEDs). These, in the minimalist spirit of hydroponics, have had their spectra precisely tuned so that the light they emit is optimal for the plants’ photosynthesis.

As you would expect, sensors watch everything—temperature, humidity, illumination—and send the data directly to Cambridge University’s engineering department where they are crunched, along with information on the plants’ growth, to work out the best regimes for future crops.For now Steven Dring, Growing Underground’s boss, is confining output to herbs and vegetables such as small lettuces and samphire that can be brought to harvestable size quickly. He has reduced the cycle for coriander from 21 to 14 days. But tests suggest that the system also works for other, chunkier crops. Carrots and radishes have already been successfully grown this way, though they may not command a sufficient premium to make their underground cultivation worthwhile. But pak choi, a Chinese vegetable popular with trendy urbanites who live in inner-London suburbs like Clapham, is also amenable. At the moment growing it takes five weeks from start to finish. Get that down to three, which Mr Dring thinks he can, and it would be profitable. The firms that make the LEDs could also be on to a good thing. Mr Dring’s come from Valoya, a Finnish firm. In Sweden, Heliospectra is in the same business. Philips, a Dutch electrical giant, has also joined in. In conventional greenhouses such lights are used to supplement the sun, but increasingly they do duty in windowless operations like Mr Dring’s. Though unlike sunlight they do not come free, they are so efficient and long-lasting that their spectral advantages seem clinching (see chart).This kind of farming does not have to take place underground. Operations like Mr Dring’s are cropping up in buildings on the surface as well. Old meatpacking plants, factories and warehouses the world over are being turned into “vertical farms”. Though they are never going to fill the whole world’s bellies, they are more than a fad. Rather, they are a modern version of the market gardens that once flourished on the edge of cities —in places just like Clapham—before the land they occupied was swallowed by urban sprawl. And with their precise control of inputs, and thus outputs (see Brain scan, below), they also represent the ultimate in what farming could become.Crops of the future: Tinker and tailor

PLANT breeders are understandably excited about manipulating botanical genomics (see next page). But it is a crop’s phenotype—its physical instantiation—that people actually eat, and this is the product of both genes and environment.Optimising phenotypes by manipulating the environment is the task Caleb Harper has set himself. Dr Harper is the founder of the Open Agriculture Initiative (OAI) at the Massachusetts Institute of Technology’s Media Lab. At first sight, that seems odd. The Media Lab is an information-technology laboratory, best known for having helped develop things like electronic paper, wireless networks and even modern karaoke machines. It is very much about bits and bytes, and not much hitherto about proteins and lipids.However, environmental information is still information. It informs how a plant grows, which is what interests Dr Harper. As he once put it, “people say they like peppers from Mexico. What they actually like is peppers grown in the conditions that prevail in Mexico.” He reckons that if you can replicate the conditions in which a botanical product grew, you can replicate that product. But this means you have to understand those conditions properly in the first place.To help with this, he and his colleagues at the OAI have developed what they call the Personal Food Computer: a standardised tabletop device that can control illumination, carbon-dioxide levels, humidity, air temperature, root-zone temperature, and the acidity and dissolved-oxygen content of water delivered to the roots, as well as its nutrient content and any other aspect of its chemistry.Plant phenotypes are monitored during growth by web cameras linked to software that detects leaf edges and colour differences and by sensors that can detect areas of active photosynthesis. After harvesting they are examined by lidar (the optical equivalent of radar) to record their shape in detail, and by gas chromatography/mass spectroscopy to understand their chemical composition.The idea is that Personal Food Computers can be built by anyone who chooses to, and form part of an “open science” network that gathers data on growing conditions and works out those conditions’ phenotypic effects. Of particular interest are matters such as flavour and astringency that are governed by chemicals called secondary metabolites. These are often parts of plant-defence mechanisms, so in one experiment the computers are looking at the effect of adding crushed arthropod exoskeletons to the water supply, which may mimic attack by insects or mites. The hope is that this will change flavours in controllable ways.Though Dr Harper is from a rural background, his career before the OAI was conventionally Media Lab-like. In particular, he designed environmental-control systems for data centres and operating theatres—keeping heat, humidity and so on within the tight limits needed for optimal function. But the jump from controlling those environments to controlling miniature farms was not enormous.Some three dozen Personal Food Computers already exist and about 100 more are under construction the world over. This geographical dispersion is important. Dr Harper’s goal, as his view on Mexican peppers suggests, is to decouple climate from geography by building a “catalogue of climates”. That would allow indoor urban farms to be programmed to imitate whatever climate was required in order to turn out crops for instant local consumption. This would certainly appeal to those who worry about “food miles”—the cost in terms of carbon dioxide of shipping edible items around the world. How it will go down with farmers in places whose climates are being imitated in rich-country cities remains to be seen.

The founder of the Open Agriculture Initiative at MIT’s Media Lab is building a “catalogue of climates” to help plants grow better

The founder of the Open Agriculture Initiative at MIT’s Media Lab is building a “catalogue of climates” to help plants grow better.

Farms need better products. Genomic understanding will provide them

C4 SOUNDS like the name of a failed electric car from the 1970s. In fact, it is one of the most crucial concepts in plant molecular biology. Plants have inherited their photosynthetic abilities from bacteria that took up symbiotic residence in the cells of their ancestors about a billion years ago. Those bacteria’s descendants, called chloroplasts, sit inside cells absorbing sunlight and using its energy to split water into hydrogen and oxygen. The hydrogen then combines with carbon dioxide to form small intermediate molecules, which are subsequently assembled into sugars. This form of photosynthesis is known as C3, because these intermediates contain three carbon atoms. Since the arrival of chloroplasts, though, evolution has discovered another way to photosynthesise, using a four-carbon intermediate. C4 photosynthesis is often more efficient than the C3 sort, especially in tropical climes. Several important crops that started in the tropics use it, notably maize, millet, sorghum and sugar cane.C4 photosynthesis is so useful that it has evolved on at least 60 separate occasions. Unfortunately, none of these involved the ancestors of rice, the second most important crop on Earth, after wheat. Yet rice, pre-eminently a tropical plant, would produce yields around 50% bigger than at present if it took the C4 route. At the International Rice Research Institute in Los Banos, outside Manila, researchers are trying to show it how. The C4 Rice Project, co-ordinated by Paul Quick, is a global endeavour, also involving biologists at 18 other laboratories in Asia, Australia, Europe and North America. Their task involves adding five alien enzymes to rice, to give it an extra biochemical pathway, and then reorganising some of the cells in the plant’s leaves to create special compartments in which carbon dioxide can be concentrated in ways the standard C3 mechanism does not require. Both of these things have frequently happened naturally in other plants, which suggests that doing them artificially is not out of the question. The team has already created strains of rice which contain genes plucked from maize plants for the extra enzymes, and are now tweaking them to improve their efficacy. The harder part, which may take another decade, will be finding out what genetic changes are needed to bring about the compartmentalisation.

Genome editing resembles the natural process of mutation

The C4 Rice Project thus aims to break through the yield plateaus and return the world to the sort of growth rates seen in the heady days of the Green Revolution. Other groups, similarly motivated, are working on making many types of crops resistant to drought, heat, cold and salt; on inducing greater immunity to infection and infestation; on improving nutritional value; on making more efficient use of resources such as water and phosphorous; and even on giving to plants that do not have it the ability to fix nitrogen, an essential ingredient of proteins, directly from the air instead of absorbing it in the form of nitrates. Such innovations should be a bonanza. Unfortunately, for reasons both technical and social, they have so far not been. But that should soon change.The early days of genetically engineered crops saw two huge successes and one spectacular failure. The successes were the transfer into a range of plants, particularly maize, soyabeans and cotton, of two types of gene. Both came from bacteria. One protected its host from the attentions of pesky insect larvae. The other protected it from specific herbicides, meaning those herbicides could be used more effectively to keep fields free from weeds. Both are beloved of farmers.The spectacular failure is that neither is beloved of consumers. Some are indifferent to them; many actively hostile. Even though over decades there has been no evidence that eating genetically modified crops is harmful to health, and little that they harm the environment, they have been treated as pariahs.Since people do not eat cotton, and soyabeans and maize are used mainly as animal fodder, the anti-GM lobby’s impact on those crops has been muted. But the idea of extending either the range of crops modified or the range of modifications available has (with a few exceptions) been thought commercially too risky to try. Moreover, transgenics, as the technique of moving genes from one species to another is called, is haphazard. Where the moved gene will end up is hard to control. That matters, for genes work better in some places than others.

Spell it for me

The search has therefore been on for a better way than transgenics of doing things. And one is now emerging that, its supporters hope, may kill both the technical and the social birds with a single stone.

Genome editing, as this approach is known, tweaks existing DNA in situ by adding, subtracting or substituting a piece that may be as small as a single genetic “letter” (or nucleotide). That not only makes the technique precise, it also resembles the natural process of mutation, which is the basis of the variety all conventional plant-breeding relies on. That may raise fewer objections among consumers, and also holds out the hope that regulators will treat it differently from transgenics.

After a couple of false starts, most researchers agree that a technique called CRISPR/Cas9, derived from a way that bacteria chop up the genes of invading viruses, is the one that will make editing crop genomes a realistic prospect. Transgenic technology has steered clear of wheat, which is eaten mainly by people. But DuPont’s seed division, Pioneer, is already trying to use CRISPR/Cas9 to stop wheat from self-pollinating, in order to make the development of hybrids easier. Similarly, researchers at the Chinese Academy of Sciences are using it to try to develop wheat plants that are resistant to powdery mildew, a serious hazard.

Not all current attempts at agricultural genome editing use CRISPR/Cas9. Cibus, in San Diego, for example, employs a proprietary technique it calls the Rapid Trait Development System (RTDS). This co-opts a cell’s natural DNA repair mechanism to make single-nucleotide changes to genomes.

RTDS has already created one commercial product, a form of rape resistant to a class of herbicides that conventional transgenics cannot protect against. But at the moment CRISPR/Cas9 seems to be sweeping most things before it—and even if it stumbles for some reason, other bacterial antiviral mechanisms might step in.

Whether consumers will accept genome editing remains to be seen. No one, however, is likely to object to a second rapidly developing method of crop improvement: a souped-up breeding technique called genomic selection.

Genomic selection is a superior version of marker-assisted selection, a process which has itself been replacing conventional crop-breeding techniques. Both genomic selection and markerassisted selection rely on recognising pieces of DNA called markers found in or near places called quantitative trait loci (QTLs). A QTL is part of a genome that has, because of a gene or genes within it, a measurable, predictable effect on a phenotype. If the marker is present, then so is the QTL. By extension, a plant with the marker should show the QTL’s phenotypic effect.The difference between conventional marker-assisted selection and the genomic version is that the former relied on a few hundred markers (such as places where the DNA stuttered and repeated itself) that could be picked up by the technology then available. Now, improved detection methods mean single-nucleotide polymorphisms, or SNPs (pronounced “snips”), can be used as markers. A SNP is a place where a single genetic letter varies in an otherwise unchanging part of the genome, and there are thousands of them.Add in the enormous amounts of computing power available to link SNPs with QTLs—and, indeed, to analyse the interactions between QTLs themselves—and the upshot is a system that can tell a breeder which individual plants are worth raising to maturity, and which should then be crossed with each other to come up with the best results.Crop strains created this way are already coming to market. AQUAmax and Artesian are drought-tolerant strains of maize developed, respectively, by DuPont and Syngenta. These two, intriguingly, are competitors with another drought-tolerant maize strain, DroughtGuard, developed by Monsanto using the transgenic approach.

Genomic selection also offers opportunities for the scientific improvement of crops that seed companies usually neglect. The NextGen Cassava Project, a pan-African group, plans to zap susceptibility to cassava mosaic virus this way and then systematically to improve the yield and nutritional properties of the crop. The project’s researchers have identified 40,000 cassava SNPs, and have now gone through three generations of genomic selection using them. Besides making cassava resistant to the virus, they also hope to double yields and to increase the proportion of starch (and thus the nutritional value) of the resulting strains. If modern techniques can similarly be brought to bear on other unimproved crops of little interest to the big seed companies, such as millet and yams, the yield-bonuses could be enormous.For the longer term, some researchers have more radical ambitions. A manifesto published last year by Donald Ort, of the United States Department of Agriculture’s Agricultural Research Service, and his colleagues proposes not merely recapitulating evolution but actually redesigning the photosynthetic process in ways evolution has not yet discovered. Dr Ort suggests tweaking chlorophyll molecules in order to capture a wider range of frequencies and deploy the resulting energy more efficiently. He is also looking at improving the way plants absorb carbon dioxide. The result, he hopes, will be faster-growing, higher-yielding crops.Such ideas are controversial and could take decades to come to fruition. But they are not fantastic. A combination of transgenics (importing new forms of chlorophyll from photosynthetic bacteria), genome editing (to supercharge existing plant enzymes) and genomic selection (to optimise the resulting mixture) might well be able to achieve them.Those who see this as an unnatural, perhaps even monstrous approach to crop improvement should recall that it is precisely what happened when the ancestors of modern plants themselves came into existence, through the combination of a bacterium and its host and their subsequent mutual adjustment to live in symbiosis. It was this evolutionary leap which greened the Earth in the first place. That something similar might re-green it is at least worth considering.

IN THE basement of a building on a wharf in Baltimore’s inner harbour, a group of aquaculturists at the Institute of Marine and Environmental Technology is trying to create an artificial ecosystem. Yonathan Zohar and his colleagues hope to liberate the raising of ocean fish from the ocean itself so that fish farms can be built inland. Fresh fish, served the day it comes out of the brine (even if the brine in question is a judicious mixture of tap water and salts), would thus become accessible to millions of landlubbers who must now have their fish shipped in from afar, deep-frozen. Equally important, marine-fish farmers would no longer have to find suitable coastal sites for penning stock while it grows to marketable size, exposing the crowded animals to disease and polluting the marine environment.People have raised freshwater fish in ponds since time immemorial, but farming species such as salmon that live mainly in saltwater dates back only a few decades, as does the parallel transformation of freshwater aquaculture to operate on an industrial scale. Now fish farming is booming. As the chart on the next page shows, human consumption of farmed fish has overtaken that of beef. Indeed, one way of supplying mankind with enough animal protein in future may be through aquaculture. To keep the boom going, though, technologists like Dr Zohar must become ever more inventive.His ecosystem, which is about to undergo commercial trials, constantly recycles the same supply of brine, purified by three sets of bacteria. One set turns ammonia excreted by the fish into nitrate ions. A second converts these ions into nitrogen (a harmless gas that makes up 78% of the air) and water.A third, working on the solid waste filtered from the water, transforms it into methane, which—via a special generator—provides part of the power that keeps the whole operation running. The upshot is a closed system that can be set up anywhere, generates no pollution and can be kept disease-free. It is also escape-proof. That means old-world species such as sea bream and sea bass, which cannot now be grown in America because they might get out and breed in the wild, could be delivered fresh to the table anywhere.Besides transforming the design of fish farms, Dr Zohar is also working on extending the range of species that they can grow. He has spent decades studying the hormone system that triggers spawning and can now stimulate it on demand. He has also examined the needs of hatchling fry, often completely different from those of adult fish, that must be met if they are to thrive. At the moment he is trying to do this for one of the most desirable species of all, the bluefin tuna. If he succeeds, and thus provides an alternative to the plummeting wild populations of this animal, sushi lovers around the world will be for ever in his debt.

Gone fishin’

Fish farmers used to dream of fitting their charges with transgenes to make them grow more quickly. Indeed, over the past couple of decades researchers have treated more than 35 fish species in this way. They have often been spectacularly successful. Only one firm, though, has persisted to the point of regulatory approval. AquaBounty’s transgenic Atlantic salmon, now cleared in both America and Canada, has the desirable property of rapid growth. Its transgene, taken from a chinook salmon, causes it to put on weight all year round, not just in spring and summer. That halves the time the fish will take to reach marketable size. Whether people will be willing to eat the result, though, is an experiment in its own right—one that all those other researchers, only too aware of widespread public rejection of transgenic crops, have been unwilling to conduct.

That may be wise. There is so much natural variation in wild fish that conventional selective breeding can make a big difference without any high-tech intervention. Back in 2007 a report by researchers at Akvaforsk, now part of the Norwegian Institute of Food, Fisheries and Aquaculture Research (NOFIMA), showed that three decades of selective breeding by the country’s salmon farmers had resulted in fish which grew twice as fast as their wild progenitors. Admittedly starting from a lower base, those farmers had done what AquaBounty has achieved, but without the aid of a transgene.If conventional selection can yield such improvements, it is tempting not to bother with anything more complicated. Tempting, but wrong. For, as understanding of piscine DNA improves, the sort of genomic selection being applied to crops can also be applied to fish.

Researchers at SalmoBreed of Bergen, in Norway, have employed it not to create bigger, faster-growing fish but to attack two of fish farming’s banes—infestation and infection. By tracking SNPs (single-nucleotide polymorphisms, a variation of a single genetic letter in a genome used as a marker) they have produced varieties of salmon resistant to sea lice and also to pancreas disease, a viral illness. They are now looking into a third problem, amoebic gill disease. In Japan, similar work has led to the development of flounders resistant to viral lymphocystis, trout immune to “cold-water” disease, a bacterial infection, and amberjack that evade the attentions of a group of parasitic worms called the monogenea.Altering nature, then, is crucial to the success of fish farming. But nurture can also give a helping hand, for example by optimising what is fed to the animals. As with any product, one key to success is to get costs down. And here, environmental and commercial considerations coincide.A common complaint by green types is that fish farming does not relieve as much pressure on the oceans as it appears to, because a lot of the feed it uses is made of fish meal. That simply transfers fishing pressure from species eaten by people directly to those that get turned into such meal. But fish meal is expensive, so researchers are trying to reduce the amount being used by substituting plant matter, such as soya. In this they have been successful. According to a paper published last year by researchers at NOFIMA, 90% of salmon feed used in Norway in 1990 was fish meal. In 2013 the comparable figure was 30%. Indeed, a report published in 2014 by the European Parliament found that fish-meal consumption in aquaculture peaked in 2005.

It’s a gas

Feeding carnivores like salmon on plants is one way to reduce both costs and environmental harm. Another, which at first sight seems exotic, is to make fish food out of natural gas. This is the proposed business of Calysta, a Californian firm. Calysta feeds the gas—or, rather, its principal component, methane—to bacteria called methanotrophs. These metabolise the methane, extract energy from it and use the atoms thus liberated, along with oxygen from water and nitrogen from the air, to build their bodies. Calysta then turns these bodies into protein pellets that are sold as fish food, a process that puts no strain at all on either sea or field.

Even conventional fish foods, though, are low-strain compared with feed for farm animals. Because fish are cold-blooded, they do not have to eat to stay warm. They thus convert more of their food into body mass. For conservationists, and for those who worry whether there will be enough food in future to feed the growing human population, that makes fish a particularly attractive form of animal protein.

Nevertheless, demand for the legged and winged sort is growing too. Novel technologies are therefore being applied to animal husbandry as well. And some imaginative researchers are even trying to grow meat and other animal products in factories, cutting the animals out of the loop altogether.

IF THE future of farming is to be more factory-like, some might argue that the treatment of stock animals such as chickens and pigs has led the way. Those are not, though, happy precedents. Crop plants, unsentient as they are, cause no welfare qualms in those who worry about other aspects of modern farming. Even fish, as long as they are kept healthy, rarely raise the ire of protesters. Birds and mammals are different. There are moral limits to how they can be treated. They are also individually valuable in a way that crop plants and fish are not. For both these reasons, they are worth monitoring one at a time.Cattle, in particular, are getting their own private sensors. Devices that sit inside an animal’s rumen, measuring stomach acidity and looking for digestive problems, have been available for several years.They have now been joined by movement detectors such as that developed by Smartbell, a small firm in Cambridge, England. This sensor hangs around a cow’s neck, recording its wearer’s movement and transmitting that information to the cloud. An animal’s general activity level is a good indication of its fitness, so the system can give early warning of any trouble. In particular, it immediately shows when its wearer is going lame—a problem that about a fifth of British cattle suffer at some point in their lives—even before an observant farmer might notice anything wrong. If picked up early, lameness is easily treated. If permitted to linger, it often means the animal has to be destroyed.

Movement detectors can also show if a cow is ready for insemination. When she is in oestrus, her pattern of movement changes, and the detector will pick this up and alert her owner. Good breeding is crucial to animal husbandry, and marker-assisted genomic selection will ensure that the semen used for such insemination continues to yield better and better offspring. What is less clear—and is actively debated—is whether genome editing has a role to play here. Transgenics has given an even wider berth to terrestrial animals than it has to fish, and for the same reason: wary consumers. Some people hope, though, that this wariness will not apply to animals whose DNA has merely been tweaked, rather than imported from another species, especially if the edits in question will improve animal welfare as well as farmers’ profits.Following this line of thinking, Recombinetics, a firm in St Paul, Minnesota, is trying to use genome editing of the sort now being employed on crops to create a strain of hornless Holstein cattle. Holsteins are a popular breed for milking, but their horns make them dangerous to work with, so they are normally dehorned as calves, which is messy, and painful for the animal. Scott Fahrenkrug, Recombinetics’ founder, therefore had the idea of introducing into Holsteins a DNA sequence that makes certain beef cattle hornless. This involved deleting a sequence of ten nucleotides and replacing it with 212 others.Bruce Whitelaw at the Roslin Institute, in Scotland, has similarly edited resistance to African swine fever into pigs, by altering a gene that helps regulate immune responses to this illness to make it resemble the version found in warthogs. These wild African pigs have co-evolved with the virus and are thus less susceptible to it than are non-African domesticated animals. Randall Prather at the University of Missouri has similarly created pigs that cannot catch porcine reproductive and respiratory syndrome, an illness that costs American farmers alone more than $600m a year. And at the International Livestock Research Institute in Nairobi, Steve Kemp and his colleagues are considering editing resistance to sleeping sickness, a huge killer of livestock, into African cattle. All this would make the animals healthier and hence happier as well.Not all such work is welfare-oriented, though. Dr Fahrenkrug has also been working on a famous mutation that increases muscle mass. This mutation, in the gene for a protein called myostatin, is found naturally in Belgian Blue cattle. Myostatin inhibits the development of muscle cells. The Belgian-Blue mutation disrupts myostatin’s structure, and thus function. Hence the animals’ oversize muscles. Two years ago, in collaboration with researchers at Texas A&M University, Dr Fahrenkrug edited the myostatin gene of a member of another breed of cattle to do likewise.

Where’s the beef?

There may, though, be an even better way to grow muscle, the animal tissue most wanted by consumers, than on animals themselves. At least two groups of researchers think it can be manufactured directly. In 2013 Mark Post of Maastricht University, in the Netherlands, unveiled the first hamburger made from muscle cells grown in laboratory cultures. In February this year a Californian firm called Memphis Meats followed suit with the first meatball.

Dr Post’s original hamburger, which weighed 140 grams, was assembled from strips of muscle cells grown in Petri dishes. Including all the set-up costs, it was said to have cost 250,000 ($350,000), or $2.5m a kilogram. Scaling up the process will bring that figure down a lot. This means growing the cells in reactor vessels filled with nutrient broth. But, because such cells are supposed to be parts of bodies, they cannot simply float around in the broth in the way that, for example, yeast cells used in biotechnology can. To thrive, they must be attached to something, so the idea is to grow them on small spheres floating in the vessels. Fat cells, which add juiciness to meat, would be cultured separately.

Do this successfully, Dr Post reckons, and the cost would fall to $65 a kilogram. Add in technological improvements already under way, which will increase the density of muscle cells that can be grown in a reactor, and he hopes that Mosa Meat, the firm he has founded to exploit his work commercially, will have hamburger mince ready for sale (albeit at the pricey end of the market) in five years’ time.

Meanwhile researchers at Clara Foods, in San Francisco, are developing synthetic egg white, using transgenic yeast to secrete the required proteins. Indeed, they hope to improve on natural egg white by tweaking the protein mix to make it easier to whip into meringues, for example. They also hope their synthetic white will be acceptable to people who do not currently eat eggs, including vegans and some vegetarians.

Technology will transform farmers’ lives in both the rich and the poor world

ONE of the greatest unsung triumphs of human progress is that most people are no longer working on the land. That is not to demean farming. Rather, it is to praise the monumental productivity growth in the industry, achieved almost entirely by the application of technology in the form of farm machinery, fertilisers and other agrochemicals, along with scientifically improved crops and livestock. In 1900 around 41% of America’s labour force worked on a farm; now the proportion is below 2%. The effect is less marked in poorer countries, but the direction of travel is the same. The share of city-dwellers in the world’s total population reached 50% in 2007 and is still rising relentlessly, yet the shrinking proportion of people living in the countryside is still able to feed the urban majority.No crystal ball can predict whether that will continue, but on past form it seems perfectly plausible that by 2050 the planet will grow 70% more food than it did in 2009, as the Food and Agriculture Organisation (FAO) says it needs to. Even though some crops in some parts of the world have reached a productivity plateau, cereal production increased by 11% in the six years after the FAO made that prediction. The Malthusian fear that population growth will outstrip food supply, now 218 years old, has not yet come true.

Yet just as Thomas Malthus has his modern-day apologists, so does his mythical contemporary, Ned Ludd. Neo-Luddism is an ever-present threat that can certainly slow down the development of new technologies—as has indeed happened with transgenics. But while it is fine for the well-fed to be prissy about not eating food containing genetically modified ingredients, their fears have cast a shadow over the development of transgenic crops that might help those whose bellies are not so full. That is unconscionable. With luck, the new generation of genome-edited plants, and maybe even animals, will not provoke such a reaction.Regardless of whether it does, though, some other trends seem near-certain to continue into the future. Precision agriculture will spread from its North American heartland to become routine in Europe and those parts of South America, such as Brazil, where large arable farms predominate. And someone, perhaps in China, will work out how to apply to rice the sort of precision techniques now applied to soyabeans, maize and other crops.The technological rationale for precision suggests farms should continue to consolidate, though in an industry in which sentiment and family continuity have always played a big part that purely economic analysis might suggest is irrational, this may not happen as fast as it otherwise would. Still, regardless of the speed at which they arrive, these large holdings will come more and more to resemble manufacturing operations, wringing every last ounce of efficiency out of land and machinery.Such large-scale farms will probably continue to be served by large-scale corporations that provide seeds, stock, machines and management plans. But, in the case of the management plans, there is an opening for new firms with better ideas to nip in and steal at least part of the market.Other openings for entrepreneurs are available, too. Both inland fish farming and urban vertical farming—though niche operations compared with Midwestern soyabean cultivation or Scottish sea-loch salmon farms—are waves of the future in the service of gustatorially sophisticated urbanites. And in these businesses, the idea of farm as factory is brought to its logical conclusión.It is in the poorer parts of the world, though, that the battle for full bellies will be won or lost; and in Africa, in particular, the scope for change is both enormous and unpredictable. Though the problems of African farming are by no means purely technological—better roads, better education and better governments would all help a great deal—technology nevertheless has a big part to play.Organisations such as the NextGen Cassava Project, which apply the latest breeding techniques to reduce the susceptibility of crops to disease and increase their yield and nutritional value, offer Africans an opportunity to leap into the future in the way they did with telephony, bypassing fixed-line networks and moving straight to mobiles. Crops could similarly jump from 18th- to 21st-century levels of potential in a matter of years, even if converting that potential into productivity still requires the developments listed earlier.Looking further into the future, the picture is hazier. Large-scale genetic engineering of the sort needed to create C4 rice, or nitrogen-fixing wheat, or enhanced photosynthetic pathways, will certainly cause qualms, and maybe not just among the neo-Luddites. And they may not be needed. It is a general technological truth that there are more ideas than applications, and perfectly decent ones fall by the wayside because others have got there first. But it is good to know that the big ideas are there, available to be drawn on in case other yield plateaus threaten the required rise in the food supply. It means that the people of 2050, whether they live in Los Angeles, Lucknow or Lusaka, will at least be able to face whatever other problems befall them on a full stomach.

Bugs in the system

Bacteria and fungi can help crops and soil

MICROBES, though they have a bad press as agents of disease, also play a beneficial role in agriculture. For example, they fix nitrogen from the air into soluble nitrates that act as natural fertiliser. Understanding and exploiting such organisms for farming is a rapidly developing part of agricultural biotechnology.

At the moment, the lead is being taken by a collaboration between Monsanto and Novozymes, a Danish firm.

This consortium, called BioAg, began in 2013 and has a dozen microbe-based products on the market. These include fungicides, insecticides and bugs that liberate nitrogen, phosphorous and potassium compounds from the soil, making them soluble and thus easier for crops to take up.

Last year, researchers at the two firms tested a further 2,000 microbes, looking for species that would increase maize and soyabean yields. The top-performing strains delivered a boost of about 3% for both crops.

In November 2015 Syngenta and DSM, a Dutch company, formed a similar partnership. And earlier that year, in April, DuPont bought Taxon Biosciences, a Californian microbes firm. And hopeful start-ups abound. One such is Indigo, in Boston. Its researchers are conducting field tests of some of its library of 40,000 microbes to see if they can alleviate the stress on cotton, maize, soyabeans and wheat induced by drought and salinity. Another is Adaptive Symbiotic Technologies, of Seattle. The scientists who formed this firm study fungi that live symbiotically within plants. They believe they have found one, whose natural partner is panic grass, a coastal species, which confers salinity-resistance when transferred to crops such as rice.

The big prize, however, would be to persuade the roots of crops such as wheat to form partnerships with nitrogen-fixing soil bacteria. These would be similar to the natural partnerships formed with nitrogen-fixing bacteria by legumes such as soyabeans. In legumes, the plants’ roots grow special nodules that become homes for the bacteria in question. If wheat rhizomes could be persuaded, by genomic breeding or genome editing, to behave likewise, everyone except fertiliser companies would reap enormous benefits.

Today’s Outside the Box is a little bit different – which, considering that most Outside the Box pieces can be classified as a little bit different, is not that unusual; but this one needs to come with a warning label that you may find it a tad wonkish. It’s from my friend Chris Whalen of the Kroll Bond Rating Agency. When I want to understand something about banks, Chris is one of my go-to guys.

In Chris’s latest memo he talks about the push to increase the capital levels of the eight largest US banks. He is critical of that effort in that it doesn’t address the real issues. He highlights the fact that even if we do increase the capital requirements of the largest banks, that doesn’t mean we won’t have problems with them in the next crisis.

It wasn’t insufficient capital that got the banks into trouble the last time around. If we don’t sufficiently address the issues that hurt the banks and the economy then, there can be no assurance that there won’t be problems of a similar nature next time, even with increased capital. This is worth thinking about as you ponder the risks to your portfolio that will come with the next downturn. You can’t assume there will not be problems with US banks. Maybe there won’t be, but I wouldn’t ignore the risk. Good management is more important than capital.

I write this introduction from 32,000 feet, flying back to Dallas. I had several great meetings while in New York; but the highlight was dinner last night with Art Cashin; Jack Rivkin, a longtime PaineWebber partner and now the brains at Altegris; Peter Boockvar of the Lindsey Group; Rich Yamarone, chief economist at Bloomberg; Lakshman Achuthan, the guiding light at ECRI; and Vikram Mansharamani, a Yale professor and author of Boombustology. These are the proverbial smartest guys in the room, and I posed a series of questions to them about the timing of the next recession, their thoughts on the upcoming election, and the economy in general.

I’ll take up their range of predictions and consensus regarding the recession call in this weekend’s letter, and go into some of the risks these gentlemen see, as well as dive into more of my notes from the conference.

My decision to not go to a game three or four of the NBA finals and hope for a game six –knowing that it could be a possible closer and hoping that it would be a win for the Cavaliers while I was down front in a box seat – now looks to be a bit suspect. I may not be making that trip unless Lebron and the Cavs get their act together and sweep the Warriors at home. After those first two games, I think I’ll hold off booking the tickets.

You have a great week. I think I’ll call a few more friends and get some additional takes on a recession. Just for giggles and grins.

Your thinking about portfolio risk analyst,

John Mauldin, Editor
Outside the Box

Large Bank Risk:
Liquidity Not Capital Is the Issue

“Credit means that a certain
confidence is given, and a certain trust reposed. Is that trust justified? And
is that confidence wise? These are the cardinal questions.”

– Walter Bagehot, Lombard
Street (1873)

Summary

·Kroll
Bond Rating Agency (KBRA) notes that since the 2008 financial crisis and the
passage of the Dodd-Frank legislation two years later, global financial
regulators have been pushing a deliberate agenda to increase the capitalization
of large banks. Despite the fact that the 2008 financial crisis was not caused
by a lack of capital inside major financial institutions, raising capital
levels has become the primary policy response among many of the G-20 nations.

·KBRA
believes that using higher capital to change bank profitability and,
indirectly, corporate behavior is a rather blunt tool for the task of ensuring
the stability of financial markets. Part of the problem with using capital as a
broad prescription for avoiding rescues for large financial institutions, aka
“too big to fail” or TBTF, is that this approach explicitly avoids addressing
the actual cause of the problem, namely errors and omissions by major banks
that undermined investor confidence.

·One
of the key fallacies embraced by regulators and policy makers is the notion
that higher capital levels will help TBTF banks avoid failure and, even in the
event, the failure of a large bank will not require public support. KBRA
believes that there is no evidence that higher levels of capital would have
prevented the “run on liquidity” which caused a number of depositories and
non-banks to fail starting in 2007.

Discussion

Since the 2008 financial crisis and the passage of
the Dodd-Frank legislation two years later, global financial regulators have
been pushing a deliberate agenda to increase the capitalization of large banks.
The objective of this increase in capital, we are told, is to make public
rescues of the largest banks less likely and to change their corporate
behavior. Despite the fact that the 2008 financial crisis was not caused by a
lack of capital inside major financial institutions, raising capital levels has
become the primary policy response among many of the G-20 nations and the
prudential regulators who oversee global banks.

Most recently, Federal Reserve Board Governor
Daniel Tarullo revealed on Bloomberg TV (June 2, 2016) that he is
“quite confident” that the eight largest U.S. banks will get hit with an
additional capital surcharge that will translate into a “significant increase”
in capital. However, he noted that there will be “some offsets in other parts
of the stress tests so that it won’t be just a straight addition of the
surcharge.” Tarullo opined that he doesn’t think the charge will go into effect
for the next round of tests, and instead there might be a “phase in.”

Lawmakers and federal regulators have made a number
of other changes in the regulation of US banks that impact asset allocation and
risk taking, including greater emphasis on liquidity and an end to principal
trading. Policy makers have explicitly ruled out direct punishment for
individual or institutional instances of fraud, thus we are left with an
indirect approach that punishes the creditors, shareholders and customers of
the largest banks. President Obama formed a “Financial Fraud Enforcement Task
Force” in November 2009 to “hold accountable those who helped bring about the
last financial crisis,” but the Obama administration has generally chosen to
pursue institutions over individuals when it comes to fraud prosecutions.

“More
equity may get [bank] boards to care more,” argues Dr. Anat Admati of Stanford
University, but KBRA believes that using higher capital to change bank
profitability and, indirectly, corporate behavior is a rather blunt tool for the
task of ensuring financial stability.

Part of the problem with using capital as a broad
prescription for avoiding rescues for large financial institutions, aka “too
big to fail”, is that this approach explicitly avoids addressing the actual
cause of the problem, namely errors and omissions by the officers and directors
of major banks that undermined investor confidence. A combination of poor loan
underwriting, excess risk taking in the trading and investment portfolios,
deliberate acts of deceit, a systemic failure to disclose the true extent of
bank liabilities, and/or acts of securities fraud actually caused the failure
of or need to rescue institutions such as Wachovia Bank, Washington Mutual,
Lehman Brothers, Bear, Stearns & Co American International Group (NYSE:AIG)
and Citigroup (NYSE:C), to name but a few. These rescues or events of default
were driven by a sharp decline in liquidity available to these obligors and led
to the wider financial crisis in 2008 and beyond.

Thus when regulators and policy makers sign on to
the idea of higher capital levels as a solution for TBTF, are we not all
effectively burying our collective heads in the sand? In mid-2008, when
Wachovia was receiving inquiries from bond investors about early redemption of
long-term debt, the bank’s stated level of balance sheet capital was not at
issue. Instead, investors, counterparties, and corporate/institutional
depositors were concerned that they no longer understood or trusted the bank’s
asset quality and financial statements, and therefore backed away from any risk
exposures with the bank. This is also why the Federal Reserve Board and
Treasury chose to conceal the true condition of Wachovia from the FDIC, as
former FDIC Chairman Sheila Bair documents in her 2013 book.

Meeting creditors’ demands for
payment requires holding liquidity--cash, essentially, or close equivalents.
But neither individual institutions, nor the private sector as a whole, can
maintain enough cash on hand to meet a demand for liquidation of all, or even a
substantial fraction of, short-term liabilities... [H]olding liquid assets that
are only a fraction of short-term liabilities presents an obvious risk. If most
or all creditors, for lack of confidence or some other reason, demand cash at
the same time, a borrower that finances longer-term assets with liquid
liabilities will not be able to meet the demand.

There are two basic reasons why the current
fixation with higher capital levels should be a cause for concern among policy
makers. First, there is no evidence that higher levels of capital would have
prevented the “run on liquidity” which caused a number of large depositories
and non-banks to fail starting in 2007. Reckless and questionable financial
decisions characterized, for example, by a failure to properly evaluate the
creditworthiness of borrowers were the proximate causes of an erosion in
investor confidence which ultimately caused these firms to collapse. (See
Whalen, Richard Christopher, The Subprime Crisis: Cause, Effect and
Consequences (2008). Networks Financial Institute Policy Brief No. 2008-PB-04. http://ssrn.com/abstract=1113888)

Careful observers of the banking scene in the 2000s
noted that names such as Washington Mutual and Countrywide Financial were
starting to contract in terms of sales volumes and access to liquidity as early
as 2005. The originate-to-sell mortgage production models used by these and
other banks depended crucially on access to stable market funding and a steady
supply of new paper. In mid-2007 when Bank of America (NYSE:BAC) announced a
partial rescue for its largest warehouse customer, Countrywide, the mortgage
bank led by Angelo Mozilo was already doomed because of ebbing loan volumes and
liquidity. More non-bank than commercial bank, half of Countrywide’s balance
sheet was funded by non-deposit, market sources.

Second, significantly higher capital levels and
other regulatory constraints reduce the profitability of banks and limit credit
expansion. The fact that the U.S. banking industry was able to fund the
post-crisis cleanup internally by diverting income is a remarkable achievement,
yet the response from policy makers has been to take deliberate action that
make banks less profitable and less able to fund future losses.

More, higher capital levels have negative effects
on capital formation and credit creation that may work against the broader
goals of financial stability and economic growth. Witness the declining bank
lending volumes in the US residential mortgage market. Banks which cannot
achieve sufficient equity returns to retain investors will, over time, either
shrink or discontinue businesses altogether to survive. Under the current
regulatory regime, banks in the G-20 nations are effectively being turned into
utilities which take little or no credit risk and thus do not support economic
activity.

Not only do higher capital levels and other forms
of punitive regulation reduce the availability of credit from depositories, but
these strictures will tend to force consumers and businesses to seek out credit
from unconventional sources that may actually increase systemic risk to the
financial system. The proliferation of various types of non-bank lenders
purporting to offer “new” business models are a familiar response to increased
regulation and tougher prudential standards. Many of these models have
originate-to-sell business models similar to that used in originating subprime
mortgages in the 2000s. For example, JPMorgan Chase & Co. Chief Executive
Officer Jamie Dimon, says marketplace lenders might find that sources of
funding evaporate during a downturn. (Hugh Son et al, “Dimon Says Online
Lenders’ Funding Not Secure in Tough Times,” Bloomberg News, May 11,
2016.)

Capital
vs. Confidence

One of the key fallacies embraced by bank
regulators is the notion that higher capital levels will help TBTF banks avoid
failure and, even in the event, the failure of a large bank will not require
public support. First and foremost, banks fail not because they run out of
capital, but because a lack of confidence results in a diminution of liquidity
available to the enterprise.

Indeed, during and after the 2008 financial crisis,
with the notable exception of Citigroup and AIG, U.S. banks as a group did not
require government support and consumed little capital in resolving failed
institutions. Instead, banks diverted current income to fund loan loss
provisions and FDIC insurance premiums. Using data from the FDIC, Chart 1 shows
provisions, net charge-offs, and pretax income for all U.S. banks since 1990.

Note that the sharp drop in industry operating
income in 2008-2009 included the cost of pre-paying several years of FDIC
insurance premiums. Not only did the U.S. banking industry fund the clean-up of
most failed banks privately and without taxpayer support, but the financial
crisis turned out to be an issue of reduced income rather than capital
impairment. Though hundreds of banks did fail because of loan losses, the
balance sheets of these institutions were marked to market and absorbed by the
surviving banks, which largely used income rather than capital to manage the
resolution process. Indeed, at no point did any major bank “run out of capital”
because the institutions which did fail stumbled long before due to a lack of
cash liquidity and were sold by the FDIC.

Ultimately, market liquidity is a function of
investor confidence, and not capital. Cash flowed into the largest banks in the
weeks after the failure of Lehman Brothers because the banks were big and
investors believed these banks would receive government support. Liquidity is
the key determinant of whether a bank or nonbank fails. Indeed, for most credit
professionals surveyed by KBRA, credit spreads and ratings, and other dynamic
market indicators, are far more important measures of particular counterparty
risk than static, backward-looking measures of balance sheet capital.

In Chart 2, we show total capital vs. loss reserves
since 1990. Again, aside from accounting adjustments and some large bank
resolutions in the 2008-2009 period (see circle in Chart 2), the U.S. banking
industry has continued to build capital steadily. When Wachovia Corp was acquired
by Wells Fargo at the end of 2008, the target charged off its entire loss
reserve and equity capital in Q3 2008, resulting in a substantial write-down of
doubtful assets and the creation of a loss reserve for the acquirer. As the
FDIC noted at the time, this transaction involving the fourth largest U.S. bank
holding company skewed the aggregate industry data during that reporting
period.

Conclusion

In his famous exchange with attorney Samuel
Untermyer over a century ago, John Pierpont (“JP”) Morgan stated the problem of
bank solvency correctly and for all time. In those days, bear in mind, the Fed
did not exist and JPMorgan & Co was the de facto central bank.
Because Morgan was not a member of the New York Clearinghouse, other banks had
to stand in line inside the bank’s lobby to transact business:

Untermyer “Is not commercial
credit based primarily upon money or property?”

Morgan: “No sir. The first thing
is character.”

Untermyer: “Before money or
property?”

Morgan: “Before money or
property or anything else. Money cannot buy it ... because a man I do not trust
could not get money from me on all the bonds in Christendom.”

The chief flaw with the current regulatory focus on
capital, KBRA believes, is that it ignores important qualitative factors
involved with the ownership and management of banks that ultimately determine
corporate behavior. When banks and non-banks decided to underwrite and sell bonds
based upon subprime mortgages in the 2000s, the level of balance sheet capital
was not at issue. Merely raising the level of capital required for banks may
provide the illusion of progress in the minds of many policy makers, but for
investors the most basic issue involved in any counterparty risk assessment
comes down to trust.

Managing the liquidity of a bank or non-bank
involves not just cash and collateral, but also reputation and transparency.
Measuring the static level of capital on a bank’s balance sheet may provide
some comfort as to enhanced financial stability. Managing liquidity, however,
is a dynamic task that defies easy quantification but is, at day’s end, crucial
to maintaining financial and economic strength. By focusing much of the
attention of regulators and policy makers on the static issue of capital, KBRA
believes, we are not addressing the true qualitative, behavioral issues that
undermined investor confidence in all types of financial institutions and led
to the 2008 financial crisis.

If you know the other and know yourself, you need not fear the result of a hundred battles.

Sun Tzu

We are travelers on a cosmic journey, stardust, swirling and dancing in the eddies and whirlpools of infinity. Life is eternal. We have stopped for a moment to encounter each other, to meet, to love, to share.This is a precious moment. It is a little parenthesis in eternity.