Flying an airplane today is not what it used to be; flying an Airbus 380 means working with a complex computer system and software which makes things easier under routine condition and hair raising when things go awry. Finance is not any different as digitization takes over. This paper attempts to capture a picture of how markets, financial advising and trading are being transformed by technology essentially through the application of mechanical and electrical engineering principles with the underlying maths involved all captured through sophisticated software. There are thus many changes affecting the stock markets, financial advising and trading and there is a glut of information on these subjects. The main objective of this paper is to sort out the most important elements among these changes so that professionals in the field can have a clear picture of their changing operating world. The secondary objective is to provide key material to faculty members who teach ‘Securities Analysis’ and ‘Advance Finance’.

Note: this is a compilation paper making use of authoritative information from credible sources as well as the insights of the two authors and is thus similar to what a reader is to a book. Due credit is given to the authors and sources.

Introduction

Computer applications have been having tremendous impact on all human undertakings and were bound to have significant effect of the stock markets, financial advising and trading. Recent changes are to say the least dramatic. The stock markets constitute a network and is a vast array of complex machinery best understood when looked at from the angle of mechanical engineering. The dynamics of the markets via trading is essentially an electronic ‘game’ applying principle of Signal Analysis. Financial engineering is a blend of principles of mechanical and electronic engineering with the ‘quants’ in command.

Traditional stock markets and trading

A market in the traditional way is basically a place, physical in nature where those interested in selling and buying meet and conduct an exchange transaction. The New York Stock Exchange is well known and Wall Street is world famous as the physical locations where stocks are traded. The financial instruments themselves have changed over the years as well as the trading methods

Major factor impacting operating modalities

Significant change in operating modalities was brought about by ‘securitization’ which in fact impacts more than the stock markets. Securitization is the process of taking an illiquid asset, or group of assets, and through financial engineering, transforming them into a security.

First, a regulated and authorized financial institution originates numerous mortgages, which are secured by claims against the various properties the mortgagors purchase. Then, all of the individual mortgages are bundled together into a mortgage pool, which is held in trust as the collateral for an MBS. The MBS can be issued by a third-party financial company, such a large investment banking firm, or by the same bank that originated the mortgages in the first place. Mortgage-backed securities are also issued by aggregators such as Fannie Mae or Freddie Mac.

Regardless, the result is the same: a new security is created, backed up by the claims against the mortgagors’ assets. This security can be sold to participants in the secondary mortgage market. This market is extremely large, providing a significant amount of liquidity to the group of mortgages, which otherwise would have been quite illiquid on their own. (For a one-stop shop on subprime mortgages, the secondary market and the subprime meltdown, check out the Subprime Mortgages Feature.)

Furthermore, at the time the MBS is being created, the issuer will often choose to break the mortgage pool into a number of different parts, referred to as tranches. These tranches can be structured in virtually any way the issuer sees fit, allowing the issuer to tailor a single MBS for a variety of risk tolerances. Pension funds will typically invest in high-credit rated mortgage-backed securities, while hedge funds will seek higher returns by investing in those with low credit ratings.

The following article by Karl Flinders gives a comprehensive picture of how technology has been impacting the markets with a prospective perspective:

Technology has contributed to a bang and a crash at the London Stock Exchange and created an invisible world where billions of pounds changes hands in milliseconds. But with EU red tape altering the financial sector’s landscape, technology’s evolutionary journey at the London Stock Exchange is far from over.

Nestling in Paternoster Square in the shadow of St Paul’s Cathedral, the London Stock Exchange, which makes its money through charging investors fees for trading shares and selling market data, is a technology pacemaker.

For a trading venue, the faster and more efficiently it can carry out a deal and the more up to date information it can store and retrieve, the more attractive it is to investors. These investors want to buy or sell shares quickly, to prevent changes in price during the transaction. Accurate market data is also important for investors to make informed choices.

Share trading took centre stage almost 300 years after share prices were published twice a week on a 10-by-4-inch sheet of paper and distributed from Jonathan’s Coffee-house in London. The year 1986 saw what is known as the financial sector’s Big Bang.

It was the end of October 1986 when the Stock Exchange Automated Quotation system replaced the trading floor. This screen-based quotation system was used by brokers to buy and sell stock rather than meeting face to face.

Technology’s major impact

The shortening of the period between a trade being initiated and complete, or the reduction of latency as it is known, is the ultimate aim of any stock exchange worth its salt.

The Big Bang of 1986 did this and more. “It brought significant benefits to both institutional and private investors, with private investors gaining low-cost independent access to the market through the proliferation of new services,” says Robin Paine, chief technology officer at the London Stock Exchange.

Cheap and efficient trading is what securities traders wanted and that is what they got. Volumes transacted saw unprecedented increases, with the average daily number of trades going through the ceiling.

The trading floor where dealers met remained, and was used in emergencies while the technology was in its infancy. However, this soon became a thing of the past as electronically-generated trading volumes rose unabated.

Just before the Big Bang’s meteoric impact, the average number of daily trades at the London Stock Exchange was 20,000, amounting to about £700m worth of shares changing hands. After the introduction of automated trading the figure went up to a daily average of 59,000 trades a few months later.

In 1987 the London Stock Exchange was transacting as much business in a month as it did in a whole year before 1986, with an average daily value of £1bn. Today, the average daily number of shares traded is 566,000, with an average daily value of £16.6bn.

These figures would be impossible to reach without technology that can reduce the time taken to complete a deal and handle massive volumes.

“Without technology, exchanges could not accommodate the increased transaction flows that are generated both by the proliferation of end investors, and by electronic trading, algorithms and low latency,” says Bob McDowall, analyst at TowerGroup.

The stock market crash

But the technological transformation was not plain sailing. No major technological advance with such a deep impact on how an industry operates can be introduced without a hitch.

This was no exception, and the stock market crash of 20 years ago that saw share prices plummet was more than a hitch, and was partly a result of the immaturity of the new technologies introduced in the Big Bang.

Trading in certain stocks could not be stopped and spiralled out of control. Eventually stocks across the world lost billions of pounds in value, and the London Stock Exchange lost 23% of its value in a single day.

McDowall says that although technology and the automation of selling did not cause the 1987 crash, technology did contribute to the velocity of the fall in share prices.

“The technology at that time lacked refinement to react to a wider range of factors beyond the share prices themselves,” he says.

Technology went through a quick facelift after the City woke up the morning after 1987’s Black Monday.

McDowall said the exchange had to introduce circuit breakers very quickly into the markets. These limited the velocity at which share prices could fall, before a halt was called to trading in the particular stock.

Algorithmic trading

These circuit breakers became more important with the proliferation of algorithmic trading. It is not humanly possible to manually transact the number of trades done on the stock exchange today. To reach these levels there must be a certain level of automation. Hence computers are today initiating many trades using algorithms.

Algorithmic trading, or “algo trading” as it is known in the financial sector, relies on computer systems to buy shares automatically when predefined market conditions are met.

This method of trading is the future, says Paine. “The markets will continue to be further digitised with the proliferation of algorithms set to increase. About half of all volume on the exchange now is electronically generated and we believe this trend will continue.”

The rest is generated by manual intervention where traders submit orders using an interactive screen.

Jonas Rodny, senior communications manager at the Nordic Exchange, said although it is difficult to be precise about levels of algorithmic and automated trading at the exchange, these are responsible for a significant amount of transactions.

“Our assumption is that both algorithmic and automated trading are growing very rapidly, currently accounting for at least a fifth of the overall trading volume on the Nordic Exchange and possibly quite a lot more,” he says.

The Nordic Exchange was created in 2006 by integrating the exchanges in Stockholm, Copenhagen, Helsinki, Iceland, Tallinn, Riga and Vilnius. OMX operates the Nordic Exchange and has a technology arm that develops technology for the exchange as well as licensing technology out to others.

The London Stock Exchange

Given the technological advancement in the 1980s and the resulting metamorphosis of the London Stock Exchange, it is no surprise that the company takes technology so seriously.

In 2003 the exchange instigated its Technology Roadmap, and after four years the exchange’s all-singing, all-dancing core trading platform Tradelect was launched.

Since its July launch the platform has set record after record in terms of the volumes and the values traded. In August this year the exchange processed a record £17.62bn of transactions in one day on Tradelect.

But there is no time to sit back and watch in a sector where technological innovation can so dramatically impact a company’s financial performance.

Constant innovation is essential if the exchange is to be able to compete with an increased number of trading venues. To this end the London Stock Exchange’s Technology Roadmap II has already been initiated.

Rodny said the Nordic Stock Exchange’s heritage is built on technological innovation, and the challenges it faces are twofold. Exchanges need to be able to provide sufficient latency to support more regular and faster trading, which allows investors to take market opportunities more quickly.

“The other key challenge arises from the fact that the continuous increase in volumes puts further constraints on capacity, not just at exchange level, but along the entire transaction chain,” says Rodny.

The future of European exchanges

Recent forces driving innovation at the exchanges across Europe stem from the European Union’s Markets in Financial Instruments Directive (Mifid). This piece of pan-European red tape has introduced more competition in the stock trading sector.

Mifid has compelled EU nations to remove what is known as the concentration rule that states that all trades must go through local exchanges. This has been the case for some time in the UK, but now it is happening across Europe and will inevitably lead to the creation of more alternative trading and reporting venues.

Two projects known as Boat and Turquoise have been created to offer trade reporting and execution facilities, respectively, on the back of Mifid. Turquoise was set up by Citigroup, Credit Suisse,Deutsche Bank, Goldman Sachs, Merrill Lynch, Morgan Stanley and UBS as an alternative trading venue, while Boat was developed by a consortium including many of the above-mentioned banks to offer a trade reporting venue.

KPMG consultant Lee Epstein says Mifid has opened up the stock trading and reporting sector to new players because it becomes more attractive for them to be able to work across Europe. “Before this you had so many different rules across Europe it was difficult,” he says.

A fragmented market

He says the introduction of alternative trade execution and reporting venues following Mifid will fragment the market, and technology will be important to differentiate venues.

Nemone Wynn-Evans, head of market development at UK-based exchange Plus Markets, says innovation will focus on succeeding in an increasingly fragmented market which increases competition and introduces new challenges.

“The impact of fragmentation and the lowering of transaction costs will mean huge volume increases in transaction data, and in particular market data,” says Wynn-Evans .

Technical innovation is required to be able to use all this data to optimise trades, she says. “The challenge of data volumes is not just an issue for investors, but also for surveillance functions and regulators.”

Plus Markets, which is a Recognised Investment Exchange in the UK, is currently installing new trading and market surveillance technology in conjunction with OMX to expand its stock coverage and enable algorithmic trading.

McDowall agrees that continuous innovation is essential. “It is an important factor if it provides business innovation combined with greater efficiency, speed of execution and reduction in costs.”

Rodny says innovation around speed, capacity and flexibility are important. “Capacity to take care of the increased volumes, speed in order to provide algo trading and flexibility to be able to integrate trading across asset classes and across markets.”

So in a computerised environment where high speed, high volume trading is critical, technology has a strong hand to play.

Add to this the need to retain massive amounts of data and be able to access it efficiently and you have a boardroom that appreciates the value of technology and will not shy away from investing in it.

The London Stock Exchange is an example of how a centuries-old organisation can meet today’s business challenges through an acute focus on technology innovation.

Technology allows us to have a picture of what is going on in stock markets around the world. Of course, it is possible for investors to invest in various markets. It is obvious that with high-speed communication what happens in one major market may influence what happen in other markets. Time difference is a major factor in ‘market interconnection’.

How wrong were we? After several years of commentary on the causes of the financial crisis, we still struggle to plumb the full depths of the event. We have tossed up, like so much confetti, a variety of culprits, both human and systemic, many of which undoubtedly played some role and had some complicity: from executive compensation to lack of transparency to Alan Greenspan to Congress to credit default swaps. Journalists have been excoriated for missing what was apparently obvious; homeowners blamed; Wall Street pilloried; economists accused of intellectual dishonesty. We have chewed over questions of structure, size, capital, leverage, risk. We have put the Zeitgeist (blame the ‘60s!) in the chair, probed trade imbalances, decried the presence of greed and taken refuge in irrational impulses. And yet, there is a strong sense that we are just swirling pieces of a jigsaw puzzle across the table. There remains a feeling that perhaps we were wrong in some deeper way.

Enter Beyond Mechanical Markets: Asset Price Swings, Risk and the Role of the State, a book published earlier this year by two economics professors, New York University’s Roman Frydman and the University of New Hampshire’s Michael D. Goldberg, that has elicited remarkably little discussion in the U.S. (it’s done better in Europe, but that’s another story). Beyond Mechanical Markets is a serious piece of work that’s based on research the pair has been doing for some time; while it’s “about” the financial crisis, its core ideas transcend that episode. It takes aim at a dominant macroeconomic impulse that, in popular terms (if anything seriously economic can be “popular”) encompasses the rational-expectations hypothesis. There have been several popular books that have taken aim at that set of ideas, from Justin Fox’s The Myth of the Rational Market, to Yves Smith’s Econned: How Unenlightened Self Interest Undermined Democracy and Corrupted Capitalism. And Tuft’s Amar Bhidé’s recent “A Call for Judgment: Sensible Finance for a Dynamic Economy“ touches on many aspects of this critique, but looks at it more from an organizational perspective: How rational expectations and efficient markets became embodied in deeply flawed risk management techniques like the Black-Scholes options-pricing model and value-at-risk tools.

Frydman and Goldberg’s thesis deals with more fundamental macroeconomic matters: To what extent can we predict the future? Is there a mechanical causal link that we can ever truly identify and quantify between past and future? They gather and deploy their intellectual confederates: Frank Knight, John Maynard Keynes, Friedrich Hayek, Karl Popper. They argue that rational expectations is one method, certainly a ubiquitous one, based on what they call a “fully predetermined model,” in which market players act as robots and markets operate as a kind of machine; another predetermined approach, they argue, is the New Keynesian school, that is the formalization into mathematical models of Keynes’ “General Theory” of 1936; a third includes some of the more mechanical tendencies of the behavioral school. “To portray individuals as robots and markets as machines,” they write, “contemporary economists must select one overarching rule that relates asset prices and risk to a set of fundamental factors such as corporate earnings, interest rates and overall economic activity, in all time periods. Only then can participants’ decision-making process ‘be put on a computer and run.’ “ These models assume individuals possess “perfect” knowledge of how available information will affect future prices and risk. The causal factors need never change.

Once an economist assumes market participants have equal access to information, the rational-market model implies that prices reflect the “true” prospects of the underlying assets nearly perfectly. “Economists and many others thought that the theory of the rational market provides the scientific underpinning for their belief that markets populated by rational individuals set asset prices correctly on average. In fact, the theory is a proverbial castle in the air: it rests on demonstrably false premises that the future unfolds mechanically from the past, and that market participants believe this as well.”

From this base the pair argues a number of related points. Again, the rational-expectations hypothesis posits mechanical, fully predetermined, Newtonian markets. But many players in the markets are, in fact, rational, in the sense that they act in “reasonable” ways. Rational players do not just automatically use one model (and investors, in the real world, differ in approach, self-interest and interpretative emphasis); they recognize that their information is imperfect and that they are constantly buffeted by what Frydman and Goldberg call “nonroutine” change, such as innovations, perturbations of the Zeitgeist or, for that matter, revolutions and earthquakes. One of the great challenges for believers in mechanical markets is what the pair call “long-lasting asset swings” and what we often loosely and promiscuously characterize as bubbles. Ironically, to explain asset swings, many economists end up arguing that investors have been seized by bouts of irrationalism, crowd psychology and momentum trading or fooled by “informational problems, poor incentives, and inadequate competition,” allowing assets to diverge from intrinsic values, as determined by the model. The market, from a predetermined perspective, loses its moorings and has to eventually be reeled back by harsh reality. That belief that outside factors have marred the perfect operation of the market machine has been buttressed by some adherents of behavioral economics (which ironically helped undermine rational expectations in the first place) who replace predetermined market relations with predetermined psychological factors. The result is the same: The market, in a sense, loses its mind until its painful return to rationality.

To be sure, they note, lack of transparency, lousy incentives and psychological factors contribute to market problems and to the destructive result, a misallocation of capital and a painful correction. But even if they did not exist, they argue, assets would still swing because of the inevitability of imperfect knowledge. Participants know prices are growing excessive; but Frydman and Goldberg are arguing for a kind of middle way between two extremes and opposing tendencies in economics: the first, that markets allocate capital nearly perfectly; the second, that markets and participants are irrational, grossly inefficient at allocating capital and prone to a succession of bubbles. Each demands a different role for the state: In the first, a hands-off attitude to upswings in asset prices; the second, a readiness to massively intervene. Getting your mind around where Frydman and Goldberg are going requires a sensitivity to terms and definitions. They are not arguing that prediction, for example, is impossible, but that “precise” prediction is. Forecasting can be successful, particularly over the short term and, over the longer term, by understanding what they call “qualitative and contingent” regulatories or trends “in driving price swings.”

Both their market diagnosis and remedy sail a course between these extremes. Frydman and Goldberg dedicate much of the heart of this book to refuting the notion that asset swings represent a departure from reality. True, they argue, psychological factors such as confidence and optimism play a role in driving the market throughout the cycle, underpinned by fundamental considerations. The difference is that their notion of what is fundamental shifts over time as they react to nonroutine change. In the late ‘90s when the great upswing in prices of tech stocks was forming, there were good reasons for investors (and the pair discuss at some length the interaction of short-term speculators and longer-term value speculators) to be optimistic, even as they exceeded historical market benchmarks: There was great optimism about technology; interest rates, inflation and unemployment were low; productivity was high; and despite some disturbing episodes (the Mexican default, the Asia Crisis, the Russian default, Long-Term Capital’s failure), America and the liberal West emerged relatively unscathed and seemingly in control. Similarly, a host of economic fundamentals — low-interest rates, low unemployment, low inflation — fed the rise of housing prices. And in both cases, belief in rational markets — that any action to flatten those swings, or to prick a hypothetical bubble, would produce “distortions” worse than letting them play out — demanded a passive role from regulators. Frydman and Goldberg believe that long-lasting asset swings are inherent in how assets markets allocate capital. However, because market participants must base their trading decisions on imperfect knowledge, asset price swings can sometime become excessive and lead to misallocations of capital.

How might that be done? This brings us to what they call “restoring the market-state imbalance.” They lay out a scheme in which regulators, such as the Federal Reserve or the Financial Stability Oversight Council, monitor markets and carefully and discretely employ a variety of techniques — based on what they call Imperfect Knowledge Economics, or IKE — to try to dampen asset swings that exceed, either on the high end or low, a wide range of values based on historical benchmarks. This is a kind of economics analogue to regulation by principle, seeking to reach beneficial outcomes through flexible, empirical response to dynamic conditions. Although they lay out a number of ways this kind of equity analogue to monetary policy might be done (much of their earlier work on IKE focused on foreign exchange markets), this sometimes seems sketchy. It downplays the difficult technical and political task of regulators going into the markets to deflate what may, or may not be, dangerous swinging assets. They agree with Ben Bernanke that regulators can easily move too soon, thus stifling, say, useful technological innovations. But they admit that more analysis needs to be done to give regulators better tools to pinpoint the best moment to act. And they generally ignore the regulatory-capture problem, which extends well beyond the fact that regulators embraced the orthodoxy of rational expectations over the past few decades. Rational expectations may have seemed to regulators to be true — it certainly was a seductive idea — but it also feeds regulatory desires to lead a peaceful life, to preside over prosperous times and to attain a comfortable retirement.

Beyond Mechanical Markets is not an economics text heavy with math (the approach of their earlier book on IKE, Imperfect Knowledge Economics: Exchange Rates and Risk, was); it hearkens back to the narrative method of economics that arguably reached its apex with Keynes. Unfortunately, Frydman and Goldberg lack the elegance of Keynes, though they’re hardly alone. The book demands some sweat equity in readers and it assumes a more-than-passing familiarity with the substance of economic ideas and history; it has a circular quality, pounding home points, then shifting the perspective, and pounding them again. That said, it marshals a powerful argument that’s bolstered by empirical reality: the eternal failures of mechanical forecasting; the sheer difficulty of beating the market with consistency; the unforeseeable ways that history unfolds. The belief in precise prediction resembles a kind of utopian project, a tower of economic Babel. At bottom, the pair makes a philosophical point that Knight, Keynes and Hayek (ironic, they comment, given that rational expectations came out of Chicago, where Hayek taught) offered many decades ago: the combination of men and events, particularly in these manmade constructs called markets, certainly improves our ability to price assets (and to forecast) over that of an individual or bureaucracy. But that inclusion of freely determined humanity (or humanity that believes it has free will, which is the same thing) conspires to erode any simple, mechanical or guaranteed relation between past and future. They quote Popper: “Quite apart from the fact that we do not know the future, the future is objectively not fixed. The future is open: objectively opened.” At bottom, they’re trying to thread the needle in the ancient free will versus determinism argument.

Will they succeed? Will anything change? Not quickly. As they admit, the power of fully predetermined models may have actually increased because of the crisis. Economic pundits continue to speak with great certainty, and these issues are complex, nuanced and often hidden. Besides, the insurrection Frydman and Goldberg argue for is far greater than just an overthrow of rational expectations; it’s an entire economic world view that claims the power to accurately predict, forecast and capture market reality. Generally, the classic response of an orthodoxy (or what Thomas Kuhn famously called a paradigm) is to ignore any threat, not only out of fear of what might be lost (tenure, prizes, careers), but out of incomprehension; to the predetermined model builders, Frydman and Goldberg’s argument must literally seem like babble. That may well be the best explanation for the fact that these issues and this book can barely generate a debate in the United States.

Central banks are known to play a key role during economic downturn and when the stock markets face unusual challenges. They intervene though the widely known quantitative easing which injects huge amount of money in the markets for the purchase of bonds. A recent article shows a central bank is propping up share prices. It is obvious in this context that the market price of stocks is likely to have nothing to do with the traditional way by which price relates to company valuation. This kind of approach is likely to become more common and is thus a significant factor impacting the market. The following article gives an account of this type of intervention:

TOKYO (BLOOMBERG) – They may not realize it yet, but Japan Inc.’s executives are increasingly working for a shareholder unlike any other: the nation’s money-printing central bank.

While the Bank of Japan’s name is nowhere to be found in regulatory filings on major stock investors, the monetary authority’s exchange-traded fund purchases have made it a top 10 shareholder in about 90 per cent of the Nikkei 225 Stock Average, according to estimates compiled by Bloomberg from public data.

It’s now a major owner of more Japanese blue-chips than both BlackRock, the world’s largest money manager, and Vanguard Group, which oversees more than US$3 trillion (S$4.06 trillion).

To critics already wary of the central bank’s outsized impact on the Japanese bond market, the BOJ’s growing influence in stocks risks distorting valuations and undermining efforts to improve corporate governance.

Proponents, meanwhile, say the purchases provide a much-needed boost to investor confidence. With the Nikkei 225 down 8.3 per cent this year and inflation well below official targets, a majority of analysts surveyed by Bloomberg predict the BOJ will boost its ETF buying – a move that could come as soon as Thursday.

“For those who want shares to go up at any cost, it’s absolutely fantastic that the BOJ is buying so much,” said Shingo Ide, chief equity strategist at NLI Research Institute in Tokyo. “But this is clearly distorting the sanity of the stock market.”

Under the BOJ’s current stimulus plan, the central bank buys about 3 trillion yen (S$36.53 billion) of ETFs every year. While policy makers don’t disclose how those holdings translate into stakes of individual companies, estimates can be gleaned from publicly available central bank records, regulatory filings by companies and ETF managers, and statistics from the Investment Trusts Association of Japan. The BOJ declined to comment on Bloomberg’s findings.

The estimates reveal a presence in Japan’s top firms that’s rivaled by few others, with the BOJ ranking as a top 10 holder in more than 200 of the Nikkei gauge’s 225 companies. The central bank effectively controls about 9 per cent of Fast Retailing Co, the operator of Uniqlo stores, and nearly 5 per cent of soy sauce maker Kikkoman Corp. It has an estimated shareholder rank of No. 3 in both Yamaha Corp, one of the world’s largest makers of musical instruments, and Daiwa House Industry Co, Japan’s biggest homebuilder.

If the BOJ accelerates its ETF purchases this week to an annual rate of 7 trillion yen – the pace predicted by Goldman Sachs – the central bank could become the No 1 shareholder in about 40 of the Nikkei 225’s companies by the end of 2017, according to Bloomberg calculations that assume other major stakeholders keep their positions unchanged. It could hold the top ranking in about 90 firms using HSBC Holdings’ estimate of 13 trillion yen.

While the BOJ’s ETF buying has come under fire from opposition lawmakers, Governor Haruhiko Kuroda has repeatedly defended the programme, saying as recently as last week the purchases aren’t big relative to the size of Japan’s stock market. At an estimated 8.6 trillion yen as of March, the BOJ’s holdings amount to about 1.6 per cent of the total capitalization of all companies listed in Japan. That compares with about 5 per cent held by the nation’s Government Pension Investment Fund. The central bank’s use of large-cap ETFs means its positions are concentrated, with less impact on the thousands of Japanese companies outside benchmark indexes.

State intervention in stock markets has worked out well for some countries. The US government spent US$245 billion to prop up banks during the global financial crisis in 2008, earning a profit of about US$30 billion on their investments as the industry recovered. At the height of the Asian Financial Crisis in August 1998, Hong Kong bought HK$118 billion (S$20.59 billion) of local shares to defend its currency peg, helping to fuel a rally that allowed it to dispose of the entire stake within five years.

In Japan, there’s little sign that BOJ share purchases have inflated Japanese valuations to dangerous levels. The Nikkei 225 trades at 16 times estimated earnings for the next 12 months, in line with the MSCI World Index. Over the past five years, the Japanese gauge has fetched an average premium of 14 percent.

Still, the longer the BOJ’s buying persists, the bigger the risk that market prices will detach from fundamentals. Assuming Goldman Sachs’s prediction for more stimulus proves correct, the BOJ could end up owning a quarter of Mitsumi Electric Co, a supplier to Apple Inc, and 21 per cent of Fast Retailing by the end of 2017, estimates compiled by Bloomberg show.

With such large stakes sitting in index-tracking ETFs that lack a mandate to scrutinize company performance, the BOJ’s intervention could also hamper attempts to improve Japan’s corporate governance, according to Nicholas Benes, representative director of the Board Director Training Institute of Japan.

“The reality of index ETFs is that their commissions are very low and they cannot spend much on engagement or analysis for proxy voting,” Benes said.

The central bank said in December that it plans to buy additional ETFs that weigh holdings based on metrics that include research spending and employee wage growth, but the BOJ hasn’t started those purchases yet because the funds don’t exist.

While bulls have cheered the BOJ’s efforts to lift share prices, the central bank is bound to reverse its intervention at some point, a potential source of instability that Sumitomo Mitsui Trust Bank Ltd says is increasingly on the minds of long-term investors.

“Of course, you can argue that we’re in abnormal times so we have abnormal measures,” said Ayako Sera, a Tokyo-based market strategist at Sumitomo Mitsui. “The biggest question in the future will be: What happens when the BOJ exits?”

Robo advisor: software which invest the resources of someone who does not want to go through the traditional method of financial advising; it is also lower in cost than the latter and perhaps less self interest and more objectivity.

Of course, one also has to focus on the particular stock and bond of a specific company involved and the guiding principle here is that of market value added which is a calculation that shows the difference between the market value of a company and the capital contributed by investors (both bondholders and shareholders). In other words, it is the sum of all capital claims held against the company plus the market value of debt and equity. A company always has the choice between issuing stocks i.e. equity or debt, in essence bonds, often combining the former and the latter. The guiding theory here is “The Modigliani–Miller theorem (of Franco Modigliani, Merton Miller) is a theorem on capital structure, arguably forming the basis for modern thinking on capital structure. The basic theorem states that under a certain market price process (the classical random walk), in the absence of taxes, bankruptcy costs, agency costs, and asymmetric information, and in an efficient market, the value of a firm is unaffected by how that firm is financed. Since the value of the firm depends neither on its dividend policy nor its decision to raise capital by issuing stock or selling debt, the Modigliani–Miller theorem is often called the capital structure irrelevance principle. The key Modigliani-Miller theorem was developed in a world without taxes. However, if we move to a world where there are taxes, when the interest on debt is tax deductible, and ignoring other frictions, the value of the company increases in proportion to the amount of debt used. And the source of additional value is due to the amount of taxes saved by issuing debt instead of equity. Modigliani was awarded the 1985 Nobel Prize in Economics for this and other contributions. Miller was a professor at the University of Chicago when he was awarded the 1990 Nobel Prize in Economics, along with Harry Markowitz and William F. Sharpe, for their “work in the theory of financial economics,” with Miller specifically cited for “fundamental contributions to the theory of corporate finance.”

The following case study shows a complex reality emerging; one should not make broad generalization: Fantasy Maths Spins Losses Into Profits by Gretchen Morgenson, The New York Times:

Companies, if granted the leeway, will surely present their financial results in the best possible light. And of course they will try to persuade investors that the calculations they prefer, in which certain costs are excluded, best represent the reality in their operations.

Call it accentuating the positive, accounting-style.

What’s surprising, though, is how willing regulators have been to allow the proliferation of phony-baloney financial reports and how keenly investors have embraced them. As a result, major public companies reporting results that are not based on generally accepted accounting principles, or GAAP, has grown from a modest problem into a mammoth one.

According to a recent study in The Analyst’s Accounting Observer, 90 percent of companies in the Standard & Poor’s 500-stock index reported non-GAAP results last year, up from 72 percent in 2009.

Regulations still require corporations to report their financial results under accounting rules. But companies often steer investors instead to massaged calculations that produce a better outcome.

I know, I know — eyes glaze over when the subject is accounting. But the gulf between reality and make-believe in these companies’ operations is so wide that it raises critical questions about whether investors truly understand the businesses they own.

Among 380 companies that were in existence both last year and in 2009, the study showed, non-GAAP net income was up 6.6 percent in 2015 compared with the previous year.

Under generally accepted accounting principles, net income at the same 380 companies in 2015 actually declined almost 11 percent from 2014.

Another striking fact: Thirty companies in the study generated losses under accounting rules in 2015 but magically produced profits when they did the math their own way. Most were in the energy sector, which has been devastated by plummeting oil prices, but health care companies and information technology businesses were also in this group.

How can a company turn losses into profits? By excluding some of its costs of doing business. Among the more common expenses that companies remove from their calculations are restructuring and acquisition costs, stock-based compensation and write-downs of impaired assets.

Creativity abounds in today’s freewheeling accounting world. And the study found that almost 10 percent of the companies in the S.&P. 500 that used made-up figures took out expenses that fell into a category known as “other.” These include expenses for a data breach (Home Depot), dividends on preferred stock (Frontier Communications) and severance (H&R Block).

But these are actual costs, notes Jack T. Ciesielski, publisher of The Analyst’s Accounting Observer. “Selectively ignoring facts can lead to investor carelessness in evaluating a company’s performance and lead to sloppy investment decisions,” he wrote. More important, he added, when investors ignore costs related to acquisitions or stock-based compensation, they are “giving managers a free pass on their effectiveness in managing all shareholder resources.”

Lynn E. Turner was the chief accountant of the S.E.C. during the late 1990s, a period when pro forma figures really started to bloom. New rules were put in place to combat the practice, he said in an interview, but the agency isn’t enforcing them.

For example, Mr. Turner said, some companies appear to be violating the requirement that they present their non-GAAP numbers no more prominently in their filings than figures that follow accounting rules.

“They just need to go do an enforcement case,” Mr. Turner said of the S.E.C. “They are almost creating a culture where it’s better to beg forgiveness than to ask for permission, and that’s always really bad.”

As it happens, the commission is in the midst of reviewing its corporate disclosure requirements and considering ways to improve its rules “for the benefit of both companies and investors.”

This would seem to be a great opportunity to tackle the problem of fake figures. But such work does not appear to rank high on the S.E.C.’s agenda.

Kara M. Stein, an S.E.C. commissioner, expressed concern about this in a public statement on April 13. Among the questions the S.E.C. was not asking, she said: “Should there be changes to our rules to address abuses in the presentation of supplemental non-GAAP disclosure, which may be misleading to investors?”

With the presidential election looming, Mr. Ciesielski said it was unlikely that any meaningful rule changes on these types of disclosures would emerge anytime soon. That means investors will remain in the dark when companies don’t disclose the specifics on what they are deducting from their earnings or cash flow calculations.

Consider restructuring costs, the most common expense excluded by companies from their results nowadays.

“Why shouldn’t companies say, ‘This is a restructuring program that is going to take us four years to complete, and here are the numbers,’” Mr. Ciesielski said in an interview. “Restructuring programs cost cash. Why not face up to it and be real about what you’re forecasting? If everybody did that consistently, that would be a dose of reality.”

Mr. Turner, the former S.E.C. chief accountant, agreed. What investors need, he said, is a clearer picture of all items — both costs and revenues — that companies consider unusual or nonrecurring in their operations. These details should appear in a footnote to the financial statements, he said.

“We need to require the disclosure of both the good and the bad,” Mr. Turner said. “If you have a large nonrecurring revenue item, you need to disclose that as well as a nonrecurring expense. Then you should require auditors to have some audit liability for these items.”

Of course, some of the fantasy figures highlighted by companies are worse than others. Excluding the impairment of an asset, Mr. Ciesielski said, is “not the worst crime being committed. But when you’re backing out litigation expenses that go on every quarter, that’s a low-quality kind of adjustment, and those are pretty abhorrent.”

The bottom line for investors, according to Mr. Ciesielski and Mr. Turner, is to ignore the allure of the make-believe. Real-world numbers may be less heartening, but they are also less likely to generate those ugly surprises that can come from accentuating the positive.

Trading of stocks has changed tremendously in the recent past mainly spurred by technological developments. Today, ordinary investors have access to trading platforms and the cost for so doing has gone lower and lower.

Basically, there are two methods of stock market trading. The traditional way of trading occurs in an open outcry manner on the stock exchange floor of the stock market. Modern stock trading is conducted via electronic exchanges and all occurrences take place in real time online.

On the stock exchange floor, the stock market trading atmosphere is chaotic and noisy. The stock market is filled with hundreds of people gesturing, shouting and rushing around when the stock market is open. Stock traders are seen chatting on phones, entering data into computer terminals and watching the consoles closely.

With online stock market trading, computer networks are used as opposed to trading off the stock market floor. A large network of computers is employed to match sellers and buyers in the electronic market instead of using human stock brokers. Although this method is not as bustling and exciting as the stock market exchange floor, it is quicker and more effective.

To start traditional stock trading on the floor, a person requests the broker purchase a said number of shares on the market. Once the request is made, the order department for the broker forwards the order to the floor clerk. The clerk then alerts a trader to locate another trader who will sell the shares the investor wanted. The deal closes when the two traders agree on a price with notification sent back the same way. Ultimately, the broker gets in touch with the investor to tell him the final price for the shares. The entire process may take awhile, based on the current market and stocks. After a few days, the investor will finally receive a confirmation in the mail.

Investing electronically is much faster and far less complicated. Computers match the buying and selling of stock in real time. Savvy investors have the distinct advantage of instant updates on stock trade happenings.

According to Investopedia, active trading is the act of buying and selling securities based on short-term movements to profit from the price movements on a short-term stock chart. The mentality associated with an active trading strategy differs from the long-term, buy-and-hold strategy. The buy-and-hold strategy employs a mentality that suggests that price movements over the long term will outweigh the price movements in the short term and, as such, short-term movements should be ignored. Active traders, on the other hand, believe that short-term movements and capturing the market trend are where the profits are made. There are various methods used to accomplish an active-trading strategy, each with appropriate market environments and risks inherent in the strategy. Here are four of the most common types of active trading and the built-in costs of each strategy. (Active trading is a popular strategy for those trying to beat the market average. To learn more, check out How To Outperform The Market.)

Day Trading Day tradingis perhaps the most well known active-trading style. It’s often considered a pseudonym for active trading itself. Day trading, as its name implies, is the method of buying and selling securities within the same day. Positions are closed out within the same day they are taken, and no position is held overnight. Traditionally, day trading is done by professional traders, such as specialists or market makers. However, electronic trading has opened up this practice to novice traders. (For related reading, also see Day Trading Strategies For Beginners.)

Position Trading
Some actually consider position trading to be a buy-and-holdstrategy and not active trading. However, position trading, when done by an advanced trader, can be a form of active trading. Position trading uses longer term charts – anywhere from daily to monthly – in combination with other methods to determine the trend of the current market direction. This type of trade may last for several days to several weeks and sometimes longer, depending on the trend. Trend traders look for successive higher highs or lower highs to determine the trend of a security. By jumping on and riding the “wave,” trend traders aim to benefit from both the up and downside of market movements. Trend traders look to determine the direction of the market, but they do not try to forecast any price levels. Typically, trend traders jump on the trend after it has established itself, and when the trend breaks, they usually exit the position. This means that in periods of high market volatility, trend trading is more difficult and its positions are generally reduced.

Swing Trading
When a trend breaks, swing traderstypically get in the game. At the end of a trend, there is usually some price volatility as the new trend tries to establish itself. Swing traders buy or sell as that price volatility sets in. Swing trades are usually held for more than a day but for a shorter time than trend trades. Swing traders often create a set of trading rules based on technical or fundamental analysis; these trading rules or algorithms are designed to identify when to buy and sell a security. While a swing-trading algorithm does not have to be exact and predict the peak or valley of a price move, it does need a market that moves in one direction or another. A range-bound or sideways market is a risk for swing traders. (For more on swing trading, see our Introduction To Swing Trading.)

Scalping Scalpingis one of the quickest strategies employed by active traders. It includes exploiting various price gaps caused by bid/ask spreadsand order flows. The strategy generally works by making the spread or buying at the bid price and selling at the ask price to receive the difference between the two price points. Scalpers attempt to hold their positions for a short period, thus decreasing the risk associated with the strategy. Additionally, a scalper does not try to exploit large moves or move high volumes; rather, they try to take advantage of small moves that occur frequently and move smaller volumes more often. Since the level of profits per trade is small, scalpers look for more liquid markets to increase the frequency of their trades. And unlike swing traders, scalpers like quiet markets that aren’t prone to sudden price movements so they can potentially make the spread repeatedly on the same bid/ask prices. (To learn more on this active trading strategy, read Scalping: Small Quick Profits Can Add Up.)

Costs Inherent with Trading Strategies
There’s a reason active trading strategies were once only employed by professional traders. Not only does having an in-house brokerage house reduce the costs associated with high-frequency trading, but it also ensures a better trade execution. Lower commissions and better execution are two elements that improve the profit potential of the strategies. Significant hardware and software purchases are required to successfully implement these strategies in addition to real-time market data. These costs make successfully implementing and profiting from active trading somewhat prohibitive for the individual trader, although not all together unachievable.

The Bottom Line
Active traders can employ one or many of the aforementioned strategies. However, before deciding on engaging in these strategies, the risks and costs associated with each one need to be explored and considered. (For related reading, also take a look at Risk Management Techniques For Active Traders.)

Algorithmic trading (automated trading, black-box trading, or simply algo-trading) is the process of using computers programmed to follow a defined set of instructions for placing a trade in order to generate profits at a speed and frequency that is impossible for a human trader. The defined sets of rules are based on timing, price, quantity or any mathematical model. Apart from profit opportunities for the trader, algo-trading makes markets more liquid and makes trading more systematic by ruling out emotional human impacts on trading activities.

Using this set of two simple instructions, it is easy to write a computer program which will automatically monitor the stock price (and the moving average indicators) and place the buy and sell orders when the defined conditions are met. The trader no longer needs to keep a watch for live prices and graphs, or put in the orders manually. The algorithmic trading system automatically does it for him, by correctly identifying the trading opportunity. (For more on moving averages, see: Simple Moving Averages Make Trends Stand Out.)

Algo-trading provides the following benefits:

Trades executed at the best possible prices

Instant and accurate trade order placement (thereby high chances of execution at desired levels)

Algo-trading is used in many forms of trading and investment activities, including:

Mid to long term investors or buy sidefirms (pension funds, mutual funds, insurance companies) who purchase in stocks in large quantities but do not want to influence stocks prices with discrete, large-volume investments.

Algorithmic trading provides a more systematic approach to active trading than methods based on a human trader’s intuition or instinct.

Algorithmic Trading Strategies

Any strategy for algorithmic trading requires an identified opportunity which is profitable in terms of improved earnings or cost reduction. The following are common trading strategies used in algo-trading:

Trend Following Strategies:

The most common algorithmic trading strategies follow trends in moving averages, channel breakouts, price level movements and related technical indicators. These are the easiest and simplest strategies to implement through algorithmic trading because these strategies do not involve making any predictions or price forecasts. Trades are initiatedbased on the occurrence of desirable trends, which are easy and straightforward to implement through algorithms without getting into the complexity of predictive analysis. The above mentioned example of 50 and 200 day moving average is a popular trend following strategy. (For more on trend trading strategies, see: Simple Strategies for Capitalizing on Trends.)

Arbitrage Opportunities:

Buying a dual listed stock at a lower price in one market and simultaneously selling it at a higher price in another market offers the price differential as risk-free profit or arbitrage. The same operation can be replicated for stocks versus futures instruments, as price differentials do exists from time to time. Implementing an algorithm to identify such price differentials and placing the orders allows profitable opportunities in efficient manner.

Index Fund Rebalancing:

Index funds have defined periods of rebalancing to bring their holdings to par with their respective benchmark indices. This creates profitable opportunities for algorithmic traders, who capitalize on expected trades that offer 20-80 basis points profits depending upon the number of stocks in the index fund, just prior to index fund rebalancing. Such trades are initiated via algorithmic trading systems for timely execution and best prices.

Mathematical Model Based Strategies:

A lot of proven mathematical models, like the delta-neutral trading strategy, which allow trading on combination of options and its underlying security, where trades are placed to offset positive and negative deltas so that the portfolio delta is maintained at zero.

Trading Range (Mean Reversion):

Mean reversion strategy is based on the idea that the high and low prices of an asset are a temporary phenomenon that revert to their mean value periodically. Identifying and defining a price range and implementing algorithm based on that allows trades to be placed automatically when price of asset breaks in and out of its defined range.

Volume Weighted Average Price (VWAP):

Volume weighted average price strategy breaks up a large order and releases dynamically determined smaller chunks of the order to the market using stock specific historical volume profiles. The aim is to execute the order close to the Volume Weighted Average Price (VWAP), thereby benefiting on average price.

Time Weighted Average Price (TWAP):

Time weighted average price strategy breaks up a large order and releases dynamically determined smaller chunks of the order to the market using evenly divided time slots between a start and end time. The aim is to execute the order close to the average price between the start and end times, thereby minimizing market impact.

Percentage of Volume (POV):

Until the trade order is fully filled, this algorithm continues sending partial orders, according to the defined participation ratio and according to the volume traded in the markets. The related “steps strategy” sends orders at a user-defined percentage of market volumes and increases or decreases this participation rate when the stock price reaches user-defined levels.

Implementation Shortfall:

The implementation shortfall strategy aims at minimizing the execution cost of an order by trading off the real-time market, thereby saving on the cost of the order and benefiting from the opportunity cost of delayed execution. The strategy will increase the targeted participation rate when the stock price moves favorably and decrease it when the stock price moves adversely.

Beyond the Usual Trading Algorithms:

There are a few special classes of algorithms that attempt to identify “happenings” on the other side. These “sniffing algorithms,” used, for example, by a sell side market makerhave the in-built intelligence to identify the existence of any algorithms on the buy side of a large order. Such detection through algorithms will help the market maker identify large order opportunities and enable him to benefit by filling the orders at a higher price. This is sometimes identified as high-tech front-running. (For more on high-frequency trading and fraudulent practices, see: If You Buy Stocks Online, You Are Involved in HFTs.)

Technical Requirements for Algorithmic Trading

Implementing the algorithm using a computer program is the last part, clubbed with backtesting. The challenge is to transform the identified strategy into an integrated computerized process that has access to a trading account for placing orders. The following are needed:

Network connectivity and access to trading platforms for placing the orders

Access to market data feeds that will be monitored by the algorithm for opportunities to place orders

The ability and infrastructure to backtest the system once built, before it goes live on real markets

Available historical data for backtesting, depending upon the complexity of rules implemented in algorithm

Here is a comprehensive example: Royal Dutch Shell (RDS) is listed on Amsterdam Stock Exchange (AEX) and London Stock Exchange (LSE). Let’s build an algorithm to identify arbitrage opportunities. Here are few interesting observations:

AEX trades in Euros, while LSE trades in Sterling Pounds

Due to the one hour time difference, AEX opens an hour earlier than LSE, followed by both exchanges trading simultaneously for next few hours and then trading only in LSE during the last hour as AEX closes

Can we explore the possibility of arbitrage trading on the Royal Dutch Shell stock listed on these two markets in two different currencies?

Requirements:

A computer program that can read current market prices

Price feeds from both LSE and AEX

A forex rate feed for GBP-EUR exchange rate

Order placing capability which can route the order to the correct exchange

Back-testing capability on historical price feeds

The computer program should perform the following:

Read the incoming price feed of RDS stock from both exchanges

Using the available foreign exchange rates, convert the price of one currency to other

If there exists a large enough price discrepancy (discounting the brokerage costs) leading to a profitable opportunity, then place the buy order on lower priced exchange and sell order on higher priced exchange

If the orders are executed as desired, the arbitrage profit will follow

Simple and Easy! However, the practice of algorithmic trading is not that simple to maintain and execute. Remember, if you can place an algo-generated trade, so can the other market participants. Consequently, prices fluctuate in milli- and even microseconds. In the above example, what happens if your buy trade gets executed, but sell trade doesn’t as the sell prices change by the time your order hits the market? You will end up sitting with an open position, making your arbitrage strategy worthless.

There are additional risks and challenges: for example, system failure risks, network connectivity errors, time-lags between trade orders and execution, and, most important of all, imperfect algorithms. The more complex an algorithm, the more stringent backtesting is needed before it is put into action.

The Bottom Line

Quantitative analysis of an algorithm’s performance plays an important role and should be examined critically. It’s exciting to go for automation aided by computers with a notion to make money effortlessly. But one must make sure the system is thoroughly tested and required limits are set. Analytical traders should consider learning programming and building systems on their own, to be confident about implementing the right strategies in foolproof manner. Cautious use and thorough testing of algo-trading can create profitable opportunities.

High-frequency trading (HFT) is a program trading platform that uses powerful computers to transact a large number of orders at very fast speeds. High-frequency trading uses complex algorithms to analyze multiple markets and execute orders based on market conditions. Typically, the traders with the fastest execution speeds will be more profitable than traders with slower execution speeds. As of 2009, it is estimated more than 50% of exchange volume comes from high-frequency trading orders.

High-frequency trading became most popular when exchanges began to offer incentives for companies to add liquidity to the market. For instance, the New York Stock Exchangehas a group of liquidity providers called supplemental liquidly providers (SLPs), which attempt to add competition and liquidity for existing quotes on the exchange. As an incentive to the firm, the NYSE pays a fee or rebate for providing said liquidity. As of 2009, the SLP rebate was $0.0015. Multiply that by millions of transactions per day and you can see where part of the profits for high frequency trading comes from.

The SLP was introduced following the collapse of Lehman Brothers in 2008, when liquidity was a major concern for investors

Companies can issue stocks publicly or privately and can switch from one to the other. Stocks which are public trade on the open markets whereas those that are private are exchanged over the counter. Companies at times buy back their stocks and this process is mostly perceived as being beneficial to both the companies and the investors. Trading is mostly in the open but at times companies operate in the dark pools with the implied complexity and possibly lack of transparency.

“The dark alleys”

“Dark pool liquidity refers to the amount of trading activity that occurs directly between parties without the use of an exchange, thereby keeping the transaction private.

Dark pool liquidity usually is created by institutions. For example, let’s assume that Company XYZ and Company ABC are pension funds in California and Oregon, respectively. Company XYZ wants to sell 2 million shares of McDonald’s (MCD) to Company ABC. However, the trade is so large that investors might regard the transaction as a sell signal on MCD, which could tank the stock and make further sales more difficult for the seller. Thus, the two companies decide to do the trade off the exchange (i.e., in a “dark pool”). The transaction costs might also be lower.

Dark pool liquidity provides anonymity. It also provides a way to avoid destabilizing the markets if a trade is particularly large, and it can increase liquidity in the markets by increasing the ease with which buyers can buy and sellers can sell. However, dark pools are controversial because they prevent all investors and market participants from knowing the true prices at which specific securities are valued in all arm’s-length transactions. Given that dark pool trading reportedly constitutes 20% of all market volume, according to some sources, the controversy is bound to continue or increase with its popularity.”

An iceberg order is a type of order placed on a public exchange. The total amount of the order is divided into a visible portion, which is reported to other market participants, and a hidden portion, which is not. When the visible part of the order is fulfilled, a new part of the hidden portion of the same size becomes visible.

As an example, suppose that a market participant places an order on the London Stock Exchange to buy 1,000 shares of stock AAAA at no more than 120p per share, with a visible portion of 100 shares. Other traders will see a buy order for 100 shares at 120p, with the other 900 remaining ‘dark’ or hidden. If someone places an order to sell 100 shares at 120, then the visible portion will be fulfilled. A new visible order to buy 100 at 120 will appear on the order book, and an order to buy 800 shares will remain hidden.

Iceberg orders are allowed under the Mifid rules, which enforce transparency of securities trading in the European Union, through the ‘order management’ waiver to pre-trade transparency. This waiver means that orders can be hidden from the market, if this is done to facilitate trading strategies which could be accomplished without hidden orders. In the case of iceberg orders, this means that instead of placing an iceberg order, a market participant could place a normal limit order and replace it by a new order for the same amount each time it is fulfilled. Since there is no difference in what the other market participants would see, the iceberg order simply makes it easier to carry out this strategy automatically, and does not give any unfair advantage.

Exchanges decide which among several orders at the same price level should be carried out first by using time priority, or ‘first come first served’. In the case of iceberg orders, the visible portion has normal priority but the hidden portion has lower priority than any visible order. This means that an order placed after an iceberg order will execute after the visible portion but before the hidden portion. For exchanges governed by Mifid this system of priorities is a legal requirement. The method of determining priority between the hidden portions of different iceberg orders varies from exchange to exchange. [1]

Perhaps the major thrust of the changes going on can be captured by what Robinhood is doing. For removing all the barriers to stock trading, Robinhood is one of our Most Innovative Companies of 2016. Robinhood’s Vlad Tenev (left) and Baiju Bhatt dream of a world where a new generation can invest with the ease of ordering an Uber. Founded in Palo Alto in 2014 by the former Stanford roommates, Robinhood is an app-based stock brokerage that offers commission-free trading. The various fees attached to trading stocks often deter young people from entering the market, but by eliminating fees and streamlining the process, Robinhood hopes to open up trading to a new demographic. The tool is the fastest-growing brokerage in history, with hundreds of thousands of customers and more than $2 billion in transactions. In 2015, it raised $50 million in funding and won an Apple Design Award–the first finance app to do so. Now, the company is letting developers build its functionality into existing products like StockTwits and Quantopian, which could revolutionize trading. And next up is international expansion: Robinhood already has 20,000 people on the waitlist for its upcoming launch in Australia.

Robert Shiller’s research concludes that the cyclically adjusted price/earnings ratio (CAPE) has some ability to predict 10-year-ahead real (i.e., inflation-adjusted) S&P 500 returns. CAPE is the current level of S&P 500 divided by average real earnings over the last 10 years. Thus CAPE tells you how many dollars the S&P 500 costs per average real dollar of earnings. Historically, when CAPE has been above its historic average of about 17, future 10-year real stock returns have usually been below average, and vice versa. Although today’s minuscule interest rates suggest the justified price/earnings level today should be above average, today’s CAPE of about 23 still suggests below-average returns.

Eugene Fama and Ken French’s research concludes that the Baa-Aaa interest-rate spread has some ability to predict long-horizon stock returns. This spread represents the additional yield demanded by investors for bearing the lower credit quality of triple-B bonds instead of the highest quality triple-A bonds. The argument for this spread’s predictability is that it should move with the unobservable equity risk premium, which is the additional return investors demand for bearing the risk of stocks compared to bonds. Historically, when this spread has been above average, future 5- and 10-year real stock returns have usually been above average, and vice versa. Today’s Baa-Aaa spread of 1.05% is slightly below its average since 1919, but slightly above its median spread. Thus, this spread suggests long-horizon stock prospects are about average. Shiller and Fama recently shared the Nobel Prize in economics.

To repeat, CAPE and Baa-Aaa spread have shown some ability to predict long-horizon stock returns. The explanatory power of the regressions is about 30%. To put this in perspective, suppose someone was trying to predict whether stock returns the next 10 years will be above average or below average. Historically, these metrics made the right prediction about 65% of the time, but missed about 35% of the time. These forecasts are far from perfect.

Separately, no one has the ability to predict short-term returns like the market correction that occurred late August. Attempts to do so usually depress investors’ long-term returns. Short-term volatility is the price investors must pay to garner stocks’ larger long-run average returns.

How do I suggest that you use these forecasts?

Before this correction, I had a 50% stock exposure, which is about 5% below the stock allocation recommended for a typical investor my age (63), and thus reflected somewhat below-average stock prospects. Such tilts of 5% to 10% away from my normal asset allocation are examples of tactical asset allocation. Should the S&P 500 enter a bear market–20% below its prior peak–then I probably will rebalance back to 55% stocks since future long-run return prospects after such a market decline would be better than they were before the market decline.

I believe this type of tactical rebalancing with the stock allocation deviating slightly around a normal stock allocation makes sense. It encourages investors to buy when prices are low and to sell when prices are high, while avoiding strong deviations from their normal stock allocation. History suggests that such tactical asset allocation can enhance long-run portfolio returns compared to a fixed-weight strategy with periodic rebalancing.

The following abstract shows the essence of how to capture the new reality of the financial markets:

This thesis describes a method of modeling financial markets by utilizing concepts from mechanical vibration. The models developed represent multi-degree of freedom, mass-spring systems. The economic principles that drive the design are supply and demand, which act as springs, and shareholders, which act as masses. The primary assumption of this research is that events cannot be predicted but the responses to those events can be. In other words, economic stimuli create responses to a stock’s price that is predictable, repeatable and scientific. The approach to determining the behavior of various financial markets encompassed techniques such as Fast Fourier Transform and discretized wavelet analysis. The researched developed in three stages; first an appropriate model of causation in the stock market was established. Second, a model of steady state properties was determined. Third, experiments were conducted to determine the most effective model and to test its predictive capabilities on ten stocks. The experiments were evaluated based on the model’s hypothetical return on investment. The results showed a positive gain on capital for nine out of the ten stocks and supported the claim that stocks behave in accordance to the natural laws of vibration. As scientific approaches to modeling the stock market are beginning to develop, engineering principles are proving to be the most relevant and reliable means of financial market prediction.

Many have tried predicting the stock market, but very few have succeeded. It is nearly impossible to predict the market for a long period of time, but with the correct mathematical algorithms, and if other major factors that affect the stock market remain unchanged, we can predict how the stock will act from its previous behavior. For the past six months, Professor Humi and I have been working on a design of a MatLab program, which would be able to predict the price of the stock. We constructed two programs; however one seems to provide us with a better prediction than the other. Both of these programs use many mathematical algorithms to predict the price of the stock.

The program that seems to provide us with a better prediction was Program 1. This program first took the best-fit curve of the actual price, and interpolated it for the amount of days we wish to predict the price of the stock. We then took a Fast Fourier Transformation of the actual price, cleaned out noise, and took the Inverse Fourier Transformation, in order to clean out small noises in the stock. Once we had the cleaned Inverse Fourier Transformation, we interpolated it again for the same amount of days as the interpolation of the best-fit curve, 5 and added the two components together. These two components provided us with the prediction for a certain number of days. The prediction we computed was fairly accurate, once small modifications were made. Certain stocks gave us a reasonable prediction without any modifications, while other predictions needed to be modified by a certain percentage (10%), after that small modification was made, the prediction was reasonable once again. From this project we realized that predicting the stock market is very difficult due to the everyday changing of economy which we cannot predict. We however attempted to predict the stock market in ideal situations, and in some manner we have succeeded. 6 Source: https://www.wpi.edu/Pubs/E-project/Available/E-project-022808-142909/unrestricted/FullIQPReport7.pdf