Big Data, Keeps Getting Bigger

The combination of big data and algorithms is one of the hot topics globally at the moment. It is showing no signs of slowing down. There is ever greater scrutiny from regulators worldwide.

This post focuses on some recent developments in this area, including a new report by the JFTC and an upcoming OECD round table on algorithms and collusion.

What is an algorithm?

An algorithm is essentially a self-contained step-by-step set of instructions to be performed. We encounter algorithms all the time in our every day lives. Just think, every time you do a Google search or book an Uber. While algorithms can process vast amounts of data in fractions of a second, things can also go badly wrong. Just consider the use of pricing algorithms by two suppliers of a now out of print biology book called The Making of the Fly. It led to the price being driven up to $23.6 million as explained here.

JFTC – Report on Data and Competition Policy

On 6 June 2017, a study group from the Japan Fair Trade Commission (JFTC) released a Report on Data and Competition Policy (see here). The Report concluded that the collection and use of big data should be subject to Japan’s competition law. It acknowledged that the “Accumulation and utilization of data, in itself, promotes competition and creates innovation”, but at the same time issues such as “business combinations that could lead to a restriction of competition including monopoly, unjust collecting (or exploitation) of data from consumers or small or medium sized enterprises and unjust data “hoarding” should be addressed” under Japan’s competition law.

The Report also noted that most of these problems could be dealt with under the current framework in Japan.

Specifically, the Report called out that mergers involving data accumulation should be reviewed more closely with consideration given to the potential for the merged entity to collect and use the data to acquire a dominant position or reduce competition in a “data market”.

This echoes recent comments by Guillaume Loriot, Director of Information, Communication and Media at the European Commission, where he said that competition policy must carefully assess big data in merger control. The aggregation of data sets will be problematic for mergers that strengthen the market power of merging companies, or where the merging of two data sets increases the barriers to entry to the market for those who need that type of data in order to operate.[1]

OECD Roundtable – “Algorithms and collusion”

The issue of big data and algorithms has also caught the attention of the OECD. Later this month, the OECD will hold a roundtable on “Algorithms and collusion” as part of its wider work streams in the digital economy. Some of the topics that will be addressed at the roundtable include the question of whether anti-trust agencies should reconsider the traditional concepts of “agreement” and “tacit collusion” and whether any anti-trust liability can be imposed on algorithm creators and users. For the background paper and other presentations/papers click here.

Other recent developments

This follows on the heels of other recent developments, including:

In Italy, the Anti-Trust Authority, together with the Communications and Data Protection Authorities, will look at the ways in which big data is collected and managed, trying to identify any potential competition concerns. It also aims to define a regulatory framework able to foster competition in the markets for the digital economy to protect privacy and consumers. For more information, click here.

In America, there has been recent diverging public comments made by members of the US Federal Trade Commission (FTC):

In a recent speech, the Acting FTC Chairman, Maureen Ohlhausen, said “there is nothing inherently suspect about using computer algorithms to look carefully at the world around you before participating in markets” and “using algorithms in ways that do not offend traditional anti-trust norms is unlikely to create novel liability scenarios”. Ms Ohlhausen also noted that American anti-trust agencies are well equipped to deal with potentially anti-competitive algorithms.

In contrast, in a recent speech, FTC Commissioner, Terrell McSweeny, said that “the rise of pricing algorithms and AI software will require changes in our enforcement practices”. In relation to the use of algorithms for pricing, Ms McSweeny said that “the consumer welfare effects are not as straightforward. Pricing algorithms raise three issues from a competition perspective. First, they may increase the effectiveness of overt collusion. Second, they may facilitate coordinated interaction in the absence of a traditional “agreement” between competitors. And third, they may enable price discrimination strategies that lead to higher prices for certain groups of customers.” Ms McSweeny, also questioned whether algorithms will enable firms to “solve” their unique prisoner’s dilemma without reverting to overt collusion.

Recent prosecutions by the DOJ and CMA against online sellers of posters who used commercially available algorithms to give effect to price fixing agreements. The algorithms made it easier to implement and police the agreement, with internal emails in one case saying “logistically it is going to be difficult to follow the pricing effectively on a daily basis so I’m looking into re-pricing [sof]tware”.[2]

In December 2016, the European Commission approved the acquisition of LinkedIn by Microsoft which considered the combination of the parties’ databases (see here for the European Commission’s decision).[3]

As algorithms become ever more ubiquitous so will the number of investigations and matters in this area and we await the ACCC’s first public move.

About The Author

Tim GargettTim is a Senior Associate in the Melbourne Competition and Regulatory team who advises clients on all aspects of competition (including merger control), regulatory, consumer, advertising and general commercial and contractual law, with extensive experience engaging with the ACCC.