Go back to menu

Data and Antitrust in the United States

The Regulatory Crossroads

31 January 2019

The recent collection and use of data has raised antitrust concerns. The authors of this article discuss the issue and conclude that rapid changes in markets and products due to new technology may necessitate the adoption of different regulatory tools to adequately evaluate markets and check potential antitrust violations.

Companies’ utilisation of big data has drawn scrutiny by legislators and increased pressure on regulators to act, especially in light of the recent misuse of data by third parties. The push for regulation of big data in the United States has leaned toward issues under consumer protection law; however, the recent collection and use of data has raised antitrust concerns. Antitrust regulation will likely be most active in the realm of merger control and algorithmic pricing, but that scope may expand as regulatory concerns continue to be raised.


Competitive concerns over data are tied to the increasing importance it plays in business and every area of life. It is the age of big data. Companies are now working with vast datasets defined by the ‘‘four Vs’’—volume, velocity, variety, and value. Volume indicates the amount of data collected, velocity involves the speed at which data can be collected and variety refers to the breadth of data collected. Value relates to the socioeconomic value derived from the use of such data. Sometimes a fifth ‘‘V’’— veracity, which relates to the reliability of the data—is also included.

Big data is now key to how many companies operate, which raises questions for antitrust enforcement. The issues facing regulators in this new environment include the evaluation of data as a barrier to market entry, the competitive significance of data, assessment of exclusionary conduct by market leaders regarding access to data and the method of evaluating big data for competition issues in the context of online platforms, a multi-sided market. Additionally, pursuant to the network effects theory, online platform market leaders can leverage data from their users to provide more attractive services, creating difficult or even insurmountable barriers for competitors entering the market.

These concerns continue to be vigorously debated by regulators, legislators and practitioners as regulation attempts to keep pace with rapidly changing technology.


While U.S. antitrust regulators have not been as proactive in investigating the utilisation of big data as some of their European counterparts, U.S. regulators indicated that data is already analysed in the review of mergers and acquisitions. The U.S. approach considers data as an input in a merger analysis to evaluate whether tying up such data in a merger or acquisition would impact the market. Antitrust regulators have focused not only on what effect a merger has on access to data but also the effect on the data involved and use of such data.

For instance, in 2008 the Federal Trade Commission (FTC) blocked a merger between two competitors using publicly available data for their products. The FTC argued that competition between the two companies spurred their development of innovative analytical tools for utilising this publicly available information. The FTC alleged concerns that the merger would disincentivise the intense competition that had led to valuable information services tools tailored to meet their customers’ needs. Although U.S. regulators attest that traditional tools are sufficient to analyse mergers involving data, practitioners question whether current antitrust tools adequately identify and address potential antitrust violations in data rich industries.


Furthermore, U.S. antitrust regulators have indicated a willingness to pursue another use of big data—algorithmic pricing. In 2015, the Department of Justice (DOJ) prosecuted two e-commerce sellers for price-fixing algorithms that set prices for posters sold online. In that case, the sellers had reached an explicit agreement to collude; however, developments in technology raise the possibility of pricing algorithms not only facilitating explicit collusion and price fixing conspiracies but also tacit collusion between competitors. Tacit collusion would be made possible by pricing algorithms capable of constantly assessing and adjusting prices at the product level for each customer. The potential misuse of pricing algorithms in facilitating collusion means that regulators may need to take a more nuanced approach to market definition in the future.


Although U.S. regulators have not been as proactive in regulating big data as their European counterparts, the increasing importance of big data not just as a tool but as a key input for various industries warrants address by regulators. Rapid changes in markets and products due to new technology may necessitate the adoption of different regulatory tools to adequately evaluate markets and check potential antitrust violations.

This article was also published in in the Pratt's Privacy & Cybersecurity Law Report (June 2018. Vol 4. No 5).