Competitive Edge: Antitrust enforcers need reinforcements to keep pace with algorithms, machine learning, and artificial intelligence
Antitrust and competition issues are receiving renewed interest, and for good reason. So far, the discussion has occurred at a high level of generality. To address important specific antitrust enforcement and competition issues, the Washington Center for Equitable Growth has launched this blog, which we call “Competitive Edge.” This series features leading experts in antitrust enforcement on a broad range of topics: potential areas for antitrust enforcement, concerns about existing doctrine, practical realities enforcers face, proposals for reform, and broader policies to promote competition. Terrell McSweeny has authored this month’s contribution.
The octopus image, above, updates an iconic editorial cartoon first published in 1904 in the magazine Puck to portray the Standard Oil monopoly. Please note the harpoon. Our goal for Competitive Edge is to promote the development of sharp and effective tools to increase competition in the United States economy.
Algorithmic price fixing isn’t science fiction. The U.S. Department of Justice’s Antitrust Division and the United Kingdom’s Competition and Markets Authority have already brought their first case in which competitors agreed to use specific pricing algorithms for the sale of posters online. This particular case did not stretch the traditional antitrust framework for price fixing because humans were involved. But as technology becomes more powerful and autonomous, some competition experts are raising concerns about whether analog antitrust doctrines can keep pace. The debate is far from settled, but it is increasingly clear that 21st century regulators are going to need technological expertise to aid them in making enforcement decisions.
Competition regulators in major markets around the world are actively assessing whether technology requires changes to their antitrust enforcement frameworks. Here in the United States, the Federal Trade Commission is wrapping up a series of public hearings on “Competition and Consumer Protection in the 21st Century” by focusing on algorithms, artificial intelligence, and predictive analytics. It plans to examine ethical and consumer protection issues associated with the use of these technologies and how competitive dynamics are affected by them.
In their most basic form, algorithms are instructions that computers follow to process data and solve problems. They are essential building blocks of our digital lives. Frequently they are used to set prices. Increasingly sophisticated pricing algorithms can offer more personalized prices or different prices for people based on information about them. Algorithms can help consumers quickly and easily locate and compare prices of products. Personalized pricing based on a customer’s ability to pay, expected individual demand, and other data points can improve efficiency and benefit consumers, though it doesn’t always work that way. For instance, studies find that people are shown higher prices on mobile devices than on desktop computers or higher prices depending on how far they are from a store location. Some antitrust experts worry that the pricing algorithms that are increasingly common in both digital and analog markets might facilitate coordination—either expressly or tacitly—thereby minimizing competition on price to the detriment of consumers while remaining undetected by antitrust enforcers.
It is important for competition enforcers to study changes in technology that affect competition, but it doesn’t necessarily follow that pricing algorithms will collude or that they will be used in collusive schemes. If pricing algorithms are truly personalized—that is, quoting different prices for different people based on a number of different data points—then collusion is unlikely since it will be nearly impossible for would-be conspirators to discipline “cheaters,” or those competitors who are deviating from the agreement.
There are two key concerns that antitrust regulators must grapple with regarding pricing algorithms, particularly in highly concentrated industries. The first has to do with technical capabilities. As the use of algorithms becomes more common, will regulators be able to understand and detect when algorithms are being used to collude? The second concern has to do with pricing algorithms automatically and independently gravitating to higher prices without human intervention or agreement. Such conduct might be hard to detect and address under existing law.
Much of the current antitrust debate also is focused on whether regulators properly understand and address the role of data in digital markets. Data’s significance as a competitive asset depends on the facts. Some data, for example, are public or can be obtained from data brokers for a fairly nominal cost. And some data can be nonrivalrous, meaning it can be used by many companies at the same time. But other data are proprietary and can operate as a barrier to entry. Antitrust agencies have proven relatively capable of addressing competition issues around data, but the demands on agencies to engage in highly technical, fact-based examinations are only likely to increase as data becomes more important in the world of predictive analytics and artificial intelligence.
Against this backdrop, it is essential for antitrust agencies to rely not only on legal and economic expertise but also technological expertise. While antitrust frameworks have proven relatively adaptable, a key question is whether the agencies themselves have the capabilities required for the digital age. Some regulators are already incorporating technologists into their work. Brazil, for example, has a technology lab. Similarly, the European Union’s Commissioner for Competition Margrethe Vestager has suggested that the Directorate General for Competition create its own algorithms in order to figure out if collusion is taking place.
In the United States, the Federal Trade Commission created a position for a chief technologist in the FTC chair’s office in 2011 and expanded its research and technology capabilities with the creation of the Office of Technology Research and Investigation, or O-Tech, in 2015. But that office is currently housed in the agency’s Bureau of Consumer Protection, suggesting its work and resources are mostly directed toward the consumer protection mission of the agency. The Federal Trade Commission should consider creating an independent and fully staffed office for the chief technologist or even a Bureau of Technology to enhance its required technological expertise and support its competition mission.
—Terrell McSweeny is a former commissioner of the Federal Trade Commission and previously held senior positions in the White House, the U.S. Department of Justice, and the U.S. Senate. She currently is a partner at the law firm Covington and Burling LLP.