The Competition and Markets Authority (CMA) has announced the launch of an investigation into the suspected sharing of competitively sensitive information among competing hotel chains using the software of a third-party data analytics provider. Businesses using or considering price‑optimisation or algorithmic decision‑making tools to make commercial decisions should be aware of rapidly evolving regulatory scrutiny and, in particular, they should be alert to the possibility of competition authority challenge.
Published: 3 March 2026
Author: Manu Mohan
Pricing algorithms powered by data analytics and predictive modelling are playing an increasingly significant role in sectors such as real estate, insurance, transportation, and education. However, in parallel, competition authorities in the UK, EU, and US are increasingly focused on how such technologies may facilitate coordinated pricing or reduce independent decision‑making.
The UK Competition and Markets Authority (CMA), in its draft Annual Plan for 2026 - 2027 and as part of implementing its 2026 - 2029 Strategy, had identified deterring algorithmic price collusion as a priority area. CMA officials have publicly acknowledged that algorithmic pricing is an “area of focus and concern” and that the CMA is “watching and learning” from US cases.
Two US Department of Justice (DOJ) proposed settlements from late 2025 now offer practical guideposts for both developers and users of pricing tools, signalling how regulators are likely to approach the design, procurement, and deployment of algorithmic pricing systems. Key learnings are that:
- the technical capabilities of software can enhance competitors’ ability to optimize cartel gains, monitor real-time deviations, and minimize incentives to cheat
- competing users who rely on the same algorithm, particularly one trained on competitively sensitive, non-public data, may be found to have engaged in unlawful coordination, even if each retains discretion to override the algorithm’s recommendations
- direct communication between competitors is not required to establish collusion. It may suffice that a software provider proposes a coordinated pricing model to multiple firms, each aware that its competitors were also being invited to participate, and that those firms adopted and adhered to the model, creating a common understanding among the competitors that prices would rise collectively
- agreements to fix list prices or starting prices (e.g., advertised list prices), even if final transaction prices vary, can distort the competitive process and may be treated as unlawful
- even if an algorithm fails to raise or stabilize prices, the collective delegation of pricing decisions to a common algorithm can itself constitute unlawful coordination
- features such as “auto accept” or default settings that encourage outsourcing of pricing function to the algorithm and make it easier for users to bulk-accept or difficult to override pricing recommendations through administrative burdens may be viewed as reducing independent judgment and increasing the risk of coordinated pricing
- guardrails in an algorithm that increase recommended prices or resist downward pricing pressures such as a “sold-out” mode that stops recommending lower prices once a target figure is reached (even if units remain unsold), or a “revenue protection” mode that limits availability to drive up prices could be seen as artificially constraining competition and contributing to coordinated price increases
Against this backdrop of heightened regulatory attention and recent enforcement actions, several there are a number of practical measures that both developers and users of pricing algorithms may wish to consider.
Recommended best practices
Developers
- avoid training models on non-public, competitively sensitive data such as rivals’ pricing, costs or customer information
- ensure each customer’s model operates autonomously without centralized consideration, sharing of information, or enforcement of common pricing logic across users
- maintain transparency in model design by avoiding “black box” logic that limits oversight, and keep clear documentation explaining how recommendations are generated and updated
- apply restrictions on the use of any non‑public information both for model training and runtime optimisation
- use historical or backward-looking data (e.g. that is at least 12 months old to train model algorithms
- prevent the use of any market‑survey data that includes non‑public, competitor‑specific pricing or operational information
- train sales and support teams to avoid statements implying access to other users’ competitively sensitive data, or shared commercial purpose across users references in customer presentations, marketing materials or discussions
- implement audit trails and enable independent review of data sources, model updates and algorithmic performance to detect and address potential compliance risks
Users of pricing algorithms
- verify whether the algorithm uses data or logic derived from competitors’ commercially sensitive information, and request documentation on data sources, training methods and compliance safeguards
- maintain sufficient human oversight and avoid full automation or features that may undermine independent commercial judgment
- establish internal guidance on extent of reliance on algorithmic pricing including escalation protocols when outputs appear to mirror market-wide patterns or competitive behaviour
- understand the functionality and calibrate the use of features such as “sold-out” mode or “revenue protection” mode that may inhibit lowering of prices or align behaviour of competitors
- provide targeted training to internal data science, commercial pricing and procurement teams to help them recognize risks of algorithmic coordination
- avoid participation in vendor-hosted meetings or training sessions where market analyses, non‑public trends, pricing strategies, or operational benchmarks may be discussed with other competing users
- avoid any language during procurement or implementation discussions that could imply commercial alignment or shared objectives with competing users of the algorithm
- conduct regular reviews of algorithmic outcomes to detect unexplained price uniformity or parallel pricing across competitors
As the regulatory landscape continues to evolve, businesses may wish to review existing practices and ensure appropriate governance measures are in place. Regular assessment of data inputs, human and system behaviour, and internal controls can support compliance efforts and help ensure that pricing tools operate in a manner consistent with competition law expectations.