Agentic Commerce: Contractual Boundaries and Consumer Rights Under UK Law
Recent developments in agentic commerce have raised important questions for both UK consumers and e‑commerce platforms. A particular area of focus is the use of AI assistants capable of browsing, comparing prices, and, in certain instances, automating checkout processes. Major platforms, including Amazon, have sought to contractually restrict these tools through blanket clauses in their Terms of Use, as highlighted in the recently announced litigation initiated by Amazon against Perplexity.
While the commercial rationale for such contractual prohibitions is understandable, these restrictions also raise potential consumer rights considerations which a court should rightly consider in the context of litigation. At the time of this update, there have been no reported instances of Amazon customers formally challenging these contractual limitations. Nonetheless, the issue underscores the broader tension between platform control and the legitimate expectations of consumers in a market increasingly shaped by AI‑driven tools.
Automated Access Restriction in Context
To illustrate the type of clause at issue, e-commerce platforms often prohibit automated access by providing as follows:
“You may not extract and/or re‑utilise parts of the content of our Services without our express written consent. In particular, you may not utilise any data mining, robots, or similar data gathering and extraction tools to extract (whether once or many times) for re‑utilisation any substantial parts of the content of any our Services, without our express written consent...”
This language typically forms part of the standard licence and access terms that govern how consumers may interact with an e-commerce platform and its services. In practical terms, the clause effectively bars the use of automated tools to access or repurpose a platform's content without explicit written permission.
Regulatory Framework: Unfair Terms under the CRA 2015
From a UK consumer law perspective, particularly under Section 62 of the Consumer Rights Act 2015 and Schedule 2, which govern unfair terms in consumer contracts, there are two key considerations. These provisions make it clear that any contract term that creates a significant imbalance in the parties’ rights and obligations to the detriment of the consumer may be deemed unfair.
Limitation of Consumer Legitimate Rights
One possible consideration is the potential limitation of consumer legitimate rights. A consumer’s legitimate right can reasonably include the ability to use an online platform efficiently and conveniently. Blanket prohibitions on agentic commerce or AI‑assisted shopping may prevent consumers from using personal automation tools to streamline purchases, compare prices, or manage activity across accounts. Such restrictions could constitute a disproportionate limitation and raise questions as to whether they qualify as unfair under Section 62 of the Consumer Rights Act 2015
2. Limitation of Reasonable Use of Services
The second consideration relates to the limitation of reasonable use of services. UK courts assess whether contract terms unreasonably restrict a consumer’s ability to use a service in normal, expected ways, such as browsing, purchasing, or comparing products. A blanket ban on AI shopping agents could prevent consumers from enhancing ordinary shopping behaviors, like using tools to identify the best deal or manage purchases efficiently. While platforms may justify restrictions to protect system security, prevent fraud, or maintain operational integrity, a total prohibition may be disproportionate for small‑scale, legitimate consumer activity
3. Potential Drafting Defects
From a contractual drafting perspective, the Automated Access Restriction clause, as currently formulated, may be overly broad and ambiguous when applied to AI shopping agents. While the clause prohibits “the use of data mining, robots, or similar data gathering and extraction tools to extract…any substantial parts of the content…,” it does not, considered alone, explicitly address AI agents acting on behalf of a consumer to browse, compare, or complete purchases. For this purpose, it is assumed that agentic commerce tools do not necessarily involve data mining or content extraction, and are intended to perform the same actions a human user would ordinarily execute, albeit in an automated and more efficient manner. This distinction underscores the potential for differing interpretations of the clause when applied to legitimate, user-directed AI activity.
Furthermore, the clause fails to distinguish between fully autonomous bots and AI tools that operate under the direction of a user, creating ambiguity over whether such conduct falls within its scope. This lack of specificity could be a critical point of defense, as courts often require contractual prohibitions to be clear and precise to be enforceable, particularly when the clause implicates innovative technologies or user-directed automation. Consequently, the clause’s broad and somewhat undefined language leaves it vulnerable to interpretation challenges and may not unequivocally capture the type of agentic activity performed by AI shopping assistants.
Key Takeways
The Importance of Proportionality: In our view, proportionality is key. UK courts are likely to weigh the consumer’s legitimate use and convenience against the platform’s security and operational concerns. A blanket ban on AI‑assisted shopping, particularly one expressed through broad terms such as prohibitions on “robots,” “data mining,” and similar tools could be vulnerable to challenge as an unfair term, especially where AI use is limited to personal convenience and does not threaten platform integrity. Consumers seeking to automate shopping in limited, non‑disruptive ways may therefore have grounds to question the enforceability of such clauses under UK law.
This publication is based on the authors' independent analysis, observations, and experience advising clients on regulatory and compliance matters. It is provided solely for informational purposes. The views expressed herein do not constitute legal advice or an official recommendation, nor do they represent the position of any institution or client. Readers should seek specific professional advice before relying on any part of this publication.

Olu A.
LL.B. (UNILAG), B.L. (Nigeria), LL.M. (UNILAG), LL.M. (Reading, U.K.)
Olu is a Partner at Balogun Harold.
olu@balogunharold.com
Kunle A.
LL.B. (UNILAG), B.L. (Nigeria), LL.M. (UNILAG), Barrister & Solicitor (Manitoba)
Kunle is a Partner at Balogun Harold.
k.adewale@balogunharold.comRelated Articles
Building for the UK Market: Key Legal Considerations for African Founders
One of the first regulatory flashpoints for founders is the issue of liability that cannot be excluded by law. Under the Consumer Rights Act 2015, businesses cannot contract out of core obligations, which include ensuring that goods are of satisfactory quality, fit for purpose, and as described
NCC Internet Code of Practice 2025: New Obligations for Online Platforms
Online platforms operating in Nigeria must adopt internal community rules governing user conduct, content moderation, disinformation, fraud, and unlawful content. These community rules must be submitted to the NCC within six months of the Code’s issuance and must align with clause 146 of the Nigeria Communications Act (the "Act").