After Anthropic: What Will the AI Licensing Market Look Like and What Does It Mean for the Industry?

The 1.5-billion-dollar settlement in Bartz v. Anthropic is not just the end of a single dispute. It is the beginning of a new phase: a phase in which it is becoming clear that the AI industry will have to rely on an organized system of content licensing instead of informal, often unlawful “scraping from the internet”.

This text examines how the global market for licenses for training AI models is changing and what consequences this has for technology companies, publishers, and authors.

A few years ago, most AI models were trained on what was often called “open web data”, with a very flexible interpretation of what was in fact lawfully available. With the first major lawsuits (against OpenAI, Meta, Anthropic, and others), and especially after Anthropic’s settlement, the industry began to reorient itself toward formal licenses.

Today, several licensing models can be seen:

  • Direct licensing – An AI company concludes individual agreements with publishers, media houses, databases, or collective management organizations;
  • Revenue-sharing models – Publishers and authors receive a percentage of the revenue that the AI company generates by using their content;
  • Bundled deals – Licensing is part of a broader cooperation package (marketing, joint projects, exclusive content);
  • Collective licensing – Organizations representing authors and publishers offer AI companies blanket licenses for a large corpus of works.

This shift from informal use to structured licensing is a consequence of the fact that case law has shown that ignoring copyright can lead to extremely costly outcomes.

Forecasts show that markets related to AI intellectual property and licensing are growing extremely quickly. Business analyses point to:

  • growth of the market for AI licenses and datasets by several tens of percent annually,
  • a multi‑year trend of increasing the value of “data as an asset”,
  • expansion of specialized companies that act as intermediaries between AI firms and rights holders.

In other words: what until recently was understood as “free input” for AI systems is now becoming a serious budget item – but also a new opportunity for authors and publishers to monetize their catalogues.

On the side of rights holders, sophisticated tools that use artificial intelligence to protect content are being developed:

  • content recognition systems (neural fingerprinting) that can identify a work even when it has been modified or partially adapted,
  • tools for tracking the use of works on the internet and in large models,
  • integration of technologies for proving the origin and authenticity of content (provenance, blockchain, standards for digital signing and traceability).

These systems enable automated detection of unauthorized use and, in the ideal scenario, automatic linking of each use to the appropriate remuneration and payment model.

One real risk is that the shift to expensive and regulated licensing will lead to a situation in which only the largest companies (those with billions of dollars in funding and revenue) can afford all the necessary licenses and potential settlements.

This can have several consequences:

  • increased concentration of the AI model market in the hands of a few global players,
  • a more difficult market entry for smaller startup companies that do not have the capital to pay for licenses and bear legal risk,
  • the possibility that innovation will move into “niches” where there is less need for protected content (for example, exclusively public domain content, synthetic data).

On the other hand, if standardized and transparent licensing models with reasonable prices and collective agreements are developed, even smaller players can gain lawful access to high‑quality datasets. The key challenge will be establishing a balance between protecting rights holders and preserving competition in the AI market.

  • Licensing is no longer “nice to have”, but a mandatory part of the business model;
  • In‑house legal teams and external advisers must be involved already at the stage of planning datasets;
  • Documentation of sources, keeping records of licenses, and complying with contractual restrictions are becoming key elements of compliance.
  • It is time to actively engage in negotiations with AI companies, either directly or through associations and collective management organizations;
  • High‑quality catalogues – especially those with a long tradition and recognizable brands – will be in demand;
  • It is necessary to think about one’s own licensing terms: whether to opt for lump‑sum fees, a percentage of revenue, exclusivity, time limits, and similar arrangements.

Viewed from this angle, Anthropic’s settlement is not just a “penalty”. It is a signal to the market that the era of unregulated use of other people’s works is over and that the future lies in structured, contract‑based regulation of relations between the AI industry and the creative sector.

Need legal advice on copyright and AI?

Follow for more legal insights:

Similar Posts