

A government committee has suggested a new hybrid system to regulate how AI developers use copyrighted material to train their models. The main idea is a mandatory blanket license that allows AI companies to use copyrighted content while ensuring that creators are paid through an organised system. The proposal tries to support AI development while guaranteeing fair compensation for human creators whose works are essential for modern generative AI.
Why this matters
The recommendation comes at a time when Generative AI is rapidly transforming many industries. These systems offer major benefits, but they also depend on huge amounts of copyrighted material that is often used without clear permission.
To deal with this issue, the Department for Promotion of Industry and Internal Trade set up a committee on April 28, 2025. The committee was asked to study the legal problems linked to AI systems, review copyright laws, determine whether they are sufficient, and prepare a paper for discussions with all relevant groups.
Different opinions
During discussions, the tech and AI industry strongly supported a broad Text and Data Mining exception that would let AI developers use copyrighted material freely. Some groups supported a TDM exception with an opt-out option for creators. In contrast, publishers, media organisations, and creators opposed these exceptions and argued for a voluntary licensing model that would let creators decide if they want to license their works. The committee also studied global approaches in the United States, Japan, the UK, Singapore, and the European Union, and observed a similar case currently before the Delhi High Court.
Why other models were rejected
The committee reviewed several regulatory systems, including voluntary licensing, extended collective licensing, statutory licensing, full TDM exceptions, and TDM with opt-out, but found that none of them suited India’s needs. It concluded that a blanket TDM exception would weaken copyright protection and deny creators fair payment. It also found that the opt-out system fails in practice because smaller creators may not know how or when to opt out, and once content is scraped, control cannot be regained. The committee also noted that opt-out systems shift the burden onto creators, reduce data quality, and require heavy transparency measures that could slow innovation. At the same time, individual licensing would be too costly and slow, especially for startups that need quick access to large and diverse datasets.
The hybrid model: mandatory blanket licence
To overcome these issues, the committee proposes a mandatory blanket licence that would let AI developers use all legally accessed copyrighted content for training without asking each creator for permission. In exchange, creators would receive compensation through a statutory payment right. A central non-profit organisation, formed by creators and approved by the government, would be responsible for collecting payments from AI developers and distributing royalties to both members and registered non-members. Royalty rates would be set by a government-appointed group and could be reviewed by courts. The committee believes this system will offer easy access to training data, lower costs, fewer compliance burdens, fair payment for creators, better AI quality, and a level-playing field for both large and small AI companies. By acting as a single-window system, it would simplify access for developers while ensuring efficient royalty payments to creators.
NASSCOM dissented from this proposal and argued in favour of a TDM exception, both commercial and non-commercial, with opt-out choices and safety measures, saying that this would support innovation more effectively.