3 min read
Hollywood and creative industries are fighting back against AI infringement, with one startup offering proprietary technology to detect suspected misuse of copyrighted material within generative AI models.
LightBar, a research platform focused on AI training data, positions itself as the evidence layer that makes those threats actionable.
The platform claims it runs “research campaigns” in which users generate outputs using structured prompts designed to test specific models or studio intellectual property.
“When lesser-known or minor characters appear accurately across generations, that pattern strengthens the signal of what may have been included in training data,” LightBar told Decrypt.
Submissions are processed through what the company describes as a proprietary analysis engine that measures “percentage likeness, distinctive character traits, and prominence.”
Over the past week, major studios have moved to formalize their infringement claims through legal channels. Industry groups and unions have also set out positions that treat certain AI outputs as potential violations of copyright and contractual rights.
The Walt Disney Company sent a cease-and-desist letter to ByteDance over its Seedance 2.0 video model, alleging unauthorized use of copyrighted characters, according to a report from Axios on Friday.
Following Disney’s move, Paramount Pictures also sent a cease-and-desist letter to ByteDance over Seedance 2.0, citing alleged intellectual property infringement, Variety reported Saturday.
“ByteDance respects intellectual property rights, and we have heard the concerns regarding Seedance 2.0," a ByteDance spokesperson told Decrypt. "We are taking steps to strengthen current safeguards as we work to prevent the unauthorized use of intellectual property and likeness by users.”
The letters show owners are converting infringement concerns into formal enforcement pressure.
At the same time, labor groups are asserting that certain AI-generated outputs implicate consent and compensation rights under existing contracts and law.
SAG-AFTRA, the U.S. union representing performers across film, television, and radio, also said it stands with the studios in condemning Seedance 2.0 and that the infringement includes unauthorized use of performers’ voices and likenesses.
The Motion Picture Association, which represents major Hollywood studios, meanwhile, urged ByteDance to stop Seedance 2.0, saying it uses copyrighted works without authorization.
LightBar said it is in active discussions with studios as they consider potential legal or licensing action related to Seedance 2.0 and other AI models, with the goal of helping “shift the conversation and the leverage back in their favor.”
The company said the results are compiled into analyses that “outline the methodology, similarity metrics, and representative examples to support further review.”
“The current wave of disputes makes one thing clear: attribution and evidence are becoming the battleground of the AI economy,” Ram Kumar, core contributor at AI and blockchain infrastructure firm OpenLedger, told Decrypt.
Documenting model outputs “absolutely strengthens a studio’s negotiating position, but only if that documentation is structured, time-stamped, and cryptographically verifiable,” Kumar said.
Creating verifiable logs that connect prompts to outputs and specific model versions can convert resemblance into quantifiable proof, strengthening a rights holder’s position in court or licensing talks, even when the underlying training data cannot be directly traced, Kumar explained.
“In the long run, this won’t just affect disputes,” he said. “It will shape how future AI systems are built: with transparent reward pathways, accountable execution, and verifiable contribution tracking embedded at the protocol level.”
Decrypt-a-cookie
This website or its third-party tools use cookies. Cookie policy By clicking the accept button, you agree to the use of cookies.