Hollywood’s stars have tried lawsuits, strikes, and moral appeals to stop AI from swallowing their faces. Now they’re trying something more practical: a price tag.

RSL Media, the organization behind existing machine-readable licensing standards, is expanding its rules to cover AI use of identities and creative works. The proposed standard would create a programmatic way for AI systems to check whether they’re allowed to use someone’s likeness — and pay for it if they are.

This isn’t a petition or a press release. It’s infrastructure. The idea is to bake licensing directly into the technical plumbing that AI systems use to access content, making unauthorized use detectable and billable by default. No lawyers, no cease-and-desist letters — just a machine-readable file that says “yes, but it costs this much.”

According to The Register, Hollywood A-listers are backing the proposed spec, which aims to give performers and creators a mechanism to collect payment when AI models replicate their face, voice, or creative output.

There’s a neat symmetry here. The people whose likenesses were scraped to train generative models are now proposing that the solution to that exploitation is more infrastructure — not a ban, not a boycott, but a standard. A piece of code that says: you can use my face, but the machine has to read the terms first.

Whether AI companies will adopt a standard that obligates them to pay for what they’ve been taking for free remains the open question. Technical standards only work when the relevant parties agree to implement them, and the AI industry’s track record on asking permission is, to put it gently, underwhelming.

Still, the approach is notable for its pragmatism. Rather than fighting to put the genie back in the bottle, the proposal assumes the genie is here to stay — and tries to install a turnstile on the way out.

Sources