Subscribe to Our Newsletter

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn't arrive within 3 minutes, check your spam folder.

Ok, Thanks

OpenAI's Sora may have been leaked by a group of early testers

A group of artists protested OpenAI's Sora video generation technology by creating a front-end tool on Hugging Face, criticizing the company for allegedly exploiting artists through unpaid labor and restrictive access to the AI video generation platform.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
OpenAI's Sora may have been leaked by a group of early testers
Credit: OpenAI

This Tuesday, a group of artists, including at least two members of OpenAI's Sora early access program, published a project on the Hugging Face platform that essentially worked as a front end allowing anyone to generate 10-second 1080p video clips with an optimized version of Sora. In a letter posted alongside the front end, the group explained they had released this tool to protest OpenAI's practices, claiming the company was taking advantage of artists by having them perform unpaid labor and using them to "art wash" the company's image without even properly compensating them for their work.

For instance, the letter explained that any Sora output had to be approved by OpenAI before being widely circulated and that only selected creators would have the opportunity to screen their works. The group found this to be unfair compensation for their work, especially considering that this was coming from a company valued at $150 billion, which will likely immensely profit from the marketing and PR impact made by the Sora-generated works it will publicly support. More generally, the group said to worry that OpenAI is more concerned about PR stunts than the creator community.

Although OpenAI never officially confirmed that the artists' project gave the general public access to the Sora API, the tool was quickly deactivated. OpenAI paused access to all artists participating in the early access program while it investigated the incident. An OpenAI spokesperson has since emphasized that creators in the early access program have no obligations other than responsibly using the model and refraining from sharing confidential information. However, the scope of both "responsible use" and "confidential information" was conveniently left unspecified. The spokesperson also highlighted OpenAI's continued commitment to supporting artists through grants, events, and other programs.

In February, OpenAI shared its Sora research advances with a splashy announcement. However, as the year progressed, OpenAI went nearly silent about Sora, except to share that it would provide early access to red teamers for security assessments, and to creatives for feedback. More generally, the company hinted that it would not release the model to a broader audience until it fully understood Sora's performance in real-world use. In OpenAI's recent Reddit AMA, OpenAI chief product officer Kevin Weil confirmed that unresolved scaling and safety issues had delayed Sora's release.

Ellie Ramirez-Camara profile image
by Ellie Ramirez-Camara
Updated

Data Phoenix Digest

Subscribe to the weekly digest with a summary of the top research papers, articles, news, and our community events, to keep track of trends and grow in the Data & AI world!

Success! Now Check Your Email

To complete Subscribe, click the confirmation link in your inbox. If it doesn’t arrive within 3 minutes, check your spam folder.

Ok, Thanks

Read More