< >
< >
< >
< >
< >
Copyright violations, Sam Altman shoplifting and more - current-scope.com
< >
< >

Copyright violations, Sam Altman shoplifting and more



On Tuesday, Openai published Sora 2The latest version of its video and audio styles that it promised would be the “most powerful imagination engine that has ever been built”. Less than a day in the publication, the imagination of most people seems to be dominated by copyright -protected material and existing intellectual property.

In Tandem with the publication of his latest model, Openaai also dropped a Sora app with which users can generate and share content. While the app is currently only invited, many videos have already come to other social platforms, even if they only want to see the content. The videos that have decreased outside the walled garden of Openai contain many well -known characters: Sonic the IgelPresent Solid snakePresent Pikachu.

There seem to be at least a few types of content that are not observed in Openai’s video. Users have reported that the app refuses to create videos with the creation of videos Darth Vader And for example Mickey Mouse. This restriction seems to be the result of a new approach from Openai to the copyright material, which is pretty simple: “We use it unless we are expressly said that we are not expressly say.” The Wall Street Journal reported at the beginning of this week that Openai has addressed film studios and other copyright holder To inform you that you have to unsubscribe from the recording of your content in Sora-generated videos. Disney did exactly that Pro ReutersTherefore, his characters for content created by users should be completed.

However, this does not mean that the model has not been trained in this content. Early this month, The Washington Post showed itself How the first version of Sora was rather clearly trained in copyrighted material that the company did not ask for permission. For example, Wapo was able to create a short video clip that the Netflix show “Wednesday” up to the displayed font and a model that looks suspicious of Jenna Ortega’s assumption of the title character. Netflix informed the publication that it did not provide any content for the training.

The outputs of Sora 2 show that it was clearly fed with the fair proportion of copyrighted material. For example, users have done it Generate scenes from โ€œRick and Morty“Completely with relatively sounding voices and art styles. (However, if they seem what the model seem, it seems to fight. Troubling is out of place.))

Other videos at least try to be a little creative about how they use characters protected by copyright. For example, users have Ronald McDonald thrown into an episode of “Love Island” And Creates a fake video game The team exposes Tony Sopraner The sopranos And Kirby from, well, Kirby.

Interestingly, not all potential copyright infringement of users who explicitly ask about it. For example, a user Sora 2 gave the command prompt “a sweet young woman who drove in a world of flower with a dragon, in a studio ghibli style, saturated, rich colors” and just high Spit an anime style version of The infinite story. Even if users do not actively call the model to create derived art, it doesn’t seem to help themselves.

“People endeavor to get in touch with their own ideas as well as stories, characters and worlds they love with their family and friends, and we see new opportunities for creators to deepen their connection with the fans,” said a spokesman for Openai to Gizmodo. “We work with Rightsolders to understand their preferences how their content appears in our ecosystem, including Sora.”

There is another genre of popular and possibly legally dubious content, which has also become popular with Sora 2 users: the Sam Altman Cinematic Universe. Openaai claims that users are unable to generate videos that use the similarity of other people, including public personalities, unless these numbers upload their similarity and give explicit permission. Altman apparently gave his OK (which makes sense, he is the CEO and was highlighted in the fully generated advertising video of the company for the start of Sora 2) and the users make the best of access to his image.

A user claimed to have the “most popular” video in the Sora Social App, which is shown Altman is caught with the shoplifting of GPUs off the goal. Others have made him one Skibidi toiletPresent A catAnd maybe the most suitable a shameless thief that steals creative materials from Hayao Miyazaki.

There are also some questions about the similarity of non -charners in these videos. How does Target feel in the video of Altman in Target, for example in relation to the logo and the use of similarity? Another user put their own similarity in an NFL gameWhat the logos of the New York Giants, Dallas Cowboys and the NFL itself seems to be used clearly. Is that considered a kosher?

Openai obviously wants people to give the app their resemblance, as it creates a lot more ways for the commitment, which currently seems to be his main currency. But the Altman examples seem to be revealing in terms of limits: it is difficult to imagine that too many public personalities will submit to the humiliation ritual to allow other people to control their image. Even worse, imagine the average person who falls into a video in which you commit a crime and the potential social effects that you may be confronted.

A spokesman for Openaai said that Altman had made its similarity to everyone, and users who check their similarity in Sora can determine who can use them: only the user, friends, selected friends or everyone. The app also offers users the option of seeing every video in which their similarity was used, including those who are not published, and can cancel or remove a video at any time with their image. The spokesman also said that videos contain metadata that show that they are created with an indicator with Sora and provided with watermarks.

Of course there are some defeats for it. The fact that a video can be deleted by Sora does not mean that an exported version can be deleted. The watermark could also be cut out. And most people don’t check the metadata of videos to ensure authenticity. We have to see what the failure looks like, but there will Be a fallout.



Leave a Reply

Your email address will not be published. Required fields are marked *

< >