This might be OpenAI‘s biggest announcement since the launch of ChatGPT. An AI-powered video generator that creates videos from merely a couple of sentences? Meet Sora, OpenAI’s latest brainchild, which is bound to change the way videos will be created on the internet forever. And yes, it would not be wrong to say that Sora is crazy good from the examples shown so far.

Sora holds immense potential for filmmakers, designers and artists.

Sora can make videos up to a minute long, focusing on keeping the video close to what the user asks for. Right now, Sora is being tested by experts looking for any potential issues, and it’s also in the hands of visual artists, designers, and filmmakers. Their feedback will help improve Sora, especially for creative work.

OpenAI Sora

Sora stands out because it can handle complex scenes with many characters, different movements, and detailed backgrounds. It gets what the user wants and how those things should look and act in the real world. The tool is smart with language, which lets it create videos with characters that show strong emotions. It can even keep the same characters and style across different parts of a video.

However, Sora isn’t perfect. It might get physics wrong in complicated scenes or miss the mark on cause and effect. It might also mix up details like left and right or have trouble with exact descriptions of events happening over time.

OpenAI is sharing Sora’s development early to get feedback and give a peek at what AI might do next. A lot of security protocols will be put into place it seems, because technology like this is prone to misuse in the lightest sense of the word. Also, keep in mind that it won’t be cheap to produce longer videos, since this will require A LOT of processing power from OpenAI’s servers. Regardless, one thing is for sure -the future of generative AI is more than just exciting – it is inviting!

By the way, if you are wondering how to use Sora AI, know that the access is still limited and it will take some time for the rollout.

RELATED:

(Via)