Advertisement

OpenAI launched a new model, GPT-4o, yesterday, which will roll out to the public over the next few weeks. It offers premium features of the GPT-4 along with a refreshed web UI. During the announcement, OpenAI’s CTO Mira Murati demonstrated some of the capabilities of their new model. So let’s take a look.

GPT-4o announcement

GPT-4o is more efficient and can listen and see better

According to the company, GPT-4o takes “a step towards much more natural human-computer interaction”. The new model can process text, images, and audio and is able to assist its users based on that. The voice mode now works more seamlessly with faster responses and better understanding. Previously, the voice mode was used to utilize three different models for transcription, intelligence, and text-to-speech output together. Not to mention it causes delays in response. In comparison, the GPT-4o handles all of that natively.

Using the camera on your phone you can now share information with the model and ask questions based on that easily using the voice mode. Reportedly, the new model can respond to voice inputs in 232 milliseconds, which is similar to the response time of us, humans. The model can also respond in a variety of tones matching user preferences. The model is better and faster as compared to GPT-4 Turbo at understanding non-English languages. GPT-4o can now also work as an interpreter.

Notably, the GPT-4o will also be available to the API, so that developers can build and power AI apps using the capabilities of the new model.

While the new model offers the features for free, the premium users will be allowed to use five times the resources as compared to the new model.

The company has also launched a ChatGPT app for Apple’s macOS-based desktops. The app for macOS offers a deeper integration into the platform. OpenAI wants to make it easier for users to integrate the tool into their workflows. With a keyboard shortcut (Option + Space) users will be able to reach the conversation page of the tool easily.

Related:

(Source)

Comments