Home Technology With OpenAI’s Release of GPT-4o, Is ChatGPT Plus Still Worth It?

With OpenAI’s Release of GPT-4o, Is ChatGPT Plus Still Worth It?

32
0
With OpenAI’s Release of GPT-4o, Is ChatGPT Plus Still Worth It?
ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab ab

Barret Zoph, a research lead at OpenAI, was recently demonstrating the new GPT-4o model and its ability to detect human emotions though a smartphone camera, when ChatGPT misidentified his face as a wooden table. After a quick laugh, Zoph assured GPT-4o that he’s not a table and asked the AI tool to take a fresh look at the app’s live video, rather than a photo he shared earlier. “Ah, that makes more sense,” said ChatGPT’s AI voice, before describing his facial expression and potential emotions.

On Monday, OpenAI launched a new model for ChatGPT that can process text, audio, and images. In a surprising turn, the company announced that this model, GPT-4o, would be available for free, no subscription required. It’s a departure from the company’s previous rollout of GPT-4, which was released in March of last year for those who pay OpenAI’s $20-per-month subscription to ChatGPT Plus. In this current release, many of the features that were previously gated off to paying subscribers, like memory and web browsing, are now rolling out to free users as well.

Last year, when I tested a nascent version of ChatGPT’s web browsing capability, it had flaws but was powerful enough to make the subscription seem worthwhile for early adopters looking to experiment with the latest technology. Since the freshest AI model from OpenAI, as well as previously gated features, are available without a subscription, you may be wondering if that $20 a month is still worthwhile. Here’s a quick breakdown to help you understand what’s available with OpenAI’s free version versus what you get with ChatGPT Plus.

What’s Available With Free ChatGPT?

To reiterate, you don’t need any kind of special subscription to start using the OpenAI GPT-4o model today. Just know that you’re rate limited to fewer prompts per hour than paid users, so make sure to be thoughtful about the questions you pose to the chatbot or you’ll quickly burn through your allotment of prompts.

In addition to limited GPT-4o access, non-paying users received a major upgrade to their overall user experience, with multiple features that were previously just for paying customers. The GPT Store, where anyone can release a version of ChatGPT with custom instructions, is now widely available. Free users can also use ChatGPT’s web browsing tool and memory features as well as upload photos and files for the chatbot analyze.

What’s Still Gated to ChatGPT Plus?

While GPT-4o is available without a subscription, you may want to keep ChatGPT Plus for two reasons: access to more prompts and newer features. “You can use the model significantly more on Plus,” Zoph tells WIRED. “There’s a lot of other exciting, future things to come as well.” Compared to non-subscribers, ChatGPT Plus subscribers are allowed to send GPT-4o five times as many prompts before having to wait or switch to a less powerful model. So, if you want to spend a decent amount of time messaging back and forth with OpenAI’s most powerful option, a subscription is necessary.

Although some of the previously exclusive features for ChatGPT Plus are rolling out to non-paying users, the splashiest of updates are still offered first behind OpenAI’s paywall. The impressive voice mode that Zoph demonstrated on stage is arriving sometime over the next couple of weeks for ChatGPT Plus subscribers.

In OpenAI’s demo videos, the bubbly AI voice sounds more playful than previous iterations and is able to answer questions in response to a live video feed. “I honestly think the ways people are going to discover use cases around this is gonna be incredibly creative,” says Zoph. During the presentation, he also showed how the voice mode could be used to translate between English and Italian. After the presentation, the company released another video showing speech translation working in real time.

Source link