After launching the latest model, the ChatGPT-4o Mini, OpenAI CEO Sam Altman acknowledged the necessity for a “naming scheme revamp” for ChatGPT and its various versions. On July 18, OpenAI introduced a new model, described as “our most cost-efficient small model.”
Altman shared a post on his X account (formerly Twitter), stating, “15 cents per million input tokens, 60 cents per million output tokens, 82% MMLU, and fast.” He emphasized, “Most importantly, we think people will really, really like using the new model.”
You guys need a naming scheme revamp so bad
— Shea (@concept_central) July 18, 2024
While the majority of users expressed appreciation for the product, one user humorously remarked on the names of the ChatGPT models, which have proliferated as OpenAI’s development has progressed, suggesting they needed a change. Responding to Mr. Altman’s post, Shea commented, “You guys need a naming scheme revamp so bad.” Acknowledging the suggestion, the entrepreneur replied, “lol yes we do.”
One of OpenAI’s most efficient AI models, GPT-4o Mini, is compact and offers minimal latency (the time it takes to generate a response). OpenAI asserts that it will handle text, images, videos, and audio, in addition to the vision (image processing) and text already supported by the API. “The model features a context window of 128K tokens, supports up to 16K output tokens per request, and has knowledge up to October 2023. Enhanced by the improved tokenizer shared with GPT-4o, managing non-English text is now even more cost-effective,” the blog post highlighted.