OpenAI has unveiled a new application programming interface (API) for its recently launched o1-pro artificial intelligence (AI) model as of Wednesday. This API is designed to provide developers with responses that exceed the performance of previous AI models from the company by utilizing increased computational power. However, this enhanced capability comes at a cost, with OpenAI making the API its most expensive offering to date. Notably, subscribers to ChatGPT Pro can utilize the o1-pro mode without incurring additional fees, although there are operational limits in place.
OpenAI’s o1-Pro Gets an API
In a recent post on X, formerly known as Twitter, OpenAI Developers announced the release of the o1-pro API. The decision to launch this API follows numerous requests from developers eager to access the model’s advanced features. The newly-released API is claimed to generate “consistently better responses” by leveraging greater computational resources than the previous o1 model.
Despite its prior availability only as a feature within ChatGPT, benchmark scores for the o1-pro model remain undisclosed. OpenAI has indicated that the API includes support for vision, function calling, and structured outputs. Additionally, it is compatible with both Responses and Batch APIs. While the model accepts text and image inputs, audio input is not included within its capabilities. Furthermore, it is limited to generating text output only.
According to the documentation provided by the company, the o1-pro model features a context window of 200,000 tokens, a knowledge cutoff point established in October 2023, and reasoning token support. However, this advanced functionality comes at a premium price, with the API costing $150 (approximately Rs. 12,900) for every million input tokens and $600 (around Rs. 51,800) for a million output tokens. Currently, the API is accessible to a select group of developers within paid tiers 1-5.
For context, the o1-mini and o3-mini models are priced significantly lower, at $1.10 (around Rs. 94) per million input tokens and $4.40 (approximately Rs. 380) per million output tokens. Developers should also be aware of potential additional expenses when utilizing the o1-pro due to its reasoning model capabilities, as API users will incur charges for reasoning tokens. This aspect has led to concerns from developers regarding their ability to understand the chain of thought (CoT) in the decision-making process of the model.
The ChatGPT Pro subscription, which is available for $200 (roughly Rs. 17,270) per month, allows subscribers to access the o1-pro model through a dedicated mode within the platform. While this subscription offers rate limits, users are not required to pay any extra fees for its use.