Here’s a demo app that shows how you can connect your app to Google Calendar and fetch upcoming events: github.com/openai/openai-…
With the Conversations API, you can now store context from Responses API calls (messages, tool calls, tool outputs, and other data). Easily render past chats, then let your users pick up where they left off (just like in ChatGPT).
Today we’re announcing Open Responses: an open-source spec for building multi-provider, interoperable LLM interfaces built on top of the original OpenAI Responses API.
✅ Multi-provider by default
✅ Useful for real-world workflows
✅ Extensible without fragmentation
Build agentic systems without rewriting your stack for every model: openresponses.org
You can now get more Codex usage from your plan and credits with three updates today:
1️⃣ GPT-5-Codex-Mini — a more compact and cost-efficient version of GPT-5-Codex
2️⃣ 50% higher rate limits for ChatGPT Plus, Business, and Edu
3️⃣ Priority processing for ChatGPT Pro and Enterprise
GPT-5-Codex-Mini allows roughly 4x more usage than GPT-5-Codex, at a slight capability tradeoff due to the more compact model.
Available in the CLI and IDE extension when you sign in with ChatGPT, with API support coming soon.
Select GPT-5-Codex-Mini for easier tasks or to extend usage when you’re close to hitting rate limits.
Codex will also suggest switching to it when you reach 90% of your limits, so you can work longer without interruptions.