GPT-5.2 Pro responses are not displayed in Capacities
under review
T
Thomas Wagner
I am using my own OpenAI API key in Capacities with the GPT-5.2 Pro model. Very short prompts (e.g. “Reply with OK”) return a visible response in Capacities. However, as soon as the prompt becomes slightly longer (e.g. “Say 100 words”), no response is displayed in the UI.
According to the OpenAI usage logs, the requests are successfully processed: the model is called, input and output tokens are consumed, and there are no API errors. This confirms that the request reaches OpenAI and a response is generated. The issue therefore appears to be on the Capacities side, where longer GPT-5.2 Pro responses are not rendered or are dropped.
Other models work correctly with the same setup and API key. This results in ongoing API costs for GPT-5.2 Pro without any usable output in Capacities.
Please investigate whether there is a parsing, length, streaming, or timeout issue in the Capacities integration specific to GPT-5.2 Pro.
Capacities v1.57.24; none none; Windows v10; Chrome v138.0.7204.251 (WebKit v537.36); Desktop (Electron); 14.1.2026, 20:37:22; en (Europe/Berlin, UTC+1)
Log In
Beth
marked this post as
under review