When using the OpenAI_Text-ChatGPT compound (Aximmetry 2026.2.0) with a third-party OpenAI-compatible proxy (PackyAPI), the system fails to output text even when the API request is technically successful and tokens are consumed.
my Issues
Protocol Mismatch (Legacy vs. Chat): The factory compound defaults to the legacy /completions endpoint for certain model families, resulting in the error: "field messages is required".
Workaround: I enabled Raw JSON Override with a structured Chat message array, which successfully bypassed this error.
The "Null Content" & Token Consumption Paradox: Even with Status 200 and confirmed token usage (e.g., completion_tokens: 23), the Response Text returns {"content": null}.
Analysis: Modern models (like GPT-5.2/5.5) often return the actual answer in a reasoning_content field instead of the standard content field. The current Aximmetry compound appears to only parse the choices[0].message.content path.
Forced Streaming Behavior: Despite setting "stream": false in the Raw JSON, the proxy often returns chat.completion.chunk (SSE stream). Aximmetry's compound seems unable to aggregate these chunks into a single string when the "stream" parameter is overridden or ignored by the proxy's server-side settings.
Questions
1.Custom JSON Parsing: Is there a way to modify the compound's internal parser to look for reasoning_content or choices[0].text without breaking the entire linked compound?
2.Base URL & Stream Handling: How can we force the compound to handle non-standard streaming responses from proxies that don't strictly adhere to the stream: false flag?
3.Best Practice for Proxies: Does Aximmetry recommend using the HTTP Request module (single node) as a replacement for the OpenAI_Text-ChatGPT compound when dealing with third-party providers to ensure full control over the JSON structure?