Open AI assistant with stream response template not working.

When I use the open ai assistant without the stream response I get the chat response from open AI but when I use the ai assistant with stream response node, i get a generic response. All other values and parameters are kept consistent. What am I missing? I have taken this template to start with. https://buildship.com/templates/assistant-retrieval
OpenAI Assistant with Retrieval - BuildShip
Make an Assistant that access to the files you upload in the Assistant playground Using No-Code visual backend and AI workflow builder.
No description
No description
2 Replies
AI Support
AI Support20h ago
AI Support Bot Information
<@947736809973121045> you can react on the relevant answer (message) with a ✅ in this thread when you think it has been solved, by the bot or by a human! Anyone can react on this message with a ❌ if the GPT bot is unhelpful or hallucinating answers. Please note: Team members will review and answer the questions on best efforts basis.
AI Support
AI Support20h ago
AI Response
If you are experiencing an issue where the AI Streaming Assistant does not return the expected chat response and instead provides a generic response, while all other settings and parameters are consistent, here are a few potential troubleshooting steps: 1. Workflow and Node Testing Mode: Streaming Assistants won't work in Test Mode. You need to ship the workflow and then test it on your client-side (frontend) application or use BuildShip's chat widget to verify streaming. Ensure that you have enabled streaming in the chat widget as well. 2. Streaming Configuration: Double check that the streaming feature is enabled for the specific use case or node. There might be a need to explicitly turn on streaming settings for certain configuration environments. 3. Client-Side Handling: Ensure that your client-side implementation is set up to handle streamed responses properly. Often, client-side applications require configuration to process data that arrives in a streamed manner rather than as a complete response. 4. API Keys and Permissions: Double-check that the correct API key and permissions are set for your OpenAI account. Sometimes, incorrect permissions or API keys can lead to unexpected results, even if they work in other configurations. 5. Instruction Overrides: If you have modified any instructions, ensure they do not conflict with the default behavior expected from the Playground or another non-streaming environment. 6. Use BuildShip's Widgets: Verify that any BuildShip widgets you are testing with are correctly configured to use the streaming feature, as certain widgets may not be enabled for streaming without configuration. If these steps do not resolve your issue, you may need to ensure your streaming setup and node configurations are correctly matched to the BuildShip environment. It would be beneficial to test the integration in a different environment if possible, or seek further assistance through BuildShip's support.
BuildShip AI Support