Johnny Hao
Johnny Hao2w ago

Please improve AI Assistants Streaming template

I have tested the templates for both Streaming Chatbot and Streaming API. On the high level, they are working for demo puposes. But for a robust solution, none of them meets the requirements. 1. For AI Chatbot Streaming: it doesn't parse the threadId, therefore, the chat session does not retain the the previous context. 2. For AI Assistant API: The output only shows the message(stream), but no threadId meta too. And the stream part is not json. I have successfully implemented the AI Assistant in my project. It's robust. Currently exploring the streaming approach. (not the deal breaker, but nice to have for UI experience) , The current template, or node only produces a response like : "stream": { "_events": {}, "_readableState": { "highWaterMark": 16384, "buffer": [], "bufferIndex": 0, "length": 0, "pipes": [], "awaitDrainWriters": null } }, "threadId": "thread_vfGivteoekohrQUgz75OTzan" } with the Set Header approach, it produces the actual message, but the threadId is missing. Please suggest! Also, I found some of the tutorials from the past by buildship, and they are outdated.
Solution:
I have finally figured out. Here is the solution. In case you don't get the Json output. You need to manually modify the node's code. BuildShip engineers put good comments there. Just uncomment the for the line stream.push("data: [DONE]\n\n"); and change the parameter to "Server Sent Events"....
No description
Jump to solution
6 Replies
AI Support
AI Support2w ago
AI Support Bot Information
<@853688488721383484> you can react on the relevant answer (message) with a ✅ in this thread when you think it has been solved, by the bot or by a human! Anyone can react on this message with a ❌ if the GPT bot is unhelpful or hallucinating answers. Please note: Team members will review and answer the questions on best efforts basis.
AI Support
AI Support2w ago
AI Response
To address your concerns regarding the AI Chatbot Streaming and Assistant API: 1. AI Chatbot Streaming: For maintaining chat sessions and retaining previous context, storing the threadId is crucial. You can store the threadId in a BuildShip table or your preferred database. This approach allows you to add data and build upon previous conversations, ensuring context is not lost between sessions. This method was mentioned in context with storing thread IDs for the OpenAI assistant【15:5†buildship-support-site.json】. 2. AI Assistant API Output Stream: To handle the output that only shows the message stream and is not in JSON format, ensure that your implementation splits the stream into parts that are individually JSON-serializable and recombine them when processing. This might require custom handling depending on how the stream is structured in your response. For both approaches, consider enhancing your workflows with robust session management and parsing mechanisms to handle and store the contextual data properly. If you need further detailed instructions, I suggest reaching out for support through BuildShip’s in-app support feature.
BuildShip AI Support
Gaurav Chadha
Gaurav Chadha2w ago
Hi @Johnny Hao are you testing the streaming on BuildShip test directly? As streaming on test isn't supported. You can test it using the ChatBot Trigger it has inbuilt UI to show correct streaming. Which doesn't require a separate Set Header approach. Rest assured, we'll ensure to update the docs and tutorials for this in the coming week.
Johnny Hao
Johnny HaoOP2w ago
thanks! the Chatbot Trigger works, but it doesn't retain the ThreadId. For every message I send to the bot, the bot treats it as a new dialog.
Gaurav Chadha
Gaurav Chadha2w ago
The thread ID is expected to be passed in the request body. You can confirm this by going to Connect > Get Data. This will load the latest request schema, and if you expand the body you'll see that the thread id field is empty.
Solution
Johnny Hao
Johnny Hao2w ago
I have finally figured out. Here is the solution. In case you don't get the Json output. You need to manually modify the node's code. BuildShip engineers put good comments there. Just uncomment the for the line stream.push("data: [DONE]\n\n"); and change the parameter to "Server Sent Events".
No description

Did you find this page helpful?