jiefu95
jiefu952mo ago

API call to have openAI with streaming

I am interested to do a streaming assistant but because there's some limitations with the prebuilt openAI streaming node (like I can't do JSON response), I would instead like to do my own API call with the parameter of "streaming": true, like how it was specified on openAI documentation. I tried doing it in buildship but it's not working. Is there any way that I can do an API call to have streaming in openAI?
2 Replies
AI Support
AI Support2mo ago
AI Support Bot Information
<@588336837970624514> you can react on the relevant answer (message) with a ✅ in this thread when you think it has been solved, by the bot or by a human! Anyone can react on this message with a ❌ if the GPT bot is unhelpful or hallucinating answers. Please note: Team members will review and answer the questions on best efforts basis.
AI Support
AI Support2mo ago
AI Response
I couldn't find any pre-built nodes or direct support in BuildShip for doing an API call with streaming set to true for OpenAI. However, you can consider creating a custom integration to achieve this. Here's how you could proceed outside of the pre-built options: 1. HTTP Request Node: Use an HTTP Request node to directly call the OpenAI API. When configuring this node, you need to: - Set the Content-Type header to application/json. - Include the stream: true parameter within your request payload. - Manage the stream response accordingly within your BuildShip workflow. 2. Custom Node: Consider leveraging BuildShip's capability to generate a custom node if you need more tailored interaction. Refer to the BuildShip documentation on creating custom nodes which may give you more flexibility for your specific use case. If you are unable to accomplish this due to limitations within BuildShip, consider reaching out to their support team for further assistance.
BuildShip AI Support