OpenAI Stream Response is empty
Hello, the Openai Stream Response is not outputting anything - could I get some help please?
- Also, the node continuously has the spinning loading circle, as you see in the screenshot
- When I test the Stream Response node individually, I get Test Failed: Cannot read properties of undefined (reading 'on')
11 Replies
Hey @Ethan Tan, as the description of the node states, this node doesn't support node testing.
To receive a stream from this workflow's endpoint, you have a few options:
- For testing, you can simply change the request method from
POST
to GET
, and open the endpoint in a browser. Most browser's are able to handle stream responses without the user having to configure anything.
- To continue using it with the POST
method, you need to specially handle the response in the client that's receiving the stream. For example, here's how we do it in our chat widget: https://github.com/rowyio/buildship-chat-widget/blob/cc95bedb6563175b4e348b4c3b065d47ef59ae0b/src/index.ts#L261 (see lines 261 to 291).
- Finally, if you are okay with using the OpenAI Assistant and Firestore, you may also try using the stream to firestore node (look up OpenAI Assistant (Stream Response To Firestore)
in the node library). The response is streamed straight to your firestore docs, which you can listen to for changes as the streamed text comes in.Thanks @nithinrdy - I understand the node cant be tested individually
When I test the workflow though it seems nothing is output from the stream?
This is the code:
The node doesn't have a standard "response" either. It directly pipes the stream through to the http response body, which is why it doesn't show up in the logs. Unfortunately, there's no way around it, the node, by its very nature, does not return a standard response object.
So testing this node isn't possible, nor is there a way to see the text stream in the logs. This is an inherent downside of the streaming nodes.
Oh I see - so looking at the code and setup above, does it look like it should be working correctly?
Yep, it does look correct. If you think you've accidentally editing something, you can try replacing it with a new node from the library to get back the original :)
Thank you - oh and how should the REST API node be set up please, in terms of the Body/Header etc?
The default configuration should work fine, are you facing any issues with the REST API trigger?
It may be fine - I'm just attempting to find the problem. The workflow is called correctly now from Vapi, but on the Vapi side they said they don't receive any output from Buildship
@nithinrdy Is there any way you can see the logs of a particular workflow run?
@nithinrdy following up on this
Sorry about the delay. Are you looking for something apart from the workflow logs? That's the place to look in, for worklow execution logs -- timestamps could help you find the specific run you're looking for.
Thanks @nithinrdy - I see the log but for example in this one:
- The request is being received correctly (called from Vapi.ai)
- I cant tell what if anything is being output from the Stream Response, or the Return node. It looks like nothing? Vapi says they receive nothing from their side
How else can we diagnose this? Is there something you can see from your side?
Since the response is piped directly to the response body, I don't think it'll show up in the logs.
The easiest way to check if the stream is working is to change the REST API trigger's method to
GET
, and open the workflow endpoint in chrome. If you see a response there, that implies the workflow is working as intended.
One other thing you may try is using the OpenAI Streaming Assistant instead. It returns a stream object, which, although isn't the actual text response, it does show up clearly in the logs.