Ethan Tan
Ethan Tan5mo ago

OpenAI Stream Response is empty

Hello, the Openai Stream Response is not outputting anything - could I get some help please? - Also, the node continuously has the spinning loading circle, as you see in the screenshot - When I test the Stream Response node individually, I get Test Failed: Cannot read properties of undefined (reading 'on')
11 Replies
nithinrdy
nithinrdy5mo ago
Hey @Ethan Tan, as the description of the node states, this node doesn't support node testing. To receive a stream from this workflow's endpoint, you have a few options: - For testing, you can simply change the request method from POST to GET, and open the endpoint in a browser. Most browser's are able to handle stream responses without the user having to configure anything. - To continue using it with the POST method, you need to specially handle the response in the client that's receiving the stream. For example, here's how we do it in our chat widget: https://github.com/rowyio/buildship-chat-widget/blob/cc95bedb6563175b4e348b4c3b065d47ef59ae0b/src/index.ts#L261 (see lines 261 to 291). - Finally, if you are okay with using the OpenAI Assistant and Firestore, you may also try using the stream to firestore node (look up OpenAI Assistant (Stream Response To Firestore) in the node library). The response is streamed straight to your firestore docs, which you can listen to for changes as the streamed text comes in.
Ethan Tan
Ethan Tan5mo ago
Thanks @nithinrdy - I understand the node cant be tested individually When I test the workflow though it seems nothing is output from the stream? This is the code:
import { Readable } from "stream";
import OpenAI from "openai";

// Function to stream data from readStream to writeStream
function streamer(writeStream, readStream) {
return new Promise(function (resolve, reject) {
let response = "";

readStream.on("content", (chunk) => {
// Extracting the payload and removing "data: " from each chunk
const payloads = chunk.toString();

for (const payload of payloads) {
const cleaned = payload.replace("data: ", "");
// Push cleaned payload to the writeStream
writeStream.push(cleaned);
response += cleaned;
}
});

readStream.on("end", () => {
// Resolve the promise when the readStream ends
resolve(response);
});

readStream.on("error", () => {
// Handle errors by pushing an error message to the writeStream and resolving the promise
writeStream.push("err...");
resolve("");
});
});
}
import { Readable } from "stream";
import OpenAI from "openai";

// Function to stream data from readStream to writeStream
function streamer(writeStream, readStream) {
return new Promise(function (resolve, reject) {
let response = "";

readStream.on("content", (chunk) => {
// Extracting the payload and removing "data: " from each chunk
const payloads = chunk.toString();

for (const payload of payloads) {
const cleaned = payload.replace("data: ", "");
// Push cleaned payload to the writeStream
writeStream.push(cleaned);
response += cleaned;
}
});

readStream.on("end", () => {
// Resolve the promise when the readStream ends
resolve(response);
});

readStream.on("error", () => {
// Handle errors by pushing an error message to the writeStream and resolving the promise
writeStream.push("err...");
resolve("");
});
});
}
export default async (
{ userRequest, systemPrompt, openaiSecret, model, temperature },
{ logging, req: ctx = {}},
) => {
// Create a new Readable stream for writing the response
let writeStream = (ctx.body = new Readable());
writeStream._read = function () {}; // Make the writeStream readable
writeStream.pipe(ctx.res); // Pipe the writeStream to the response object to send data to the client
ctx.type = "text/undefined-content";

// Set response headers
Object.assign(ctx.response.headers, {
"Transfer-Encoding": "chunked",
Connection: "keep-alive",
});

// Initialize OpenAI API client
const openai = new OpenAI({
apiKey: openaiSecret,
});

try {
// Make an asynchronous call to OpenAI API to get stream response
const completion = await openai.beta.chat.completions.stream(
{
model,
temperature,
messages: [
{
role: "system",
content: systemPrompt,
},
{
role: "user",
content: userRequest,
},
],
stream: true,
},
{
responseType: "stream",
},
);

// Stream data from completion to the writeStream
const response = await streamer(writeStream, completion);
return response;
} catch (error) {
// Handle errors if any
logging.log(error);
return "";
}
};
export default async (
{ userRequest, systemPrompt, openaiSecret, model, temperature },
{ logging, req: ctx = {}},
) => {
// Create a new Readable stream for writing the response
let writeStream = (ctx.body = new Readable());
writeStream._read = function () {}; // Make the writeStream readable
writeStream.pipe(ctx.res); // Pipe the writeStream to the response object to send data to the client
ctx.type = "text/undefined-content";

// Set response headers
Object.assign(ctx.response.headers, {
"Transfer-Encoding": "chunked",
Connection: "keep-alive",
});

// Initialize OpenAI API client
const openai = new OpenAI({
apiKey: openaiSecret,
});

try {
// Make an asynchronous call to OpenAI API to get stream response
const completion = await openai.beta.chat.completions.stream(
{
model,
temperature,
messages: [
{
role: "system",
content: systemPrompt,
},
{
role: "user",
content: userRequest,
},
],
stream: true,
},
{
responseType: "stream",
},
);

// Stream data from completion to the writeStream
const response = await streamer(writeStream, completion);
return response;
} catch (error) {
// Handle errors if any
logging.log(error);
return "";
}
};
nithinrdy
nithinrdy5mo ago
The node doesn't have a standard "response" either. It directly pipes the stream through to the http response body, which is why it doesn't show up in the logs. Unfortunately, there's no way around it, the node, by its very nature, does not return a standard response object. So testing this node isn't possible, nor is there a way to see the text stream in the logs. This is an inherent downside of the streaming nodes.
Ethan Tan
Ethan Tan5mo ago
Oh I see - so looking at the code and setup above, does it look like it should be working correctly?
nithinrdy
nithinrdy5mo ago
Yep, it does look correct. If you think you've accidentally editing something, you can try replacing it with a new node from the library to get back the original :)
Ethan Tan
Ethan Tan5mo ago
Thank you - oh and how should the REST API node be set up please, in terms of the Body/Header etc?
nithinrdy
nithinrdy5mo ago
The default configuration should work fine, are you facing any issues with the REST API trigger?
Ethan Tan
Ethan Tan5mo ago
It may be fine - I'm just attempting to find the problem. The workflow is called correctly now from Vapi, but on the Vapi side they said they don't receive any output from Buildship @nithinrdy Is there any way you can see the logs of a particular workflow run? @nithinrdy following up on this
nithinrdy
nithinrdy5mo ago
Sorry about the delay. Are you looking for something apart from the workflow logs? That's the place to look in, for worklow execution logs -- timestamps could help you find the specific run you're looking for.
Ethan Tan
Ethan Tan5mo ago
Thanks @nithinrdy - I see the log but for example in this one: - The request is being received correctly (called from Vapi.ai) - I cant tell what if anything is being output from the Stream Response, or the Return node. It looks like nothing? Vapi says they receive nothing from their side How else can we diagnose this? Is there something you can see from your side?
nithinrdy
nithinrdy5mo ago
Since the response is piped directly to the response body, I don't think it'll show up in the logs. The easiest way to check if the stream is working is to change the REST API trigger's method to GET, and open the workflow endpoint in chrome. If you see a response there, that implies the workflow is working as intended. One other thing you may try is using the OpenAI Streaming Assistant instead. It returns a stream object, which, although isn't the actual text response, it does show up clearly in the logs.