nithinrdy
nithinrdy
BRBuildShip + Rowy
Created by Brian on 9/13/2024 in #❓・buildship-help
"undefined's stream has ended" in OpenAI Assistant (Stream Response)
gotcha, we'll look into this
9 replies
BRBuildShip + Rowy
Created by Brian on 9/13/2024 in #❓・buildship-help
"undefined's stream has ended" in OpenAI Assistant (Stream Response)
Hey @Brian, sorry about the confusion, this error isn't because of a missing/non-existent thread ID. As for why a thread is created despite the assistant not having been found, it's simply because the assistant is designed to first generate a thread, and then plug it into an assistant instance to generate responses/conversations. So, to clarify, the assistant creating a new thread is normal when you do not supply a thread ID of your own, and the error that you see is simply because the assistant ID you provided for the node does not correspond to an existing assistant instance (over on platform.openai.com). Could you please double-check and make sure that you're using the right assistant ID? Could you also make sure that your assistant instance is that of the V2 Assistant (that's what the node is designed to work with)?
9 replies
BRBuildShip + Rowy
Created by Thomas on 6/26/2024 in #❓・buildship-help
Switch Node: one case for several conditions
No description
5 replies
BRBuildShip + Rowy
Created by Thomas on 6/26/2024 in #❓・buildship-help
Switch Node: one case for several conditions
No description
5 replies
BRBuildShip + Rowy
Created by Savannah on 5/22/2024 in #💬・general
Savannah - I'm not sure how to use variable, bu...
Hey @Savannah, try removing the quotes in front of and at the end of the variable.
1 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
One other thing you may try is using the OpenAI Streaming Assistant instead. It returns a stream object, which, although isn't the actual text response, it does show up clearly in the logs.
19 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
Since the response is piped directly to the response body, I don't think it'll show up in the logs. The easiest way to check if the stream is working is to change the REST API trigger's method to GET, and open the workflow endpoint in chrome. If you see a response there, that implies the workflow is working as intended.
19 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
Sorry about the delay. Are you looking for something apart from the workflow logs? That's the place to look in, for worklow execution logs -- timestamps could help you find the specific run you're looking for.
19 replies
BRBuildShip + Rowy
Created by Murilo Ravani on 5/20/2024 in #❓・buildship-help
OpenAI Assistant v2 vs gpt-4o
Hey there @Ravani, if you'd like to use the new 4o model, please replace the assistant node in your workflow with a new copy of the assistant from the node library. The one you're using is probably the older V1 assistant. The one currently in the node library is built to work with the V2 assistant.
4 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
The default configuration should work fine, are you facing any issues with the REST API trigger?
19 replies
BRBuildShip + Rowy
Created by Abrar on 5/19/2024 in #❓・buildship-help
query
Hey @Abrar, if you're talking about using a package as part of a node in your workflow, you do not actually need to manually install packages. BuildShip takes care of this behind the scenes. This is all you need to do: 1. Add the required ES6 import statement(s) to your script (Image 1). 2. Head to the Info section of the node editor and look for the "NPM Packages" section. Here you can configure your package (i.e. pick the version you'd like) (Image 2). Hope this helps.
5 replies
BRBuildShip + Rowy
Created by Holden on 5/19/2024 in #💬・general
Holden - Do I need to adjust anything within Bu...
Hey @Holden, that's right, all you need to do to start using the 4o model is to configure your assistant over on platform.openai.com. Do make sure you're using the latest version of the assistant though, which is built to work with the OpenAI Assistant V2 API (the description of the node should tell you if it's compatible with V2 or not). If this is not the case, please replace the node with a new one from the node library.
2 replies
BRBuildShip + Rowy
Created by HocusPocus on 5/19/2024 in #💬・general
HocusPocus - hey guys, I want to integrate the ...
hey there @HocusPocus, as the error message indicates, the user prompt is missing. The template expects the user prompt to come in from the HTTP Request Body, via the message property. Could you check if that's the property name, and if not could you try renaming the property to message, and see if that works?
5 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
Yep, it does look correct. If you think you've accidentally editing something, you can try replacing it with a new node from the library to get back the original :)
19 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
So testing this node isn't possible, nor is there a way to see the text stream in the logs. This is an inherent downside of the streaming nodes.
19 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
The node doesn't have a standard "response" either. It directly pipes the stream through to the http response body, which is why it doesn't show up in the logs. Unfortunately, there's no way around it, the node, by its very nature, does not return a standard response object.
19 replies
BRBuildShip + Rowy
Created by Ethan Tan on 5/19/2024 in #❓・buildship-help
OpenAI Stream Response is empty
Hey @Ethan Tan, as the description of the node states, this node doesn't support node testing. To receive a stream from this workflow's endpoint, you have a few options: - For testing, you can simply change the request method from POST to GET, and open the endpoint in a browser. Most browser's are able to handle stream responses without the user having to configure anything. - To continue using it with the POST method, you need to specially handle the response in the client that's receiving the stream. For example, here's how we do it in our chat widget: https://github.com/rowyio/buildship-chat-widget/blob/cc95bedb6563175b4e348b4c3b065d47ef59ae0b/src/index.ts#L261 (see lines 261 to 291). - Finally, if you are okay with using the OpenAI Assistant and Firestore, you may also try using the stream to firestore node (look up OpenAI Assistant (Stream Response To Firestore) in the node library). The response is streamed straight to your firestore docs, which you can listen to for changes as the streamed text comes in.
19 replies
BRBuildShip + Rowy
Created by Elder Marx on 5/17/2024 in #💬・general
Elder Marx - I'm trying to download a file thro...
If you need to make a file public within a workflow, you can do so using the "Generate Public Download URL" node.
4 replies
BRBuildShip + Rowy
Created by Elder Marx on 5/17/2024 in #💬・general
Elder Marx - I'm trying to download a file thro...
Hey @Elder Marx, could you try making the file public? Head to Buildship storage (in the settings), open the options menu for the file in question, and select "Get Public URL". This will make the file public and copy its url to your clipboard (the URL will be the same as what's in your image, just that the file will be made public).
4 replies
BRBuildShip + Rowy
Created by meirk on 5/13/2024 in #❓・buildship-help
streaming assitant template is returning this error:
Because the standard assistant template returns a JSON, so one of the properties returns the message, while the other returns the thread ID, like so:
{
"message": "...",
"threadId": "..."
}
{
"message": "...",
"threadId": "..."
}
But in case of a stream response it's not possible to return a JSON like that, because the response is a stream of text. So there needs to be a different way to return the thread ID in case of a stream. Returning it through the response header is one such way.
10 replies