OpenAI + Twilio - Request Timeout
We are building an WhatsApp bot, trying to integrate Twilio with OpenAI, using BuildShip.
The issue we are facing is that, when we make a HTTP request from Twilio to BuildShip it expects to have an answer in 10 seconds, or it will timeout and consider a failed request. However, when we send the request to OpenAI it takes more than 10 seconds to return our call.
I'd like to somehow, whenever BuildShip receives the request from Twilio, return a status 200 to Twilio and then continue to execute the other nodes, we tried to return a 200 in parallel but it didn't work.
How can I return a status 200 on a call and keep with the other nodes?
Solution:Jump to solution
you could execute a second workflow node, then return 200, then make the second workflow the main workflow
1 Reply
Solution
you could execute a second workflow node, then return 200, then make the second workflow the main workflow