Git Maxd
Git Maxd5mo ago

GitMaxd - I’ve been exploring the product and h...

I’ve been exploring the product and have subscribed after some quick wins. Since then I’ve encountered a few different issues mostly around storage. I realize that is likely me misunderstanding the process. I’ve simplified the workflow to only take screenshot of a URL -> Upload the image to ShipBuild Storage, and return the Public URL to the image. But, it doesn’t work. I’ve read and watched all of the storage and image related Docs and YT’s, and I am certain that I am doing it according to the instructions.
I do not get any errors, and I am returned a URL, but there is no image displayed at the url. I can see that the image is created in local storage (settings) and there’s no errors in the upload step. It’s unclear what the problem could be as there is no error, but there also is no image, or at least no image displayed at the returned URL. Any thoughts? It’s an AI generated endpoint, so I don’t love the solution of trying another AI endpoint credit burn as I’ve tried that a couple times. It feels like there’s a problem with default public storage - or is there I step in the flow I am missing?
10 Replies
Gaurav Chadha
Gaurav Chadha5mo ago
Hi @GitMaxd, are you using the Screenshot URL template with a Generate Public URL node?
No description
Git Maxd
Git Maxd5mo ago
That is what I was using, but I don’t have the “generate uniq…” step. I’ve circled it in the screenshot - what is this step called in the templates. Sorry, but I am away from a computer at the moment and can’t test until the morning.
Git Maxd
Git Maxd5mo ago
No description
Gaurav Chadha
Gaurav Chadha5mo ago
Yeah, it is require to generate a unique ID each time a url is generated, as your image gets overrided on the same reference URL due to which it fails to display image. You can try by adding this node/clone template. Let me know if it does't work.
Git Maxd
Git Maxd5mo ago
Really appreciate it. I’ll give it a go in the morning and update here.
While I have your attention - is there any good resources to learn more about integrating custom API endpoints? I have some LangServe FastAPI’s that I’d like to add but I’ve stumbled a bit trying to get things to work properly. Tried the AI a few times without much luck and thinking I am not prompting well. I’ve watched all your vids. Don’t happen recall seeing anything specificity for trying to do that integration- I’m sure I just missed it, but if you knew a link that would be amazing. What is that template called? The screenshot cuts off at “generate uniq”… I’m assuming it’ll auto complete to “ID” or something similar
Gaurav Chadha
Gaurav Chadha5mo ago
No description
Git Maxd
Git Maxd5mo ago
That's all it was. I added that node and the issue is resolved. Thanks for your help - I appreciate. It would be amazing to be able to assign a prompt(/question) to a node that is pulled in from LangChain Hub (LangSmith dot com). Use case: Prompt refining is very loose on BuildShip. There's no real way to iterate or version control prompts on nodes, just the nodes themselves. LangSmith allows the ability to rapidly iterate and test prompts against a variety of LLMs. Prompt management could be handled entirely inside LangSmith Hub and then assigned to the prompt field pulling it in each time it was updated. I understand that there is a latency issue here, but as a user of LangSmith Hub I can assure you that it is very low and I would happily put up with that for the benefit of rapid prompt development and history.
Gaurav Chadha
Gaurav Chadha5mo ago
Thanks for sharing. cc @gerard, something we can explore to have enhance prompt engineering..?
gerard
gerard5mo ago
Yeah sure. We can call endpoints from lang chain the same way we do calls to replicate models. We can chat more about it @GitMaxd if you want
Git Maxd
Git Maxd5mo ago
Well, it’s just that the more that I enjoy working with build ship the less I enjoy working with prompts inside of build ship. It’s not that anything is bad about having a text box to paste in and that’s what I’m doing now. Just a nice convenience factor and could maybe introduce some LangSmith tracing at the same time. Due to how closely they are coupled and related. I think that ultimately that feature will have zero impact on your sales or me sticking around. It’s just some thing that I’ve grown quite fond of and wanted to share with you guys. And you’d get some good attention from 🦜 - with that said, I don’t think it would be super hard to integrate. It’s an API key (secret) and a Prompt Hub Name/ID. ‘from langchain import hub prompt = hub.pull("gitmaxd/synthetic-training-data")’ It would be slick.