Using Corporate Proxy with Langchain, Langsmith, OpenAI etc... #22591
Replies: 5 comments 2 replies
-
Hello, @thibaudbrg! I'm here to help you with your Langchain issue. I'm a bot that can assist you with bugs, questions, and becoming a contributor. Let's tackle this challenge together! To configure your Python project using Langchain, Langsmith, and various LLMs to forward requests through your corporate proxy, you need to set up the proxy settings for each component. Here are the steps to achieve this:
By following these steps, you should be able to forward requests through your corporate proxy for Langchain, Langsmith, and various LLMs. For more details on configuring proxies for specific components, you can refer to the relevant sections in the Langchain documentation and source code: |
Beta Was this translation helpful? Give feedback.
-
So I indeed managed to make OpenAI work with the proxy by simply setting up But my question remains for Langsmith. |
Beta Was this translation helpful? Give feedback.
-
This does not really solves it because I would still need to have Langsmith components to use my new The problem needs a more general solution. Another example, while trying to use ChatOllama, I need to pass the proxy and the corporate certificate. This cannot be achieved as there does not seem to exist any parameters in |
Beta Was this translation helpful? Give feedback.
-
Hello, Any update on this? I also have the proxy problem with LangSmith. When I try to use LangSmith with my LangChain project from within the organization I got the following error:
Thank you! |
Beta Was this translation helpful? Give feedback.
-
Not to forget: "no_proxy". |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Checked other resources
Commit to Help
Example Code
Description
Hi everyone,
First of all, thank you for your awesome work at Langchain.
I'm building a rather complex python project pimped with AI using Lanchain, Langsmith and any LLMs (I mostly use OpenAI, Claude and Ollama+llama3-70B). The core logic of the app is built with the help of the LangGraph framework. This clearly make the code conceptually more readable and conveniant for everyone.
I'm trying to move the code to my corporate desktop but I'm facing the issue with the corporate proxy.
I'd like to be able to forward any requests that the code makes to any endpoint (OpenAI or Claude or Langsmith) through my corporate proxy.
Even if I saw several solution tentatives with OpenAI, it does not really work for me and doesn't solve the solution in general. I'm looking for an easy way to pass my
http_proxy
and myhttps_proxy
urls to Langchain so that it automatically uses them when requesting any endpoint. (say OpenAI or Claude or Langsmith APIs)In fact, for OpenAI solely, I tried setting up
OPENAI_PROXY
or evenhttp_proxy
andhttps_proxy
in the.env
file but without success. (#1423). I also tried to implement the variables directly into theChatOpenAI()
instance class (#8896).For Langsmith, I have no idea.
To let you have a better understanding of my code, here are some highlights:
.env
looks like this:Thanks for your help!
System Info
Python 3.11.9
Windows 10
requests 2.31.0
Beta Was this translation helpful? Give feedback.
All reactions