docs/tutorials/qa_chat_history/ #27956
Replies: 23 comments 15 replies
-
it can be a silly question, but please help me: |
Beta Was this translation helpful? Give feedback.
-
The import code: vector_store = InMemoryVectorStore(embeddings) vector_store = InMemoryVectorStore(embeddings) I think you wrongly added a "_" that will cause import error |
Beta Was this translation helpful? Give feedback.
-
Hello!
|
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
Hi there, |
Beta Was this translation helpful? Give feedback.
-
Hi |
Beta Was this translation helpful? Give feedback.
-
The memory saver resend all the previous messages (Human and AI message) to LLM so the LLM has context. Wouldn't it be easy and consume less resource if we could the multi-turn chat session from LLM itself like this: https://ai.google.dev/gemini-api/docs/text-generation?lang=node#chat? |
Beta Was this translation helpful? Give feedback.
-
Great tutorials thanks team. Do the Thanks again. |
Beta Was this translation helpful? Give feedback.
-
|
Beta Was this translation helpful? Give feedback.
-
Great tutorials thanks team. I am a nerd, so I draw a graph to help me understand how it works. Hope it helps you too: |
Beta Was this translation helpful? Give feedback.
-
Great tutorial. Thank you. I am trying the "create_react_agent" method with Anthropic model (used ChatAnthropic from langchain library). When I ran the message from the tutorial following error is what I received: == Error message starts == But when I tried with Open AI GPT 4o, it worked just fine. What changes should I make when I try it with Anthropic model? Following is a code: == Code starts == sonnet_3_5_model = ChatAnthropic( memory = MemorySaver() memory_config = {"configurable": {"thread_id": "def234"}} inputs = {"messages": [("user", "What's your name?")]} for s in agent_executor.stream(inputs, stream_mode="values", config=memory_config,): == Code ends == |
Beta Was this translation helpful? Give feedback.
-
Hi, may I ask how to fix the error below: NotImplementedError: It seems that there is a problem with the function 'query_or_respond'? I used the same code as this tutorial. Welcome to any comments, idea or possible solutions, thanks~ |
Beta Was this translation helpful? Give feedback.
-
StateGraph generate step produces empty "Ai Message"
^^^ Where is the response? ^^^ I check the log output of the
The response |
Beta Was this translation helpful? Give feedback.
-
How would one pass metadata filtering arguments at input time to this? Basically, I want the user to be able to select a "source" to further make the retrieval process more accurate. |
Beta Was this translation helpful? Give feedback.
-
https://api.python.langchain.com/en/latest/tools/langchain.tools.retriever.create_retriever_tool.html makes a reference to this tutorial. How does |
Beta Was this translation helpful? Give feedback.
-
AttributeError: module 'mlflow' has no attribute 'trace' |
Beta Was this translation helpful? Give feedback.
-
I am getting a bit lost with all this LangGraph agent stuff. At least for my app, I just want simply to add a few documents from an API and ask Q&A around that. Thats it ? Whats the best way of doing that ? I managed to create some direct responses without langgraph but now I want to add Streaming + History. |
Beta Was this translation helpful? Give feedback.
-
TypeError: Type is not msgpack serializable: ToolMessage When running Stateful management of chat history |
Beta Was this translation helpful? Give feedback.
-
If you are not using Chatgpt but an open-source llm, you will have a problem with this line of code: llm_with_tools = llm.bind_tools([retrieve]) the error message will be something like this: ollama._types.ResponseError: registry.ollama.ai/library/gemma3:4b does not support tools (status code: 400) using this langchain reference : https://python.langchain.com/v0.2/docs/integrations/chat/ollama/#tool-calling I was able to fix it, i simply wrote: from langchain_ollama import ChatOllama |
Beta Was this translation helpful? Give feedback.
-
The code is working fine for me, but the output doesn't contain any tool message. I wonder why could be. |
Beta Was this translation helpful? Give feedback.
-
Hello, The code is working fine but Im not able to see section like "====Tool Message====". I use in this code Ollama and polish model: "Bielik" |
Beta Was this translation helpful? Give feedback.
-
What should I do to be professional in RAG? anyone have experience told me the right way? |
Beta Was this translation helpful? Give feedback.
-
It should have the link to deploy this app on what is next section |
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
docs/tutorials/qa_chat_history/
This guide assumes familiarity with the following concepts:
https://python.langchain.com/docs/tutorials/qa_chat_history/
Beta Was this translation helpful? Give feedback.
All reactions