-
-
Couldn't load subscription status.
- Fork 5.3k
Description
How are you running AnythingLLM?
Desktop
What happened?
I had 1.8.5-r2 Anything LLM desktop installed. It worked properly with LM Studio as LLM and embedding model provider simultaneously. During my attempt to add an MCP server to Anything LLM I upgraded to the latest version (1.9.0). Although my configuration remained intact the system stopped working: it was possible to select an LLM, so there was some sort of communication between LMStudio and Anything LLM but once I started to chat I got the following mysterious error message: "An error occurred while streaming response. network error". Once this message appeared Anything LLM stopped being able to list LM Studio models at all until the application was restarted. After I managed to fix my MCP configuration issue (full path was required in the command field for proper operation) I had to reinstall the older 1.8.5-r2 version of Anything LLM to get it working properly with LLM Studio as LLM and embedder provider as well. So now the old version works properly and it is able to use the MCP server as well in agent mode (@agent).
Please, check the reason of functional degradation with the upgrade before too many users run into the same issue.
My platform is a Mac Studio M3 Ultra if it counts.
Are there known steps to reproduce?
No response