Replies: 2 comments 2 replies
-
|
Thank you for your attention on this issue. Please let me know if additional info is needed or how I can be helpful |
Beta Was this translation helpful? Give feedback.
1 reply
-
|
Thank you, @ekzhu, for the follow-up. Indeed, the issue is a model compatibility issue. I am willing to work on a fix but only within the 0.2 context and with DBRX models. From the 0.4 autogen_ext its not clear that DBRX is supported
I will need to pursue other avenues. thanks |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
What happened?
Python Code
This code is exactly taken from the following example
https://microsoft.github.io/autogen/0.2/docs/notebooks/agentchat_groupchat/
**To Reproduce
Expected behavior
The agents should write code and lookup articles in arxiv
Note that his code works successfully when using OPEN AI models such as GPT 4o
However, when running with DBRX
databricks-meta-llama-3-70b-instructthen a BadRequestError is generatedthe DBRX foundation models work fine in most autogen examples but not for GroupChat
Which packages was the bug in?
V0.2 (autogen-agetnchat==0.2.*)
AutoGen library version.
Python dev (main branch)
Other library version.
No response
Model used
databricks-meta-llama-3-70b-instruct
Model provider
None
Other model provider
No response
Python version
3.10
.NET version
None
Operating system
None
Beta Was this translation helpful? Give feedback.
All reactions