-
Notifications
You must be signed in to change notification settings - Fork 2
feat: add providers #2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Open
SevenFo
wants to merge
30
commits into
FoundationAgents:main
Choose a base branch
from
SevenFo:feature/providers-split-integration
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Open
Changes from all commits
Commits
Show all changes
30 commits
Select commit
Hold shift + click to select a range
cd37d92
Update pre-commit configuration
SevenFo bfd8295
feat: migrate Anthropic provider to use metagpt-core dependency
SevenFo 556bda6
update tests and setup.py
SevenFo 3f42de8
Feature: Migrate Ark provider from main repo to MetaGPT-Ext
SevenFo 5f4afa8
feat: migrate AWS Bedrock provider from MetaGPT to MetaGPT-Ext
SevenFo 9a6e3b5
更新单元测试
SevenFo b16f358
更新setup.py,排除tests目录下的包
SevenFo 4170416
Add Azure OpenAI provider implementation
SevenFo 731cf04
feat: migrate AWS Bedrock provider from MetaGPT to MetaGPT-Ext
SevenFo 78a7f14
feat: migrate DashScope provider from MetaGPT main repository
SevenFo e14fa0e
feat: migrate google-gemini-provider from MetaGPT to MetaGPT-Ext
SevenFo bf7a2bf
run pre-commit
SevenFo e85c4fa
run pre-commit
SevenFo 7b41f43
run pre-commit
SevenFo bb41eee
run pre-commit
SevenFo fc0cf44
run pre-commit
SevenFo 0a117af
run pre-commit
SevenFo 5cabf4c
Merge anthropic provider split
SevenFo 637ccca
Merge ark provider split
SevenFo f87dfc4
Merge azure openai provider split
SevenFo a6a302d
Merge bedrock provider split
SevenFo abcbed0
Merge dashscope provider split
SevenFo 7182b4f
Merge google gemini provider split
SevenFo 6af9c0d
remove redundant directory
SevenFo 4d8cade
Merge bedrock provider split
SevenFo 290bc9f
refactor: remove redundant UTF-8 encoding declarations
SevenFo ff91ed8
feat: 添加asyncio.run到所有provider示例代码
SevenFo 3279e53
统一 requirements-test.txt
SevenFo 8004440
修正 anthropic provider 的示例代码依赖
SevenFo 9cb6ec5
update requirements.txt of azure-openai provider
SevenFo File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,4 +1,4 @@ | ||
default_stages: [ commit ] | ||
default_stages: [ pre-commit ] | ||
|
||
# Install | ||
# 1. pip install metagpt[dev] | ||
|
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,59 @@ | ||
# Python | ||
__pycache__/ | ||
*.py[cod] | ||
*$py.class | ||
*.so | ||
.Python | ||
build/ | ||
develop-eggs/ | ||
dist/ | ||
downloads/ | ||
eggs/ | ||
.eggs/ | ||
lib/ | ||
lib64/ | ||
parts/ | ||
sdist/ | ||
var/ | ||
wheels/ | ||
*.egg-info/ | ||
.installed.cfg | ||
*.egg | ||
|
||
# Unit test / coverage reports | ||
htmlcov/ | ||
.tox/ | ||
.nox/ | ||
.coverage | ||
.coverage.* | ||
.cache | ||
nosetests.xml | ||
coverage.xml | ||
*.cover | ||
.hypothesis/ | ||
.pytest_cache/ | ||
|
||
# Test logs | ||
test_logs/ | ||
|
||
# Environments | ||
.env | ||
.venv | ||
env/ | ||
venv/ | ||
ENV/ | ||
env.bak/ | ||
venv.bak/ | ||
|
||
# VS Code | ||
.vscode/ | ||
*.code-workspace | ||
|
||
# PyCharm | ||
.idea/ | ||
*.iml | ||
*.iws | ||
*.ipr | ||
|
||
# Jupyter Notebook | ||
.ipynb_checkpoints |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,44 @@ | ||
# MetaGPT Provider Anthropic | ||
|
||
This package provides Anthropic (Claude) integration for MetaGPT. | ||
|
||
## Installation | ||
|
||
```bash | ||
pip install metagpt-provider-anthropic | ||
``` | ||
|
||
## Usage | ||
|
||
```python | ||
import asyncio | ||
from metagpt.provider.anthropic import AnthropicLLM | ||
from metagpt.core.configs.llm_config import LLMConfig | ||
|
||
async def main(): | ||
config = LLMConfig( | ||
api_type="anthropic", | ||
api_key="your-api-key", | ||
model="claude-3-opus-20240229" | ||
) | ||
|
||
# Initialize the Anthropic LLM | ||
llm = AnthropicLLM(config) | ||
|
||
# Ask a question | ||
response = await llm.aask("What is artificial intelligence?") | ||
print(response) | ||
|
||
# Run the async function | ||
if __name__ == "__main__": | ||
asyncio.run(main()) | ||
``` | ||
|
||
## Configuration | ||
|
||
The following configuration parameters are supported: | ||
|
||
- `api_type`: Must be set to "anthropic" to use this provider | ||
- `api_key`: Your Anthropic API key | ||
- `model`: The Claude model to use (default: "claude-3-opus-20240229") | ||
- `base_url`: Optional base URL for the Anthropic API |
5 changes: 5 additions & 0 deletions
5
provider/metagpt-provider-anthropic/metagpt/provider/anthropic/__init__.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,5 @@ | ||
#!/usr/bin/env python | ||
|
||
from metagpt.provider.anthropic.base import AnthropicLLM | ||
|
||
__all__ = ["AnthropicLLM"] |
89 changes: 89 additions & 0 deletions
89
provider/metagpt-provider-anthropic/metagpt/provider/anthropic/base.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,89 @@ | ||
#!/usr/bin/env python | ||
|
||
from anthropic import AsyncAnthropic | ||
from anthropic.types import Message, Usage | ||
from metagpt.core.configs.llm_config import LLMConfig, LLMType | ||
from metagpt.core.const import USE_CONFIG_TIMEOUT | ||
from metagpt.core.logs import log_llm_stream | ||
from metagpt.core.provider.base_llm import BaseLLM | ||
from metagpt.core.provider.llm_provider_registry import register_provider | ||
|
||
|
||
@register_provider([LLMType.ANTHROPIC, LLMType.CLAUDE]) | ||
class AnthropicLLM(BaseLLM): | ||
def __init__(self, config: LLMConfig): | ||
self.config = config | ||
self.__init_anthropic() | ||
|
||
def __init_anthropic(self): | ||
self.model = self.config.model | ||
self.aclient: AsyncAnthropic = AsyncAnthropic(api_key=self.config.api_key, base_url=self.config.base_url) | ||
|
||
def _const_kwargs(self, messages: list[dict], stream: bool = False) -> dict: | ||
kwargs = { | ||
"model": self.model, | ||
"messages": messages, | ||
"max_tokens": self.config.max_token, | ||
"stream": stream, | ||
} | ||
|
||
if self.use_system_prompt: | ||
# if the model support system prompt, extract and pass it | ||
if messages[0]["role"] == "system": | ||
kwargs["messages"] = messages[1:] | ||
kwargs["system"] = messages[0]["content"] # set system prompt here | ||
|
||
if self.config.reasoning: | ||
kwargs["thinking"] = {"type": "enabled", "budget_tokens": self.config.reasoning_max_token} | ||
|
||
return kwargs | ||
|
||
def _update_costs(self, usage: Usage, model: str = None, local_calc_usage: bool = True): | ||
usage = {"prompt_tokens": usage.input_tokens, "completion_tokens": usage.output_tokens} | ||
super()._update_costs(usage, model) | ||
|
||
def get_choice_text(self, resp: Message) -> str: | ||
if len(resp.content) > 1: | ||
self.reasoning_content = resp.content[0].thinking | ||
text = resp.content[1].text | ||
else: | ||
text = resp.content[0].text | ||
return text | ||
|
||
async def _achat_completion(self, messages: list[dict], timeout: int = USE_CONFIG_TIMEOUT) -> Message: | ||
resp: Message = await self.aclient.messages.create(**self._const_kwargs(messages)) | ||
self._update_costs(resp.usage, self.model) | ||
return resp | ||
|
||
async def acompletion(self, messages: list[dict], timeout: int = USE_CONFIG_TIMEOUT) -> Message: | ||
return await self._achat_completion(messages, timeout=self.get_timeout(timeout)) | ||
|
||
async def _achat_completion_stream(self, messages: list[dict], timeout: int = USE_CONFIG_TIMEOUT) -> str: | ||
stream = await self.aclient.messages.create(**self._const_kwargs(messages, stream=True)) | ||
collected_content = [] | ||
collected_reasoning_content = [] | ||
usage = Usage(input_tokens=0, output_tokens=0) | ||
|
||
async for event in stream: | ||
event_type = event.type | ||
if event_type == "message_start": | ||
usage.input_tokens = event.message.usage.input_tokens | ||
usage.output_tokens = event.message.usage.output_tokens | ||
elif event_type == "content_block_delta": | ||
delta_type = event.delta.type | ||
if delta_type == "thinking_delta": | ||
collected_reasoning_content.append(event.delta.thinking) | ||
elif delta_type == "text_delta": | ||
content = event.delta.text | ||
log_llm_stream(content) | ||
collected_content.append(content) | ||
elif event_type == "message_delta": | ||
usage.output_tokens = event.usage.output_tokens # update final output_tokens | ||
|
||
log_llm_stream("\n") | ||
self._update_costs(usage) | ||
full_content = "".join(collected_content) | ||
if collected_reasoning_content: | ||
self.reasoning_content = "".join(collected_reasoning_content) | ||
|
||
return full_content |
8 changes: 8 additions & 0 deletions
8
provider/metagpt-provider-anthropic/metagpt/provider/anthropic/utils.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,8 @@ | ||
#!/usr/bin/env python | ||
|
||
"""Utility functions for the Anthropic provider.""" | ||
|
||
|
||
def get_default_anthropic_model(): | ||
"""Return the default Anthropic model.""" | ||
return "claude-3-opus-20240229" |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,4 @@ | ||
pytest>=7.3.1 | ||
pytest-asyncio>=0.21.0 | ||
pytest-mock>=3.10.0 | ||
pytest-cov>=4.1.0 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
anthropic>=0.15.0 | ||
metagpt-core>=1.0.0 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,32 @@ | ||
#!/usr/bin/env python | ||
|
||
from setuptools import find_namespace_packages, setup | ||
|
||
with open("README.md", "r", encoding="utf-8") as fh: | ||
long_description = fh.read() | ||
|
||
with open("requirements.txt", "r", encoding="utf-8") as f: | ||
required = f.read().splitlines() | ||
|
||
setup( | ||
name="metagpt-provider-anthropic", | ||
version="0.1.0", | ||
description="Anthropic (Claude) provider for MetaGPT", | ||
long_description=long_description, | ||
long_description_content_type="text/markdown", | ||
author="MetaGPT Team", | ||
author_email="[email protected]", | ||
url="https://github.com/geekan/MetaGPT-Ext", | ||
packages=find_namespace_packages(include=["metagpt.*"], exclude=["*.tests", "*.tests.*", "tests.*", "tests"]), | ||
install_requires=required, | ||
python_requires=">=3.9", | ||
classifiers=[ | ||
"Development Status :: 3 - Alpha", | ||
"Intended Audience :: Developers", | ||
"License :: OSI Approved :: MIT License", | ||
"Programming Language :: Python :: 3", | ||
"Programming Language :: Python :: 3.9", | ||
"Programming Language :: Python :: 3.10", | ||
"Programming Language :: Python :: 3.11", | ||
], | ||
) |
Empty file.
10 changes: 10 additions & 0 deletions
10
provider/metagpt-provider-anthropic/tests/mock_llm_config.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
#!/usr/bin/env python | ||
""" | ||
Mock LLM configurations for testing | ||
""" | ||
|
||
from metagpt.core.configs.llm_config import LLMConfig | ||
|
||
mock_llm_config_anthropic = LLMConfig( | ||
api_type="anthropic", api_key="xxx", base_url="https://api.anthropic.com", model="claude-3-opus-20240229" | ||
) |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,10 @@ | ||
[pytest] | ||
testpaths = tests | ||
python_files = test_*.py | ||
python_classes = Test* | ||
python_functions = test_* | ||
addopts = -xvs | ||
log_file = test_logs/pytest.log | ||
log_file_level = DEBUG | ||
log_file_format = %(asctime)s [%(levelname)8s] %(message)s (%(filename)s:%(lineno)s) | ||
log_file_date_format = %Y-%m-%d %H:%M:%S |
67 changes: 67 additions & 0 deletions
67
provider/metagpt-provider-anthropic/tests/req_resp_const.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,67 @@ | ||
#!/usr/bin/env python | ||
""" | ||
Default request & response data for provider unittest | ||
""" | ||
|
||
from anthropic.types import ( | ||
ContentBlockDeltaEvent, | ||
Message, | ||
MessageStartEvent, | ||
TextBlock, | ||
TextDelta, | ||
) | ||
from anthropic.types import Usage as AnthropicUsage | ||
from metagpt.core.provider.base_llm import BaseLLM | ||
|
||
# Common test data | ||
prompt = "who are you?" | ||
messages = [{"role": "user", "content": prompt}] | ||
resp_cont_tmpl = "I'm {name}" | ||
default_resp_cont = resp_cont_tmpl.format(name="GPT") | ||
|
||
|
||
# For Anthropic | ||
def get_anthropic_response(name: str, stream: bool = False) -> Message: | ||
if stream: | ||
return [ | ||
MessageStartEvent( | ||
message=Message( | ||
id="xxx", | ||
model=name, | ||
role="assistant", | ||
type="message", | ||
content=[TextBlock(text="", type="text")], | ||
usage=AnthropicUsage(input_tokens=10, output_tokens=10), | ||
), | ||
type="message_start", | ||
), | ||
ContentBlockDeltaEvent( | ||
index=0, | ||
delta=TextDelta(text=resp_cont_tmpl.format(name=name), type="text_delta"), | ||
type="content_block_delta", | ||
), | ||
] | ||
else: | ||
return Message( | ||
id="xxx", | ||
model=name, | ||
role="assistant", | ||
type="message", | ||
content=[TextBlock(text=resp_cont_tmpl.format(name=name), type="text")], | ||
usage=AnthropicUsage(input_tokens=10, output_tokens=10), | ||
) | ||
|
||
|
||
# For llm general chat functions call | ||
async def llm_general_chat_funcs_test(llm: BaseLLM, prompt: str, messages: list[dict], resp_cont: str): | ||
resp = await llm.aask(prompt, stream=False) | ||
assert resp == resp_cont | ||
|
||
resp = await llm.aask(prompt) | ||
assert resp == resp_cont | ||
|
||
resp = await llm.acompletion_text(messages, stream=False) | ||
assert resp == resp_cont | ||
|
||
resp = await llm.acompletion_text(messages, stream=True) | ||
assert resp == resp_cont |
45 changes: 45 additions & 0 deletions
45
provider/metagpt-provider-anthropic/tests/test_anthropic_api.py
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,45 @@ | ||
#!/usr/bin/env python | ||
""" | ||
Test for the Anthropic (Claude) provider | ||
""" | ||
|
||
import pytest | ||
from anthropic.resources.completions import Completion | ||
from metagpt.provider.anthropic import AnthropicLLM | ||
from tests.mock_llm_config import mock_llm_config_anthropic | ||
from tests.req_resp_const import ( | ||
get_anthropic_response, | ||
llm_general_chat_funcs_test, | ||
messages, | ||
prompt, | ||
resp_cont_tmpl, | ||
) | ||
|
||
name = "claude-3-opus-20240229" | ||
resp_cont = resp_cont_tmpl.format(name=name) | ||
|
||
|
||
async def mock_anthropic_messages_create( | ||
self, messages: list[dict], model: str, stream: bool = True, max_tokens: int = None, system: str = None | ||
) -> Completion: | ||
if stream: | ||
|
||
async def aresp_iterator(): | ||
resps = get_anthropic_response(name, stream=True) | ||
for resp in resps: | ||
yield resp | ||
|
||
return aresp_iterator() | ||
else: | ||
return get_anthropic_response(name) | ||
|
||
|
||
@pytest.mark.asyncio | ||
async def test_anthropic_acompletion(mocker): | ||
mocker.patch("anthropic.resources.messages.AsyncMessages.create", mock_anthropic_messages_create) | ||
|
||
anthropic_llm = AnthropicLLM(mock_llm_config_anthropic) | ||
resp = await anthropic_llm.acompletion(messages) | ||
assert resp.content[0].text == resp_cont | ||
|
||
await llm_general_chat_funcs_test(anthropic_llm, prompt, messages, resp_cont) |
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
没有安装
pre-commit
?格式看着有问题There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这个是跑过
pre-commit run --all-files
应该没问题吧?