-
Notifications
You must be signed in to change notification settings - Fork 4.8k
提交AI模型使用阿里qwen的版本 #223
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
提交AI模型使用阿里qwen的版本 #223
Conversation
|
测试了能用qwen跑通 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Summary of Changes
Hello @OliverPanda, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!
此拉取请求的核心目标是增强TradingAgents框架的语言模型多样性,通过集成多个国内领先的大型语言模型,为用户提供更广泛的选择和更优化的本地化体验。这些改动不仅涉及底层模型适配,还包括用户界面的汉化和详细的项目文档,旨在提升框架的整体可用性和可维护性。
Highlights
- 国内大模型集成: 集成了阿里巴巴通义千问(Qwen)、百度文心一言(ERNIE)、智谱AI(GLM)和月之暗面Kimi等国内主流大型语言模型,极大地扩展了TradingAgents框架的AI能力。
- LLM适配器与核心配置更新: 新增了针对各国内大模型的适配器类,并更新了核心配置和LLM初始化逻辑,使其能够根据环境变量动态选择和加载不同的LLM服务提供商。
- 命令行界面(CLI)本地化与增强: CLI的用户提示和选项已全面汉化,并增加了国内LLM的选择功能,提升了中文用户的交互体验。同时,日志和报告文件写入已确保UTF-8编码,避免乱码问题。
- 项目文档与结构说明: 新增了详细的中文项目架构指南(GUIDE.md)和项目结构说明(项目结构说明.md),全面阐述了框架的设计理念、模块功能和运行方式,方便开发者理解和使用。
Using Gemini Code Assist
The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.
Invoking Gemini
You can request assistance from Gemini at any point in your pull request via creating an issue comment (i.e. comment on the pull request page) using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands.
| Feature | Command | Description |
|---|---|---|
| Code Review | /gemini review |
Performs a code review for the current pull request in its current state. |
| Pull Request Summary | /gemini summary |
Provides a summary of the current pull request in its current state. |
| Comment | @gemini-code-assist | Responds in comments when explicitly tagged, both in issue comments and review comments. |
| Help | /gemini help |
Displays a list of available commands. |
Customization
To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.
Limitations & Feedback
Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.
You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.
Footnotes
-
Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution. ↩
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Code Review
本次提交主要围绕集成国内AI大模型(特别是阿里通义千问)展开,增加了对Qwen、Ernie、GLM、Kimi等模型的适配器。同时,代码中增加了大量中文注释和文档,CLI界面也进行了汉化,极大地提升了对中文开发者的友好度。这是一个重要的功能扩展,使项目能够利用国内的AI生态。
然而,在核心实现上存在一些严重问题:
- 多个模型的Embedding(向量化)功能实现不完整或不正确,特别是使用了基于哈希的伪Embedding作为回退,这将严重影响记忆检索等核心功能的准确性。
- 部分模型(如文心一言)的API端点URL构建方式有误,可能导致API调用失败。
- 新增加的模型适配器在流式输出(streaming)和工具绑定(tool binding)方面的实现较为初级,有待完善。
建议优先修复Embedding和API端点相关的严重问题,以确保新模型能够正确工作。同时,完善适配器的功能以充分利用模型能力。
| def get_embedding(self, text): | ||
| """Get OpenAI embedding for a text""" | ||
| """Get embedding for a text using the configured model""" | ||
|
|
||
| response = self.client.embeddings.create( | ||
| model=self.embedding, input=text | ||
| ) | ||
| return response.data[0].embedding | ||
| # 根据不同的模型提供商调用不同的API | ||
| if "dashscope.aliyuncs.com" in self.config["backend_url"]: | ||
| return self._get_qwen_embedding(text) | ||
| elif "baidu" in self.config["backend_url"]: | ||
| return self._get_ernie_embedding(text) | ||
| else: | ||
| # 对于其他模型,使用简化的embedding(返回固定向量) | ||
| return self._get_simple_embedding(text) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| response = requests.post( | ||
| f"{self.config['backend_url'].replace('/chat/completions', '/embeddings')}", | ||
| headers=headers, | ||
| json=data, | ||
| timeout=30 | ||
| ) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
通义千问Embedding API的端点URL构建方式 f"{self.config['backend_url'].replace('/chat/completions', '/embeddings')}" 存在风险且可能不正确。这种依赖于替换字符串的方式非常脆弱,一旦聊天API的URL路径发生变化就会失效。根据阿里DashScope的官方文档,Embedding有其专用的服务地址。建议直接使用官方指定的、独立的Embedding API端点URL,以确保稳定性和正确性。
| response = requests.post( | |
| f"{self.config['backend_url'].replace('/chat/completions', '/embeddings')}", | |
| headers=headers, | |
| json=data, | |
| timeout=30 | |
| ) | |
| response = requests.post( | |
| "https://dashscope.aliyuncs.com/api/v1/services/embeddings/text-embedding/text-embedding", | |
| headers=headers, | |
| json=data, | |
| timeout=30 | |
| ) | |
| data["stop"] = stop | ||
|
|
||
| # 发送请求 | ||
| url = f"{self.base_url}/{self.model_name}?access_token={access_token}" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| def _get_ernie_embedding(self, text): | ||
| """获取文心一言embedding""" | ||
| try: | ||
| # 文心一言的embedding API调用 | ||
| # 这里使用简化的实现 | ||
| return self._get_simple_embedding(text) | ||
| except Exception as e: | ||
| print(f"⚠️ 文心一言embedding调用失败: {e}") | ||
| return self._get_simple_embedding(text) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| ┌─────────────────────────────────────────────────────────────┐ | ||
| │ TradingAgents 架构图 │ | ||
| ├─────────────────────────────────────────────────────────────┤ | ||
| │ 数据层 (Data Layer) │ | ||
| │ ┌─────────┬─────────┬─────────┬─────────┬─────────┐ │ | ||
| │ │Yahoo │FinnHub │Reddit │Google │技术指标 │ │ | ||
| │ │Finance │数据 │数据 │News │计算 │ │ | ||
| │ └─────────┴─────────┴─────────┴─────────┴─────────┘ │ | ||
| ├─────────────────────────────────────────────────────────────┤ | ||
| │ 代理层 (Agent Layer) │ | ||
| │ ┌─────────┬─────────┬─────────┬─────────┬─────────┐ │ | ||
| │ │市场分析师│基本面分析师│新闻分析师│社交媒体分析师│风险管理 │ │ | ||
| │ └─────────┴─────────┴─────────┴─────────┴─────────┘ │ | ||
| │ ┌─────────┬─────────┬─────────┬─────────┬─────────┐ │ | ||
| │ │看涨研究员│看跌研究员│研究经理 │交易员 │风险经理 │ │ | ||
| │ └─────────┴─────────┴─────────┴─────────┴─────────┘ │ | ||
| ├─────────────────────────────────────────────────────────────┤ | ||
| │ 协调层 (Coordination Layer) │ | ||
| │ ┌─────────┬─────────┬─────────┬─────────┐ │ | ||
| │ │图构建 │状态管理 │条件逻辑 │信号处理 │ │ | ||
| │ └─────────┴─────────┴─────────┴─────────┘ │ | ||
| ├─────────────────────────────────────────────────────────────┤ | ||
| │ 决策层 (Decision Layer) │ | ||
| │ ┌─────────┬─────────┬─────────┐ │ | ||
| │ │辩论机制 │风险评估 │最终决策 │ │ | ||
| │ └─────────┴─────────┴─────────┘ │ | ||
| └─────────────────────────────────────────────────────────────┘ | ||
| ``` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| BASE_URLS = [ | ||
| ("OpenAI", "https://api.openai.com/v1"), | ||
| ("Anthropic", "https://api.anthropic.com/"), | ||
| ("Google", "https://generativelanguage.googleapis.com/v1"), | ||
| ("Openrouter", "https://openrouter.ai/api/v1"), | ||
| ("Ollama", "http://localhost:11434/v1"), | ||
| # 国内免费大模型(推荐) | ||
| ("🇨🇳 通义千问 (Qwen) - 金融领域表现优秀", "qwen", "https://dashscope.aliyuncs.com/compatible-mode/v1"), | ||
| ("🇨🇳 文心一言 (ERNIE) - 免费额度最高", "ernie", "https://aip.baidubce.com/rpc/2.0/ai_custom/v1/wenxinworkshop/chat"), | ||
| ("🇨🇳 智谱AI (GLM) - 清华大学出品", "glm", "https://open.bigmodel.cn/api/paas/v4"), | ||
| ("🇨🇳 月之暗面Kimi - 长文本处理强", "kimi", "https://api.moonshot.cn/v1"), | ||
| # 国外模型 | ||
| ("🌍 OpenAI - GPT系列", "openai", "https://api.openai.com/v1"), | ||
| ("🌍 Anthropic - Claude系列", "anthropic", "https://api.anthropic.com/"), | ||
| ("🌍 Google - Gemini系列", "google", "https://generativelanguage.googleapis.com/v1"), | ||
| ("🌍 OpenRouter - 多模型聚合", "openrouter", "https://openrouter.ai/api/v1"), | ||
| ("🌍 Ollama - 本地部署", "ollama", "http://localhost:11434/v1"), | ||
| ] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
|
||
| # 优化配置参数 | ||
| config["max_debate_rounds"] = 1 # 减少API调用次数 | ||
| config["online_tools"] = False # 使用离线数据,减少API调用 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| ) | ||
| else: | ||
| raise ValueError(f"Unsupported LLM provider: {self.config['llm_provider']}") | ||
| raise ValueError(f"不支持的AI模型提供商: {self.config['llm_provider']}") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| def _stream( | ||
| self, | ||
| messages: List[BaseMessage], | ||
| stop: Optional[List[str]] = None, | ||
| run_manager: Optional[CallbackManagerForLLMRun] = None, | ||
| **kwargs: Any, | ||
| ) -> Iterator[ChatGeneration]: | ||
| """流式生成(暂不支持)""" | ||
| # 对于不支持流式生成的模型,我们返回一个包含完整响应的生成 | ||
| result = self._generate(messages, stop, run_manager, **kwargs) | ||
| for generation in result.generations: | ||
| yield generation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| def bind_tools(self, tools, **kwargs): | ||
| """绑定工具到模型(简化实现)""" | ||
| # 对于国内模型,我们简化工具绑定 | ||
| # 直接返回self,让上层处理工具调用 | ||
| return self |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
|
Would it not make sense to use the litellm adapter here to so other models can be implemented or used quickly? https://github.com/BerriAI/litellm |
No description provided.