刚开始使用openclaw配置提供商、模型时,都是通过命令设置的,命令太多,不太好记。当有个性化的需求时,可以直接改配置文件来添加模型等新的需求
演示内容:
模型选择,此处使用了千问供应商千问两个模型、ollama本地搭建的本地模型和在线模型、还有使用联通云供应商提供的glm、minimax、deepseek三个模型

配置通道
此处有默认的openclaw-tui,openclaw自己提供的,还有新增的机器人和飞书机器人,这两个都是通过各自官网提供的一键安装添加的
机器人:https://q..com/bot/openclaw/login.html,或见个人写的文章
飞书机器人:https://www.feishu.cn/content/article/,或见个人写的文章,只需要终端执行npx -y @larksuite/openclaw-lark-tools install,扫码登录即可

ollama使用参考之前的文章
单个供应商ollama下的模型,新增的话可参考
“ollama”: {
GPT plus 代充 只需 145 "baseUrl": "http://127.0.0.1:11434/v1", "apiKey": "ollama-local", "api": "ollama", "models": [ { "id": "deepseek-v3.2:cloud", "name": "deepseek-v3.2:cloud", "reasoning": true, "input": [ "text" ], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": , "maxTokens": }, { "id": "minimax-m2.5:cloud", "name": "minimax-m2.5:cloud", "reasoning": true, "input": [ "text" ], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": , "maxTokens": }, { "id": "qwen:1.8b", "name": "qwen:1.8b", "input": [ "text" ], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": 32768 }, { "id": "qwen3:4b", "name": "qwen3:4b", "reasoning": true, "input": [ "text" ], "cost": { "input": 0, "output": 0, "cacheRead": 0, "cacheWrite": 0 }, "contextWindow": } ] },
所有模型
默认模型为qwen-portal/coder-model
供应商有三家,参考配置
“agents”: {
GPT plus 代充 只需 145"defaults": { "model": { "primary": "qwen-portal/coder-model" }, "models": { "qwen-portal/coder-model": { "alias": "qwen" }, "qwen-portal/vision-model": {}, "ollama/minimax-m2.5:cloud": { "alias": "ollama-cloud" }, "ollama/deepseek-v3.2:cloud": { "alias": "ollama-cloud2" }, "ollama/qwen:1.8b": { "alias": "ollama-local-1.8b" }, "ollama/qwen3:4b": { "alias": "ollama-local-4b" }, "unicom-cloud/MiniMax-M2.5": { "alias": "unicom" }, "unicom-cloud/DeepSeek-V3.1": { "alias": "unicom2" }, "unicom-cloud/glm-5": { "alias": "unicom3" } }, "workspace": "/root/.openclaw/workspace", "compaction": { "mode": "safeguard" }, "maxConcurrent": 4, "subagents": { "maxConcurrent": 8 } }
版权声明:本文内容由互联网用户自发贡献,该文观点仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容,请联系我们,一经查实,本站将立刻删除。
如需转载请保留出处:https://51itzy.com/kjqy/244172.html