RAkira (@Acecrow) 在 【openclaw求助】LLM request failed: network connection error.(wsl的ubuntu系统部署openclaw+windows本地ollama模型) 中发帖
向大佬们求助,情况如下:
报错内容:
[image]
龙虾部分:使用wsl的ubuntu系统部署openclaw
模型部分:windows本地ollama托管
现在在wsl中:
curl http://127.0.0.1:11434
Ollama is running
但是用openclaw tui展示如下:
run error: LLM request failed: network connection error.
openclaw配置文件模型部分如下:
"models": {
"mode": "merge",
"providers": {
"ollama": {
"baseUrl": "http://127.0.0.1:11434",
"api": "ollama",
"apiKey": ...