council of models for decision
A Model Context Protocol (MCP) server that queries multiple Ollama models and combines their responses, providing diverse AI perspectives on a single question. This creates a "council of advisors" approach where Claude can synthesize multiple viewpoints alongside its own to provide more comprehensive answers.
graph TD
A[Start] --> B[Worker Local AI 1 Opinion]
A --> C[Worker Local AI 2 Opinion]
A --> D[Worker Local AI 3 Opinion]
B --> E[Manager AI]
C --> E
D --> E
E --> F[Decision Made]
To install multi-ai-advisor-mcp for Claude Desktop automatically via Smithery:
npx -y @smithery/cli install @YuChenSSR/multi-ai-advisor-mcp --client claude
Clone this repository:
git clone https://github.com/YuChenSSR/multi-ai-advisor-mcp.git
cd multi-ai-advisor-mcp
Install dependencies:
npm install
Build the project:
npm run build
Install required Ollama models:
ollama pull gemma3:1b
ollama pull llama3.2:1b
ollama pull deepseek-r1:1.5b
Create a .env
file in the project root with your desired configuration:
# Server configuration
SERVER_NAME=multi-model-advisor
SERVER_VERSION=1.0.0
DEBUG=true
# Ollama configuration
OLLAMA_API_URL=http://localhost:11434
DEFAULT_MODELS=gemma3:1b,llama3.2:1b,deepseek-r1:1.5b
# System prompts for each model
GEMMA_SYSTEM_PROMPT=You are a creative and innovative AI assistant. Think outside the box and offer novel perspectives.
LLAMA_SYSTEM_PROMPT=You are a supportive and empathetic AI assistant focused on human well-being. Provide considerate and balanced advice.
DEEPSEEK_SYSTEM_PROMPT=You are a logical and analytical AI assistant. Think step-by-step and explain your reasoning clearly.
Locate your Claude for Desktop configuration file:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
Edit the file to add the Multi-Model Advisor MCP server:
{
"mcpServers": {
"multi-model-advisor": {
"command": "node",
"args": ["/absolute/path/to/multi-ai-advisor-mcp/build/index.js"]
}
}
}
Replace /absolute/path/to/
with the actual path to your project directory
Restart Claude for Desktop
Once connected to Claude for Desktop, you can use the Multi-Model Advisor in several ways:
You can see all available models on your system:
Show me which Ollama models are available on my system
This will display all installed Ollama models and indicate which ones are configured as defaults.
Simply ask Claude to use the multi-model advisor:
what are the most important skills for success in today's job market,
you can use gemma3:1b, llama3.2:1b, deepseek-r1:1.5b to help you
Claude will query all default models and provide a synthesized response based on their different perspectives.
The MCP server exposes two tools:
list-available-models
: Shows all Ollama models on your systemquery-models
: Queries multiple models with a questionWhen you ask Claude a question referring to the multi-model advisor:
query-models
toolEach model can have a different "persona" or role assigned, encouraging diverse perspectives.
If the server can't connect to Ollama:
ollama serve
)If a model is reported as unavailable:
ollama pull <model-name>
ollama list
list-available-models
tool to see all available modelsIf the tools don't appear in Claude:
Some managers' AI models may have chosen larger models, but there is not enough memory to run them. You can try specifying a smaller model (see the Basic Usage) or upgrading the memory.
MIT License
For more details, please see the LICENSE file in this project repository
Contributions are welcome! Please feel free to submit a Pull Request.