A Python-powered Model Context Protocol MCP server and client that uses Wolfram Alpha via API.
Seamlessly integrate Wolfram Alpha into your chat applications.
This project implements an MCP (Model Context Protocol) server designed to interface with the Wolfram Alpha API. It enables chat-based applications to perform computational queries and retrieve structured knowledge, facilitating advanced conversational capabilities.
Included is an MCP-Client example utilizing Gemini via LangChain, demonstrating how to connect large language models to the MCP server for real-time interactions with Wolfram Alpha’s knowledge engine.
Wolfram|Alpha Integration for math, science, and data queries.
Modular Architecture Easily extendable to support additional APIs and functionalities.
Multi-Client Support Seamlessly handle interactions from multiple clients or interfaces.
MCP-Client example using Gemini (via LangChain).
git clone https://github.com/ricocf/mcp-wolframalpha.git
cd mcp-wolframalpha
Create a .env file based on the example:
WOLFRAM_API_KEY=your_wolframalpha_appid
GeminiAPI=your_google_gemini_api_key (Optional if using Client method below.)
pip install -r requirements.txt
To use with the VSCode MCP Server:
.vscode/mcp.json
in your project root.configs/vscode_mcp.json
as a template.To use with Claude Desktop:
{
"mcpServers": {
"WolframAlphaServer": {
"command": "python3",
"args": [
"/path/to/src/core/server.py"
]
}
}
}
This project includes an LLM client that communicates with the MCP server.
python main.py
To build and run the client inside a Docker container:
docker build -t wolframalpha -f .devops/llm.Dockerfile .
docker run -it wolframalpha