FastMCP-based server that exposes tools to query a T5G dashboard for cards, cases, bugs, escalations, issues, and enriched case data.
- Python 3.9+
- pip
pip install -r requirements.txt
Set the base URL for the dashboard API used by the server.
export DASHBOARD_API="https://your-dashboard.example.com/api"
- Using the helper script:
./run.sh
- Or directly with fastmcp:
fastmcp run server.py:mcp --transport http --port 8000
The server will listen at http://localhost:8000/mcp
by default.
A minimal chatbot is provided that can optionally use an LLM (OpenAI) with tool calling to invoke the MCP tools. Without an LLM, it falls back to keyword routing.
# optional if using LLM-backed chat
export OPENAI_API_KEY=sk-...
export OPENAI_MODEL=gpt-4o-mini
# ensure server is running first (see above)
python chatbot.py --mcp-url http://localhost:8000/mcp
# to run without an LLM and rely on keyword routing
python chatbot.py --no-llm
This repo includes a simple client to invoke server tools over HTTP.
python client.py -d cards
python client.py -d cases
python client.py -d bugs
python client.py -d details
python client.py -d escalations
python client.py -d issues
python client.py -d full_case_data
Server tools are defined in server.py
and can be called via MCP or the provided CLI.
get_cards
: Returns all JIRA cards as a dict.get_cases
: Returns all customer cases as a dict.get_bugs
: Returns JIRA bugs keyed by case number.get_details
: Returns extended case details keyed by case number.get_escalations
: Returns a list of escalated cases.get_issues
: Returns issues keyed by case number.get_full_case_data
: All the info from get_cards, but keyed on cases rather than JIRA cards