Skip to content

Instantly share code, notes, and snippets.

@rajivmehtaflex
Last active July 8, 2025 16:41
Show Gist options
  • Save rajivmehtaflex/28ff9aac8a176bd662650d2549f79ad2 to your computer and use it in GitHub Desktop.
Save rajivmehtaflex/28ff9aac8a176bd662650d2549f79ad2 to your computer and use it in GitHub Desktop.
Local MCP Scripting with AICHAT-UTILS-DESCRIPTION
#!/bin/bash
# Install VSCode Extensions
./code --install-extension ms-python.python
./code --install-extension ms-toolsai.jupyter
./code --install-extension github.copilot
./code --install-extension janisdd.vscode-edit-csv
./code --install-extension ms-python.debugpy
./code --install-extension ms-python.vscode-pylance
./code --install-extension ms-toolsai.jupyter-keymap
./code --install-extension ms-toolsai.jupyter-renderers
./code --install-extension ms-toolsai.vscode-jupyter-cell-tags
./code --install-extension ms-toolsai.vscode-jupyter-slideshow
curl https://gist.githubusercontent.com/rajivmehtaflex/d8f83f7b286fd224d66177f40c162076/raw/203b57e93c61c04290442ba5ee44cdec7edff43d/install_brew.sh | sh
brew install aichat jq argc uv
git clone https://github.com/sigoden/llm-functions.git
cd llm-functions && touch tools.txt agents.txt
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.40.3/install.sh | bash
export NVM_DIR="$HOME/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
[ -s "$NVM_DIR/bash_completion" ] && \. "$NVM_DIR/bash_completion" # This loads nvm bash_completion
nvm install --lts
npm install -g task-master-ai

MCP Local Server Setup Guide

A complete step-by-step guide to create a local MCP (Model Context Protocol) server from scratch.

Prerequisites

  • Python 3.12 or higher
  • uv package manager installed (Install uv)

Step-by-Step Setup

1. Create Project Directory

mkdir mcp-local-gen
cd mcp-local-gen

2. Initialize UV Environment

uv init -p 3.12
uv sync

3. Install Required Dependencies

# Install MCP with CLI support
uv add "mcp[cli]"

# Install logging library
uv add loguru

# Install Groq API client (optional - for web search functionality)
uv add groq

4. Create the MCP Server File

Create quick-mcp.py with the following content:

# server.py
from mcp.server.fastmcp import FastMCP
from loguru import logger
from groq import Groq
import os

logger.add("./file.log")

# Create an MCP server
mcp = FastMCP("Demo")

# Add an addition tool
@mcp.tool()
def sum(a: float, b: float) -> float:
    """Add two numbers"""
    logger.info(f"Summing {a} and {b}")
    return a + b

# Add a dynamic greeting resource
@mcp.resource("greeting://{name}")
def get_greeting(name: str) -> str:
    """Get a personalized greeting"""
    logger.info(f"Getting greeting for {name}")
    return f"Hello, {name}!"

# Add web search tool using Groq (optional)
@mcp.tool()
def web_search_groq(user_query: str, temperature: float = 0.7, max_tokens: int = 2000, top_p: float = 0.94) -> str:
    """Get response from Groq's compound-beta model for web search queries"""
    try:
        logger.info(f"Processing web search query: {user_query[:100]}...")
        client = Groq(api_key=os.getenv("GROQ_API_KEY", "your-api-key-here"))
        completion = client.chat.completions.create(
            model="compound-beta",
            messages=[{"role": "user", "content": user_query}],
            temperature=float(temperature),
            max_completion_tokens=int(max_tokens),
            top_p=float(top_p),
            stream=False,
            stop=None,
        )
        response = completion.choices[0].message.content
        logger.info("Successfully generated web search response")
        return response
    except Exception as e:
        logger.error(f"Error in web_search_groq: {str(e)}")
        return f"An error occurred: {str(e)}"

if __name__ == "__main__":
    logger.info("Starting MCP server...")
    # Initialize and run the server
    mcp.run(transport='stdio')

5. Set Up Environment Variables (Optional)

If using the web search functionality:

# Create .env file
echo "GROQ_API_KEY=your-actual-api-key-here" > .env

6. Create Claude Desktop Configuration

Add this configuration to your Claude Desktop settings:

{
  "mcp-maths-tools": {
    "command": "uv",
    "args": [
      "--directory",
      "/path/to/your/mcp-local-gen/",
      "run",
      "quick-mcp.py"
    ]
  }
}

Note: Replace /path/to/your/mcp-local-gen/ with the actual absolute path to your project directory.

7. Test the Server

Run the server directly:

uv run quick-mcp.py

Or test with MCP CLI:

uv run --with "mcp[cli]" mcp run quick-mcp.py

8. Verify Installation

The server should start and you should see:

  • Log messages in the console
  • A file.log file created with detailed logging
  • The server listening on stdio transport

Available Tools and Resources

Tools

  • sum(a: float, b: float) - Adds two numbers with logging
  • web_search_groq(user_query: str, ...) - Web search using Groq API

Resources

  • greeting://{name} - Dynamic greeting resource for any name

Project Structure

mcp-local-gen/
├── quick-mcp.py        # Main MCP server implementation
├── pyproject.toml      # Python project configuration
├── uv.lock            # UV lock file
├── file.log           # Server logs
├── .env               # Environment variables (optional)
└── README.md          # This guide

Troubleshooting

Common Issues

  1. Import errors: Ensure all dependencies are installed with uv sync
  2. Path issues: Use absolute paths in Claude Desktop configuration
  3. API key errors: Set GROQ_API_KEY environment variable if using web search
  4. Permission errors: Ensure the project directory is accessible

Debug Commands

# Check dependencies
uv show

# Update dependencies
uv lock

# Reinstall environment
uv sync --reinstall

Next Steps

  1. Customize the tools and resources for your specific needs
  2. Add error handling and validation
  3. Implement additional MCP features like prompts
  4. Deploy to a production environment if needed

Resources

model: mistral:mistral-small-latest
clients:
- type: gemini
api_key: <KEY>
- type: openai
api_key: <KEY>
- type: openai-compatible
name: mistral
api_base: https://api.mistral.ai/v1
api_key: <KEY>
---------TOOLS.md------------
prompt:Alwary use tools in case you find to get information is able to get from tools
---
use_tools: all
---
from langchain.agents import create_openai_functions_agent, AgentExecutor
from langchain import hub
from langchain_openai import ChatOpenAI
from composio_langchain import ComposioToolSet, Action, App
import os
os.environ["OPENAI_API_KEY"] = "<key>"
os.environ["COMPOSIO_API_KEY"] = "<key>"
llm = ChatOpenAI(model="gpt-4o-mini")
prompt = hub.pull("hwchase17/openai-functions-agent")
composio_toolset = ComposioToolSet()
tools = composio_toolset.get_tools(actions=['GMAIL_SEND_EMAIL'])
agent = create_openai_functions_agent(llm, tools, prompt)
agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True)
task = "send mail to [email protected] with subject: test and body: test"
result = agent_executor.invoke({"input": task},verbose=False)
print(result["output"])
from composio import ComposioToolSet, Action
from composio import ComposioToolSet
import os
os.environ["COMPOSIO_API_KEY"] = "<key>"
toolset = ComposioToolSet()
tool_set = ComposioToolSet(entity_id="default")
response=tool_set.execute_action(
action=Action.GITHUB_STAR_A_REPOSITORY_FOR_THE_AUTHENTICATED_USER,
params={"owner": "rajivmehtaflex", "repo": "g-sample-copilot"},
entity_id="default",
)
print(response)
print("done")
Objective: Any Format -> MD file
uv add markitdown-mcp
uvx markitdown-mcp --sse --port 8889
"markitdown-mcp-server": {
"url": "http://localhost:8889/sse"
}
=================== GOTHUB-MCP =====================
"github": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"GITHUB_PERSONAL_ACCESS_TOKEN",
"ghcr.io/github/github-mcp-server"
],
"env": {
"GITHUB_PERSONAL_ACCESS_TOKEN": "<YOUR_TOKEN>"
}
}

Comments are disabled for this gist.