- Marlene - Senior Developer Advocate, Python on Azure team at Microsoft
- Active in Python community (ACM volunteer, Python Software Foundation board)
- Gwen - Developer Advocate, Same team as Marlene
- 4 years at Microsoft, YouTube content creator
- Model Context Protocol (MCP) with Python
- Hands-on demos and practical examples
- Building MCP servers from scratch
- MCP Fundamentals - What is MCP and how does it work?
- In-depth MCP Features - Tools, Resources, and Prompts
- Live Demos - Building MCP servers step by step
- Complete Example - Study Buddy application
- Advanced Example - AI Research Hub with multiple MCP servers
- Live coding demonstrations
- GitHub repository with tutorials
- Real-time Q&A
- GitHub Copilot (with Agent Mode)
- Claude Code
- Cursor
- Cline
- Many others emerging rapidly
- How to provide more context to LLMs?
- How to connect to databases and external resources?
- How to perform specific actions beyond just code generation?
- Created by: Anthropic (makers of Claude)
- Purpose: Open protocol that standardizes how applications provide context to LLMs
- Analogy: "Like USB-C but for AI"
- Similar to: REST for web APIs
- Standardized way to connect AI tools with external resources
- Extensible and community-driven
- Reduces context confusion and improves AI responses
- Environments where MCP interactions happen
- Examples: VS Code, Claude Desktop, other AI tools
- Tools that interact with MCP servers
- Examples: GitHub Copilot, Claude, other AI assistants
- Expose functionality, tools, and resources
- Built by developers to provide specific capabilities
Host ← Client ← Server
- Controlled by: The model/LLM
- Purpose: Functions that can be invoked
- Examples: Retrieving data, sending messages, updating database records
- Controlled by: The application
- Purpose: Expose information or data to the model
- Examples: Files, documents, database records, API responses
- Controlled by: The user
- Purpose: Predefined templates for specific tasks
- Examples: Document Q&A, output formatting, transcript summaries
pip install mcp
# or with uv
uv sync
from mcp.server.fastmcp import FastMCP
# Create server instance
mcp = FastMCP("your-server-name")
# Your tools, resources, and prompts go here
if __name__ == "__main__":
mcp.run()
- Create
.vscode/mcp.json
file - Configure server settings and paths
{
"mcpServers": {
"learn-python-mcp": {
"command": "uv",
"args": ["--directory", ".", "run", "server.py"]
}
}
}
- Server name: Identifier for your MCP server
- Command: How to run your server (uv, python, etc.)
- Args: Arguments passed to the command
- Directory: Working directory for the server
- Predefined templates for specific tasks
- Can be static or dynamic
- Accessible via
/
command in VS Code
@mcp.prompt()
async def generate_topics(level: str) -> str:
return f"Generate five Python topics for someone with {level} experience"
- Consistent welcome messages
- Complex prompt templates
- Reusable instruction sets
- Python functions that LLMs can invoke
- Enable specific actions and operations
- Decorated with
@mcp.tool()
@mcp.tool()
async def generate_exercises(topic: str, level: str, context) -> str:
# Get prompt template
prompt_text = await get_prompt("generate_exercise_prompt")
# Use context to call LLM automatically
response = await context.session.create_message(
role="user",
content=prompt_text
)
return response
- File-like data sources (read-only)
- Identified by URIs:
protocol://host/path
- Reduce token usage and provide context
@mcp.resource("file://user/study_progress.json")
async def get_study_progress(username: str) -> str:
with open("study_progress.json", "r") as f:
data = json.load(f)
return json.dumps(data.get(username, {}))
- Configuration files
- Database records
- Document collections
- Image assets
- Allows client to automatically call LLMs
- Enables autonomous workflows
- Reduces manual intervention
# Use context to automatically invoke LLM
response = await context.session.create_message(
role="user",
content=prompt_text
)
- Seamless user experience
- Better for agentic coding workflows
- Reduces need for manual prompt triggering
- Generate Python exercises based on skill level
- Track learning progress
- Provide personalized study materials
- Prompt: Generate topic suggestions
- Tool: Create and store exercises
- Resource: Access progress data
- Dynamic difficulty adjustment
- Progress tracking
- Exercise database management
- Automate research workflows
- Integrate multiple MCP servers
- Streamline paper and code discovery
- Research topic initialization
- Search Hugging Face papers
- Find GitHub repositories
- Compile research dashboard
- Generate summary reports
- Custom Research Hub MCP
- Hugging Face MCP
- GitHub MCP
{
"mcpServers": {
"ai-research-hub": {
"command": "uv",
"args": ["run", "server.py"]
},
"github": {
"type": "github",
"url": "https://github.com/modelcontextprotocol/servers"
},
"huggingface": {
"type": "huggingface",
"url": "https://github.com/huggingface/mcp-server"
}
}
}
- Only include tools you need
- Deselect unused functionality
- Organize by project requirements
- Access via GitHub Copilot → Tools → MCP Servers
- Select/deselect specific tools
- Organize by relevance to current project
- Fewer tools = better performance
- Reduce context confusion
- Select tools relevant to current task
- Custom server tools
- Third-party MCP servers
- Built-in integrations
- Documentation: MS Docs MCP for better technical answers
- Development: Database connectivity and API integration
- Research: Paper discovery and code implementation finding
- Productivity: Task automation and workflow orchestration
- Growing ecosystem of community-built servers
- GitHub repositories with pre-built functionality
- Easy to extend and customize
- Use pseudo-code style natural language
- Be specific about expected outputs
- Include clear instructions and context
- Create
.github/copilot-instructions.md
- Explicitly mention MCP server usage
- Provide context about your project
- MCP works best in Agent Mode
- Agent Mode enables autonomous tool calling
- Ask Mode requires manual tool selection
- Solution: Ensure proper MCP configuration
- Solution: Restart VS Code after config changes
- Solution: Check file paths and permissions
- Solution: Limit selected tools
- Solution: Use specific, clear prompts
- Solution: Optimize resource usage
- Solution: Use Copilot instructions
- Solution: Be explicit about MCP server usage
- Solution: Structure prompts effectively
- Visit Tutorial Repository: Follow aka.ms/learnmcppython
- Set Up Development Environment: Install uv, Python, VS Code
- Try Basic Examples: Start with prompts, then tools, then resources
- Implement the Study Buddy application
- Build your own simple MCP server
- Integrate multiple MCP servers
- Contribute to the community
- GitHub repository with complete examples
- Documentation and best practices
- Community Discord and forums
- Created by: Anthropic, adopted by AI community
- Evolution: Rapidly evolving with community input
- Contribution: Open source and community-driven
- David Sariah (Twitter: active MCP updates)
- Microsoft Python team
- Growing developer community
- Standardization across AI tools
- Expanded server ecosystem
- Enhanced integration capabilities
- MCP standardizes AI tool integration
- Three core concepts: Tools, Resources, Prompts
- Python implementation is straightforward and powerful
- Agent Mode provides the best experience
- What workflows could you automate with MCP?
- Which external resources would benefit your AI interactions?
- How could MCP improve your development productivity?
- Follow us for more Python and AI content
- Check out the GitHub repository
- Join the MCP community discussions
- Tutorial Repository: aka.ms/learnmcppython
- MCP Documentation: Official Anthropic docs
- Python Community: ACM, Python Software Foundation
- Microsoft Python Team: Azure Python resources
- UV: Modern Python package manager
- Rich: Terminal UI library
- FastMCP: Python MCP server framework
- VS Code: Primary development environment
- Python Discord servers
- MCP developer forums
- GitHub discussions and issues