You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Role: You are an expert AI-assisted software engineer. Your task is to analyze a given codebase and generate a set of robust, best-practice AGENTS.md files to help other AI coding agents understand and contribute to this project effectively.
Objective: Create one AGENTS.md file at the project root and, where appropriate, additional AGENTS.md files in key subdirectories (like packages in a monorepo or distinct modules like /frontend and /api). These files will serve as the primary instruction set for AI agents.
Process:
Analyze Codebase Structure:
Scan the entire file tree to identify the project's overall architecture.
Identify the primary package.json, pyproject.toml, go.mod, pom.xml, build.gradle, requirements.txt, or similar dependency/build file to understand the project's main language, framework, and build system.
Identify major subdirectories that represent distinct parts of the project (e.g., /packages/*, /services/, /apps/web, /docs). These are candidates for nested AGENTS.md files.
Generate Root AGENTS.md:
Create a file named AGENTS.md in the project's root directory.
This file must contain global instructions applicable to the entire project.
Populate it with the following sections, inferring content from the codebase (especially build files, CI/CD configs like .github/workflows, and linter configs like .eslintrc, pyproject.toml, checkstyle.xml, etc.).
Crucially: All commands and filepaths must be accurate for the technology stack you identify.
# AGENTS.md## 1. Project Overview & Structure***Description:**[Brief, one-sentence description of the project's purpose.]***Primary Tech:**[e.g., "Python, Django, and React", "Java and Spring Boot", "Go microservices", "TypeScript monorepo with pnpm"]***Key Directories:***`[e.g., /src]`: [Purpose of directory]*`[e.g., /tests]`: [Purpose of directory]*`[e.g., /docs]`: [Purpose of directory]*`[e.g., /scripts]`: [Purpose of directory]## 2. CRITICAL DOCUMENTATION MANDATES**This is the most important set of rules for this project. Adhere to them strictly.*****Rule 1: Document All New Code:** All new code (files, functions, classes, methods) **must** be comprehensively documented *before* the task is considered complete.
***Rule 2: Comprehensive Standard:** "Comprehensively documented" means:
***File Level:** A header comment explaining the file's purpose and contents.
***Function/Method Level:** Docstrings (e.g., JSDoc, Python docstrings, JavaDoc) that detail:
* A clear description of what the function does.
* All `@param` (inputs): name, type, and description.
* All `@return` (outputs): type and description of the return value.
* All `@throws` or error states: descriptions of any errors that can be thrown.
***Rule 3: Update Documentation with Code:** When you modify *any* existing code, you **must** update its corresponding documentation to reflect the changes. This is non-negotiable.
***Rule 4: DO NOT DELETE (Critical):****NEVER** delete any documentation, comments, or docstrings unless you are also deleting the code block, function, or file that it documents.
***Task: Document Existing Code:** As you work, if you encounter any undocumented code, your first priority is to apply the comprehensive documentation standard (Rule 2) to it *before* making other changes.
## 3. Setup & Build Commands
Provide exact, copy-pasteable commands.
***Install Dependencies:**`[e.g., poetry install, npm install, mvn install, go mod tidy]`***Build Project:**`[e.g., mvn package, npm run build, go build ./...]`***Run Development Server:**`[e.g., python manage.py runserver, npm run dev, go run main.go]`## 4. Testing Commands***Run All Tests:**`[e.g., poetry run pytest, npm run test, mvn test, go test ./...]`***Run Specific Test File:**`[e.g., poetry run pytest path/to/test.py, npm run test -- path/to/test.js, mvn -Dtest=MyTestClass test]`## 5. Coding Conventions & Style***Documentation:** Adhere *strictly* to the rules in the `CRITICAL DOCUMENTATION MANDATES` section.
***Style Guide:**[e.g., "Follow PEP 8", "Follow Google Java Style Guide", "ESLint rules in .eslintrc.json"]***Linter Command:**`[e.g., poetry run ruff check ., npm run lint]`***Formatter Command:**`[e.g., poetry run ruff format ., npm run format, go fmt ./...]`***Naming:**[e.g., "Python classes are PascalCase", "Java methods are camelCase", "Go funcs are PascalCase for exported"]***Framework Rules:**[e.g., "All Django models must be in a `models.py` file", "All React state logic must use custom hooks"]## 6. Architecture & Design Patterns***Key Rules:***[e.g., "DO NOT write business logic in controllers; use Service classes in `/services`."]*[e.g., "All database access must go through the Repository layer."]*[e.g., "Environment variables are loaded from `.env`. See `.env.example` for a template."]## 7. Codebase Evolution (New Modules)***Mandate:** If your task involves creating a new, significant module, package, or service (e.g., a new directory in `/apps`, `/packages`, or `/services`), you **must** create a new `AGENTS.md` file in that new directory's root.
***Content Standard:** This new `AGENTS.md` file must:
1. Start by stating it inherits from the root: `This file contains rules specific to this module. It inherits all rules from the root AGENTS.md.`2. Include an overview of the new module's purpose and tech stack.
3. List all module-specific commands (e.g., `npm run test --filter new-package`).
4. Define any module-specific conventions or rules that add to or override the root file.
## 8. Git & PR Workflow***Branch Naming:**`[e.g., feat/feature-name, fix/bug-name, docs/topic-name]`***Commit Messages:**`[e.g., "Must follow Conventional Commits standard"]`***PR Checklist:***[e.g., "Title is descriptive and follows conventions."]*[e.g., "All documentation is updated per mandates."]*[e.g., "If new modules were added, they include their own `AGENTS.md` file."]*[e.g., "All tests pass."]*[e.g., "Linter passes."]
Generate Nested AGENTS.md (If Applicable):
For each major subdirectory identified in Step 1 (e.g., /frontend or /api), create a specificAGENTS.md file inside it.
This file should only contain rules that are specific to that module and override or add to the root file.
Example for a Python backend's sub-directory AGENTS.md:
# AGENTS.md (for /api)
This file contains rules specific to the `/api` service. It inherits all rules from the root `AGENTS.md`.
**CRITICAL: All documentation mandates from the root `AGENTS.md` apply here.**## 1. Project Overview***Description:** The Django REST Framework API.
***Tech:** Python, Django, PostgreSQL.
## 2. Commands***Run Dev Server:**`poetry run python manage.py runserver 8000`***Run Tests:**`poetry run pytest api/`***Run Migrations:**`poetry run python manage.py migrate`## 3. Coding Conventions***Serialization:** Use Django REST Framework serializers, defined in `serializers.py` for each app.
***Authentication:** All views must use token-based authentication from `rest_framework.authtoken`.
Final Review:
Ensure all commands are precise and wrapped in backticks (`).
Keep descriptions concise (use bullet points, not prose).
Do not duplicate information. Link to external docs (CONTRIBUTING.md, wikis) if more detail is needed, but provide the essential commands and rules directly.
Comprehensive prompts for AI-assisted software maintenance, documentation, and refactoring.
⸻
📘 Meta Context Template
Use this before any specialized task to set the AI’s frame of reference.
You are an expert full-stack engineer, DevOps practitioner, and technical writer.
You will perform your assigned task with discipline and completeness, following idiomatic best practices for the language/framework in question.
Rules:
- Preserve functional intent unless explicitly instructed otherwise.
- Optimize for clarity, maintainability, and performance.
- Document both _what_ and _why_.
- Output in clean, well-formatted Markdown or code blocks.
- Always include a summary of what was changed, why, and how to validate correctness.
- When you make assumptions, list them explicitly.
- If a task risks breaking behavior, provide an automated rollback or feature-flag plan.
Context:
- Provide repo root path and branch to operate on.
- Provide file access context or paste relevant files inline if not accessible.
- Provide target language/framework when applicable.
⸻
🧩 1. Generate or Update AGENTS.md
Generic Prompt
Create or update a complete AGENTS.md file at the repository root.
Include every automation and agent that interacts with the codebase.
For each agent include:
- Name and short description
- Purpose and scope (what it automates)
- Trigger(s) and frequency (push, schedule, manual)
- Entry point (script, workflow file, target command)
- Required environment variables and example values
- Output artifacts (reports, builds, releases)
- Dependencies (system, language packages, external services)
- Fail behavior and retry strategy
- Security considerations (secrets, scopes)
- How to run locally (developer steps)
- Troubleshooting steps and common failure modes
- Example logs and how to interpret them
- Contact/ownership (team or role responsible)
If repository contains multiple subprojects:
- Create a top-level AGENTS.md referencing per-subproject AGENTS.md with relative links
- Generate a per-subproject AGENTS.md in each subproject
- Add a summary table in root AGENTS.md listing each agent and where it lives
Output:
- A single README-style Markdown file with Table of Contents and anchors.
- For complex agents, include example YAML/JSON snippet of its configuration.
Python / FastAPI / Django
Document:
- Celery / RQ / Dramatiq workers: queues, concurrency, broker URL, routes
- Background tasks (fastapi.BackgroundTasks), APScheduler cron jobs
- Alembic or Django migrations: revision naming convention, migration workflow
- pytest / tox / pre-commit hooks: commands and expected outputs
- Sentry or other error reporting configuration
- Docker/Compose service definitions used in CI or local dev
- Python version managers in use (pyenv, asdf)
- Virtualenv or venv conventions, requirements.txt / Pipfile / poetry.lock usage
Add samples:
- celery worker launch command
- sample pytest invocation with marker usage
- local dev docker-compose command for agents
Document:
- npm / yarn scripts (what each script does)
- Build pipelines (webpack, esbuild, vite)
- Linting and formatting: eslint, prettier, husky, lint-staged
- CI agents (GitHub Actions, CircleCI) and deployment agents (Vercel, Netlify)
- Serverless functions and schedule triggers (cron via provider)
- Storybook or visual regression test runners
Add examples:
- GitHub Actions workflow snippet for test + build + deploy
- Vercel/GitHub environment variable usage
Go
Document:
- go:generate directives and where they run
- Build and test agents, Docker build agents
- Release tagging and binary packaging agents
- Use of goreleaser or similar tools
Add examples:
- goreleaser config sample
- recommended go build/test matrix in CI
Java / Spring Boot
Document:
- Maven/Gradle tasks used in CI pipelines
- JUnit runner integration and coverage reporting
- Database migration agents (Flyway/Liquibase)
- Container build and Java-specific JVM tuning for agents
Add examples:
- sample Jenkinsfile/GitHub Actions matrix for JDK versions
C# / .NET
Document:
- dotnet build, dotnet test tasks
- NuGet package publishing agents
- Azure DevOps or GitHub Actions agents for container builds
- test coverage generation tools (coverlet)
⸻
🔧 2. Refactor Codebase
Generic Prompt
Refactor the entire codebase for clarity, maintainability, and performance while preserving external behavior.
Perform:
1. Analyze the project structure and dependency graph.
2. Identify duplication, monolithic modules, and long methods/classes.
3. Apply idiomatic patterns for the language/framework.
4. Break large files into smaller modules where appropriate.
5. Replace anti-patterns and deprecated APIs.
6. Improve tests and create tests where missing for changed code.
7. Update imports, package/module names, and public APIs as needed — expose changes in CHANGELOG or migration guide.
Deliverables:
- Patch/diff for modified files.
- Summary of changes with rationale.
- How to run tests and validate behavior.
- A changelog entry or migration notes for breaking changes.
- Suggested follow-up refactors.
Python
Refactor specifics:
- Enforce PEP8 naming and PEP257 docstrings.
- Add type hints and gradually introduce mypy strictness (annotate stubs).
- Replace large functions with smaller helpers and classes.
- Introduce context managers for resource lifecycles.
- Extract service/business logic from view/controllers.
- Break monolithic modules into packages.
- Replace bare except: clauses with specific exceptions.
- Introduce patterns: repository/service/factory as needed.
Provide code diffs and tests to validate behavior.
JavaScript / TypeScript
Refactor specifics:
- Convert to ES modules (import/export) or to TypeScript where missing.
- Replace callbacks with async/await.
- Break large components into smaller subcomponents (React).
- Add typing with TypeScript interfaces/types.
- Centralize API clients and error handling.
- Move environment-sensitive config into runtime env vars.
- Remove unused code and dead exports.
Go
Refactor specifics:
- Ensure packages export minimal public API.
- Use interfaces to decouple implementations.
- Handle errors clearly: wrap with fmt.Errorf("%w") where helpful.
- Avoid large god packages; split responsibilities.
- Use package-level tests and interface-based testing for mocks.
Rust
Refactor specifics:
- Clarify ownership/borrowing lifetimes.
- Replace heap allocations where unnecessary.
- Use idiomatic Result/Option error flows.
- Add proper module docs and example tests.
Java / Spring Boot
Refactor specifics:
- Apply SOLID: single responsibility for controllers/services/repositories.
- Introduce DTOs for serialization boundary.
- Move configuration to profiles and avoid environment-specific code in business logic.
- Use constructor injection over field injection.
C# / .NET
Refactor specifics:
- Use dependency injection patterns and interfaces.
- Avoid synchronous I/O on async methods.
- Ensure correct implementation of IDisposable when needed.
Swift
Refactor specifics:
- Prefer value types (structs) when appropriate.
- Use Swift concurrency (async/await) safely and reduce forced unwraps.
- Move large view logic into ViewModels (MVVM) for SwiftUI.
- Apply SwiftLint rules.
⸻
🧪 3. Create or Update Unit Tests
Generic Prompt
Create or update comprehensive unit tests for the entire codebase.
Guidelines:
- Use native or most appropriate testing framework.
- Cover public APIs and core business logic.
- Include positive, negative, and edge cases.
- Mock external dependencies (network, DB, file system).
- Write fixtures/factories for repeated setup.
- Use deterministic seeds for randomness.
- For async code: test concurrency, timeouts, cancellation behavior.
- Add test running instructions and CI integration.
- Output a coverage report and highlight gaps under 90%.
Python (pytest)
- Use pytest for tests; leverage pytest fixtures and parametrized tests.
- For async: use pytest-asyncio.
- Use monkeypatch or unittest.mock for dependency isolation.
- Create conftest.py for shared fixtures.
- Add tox / Github Actions matrix for python versions.
- Example test skeleton:
def test_function_behavior():
result = module.function_under_test(args)
assert result == expected
JavaScript / TypeScript (Jest / Vitest)
- Use Jest or Vitest, enable ts-jest if TypeScript present.
- Mock fetch/axios with jest.mock or msw (Mock Service Worker) for integration-like tests.
- For React: use @testing-library/react for component behavior.
- Snapshot tests for complex UI render output.
- Add test scripts to package.json and CI test steps.
Go
- Use table-driven tests.
- Use testing.T and subtests.
- Use testcontainers-go for DB dependent code or sqlite3 in memory.
- Benchmarks: go test -bench .
Rust
- Use cargo test, integration tests under tests/ dir.
- Test both Ok and Err cases for Result-returning functions.
Java
- JUnit 5 tests, Mockito for mocking.
- Integration tests via @SpringBootTest with test profile and embedded DB (H2).
C#
- xUnit or NUnit with Moq for mocking.
- Configure test projects in solution file and CI.
Swift
- XCTest for unit tests.
- UI tests with XCUITest where applicable.
⸻
📚 4. Generate or Update Documentation (Per-file and Inline)
Generic Prompt
Add or update comprehensive per-file and inline documentation for the repository.
For each file:
- Provide a top-level docstring/header describing:
- Purpose of the file
- Key responsibilities
- External dependencies and side-effects
- Example usage (minimal)
- For each public class/function:
- Description of what it does
- Parameters and types
- Return types and possible error/exception cases
- Side effects, thread-safety, concurrency notes
- Complexity or performance characteristics (if relevant)
- Inline comments:
- Explain why complex logic exists (rationale)
- Note non-obvious edge cases and assumptions
- Use language-appropriate doc formats:
- Python: Google/Numpy/PEP-257 style docstrings
- JS/TS: JSDoc
- Go: godoc comments
- Java: Javadoc
- C#: XML docs
- Rust: /// rustdoc
- Swift: /// Swift doc
- Generate a documentation index mapping files to their top-level summary.
Example Python File Docstring
"""
module_name.py
Purpose: Provide utilities for X that are used by Y and Z.
Dependencies: requests, pydantic
Usage:
from module_name import important_function
important_function(...)
"""
Example JSDoc
/\*\*
- Does something.
- @param {string} id - Unique identifier.
- @returns {Promise<object>} - Resolves to the object.
\*/
async function fetchThing(id) { ... }
⸻
📘 5. Generate Comprehensive README.md
Generic Prompt
Generate a full README.md for the repository.
Sections to include:
- Project title and concise tagline
- Overview: purpose and key ideas
- Architecture summary and component diagram (mermaid optional)
- Quick start (local dev): prerequisites and commands
- Installation and setup (containerized and non-containerized)
- Environment variables and .env.example explanation
- Build and run instructions
- Testing (unit, integration, E2E)
- CI/CD: what runs and what to expect
- Contributing guidelines (link to CONTRIBUTING.md)
- API reference / CLI usage (short examples)
- Troubleshooting and common issues
- Roadmap and how to propose changes
- License and acknowledgments
Style:
- Use concise, practical documentation with examples and commands.
- Provide copy-paste command blocks for common tasks.
- Use a Table of Contents for long READMEs.
- Provide links to AGENTS.md, CONTRIBUTING.md, CHANGELOG.md, and other reference docs.
Framework-specific notes
• React/Next.js/Vue: Include build and development commands, static export steps, preview URLs for Vercel/Netlify.
• Django/FastAPI/Flask: Include migrations, how to run dev server, creating superuser, sample curl requests.
• Go/Rust/Java/C#: Include binary build instructions, example env vars, and system requirements.
⸻
⚙️ 6. Generate or Update .gitignore, Makefile, and .env.example
Generic Prompt
Create or update standard support files for the repo.
Deliver:
- .gitignore: include language- and framework-specific sections
- Makefile: targets for setup, build, test, lint, clean, run, package
- .env.example: list every environment variable used in code with placeholder and comment
Make sure to:
- Use canonical patterns for languages (node_modules, .next, **pycache**, bin, obj, target)
- Provide comments in Makefile describing each target
- Provide safe default values in .env.example (e.g., 127.0.0.1 placeholders)
Examples by stack
# Python
## .gitignore:
venv/
**pycache**/
\*.pyc
.env
## Makefile:
.PHONY: install test lint run clean
install:
pip install -r requirements.txt
test:
pytest -q
lint:
black --check . && flake8
run:
uvicorn app.main:app --reload
clean:
rm -rf **pycache** .pytest_cache build
## .env.example:
DATABASE_URL=postgres://user:pass@localhost:5432/dbname
SECRET_KEY=replace_me
DEBUG=true
# Node.js
## .gitignore:
node_modules/
dist/
.env.local
## Makefile:
install:
npm ci
dev:
npm run dev
build:
npm run build
test:
npm run test
lint:
npm run lint
## .env.example:
NODE_ENV=development
PORT=3000
API_URL=http://localhost:8000
# Go
## .gitignore:
bin/
vendor/
## Makefile:
.PHONY: build test fmt vet run clean
build:
go build -o bin/app ./...
test:
go test ./...
fmt:
go fmt ./...
vet:
go vet ./...
run:
go run ./cmd/app
clean:
rm -rf bin
## .env.example:
PORT=8080
DATABASE_URL=postgres://user:pass@localhost:5432/db
# Java
## .gitignore:
target/
\*.iml
.idea/
## Makefile:
mvn-clean:
mvn clean
build:
mvn package -DskipTests
test:
mvn test
run:
mvn spring-boot:run
## .env.example:
SPRING_DATASOURCE_URL=jdbc:postgresql://localhost:5432/db
SPRING_PROFILES_ACTIVE=local
# C#
## .gitignore:
bin/
obj/
.vs/
## Makefile (or scripts):
build:
dotnet build
test:
dotnet test
run:
dotnet run --project src/MyApp
## .env.example:
ASPNETCORE_ENVIRONMENT=Development
CONNECTION_STRING=Server=localhost;Database=mydb;User Id=sa;Password=pass;
⸻
🧱 7. Generate a Codebase Summary (Project Map)
Generic Prompt
Analyze the repository and produce a project map.
Deliver:
- Directory tree (2–4 levels deep)
- One-line description for each top-level file/directory
- Graph of key cross-module dependencies
- Unreferenced or orphan files
- Suggested reorganization for modularity (if applicable)
- Mermaid diagram (optional) showing major components and data flow
Output example:
/src
├─ api/ # REST endpoints; maps to controllers
├─ db/ # ORM models and migrations
├─ services/ # Business logic and domain services
├─ workers/ # Background jobs and schedulers
├─ tests/ # Unit and integration tests
└─ README.md # Project overview and dev instructions
⸻
🧠 8. Codebase Quality Audit
Generic Prompt
Run a thorough quality audit of the codebase and produce a prioritized report.
Audit items:
- Linter/style violations
- Complexity hotspots (high cyclomatic complexity)
- Duplication and DRY violations
- Long methods or large classes
- Exception/error handling issues
- Performance hotspots and IO/CPU bottlenecks
- Security concerns (insecure configs, hardcoded secrets)
- Test coverage and flaky tests
- Dependency health and license issues
- CI/CD reliability and flaky pipeline steps
Output:
- Executive summary
- Sections: Critical / High / Medium / Low findings
- Reproduction steps and examples for each finding
- Automated fixes where possible (e.g., style fixes via formatter)
- Suggested timelines and impact estimates for fixes
⸻
🧮 9. Dependency Health and Upgrade Plan
Generic Prompt
Produce a dependency audit and upgrade plan.
Tasks:
- Enumerate all dependencies (runtime, dev, system)
- For each dependency: current version, latest stable version, release notes link
- Indicate known CVEs or advisories
- Propose upgrade batches (safe upgrades, upgrades requiring testing, breaking changes)
- Provide a test plan for each upgrade (smoke tests, integration tests)
- If relevant: provide a fork/patch strategy for unmaintained libraries
⸻
🔐 10. Security and Secrets Audit
Generic Prompt
Perform a security scan focused on secrets and insecure patterns.
Tasks:
- Search repo for high-entropy strings and common secret patterns (API keys, private keys)
- Flag committed secrets and suggest rotation steps
- Detect insecure TLS usage, hardcoded credentials, or permissive CORS origins
- Audit Dockerfiles for root usage and unnecessary packages
- Check CI for exposed secrets and permissive job runners
- Evaluate authentication flows, authorization checks, and input validation
- Produce an immediate remediation checklist and mid-term security roadmap
⸻
💬 11. Generate or Update CONTRIBUTING.md
Generic Prompt
Create or update CONTRIBUTING.md with developer-friendly onboarding and contribution guidelines.
Include:
- Local dev setup (clone, prerequisites, make targets)
- Branch naming standards and PR process
- Commit message conventions (Conventional Commits recommended)
- Testing requirements and how to run them
- Linting and formatting enforcement (pre-commit, CI)
- Code review checklist (maintainability, security, tests)
- How to report bugs and propose features
- Process for handling sensitive security reports (private email, security.md)
⸻
🧰 12. Generate or Update CHANGELOG.md
Generic Prompt
Create or align CHANGELOG.md to "Keep a Changelog" format.
Requirements:
- Use semantic versioning headers (Unreleased, vX.Y.Z)
- Sections: Added, Changed, Deprecated, Removed, Fixed, Security
- For each release, include release date and short descriptions
- If available, link PRs or commits
- Provide guidance on release notes generation for maintainers
⸻
🧪 13. Create or Update Integration Tests
Generic Prompt
Create or update integration/e2e tests that validate end-to-end behavior.
Requirements:
- Use realistic test fixtures or testcontainers for DB/backend services
- Ensure environment setup and teardown is idempotent
- Mock external upstreams where appropriate, or use recorded fixtures
- Tag tests: smoke, integration, regression, flaky
- Add CI gates for smoke/integration tests that must pass before release
Example frameworks:
• FastAPI/Flask/Django: use TestClient and transactional DB rollbacks
• Express/Node: use supertest + test doubles for external APIs
• React/Next: Playwright/Cypress for UI flows
• Go/Java/C#: API-level tests with in-memory DBs or Docker containers
⸻
🪄 14. Add or Update Type Definitions
Generic Prompt
Ensure codebase is strongly typed per language capabilities.
Tasks:
- Python: Add PEP 484 type hints and mypy config; add typed stubs where necessary
- TypeScript: Migrate JS files to .ts/.tsx; introduce strict compilerOptions
- Go: Ensure exported functions and structs are typed (Go is typed by default), add interface types for mocks
- Java/C#/Swift/Rust: add/generate interface/trait definitions or generics where helpful
Deliver:
- Type coverage report and list of untyped/unresolved signatures
- Suggested incremental strategy for adding types
⸻
🧩 15. API Documentation Generator
Generic Prompt
Generate or refresh API documentation.
Deliverables:
- OpenAPI 3.0 (YAML/JSON) or RESTful Markdown docs
- Endpoint list with methods, params, request/response examples
- Authentication/authorization model
- Error codes and sample payloads
- Link to source code locations for handlers/controllers
- Reconciliation with existing swagger/autogenerated docs
Framework specifics:
• FastAPI: extract and tidy auto-generated OpenAPI schema, add examples
• Django REST Framework: ensure serializers and viewsets have docs, generate swagger
• Express: use JSDoc/Swagger annotations or convert route docs to OpenAPI YAML
• Go: annotate handlers for swagger using go-swagger or swaggo
⸻
📊 16. Generate Developer Metrics Dashboard
Generic Prompt
Produce developer metrics and health dashboard data in JSON/Markdown.
Metrics:
- Lines of code per module
- Test coverage per module
- Last modified dates
- Commit frequency and contributors per file/module
- Static complexity metrics (cyclomatic complexity)
- Top 10 churn files
- Open PR age and average review time
Deliver:
- JSON data and a human-readable summary
- Suggested improvement actions (reduce churn, add tests)
⸻
🔄 17. Documentation Synchronization Agent
Generic Prompt
Scan the codebase for discrepancies between code and docstrings/README/AGENTS.md.
Tasks:
- Identify functions changed without docstring updates
- Flag README examples that no longer match CLI/API
- Update AGENTS.md if automation configs changed
- Produce a "docs-to-code" diff and generate patch suggestions
Thoroughly audit Python project for:
- Async correctness (detect sync blocking calls inside async def)
- Type coverage and mypy issues
- Circular import hotspots
- Resource leaks (open files, DB sessions)
- Performance bottlenecks: recommend profiling hooks
- Caching opportunities and memoization
- Secure handling of user input
Deliver diffs, tests, and sample profiling output suggestions.
JavaScript/TypeScript Deep Review
Thoroughly audit JS/TS project for:
- Unhandled Promise rejections
- Suspicious use of any/unkown in TS
- Large bundle entry points; recommend code splitting
- Memory leaks in long-lived client sessions (e.g., event listeners not removed)
- Security issues (unsafe eval, DOM insertion)
Deliver: refactors, webpack/vite suggestions, test additions.
Go Deep Review
Thoroughly audit Go project for:
- Goroutine leaks and unbuffered channel blocking
- Error wrapping and handling consistency
- Unnecessary allocations in hot paths
- Improve concurrency patterns and add benchmarks
Deliver: go test -bench changes, suggested code replacements.
Rust Deep Review
Thoroughly audit Rust project for:
- Unsafe blocks and FFI safety
- Lifetime annotation issues and unnecessary clones
- Compile-time speedups (feature gating, workspace layout)
- Crate dependency review for bloat
Deliver: suggested code edits and cargo bench targets.
Java Deep Review
Thoroughly audit Java/Spring project for:
- Transaction boundary issues
- Thread pool and resource management
- Serialization vulnerabilities
- JUnit test isolation and integration stability
C# Deep Review
Thoroughly audit C#/.NET project for:
- Async/await misuse leading to thread starvation
- IDisposable usage and resource cleanup
- LINQ performance in hot paths
- DI lifecycle mismatches (Transient vs Scoped vs Singleton)
Swift Deep Review
Thoroughly audit Swift project for:
- Memory cycles and retain cycles in closures
- Swift concurrency misuse and actor boundaries
- Excessive UI recomputation in SwiftUI
- Performance hotspots in rendering loops
⸻
🚀 19. Project Bootstrap Prompt (Universal)
Generic Prompt
Bootstrap a new project using the given language/framework with best-practice scaffolding.
Deliver:
- Directory structure
- Starter README.md with setup and test commands
- .gitignore, .env.example, Makefile/package.json
- Basic CI pipeline (GitHub Actions template)
- AGENTS.md seed describing the pipeline
- Basic unit test and linter config
- Example "Hello world" endpoint or CLI
- Instructions to extend templates and best practices
Examples
Python / FastAPI:
- src/
- app/
- main.py (FastAPI app)
- api/
- models/
- services/
- tests/
- pyproject.toml or requirements.txt
- Makefile with install, run, test, lint
- GitHub Actions: python - matrix: [3.11,3.10,3.9]
- AGENTS.md: describe pytest, black, mypy, alembic migrations
Node / Express:
- src/
- index.js
- routes/
- controllers/
- services/
- package.json with scripts: start, dev, test, lint
- ESLint + Prettier config
- Github Actions CI
React / Next.js:
- pages/ or app/
- components/
- styles/
- package.json scripts for dev, build, start, lint, test
- Vercel/GH Actions example for deploy preview
Go:
- cmd/app/main.go
- internal/
- pkg/
- Makefile with build, test, lint
- GitHub Actions for go versions
Perform a complete audit and regeneration of documentation, agents, and hygiene files for this repository.
Deliver:
- Regenerated README.md
- AGENTS.md
- CONTRIBUTING.md
- CHANGELOG.md
- .gitignore
- .env.example
- Makefile or package.json scripts
- API docs (OpenAPI/Swagger or Markdown)
- Test coverage report and suggested remediation patches
- Security checklist and secrets remediation
- Dependency upgrade plan
Also provide:
- A single ZIP archive manifest (list of files changed/created)
- A migration or rollout plan for deploying updates without downtime
- Verification steps and smoke tests to run after changes
⸻
Appendix — Pocket Prompts (Quick-calls)
These are short, repeatable prompts you can paste into an agent for quick actions.
Make AGENTS.md for Python project
Make a full AGENTS.md for this Python repo. Include Celery, alembic, pytest, pre-commit, Docker Compose, Sentry configs. Provide run commands and troubleshooting steps.
Refactor repository for TypeScript strict mode
Migrate project to TypeScript strict mode: add tsconfig strict settings, convert JS files incrementally, add typing stubs, and produce a list of remaining any types. Provide changes as patches.
Add tests for file X
Create unit tests for file path/to/file. Use the repo's testing framework and mocking strategy. Cover edge cases and include a conftest/fixture for repeated setup.
Create PR with changes
Create a set of changes for X and produce a ready-to-open PR description: summary, why, testing steps, risk, and rollback.
⸻
Usage Notes & Best Practices
Always run the provided tests and linters locally before opening a PR.
When the agent suggests a refactor: insist on tests for the changed surface area.
Version control: use topic branches for any large overhaul and protect main with CI gates.
Start small: prefer incremental typed/behavioral changes rather than giant sweeping refactors.
For security-sensitive repos: never accept automated commits that change secrets or rotate keys without human sign-off.