Skip to content

Instantly share code, notes, and snippets.

@brianjking
Forked from vincentkoc/doc-build.md
Created November 10, 2025 21:38
Show Gist options
  • Select an option

  • Save brianjking/332d76d6eede352360bfbd15a8948cb9 to your computer and use it in GitHub Desktop.

Select an option

Save brianjking/332d76d6eede352360bfbd15a8948cb9 to your computer and use it in GitHub Desktop.
Codex Documentation Review and Build Commands
description argument-hint
Build high-impact documentation grounded in best practices
TYPE=<tutorial|how-to|reference|explanation>

Use this when drafting or overhauling docs. It works for net-new guides, branch-specific updates, or sweeping improvements. Pass TYPE if you already know the Diataxis bucket.

1. Define intent & success

  • Audience: skill level, prerequisites, and the “job to be done.”
  • Outcome: what readers can do immediately after (e.g., “deploy first webhook”).
  • Funnel placement (Matt Palmer Rule 2): decide how the doc answers What/Why, Quickstart, and Next steps.

2. Outline with Diataxis + structure (Matt Palmer Rules 2-3)

  • Tutorials = end-to-end build; How-tos = focused tasks; Reference = exhaustive facts; Explanations = concepts/why.
  • Sketch navigation: headings as informative sentences, table of contents, and cross-links to landing/next guides.
  • Ensure each section opens with the takeaway sentence (OpenAI “takeaways up front”).

3. Draft for skimmability & clarity (OpenAI guidelines)

  • Short paragraphs, topic-first sentences, bullets/tables for option lists, bold only the key command/result.
  • Use imperative voice, avoid left-branching sentences, remove vague pronouns (“this” → concrete noun).
  • Favor precise terms (“input”, “max token limit”), avoid jargon or explain it inline.

4. Design the experience for humans and agents (Matt Palmer Rules 1 & 4-8)

  • Keep critical info in text (not just images). Provide copy-ready commands and self-contained code.
  • Embed automation hooks: mention or add llms.txt, MCP endpoints, or repo templates if relevant.
  • Add “Next steps” or decision tables so agents and humans can follow recommended paths.
  • Note background tasks: link/lint checks, Vale rules, or CI required for this doc.

5. Use Fern components intentionally

  • Structure/navigation: Accordion, Step, Tab, Sticky table, Anchor.
  • Emphasis/context: Callout, Aside, Badge, Tooltip, Card.
  • Actions/resources: Button, Copy, Download, Files, Frame, Icon.
  • API depth: Code block, Endpoint request/response/schema, Runnable endpoint, Parameter field.
  • Pick components that reduce cognitive load; mention in the draft where each should appear if authoring outside Fern.

6. Content completeness & safety

  • Cover prerequisites (access, keys, roles), setup, common failure modes, and rollback steps.
  • Ensure code samples follow safe patterns (no hard-coded secrets, least-privilege scopes).
  • Include performance/limit callouts, pricing or quota notes, and links to reference tables.
  • When reusing snippets, verify versions and update dates to prevent context rot.

7. Validation checklist

  • Run or dry-run every command/code block; capture actual outputs or annotate expectations.
  • Test copy buttons, tabs, runnable endpoints, and internal/external links.
  • If the doc ships in a branch, confirm navigation (docs.yml, versions, tabs) and CI previews pass.
  • Record outstanding verifications the reviewer must perform (e.g., “needs screenshot once feature flag enabled”).

8. Handoff

Deliver:

  1. Draft text (or updated file) with component annotations.
  2. Section-by-section status (done, needs SME review, blocked).
  3. Suggested reviewers: SME, tech writer, agent/automation owner.
description
Review documentation changes for structure, clarity, and agent-readiness

Use this prompt when you need to QA documentation work in a branch or across the repo. Default to the git diff against main, but if $FILES is provided only review those paths.

1. Confirm scope & setup

  • Identify doc type (landing, quickstart, tutorial, API reference, conceptual) and intended audience.
  • Map the doc to the Diataxis quadrant (Tutorial / How-to / Reference / Explanation). Flag misaligned content.
  • Note any Fern, Mintify, Sphinx and other docs framework usage so you can enforce component standards.

2. Structural & navigational review (Matt Palmer rules 1-3)

  • Funnel check: does the doc clearly answer What is this?, How do I start fast?, Where do I go next?
  • Ensure logical headings, working ToC, and forward/backward links. No critical info trapped in images.
  • Verify the doc fits the surrounding navigation (products, versions, tabs) and keeps outdated pages removed/redirected.

3. Writing-quality audit (OpenAI “What makes documentation good”)

  • Skimmability: informative titles, short paragraphs, topic sentences up front, bullets/tables where helpful, bold sparingly for emphasis.
  • Clarity: simple sentences, no left-branching constructions, minimal pronouns like “this/that” without nouns, consistent terminology/casing.
  • Empathy: avoid prescriptive “you probably…”, prefer precise terms (“input” vs “prompt”), keep examples self-contained and dependency-light.

4. Content coverage & accuracy (Matt Palmer rules 4-8 + OpenAI “Be broadly helpful”)

  • Ensure tutorials/how-tos show end-to-end success paths, surface common pitfalls, and never teach bad habits (e.g., leaking API keys).
  • Confirm code/config samples run as-is, use secure defaults, and highlight constraints (rates, limits, auth).
  • Validate that quickstarts truly run in minutes (one command when possible) and that “Next steps” or related links exist.
  • Check for outdated data, broken links, or missing screenshots. If the doc references automation/CI, confirm scripts exist or note gaps.

5. Agent- and Fern-readiness

  • Docs targeting agents should expose structured metadata: llms.txt, schema snippets, MCP hooks, etc., when applicable.
  • For Fern pages, confirm component usage follows intent:
    • Accordion, Step, Tab, Sticky table for navigation/structure
    • Callout, Aside, Badge, Tooltip for emphasis and context
    • Card, Button, Copy, Download, Files, Frame, Icon for UX polish
    • Code block, Endpoint request/response/schema, Runnable endpoint, Parameter field for API accuracy
  • Flag opportunities where a component would improve scanning or interactivity.

6. Output format and Handoff

Produce and Deliver:

  1. Blocking issues – numbered list with file path + line (if known) and required fix.
  2. Non-blocking polish – bullets for style nits or optional improvements.
  3. Validation notes – mention checks performed (links, build, lint) or what still needs verification.

Keep the tone direct, actionable, and reference the relevant guideline (OpenAI, Matt Palmer rule #, Fern component) when calling out issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment