DocCheck
Most project documentation starts life accurate…
and then quietly drifts out of sync.
Dependencies change.
Tests move.
Practices evolve.
The docs don’t.
DocCheck is a CLI tool designed to tackle that problem head-on by generating and validating living documentation — documentation that reflects how a project actually works today, not how we wish it worked.
The idea started (unsurprisingly) with vibe-coding projects using Claude Code. But the approach is broader than that:
- Why not apply the same idea to
README.mdfiles? - Why not replace reams of Confluence pages with a single accurate, validated page?
- What if documentation could almost look after itself?
The problem DocCheck is trying to solve
Traditional documentation is aspirational.
It describes the intended architecture, the ideal workflow, and the best practices the team agreed six months ago.
Reality, meanwhile, keeps moving.
DocCheck is built around a simple but opinionated idea:
Documentation should reflect reality, not aspiration.
Instead of asking humans to remember to update docs, DocCheck inspects the project itself and validates that what’s written down still matches what’s actually happening in the codebase.
What DocCheck is
At its core, DocCheck is a single-purpose CLI tool that:
- Bootstraps documentation from an existing project
- Detects when documented practices diverge from reality
- Helps keep documentation accurate as the project evolves
It’s designed to be useful for both humans and AI assistants, acting as a reliable, up-to-date source of project context.
By default, DocCheck can generate and validate a CLAUDE.md file, but the same approach can be applied to other formats such as README.md or project-specific documentation files.
Project goals
Immediate goals
- Generate an initial documentation file by scanning an existing project
- Detect and report documentation drift
- Dogfood the tool on itself and across my existing project portfolio
Longer-term ideas
- A web-based interview tool for initial generation
- CI/CD integration for automated validation
- Lightweight templates for different project types
- Potential expansion beyond Claude-specific use cases
Nothing fancy up front — prove the value first.
What “good” documentation looks like
Based on experimentation so far, useful living documentation typically includes:
- Project overview and current status
- Tech stack and key dependencies
- Architecture and structural decisions
- Development practices and workflows
- Domain knowledge and terminology
- AI-specific guidance and gotchas
- Quality gates and expectations
DocCheck doesn’t enforce this structure — it infers and validates it based on how the project actually behaves.
How success is measured
DocCheck succeeds if:
- It can bootstrap and validate itself
- It works across my existing projects (BlogLog, Jironaut, Worthsmith, and the wider Curious Coach Tools suite)
- It meaningfully reduces context-switching cost
- The documentation it produces is trusted rather than ignored
Meta note
The initial documentation content used to seed DocCheck has been created through conversation analysis rather than project scanning. It represents the ideal that the tool itself should eventually be able to generate automatically by inspecting a real codebase.