Background #
Like most developers working with coding agents, I've found some processes produce better results than others. After discussing workflows with other developers, I discovered fascinating variety in their approaches.
I recently came across a post on Simon Willison's blog referencing Peter Steinberger where Peter discusses the evolution of his workflow. Peter went from handwriting a fairly rigid spec to riffing with Claude to come up with a a plan.
My workflow is similar to Peters, but trying to go back to making formal specs that live along side the code. I modeled this on Python Enhancement Proposals (PEPs).
Process #
I maintain a specs/ folder where ALL features live as markdown files (git-ignored for shared projects). I use separate specialized agents: spec writer, spec reviewer, coding agent, code reviewer, bug filer, etc.
Key practice: I run each agent in separate sessions, closing immediately after each task before handing off to the next agent. This produces better resultsβI believe because previous claims about completeness are gone, allowing the new agent to focus purely on finding improvements.
Novel aspect: Each specification lives in the codebase as a deliverable alongside the code. This keeps all feature context easily available, makes work easy to grade against original intentions, and preserves design decisions and process.
Slash commands #
/spec #
Create a new QEP (Quest Enhancement Proposal) in specs/ directory.
Goals:
$1
Steps:
1. [ ] Find the next available QEP number by checking specs/qep-*.md files
2. [ ] Create specs/qep-NNN-slug.md with proper template
3. [ ] Fill in Title, Number, Status (Draft), Author, Created date
4. [ ] Add Motivation, Proposal, and Rationale sections based on description
5. [ ] Include Examples section with Quest code samples
6. [ ] Add Implementation Notes section
7. [ ] Add References section if applicable
8. [ ] Open the file for user to review and edit
Template structure:
- Title: QEP-NNN: Feature Name
- Metadata: Number, Status, Author, Created
- Sections: Motivation, Proposal, Rationale, Examples, Implementation Notes, References
/spec-review #
Review QEP specification $1 in specs/ completeness and quality.
Review checklist:
1. [ ] Read the QEP-$1 document from specs/
2. [ ] Verify proper formatting and metadata (Number, Status, Author, Created)
3. [ ] Check Motivation section clearly explains the problem
4. [ ] Verify Proposal section has clear, concrete design
5. [ ] Review Rationale section explains design decisions
6. [ ] Ensure Examples section has working Quest code samples
7. [ ] Check Implementation Notes for technical feasibility
8. [ ] Verify consistency with existing Quest language design
9. [ ] Check for conflicts with other QEPs or existing features
10. [ ] Suggest improvements or clarifications needed
11. [ ] Recommend status change if appropriate (Draft β Accepted β Implemented)
Provide detailed feedback on:
- Clarity and completeness
- Technical soundness
- Backward compatibility concerns
- Edge cases or gotchas
- Documentation needs
/code #
Beging coding $1
1. [ ] Identify scope and nature of the problem
2. [ ] Run full test suite. If there are failures halt and ask for them to be resolved before continuing.
3. [ ] Write initial implementation
4. [ ] Write comprehensive tests in tests/ and run the individual test file to confirm implementation
5. [ ] Run full test suite to ensure new bugs were not introduced.
6. [ ] Update docs/docs necessary
7. [ ] Update README.md / CLAUDE.md if making a fundamental language change. (not necessary for bug fixing)