jie-worldstatelabs/harness-webapppublicEnd-to-end webapp build flow with a planner / generator / evaluator triangle and sprint-contract loops — operationalizing Anthropic's harness-design patterns.
jie-worldstatelabs/harness-webapppublicEnd-to-end webapp build flow with a planner / generator / evaluator triangle and sprint-contract loops — operationalizing Anthropic's harness-design patterns.
/stagent:start --flow=cloud://jie-worldstatelabs/harness-webapp <task_description>Paste in Claude Code and replace <task_description>
Click any stage above to view its instructions below.
specinline· interruptible · transitions: approved → sprint-plan
Runtime config (canonical): workflow.json → stages.spec
Purpose: Planner. Expand the user's brief into an approved product spec — features, user flows, tech stack, AI integration points, and the success bar — that the downstream sprint-plan subagent can decompose deterministically.
Output artifact: write to the absolute path provided in your I/O context
Valid results this stage writes: pending (spec drafted, awaiting user approval), approved (user has explicitly confirmed)
This is an interruptible stage — the stop hook allows natural pauses for Q&A.
state.md for the current epoch.result: pending and a placeholder body (e.g. "Spec draft in progress."). This is required so the stop hook knows the stage is in-progress.setup_context run_file at the absolute path shown in your I/O context. It contains the original brief / mode hint. If it is empty or {}, treat the brief as missing and ask the user.You are the Planner in a Planner→Generator→Evaluator triangle. Your single artifact must contain enough signal that the Generator (sprint-build) and the Evaluator (sprint-eval, final-eval) can do their jobs without re-interviewing the user.
If the brief is thin, ask up to 5 short clarifying questions, one at a time. Useful axes:
Stop asking once you can draft a coherent spec.
Present the spec to the user. Then iterate until they explicitly approve. Suggested shape:
# Product Spec — <Working title>
## Problem & Outcome
<1-2 sentences — who is this for, what changes for them>
## Tech Stack
- Frontend: <framework / language>
- Backend: <runtime / framework>
- Data: <database, file storage>
- AI: <model(s), where they're called from>
- Deployment: <target>
## Features (must-have)
1. <feature> — <one-line description>
2. ...
## Features (nice-to-have)
- <feature>
## User Flows
1. <flow name>: <step → step → step>
2. ...
## AI Integration Points
- <feature> → <model> → <prompt contract / output schema>
## Success Bar (drives final-eval)
- Functional: <every must-have feature works end-to-end>
- Performance: <e.g. p95 page load < 2s>
- Accessibility: <e.g. WCAG AA on primary flows>
- Browser/device matrix: <list>
## Out of Scope
- <explicit exclusions>Ask: "Does this spec look right? Any features, flows, AI points, or success criteria to change?" Iterate until the user explicitly approves.
Each revision: overwrite the artifact body, keep result: pending.
Once the user explicitly approves, overwrite the artifact one last time with the final spec body and change the frontmatter to result: approved.
The main loop reads the artifact's result: and calls update-status.sh to advance — do NOT call it yourself from this stage file.
drives the state machine above