Audience: Buyers, Sellers, Support
Prerequisites: Active order; at least one milestone/deliverable in scope
Outcomes: A cleanly scoped dispute with objective, verifiable evidence
Open (UI)
Orders → Dispute → Open → choose reason (non_delivery
, late_delivery
, poor_quality
, payment_issue
) → describe what happened, expected, criteria missed → submit → upload evidence.
Open (API)
curl -X POST $API_BASE/api/disputes -b cookies.txt \ -H 'Content-Type: application/json' \ -d '{ "orderId":"ord_123", "reason":"poor_quality", "details":"Typography and spacing differ from the signed spec; see annotated PDF." }' # → { "disputeId":"dsp_456", "status":"Open" }
Evidence standards (what “good” looks like)
Contract excerpts: exact clause/acceptance criteria (annotated PDF/page refs)
Deliverable artifacts: submitted files or diffs vs prior versions
Objective measures: test reports (Lighthouse, CI), screenshots with timestamps, logs
Timeline: submission dates, due date, in-app conversation links (not external chats)
Impact statement: succinct description of impact/cost
Upload (API)
curl -X POST $API_BASE/api/disputes/dsp_456/evidence -b cookies.txt \ -F 'files=@contract_excerpt.pdf' \ -F 'files=@annotated_mockup.png' \ -F 'files=@lighthouse_report.json' \ -F 'notes=Sections 3.1 and 4.2 of the contract; M1 acceptance criteria unmet.'
Standards & limits
Filenames:
ord123-m1-criteria-1.pdf
(short, numbered)Defaults: 10 MB/file, 6 files/request, AV scan + MIME checks
If redacting PII, state what was redacted
Anti-patterns
External links to mutable docs; upload the artifact
Wall-of-text rants; keep facts high, emotion low
QA checklist
Evidence appears under correct party/org; AV/MIME checks enforced
Timeline shows timestamps + actors; attachments downloadable
Runbook: “File rejected”
Convert/compress; verify MIME; split archives; re-upload with checksums for large sets.