Project Definition Pattern
Also known as:
Clarifying project vision, success criteria, and next physical action before work begins prevents rework and keeps execution aligned.
Clarifying project vision, success criteria, and next physical action before work begins prevents rework and keeps execution aligned.
[!NOTE] Confidence Rating: ★★★ (Established) This pattern draws on Agile Project Management, Design Thinking.
Section 1: Context
Work begins in a state of half-clarity. A corporate team inherits a mandate without understanding what “success” looks like in measurable terms. A government agency receives budget to solve a problem nobody has defined. An activist campaign gains energy around a cause but no one has named the threshold that marks victory. A software team receives a feature request and immediately opens a sprint without asking what done actually means.
In each case, the system is fragmenting—not visibly broken, but vital energy leaking outward into rework, misalignment, and stakeholder friction. The commons is intact but its stewards are not moving as one organism. Project definition is the pattern that catches this fragmentation before the decay accelerates. It surfaces the assumption-gaps that will otherwise compound through weeks of execution. It is not about perfection or lengthy planning; it is about reaching minimum viable clarity—enough shared understanding that the next action is obvious and the finish line is visible.
This pattern works in domains where action costs resources (time, money, attention, social capital). It prevents the slow hemorrhage of a system working at cross-purposes with itself.
Section 2: Problem
The core conflict is Project vs. Pattern.
One force says: Begin immediately. Start experimenting, learning, shipping. Every hour spent defining is an hour not spent doing. Clarity emerges from action, not from planning tables.
The other force says: Clarify first. Misalignment compounds. A team working toward different finish lines burns fuel for nothing. A project without success criteria cannot know if it succeeded.
When unresolved, the tension produces real breakage:
Projects without definition consume resources without generating measurable value. A corporate digital transformation initiative spends $2M and produces deliverables that stakeholders don’t use because nobody asked them what success looked like. A government program distributes funds but cannot prove it solved the problem it was funded to address. An activist campaign mobilises hundreds of people around competing visions of what victory means, fragmenting energy. A software team ships features that don’t address the user’s actual problem because “requirements” were an email thread, not a shared frame.
But over-definition kills vitality too. Elaborate project charters that nobody touches again. Success metrics so rigid they prevent adaptation. Planning that delays action so long the moment passes. This is why the tension matters: definition must be just enough, just in time—enough to prevent rework, not so much that it becomes theatre.
The core conflict: Should we act fast or think clear? The pattern shows they are not opposites.
Section 3: Solution
Therefore, invest 4–8 hours in a structured definition session that locks in vision, success criteria, and one clear next physical action—then protect execution from scope creep with those definitions as guardrails.
The mechanism is simple: shared understanding before distributed work. When all stewards hold the same frame of what done looks like and why it matters, their autonomous decisions align without constant coordination. This is composability at the cognitive level.
In living systems terms, you are establishing the root network before the plant grows. The roots do not determine the shape the plant will take—the plant will adapt to soil, light, season. But without roots, the plant has nowhere to draw nourishment. Project definition is that root system.
Design Thinking provides the shape: Frame the problem genuinely (what is actually being solved, for whom, why now?), not the assumed solution. Agile provides the tempo: Definition happens before work begins, but it is not waterfall documentation—it is a shared artifact the team will live inside and adjust as learning accelerates.
The shift: from “we will figure it out as we go” (which fragments into competing interpretations) and from “we will plan everything perfectly” (which locks you into assumptions that break on contact with reality) to “we will clarify enough to move with coherence, then execute and adapt.”
This is how you prevent rework without preventing learning. The vision and success criteria are fixed; the path to them is not. This is the pattern’s vital contribution: it keeps the commons’ stewards moving toward the same horizon even as they navigate different terrain.
Section 4: Implementation
1. Convene the steward circle (4–8 people, 4 hours). Gather those with decision authority, domain knowledge, and skin in the outcome. Not everyone—that dilutes clarity. Just enough to represent the full shape of the work. Schedule a single block. Fragments across days produce incoherence.
2. Frame the genuine problem, not the assumed solution. Begin with Why does this project exist now? not What features do we build? Write one paragraph answering: What state of the world are we trying to change? For whom? What would be true if we succeeded? What would be different? This is not marketing language—it is the honest diagnosis. In corporate language: What business outcome are we chasing? In government: What public value are we stewarding? In activist work: What systemic shift are we manifesting? In tech: What user problem are we solving?
3. Name 3–5 success criteria that are measurable and bounded. Not “improve engagement.” Increase weekly active users from 12% to 35% of total registered users within 6 months. Not “better health outcomes.” Reduce wait time from 3 weeks to 5 days for initial screenings in rural clinics. Not “raise awareness.” Achieve 60% recognition of the campaign message among target demographic by month 4. Not “ship on time.” Deliver the authentication module with zero critical bugs in production, deployed by sprint 8.
These criteria become the truth-telling system. They are not wishes. They are the finish line you will actually measure against.
4. Identify three categories of constraint and assumption. What money, time, people, or technology are you certain you have? What are you assuming you will have? What dependencies are external (waiting on approvals, partner orgs, market conditions)? Write these down. They will shift, but this makes the shifts visible.
5. Define the next physical action in specific, unambiguous language. Not “begin planning.” In 48 hours, the research lead interviews five current users about their pain with the existing system. Not “set up infrastructure.” The DevOps engineer spins up staging environment using template X; product owner verifies access by Friday. This action should take 1–5 days. It should be the smallest thing that moves you toward learning something true about your assumptions.
6. Write one problem frame—one document, one page. Include: Why (the genuine problem), What (the vision), How we measure (3–5 success criteria), What we have/assume/depend on, Next action. Commit this to shared space. It becomes your reference when scope creep arrives (and it will).
Context-specific callouts:
-
Corporate: Frame the problem in business language (revenue, risk mitigation, capability building), but include the human outcome too. Success criteria must be tied to quarterly business reviews so they do not become orphaned. Next action: assign a sponsor and a core team lead; they own the frame.
-
Government: Name the public value explicitly. Success criteria must align with stated agency mission and be auditable. Build in a checkpoint with oversight bodies at definition stage. Next action: brief your governance structure before full-team kickoff; resolve authority questions upfront.
-
Activist: Frame the problem in terms of systemic change, not just campaign activity. Success criteria might include: X% of target population shifts on Y issue or Policy Y is amended by date Z. Engage base leadership in definition; they are stewards too. Next action: test the message with 3 community groups; document their feedback.
-
Tech: Frame the problem as user job-to-be-done, not feature. Success criteria must include reliability and performance, not just launch date. Build definition session into sprint zero. Next action: acceptance criteria is written and reviewed by engineering and product together; team sketches architecture before code.
Section 5: Consequences
What flourishes:
Execution accelerates because ambiguity vanishes. Teams make decisions locally without constant escalation—they know the finish line. Rework drops because work is aligned to actual success criteria, not interpreted briefs. Stakeholder friction eases; everyone can measure progress against the same frame. New capacity emerges: the team can innovate on how to reach the criteria without renegotiating what the criteria are. In Agile terms, you have created the container for velocity.
Trust strengthens. When stewards see that definition led to aligned action and measurable progress, they believe in the pattern. The next project gets defined faster. The commons develops a shared rhythm: clarity → execution → learning → adjustment.
What risks emerge:
Rigidity. If the definition becomes sacred text instead of a living reference, the system hardens. Conditions change; success criteria that made sense in month one may become obsolete in month four. Watch for teams defending a definition that has been falsified by reality. The pattern degrades from resilient compass to brittle orthodoxy.
Definition without power. If definition creates clarity but stewards lack authority to act on it, the pattern becomes performance—beautiful documents, no movement. This surfaces an ownership problem, not a definition problem, but it will feel like definition failed.
Incomplete steward circle. If key stakeholders are absent from definition, the clarity you generate is incomplete. Hidden stakeholders emerge mid-execution with different success criteria. This is why enough people must be in the room, not everyone but not too few.
False precision. Success criteria can be gamed: metrics that look clean but measure the wrong thing. A software team ships on the defined date but the feature is fragile. A government program hits its target numbers but ignores the population most in need. This is a deeper pattern—choosing what to measure—but definition makes it visible.
Given the commons assessment (resilience: 3.0, ownership: 3.0), this pattern does not by itself build resilience or distribute ownership. It coordinates around existing structures. If your commons lacks resilient decision-making or true co-ownership, definition will work in service of those gaps rather than healing them.
Section 6: Known Uses
Spotify’s Squad Model (Tech): In the early 2010s, Spotify scaled from dozens to hundreds of engineers. They faced the choice: waterfall coordination or autonomous squads. They chose definition-then-autonomy. Each squad received a mission (not a feature list): Improve search relevance for underheard artists. Success criteria were set at the squad level and refreshed quarterly. Teams were then free to choose technology, process, and implementation. The pattern: tight definition of what problem, loose definition of how to solve it. Result: velocity and coherence coexisted. Squads shipped faster than competitors with three times the coordination overhead.
Cape Town’s Water Crisis Response (Government): In 2018, Cape Town faced Day Zero—the day municipal water would run dry. The city government gathered water department, emergency services, community leaders, and NGOs. In a single intensive session, they defined: What is our actual problem? (not just scarcity, but distribution equity and consumption behavior). What does success look like? Concrete metrics: Reduce daily consumption from 1,200 to 500 million liters within 100 days. What is our next action? Immediate public communication campaign plus restriction enforcement. This single definition session created alignment across competing interests. When the crisis demanded daily decisions, all parties moved from the same frame. Day Zero was averted. The pattern worked because governance was genuinely shared in that definition moment.
Black Lives Matter Activist Campaigns (Activist): Early BLM campaigns succeeded when they answered, upfront: What change are we manifesting? (policy, narrative shift, organizational accountability). How will we measure it? (policy passed, media coverage shift, stated commitments from institutions). What is the first action? (particular protest, petition, direct action). Campaigns that skipped definition—that began with energy but no clarity on finish—fragmented into competing tactics and burned out faster. Campaigns that locked in definition maintained momentum because participants could orient their individual actions toward a shared vision. This is composability at the activist level: decentralized action, centered purpose.
Section 7: Cognitive Era
In an age of AI-assisted analysis and distributed intelligence, project definition shifts in three ways:
Speed of reframing increases. AI tools can now generate multiple problem frames in hours—user research synthesis, stakeholder perspective mapping, scenario modeling. Practitioners can test whether their definition is robust by having AI play adversary: Here is your success criteria. What edge cases break it? What populations does it ignore? This compresses definition time. The risk: speed becomes substitute for depth. A definition generated in 2 hours by a tool and a practitioner feels legitimate but may not have absorbed the actual lived context. The pattern still requires human stewardship.
Success criteria become more measurable. Real-time data streams, sensor networks, and automated monitoring mean you can now measure things that were once opaque. A government program can track not just outputs (funds distributed) but outcomes (lives changed). An activist campaign can measure narrative shift through social listening. A tech team can instrument every user interaction. The risk: If it can be measured, should it be? Optimizing for what you can measure often means optimizing away the things that matter most (belonging, dignity, agency). Definition in the AI era must explicitly name what matters but is hard to quantify, then protect those things from optimization.
Steward circles expand or dissolve. Distributed decision-making tools and AI-generated insights mean definition sessions can include more voices without becoming chaos. Or, conversely, definition can happen without human stewardship at all—an algorithm determines the problem and success criteria based on data patterns. This is a risk: definition without democracy. The pattern depends on genuine steward participation. AI can accelerate definition, but it cannot replace the human act of choosing what matters.
The tech context translation becomes more crucial: Software projects ship on time when success criteria are established first. In an era where AI can generate code to spec, the rate-limiting step is not implementation—it is clarity. Teams with tight, well-tested definitions will ship faster. Teams that outsource definition to tools or skip it entirely will ship broken things that technically meet criteria nobody should have been optimizing for.
Section 8: Vitality
Signs of life:
-
The team references the definition naturally. In standup, a developer asks: Does this decision move us toward our success criteria or away? Not because it is required, but because the frame is genuinely useful. The definition has become a compass, not a document.
-
Scope creep is debated, not accepted. When a new request arrives, the team asks: Is this in scope for our defined success? If not, what does it displace? This is healthy friction. Scope is no longer a passive thing that happens to you.
-
Learning shapes adjustment without abandoning the frame. By week three, you discover your assumption about user behavior was wrong. The team does not abandon the criteria; they adjust the approach and learn faster. The frame holds even as the path changes.
-
New stewards can join and quickly understand why the work matters. A team member reads the definition and immediately knows the heartbeat of the project. Onboarding is faster because the why is explicit.
Signs of decay:
-
The definition is never looked at again. It was written in a workshop, committed to a folder, and is now four months old. Changes accumulate undocumented. Nobody references the success criteria. The document became ritual, not guidance.
-
Success criteria are treated as suggestions, not guardrails. Deadlines slip, metrics are reinterpreted retroactively, nobody checks at the end whether you actually succeeded. The frame lost its force.
-
Stewards still argue about what the project is actually for. Two months in, people still hold different ideas of success. Definition happened but did not create shared understanding. This often means definition was imposed, not co-created—stakeholders complied rather than committed.
-
The team optimizes for what they can measure and loses sight of what matters. You hit your metrics but nobody actually uses what you built. You achieved your targets but the people affected are worse off. The frame became a cage.
When to replant:
Replant this pattern when the system’s vitality begins to decline—when rework accelerates, when stewards lose alignment, when you sense energy leaking into coordination overhead. Do not wait for catastrophic failure. Also replant when conditions change materially (new stakeholder, budget shift, market change); the old definition may be obsolete. The right moment is when you can feel the commons fragmenting. That is when clarity becomes urgent again.