The Comfort of Process and the Cost of Certainty
Imagine a flowchart where every box technically clears the threshold to move forward, if only barely. Each decision point represents a 49/51 judgment call, supported by information that may be only 60 percent reliable but presented with professional confidence.
By the time the final decision emerges, it looks authoritative. The process was followed. The analysis is thorough. The conclusion appears inevitable.
What is harder to see is how many marginal assumptions are stacked on top of one another. What looks solid is often built on a foundation of sand.
The Department of Defense is full of good people. Highly educated, deeply experienced, and serious about the mission. If intelligence and expertise were enough, the Pentagon would be among the most effective organizations in the world.
And yet, it often isn’t.
Programs slip years behind schedule. Costs rise faster than projections. Strategies emerge fully staffed, briefed, and defensible only to prove fragile once exposed to real‑world friction. Reform efforts arrive in predictable cycles, each promising to fix what the last one could not.
The usual explanations are familiar: leadership failure, bureaucratic inertia, political interference. Each plays a role. None fully explains the pattern.
The deeper issue is more uncomfortable. The Pentagon repeatedly pursues “optimal” solutions inside systems that cannot support optimality.
Small organizations run on judgment. Leaders see most of the system, make tradeoffs directly, and adjust quickly when reality intrudes. Feedback loops are short. Authority is clear. Failure is visible and rapidly corrected.
That model breaks down as organizations scale.
Once complexity exceeds what any individual, or even a small group, can reasonably comprehend, judgment must be distributed. Decisions are decomposed into parts, delegated across functions, and recombined through layers of coordination and review. Process replaces proximity.
This shift is not a flaw. It is necessary for the continued operation of a large, complex organization like the Pentagon.
Over time, however, process stops being a support mechanism and becomes a substitute for decision‑making. What began as a way to inform leaders quietly turns into a way to decide on their behalf.
In the US military, this tendency is amplified. The institution values standardization, repeatability, and control for good reason. Lives depend on doing many things exactly right, every time. As a result, vast amounts of knowledge are codified into doctrine, manuals, and instructions designed to eliminate ambiguity.
For well‑defined tasks, this works extremely well.
The problem arises when those same mechanisms are applied to decisions that are not well‑defined, repeatable, or binary.
Process is often treated as neutral – an objective tool that simply ensures consistency. In reality, process encodes culture.
Every approval chain, checklist, and review board quietly answers questions about what an organization values, what risks it fears, and which failures are unacceptable. Over time, these preferences harden. Process becomes a cultural artifact as much as a technical one.
In the Department of Defense, process tends to optimize for survivability inside the institution. Decisions that are defensible, auditable, and compliant are rewarded. Decisions that create exposure, even if strategically sound, are not.
The result is an organization exceptionally good at producing decisions that survive scrutiny, but far less effective at producing decisions that survive change.
Strategic planning, acquisition tradeoffs, and force design choices are probabilistic. They depend on incomplete information, contested intelligence, and predictions about adversary behavior, technology maturation, and political constraints years into the future.
Yet these decisions are often forced through frameworks designed for certainty.
When the environment shifts, as it inevitably does, the system does not fail at a single point. It loses resilience everywhere at once.
Optimization assumes stability and predictability: clear objectives, fixed constraints, reliable feedback.
The Department of Defense operates in none of those conditions.
It rarely has first‑mover advantage. It must commit resources years before outcomes can be measured. It manages a budget measured in trillions while answering simultaneously to Congress, the public, allies, industry, and adversaries.
In that environment, “optimal” solutions are less about finding the best answer and more about constructing the most defensible one.
Process delivers confidence. It does not deliver adaptability.
The most effective organizations design processes that support judgment rather than replace it. They are explicit about uncertainty. They clarify where authority resides. They stress‑test assumptions instead of burying them under confidence.
For the Department of Defense, this means shifting from asking “What is the optimal solution?” to asking “What decision can survive change?”
The solution is not to eliminate process. Large organizations, like the Department of Defense cannot function without it. The danger lies in confusing procedural rigor with decision quality. In complex environment, process should serve as a decision support tool rather than a substitute for professional judgement.
That requires deliberately defining decision boundaries. Routine, compliance-driven activities benefit from standardization. Strategic, operational, and risk-based decisions do not.
Institutions must also reassess incentives. Leaders are currently rewarded for adherence and punished for deviation, even when deviation produces better outcomes. Evaluation systems should assess whether decisions were reasonable given the information available at the time, not whether every procedural step was followed perfectly.
Organizations must institutionalize “good-enough” decision-making. Time-boxed analysis, minimum viable decision thresholds, and feedback-driven refinement should be designed into processes themselves. In dynamic environments, speed, adaptability, and intent alignment matter more than theoretical optimality.
Leaders should always have the option to break the process when appropriate.
Finally, the feedback mechanisms must, above all, reward decision quality and learning above procedural perfection. Failure should trigger analysis rather than blame.
In complex systems, success is not about being right on paper. It is about building decisions resilient enough to survive contact with reality.
Biography:
Joshua Cobb is a U.S. Army officer with experience in operational command and the defense acquisition enterprise. He writes on decision-making, organizational behavior, and how large institutions operate under uncertainty.
Disclaimer: The views expressed are those of the author and do not necessarily reflect the official policy or position of the Department of the Air Force or the U.S. Government.






Leave a Reply