Skip to main content
Why 'Optimal' Is The Most Dangerous Word In Business
By Hisham Eltaher
  1. Systems and Innovation/

Why 'Optimal' Is The Most Dangerous Word In Business

Key Takeaways

  1. The Stability Trap: Over-optimization for predictability eliminates the flexibility needed to adapt to unexpected events, making systems brittle.
  2. Goodhart's Law: When metrics become targets, they cease to measure what they were intended to, leading to unintended consequences.
  3. Hidden Value Judgments: The term 'optimal' masks subjective priorities that often prioritize profit over safety or resilience.
  4. Latent Risks: Aggressive optimization creates hidden vulnerabilities that perform well under normal conditions but fail catastrophically when stressed.
  5. Design for Resilience: True organizational strength comes from building flexibility and redundancy, not just efficiency.

The Unseen Dangers of Our Quest for Perfection
#

The modern business ideal is a relentless drive for optimization. We chase efficiency, streamline processes, and demand data-driven perfection in every decision. But a dangerous paradox lies at the heart of this quest: the single-minded pursuit of an ‘optimal’ state can silently create the conditions for catastrophic failure.

The tragedy of the Boeing 737 MAX is not just a story of a technical glitch; it is a story about the hidden risks of optimization. Its failure reveals how a culture obsessed with efficiency can systemically erode safety, judgment, and resilience. This was a failure born from a profound confusion: the belief that an optimized model of the world is the same as the world itself. By mistaking the map for the territory, Boeing’s culture of efficiency created the very instability it sought to eliminate. This post will unpack four counter-intuitive takeaways from this disaster, revealing the profound dangers of an efficiency-obsessed culture.

Note

346 Lives lost in Boeing 737 MAX crashes due to optimization-driven design flaws


The Stability Trap: How Perfect Predictability Makes Us Brittle
#

The core paradox of complex systems is that extreme stability is not strength, but fragility. When organizations over-optimize to eliminate all deviation and variation in pursuit of perfect predictability, they also eliminate the flexibility needed to adapt to unexpected events. This drive to standardize removes the crucial ‘slack’—the adaptive capacity—that allows a system to absorb shocks and adjust under stress.

Just-in-Time (JIT) inventory management is a classic example. While it optimizes for reduced storage costs and improved cash flow, it creates extreme vulnerability to any supply chain disruption. A single delay in receiving goods can halt the entire production process because the system has been stripped of the very inventory buffers that would allow it to absorb a shock. In systems theory, this is known as ’tight coupling’—a state where components are so interconnected that a failure in one part rapidly cascades through the entire system.

This is what makes over-optimization so dangerous: a system that is perfectly tuned for expected conditions becomes brittle and incapable of responding when circumstances inevitably change. It performs exceptionally well until the moment it fails, and then it fails completely.

Note

$200-400M Potential cost to Boeing if simulator training had been required for 737 MAX

“Resilience is not the elimination of variability. Resilience is the capacity to adjust to variability when it occurs.”

Goodhart’s Law: When a Metric Becomes a Target, It Lies
#

Goodhart’s Law is captured in a simple but powerful adage: “When a measure becomes a target, it ceases to be a good measure.” In simple terms, once a metric is used for control or reward, people will start to optimize for the metric itself, often in ways that undermine the original goal the metric was supposed to represent.

The Boeing 737 MAX case is a tragic illustration of this principle. The company’s “target” was to design the new plane as a minor update to the previous 737 model. Hitting this target was critical because it would avoid the need for costly and time-consuming pilot simulator training. To achieve this, Boeing systematically downplayed—and in some cases, willfully concealed—the significance of the new and powerful Maneuvering Characteristics Augmentation System (MCAS), a flight control system that fundamentally changed how the plane handled. Boeing not only discounted concerns from its own engineers but also did not share certain information about MCAS with regulators. The financial pressure to hit this metric was immense.

“If simulator training had been needed, Boeing would have owed Southwest Airlines between $200 to nearly $400 million dollars.”

In focusing on the proxy of avoiding retraining costs, Boeing suffered a catastrophic failure of its actual goal: ensuring safety. The target was met, but the outcome was a catastrophic failure that resulted in the loss of 346 lives.

The Value Judgment: “Optimal” Is Never Objective
#

The term “optimal” sounds impartial, mathematical, and objective. In reality, it always hides a subjective value judgment. In any optimization problem, a choice is made about what to maximize (the objective) and what to treat as a minimum requirement (a constraint). These are not neutral choices; they are reflections of an organization’s priorities and values.

Consider car safety. If a car company’s true, singular objective were to maximize safety, the car would never move. A parked car is the safest car. Instead, safety is treated as a constraint on the primary objectives of profit, speed, and performance. This hidden value judgment often prioritizes predictable, stable performance—the very state described in the ‘Stability Trap’—over less quantifiable but more critical values like resilience, adaptability, and true safety.

This same logic was at play with the Boeing 737 MAX. Chasing the metric of minimal retraining costs, the company’s decisions clearly prioritized its own goals of profitability and speed-to-market. Critical safety features that could have alerted pilots to sensor malfunctions—the very issue that triggered the MCAS failures—were sold as optional upgrades. This decision created a latent risk that became tragically real; the airlines involved in both fatal crashes had not purchased them.

“Boeing’s decisions were predominantly driven by concerns about profitability. As a for-profit business, it answers to its shareholders and the safety and priority of the flying public became secondary.”

The language of optimization was used to make a choice driven by profit appear to be a neutral, evidence-based decision, effectively putting corporate interests over public safety. This is a form of ethical obfuscation.

Note

1 Sensor used by MCAS system, creating single point of failure

The Hidden Time Bomb: How Optimization Creates “Latent Risk”
#

Aggressive optimization creates “latent risks”—hidden vulnerabilities where a system performs exceptionally well under normal conditions but is catastrophically fragile when an optimization fails. The fundamental challenge lies in the fact that optimization success creating observability blindness, as the smooth, efficient performance under normal operations masks the danger lurking just beneath the surface.

This ‘observability blindness’ is a direct consequence of Goodhart’s Law in action; by focusing exclusively on the target metric—cost savings and minimal pilot retraining—the organization became blind to the accumulating risk in the underlying system. This concept directly explains the design of the 737 MAX’s MCAS system. In the name of efficiency, cost-saving, and minimizing the perceived scope of the plane’s updates, the system was designed with a critical flaw.

“When it was rolled out, MCAS took readings from only one sensor on any given flight, leaving the system vulnerable to a single point of failure.”

This design choice was a ticking time bomb. It was a latent risk created directly by the relentless drive to optimize for cost and expediency. The system worked perfectly as long as that one sensor worked perfectly. But when it inevitably failed, the result was catastrophic, as the flawed design had removed any redundancy or buffer against error.


Conclusion: Designing for Resilience, Not Just Efficiency
#

The common thread through these four dangers is a fundamental error: mistaking the model for reality. When we over-optimize, we treat metrics as goals, assume stability is strength, mask subjective values with objective language, and trust smooth performance as a sign of safety. In each case, we fall in love with a simplified, predictable map and become blind to the complex, variable territory it is meant to represent. The wisdom for any organization is not to abandon efficiency, but to recognize its limits and build in the flexibility, redundancy, and slack that allows for true resilience.

As you reflect on your own work, consider this: In our drive to make our work and lives perfectly efficient, what crucial ‘slack’ are we eliminating, and what hidden risks are we creating without realizing it?


References
#

  1. Arafat, J., Moniruzzaman, K. M., Hossain, S., & Tasmin, F. (2024). Detecting and preventing latent risk accumulation in high-performance software systems. arXiv. https://arxiv.org/
  2. Goodhart, C. A. E. (1975). Problems of monetary management: The UK experience. Papers in Monetary Economics. Reserve Bank of Australia.
  3. Hoskin, K. (1996). The ‘awful idea of accountability’: Inscribing people into the measurement of objects. In Accountability: Power, ethos and the technologies of managing (pp. 265–282). International Thomson Business Press.
  4. Kuczynski, J., Wang, C., Glass, M., & Hoffman, F. (2021). Boeing 737 MAX: A case study of failure in a supply chain using system of systems framework. Issues in Information Systems, 22(1), 51–62.
  5. Laufer, B., Gilbert, T. K., & Nissenbaum, H. (2023). Optimization’s neglected normative commitments. In Proceedings of the 2023 ACM Conference on Fairness, Accountability, and Transparency (FAccT ‘23) (pp. 1269–1280). ACM. https://doi.org/10.1145/3593013.3594084
  6. Liu, J., & Gea, H. (2018). Robust topology optimization under multiple independent unknown-but-bounded loads. Computer Methods in Applied Mechanics and Engineering, 334, 367–384.
  7. Obermeyer, Z., Powers, B., Vogeli, C., & Mullainathan, S. (2019). Dissecting racial bias in an algorithm used to manage the health of populations. Science, 366(6464), 447–453. https://doi.org/10.1145/3593013.3594084
  8. Reality Drift Working Papers Series. (2025). The optimization trap: How efficiency culture erodes meaning. Figshare. https://figshare.com/
  9. Taylor, A. (2024). Just-in-Time (JIT) inventory management: Benefits & risks. Cleverence. https://cleverence.com/
  10. Thomas, C. (2025). When stability fails: Why over-optimization creates organizational brittleness. Compliance Week. https://www.complianceweek.com/

Related