The Bridge Wobbled. They Saw It. They Fixed It Wrong.
On July 1, 1940, the Tacoma Narrows Bridge opened in Washington state. It was an engineering marvel: a suspension bridge spanning 2,800 feet with the longest center span in the world at that time. The design was revolutionary—the deck was thinner and more flexible than previous bridges, giving it a distinctive, almost delicate appearance. Traffic was modest at first but growing.
Almost immediately, engineers noticed that the bridge oscillated in the wind. It wobbled. It swayed. The movement was visible, dramatic, and unnerving to drivers. We have footage of this motion, and it’s genuinely disturbing to watch—a bridge that looks alive, moving in ways bridges shouldn’t move.
The engineers had a choice: close the bridge, investigate the instability, redesign the structure. Instead, they made repairs—added wind bracing, stiffened certain sections—but they continued operating the bridge. These fixes addressed the symptoms of oscillation without correcting the aerodynamic instability that caused it.
On November 7, 1940, just four months after opening, the bridge’s center span catastrophically failed and collapsed into Puget Sound. One person died. But the tragedy revealed something profound about human psychology: how commitment to a chosen design path, combined with sunk costs and confirmation bias, prevents engineers from recognizing that the solution they’re implementing is addressing the symptom rather than the disease.
The Financial and Reputational Investment Preventing Course Correction
The Tacoma Narrows Bridge cost $6 million to build—approximately $110 million in today’s dollars. The engineer responsible for the design was Joseph Strauss, one of the most celebrated bridge builders in the world. He had designed the Golden Gate Bridge, which opened to universal acclaim. Tacoma Narrows was his next triumph.
Closing the bridge would mean admitting the design had fundamental flaws. It would mean writing off the $6 million investment. It would damage Strauss’s reputation and the reputations of every engineer and administrator who had approved the design. The financial and psychological costs of confronting the problem directly were enormous.
Cost to build Tacoma Narrows Bridge in 1940
The bridge cost $6 million in 1940—roughly $110 million in today’s dollars. This financial sunk cost influenced the decision to continue rather than redesign.
This is the sunk cost fallacy in its purest form: the decision to continue investing in a failing project because of prior investments, rather than objectively evaluating the project’s future viability. The $6 million was gone. Whether the engineers continued to operate the bridge or closed it had no bearing on that cost. Yet psychologically, the existence of that sunk cost made it harder to admit the design was fundamentally flawed.
Confirmation Bias: Looking for Reasons the Bridge Was Safe
Once the engineers noticed the oscillation, they faced a psychological fork in the road: Is this motion a sign of fundamental instability, or is it a normal characteristic of a flexible suspension bridge? The answer depended on how they looked at the evidence.
The bridge’s design was based on thin, flexible deck plates—novel at the time. This flexibility had been presented as an advantage: it would allow the bridge to move with the wind rather than resist it, distributing forces more evenly. When engineers observed the wobbling, they didn’t ask: “Did our design assumptions predict this motion?” They asked: “Is this motion acceptable given our design philosophy?” The second question contains an implicit assumption that the design is sound.
This is confirmation bias: searching for evidence that supports your hypothesis while ignoring evidence that contradicts it. The engineers looked for reasons the motion was acceptable rather than asking whether the motion revealed a flaw in their understanding. They consulted precedent (other suspension bridges move in the wind), created categories for the observed behavior (calling it “flutter” rather than “resonance”), and concluded that the motion was a normal feature rather than a warning sign.
Time from opening to catastrophic failure
The bridge opened on July 1, 1940, and collapsed on November 7, 1940—49 days of operation and visible instability.
The Invisible Enemy: Aerodynamic Resonance
What the engineers didn’t understand was aerodynamic resonance—the principle that wind at certain frequencies can drive a structure into oscillations that grow exponentially. The thin, flexible deck that was supposed to distribute forces more smoothly actually created the conditions for resonance. The bridge had become a tuning fork: when the wind hit the right frequency, the bridge would vibrate at its natural resonant frequency, and those vibrations would grow larger with each oscillation.
The engineers knew oscillation was occurring. They didn’t know why. In the absence of clear understanding, they fell back on precedent and reassurance. The Golden Gate Bridge moved in the wind, they reasoned. Suspension bridges have always flexed. Therefore, this motion is probably fine.
But Tacoma Narrows was different. Its thin deck made it more vulnerable to aerodynamic forces. Its design parameters—the dimensions, materials, and proportions that made it beautiful and economical—had inadvertently created a structure optimized for resonance. The engineers had built an efficient machine for converting wind energy into destructive motion.
“Band-Aid” Solutions That Deepened Commitment
When public concern about the wobbling bridge grew, engineers implemented repairs. They added stiffening elements. They reinforced certain sections. They applied solutions designed to reduce the visible oscillation without addressing the underlying aerodynamic instability.
These fixes worked, temporarily. The bridge wobbled less obviously. But they hadn’t solved the problem. They had created a false sense of security while the fundamental instability persisted. Worse, each fix represented an additional investment—financial and psychological—in the idea that the bridge could be saved through modification rather than redesign.
The sunk cost fallacy and confirmation bias worked together here: each unsuccessful repair deepened the commitment to the idea that the bridge was fundamentally sound and just needed adjustment. The engineers were caught in what psychologists call an “escalation of commitment”—the tendency to invest more resources in a failing enterprise after an initial investment, driven by the desire to recover the original investment.
Major repair attempts before collapse
Engineers implemented at least 4 major repair attempts between July and November 1940, each adding cost and deepening institutional commitment to the bridge.
The Illusion of Control Through Technical Tinkering
A critical psychological pattern emerges here: when confronted with a complex problem, organizations tend to reach for technical solutions—repairs, modifications, adjustments—rather than asking fundamental questions about their assumptions. Tinkering creates an illusion of control. Each repair represents progress. The engineers were doing something, taking action, addressing the problem. That activity itself became reassuring, even though the activity was addressing symptoms rather than causes.
Kahneman’s research on overconfidence shows that people tend to believe they understand systems better than they actually do. The engineers understood the mechanics of suspension bridges—cable systems, load distribution, static forces. They didn’t understand the aerodynamics of thin, flexible structures. But expertise in one domain can create false confidence in related domains.
By the time the bridge failed on November 7, engineers were still convinced that the repairs would eventually solve the problem. They were looking at 49 days of operational data, searching for evidence that the bridge was becoming more stable. They weren’t looking at the fundamental physics that would eventually produce catastrophe.
The Visible Failure: Why Engineers Couldn’t See What Everyone Could See
The most psychologically interesting aspect of Tacoma Narrows is that the problem was visible. Drivers reported the wobbling. Pedestrians were frightened. The motion was dramatic enough to be filmed. The evidence of a problem was not hidden. It was not subtle. It was obvious.
Yet the engineers—the people with the expertise and authority to interpret that evidence—couldn’t see it as a fundamental problem. Instead, they framed it as a characteristic of the design, something that could be managed through careful repair and operation. This represents a profound failure of translation between visible evidence and expert interpretation.
The footage of the Tacoma Narrows Bridge oscillating has become iconic precisely because it shows something engineers built, operating in a way that violates intuitions about how structures should behave. The bridge looks alive, unstable, dangerous. That intuition was correct. But the engineers had trained themselves to interpret the same phenomenon as manageable.
Psychological Commitment and the Impossibility of Turning Back
At some point between July and November 1940, the decision point shifted. Initially, closing the bridge and redesigning it was theoretically possible. After three months of operation, successful repairs, increased traffic, and further investment, closing the bridge became psychologically unthinkable. The organization had committed itself to a course of action.
This is what researchers call “psychological ownership.” Once you have built something, invested in it, defended it against critics, the thing becomes part of your identity. Admitting the design is fundamentally flawed becomes admitting a fundamental failure of judgment. For Joseph Strauss, one of the world’s greatest bridge builders, that psychological cost would have been enormous.
The bridge failed not because the problem was unknown or invisible, but because the psychological commitment to the design path had become so strong that alternative interpretations of the evidence were dismissed. The engineers were rational people operating within an irrational system—one where the sunk costs and commitment had distorted their perception of reality.
Fatality from the collapse
The collapse killed 1 person—a photographer trapped in his car on the bridge—and triggered a redesign of suspension bridge engineering for decades.
The Deeper Lesson: Commitment as Blindness
Tacoma Narrows reveals a principle that modern engineering has only partially learned: commitment to a chosen design path can be more dangerous than incompetence. A competent engineer working under pressure to justify prior investment and unwilling to acknowledge fundamental error can cause more damage than a frankly incompetent engineer working without commitment.
The solution to this problem isn’t obvious. Organizations can’t operate by constantly questioning their core designs. They need to commit to courses of action. But they also need mechanisms that allow fundamental assumptions to be examined when evidence suggests they’re wrong.
The Tacoma Narrows Bridge was redesigned after its failure, and the new design incorporated active damping systems and aerodynamic principles that prevented resonance. Modern suspension bridges are built with explicit understanding of aerodynamic stability. But this knowledge was purchased with the collapse of an elegant bridge and the death of a photographer. Knowledge that could have been available through better engineering judgment and less psychological attachment to the original design.
The bridge wobbled. Everyone could see it wobble. But the people with the authority to stop the wobble had committed themselves to a design they couldn’t admit was flawed. That commitment killed the bridge—and nearly killed the engineering community’s confidence in suspension bridge design for a generation.
