From Artillery to Assets

The mid-twentieth century brought a surge of fervor for rational, mathematical decision-making, heavily influenced by World War II’s rigorous demands. Techniques born in the high-stakes environment of operations research (OR), such as optimizing bomb fragmentation for maximum impact or using linear programming for efficient shipping, soon found their way into finance. This new scientific approach required abandoning the old financial world of empirical research and “rules of thumb” in favor of pure theory ruled by simplifying assumptions. The shift paved the way for “Statistical Man,” a hyper-rational economic actor who made choices by weighing potential outcomes probabilistically.

WWII Operations Research

Military optimization techniques like linear programming migrate to financial modeling

The Mathematical Foundation of Choice

The foundation of Statistical Man was built upon the insights of mathematician John von Neumann and economist Oskar Morgenstern. Their work demonstrated that rational individuals ought to maximize “expected utility” by assigning a numerical value (utility) to each outcome and multiplying it by its probability. The central argument for this new quantitative finance became crystal clear: defining rational behavior in the face of uncertainty. This framework was embraced by economists like Jacob Marschak, who asserted that to be an “economic man,” one must be a “statistical man”. This mathematical rigor allowed theorists to overcome the long-standing problem of perfect foresight, replacing it with measurable risk.

The Geometric Core of Modern Portfolio Theory

Balancing Risk and Covariance

Harry Markowitz, a graduate student mentored by Marschak and statistical expert Jimmie Savage, recognized that traditional investment advice was logically inconsistent. If investors truly sought only the “best at the price,” they would put all capital into a single security, yet diversifying was universally accepted as prudent. Markowitz’s 1952 breakthrough was defining risk not merely as the volatility of an individual security, but as the risk of the portfolio as a whole. This calculation required assessing the covariance—the extent to which different securities moved together. The goal was to assemble an “efficient” portfolio that delivered the maximum return for a given level of risk, a problem perfectly suited for the mathematical tools of linear programming pioneered during wartime.

1952

Harry Markowitz publishes Modern Portfolio Theory, revolutionizing risk assessment

Arbitrage and Irrelevance

The theoretical development of modern finance rapidly moved toward even bolder simplifying assumptions. Franco Modigliani and Merton Miller (Policy and Critique lens) launched an assault on empirical finance by tackling the crucial question of capital cost. They introduced the M&M Propositions, arguing that in a rational and perfect environment, the firm’s mix of debt and equity—its capital structure—is irrelevant to its market value. This theory rested on the belief that if two identical income streams were priced differently due to capital structure, rational investors (arbitrageurs) would intervene to exchange one for the other until prices equalized. This reliance on arbitrageurs to correct mispricings became a core tenet of rational finance. Simultaneously, the CAPM (Capital Asset Pricing Model), developed independently by Jack Treynor and William Sharpe (Technological History lens), built upon Markowitz’s work by quantifying the precise risk investors should be rewarded for. They determined that the only relevant risk, measured by “beta,” was the asset’s sensitivity to overall market movements, as all other idiosyncratic risks could be diversified away.

The Rise of the Scientific Consultant

While these theories were brilliant abstractions, their immediate impact on Wall Street was minimal; traditional finance firms were resistant to the academic math. However, the intellectual edifice provided powerful new tools for analyzing markets. The quantitative side of finance flourished, leading to the establishment of organizations like the Cowles Foundation and the Merrill Foundation, dedicated to applying science to financial reality. This new scientific approach allowed firms like Arthur D. Little to pioneer risk consulting, realizing that while academic models ignored uncertainty, clients needed solutions. The ability to quantify risk in terms of beta, expected utility, and covariance gave rise to financial engineers who began advising the newly dominant pension funds, laying the cultural groundwork for the eventual wholesale adoption of rational market theories.

The Unassailable Logic of Theory

The triumph of the theoretical approach meant that finance—the “business school version of economics”—was fundamentally transformed from a practical, empirical field to a deductive, model-driven science. The elegance and internal consistency of CAPM and the M&M propositions made them almost instantly compelling to younger scholars, especially in an era tired of the institutionalists’ lack of cohesive theory. The core philosophical victory was Milton Friedman’s argument (Policy and Critique lens): the relevance of a theory lay not in the realism of its assumptions (like perfect rationality), but in its predictive usefulness. This logical shield allowed the new science of finance to grow unchallenged, preparing the academic stage for the arrival of the ultimate market doctrine: the Efficient Market Hypothesis.