
In a world we often understand through the lens of diminishing returns—where the first slice of pizza is always the best—lies a counter-intuitive yet powerful force: increasing returns. This is the principle that more can lead to even more, where success actively breeds further success. While classical economics relies on the stabilizing nature of diminishing returns to predict market equilibrium, it struggles to explain why certain technologies dominate, why industries cluster in specific regions, and why history seems to cast such a long shadow over our economic and social systems. The concept of increasing returns fills this gap, offering a framework to understand these complex, self-reinforcing dynamics.
This article explores the strange and wonderful world of increasing returns. First, in "Principles and Mechanisms," we will dissect the core theory, examining the positive feedback loops, non-convexities, and mechanisms like network effects that drive this phenomenon, and we will uncover its profound implications for market stability and the concept of path dependence. Following this, the "Applications and Interdisciplinary Connections" chapter will showcase the principle's vast reach, revealing how it shapes everything from evolutionary biology and the origin of the sexes to technological lock-in and the persistent inequalities in our societies.
In our everyday experience, we are intimately familiar with the law of diminishing returns. The first slice of pizza is heavenly; the tenth, not so much. The first hour of study is productive; the fifth is a slog. We intuitively feel that as we do more of something, the benefit we get from each additional bit tends to decrease. This principle of negative feedback—where more leads to less—is a cornerstone of classical economics. It’s a stabilizing force, a cosmic governor that keeps things in balance, ensuring that markets settle into a predictable, efficient equilibrium.
But what if the world isn’t always so well-behaved? What if there are situations where the opposite is true? What if, sometimes, the more you have of something, the more you get? This is the strange and wonderful world of increasing returns. It’s a world of positive feedback, of virtuous cycles, where success breeds success. Imagine a small snowball rolling down a hill. As it picks up more snow, it gets bigger. Because it’s bigger, it picks up even more snow, even faster. Its growth is self-reinforcing.
We can capture this idea with a simple mathematical picture. Imagine a company trying to decide its production scale, . In a world of diminishing returns, its profit might look like a simple, well-behaved hill, with a single peak. But with increasing returns, the landscape changes. Suppose the firm's profit, , has a component that grows like at first, but is eventually tamed by a congestion effect, like . The profit function might look something like , where is some initial setup cost.
At the very beginning, for a tiny production scale , the positive term dominates. The more you produce, the faster your profit grows. This is increasing returns in action! However, if you try to produce nothing (), your profit is negative due to the setup cost, . The function actually has a local minimum at the start. To become profitable, the firm can’t just dip its toes in the water; it has to take a leap of faith to a sufficiently large production scale, say , to overcome the initial cost and reach the true peak of the profit hill. This non-concave profit landscape, with its initial valley at zero and a peak further out, is the signature of increasing returns. It tells us that getting started is hard, but once you gain momentum, the system works to your advantage.
This idea of self-reinforcing momentum isn’t just a mathematical curiosity. It’s a powerful force in technology, economics, and even in our social institutions. But where does it come from? What are the engines that drive this positive feedback? There are several key mechanisms.
First, there are high upfront or fixed costs. Think about building a railroad, designing a new microprocessor, or developing a life-saving drug. The initial investment—the research, the design, the physical infrastructure—is enormous. The first product off the assembly line has effectively cost billions of dollars. But once the system is in place, the cost of each subsequent unit (the marginal cost) can be very low. This means that the more you produce, the more you spread out that massive initial cost, and the more profitable each new sale becomes. The high fixed cost in our simple profit model represents exactly this barrier.
Second, there are learning effects. As a society, we get better at doing things the more we do them. When a new, complex technology is introduced, the first users are pioneers, navigating a buggy and poorly understood system. But as the user base grows, a collective intelligence develops. Best practices are established, training courses are created, and the technology itself is refined based on user feedback. In a complex health system, for instance, once a particular payment model is adopted, medical schools might begin teaching its specific compliance rules, and professional associations will invest in building the required expertise, making the system more efficient and entrenched over time.
Third, and perhaps most powerfully, are coordination and network effects. Many products and systems are only valuable if other people use them too. A single telephone is a paperweight; a billion telephones form a global communications network. The value of a social media platform, a video game console, or an operating system depends directly on the size of its network of users. This creates an incredibly powerful positive feedback loop. As more people adopt a technology, it becomes more valuable, which in turn attracts even more adopters. A dynamic model of technological competition shows this clearly: the utility, or attractiveness, of a technology can be directly proportional to its current market share. Winner takes all, or at least winner takes most.
Finally, these effects are amplified by adaptive expectations and complementary assets. Once a technology or standard pulls ahead, people expect it to become the winner. They therefore invest their time and money in that technology, making it a self-fulfilling prophecy. An entire ecosystem of supporting products and services—complementary assets—grows around the dominant standard. Consider a health system's payment model. A Fee-For-Service (FFS) system isn't just an abstract rule; it's a universe of interconnected parts: electronic health records designed around billing codes, insurance products based on FFS actuarial models, and even patient expectations of how care should be delivered. This entire infrastructure co-evolves with the standard, each part reinforcing the others. To change the standard is not to flip a single switch, but to rewire an entire, deeply interconnected web.
What is the ultimate consequence of these powerful, self-reinforcing mechanisms? It is that history takes on a profound and inescapable importance. The path a system takes early in its life can determine its fate for a very long time. This is the concept of path dependence.
In a world governed by diminishing returns, history doesn't matter so much. If you make a wrong turn, the stabilizing forces of negative feedback will gently guide you back to the optimal solution. But in a world of increasing returns, small, contingent events can be magnified and cemented into place. A lucky break, a clever marketing decision, or even a random accident can give one competitor an early lead. Increasing returns kick in and amplify that lead, until it becomes an insurmountable advantage. The system becomes locked-in to a particular path.
The classic example is the QWERTY keyboard layout. It was designed in the 19th century to slow typists down to prevent the keys on mechanical typewriters from jamming. It is far from the most efficient layout possible. Yet, it gained an early foothold. Millions learned to type on it, manufacturers built their equipment for it, and typing software was designed for it. The switching costs—the collective effort of retraining hundreds of millions of people and retooling an entire industry—are now so immense that we are locked in to a suboptimal standard.
The health system scenarios provide a perfect illustration of this process. An early choice to adopt a Fee-For-Service payment model triggers a cascade of self-reinforcing events: investments are made, expertise is developed, and expectations are set. Over time, the cost of switching to an alternative, like global budgets, becomes prohibitively high. The system is on a path that it cannot easily leave. Dynamic models of competition make this crystal clear: when increasing returns are strong, the system will have multiple stable equilibria (e.g., technology A dominates, or technology B dominates). Between them lies an unstable "tipping point." The final outcome depends entirely on which side of this tipping point the system happens to start. A small, early nudge one way or the other can decide the winner for generations.
Here we arrive at the most profound, and perhaps unsettling, implication of increasing returns. The neat, elegant world of classical economics, with its predictable march toward a single, efficient equilibrium, relies on a crucial assumption: convexity. This is the mathematical embodiment of diminishing returns. It ensures that supply and demand curves are smooth and well-behaved, that profit landscapes have only one peak, and that the "invisible hand" of the market can do its job without any trouble.
But increasing returns shatter this convex world. As we saw, they create non-convexities—profit functions with valleys to be crossed and cost functions that don't always slope upwards. What does this do to the beautiful machine of the market? It can cause it to sputter and break.
Consider an economy where a firm has this kind of non-convex technology. Because of the high fixed costs and initial increasing returns, the firm doesn't smoothly increase its production as the market price rises. Instead, it does nothing until the price hits a critical "entry price," at which point it suddenly jumps into the market with a large volume of output. Its supply curve is not a smooth line, but a discontinuous leap.
When this discontinuous supply curve meets a standard, downward-sloping demand curve, strange things can happen. Sometimes, the curves intersect cleanly, and we get a single, unique equilibrium price, just as the textbooks promise. But sometimes, they might cross in a way that the potential equilibrium price is lower than the firm's entry price, meaning the firm would never produce at that price. Or perhaps the demand is never high enough to justify the firm entering the market at all. In these cases, no equilibrium exists. The market fails to find a price that can balance supply and demand. In yet other scenarios, if demand is completely flat below the entry price, there might be an entire continuum of prices where both supply and demand are zero, and the market is stuck in a zero-trade limbo.
This is a startling conclusion. The presence of increasing returns means we can no longer take for granted that markets will find a single, optimal outcome. The economy can become unpredictable. It can get locked into inefficient paths. The technology that comes to dominate is not necessarily the best, but perhaps just the luckiest. This doesn't mean markets don't work; it means they are far more complex, interesting, and historically contingent than we might have first imagined. They are not just simple machines, but complex, evolving systems, where the echoes of the past shape the possibilities of the future.
Now that we have tinkered with the engine of increasing returns and seen how it runs, let us take it for a drive. Let us explore the world with this new lens and see what landscapes it has carved, what structures it has built, and what futures it might create. You may be surprised to find that this single principle—that success breeds success—is one of nature's most versatile and powerful tools. Its handiwork is visible everywhere, from the architecture of living creatures to the very shape of our global economy.
We often imagine evolution as a slow, grinding process of gradual change. But the history of life is also a story of spectacular leaps in organization: the leap from single cells to multicellular organisms, or the leap from one type of gamete to two distinct sexes. How does nature make such jumps? It turns out that increasing returns provide the impetus, turning a gentle slope of improvement into a launch ramp for innovation.
Consider one of the earliest moments of animal evolution: the emergence of the first multicellular creatures, like sponges. Imagine a simple clump of identical cells. Each cell must do everything: find food, digest it, and contribute to the whole. It is a jack-of-all-trades, but a master of none. What if some cells specialized in creating a water current to capture food (becoming "collar cells"), while others specialized in digesting and distributing that food (becoming "amoebocytes")? This is a division of labor. Common sense suggests this is more efficient, but why?
The secret lies in increasing returns. The effectiveness of a group of pumping cells can be greater than the sum of its parts. Two cells pumping in a coordinated way can generate a current that is more than twice as strong as a single cell's, thanks to the physics of fluid dynamics. This cooperative effect means the return on investment in "pumping machinery" is convex—or, in our language, it shows increasing returns. A model of this exact process shows that if the efficiency gain from this specialization is large enough to overcome the metabolic cost of organizing the two cell types, evolution will favor the specialized, more complex sponge body plan. The simple, undifferentiated ball of cells becomes unstable, and the population is driven to a new, higher state of organization. Increasing returns provide the selective "push" needed to climb the ladder of complexity.
This same logic helps solve an even more fundamental puzzle: the origin of the two sexes, male and female. Most people don't think to ask why there are sperm and eggs. Why not just one type of generic gamete? Some simple organisms are in fact "isogamous," producing one type of gamete. So how did "anisogamy"—the divergence into small, mobile sperm and large, stationary eggs—arise?
Imagine an ancient, externally fertilizing organism living in a patchy environment, where mates and resources are clustered in sparse oases. Each organism has a fixed energy budget to produce gametes. It faces a trade-off: it can make many small gametes or a few large ones. A large gamete is well-provisioned, giving the future zygote a better start in life. A small gamete has few resources but can be made in vast numbers, and the energy saved can be invested in mobility to search for a mate.
Here, the principle of increasing returns enters in two places. First, provisioning a zygote has diminishing returns; the first bit of food is life-saving, but after a certain point, extra food helps less and less. Second, and crucially, searching for a rare oasis in a vast space has increasing returns. A small investment in mobility might be completely wasted if the gamete runs out of fuel before finding a mate. But once the investment in mobility is large enough to cross the barren gaps between patches, the probability of success shoots up dramatically.
This asymmetry in returns creates what is called disruptive selection. The middle ground—a medium-sized, moderately mobile gamete—is a terrible compromise. It is not provisioned well enough to give its offspring a major advantage, nor is it mobile enough to be an effective searcher. Selection relentlessly pushes the population to two extremes: one strategy gives up mobility entirely to maximize provisioning, producing large, stationary "eggs." The other strategy jettisons all excess baggage to invest everything in mobility, producing tiny, stripped-down, hyper-mobile "sperm." The existence of two sexes is not a historical accident; it is a profound and logical consequence of the mathematics of search and provisioning in a non-uniform world.
This same logic, where small early advantages are magnified into large, stable differences, does not just operate over evolutionary eons. We see it sculpting our own societies, technologies, and economies, often in ways that create profound and persistent challenges.
Consider the stubborn problem of inequality in access to healthcare. Imagine a health system with two arms: a prestigious, well-funded cardiovascular hospital that primarily serves a wealthy population, and a network of underfunded community clinics that serve a poorer, minority population. A new budget becomes available. Where should the system invest? A simple analysis might suggest investing where the need is greatest—in the community clinics. Yet, time and again, we see resources flowing to the already-advantaged "centers of excellence."
This phenomenon can be explained by path dependence and increasing returns. The specialty hospital already has a massive installed base of capital, equipment, and highly trained staff. This creates a kind of "gravity well." The agglomeration of talent and resources can create its own increasing returns—a new star surgeon is more productive when surrounded by other experts and top-tier equipment. More formally, the organization faces immense friction if it tries to shift its path. Moving resources away from the established center involves real switching costs, political battles, and resistance from those who have a stake in the status quo (a kind of institutional "sunk cost" fallacy). A formal model shows that a rational, margin-maximizing administrator, even without any personal bias, will almost always choose to invest where the marginal financial return is highest, which is in the already-successful hospital. This creates a self-reinforcing loop, or path dependence, where the initial disparity in resources becomes entrenched and magnified over time. The system's structure perpetuates racial and economic inequality, not necessarily because of overt prejudice, but because it is locked into a path shaped by the logic of increasing returns.
This dynamic of "lock-in" extends to our entire technological infrastructure. Think of the QWERTY keyboard, which persists despite the existence of arguably more efficient layouts. It gained an early lead, and the network of keyboards, trained typists, and training programs created a feedback loop that made it unbeatable. The same is true for the competition between different energy technologies.
The process of technological improvement itself is a prime example of increasing returns. The more we produce a technology like solar panels or batteries, the more we learn about how to produce it efficiently. This is called "learning-by-doing." Each new unit produced lowers the cost of all future units. This positive feedback is why the cost of solar power has plummeted by over 90% in the last decade. But this wonderful engine of progress comes with a profound and unsettling consequence. As one model of an energy economy demonstrates, the presence of these dynamic increasing returns makes the future fundamentally unpredictable.
Because the system is path-dependent, a small, temporary policy nudge—like a carbon tax or a renewable energy subsidy—can potentially shift the entire economy from a fossil-fuel trajectory to a renewables trajectory. However, it also means our economic models are extraordinarily sensitive. Different plausible assumptions about the rate of learning can lead to wildly different forecasts about our energy future, even when all the assumptions are consistent with historical data. Even worse, the system may possess multiple possible futures, or equilibria, all of which are internally consistent. The economy could get locked into a high-cost, low-innovation path or a low-cost, high-innovation path, and the model alone cannot tell us which one will prevail. The presence of increasing returns means the economy is not a predictable machine but a complex, adaptive system whose future is not just unknown, but perhaps unknowable.
The principle of increasing returns even governs the invisible world of information, ideas, and influence. In our hyper-connected world, understanding how ideas spread through social networks is more important than ever.
Imagine trying to start a viral trend. You can "seed" your idea by convincing a few key influencers. A simple model of influence spread might assume diminishing returns: if you already have a large group of influencers on board, adding one more gives you less of a boost, because their audience likely overlaps with the others. In the language of network science, this influence function is "submodular."
But what if the process has reinforcement? What if hearing a message multiple times makes a person more likely to adopt it? A fascinating model explores this very question in a network with feedback loops, where person A can influence B, and B can influence A back. In this more realistic scenario, the law of diminishing returns breaks down. Instead, we find increasing marginal returns. Seeding two connected influencers together creates a powerful synergy. They reinforce each other's message, creating an echo chamber that is far more effective than the sum of their individual efforts. The marginal value of adding an influencer to your campaign grows as the group of influencers gets larger. This shatters submodularity and explains the explosive, winner-take-all dynamics of viral phenomena, financial manias, and political polarization. Success doesn't just breed success—it breeds it at an accelerating rate.
From the first division of labor in a humble sponge, to the bifurcation of life into sperm and egg, to the stubborn inequalities in our cities and the dizzying trajectory of our technology, the signature of increasing returns is everywhere. It is the engine of creation, the architect of complexity, and the source of the universe’s fascinating, and often frustrating, unpredictability. It teaches us that history matters, that small, contingent events can have large and lasting consequences, and that the world we inhabit is not a smooth, linear machine but a bubbling, path-dependent process of perpetual self-creation. To grasp this one principle is to gain a far deeper appreciation for the intricate and interconnected tapestry of our world.