The Social Capital Stall Behind America’s Gerontocracy – Palladium

image

Štefan Štefančík/Man In A Hoodie

Modern finance is defined by the ‘savings glut’ problem: some groups—the wealthiest 0.1%, multinational corporations, and the central banks of export-dependent countries—relentlessly accumulate financial assets. Over time, this pushes interest rates lower, but it also reduces consumption. This is both a cause and a symptom of economic stagnation. But the problem goes deeper.

Running in parallel to the financial savings glut is one in social capital. Like other forms of capital, people accumulate social capital throughout their lives—as they accumulate trust, responsibility, banked favors, institutional competence, and markers of social status. In a healthy system, social capital compounds over time, but new entrants can always get some; think of a company where the CEO is trusted to make good decisions, but new workers also have a realistic chance of promotion over time, which requires investment in them by more established players. In a social capital savings glut, we would expect the people in charge to stay in charge, to hoard their reputational capital by avoiding risks, and to accept stagnation as the price of stability.

There is, in fact, evidence of this glut across many domains. In business, the average birth year of incoming Fortune 500 CEOs has stayed the same from 2005 through 2019. In politics, the two major party presidential candidates are 77 and 74. They’ve been well-known public figures since the 1980s, a generation ago. Trump was getting a full-length profile in New York Magazine in 1980. Biden, likewise, was prominent enough to pursue the Democratic nomination in 1988, and even received a single vote at the 1984 convention. While they’re opposites in terms of how much their nominations represented a party’s desire to rally behind its establishment, what they have in common is that they’ve been public figures for so long that their name recognition is self-fulfilling.The Senate Majority Leader is 78, and the House Majority Leader is 80. In entertainment, the people remain young, but the most valuable brands are old: of the 20 top-grossing movies in 2019, two were remakes, nine were sequels (including one sequel to a remake), seven were part of existing franchises, and only two were originals. Even ostensibly timeless parts of popular culture are determined by older people: the most popular Christmas songs of the day were overwhelmingly songs released in the 1950s and ‘60s.

At one level, this could be viewed as a positive development. Perhaps we’ve gotten so good at identifying talented leaders that the same natural talents will be at the top of their career for decades. Perhaps Marvel Comics and Star Wars have fully fleshed out all of the compelling narratives the world needs. Christmas music, too, may be a solved problem. But it creates a strange syncopation of ambition. In a healthy society, people in their 20s, 30s, and 40s can believe that they could be at the top of their career with 10 years hard work. But it’s disconcerting for this thought to be most realistic for people over 65.

In more measurable domains like economics, it’s easy to sketch out how this problem could come about. The economist Thomas Piketty points out in Capital in the Twenty-First Century that any time the real return on wealth exceeds the real growth rate of the economy, this necessarily leads to large and growing income inequality. If r > g can explain continuous wealth accumulation, a similar model could explain the continuous accumulation of status and influence. In economics, r > g implies that if someone is already wealthy, and passively invests their money in a diversified portfolio, their share of economic output will rise over time. The equivalents ripple throughout society: in a company, management’s power rises over time; in the arts, the famous get gradually more fame and funding; in politics, each party’s leading lights get more airtime and influence through simple compounding.

American political history offers some evidence for this. As Piketty notes in Capital, crises tend to destroy accumulated wealth faster than they destroy the economy as a whole. When a country loses a war, big business often loses its factories, and the political allies of the wartime elite have their power drastically curtailed. Even a victorious outcome can be wealth-depleting; for example, the First and Second World Wars were immensely destructive to the British elites. The British military lost 12% of its soldiers and 17% of its officers in WWI, including 20% of the officers who attended Eton, and the survivors had to deal with massive tax increases to pay for it all. After WWII, decades of taxes and inflation further depleted the upper class’s resources, while the end of Britain’s colonial system meant that ambitious elites had fewer opportunities to leave the country to seek status and wealth in Britain’s colonial holdings. Meanwhile, the workers still had 24 hours a day, just as they did before the conflict, so they experienced an increase in their share of the wealth.

Wars are an important mechanism for redistributing political status to the young. After the Revolutionary War, the first five presidents had served in the military in some capacity. Even John Adams, while not in uniform, had been chairman of the Continental Congress’ Board of War. Six of the seven presidents elected after the Civil War had served in the military, and every post-WWII president did the same until the election of Bill Clinton in 1992.

So, one way to understand the high average age of American politicians today is to note that the U.S. has not been involved in many recent conflicts where the conflict was seen as unambiguously righteous and resulted in a clear victory. Mass mobilization is an inherently leveling force, and the chaos of wartime means leaders are quickly identified, regardless of their backgrounds.

There are other avenues for ambitious people to make their mark, but these fields generally don’t represent a direct path to national leadership. Working for an investment bank, a consulting company, or a growing tech company is a path to rapid career advancement—but precisely because these organizations are so optimized for profits, they’re bad at influencing policy. The story of large tech companies is a story of growth that crushes every obstacle until the only one left is regulation, at which point the company has to quickly recruit a bevy of expensive advisors. For example, Uber hired Obama campaign manager David Plouffe on the way up, and then Eric Holder, his Attorney General, on the way down.

There are a handful of successful technology entrepreneurs who have made a splash politically, but in general, the way tech CEOs participate in the political process is by testifying before Congress. As Mark Zuckerberg once said, “If you want to change the world, the best thing to do is to build a company.”

What’s the mechanism? It may have something to do with the complementarity between specialists and generalists. A national politician or a CEO needs to be able to address many different topics. Ergo, politicians have to have a view on foreign policy, taxes, healthcare, the environment, law enforcement, and social issues. And generally, those views are backed up by more specialized staffers who help them craft coherent policy proposals and bullet-proof them against obvious rejoinders. A CEO has to recruit, deal with investors, make strategic decisions around product launches and cancellations, and negotiate mergers and acquisitions. And, like politicians, these CEOs need a staff of narrower experts to help them deal with complex tradeoffs.

For anyone who is considering their immediate options, being a specialist who reports to a generalist is the ideal trade: it means having a significant impact on a subset of the world, instead of the risky choice of having fewer constraints and zero impact. This tends to make established leadership self-perpetuating. Job security at the top makes specialization a safer choice for everyone else trying to enter the game, and that specialization reinforces the top’s job security by removing potential competitors.

And this, too, has compounding effects. People build up social capital during their careers—a balance sheet of favors owed and favors earned. In general, that balance sheet’s assets and liabilities expand over time, so career leverage rises with seniority. In economics, stability raises the returns from leverage; an unstable company or a volatile economy can’t support heavily-indebted companies and individuals, but any system where change is gradual is one where it’s safe to borrow. Early-stage biotech companies and R&D-driven semiconductor designers typically have little or no debt, while a mature company selling packaged food or cigarettes can afford to borrow immense sums, knowing that most of their customers will keep coming back at a predictable pace.

Financial systems have a self-correcting way to deal with excess leverage driven by long periods of stability: they crash. The economist Hyman Minsky described a cycle where investors accustomed to high returns have to borrow ever larger sums to achieve those returns, and eventually the magnitude of this borrowing makes the entire system brittle enough that a small disruption can topple it. One of the other points Minsky makes, though, is that there’s a twilight period to these bubbles, where the primary driver of growth is the existence of leverage itself. In the subprime bubble, this was exemplified by hot real estate markets where a disproportionate share of the workforce was involved in construction and housing finance. In the telco/tech bubble of the 90s, it was when telecom network operators borrowed money and spent it on telecom equipment, or when dot coms raised money based on the performance of companies like Yahoo and AOL, and used that money to buy banner ads from Yahoo and AOL.

The analogous bubble-amplifying behavior in politics is to agree with senior politicians—only more so. This creates a dynamic where the collective behavior of ambitious young members of a campaign or administration tends to push that administration’s policies further from the center. Ultimately, national political leaders become more partisan than the base, but much less partisan than their staffs. It’s an unstable coalition that naturally breeds elitism. One illustrative example of this is the case of Jonathan Gruber, a healthcare economist at MIT. Gruber built simulations of the impacts of healthcare policies, and was one of the minds behind both Romneycare in Massachusetts and Obamacare nationwide. Gruber worked off two premises: first, that healthcare policies have to redistribute money from healthy people to sick people if they’re going to provide broader care at a given cost, and second, that voters don’t like this idea. Both are close to tautological. What Gruber also did, though, was admit this on video, attributing the complexities of the Affordable Care Act to “the stupidity of the American voter.” Gruber’s bracing cynicism is part of why he’s a successful wonk and a terrible politician. When his remarks were publicized, the PR blowback against the ACA was immense.

Unfortunately, the self-correction takes a long time. Unlike companies, there isn’t a quoted market tracking the value of a politician’s social capital, so there’s no way for the market to visibly crash—and there are few rewards for contrarian short bets. The payoff from not backing Jeb Bush in 2016 is tiny: saving a bit of money, and saving some faint embarrassment. But there isn’t an easy way to structure an asymmetric bet that a given politician’s chances are wildly inflated, except by supporting the underdog. And any system where specialists thrive is one where they’re essential. An outsider who wins power has to surround themselves with insiders to wield it, and those insiders refract the information they share with their boss and the interpretation they apply to policies.

The gap between branding and implementation is reflected throughout the modern system. Trump’s judicial picks are pre-vetted the same way George W. Bush’s were: by the Federalist Society. Large companies tack left on social issues while staying to the right on economic ones. They’d prefer controversy over their position on hot-button culture war topics to a discussion of where they’re shifting jobs and paying taxes. It’s the bullfighter’s logic: by waving the red cloth, you distract the charging animal and protect what really matters.

The status savings glut is self-perpetuating, and its most visible impact is the steady graying of authority figures. Advances in lighting, cosmetics, unobtrusive teleprompters, and perhaps stimulants have allowed politicians to remain videogenic for decades after they hit retirement age. With few limits on how long someone can remain a public figure, other than extreme senility or death, the social status savings glut can persist until the institutions themselves break down. In the case of companies, that can happen at a rapid pace; it only took a decade for IBM to go from setting the pace in its industry to veering close to bankruptcy.

But for governments, stagnation can continue for a long time before a sufficiently jarring reaction forces a change. That means the resulting crisis can become all the more destructive when it finally breaks.

Byrne Hobart works in the financial services industry and writes one of the top newsletters on Substack called The Diff. He has worked at research companies, a hedge fund, and a cryptocurrency startup.

Related