Showing posts with label Gambling. Show all posts
Showing posts with label Gambling. Show all posts

Monday, July 29, 2013

Wall Street Lays Another Egg

(by Niall Ferguson, Vanity Fair, December 2008)

This year we have lived through something more than a financial crisis. We have witnessed the death of a planet. Call it Planet Finance. Two years ago, in 2006, the measured economic output of the entire world was worth around $48.6 trillion. The total market capitalization of the world’s stock markets was $50.6 trillion, 4 percent larger. The total value of domestic and international bonds was $67.9 trillion, 40 percent larger. Planet Finance was beginning to dwarf Planet Earth.

Planet Finance seemed to spin faster, too. Every day $3.1 trillion changed hands on foreign-exchange markets. Every month $5.8 trillion changed hands on global stock markets. And all the time new financial life-forms were evolving. The total annual issuance of mortgage-backed securities, including fancy new “collateralized debt obligations” (C.D.O.’s), rose to more than $1 trillion. The volume of “derivatives”—contracts such as options and swaps—grew even faster, so that by the end of 2006 their notional value was just over $400 trillion. Before the 1980s, such things were virtually unknown. In the space of a few years their populations exploded. On Planet Finance, the securities outnumbered the people; the transactions outnumbered the relationships.

New institutions also proliferated. In 1990 there were just 610 hedge funds, with $38.9 billion under management. At the end of 2006 there were 9,462, with $1.5 trillion under management. Private-equity partnerships also went forth and multiplied. Banks, meanwhile, set up a host of “conduits” and “structured investment vehicles” (sivs—surely the most apt acronym in financial history) to keep potentially risky assets off their balance sheets. It was as if an entire shadow banking system had come into being.  Then, beginning in the summer of 2007, Planet Finance began to self-destruct in what the International Monetary Fund soon acknowledged to be “the largest financial shock since the Great Depression.” Did the crisis of 2007–8 happen because American companies had gotten worse at designing new products? Had the pace of technological innovation or productivity growth suddenly slackened? No. The proximate cause of the economic uncertainty of 2008 was financial: to be precise, a crunch in the credit markets triggered by mounting defaults on a hitherto obscure species of housing loan known euphemistically as “subprime mortgages.”

Central banks in the United States and Europe sought to alleviate the pressure on the banks with interest-rate cuts and offers of funds through special “term auction facilities.” Yet the market rates at which banks could borrow money, whether by issuing commercial paper, selling bonds, or borrowing from one another, failed to follow the lead of the official federal-funds rate. The banks had to turn not only to Western central banks for short-term assistance to rebuild their reserves but also to Asian and Middle Eastern sovereign-wealth funds for equity injections. When these sources proved insufficient, investors—and speculative short-sellers—began to lose faith.

Beginning with Bear Stearns, Wall Street’s investment banks entered a death spiral that ended with their being either taken over by a commercial bank (as Bear was, followed by Merrill Lynch) or driven into bankruptcy (as Lehman Brothers was). In September the two survivors—Goldman Sachs and Morgan Stanley—formally ceased to be investment banks, signaling the death of a business model that dated back to the Depression. Other institutions deemed “too big to fail” by the U.S. Treasury were effectively taken over by the government, including the mortgage lenders and guarantors Fannie Mae and Freddie Mac and the insurance giant American International Group (A.I.G.).

By September 18 the U.S. financial system was gripped by such panic that the Treasury had to abandon this ad hoc policy. Treasury Secretary Henry Paulson hastily devised a plan whereby the government would be authorized to buy “troubled” securities with up to $700 billion of taxpayers’ money—a figure apparently plucked from the air. When a modified version of the measure was rejected by Congress 11 days later, there was panic. When it was passed four days after that, there was more panic. Now it wasn’t just bank stocks that were tanking. The entire stock market seemed to be in free fall as fears mounted that the credit crunch was going to trigger a recession. Moreover, the crisis was now clearly global in scale. European banks were in much the same trouble as their American counterparts, while emerging-market stock markets were crashing. A week of frenetic improvisation by national governments culminated on the weekend of October 11–12, when the United States reluctantly followed the British government’s lead, buying equity stakes in banks rather than just their dodgy assets and offering unprecedented guarantees of banks’ debt and deposits.

Since these events coincided with the final phase of a U.S. presidential-election campaign, it was not surprising that some rather simplistic lessons were soon being touted by candidates and commentators. The crisis, some said, was the result of excessive deregulation of financial markets. Others sought to lay the blame on unscrupulous speculators: short-sellers, who borrowed the stocks of vulnerable banks and sold them in the expectation of further price declines. Still other suspects in the frame were negligent regulators and corrupt congressmen. This hunt for scapegoats is futile. To understand the downfall of Planet Finance, you need to take several steps back and locate this crisis in the long run of financial history. Only then will you see that we have all played a part in this latest sorry example of what the Victorian journalist Charles Mackay described in his 1841 book, Extraordinary Popular Delusions and the Madness of Crowds.

Nothing New

As long as there have been banks, bond markets, and stock markets, there have been financial crises. Banks went bust in the days of the Medici. There were bond-market panics in the Venice of Shylock’s day. And the world’s first stock-market crash happened in 1720, when the Mississippi Company—the Enron of its day—blew up. According to economists Carmen Reinhart and Kenneth Rogoff, the financial history of the past 800 years is a litany of debt defaults, banking crises, currency crises, and inflationary spikes. Moreover, financial crises seldom happen without inflicting pain on the wider economy. Another recent paper, co-authored by Rogoff’s Harvard colleague Robert Barro, has identified 148 crises since 1870 in which a country experienced a cumulative decline in gross domestic product (G.D.P.) of at least 10 percent, implying a probability of financial disaster of around 3.6 percent per year.

If stock-market movements followed the normal-distribution, or bell, curve, like human heights, an annual drop of 10 percent or more would happen only once every 500 years, whereas in the case of the Dow Jones Industrial Average it has happened in 20 of the last 100 years. And stock-market plunges of 20 percent or more would be unheard of—rather like people a foot and a half tall—whereas in fact there have been eight such crashes in the past century.  The most famous financial crisis—the Wall Street Crash—is conventionally said to have begun on “Black Thursday,” October 24, 1929, when the Dow declined by 2 percent, though in fact the market had been slipping since early September and had suffered a sharp, 6 percent drop on October 23. On “Black Monday,” October 28, it plunged by 13 percent, and the next day by a further 12 percent. In the course of the next three years the U.S. stock market declined by a staggering 89 percent, reaching its nadir in July 1932. The index did not regain its 1929 peak until November 1954.

That helps put our current troubles into perspective. From its peak of 14,164, on October 9, 2007, to a dismal level of 8,579, exactly a year later, the Dow declined by 39 percent. By contrast, on a single day just over two decades ago—October 19, 1987—the index fell by 23 percent, one of only four days in history when the index has fallen by more than 10 percent in a single trading session.  This crisis, however, is about much more than just the stock market. It needs to be understood as a fundamental breakdown of the entire financial system, extending from the monetary-and-banking system through the bond market, the stock market, the insurance market, and the real-estate market. It affects not only established financial institutions such as investment banks but also relatively novel ones such as hedge funds. It is global in scope and unfathomable in scale.

Had it not been for the frantic efforts of the Federal Reserve and the Treasury, to say nothing of their counterparts in almost equally afflicted Europe, there would by now have been a repeat of that “great contraction” of credit and economic activity that was the prime mover of the Depression. Back then, the Fed and the Treasury did next to nothing to prevent bank failures from translating into a drastic contraction of credit and hence of business activity and employment. If the more openhanded monetary and fiscal authorities of today are ultimately successful in preventing a comparable slump of output, future historians may end up calling this “the Great Repression.” This is the Depression they are hoping to bottle up—a Depression in denial.

To understand why we have come so close to a rerun of the 1930s, we need to begin at the beginning, with banks and the money they make. From the Middle Ages until the mid-20th century, most banks made their money by maximizing the difference between the costs of their liabilities (payments to depositors) and the earnings on their assets (interest and commissions on loans). Some banks also made money by financing trade, discounting the commercial bills issued by merchants. Others issued and traded bonds and stocks, or dealt in commodities (especially precious metals). But the core business of banking was simple. It consisted, as the third Lord Rothschild pithily put it, “essentially of facilitating the movement of money from Point A, where it is, to Point B, where it is needed.”

The system evolved gradually. First came the invention of cashless intra-bank and inter-bank transactions, which allowed debts to be settled between account holders without having money physically change hands. Then came the idea of fractional-reserve banking, whereby banks kept only a small proportion of their existing deposits on hand to satisfy the needs of depositors (who seldom wanted all their money simultaneously), allowing the rest to be lent out profitably. That was followed by the rise of special public banks with monopolies on the issuing of banknotes and other powers and privileges: the first central banks.

With these innovations, money ceased to be understood as precious metal minted into coins. Now it was the sum total of specific liabilities (deposits and reserves) incurred by banks. Credit was the other side of banks’ balance sheets: the total of their assets; in other words, the loans they made. Some of this money might still consist of precious metal, though a rising proportion of that would be held in the central bank’s vault. Most would be made up of banknotes and coins recognized as “legal tender,” along with money that was visible only in current- and deposit-account statements.

Until the late 20th century, the system of bank money retained an anchor in the pre-modern conception of money in the form of the gold standard: fixed ratios between units of account and quantities of precious metal. As early as 1924, the English economist John Maynard Keynes dismissed the gold standard as a “barbarous relic,” but the last vestige of the system did not disappear until August 15, 1971—the day President Richard Nixon closed the so-called gold window, through which foreign central banks could still exchange dollars for gold. With that, the centuries-old link between money and precious metal was broken.

Though we tend to think of money today as being made of paper, in reality most of it now consists of bank deposits. If we measure the ratio of actual money to output in developed economies, it becomes clear that the trend since the 1970s has been for that ratio to rise from around 70 percent, before the closing of the gold window, to more than 100 percent by 2005. The corollary has been a parallel growth of credit on the other side of bank balance sheets. A significant component of that credit growth has been a surge of lending to consumers. Back in 1952, the ratio of household debt to disposable income was less than 40 percent in the United States. At its peak in 2007, it reached 133 percent, up from 90 percent a decade before. Today Americans carry a total of $2.56 trillion in consumer debt, up by more than a fifth since 2000.

Even more spectacular, however, has been the rising indebtedness of banks themselves. In 1980, bank indebtedness was equivalent to 21 percent of U.S. gross domestic product. In 2007 the figure was 116 percent. Another measure of this was the declining capital adequacy of banks. On the eve of “the Great Repression,” average bank capital in Europe was equivalent to less than 10 percent of assets; at the beginning of the 20th century, it was around 25 percent. It was not unusual for investment banks’ balance sheets to be as much as 20 or 30 times larger than their capital, thanks in large part to a 2004 rule change by the Securities and Exchange Commission that exempted the five largest of those banks from the regulation that had capped their debt-to-capital ratio at 12 to 1. The Age of Leverage had truly arrived for Planet Finance.

Credit and money, in other words, have for decades been growing more rapidly than underlying economic activity. Is it any wonder, then, that money has ceased to hold its value the way it did in the era of the gold standard? The motto “In God we trust” was added to the dollar bill in 1957. Since then its purchasing power, relative to the consumer price index, has declined by a staggering 87 percent. Average annual inflation during that period has been more than 4 percent. A man who decided to put his savings into gold in 1970 could have bought just over 27.8 ounces of the precious metal for $1,000. At the time of writing, with gold trading at $900 an ounce, he could have sold it for around $25,000.  Those few goldbugs who always doubted the soundness of fiat money—paper currency without a metal anchor—have in large measure been vindicated. But why were the rest of us so blinded by money illusion?

Blowing Bubbles

In the immediate aftermath of the death of gold as the anchor of the monetary system, the problem of inflation affected mainly retail prices and wages. Today, only around one out of seven countries has an inflation rate above 10 percent, and only one, Zimbabwe, is afflicted with hyperinflation. But back in 1979 at least 7 countries had an annual inflation rate above 50 percent, and more than 60 countries—including Britain and the United States—had inflation in double digits.

Inflation has come down since then, partly because many of the items we buy—from clothes to computers—have gotten cheaper as a result of technological innovation and the relocation of production to low-wage economies in Asia. It has also been reduced because of a worldwide transformation in monetary policy, which began with the monetarist-inspired increases in short-term rates implemented by the Federal Reserve in 1979. Just as important, some of the structural drivers of inflation, such as powerful trade unions, have also been weakened.

By the 1980s, in any case, more and more people had grasped how to protect their wealth from inflation: by investing it in assets they expected to appreciate in line with, or ahead of, the cost of living. These assets could take multiple forms, from modern art to vintage wine, but the most popular proved to be stocks and real estate. Once it became clear that this formula worked, the Age of Leverage could begin. For it clearly made sense to borrow to the hilt to maximize your holdings of stocks and real estate if these promised to generate higher rates of return than the interest payments on your borrowings. Between 1990 and 2004, most American households did not see an appreciable improvement in their incomes. Adjusted for inflation, the median household income rose by about 6 percent. But people could raise their living standards by borrowing and investing in stocks and housing.

Nearly all of us did it. And the bankers were there to help. Not only could they borrow more cheaply from one another than we could borrow from them; increasingly they devised all kinds of new mortgages that looked more attractive to us (and promised to be more lucrative to them) than boring old 30-year fixed-rate deals. Moreover, the banks were just as ready to play the asset markets as we were. Proprietary trading soon became the most profitable arm of investment banking: buying and selling assets on the bank’s own account.  Losing our shirt? The problem is that our banks are also losing theirs. There was, however, a catch. The Age of Leverage was also an age of bubbles, beginning with the dot-com bubble of the irrationally exuberant 1990s and ending with the real-estate mania of the exuberantly irrational 2000s. Why was this?

The future is in large measure uncertain, so our assessments of future asset prices are bound to vary. If we were all calculating machines, we would simultaneously process all the available information and come to the same conclusion. But we are human beings, and as such are prone to myopia and mood swings. When asset prices surge upward in sync, it is as if investors are gripped by a kind of collective euphoria. Conversely, when their “animal spirits” flip from greed to fear, the bubble that their earlier euphoria inflated can burst with amazing suddenness. Zoological imagery is an integral part of the culture of Planet Finance. Optimistic buyers are “bulls,” pessimistic sellers are “bears.” The real point, however, is that stock markets are mirrors of the human psyche. Like Homo sapiens, they can become depressed. They can even suffer complete breakdowns.

This is no new insight. In the 400 years since the first shares were bought and sold on the Amsterdam Beurs, there has been a long succession of financial bubbles. Time and again, asset prices have soared to unsustainable heights only to crash downward again. So familiar is this pattern—described by the economic historian Charles Kindleberger—that it is possible to distill it into five stages:

(1) Displacement: Some change in economic circumstances creates new and profitable opportunities. (2) Euphoria, or overtrading: A feedback process sets in whereby expectation of rising profits leads to rapid growth in asset prices. (3) Mania, or bubble: The prospect of easy capital gains attracts first-time investors and swindlers eager to mulct them of their money. (4) Distress: The insiders discern that profits cannot possibly justify the now exorbitant price of the assets and begin to take profits by selling. (5) Revulsion, or discredit: As asset prices fall, the outsiders stampede for the exits, causing the bubble to burst.  The key point is that without easy credit creation a true bubble cannot occur. That is why so many bubbles have their origins in the sins of omission and commission of central banks.

The bubbles of our time had their origins in the aftermath of the 1987 stock-market crash, when then novice Federal Reserve chairman Alan Greenspan boldly affirmed the Fed’s “readiness to serve as a source of liquidity to support the economic and financial system.” This sent a signal to the markets, particularly the New York banks: if things got really bad, he stood ready to bail them out. Thus was born the “Greenspan put”—the implicit option the Fed gave traders to be able to sell their stocks at today’s prices even in the event of a meltdown tomorrow.

Having contained a panic once, Greenspan thereafter had a dilemma lurking in the back of his mind: whether or not to act pre-emptively the next time—to prevent a panic altogether. This dilemma came to the fore as a classic stock-market bubble took shape in the mid-90s. The displacement in this case was the explosion of innovation by the technology and software industry as personal computers met the Internet. But, as in all of history’s bubbles, an accommodative monetary policy also played a role. From a peak of 6 percent in February 1995, the federal-funds target rate had been reduced to 5.25 percent by January 1996. It was then cut in steps, in the fall of 1998, down to 4.75 percent, and it remained at that level until June 1999, by which time the Dow had passed the 10,000 mark.

Why did the Fed allow euphoria to run loose in the 1990s? Partly because Greenspan and his colleagues underestimated the momentum of the technology bubble; as early as December 1995, with the Dow just past the 5,000 mark, members of the Fed’s Open Market Committee speculated that the market might be approaching its peak. Partly, also, because Greenspan came to the conclusion that it was not the Fed’s responsibility to worry about asset-price inflation, only consumer-price inflation, and this, he believed, was being reduced by a major improvement in productivity due precisely to the tech boom.

Greenspan could not postpone a stock-exchange crash indefinitely. After Silicon Valley’s dot-com bubble peaked, in March 2000, the U.S. stock market fell by almost half over the next two and a half years. It was not until May 2007 that investors in the Standard & Poor’s 500 had recouped their losses. But the Fed’s response to the sell-off—and the massive shot of liquidity it injected into the financial markets after the 9/11 terrorist attacks—prevented the “correction” from precipitating a depression. Not only were the 1930s averted; so too, it seemed, was a repeat of the Japanese experience after 1989, when a conscious effort by the central bank to prick an asset bubble had ended up triggering an 80 percent stock-market sell-off, a real-estate collapse, and a decade of economic stagnation.  What was not immediately obvious was that Greenspan’s easy-money policy was already generating another bubble—this time in the financial market that a majority of Americans have been encouraged for generations to play: the real-estate market.

The American Dream

Real estate is the English-speaking world’s favorite economic game. No other facet of financial life has such a hold on the popular imagination. The real-estate market is unique. Every adult, no matter how economically illiterate, has a view on its future prospects. Through the evergreen board game Monopoly, even children are taught how to climb the property ladder.  Once upon a time, people saved a portion of their earnings for the proverbial rainy day, stowing the cash in a mattress or a bank safe. The Age of Leverage, as we have seen, brought a growing reliance on borrowing to buy assets in the expectation of their future appreciation in value. For a majority of families, this meant a leveraged investment in a house. That strategy had one very obvious flaw. It represented a one-way, totally unhedged bet on a single asset.

To be sure, investing in housing paid off handsomely for more than half a century, up until 2006. Suppose you had put $100,000 into the U.S. property market back in the first quarter of 1987. According to the Case-Shiller national home-price index, you would have nearly tripled your money by the first quarter of 2007, to $299,000. On the other hand, if you had put the same money into the S&P 500, and had continued to re-invest the dividend income in that index, you would have ended up with $772,000 to play with—more than double what you would have made on bricks and mortar.

There is, obviously, an important difference between a house and a stock-market index. You cannot live in a stock-market index. For the sake of a fair comparison, allowance must therefore be made for the rent you save by owning your house (or the rent you can collect if you own a second property). A simple way to proceed is just to leave out both dividends and rents. In that case the difference is somewhat reduced. In the two decades after 1987, the S&P 500, excluding dividends, rose by a factor of just over six, meaning that an investment of $100,000 would be worth some $600,000. But that still comfortably beat housing.

There are three other considerations to bear in mind when trying to compare housing with other forms of assets. The first is depreciation. Stocks do not wear out and require new roofs; houses do. The second is liquidity. As assets, houses are a great deal more expensive to convert into cash than stocks. The third is volatility. Housing markets since World War II have been far less volatile than stock markets. Yet that is not to say that house prices have never deviated from a steady upward path. In Britain between 1989 and 1995, for example, the average house price fell by 18 percent, or, in inflation-adjusted terms, by more than a third—37 percent. In London, the real decline was closer to 47 percent. In Japan between 1990 and 2000, property prices fell by more than 60 percent.

The recent decline of property prices in the United States should therefore have come as less of a shock than it did. Between July 2006 and June 2008, the Case-Shiller index of home prices in 20 big American cities declined on average by 19 percent. In some of these cities—Phoenix, San Diego, Los Angeles, and Miami—the total decline was as much as a third. Seen in international perspective, those are not unprecedented figures. Seen in the context of the post-2000 bubble, prices have yet to return to their starting point. On average, house prices are still 50 percent higher than they were at the beginning of this process.

So why were we oblivious to the likely bursting of the real-estate bubble? The answer is that for generations we have been brainwashed into thinking that borrowing to buy a house is the only rational financial strategy to pursue. Think of Frank Capra’s classic 1946 movie, It’s a Wonderful Life, which tells the story of the family-owned Bailey Building & Loan, a small-town mortgage firm that George Bailey (played by James Stewart) struggles to keep afloat in the teeth of the Depression. “You know, George,” his father tells him, “I feel that in a small way we are doing something important. It’s satisfying a fundamental urge. It’s deep in the race for a man to want his own roof and walls and fireplace, and we’re helping him get those things in our shabby little office.” George gets the message, as he passionately explains to the villainous slumlord Potter after Bailey Sr.’s death: “[My father] never once thought of himself.… But he did help a few people get out of your slums, Mr. Potter. And what’s wrong with that? … Doesn’t it make them better citizens? Doesn’t it make them better customers?”  There, in a nutshell, is one of the key concepts of the 20th century: the notion that property ownership enhances citizenship, and that therefore a property-owning democracy is more socially and politically stable than a democracy divided into an elite of landlords and a majority of property-less tenants. So deeply rooted is this idea in our political culture that it comes as a surprise to learn that it was invented just 70 years ago.

Fannie, Ginnie, and Freddie

Prior to the 1930s, only a minority of Americans owned their homes. During the Depression, however, the Roosevelt administration created a whole complex of institutions to change that. A Federal Home Loan Bank Board was set up in 1932 to encourage and oversee local mortgage lenders known as savings-and-loans (S&Ls)—mutual associations that took in deposits and lent to homebuyers. Under the New Deal, the Home Owners’ Loan Corporation stepped in to refinance mortgages on longer terms, up to 15 years. To reassure depositors, who had been traumatized by the thousands of bank failures of the previous three years, Roosevelt introduced federal deposit insurance. And by providing federally backed insurance for mortgage lenders, the Federal Housing Administration (F.H.A.) sought to encourage large (up to 80 percent of the purchase price), long (20- to 25-year), fully amortized, low-interest loans.

By standardizing the long-term mortgage and creating a national system of official inspection and valuation, the F.H.A. laid the foundation for a secondary market in mortgages. This market came to life in 1938, when a new Federal National Mortgage Association—nicknamed Fannie Mae—was authorized to issue bonds and use the proceeds to buy mortgages from the local S&Ls, which were restricted by regulation both in terms of geography (they could not lend to borrowers more than 50 miles from their offices) and in terms of the rates they could offer (the so-called Regulation Q, which imposed a low ceiling on interest paid on deposits). Because these changes tended to reduce the average monthly payment on a mortgage, the F.H.A. made home ownership viable for many more Americans than ever before. Indeed, it is not too much to say that the modern United States, with its seductively samey suburbs, was born with Fannie Mae. Between 1940 and 1960, the home-ownership rate soared from 43 to 62 percent.

These were not the only ways in which the federal government sought to encourage Americans to own their own homes. Mortgage-interest payments were always tax-deductible, from the inception of the federal income tax in 1913. As Ronald Reagan said when the rationality of this tax break was challenged, mortgage-interest relief was “part of the American dream.”  In 1968, to broaden the secondary-mortgage market still further, Fannie Mae was split in two—the Government National Mortgage Association (Ginnie Mae), which was to cater to poor borrowers, and a rechartered Fannie Mae, now a privately owned government-sponsored enterprise (G.S.E.). Two years later, to provide competition for Fannie Mae, the Federal Home Loan Mortgage Corporation (Freddie Mac) was set up. In addition, Fannie Mae was permitted to buy conventional as well as government-guaranteed mortgages. Later, with the Community Reinvestment Act of 1977, American banks found themselves under pressure for the first time to lend to poor, minority communities.

These changes presaged a more radical modification to the New Deal system. In the late 1970s, the savings-and-loan industry was hit first by double-digit inflation and then by sharply rising interest rates. This double punch was potentially lethal. The S&Ls were simultaneously losing money on long-term, fixed-rate mortgages, due to inflation, and hemorrhaging deposits to higher-interest money-market funds. The response in Washington from both the Carter and Reagan administrations was to try to salvage the S&Ls with tax breaks and deregulation. When the new legislation was passed, President Reagan declared, “All in all, I think we hit the jackpot.” Some people certainly did.

On the one hand, S&Ls could now invest in whatever they liked, not just local long-term mortgages. Commercial property, stocks, junk bonds—anything was allowed. They could even issue credit cards. On the other, they could now pay whatever interest rate they liked to depositors. Yet all their deposits were still effectively insured, with the maximum covered amount raised from $40,000 to $100,000, thanks to a government regulation two years earlier. And if ordinary deposits did not suffice, the S&Ls could raise money in the form of brokered deposits from middlemen. What happened next perfectly illustrated the great financial precept first enunciated by William Crawford, the commissioner of the California Department of Savings and Loan: “The best way to rob a bank is to own one.” Some S&Ls bet their depositors’ money on highly dubious real-estate developments. Many simply stole the money, as if deregulation meant that the law no longer applied to them at all.

When the ensuing bubble burst, nearly 300 S&Ls collapsed, while another 747 were closed or reorganized under the auspices of the Resolution Trust Corporation, established by Congress in 1989 to clear up the mess. The final cost of the crisis was $153 billion (around 3 percent of the 1989 G.D.P.), of which taxpayers had to pay $124 billion. But even as the S&Ls were going belly-up, they offered another, very different group of American financial institutions a fast track to megabucks. To the bond traders at Salomon Brothers, the New York investment bank, the breakdown of the New Deal mortgage system was not a crisis but a wonderful opportunity. As profit-hungry as their language was profane, the self-styled “Big Swinging Dicks” at Salomon saw a way of exploiting the gyrating interest rates of the early 1980s.

The idea was to re-invent mortgages by bundling thousands of them together as the backing for new and alluring securities that could be sold as alternatives to traditional government and corporate bonds—in short, to convert mortgages into bonds. Once lumped together, the interest payments due on the mortgages could be subdivided into strips with different maturities and credit risks. The first issue of this new kind of mortgage-backed security (known as a “collateralized mortgage obligation”) occurred in June 1983. The dawn of securitization was a necessary prelude to the Age of Leverage.

Once again, however, it was the federal government that stood ready to pick up the tab in a crisis. For the majority of mortgages continued to enjoy an implicit guarantee from the government-sponsored trio of Fannie, Freddie, and Ginnie, meaning that bonds which used those mortgages as collateral could be represented as virtual government bonds and considered “investment grade.” Between 1980 and 2007, the volume of such G.S.E.-backed mortgage-backed securities grew from less than $200 billion to more than $4 trillion. In 1980 only 10 percent of the home-mortgage market was securitized; by 2007, 56 percent of it was.  These changes swept away the last vestiges of the business model depicted in It’s a Wonderful Life. Once there had been meaningful social ties between mortgage lenders and borrowers. James Stewart’s character knew both the depositors and the debtors. By contrast, in a securitized market the interest you paid on your mortgage ultimately went to someone who had no idea you existed. The full implications of this transition for ordinary homeowners would become apparent only 25 years later.

The Lessons of Detroit

In July 2007, I paid a visit to Detroit, because I had the feeling that what was happening there was the shape of things to come in the United States as a whole. In the space of 10 years, house prices in Detroit, which probably possesses the worst housing stock of any American city other than New Orleans, had risen by more than a third—not much compared with the nationwide bubble, but still hard to explain, given the city’s chronically depressed economic state. As I discovered, the explanation lay in fundamental changes in the rules of the housing game.  I arrived at the end of a borrowing spree. For several years agents and brokers selling subprime mortgages had been flooding Detroit with radio, television, and direct-mail advertisements, offering what sounded like attractive deals. In 2006, for example, subprime lenders pumped more than a billion dollars into 22 Detroit Zip Codes.

These were not the old 30-year fixed-rate mortgages invented in the New Deal. On the contrary, a high proportion were adjustable-rate mortgages—in other words, the interest rate could vary according to changes in short-term lending rates. Many were also interest-only mortgages, without amortization (repayment of principal), even when the principal represented 100 percent of the assessed value of the mortgaged property. And most had introductory “teaser” periods, whereby the initial interest payments—usually for the first two years—were kept artificially low, with the cost of the loan backloaded. All of these devices were intended to allow an immediate reduction in the debt-servicing costs of the borrower.

In Detroit only a minority of these loans were going to first-time buyers. They were nearly all refinancing deals, which allowed borrowers to treat their homes as cash machines, converting their existing equity into cash and using the proceeds to pay off credit-card debts, carry out renovations, or buy new consumer durables. However, the combination of declining long-term interest rates and ever more alluring mortgage deals did attract new buyers into the housing market. By 2005, 69 percent of all U.S. householders were homeowners; 10 years earlier it had been 64 percent. About half of that increase could be attributed to the subprime-lending boom.

Significantly, a disproportionate number of subprime borrowers belonged to ethnic minorities. Indeed, I found myself wondering, as I drove around Detroit, if “subprime” was in fact a new financial euphemism for “black.” This was no idle supposition. According to a joint study by, among others, the Massachusetts Affordable Housing Alliance, 55 percent of black and Latino borrowers in Boston who had obtained loans for single-family homes in 2005 had been given subprime mortgages; the figure for white borrowers was just 13 percent. More than three-quarters of black and Latino borrowers from Washington Mutual were classed as subprime, whereas only 17 percent of white borrowers were. According to a report in The Wall Street Journal, minority ownership increased by 3.1 million between 2002 and 2007.

Here, surely, was the zenith of the property-owning democracy. It was an achievement that the Bush administration was proud of. “We want everybody in America to own their own home,” President George W. Bush had said in October 2002. Having challenged lenders to create 5.5 million new minority homeowners by the end of the decade, Bush signed the American Dream Downpayment Act in 2003, a measure designed to subsidize first-time house purchases in low-income groups. Between 2000 and 2006, the share of undocumented subprime contracts rose from 17 to 44 percent. Fannie Mae and Freddie Mac also came under pressure from the Department of Housing and Urban Development to support the subprime market. As Bush put it in December 2003, “It is in our national interest that more people own their own home.” Few people dissented.

As a business model, subprime lending worked beautifully—as long, that is, as interest rates stayed low, people kept their jobs, and real-estate prices continued to rise. Such conditions could not be relied upon to last, however, least of all in a city like Detroit. But that did not worry the subprime lenders. They simply followed the trail blazed by mainstream mortgage lenders in the 1980s. Having pocketed fat commissions on the signing of the original loan contracts, they hastily resold their loans in bulk to Wall Street banks. The banks, in turn, bundled the loans into high-yielding mortgage-backed securities and sold them to investors around the world, all eager for a few hundredths of a percentage point more of return on their capital. Repackaged as C.D.O.’s, these subprime securities could be transformed from risky loans to flaky borrowers into triple-A-rated investment-grade securities. All that was required was certification from one of the rating agencies that at least the top tier of these securities was unlikely to go into default.

The risk was spread across the globe, from American state pension funds to public-hospital networks in Australia, to town councils near the Arctic Circle. In Norway, for example, eight municipalities, including Rana and Hemnes, invested some $120 million of their taxpayers’ money in C.D.O.’s secured on American subprime mortgages.  In Detroit the rise of subprime mortgages had in fact coincided with a new slump in the inexorably declining automobile industry. That anticipated a wider American slowdown, an almost inevitable consequence of a tightening of monetary policy as the Federal Reserve belatedly raised short-term interest rates from 1 percent to 5.25 percent. As soon as the teaser rates expired and mortgages were reset at new and much higher interest rates, hundreds of Detroit households swiftly fell behind in their mortgage payments. The effect was to burst the real-estate bubble, causing house prices to start falling significantly for the first time since the early 1990s. And the further house prices fell, the more homeowners found themselves with “negative equity”—in other words, owing more money than their homes were worth.  The rest—the chain reaction as defaults in Detroit and elsewhere unleashed huge losses on C.D.O.’s in financial institutions all around the world—you know.

Drunk on Derivatives

Do you, however, know about the second-order effects of this crisis in the markets for derivatives? Do you in fact know what a derivative is? Once excoriated by Warren Buffett as “financial weapons of mass destruction,” derivatives are what make this crisis both unique and unfathomable in its ramifications. To understand what they are, you need, literally, to go back to the future.

For a farmer planting a crop, nothing is more crucial than the future price it will fetch after it has been harvested and taken to market. A futures contract allows him to protect himself by committing a merchant to buy his crop when it comes to market at a price agreed upon when the seeds are being planted. If the market price on the day of delivery is lower than expected, the farmer is protected.  The earliest forms of protection for farmers were known as forward contracts, which were simply bilateral agreements between seller and buyer. A true futures contract, however, is a standardized instrument issued by a futures exchange and hence tradable. With the development of a standard “to arrive” futures contract, along with a set of rules to enforce settlement and, finally, an effective clearinghouse, the first true futures market was born.

Because they are derived from the value of underlying assets, all futures contracts are forms of derivatives. Closely related, though distinct from futures, are the contracts known as options. In essence, the buyer of a “call” option has the right, but not the obligation, to buy an agreed-upon quantity of a particular commodity or financial asset from the seller (“writer”) of the option at a certain time (the expiration date) for a certain price (known as the “strike price”). Clearly, the buyer of a call option expects the price of the underlying instrument to rise in the future. When the price passes the agreed-upon strike price, the option is “in the money”—and so is the smart guy who bought it. A “put” option is just the opposite: the buyer has the right but not the obligation to sell an agreed-upon quantity of something to the seller of the option at an agreed-upon price.

A third kind of derivative is the interest-rate “swap,” which is effectively a bet between two parties on the future path of interest rates. A pure interest-rate swap allows two parties already receiving interest payments literally to swap them, allowing someone receiving a variable rate of interest to exchange it for a fixed rate, in case interest rates decline. A credit-default swap (C.D.S.), meanwhile, offers protection against a company’s defaulting on its bonds.  There was a time when derivatives were standardized instruments traded on exchanges such as the Chicago Board of Trade. Now, however, the vast proportion are custom-made and sold “over the counter” (O.T.C.), often by banks, which charge attractive commissions for their services, but also by insurance companies (notably A.I.G.). According to the Bank for International Settlements, the total notional amounts outstanding of O.T.C. derivative contracts—arranged on an ad hoc basis between two parties—reached a staggering $596 trillion in December 2007, with a gross market value of just over $14.5 trillion.

But how exactly do you price a derivative? What precisely is an option worth? The answers to those questions required a revolution in financial theory. From an academic point of view, what this revolution achieved was highly impressive. But the events of the 1990s, as the rise of quantitative finance replaced preppies with quants (quantitative analysts) all along Wall Street, revealed a new truth: those whom the gods want to destroy they first teach math.

Working closely with Fischer Black, of the consulting firm Arthur D. Little, M.I.T.’s Myron Scholes invented a groundbreaking new theory of pricing options, to which his colleague Robert Merton also contributed. (Scholes and Merton would share the 1997 Nobel Prize in economics.) They reasoned that a call option’s value depended on six variables: the current market price of the stock (S), the agreed future price at which the stock could be bought (L), the time until the expiration date of the option (t), the risk-free rate of return in the economy as a whole (r), the probability that the option will be exercised (N), and—the crucial variable—the expected volatility of the stock, i.e., the likely fluctuations of its price between the time of purchase and the expiration date (s). With wonderful mathematical wizardry, the quants reduced the price of a call option to the Black-Scholes formula:

Feeling a bit baffled? Can’t follow the algebra? That was just fine by the quants. To make money from this magic formula, they needed markets to be full of people who didn’t have a clue about how to price options but relied instead on their (seldom accurate) gut instincts. They also needed a great deal of computing power, a force which had been transforming the financial markets since the early 1980s. Their final requirement was a partner with some market savvy in order to make the leap from the faculty club to the trading floor. Black, who would soon be struck down by cancer, could not be that partner. But John Meriwether could. The former head of the bond-arbitrage group at Salomon Brothers, Meriwether had made his first fortune in the wake of the S&L meltdown of the late 1980s. The hedge fund he created with Scholes and Merton in 1994 was called Long-Term Capital Management.

In its brief, four-year life, Long-Term was the brightest star in the hedge-fund firmament, generating mind-blowing returns for its elite club of investors and even more money for its founders. Needless to say, the firm did more than just trade options, though selling puts on the stock market became such a big part of its business that it was nicknamed “the central bank of volatility” by banks buying insurance against a big stock-market sell-off. In fact, the partners were simultaneously pursuing multiple trading strategies, about 100 of them, with a total of 7,600 positions. This conformed to a second key rule of the new mathematical finance: the virtue of diversification, a principle that had been formalized by Harry M. Markowitz, of the Rand Corporation. Diversification was all about having a multitude of uncorrelated positions. One might go wrong, or even two. But thousands just could not go wrong simultaneously.

The mathematics were reassuring. According to the firm’s “Value at Risk” models, it would take a 10-s (in other words, 10-standard-deviation) event to cause the firm to lose all its capital in a single year. But the probability of such an event, according to the quants, was 1 in 10,24—or effectively zero. Indeed, the models said the most Long-Term was likely to lose in a single day was $45 million. For that reason, the partners felt no compunction about leveraging their trades. At the end of August 1997, the fund’s capital was $6.7 billion, but the debt-financed assets on its balance sheet amounted to $126 billion, a ratio of assets to capital of 19 to 1.

There is no need to rehearse here the story of Long-Term’s downfall, which was precipitated by a Russian debt default. Suffice it to say that on Friday, August 21, 1998, the firm lost $550 million—15 percent of its entire capital, and vastly more than its mathematical models had said was possible. The key point is to appreciate why the quants were so wrong.

The problem lay with the assumptions that underlie so much of mathematical finance. In order to construct their models, the quants had to postulate a planet where the inhabitants were omniscient and perfectly rational; where they instantly absorbed all new information and used it to maximize profits; where they never stopped trading; where markets were continuous, frictionless, and completely liquid. Financial markets on this planet followed a “random walk,” meaning that each day’s prices were quite unrelated to the previous day’s, but reflected no more and no less than all the relevant information currently available. The returns on this planet’s stock market were normally distributed along the bell curve, with most years clustered closely around the mean, and two-thirds of them within one standard deviation of the mean. On such a planet, a “six standard deviation” sell-off would be about as common as a person shorter than one foot in our world.  It would happen only once in four million years of trading.

But Long-Term was not located on Planet Finance. It was based in Greenwich, Connecticut, on Planet Earth, a place inhabited by emotional human beings, always capable of flipping suddenly and en masse from greed to fear. In the case of Long-Term, the herding problem was acute, because many other firms had begun trying to copy Long-Term’s strategies in the hope of replicating its stellar performance. When things began to go wrong, there was a truly bovine stampede for the exits. The result was a massive, synchronized downturn in virtually all asset markets. Diversification was no defense in such a crisis. As one leading London hedge-fund manager later put it to Meriwether, “John, you were the correlation.”

There was, however, another reason why Long-Term failed. The quants’ Value at Risk models had implied that the loss the firm suffered in August 1998 was so unlikely that it ought never to have happened in the entire life of the universe. But that was because the models were working with just five years of data. If they had gone back even 11 years, they would have captured the 1987 stock-market crash. If they had gone back 80 years they would have captured the last great Russian default, after the 1917 revolution. Meriwether himself, born in 1947, ruefully observed, “If I had lived through the Depression, I would have been in a better position to understand events.” To put it bluntly, the Nobel Prize winners knew plenty of mathematics but not enough history.

One might assume that, after the catastrophic failure of L.T.C.M., quantitative hedge funds would have vanished from the financial scene, and derivatives such as options would be sold a good deal more circumspectly. Yet the very reverse happened. Far from declining, in the past 10 years hedge funds of every type have exploded in number and in the volume of assets they manage, with quantitative hedge funds such as Renaissance, Citadel, and D. E. Shaw emerging as leading players. The growth of derivatives has also been spectacular—and it has continued despite the onset of the credit crunch. Between December 2005 and December 2007, the notional amounts outstanding for all derivatives increased from $298 trillion to $596 trillion. Credit-default swaps quadrupled, from $14 trillion to $58 trillion.

An intimation of the problems likely to arise came in September, when the government takeover of Fannie and Freddie cast doubt on the status of derivative contracts protecting the holders of more than $1.4 trillion of their bonds against default. The consequences of the failure of Lehman Brothers were substantially greater, because the firm was the counter-party in so many derivative contracts.  The big question is whether those active in the market waited too long to set up some kind of clearing mechanism. If, as seems inevitable, there is an upsurge in corporate defaults as the U.S. slides into recession, the whole system could completely seize up.

The China Syndrome

Just 10 years ago, during the Asian crisis of 1997–98, it was conventional wisdom that financial crises were more likely to happen on the periphery of the world economy—in the so-called emerging markets of East Asia and Latin America. Yet the biggest threats to the global financial system in this new century have come not from the periphery but from the core. The explanation for this strange role reversal may in fact lie in the way emerging markets changed their behavior after 1998.

For many decades it was assumed that poor countries could become rich only by borrowing capital from wealthy countries. Recurrent debt crises and currency crises associated with sudden withdrawals of Western money led to a rethinking, inspired largely by the Chinese example.   When the Chinese wanted to attract foreign capital, they insisted that it take the form of direct investment. That meant that instead of borrowing from Western banks to finance its industrial development, as many emerging markets did, China got foreigners to build factories in Chinese enterprise zones—large, lumpy assets that could not easily be withdrawn in a crisis.

The crucial point, though, is that the bulk of Chinese investment has been financed from China’s own savings. Cautious after years of instability and unused to the panoply of credit facilities we have in the West, Chinese households save a high proportion of their rising incomes, in marked contrast to Americans, who in recent years have saved almost none at all. Chinese corporations save an even larger proportion of their soaring profits. The remarkable thing is that a growing share of that savings surplus has ended up being lent to the United States. In effect, the People’s Republic of China has become banker to the United States of America.

The Chinese have not been acting out of altruism. Until very recently, the best way for China to employ its vast population was by exporting manufactured goods to the spendthrift U.S. consumer. To ensure that those exports were irresistibly cheap, China had to fight the tendency for its currency to strengthen against the dollar by buying literally billions of dollars on world markets. In 2006, Chinese holdings of dollars reached 700 billion. Other Asian and Middle Eastern economies adopted much the same strategy.

The benefits for the United States were manifold. Asian imports kept down U.S. inflation. Asian labor kept down U.S. wage costs. Above all, Asian savings kept down U.S. interest rates. But there was a catch. The more Asia was willing to lend to the United States, the more Americans were willing to borrow. The Asian savings glut was thus the underlying cause of the surge in bank lending, bond issuance, and new derivative contracts that Planet Finance witnessed after 2000. It was the underlying cause of the hedge-fund population explosion. It was the underlying reason why private-equity partnerships were able to borrow money left, right, and center to finance leveraged buyouts. And it was the underlying reason why the U.S. mortgage market was so awash with cash by 2006 that you could get a 100 percent mortgage with no income, no job, and no assets.  Whether or not China is now sufficiently “decoupled” from the United States that it can insulate itself from our credit crunch remains to be seen. At the time of writing, however, it looks very doubtful.

Back to Reality

The modern financial system is the product of centuries of economic evolution. Banks transformed money from metal coins into accounts, allowing ever larger aggregations of borrowing and lending. From the Renaissance on, government bonds introduced the securitization of streams of interest payments. From the 17th century on, equity in corporations could be bought and sold in public stock markets. From the 18th century on, central banks slowly learned how to moderate or exacerbate the business cycle. From the 19th century on, insurance was supplemented by futures, the first derivatives. And from the 20th century on, households were encouraged by government to skew their portfolios in favor of real estate.

Economies that combined all these institutional innovations performed better over the long run than those that did not, because financial intermediation generally permits a more efficient allocation of resources than, say, feudalism or central planning. For this reason, it is not wholly surprising that the Western financial model tended to spread around the world, first in the guise of imperialism, then in the guise of globalization.

Yet money’s ascent has not been, and can never be, a smooth one. On the contrary, financial history is a roller-coaster ride of ups and downs, bubbles and busts, manias and panics, shocks and crashes. The excesses of the Age of Leverage—the deluge of paper money, the asset-price inflation, the explosion of consumer and bank debt, and the hypertrophic growth of derivatives—were bound sooner or later to produce a really big crisis.

It remains unclear whether this crisis will have economic and social effects as disastrous as those of the Great Depression, or whether the monetary and fiscal authorities will succeed in achieving a Great Repression, averting a 1930s-style “great contraction” of credit and output by transferring the as yet unquantifiable losses from banks to taxpayers.  Either way, Planet Finance has now returned to Planet Earth with a bang. The key figures of the Age of Leverage—the lax central bankers, the reckless investment bankers, the hubristic quants—are now feeling the full force of this planet’s gravity.

But what about the rest of us, the rank-and-file members of the deluded crowd? Well, we shall now have to question some of our most deeply rooted assumptions—not only about the benefits of paper money but also about the rationale of the property-owning democracy itself.  On Planet Finance it may have made sense to borrow billions of dollars to finance a massive speculation on the future prices of American houses, and then to erect on the back of this trade a vast inverted pyramid of incomprehensible securities and derivatives. But back here on Planet Earth it suddenly seems like an extraordinary popular delusion.

Sunday, July 21, 2013

Poker News

 
2015 46th Annual World Series Of Poker - Event #68: No-Limit Hold'em Main Event
Wednesday, July 8, 2015 1:19 AM PST (about 3 days and 9 hours ago)
 
John Gorsuch Leads Largest Day 1 Field Ever Into Day 2 Of Main Event
Records fell today on the final Day 1 of the 2015 World Series of Poker Main Event as 3,963 players bought in for a shot at being this year's Champion. This is the largest single flight in Main Event history, and it means that a total of 6,420 players have entered the Main Event. There were 2,765 who bagged chips tonight. Over the course of all three starting flights, 4,389 players advance to Day 2.  After registration officially closed, tournament staff announced the payout information. The 6,420 entries created a prize pool of $60,348,000. The top 1,000 place will be paid, with a min-cash worth $15,000. All players who make the November Nine will earn seven-figure payouts, and the winner will take home $7,680,021. Full payouts are available here.
It was 2014 Main Event Champion Martin Jacobson who got today's festivities underway, having recovered sufficiently enough from the illness that ruled him out of starting Day 1a to be able to give out the "Shuffle Up and Deal" announcement after his banner for his victory last year was unveiled. Jacobson's day did not get any better as he was unable to gain any traction before falling shortly after the dinner break to Garrett Greuner.
John Gorsuch ended the day as the chip leader, the Virginia native bagging up 198,100 at the close of play. Joining Gorsuch at the top of the counts are Zarik Megerdichian (180,400), James Juvancic (166,350), Timo Pfutzenreuter (150,075), Jeff Griffiths (140,400) and Craig Varnell (140,000).
John Gorsuch (center)
There were many former WSOP Main Event Champions in the field today, with a number of them making it through to Day 2. 2013 Champion Ryan Riess leads that particular field with 108,800 at the close of play, and he will be joined on Day 2C by Jamie Gold (81,000), Joe Hachem (63,500), Peter Eastgate (22,100), and Phil Hellmuth (79,725). Hellmuth registered during the dinner break today and found himself seated on the same table as none other than Phil Ivey, with Ivey "getting the best of Hellmuth to double up" at one point. The champions will be joined by Daniel Negreanu (73,825), Allen Cunningham (65,025), Bertrand "Elky" Grospellier (22,375) and Michael Mizrachi (64,350), who all made it to Day 2 without incident.
Not everybody ran well enough to make Day 2 though, with former Main Event Champions Joe Cada, Greg Merson and Jerry Yang joining their compatriot Jacobson on the rail, with Yang suffering a horrible cooler early on Day 1 when he ran a set of sevens into a set of aces to be eliminated. Merson also found himself on the wrong side of a cooler, when he ran pocket kings into pocket aces before dinner. They were joined on the rail by some of poker's finest, such as Patrik Antonius, Jimmy Fricke, Mike Sexton and Jennifer Harman, who were all eliminated by the end of the day.
It wasn't just some of the game's best players that were successful today, with a number of celebrities and sports personalities taking their seats at the table today. Most successful was former MLB first round draft pick Wade Townsend, who bagged up 146,000 going into Thursday's Day 2. He will be joined by famous actors and comedians Ray Romano (33,375) and Brad Garrett (46,050), who dodged bullets and avoided sharks to finish today's play and continue in the Main Event.
Unfortunately for some of the celebrities, today was not their lucky day. Ex-NFL star Richard Seymour battled valiantly throughout the day, but he was eliminated at the beginning of Level 5. Also spotted in today's field was Aaron Paul, best known for playing Jesse Pinkman in Breaking Bad. Coincidentally, there was a player named Walter White registered for Day 1A who bagged up 29,975 chips, but Paul was unable to join him, busting early on Day 1 when he ran a set into a flopped flush. NBA star and Phoenix Suns player Earl Barron took his seat today, but the 7'0" center/power forward sadly bust very late at the end of Day 1.
All players that survived today will now take a day off before they return on Thursday for Day 2c here at the Rio. Tomorrow will see the return of the players that survived Days 1a and 1b respectively, and they will converge on the World Series of Poker for another grueling five two-hour levels at 12pm local time tomorrow to play through to Day 3, which is scheduled for Friday.  Join us tomorrow here at WSOP.com as we bring you all the action from the first of the Day Twos as the remaining players continue on their quest to become this year's WSOP Main Event Champion!
 
 
 
How Poker Lost Its Soul
(By Colson Whitehead, Grantland, 11 June 2014)
Colson Whitehead is a novelist and essayist. His writing has appeared in a number of publications, such as the New York Times, The New Yorker, New York Magazine, Harper's and Granta.  In his new book, “The Noble Hustle: Poker, Beef Jerky, and Death,” Colson Whitehead recounts his 2011 trip to the World Series of Poker. Whitehead was reporting a piece for Grantland, which paid for him to enter the competition. Whitehead trained for several weeks in Atlantic City before making the trip.
On my first Vegas trip in ’91, we stumbled on a wonderland.  It was a grubby spot on Fremont Street, just past the Four Queens and Binion’s, embedded in an outcropping of souvenir shops. The House of Jerky. I knew Slim Jims, those spicy straws of processed ears and snouts. This was something else entirely. We squinted in joyful bafflement before the rows of clear plastic pouches filled with knobs of dark, lean meat, seasoned and cured. Li’l baggie of desiccant at the bottom for freshness. The jerkys reminded me tree bark, which we peel ’n’ eat in times of drought and on major holidays. We walked the aisles. The flavors were ordinary, yes. Pepper, teriyaki, barbecue. But the ark-ful of proteins was miraculous: beef, Alaskan salmon, buffalo, turkey, alligator, venison, ostrich.
The proprietor was a middle-aged Asian man named Dexter Choi. That one man’s singular vision could beget such bounty! It was America laid out before us, dangling on metal rods set into scuffed particle board. Complete with wide open spaces, for the store had a modest inven- tory. Dried fruit. Nuts. But mostly jerky.  Mr. Choi remained unmoved by our oh-snaps and holy-cows. The House of Jerky was kitsch to us, but we stood inside the man’s desert dream that day. You know there was a hater chorus when he shared his plans. “Forget about jerky, Dexter, study for the electrician’s licensing exam.” “Sure jerky is a low-calorie, high-sodium snack, Dexter, but when are you going to get your head out of the clouds?” “Look at these lips, Dexter—will your dried muscle-meat ever kiss you like I do?”
He endured. To build a House of Jerky is to triumph against the odds, to construct a nitrate-filled monument to possibility and individual perseverance. Dexter Choi was an outlaw. He faced down fate and flopped a full house.  Maybe things could have improved re: foot traffic, but I couldn’t help but be moved. From that day on, beef jerky was synonymous with freedom and savory pick-me- ups between meals. We bought a few bags of that sweet bark for our drive into Death Valley and continued on our journey.
How could I foresee that this cowboy snack would become a symbol of corporate poker, indeed the commercialization of all Las Vegas? Beef jerky was now the leathery, mass-produced face of modern poker. Meat snacks generated $1.4 billion a year in business, Jack Link ’s a major player. Started in the 1880s by an immigrant named Chris Link, who served up smoked meats and sausages to Wisconsin pioneer folk, Jack Link’s was now the fastest-growing meat snack firm in the world, with a hundred different products sold in forty countries. “More than a century has passed,” the Our History page of their site announced, “but the Link family principles and traditions remain the same: hard work, integrity, and a commitment to earn consumer respect by delivering the best-tasting meat snacks in the world.”
Respect them I did. Since 2008, the company had been an official sponsor of the Main Event—the official name of the thing is “The World Series of Poker Presented by Jack Link ’s Beef Jerky.” I had, in effect, been walking around in a big plastic bag ever since I stepped in the Rio. Explained the chronic suffocating feeling.  The company’s red and black logo mottled the ESPN studio in the Amazon Room, vivid on the clothing of sponsored players like cattle brands. Jack Link’s “Messin’ with Sasquatch” commercials were a mainstay of poker TV programming, featuring their mascot Sasquatch as he was humiliated by golfers, campers, and frat boys before putting a Big Foot up their asses. The mascot’s meaning? Despite the death of the frontier, and the stifling monotony of modern life, the Savage still walks among us. That, or Betty White was unavailable.
Watch any of ESPN’s coverage and you’ll encounter “Jack Link ’s Beef Jerky Wild Card Hand,” in which host Norman Chad tries to divine the contents of a hand through betting patterns. The “hole-card cam” was a clutch innovation behind poker’s populist boom, allowing viewers to see the players’ hands. Before we pierced that veil, televised poker was like watching a baseball game with an invisible ball—i.e., even more boring than watching regular baseball. The hole-card cam allowed for simultaneous commentary—just like real sports! The fans participated in the spectacle, second-guessing, pitting their own calculations against the pros’ moves. They learned. They got better. They started playing in the events they watched on TV.
Poker as million-dollar theater, hence the upgrade from Johnny Moss’s engraved silver cup to diamond-encrusted bracelets. I was implicated in this big-biz operation. Grantland, the magazine that sent me, was owned by ESPN. ESPN was owned by Disney. Which is why they had trouble finding my check. It was floating around the accounting office of Caesars, which was owned by Harrah’s, who owned the WSOP.  At registration, I’d  kept mentioning  ESPN  and Grantland as my benefactors, when the check was cut by Disney. We were all confused.  People asked if I’d be able to keep the money if I cashed at the WSOP. Yes—that had been made clear to me. I wasn’t getting paid for the article. My compensation was them paying my entrance fee. Haggle with a lowly freelancer over winnings? Peanuts to the parent corp. I was writing for an entity owned by the company that made millions and millions off WSOP coverage. My words were an advertisement, is one way of looking at it. Raise awareness of the game. Inspire some misfit kid to take up poker. Spread the gospel far and wide.  Grantland. ESPN. Disney. It was all in the family.  The House always wins.


The One-Minute Guide to Poker
(By Tom Chiarella, Esquire magazine, October 2010)

The World Series of Poker Europe started this week in London. Here are a few words of advice from our writer at large, who played in the World Series of Poker and taught a poker class for almost ten years at DePauw University.   http://www.esquire.com/features/world-series-of-poker-advice-1010

Bet. Bet hard. Poker is a game of courage and reaction. When in doubt, don't call, raise. You'll get a reaction. If someone comes over the top and puts you all-in, then you've paid a little to learn a lot. When you get beat — and you will — do not button up and stop betting.

You can ignore the first rule and win. But if you do not completely understand its principles, then you should not sit at a poker table.

Be a bastard. Drive people crazy. When they hate you, you can take advantage.

Know your outs. Play enough, practice enough so you can understand whether you have three cards that can help you or fifteen.

Think of your chips as weapons: When you have a lot of them, they are a gangster's stockpile of gats. When you have a few, they are samurai swords. Both dangerous, but in different ways.

Here's a saying: When men draw to flushes, they leave Vegas on buses.

Opt for the home game; it's the best way to enjoy poker. Play dealer's choice, serve good sandwiches, and set the stakes so no one is making a living at beating down his friends.

Shut up. Did someone draw out a runner-runner flush against your set of aces? Get used to it.


Things You Should Never Do at the Poker Table
• Splash the pot.
• Ask if it's your turn.
• Ask to see a losing hand.
• Ask for advice.
• Offer advice.
• Suggest a round of shots.
• Blame the dealer.
• Throw in your watch.
• Weep.
 

Poker Becomes A Sport For Young American Males
(By Daniel de Vise, Washington Post, October 10, 2011)

Eric Froehlich was the son of a chemical engineer and smart enough to win admission to the Thomas Jefferson High School for Science and Technology in Fairfax County and then the University of Virginia.  And then he discovered online poker.  Long identified with saloons, cigars and Mississippi riverboats, poker in recent years has found an unlikely home: in dormitory rooms, on the computer screens of clever young men. Froehlich won a major World Series of Poker tournament in 2005 at 21, making him the youngest winner of a coveted poker “bracelet,” until he was eclipsed by three players who were younger still.  “Gamblers are no longer gangsters with guns,” said Justin Vingelis, 22, a poker player who graduated in May from James Madison University. “They are nerds with calculators.”
At many colleges, the campus culture largely embraces poker. Last November, engineering students at the University of Maryland hosted their fourth annual Casino Night, an evening of card-playing and networking. A student charity at U-Va. holds annual “Hold ’em for Hunger” tournaments.  But the federal government has been less indulgent. In April, the Justice Department shut down three leading poker sites and charged their owners with bank fraud and money-laundering. In a civil lawsuit filed last month, federal prosecutors alleged the owners of one site, Full Tilt Poker, pocketed more than $300 million in player deposits.  Full Tilt insiders “lined their own pockets with funds picked from the pockets of their most loyal customers,” Preet Bharara, the U.S. attorney in Manhattan, said in a statement.  Among the losers is Vingelis, an online poker player from the Fairfax suburb of Burke. Vingelis joined Full Tilt at 18, when he got his first debit card. He bet a dollar or two at a time and reckons he made about $2,000 playing poker on the site before it shut down.  “From my conversations with some friends, we are all resigned to the fact that our money is gone,” he said.

Poker has become a game of the young. The last three winners of the annual World Series of Poker “Main Event,” poker’s top tournament, were 22, 21 and 23 years old, respectively, and males.  A 2010 survey by the Annenberg Public Policy Center at the University of Pennsylvania found 16 percent of college-age males — 1.7 million young men — gambled on the Internet at least once a month, primarily by playing poker. A 2008 Florida study found college students twice as given to gamble as older adults.  Poker sites cater to college students. One promises: “$30,000 guaranteed! You’ve been studying, this is the exam!” Another asks, “How would you like to have $10,000 shaved off of this year’s tuition?”  To those who study gambling as an addiction, this predilection for poker puts thousands of college students in peril. Students with gambling problems are more likely to run up credit card debt, take drugs, get bad grades and steal, studies have found.  “It doesn’t just mean money,” said Jeffrey Derevensky, a youth gambling researcher at McGill University in Canada. “It can mean time. It can mean theft. It can mean kids dropping out of school.”

Yet, a college gambling task force found in 2008 that only 22 percent of colleges have a written policy on gambling. “And in a lot of cases, that might be simply, ‘There shall be no gambling on campus,’ ” said Christine Reilly, senior research director at the National Center for Responsible Gaming.  Today’s inquisitive teens discover poker for the same reasons their parents played chess and pondered the Rubik’s Cube. Poker is a particularly cerebral form of gambling, one that rewards not luck but math and logic. For Froehlich, poker was like a puzzle, one “that’s very much not solved and very much never will be solved.”  Froehlich, now 27, started playing poker when he enrolled at the University of Virginia in 2002. His inspiration wasn’t Wild Bill Hickok or Steve McQueen’s “Kid” but rather Magic: The Gathering, a card game created by a mathematics professor and popular with the Dungeons & Dragons crowd.  Poker “was just something I did in the dorms,” Froehlich said. It wasn’t meant to be a career. But Froehlich didn’t much like school. When he took some time off to care for his ailing mother, he started spending his free time playing poker. By 22, Froehlich had won two major tournaments and a lot of money, and “it almost seemed selfish not to go the poker route.” He is now a successful professional, playing in live poker tournaments.
Froehlich and other young players trace the explosion of collegiate poker to the ascent of Chris Moneymaker, an accountant who won poker’s biggest tournament in 2003 after qualifying through an online poker site. Poker became a marquee event on ESPN, a network already popular with young men, who could now tune in to watch other young men win stacks of money.  “You’re talking about a group of people who are very impulse-prone, who can make poor decisions,” said Stephen McDaniel, a gambling expert at U-Md.

The past decade has seen an evolution of gambling to “gaming,” a triumph of euphemism amid a wave of legislation to legalize and destigmatize wagering. Forty-eight states and the District now permit citizens to gamble legally, according to Keith Whyte, executive director of the National Council on Problem Gambling in Washington. Utah and Hawaii are the lone holdouts. Poker, in particular, has become viewed as a sort of athletic event, although the U.S. government still views online poker as a crime.  “There’s been a massive cultural shift toward acceptance and accessibility and availability of gambling in America,” Whyte said. “And that certainly filters down to the kids.”

In this climate, some gambling experts say legalized online poker is inevitable. Supporters say legalization would cleanse an industry tarnished by scandal. Because online sites operate outside U.S. law, they are largely unregulated.  “Those people who got swindled by Full Tilt, they’d be much less likely to be swindled if it was legal,” said Rep. Barney Frank (D-Mass.), a prominent supporter of legal gambling.  But some veteran players say online poker will never be safe. They point to the ease with which poker sites have been hacked and the outcomes manipulated.  “The poker sites up to now have not had the software capability to detect collusion,” said Robert Turner, a professional player and poker security expert based in Downey, Calif. Turner said he supports legalized poker, but not until the industry can catch cheats.  Many fewer students are playing poker online since the federal crackdown.  “Poker’s already risky,” said Matthew Baker, 22, a graduate student at Virginia Tech. “Now, whether you win or lose, you still might not get your money. And it’s really hard to say which site is going to be trustworthy.”  But Baker wouldn’t bet against the online poker industry. “I feel like somebody’s going to fill the void,” he said.

 

The Lumberjack Fells Some Giants In Las Vegas
(By J. Freedom duLac, Washington Post, Nov. 9, 2009)

Somebody pinch Darvin Moon. Seriously.  The self-employed lumberjack from Western Maryland's panhandle is down to the final two in the 2009 World Series of Poker Main Event, and will play late Monday night for the $8,546,435 first-place prize against Joe Cada, a young, cocksure poker pro from Shelby Township, Mich.  Moon, a self-effacing, self-taught amateur from Oakland, Md., whose ruddy, jowly face seems to have been kissed by kismet, is living every poker player's dream, having outlasted and out-lucked nearly the entire field of 6,494 players.  "Pretty awesome, isn't it?" Moon said.  Yet as Saturday blurred into Sunday at the tournament's final table, Moon's eyes appeared to be sealed shut as he sat onstage at the Rio Hotel and Casino's Penn and Teller Theater -- even with massive pots forming and millions of dollars in prize money at stake in poker's marquee event.  Lost in the reverie of his storybook run, perhaps?  Not exactly.  "Fell asleep a little bit," Moon admitted afterward. "I was zoned out."  And that was just 13 hours into a grueling session that began at 1 p.m. Saturday and wouldn't end until Cada eliminated Frenchman Antoine Saout, in third place, at around 6 a.m. Sunday. It was the longest final table in Main Event history, and it's to be continued: Moon and Cada will return Monday at 10 p.m. local time to play heads-up Texas Hold 'Em until one player has all of the chips.
Cada has a nice head start in the showdown, with nearly 136 million of the chips. Moon, the chip leader when the nine-player final table convened Saturday, has 58,850,000 chips -- roughly the same number he had at the start of the day.  "I lost 80,000 chips, but I'm about 4 million wealthier in the real green [stuff]," he said after what was either an early breakfast or an incredibly late midnight snack Sunday morning. Moon and his wife, Wendy, left Vegas in July with a $1,263,602 check, the minimum each of the final nine players would win. He is now guaranteed at least the second-place payout of $5,182,928.  Before he left the theater Sunday morning, Moon stopped to get a close-up of the cash bundles placed near the final table. "I want to count this to make sure it comes back on Monday, when I take it all home," Moon said to the guards.  Both Moon and Cada got lucky at the final table, winning big hands in which they were mathematical underdogs, including the one on which Moon eliminated Phil Ivey, one of the world's most famous and feared poker players. (Ivey had A-K; Moon had A-Q and won when the dealer turned over one of the three queens remaining in the deck -- much to the dismay of Ivey's fans and fellow poker pros, who were skeptical of Moon and critical of some of his plays. Though it wasn't just the poker purists getting on Moon's case; after he'd tried to bluff Saout and got caught, the logger's own mother, sitting next to the stage, told him: "If I'd a done that, you'd tell me I made a donkey play." Moon laughed.) 

The two survivors couldn't be more different.  Cada, 21, is a brash Internet poker whiz who began learning the game online as a teenager; he has already had enough success playing cards that he's been able to buy his own house. He looks like he just swept in from an Abercrombie ad shoot, and he's in line to become the youngest champ in Main Event history. He'd probably be the first winner, too, to have worn a watch with a king-of-hearts card on its face during the final table.  Moon, 46, is a laconic poker hobbyist who plays in low-stakes tournaments at the local Elks Lodge and American Legion. He doesn't use the Internet, doesn't have an e-mail address and keeps telling everybody he's dumb. He looks like a redheaded fire hydrant with meat hooks for arms and swears he's returning to the woods to work once he's done at the World Series.  He's the accidental folk hero baffled by his own celebrity. "The last time I signed an autograph, I was getting out of jail," he said Saturday night, after signing a dozen of them for fans. "That part is weird to me. I'm no different than them."  Sure enough, there was Moon rubbing elbows with his people, hoi polloi (Greek for "hillbillies"?), at the Hold 'em Bonus table game early Saturday, when he woke up at 3 a.m. after going to bed too early the night before. He stopped in the casino again during the final-table dinner break, and passersby were incredulous that somebody sitting at the Main Event final table would be sitting there, placing $15 and $20 bets.  He almost stopped in the casino again Sunday, after the epic overnight marathon. Instead, he kept on going, to the 1,100-square-foot suite with the Jacuzzi and the two-sink bathroom and the flat-screen TVs and the bed that was calling his name.  "My head is spinning," he said. "I could sleep for two days."  If he does, he just might awaken as the new world champion of poker.


Poker Pros, Fans Going All In
(By Steve DiMeglio, USA Today)

As the first winner of the World Series of Poker 36 years ago, poker legend Johnny Moss won a silver cup. Back then, a vote among the players decided the winner.  No cash exchanged hands.  "When the tournament was over, we gave the cup to Moss and then had a few drinks and started playing poker again," poker legend Doyle "Texas Dolly" Brunson said. "Those were the days. Boy, have times changed."  Have they ever. No longer shunned to the dark, back rooms of casinos, poker is sizzling in the mainstream, having ridden twin tidal waves of exposure to the masses- the Internet and television.  The hottest happening is the WSOP- the longest continual tournament (eight weeks) in poker. The WSOP's eminent enticement begins Friday- the $10,000 buy-in Texas Hold'em main event that runs through August 10. First place could be worth $10 million.  As Las Vegas began baking July 16, 2005, with much of Sin City still asleep and the charity worker, magician, truck driver, law student, bar owner, former loan officer and vociferous poker professional having been vanquished from the final table of the largest, most lucrative tournament in poker history, pistol- and shotgun-armed security guards brought 17 boxes of cash from a fortified vault at Binion's Gambling Hall & Hotel and dumped the currency on a table. 
 
Looking on as 850 pounds of $100 bills worth $7.5 million swelled toward the ceiling were the last two survivors of the 5,619 players in the World Series of Poker's No-Limit Texas Hold'em main event — Steven Dannenmann, a CPA and mortgage broker from Maryland, and Joseph Hachem, a chiropractor-turned-poker pro from Australia.  Six hands into heads-up action and eight days after the tournament started, Hachem's straight bested Dannenmann's aces to win the mother of all poker tournaments.  Just shy of 7 a.m., with an Australian flag draped over his right shoulder, tournament organizers gave Hachem the coveted white-gold and diamond bracelet, awarded to every winner of a WSOP event, and $7.5 million, a first-place reward worth more than the purse for the entire field ($7,154,642) in the 2005 British Open that weekend.  "I just wanted to make sure it was all mine," said Hachem, 40. "I remember thinking my wife could go shopping now."
Super Bowl of poker

This year, the WSOP is on track to smash every tournament record for entrants and prize money. For instance, the women's no-limit hold'em event attracted 1,128 players- nearly double the number who played in last year's Ladies Event. Tournament No. 17, a no-limit Texas Hold'em event with a $1,000 buy-in, had 2,891 players, the second-largest live poker tournament in history. The H.O.R.S.E. tournament, which consisted of five different poker games, attracted 143 players who each put up $50,000 to play- the largest entry fee ever for a poker tournament.  As of Wednesday, 6,965 had entered the main event; more than 7,500 are expected to put up the $10,000 entry fee. All 208 poker tables on 45,000 square feet of convention-room floor at the Rio All-Suite Hotel & Casino just west of the Las Vegas Strip will be in play. All of poker's best players are here, as well as numerous celebrities and sports figures, including Ben Affleck, James Woods, Shannon Elizabeth, Laura Prepon and "Spider-Man" himself, Tobey Maguire. 
 
"There's nothing like this tournament," poker pro Cyndy Violette said. "It's our Super Bowl, World Series, Kentucky Derby all rolled into one."  Chris "Jesus" Ferguson, the 2000 WSOP main event champion, added: "It's the most painful tournament to get knocked out of because it's so much fun and you have to wait a year for it to start again."  The WSOP invites the proletariat in- anyone who pays an entry fee for a tournament and is at least 21 can play. This year, more than 30,000 players from more than 50 countries have played in the event's 45 tournaments, trying to win some of the nearly $150 million in prize money.  "You think you're prepared for it, but when you get to the WSOP, it's overwhelming," said poker pro Robert Williamson III. "Every day, you walk in to a sea of people and poker tables. Then there are thickets of people gathered around the sea of people playing."

 "Last year, as I looked at all the poker tables and players, I was thinking about all the old-timers who helped evolve this thing," Brunson said. "All of them are dead and gone, but to see what has happened to the game, thinking about those old guys, brought tears to my eyes."  The WSOP's long run to its lofty status started with a poker marathon in 1949. Nick "The Greek" Dandalos asked Benny Binion, owner of the Horseshoe Casino in Las Vegas, to arrange the biggest poker game of all time. Binion called Moss, regarded as the best player in the world, bankrolled him, and Moss played Dandalos for five months outdoors in front of the casino. The two played every type of poker for pots of hundreds of thousands of dollars until Dandalos decided he'd had enough. He rose from the table and solemnly said, "Mr. Moss, I have to let you go."  The epic battle, however, stayed with Binion, especially after he saw large crowds gathering every day to watch. The game gave Binion the inspiration to later start the World Series of Poker.  It took Binion a long time to realize his idea, but 21 years later, Binion gathered 35 of the game's best players for an event he called the World Series of Poker.

The Moneymaker era
     The WSOP slowly grew, both in participants and new events, with a majority of poker variants being used. In 1978, Bobby Baldwin, now CEO of Mirage Resorts, defeated 41 players to win the main event and $210,000.  "Poker was less popular," Baldwin said. "But the 42 players in the main event were very, very good, and the play was very, very intense. It was difficult to win that tournament, but it's more difficult to win now. It's easier to beat 41 players than it is 7,000."  By 1988, there were 167 players in the main event. A decade later, there were 350, and in 2002 there were 631.

Then along came Chris Moneymaker. While the name helped, Moneymaker's Cinderella journey from a $39 online satellite tournament to a $2.5 million payday for winning the main event in 2003 latched on to America's imagination and jump-started the poker craze. TV ratings were the highest ever, sparking numerous poker shows and hordes of poker websites.  The WSOP's entrants jumped, too. Moneymaker, a former accountant, beat 838 players. The following year, 2,576 showed up, then 5,619 last year.  "Moneymaker caught a lot of people's attention, and they discovered that you don't need exceptional physical skills to play poker," said poker pro Lyle Berman. "I can't dunk a basketball; I'm just not big enough. I can't beat Tiger Woods in golf. But in poker, in the WSOP, you can beat the best players in the world.  "And poker is such a great game," Berman added. "It's you and your brain against everybody else. The deck doesn't know who you are."




What Poker Teaches Us
(By Matthew Rousu, The Federalist, November 15, 2013)
 
Organizations that oppose gambling will often claim that gambling has no benefits.  This isn’t true.  Beyond the enjoyment we experience, many forms of gambling can teach useful skills.  Blackjack, for example, teaches us about odds, variance, and money management.  Placing bets on horse racing can also teach people an enormous amount on odds and probabilities, as betting on different horses offers different payouts for winning. Even those with limited mathematical backgrounds quickly learn that betting $5 on a horse with 14-1 odds will pay them back $70 for a win.  Similar skills can be learned with sports betting.  While these and some other forms of gambling can provide some skill development, none offers the opportunity to develop real world skills like poker.  It seems fitting that the most glamorous of all gambling games can teach us so much.  After all, Mark Twain spoke eloquently about poker and it’s been played regularly in the Oval Office by many presidents.  It is estimated that 70-80 million Americans play poker.  While some play for low stakes and some play for high stakes, Americans love this game that combines instinct, mathematical ability, psychology, and luck.
 
Scholars studying poker have used many different approaches, but consistently find that poker is a game that requires significant skill.  University of Chicago researchers studied results from the biggest annual poker competition, the World Series of Poker.  They found that highly skilled players had a return on investment (ROI) of over 30 percent, whereas other players had a negative ROI of 15  percent.  In a study I published in Gaming Law Review and Economics, myself and colleague Michael Smith found that poker players who spent more time studying poker earned more money.  Given that no amount of studying would help win a game of luck like the lottery, this provides evidence that poker is a game predominantly based on skill.  Research from the University of Pennsylvania found that the amount of skill required wasn’t unlike the skill required in golf.  That poker is a game that rewards skill isn’t questioned among serious scholars. However, poker can also teach individuals some incredibly valuable skills.
Math
First, a person who plays a significant amount of poker can learn a significant amount of statistics, mathematics, and probabilities.  Poker players quickly learn the importance of calculating what are called “pot odds” – the amount of money a player could win versus the amount he or she would have to risk – and relatively quickly.  A basic knowledge of pot odds, combined with how often a player thinks he or she will win, can help in making optimal decisions.  For example, if you’re in a hand where you estimate you’ll win the hand 1/3 of the time, the decision on whether to call an opponent’s bet will come down to the odds.  If you have to call a $10 bet (i.e., risk $10 to stay in the hand), you’ll find it unprofitable to call if the total you can win is less than $20, but profitable if the amount you can win is more than $20.  The decision to call can’t be made in isolation – you would need to estimate both the odds of winning and the amount you can win to make the correct decision.
The math in poker can get quite advanced.  You can examine the probability that a player holds a particular hand given the variety of possible hands you’d expect him/her to hold.  A serious poker player will learn the equivalent of at least one college-level statistics course through playing.  There are far more benefits from poker than just learning statistics, however.
Strategy
Poker players also learn about strategic interactions.  Poker rewards those who can outthink their opponent and can take different pieces of information and synthesize them to make correct decisions.  This may seem intuitive, but some relatively deep concepts are learned by those who play poker, including advanced concepts in game theory like mixed strategies and exploitative strategies.  In game theory, a mixed strategy occurs when a player won’t want to take the same action every time.  A simple example of a mixed strategy can be seen in baseball.  A pitcher won’t want to throw a fastball every time, as the hitter will plan for a fastball and have more success.
Similarly, the pitcher won’t want to throw a curveball every time, as the hitter will simply plan for a curveball each time and have more success.  The optimal strategy for a pitcher is a mixed strategy, where some percentage of the time a fastball is thrown, and some percentage of the time a curveball is thrown.  In poker, players intuitively learn how to randomize their actions when determining how often to bluff.  Players who bluffs too often gives their opponents easy decisions (always call the player’s bet).  Players who never bluff give their opponents easy decisions as well, as the opponents should look to fold to bets often.  A poker player learns through experience that randomizing their play is important.  An exploitative strategy is one where you realize an opponent is making an inferior choice, and you vary your play to exploit that poor choice.  A poker player seizes upon these opportunities.  Once again, the skills learned at the poker table are transferrable to the real world.
Emotional maturity
A third skill people can learn from poker is how to handle emotional swings.  Anybody who’s played poker for any amount of time knows that you’ll have to deal with both winning and losing.  Keeping your emotions in check, and handling both wins and loses calmly is important.  Poker also gives you plenty of opportunities to deal with both good luck and bad luck.  Handling swings in fortune is crucial to success at the table, and is an important life skill as well.   Players who wish to play regularly learn money management skills that that are useful not only to business owners but to everybody.
A book titled “The Poker Mindset”, which was written by Ian Taylor and Matthew Hilger, addresses the psychological aspects of poker and how a successful player handles them.  These skills are so essential for success in other fields, that Taylor’s and Hilger’s book, while written for poker, is being used by economic analysts and in a college course on legal negotiations.  These are fields where emotional control is essential, and these fields are turning to advice from poker players.
Money management
A fourth skill that can be learned from poker is money management.  In poker, running out of money means losing the ability to play poker.  Players know that just because they have money, they can’t spend it if they want to “stay in business” (play poker), and that it’s good to have reserve funds available.  Players who wish to play regularly learn money management skills that that are useful not only to business owners but to everybody.  After all, every financial advisor will tell people to have an emergency fund. It takes discipline to have an emergency fund and not be tempted to spend it.
The argument from those who oppose gambling is short-sighted.  It ignores the enormous leisure benefits people can gain from certain games, like blackjack, betting on horse-racing, and especially poker.  Companies, universities, stock-market traders, and others rely on wisdom from poker to help train their workers and students.  The benefits go far beyond simple recreation.
That being said, this isn’t the best argument to keep the government from prohibiting gambling.  Even if there were no benefits from gambling, one could argue that our government shouldn’t restrict it.  After all, is it the government’s right to tell adults how to live their lives?  For those of us who think it isn’t, the better case to keep the government away from gambling prohibitions is the argument that we deserve freedom.