Although gold prices hit a new high in mid-January, Americans, by and large, are still reluctant about gold. They don’t quite “get it.” This incomprehension is different than that of Americans not “getting,” for example, bitcoin (as few seem to). They may understand gold as a safe haven that has always stood the test of time, war, crises, inflation, etc. Some also understand that no gold proponent advocates harkening back to a mythical 19th century gold hey-day (one that did not exist — certainly not consistently), or recommends re-issuing gold minted currency, or reverting to any kind of bimetallism (the 19th century norm).
That said, the “barbarous relic” view tends to persist. Overall, it is thought that gold simply has no place in a modernized (read: central bank-controlled) economy. Making matters more complex is the question of what is gold and what is not. The recent proliferation of gold derivatives, “paper gold,” ETFs, certificates, bogus gold; the Chinese, the Russians, depleted reserves, actual supply make its study opaque and abstract. In light of this confusion, a basic overview of the role of gold in an economy, both in classic and modern terms, is in order:
The Complexity of the Age of Gold Standards
In the beginning of the modern economic era of the later 19th century, a pure “gold standard” was never consistent. However, its rise to preeminence as ‘the’ pillar of sound economic theory was that of gold’s role as a hedge against inflation and against Unsound Money — paper money easily manipulated to reckless credit whims. In this regard, the European central banks of the day were excellent watchdogs.
Great Britain led the charge on this score. The country was on a full gold standard from 1816 onwards, but de facto (the nuance is important) since 1717. Ironically, this was after Sir Isaac Newton, Master of the Mint, did not depreciate gold enough when the country was on a bimetallic standard, like most of the world. According to the legend, Newton had set the official silver price of gold guinea at twenty-one shillings and inadvertently drove what were newly re-minted full-bodied silver coins out of Britain — an illustration of Gresham’s law. This left only worn silver coins to circulate as a means of payment along with overvalued gold coins. Yet, this error in judgment established the gold standard in practice in the country, later codified into law following the Napoleonic wars. This, in turn influenced other nations, especially Germany and Japan, to turn to the gold standard — a means of trying to emulate British success.
A consistently international gold standard proper dates from around the 1870s until 1914. The U.S. adopted de facto a gold standard with specie (metal coin in mass circulation) payment on greenbacks in 1879, and became de jure with the Gold Standard Act of 1900.
The features of this old gold standard, both European and American, were as follows: a) that a national monetary unit was now defined in terms of a given quantity of gold; b) that the central bank or Treasury would stand by ready to buy and sell gold at the resulting fixed price in the terms of the national currency; c) that gold was freely coined and gold coins formed a significant part of the circulating medium (and thus implying fixed exchange rates). It was not a flawless system, of course, and it did not ensure price stability. Variations were due mostly to fluctuations in gold production and gold-rush boom-cycles that took place after the Civil war.
The Age of Fiat Currency
After World War I, another international landscape emerged and the “gold standard” took on multiple meanings — a fact that eventually led to that standard’s degradation and abrogation. When the U.S. and France displaced England to emerge as the chief post-war creditor countries, the mechanism of the pre-war gold standard to which depreciated currencies related no longer existed. Only America was left with a full gold standard; Great Britain and France had a gold bullion standard and other countries (Germany, primarily) had a gold-exchange standard.
(A gold standard is a fixed price between a currency for a specified amount of old. A gold exchange standard, then dominant in the 1920s, meant that foreign governments held U.S. dollars and the British pound as reserves — not gold itself — as those currencies were exchangeable for gold. A gold-bullion standard means that gold coins do not circulate, but authorities agree to sell bullion on demand at a fixed price in exchange for the circulating currency.)
With these competing standards, unbalanced trade relationships started to overwhelm the international economy. The U.S. manipulated its own domestic credit policies to ease credit and exchange-standard controls, a factor culminating in the crash of 1929 and the international financial crisis of 1931.
The contemporary era of gold begins when President Franklin Delano Roosevelt took the country off the U.S. full gold standard, meaning, among many things, that creditors had lost the right to demand payment in gold and individual households were required to turn in their gold holdings. Under Bretton Woods (1944), gold was effectively abandoned: domestic convertibility was illegal and its monetary use was very constrained in favor of the dollar. President Nixon’s move in 1971 was, in effect, a post-script to the actions of Bretton Woods.
Today, investors still view gold as a great insurance policy of sorts, but, as I have written here, the gold market is itself vulnerable to manipulation through short-term rigged market trades and long-arm central bank interventions. .
“In the absence of the gold standard, there is no way to protect savings from confiscation through inflation. There is no safe store of value”, Alan Greenspan was quoted on this site when he returned to being the Old Greenspan after leaving the Fed. His move demonstrated what is necessary when it comes to any economy having clarity and common sense concerning the true value of money, which is, in many times and places, the standard of gold itself.