This post was written with contributions from Diego Andrade. Diego is a Gold Strategist in the Global SPDR Business.
Investing is a profoundly personal exercise. We all have individual viewpoints and goals that we use to shape our investment decisions. As a veteran with more than 40 years working in the financial industry, I have seen the influence of personal preference play out through the lens of gold investing. Some investors would never think to build a portfolio without gold while others would never think to build a portfolio with it. My job as a gold expert is to ensure investors are fully educated on gold and its potential role in a portfolio so they can make an informed decision on how they approach the precious metal when managing their investments.
That is why comments Warren Buffett made about gold at Berkshire Hathaway’s annual meeting struck me, and I felt they needed to be placed in context. After all, when the Oracle of Omaha speaks, investors listen, and I want them to be adequately informed.
In May of 2018, Buffett asked his audience to compare the returns made by investing $10,000 in gold in 1942 vs. investing that same amount in an S&P 500® Index fund.1 He then revealed that the S&P 500 investment would be worth $51 million today compared with roughly $400,000 for gold. This comparison, with no additional insight, would cause almost any investor to question a gold investment. After all, using Buffett’s numbers, we’re talking about a 12.06% compound annual growth rate (CAGR) for stocks vs. a 5.04% CAGR for gold, which results in whatooks like a lost investment opportunity worth millions of dollars over 76 years.
But it’s important to look under the hood when analyzing return figures. Below are three reasons I believe Buffett’s premise is misleading, and it fails to present investors with enough context to understand why this is not an apples-to-apples comparison.
1. The history of the spot price of gold
From the ancient kingdom of Lydia in 200 BCE to the modern US Treasury, gold has played an influential role in international monetary systems. Gold was established as the sole basis for redeeming paper currency in 1900 when President William McKinley signed the Gold Standard Act. It set the value of gold at $20.67/oz, and, most importantly, it meant that for the next 71 years the spot price of gold ($/oz) would be controlled by a branch of the US government.2
By March 1942 the Federal Reserve had adjusted the spot price of gold to $33.85/oz. Two years later, the Bretton Woods system was signed, and the US dollar was pegged to gold at $35/oz. The system lasted until President Nixon ended it on August 15, 1971, when gold’s value had been increased by the Federal Reserve to $43.28/oz.3
Why is this important?
As the chart below shows, in the 29 years from March of 1942 until August of 1971, the Federal Reserve (Fed), not the free market, controlled the price of gold. During this time the Fed appreciated the price of gold by $9.43/oz, which is equivalent to a 0.65% CAGR. During that same time, the S&P 500, which is controlled by the free market, provided a 13.82% CAGR.
But comparing returns between the S&P 500 and gold starting around the time President Nixon removed the US Dollar from the gold would provide more of an apples-to-apples return comparison. Nixon’s policy move meant that free markets, not the US Fed or any other central bank, would be in control of pricing gold, the same way free markets dictate the value of the S&P 500. Looking at that same initial $10,000 investment in gold and the S&P 500, measured since the second quarter of 1971, shows the gold investment would have appreciated to $311,640 while the S&P 500 investment would stand at $1,097,900. This is equivalent to a 10.62% CAGR for the S&P 500 vs. a 7.61% CAGR for gold.4