IN PART I of this article, I discussed Karl von Clausewitz, and his important study of policy and strategy, On War, published in 1832. Clausewitz was reviewing the Napoleonic Wars through the lens of policy and strategy, but his ideas have broader scope than just the martial arena.
In his discussions of the Napoleonic Wars, Clausewitz was, of course, reviewing military history. But he was not writing simply for the sake of allowing his readers to enjoy the vicarious experience of war, the whiz and zip of flying bullets and the acrid smell of gunpowder clogging their nostrils.
In Part I of this article, I mentioned that according to Clausewitz, policy, strategy, operations, and tactics are a continuum. I summarized one of Clausewitz’s key points as being that while a failure of tactics or operations can doom the entire chain, a failure of policy and strategy will doom the entire chain.
I advised you to keep the “failure of policy” and “failure of strategy” ideas in mind because we see these a lot, throughout history and certainly in current events.
A Few Words on Politics, Policy, Strategy, and Energy
I am discussing all of this because the United States needs to start thinking, and thinking very hard, about developing a “national energy strategy” in a pure sense. In Part I and in the preceding paragraphs here, I began to refer to policy and strategy in the context of Clausewitz and of warfighting.
I am using Clausewitz as the rhetorical guide to the concepts of policy and strategy. The only reason that I am now using Clausewitz in connection with energy strategy is that his application of policy and strategic concepts to the study of the Napoleonic Wars helps to crystallize thinking about the basic strategic terms. That is, the strategic concept is applicable to energy issues as well.
If you follow the continuum of Clausewitz, then politics and policy come before strategy. So perhaps we have to ask what is the appropriate energy policy for the United States? Or moving further up the line, what are the politics of energy?
These are loaded and emotional questions, particularly in a nation with the energy consumption history of America.
I have been discussing this back and forth over the months with my colleague at Agora Financial, Dan Denning, who made a good point. Dan noted that up until the present time, no U.S. national energy policy has had any true sense of strategic urgency, let alone any staying power.
Sure, it has always been fun and political sport to beat up on oil companies, going back to the days of Titusville. But a long-term national energy strategy? Not in America, and for a lot of reasons.
Back to the Future in the 1970s
The only time that energy became a fundamental issue of U.S. policy was back in the 1970s. That was when U.S. oil production peaked, imports started to accelerate upward, and energy prices rose dramatically after the 1973 Yom Kippur War.
Due to a nominal “embargo” against the United States, sponsored by the Arab nations of the Organization of Petroleum Exporting Countries (OPEC), the politics of energy became more than apparent in 1974, embodied in public dissatisfaction with long lines for scarce gasoline.
What passed for a U.S. energy policy followed the gasoline lines, mostly in terms of new federal laws and regulations concerning energy use and production. For example, there was bipartisan political support for energy conservation measures such as tax breaks for insulating homes and even mandating automakers to increase the fuel efficiency of their products. The United States determined to establish its Strategic Petroleum Reserve.
Also, there was federal sponsorship of research into alternative energy systems such as coal conversion and oil shale development, among many other energy options. A lot of things happened in America in the 1970s with respect to energy conservation, and a lot of those things were at what could be characterized as a “strategic” level, and basically good for the nation over the long haul.
Still, much of what occurred in the 1970s seems in retrospect to have been a response to clever lobbying at the federal political level by well-connected interest groups, and U.S. knee-jerk reactions to singular events and geopolitical machinations by other parties in other nations.
That is, what passed for U.S. energy strategy was along the lines of maintaining a happy, yet receding, status quo, not the creation of a continuous process of influencing and shaping events toward an energy future that would be marked by radical change.
The events at Three Mile Island, for example, stand alone as a sharp, high-G turning point on the U.S. collective view of nuclear power, a view that has taken almost two generations to start to change.
Looking back, it seems as though little of either the politics or policy toward energy in the United States during the 1970s was premised on a clear understanding of the underlying physical reality of the eventual impending decline in conventional oil production.
As far back as 1956, the great geologist M. King Hubbert made public his mathematically determined prediction of the “peaking” of domestic U.S. oil production, and his prediction came true for the United States in 1970 (although it only became clear in hindsight).
As the 1970s progressed, this Hubbert school of thought was distinctly a minority viewpoint, certainly much less prevalent than even the current Peak Oil school. After leaving the employ of Shell Oil, Hubbert worked for the U.S. Geological Survey, lectured at many schools, and was for a time a visiting professor at MIT.
Yes, Hubbert’s argument was out there, but was a distant voice in the wilderness. Hubbert’s seminar at no less an institution than Harvard University in 1977 attracted all of about a dozen people, one of whom was me.
In the 1970s, whenever some people would speak of limits to the ultimate levels of oil production, others as often as not would sneer and label the discussion derisively “Club of Rome” thinking. In all fairness, U.S. leadership and policymakers had other things to worry about, like surviving the Cold War, and losing the Vietnam War.
And it is not out of line to note that the festering cultural Marxism within much of U.S. politics in the 1970s foreclosed intelligent thinking at many levels (both then and now, I should add). but that is another discussion for another time.
It is interesting that one of the most prescient discussions of the world’s energy future was presented in the 1970s not by any senior leader or policymaker of the U.S. political or business establishment, Democrat or Republican, but by the great science fiction writer Isaac Asimov.
Best known for such classic science fiction works as the Foundation series and I, Robot, Asimov gave his prophetic lecture in 1974 entitled “The Future of Humanity.” Here is the link to the rather astonishing text if you are interested: http://www.asimovonline.com/oldsite/future_of_humanity.html.
Back to the Future in the 1980s and Beyond
The political and policy debate of the 1970s led in the 1980s to what passed for a national political consensus (well, what passes for “national consensus” in a vast nation like America) that no long-term national energy mobilization was necessary. Simply put, the “energy policy” was to drill for oil at home and buy more oil from abroad. (For the record, I was working for Gulf Oil Co. back then, and we drilled like gangbusters.)
When all else fails, the policy was to have a strong Navy and powerful expeditionary ground force and Air Force that could dominate events in the Persian Gulf. QED, right?
Looking back at it, the lack of any sense of urgency in crafting some sort of U.S. energy policy is not incomprehensible. By the 1980s, America had a diversified supply chain of domestic and offshore oil suppliers and the price of crude oil was serendipitously falling due to the opening up of new oil provinces like Alaska and the North Sea.
For a good many years, America had seemingly found its happy means to preserve the beloved status quo, seen in a relatively low price for gasoline at the pump. The voters were calm. The underlying politics was ill informed, if not delusional.
Despite the price spike in oil occasioned by the Gulf War of 1991, this fanciful U.S. view of its seemingly secure — and cornucopian — energy base lasted through the decade of the 1990s as well, during the “waking up from history” period and into the new millennium. Over time, the shale oil projects of the 1970s were shuttered.
The SUVs of the 1990s took to the roads by storm. And like the gold-laden galleons of another age, the oil tankers docked at U.S. ports with their precious cargoes of black treasure sloshing in the holds and pressing down upon the keels.
From a political and policy standpoint, it seemed that the factors behind any increase in energy prices in the 1980s, 1990s, and into the 2000s were temporary and political, not permanent and physical. Price spike?
It was the market at work. It was the manipulations of evil oil-traders, or some one-off military event like the “Tanker War” of the mid-1980s in the Persian Gulf. Give it time, and “Old Mr. Market” would get those pesky prices back down. To the extent that there was an energy strategy in all of this, it was to perpetuate the myths of the past.
And anyway, strategy or no, the traditional economic view in America is that “free markets” are better at allocating the resources of society than government regulation or planning. This despite the fact that the U.S. dollar is a “managed” currency, the price and quantity and quality of which are set by the Federal Reserve.
And this despite the fact that U.S. federal and state governments have heavily regulated most large-scale business activities for over a century, down to the former Interstate Commerce Commission setting the prices and tariffs for bus tickets.
And of course, up to 40%, by some estimates, of U.S. gross domestic product is the result of government spending at the federal, state, and local levels. So how “free” is such a free market, anyhow? To paraphrase a former U.S. president, “I guess it depends on what your definition of the word ‘free’ is.”
Is the “Free Market” a Strategy?
Americans have a lot of confidence in the so-called “free market” as a mechanism to permit the nation to mobilize and meet challenges. But I have to wonder out loud if this is based mostly on wishful thinking and outdated nostalgia about what occurred in the run-up to and during World War II. Like dried-up has-been movie stars recalling their glory days on the silver screen of old, Americans have read too many of their own press releases about being the “arsenal of democracy.”
Or is this kind of self-promotion just a reflection of a national characteristic, originating in a nation and culture that uses the mighty U.S. dollar as its currency, which currency is also fortuitously the world’s “reserve currency.” After all, it is easy to write checks if nobody ever comes to the teller window to cash them.
During World War II, America had tremendous wealth in both natural resources and capital stock, including being almost self-sufficient in oil production. (America was a net oil exporter until 1943.) Politically, the war allowed the central government to, in essence, hijack the national industrial base and orient it toward fighting a war on two fronts, on opposite sides of the planet. It was an immense undertaking.
From a policy standpoint, the key challenge for the American war planners was to figure out the most efficient ways to reorient national production to support the war policy and defeat the Axis powers.
Lack of resources, such as energy resources and other raw materials, labor, and especially credit, was not a significant factor in the American war effort. (And when it comes to modern military affairs, the U.S. Department of Defense still tends to get what it wants, particularly if the item is manufactured in a range of states and congressional districts. This is another topic for another time.)
I am by no means arguing that World War II was easy. The Second World War was an immense national effort, probably unique in all of human history, and certainly in U.S. history. But there were no real constraints or limitations at the level of domestic war production.
After all, there were no fleets of German bombers flattening U.S. cities, as was the case in many other countries in the world. And there was plenty of energy with which to keep the lights on and the furnaces burning down at the mills.
By 1944, the United States could summon its allies at Bretton Woods, N.H., to forge (some say, to rubber stamp) the future monetary arrangements for the world, with the U.S. dollar as the linchpin. It is not overstatement to say that by mid-1943 it was all but a foregone conclusion that U.S. warfighting policy, and its implementing strategy, would prevail.
Japan’s Adm. Isoroku Yamamoto understood and believed this as far back as 1940 and in so many words famously said almost exactly that to Japan’s premier that year. The bottom line was that the U.S. economy had the capacity to provide the theater commanders and generals and admirals — not to mention the fighting elements of numerous allies, from Britain to China to the Soviet Union — with the tools they needed to win the war. And so it happened.
The most difficult questions, certainly for the U.S. political and military leadership at senior levels, revolved around where and when to deploy the blood and treasure of the nation most effectively in a military sense. It was the hardest of calculus. For example, the invasion of Europe was delayed due to the need for landing craft in the Pacific Theater.
So hundreds of thousands of Russian troops died on their own German front, and Stalin fumed and raged while the D-day invasion of France was postponed again and again. But at the same time, the United States could send 1,000-plane air raids against German cities, tying up a million German air defense troops and tens of thousands of German guns and aircraft.
For an example from the Pacific Theater, the invasion of Iwo Jima occurred because the Army Air Corps wanted that particular island as a strategic base on which battle-damaged and fuel-short B-29s, valuable airplanes full of highly trained aircrews, could land on the way back from bombing Japan.
Thus, thousands of Marines died capturing Iwo Jima from fierce Japanese resistance. By the grim math of war, eventually, the deaths of these Marines saved the lives of more thousands of U.S. aircrew. As I said above, this was a hard sort of calculus.
At this D-day and long-range bombing level of strategic and operational planning, there were damn few clear-cut choices and no easy ones. But one of the few luxuries available to the war planners was not to have to worry much about from where the next barrels of black oil would come.
We will continue this discussion in Part III of this article. Thank you for reading Whiskey & Gunpowder.
Until we meet again…Byron W. KingApril 10, 2006
Byron King is the editor of Outstanding Investments, Byron King's Military-Tech Alert, and Real Wealth Trader. He is a Harvard-trained geologist who has traveled to every U.S. state and territory and six of the seven continents. He has conducted site visits to mineral deposits in 26 countries and deep-water oil fields in five oceans. This provides him with a unique perspective on the myriad of investment opportunities in energy and mineral exploration. He has been interviewed by dozens of major print and broadcast media outlets including The Financial Times, The Guardian, The Washington Post, MSN Money, MarketWatch, Fox Business News, and PBS Newshour.
One of the most heated political battles raging across the western world is debt versus austerity. In the U.S. this debate reached it's apex in 2011 when the U.S. credit rating was downgraded by Standard and Poor's. In today's essay, however, Chris Mayer throws the debate out the window, explaining why he thinks a U.S. debt crisis will never happen...
Believe it or not, more capital for a company doesn't necessarily mean better returns for investors. In fact, in a recent study that dug through data from more than 200 acquisitions going back to 2006, they found a "sweet spot" for the most likely acquisition targets. And it's lower than you think. Matthew Milner explains...
The Affordable Care Act dumped 2,000 pages of regulations into the health care sector, stifling any innovation that could have brought about real cost savings. But even with these obstacles, there are still people looking for ways to do things better and at a lower cost. These new technologies could be the key to fixing health care in America...
While many of the newer social media stocks struggle for gains this year, old-school tech stocks have become some of the best trades on the market. With the rare exception (Facebook is doing well—shares are up 26% year-to-date) the social stocks are in the gutter. They got off to a fast start in January and Februray, but ran out of steam in the spring. Aside from a few feeble attempts, few have posted anything close to a noteworthy comeback. Twitter, LinkedIn, and Groupon are all down double-digits year-to-date. Groupon—the worst performer on this short list—is down 47%. On the other had, the biggest of the big tech stocks on the market are helping traders pile up even larger gains right now. Greg Guenthner explains…
In the 1960s, total credit in the U.S. broke the one trillion dollar mark...and since then, it has expanded over 50 times. But now, as Richard Duncan explains, the explosion of credit that's made America prosperous, threatens to take the entire economy down. And that could mean the return of another depression...