Bourgeois society stands at the crossroads, either transition to Socialism or regression into Barbarism.
- Rosa Luxemburg, "Junius Pamphlet" 1916

Sunday, July 3, 2011

On the Confederate victory at Gettysburg: July 3, 1863

Little Round Top
Gettysburg battlefield, Pennsylvania
From http://commons.wikimedia.org/wiki/File:View_from_Little_Round_Top_over_Gettysburg_Battlefield.jpg


The destruction of Philadelphia by Robert E. Lee's Army of Northern Virginia on September 11, 1863 effectively ended the War of Secession. Fighting between Lee's forces and remnants of General Winfield Scott Hancock's Army of the Potomac would continue for another month, with minor skirmishes ranging across Pennsylvania. But the burning of the city where the Declaration of Independence had been signed triggered the final dissolution of the American union, four score and seven years after its birth. Five days after Philadelphia, word reached Washington that France and Britain had  extended official diplomatic recognition to the Confederate States of America. The Anglo-French decision had been reached prior to the Philadelphia catastrophe, but news of European recognition sealed the union's fate. On September 17, a delegation of Congressional leaders met with President Lincoln at the White House. Following that meeting, Lincoln announced his resignation from office. Vice-President Hannibal Hamlin of Maine was sworn in two hours later as the seventeenth President of the United States. It was Hamlin who signed the peace accords with Jefferson Davis in Richmond, Virginia, on October 30, 1863. 

The consequences of Confederate independence continue to be felt. The territory of the former United States at the outbreak of war in 1861 is now home to twenty one independent nation-states. The old Confederacy is the least of them all. The secession of Florida in 1972 reduced the rump Confederate states to a shell of their former greatness, with only Georgia, Alabama, and Mississippi remaining as members. Today the Pacific Republic, anchored by California and the Vancouver States and the alliance with Tokyo, contends with the Texas-led Concordium of the Central Republics for economic and military supremacy in North America. Pacific dominance in information technology is more than counter-balanced by the C.C.R.'s control of oil fields in its Mexican and Venezuelan territories.  

The autocratic empire of New York stands in glittering isolation from the other North American powers, its attention long since turned eastward across the Atlantic. The admirals, field marshals and industrial managers in Manhattan are content to supervise a thriving Bismarckian realm extending from Pittsburgh to Charleston. New York's alliance with the trans-European Kriegstadt, dating from the Great War of 1913-1922, remains strong on paper. Whether the troops and naval squadrons of New York will continue to participate in Berlin's counter-guerrilla operations against Russian insurgents remains a subject of fierce debate within the Alliance.  

The other shabby autocracies and principalities of North America fend for themselves as best they can. The Mormons have their Utopian theocracy in Deseret. The Dominion of New England recently withdrew from the British Commonwealth at last, but faces internal strife that could be its undoing.  Various separatist parties, having abandoned armed resistance in the 1980s, are now close to forming a coalition in the parliament at Boston. Meanwhile, the Great Lakes Confederation remains a squabbling hodge-podge of corrupt little cornfield republics. The Dixie Mountain states fight their endless, bloody wars for God and feudal rights. And so on. 

All of this because of what began on an April day in 1861 at Fort Sumter. Or perhaps it all started in the presidential election of 1860, which unexpectedly brought an obscure extremist party to power in Washington, led by an ill-prepared, mediocre lawyer from Illinois. Or perhaps today's North America first took shape at the Constitutional Convention of 1787, which failed so completely to forge a durable, functioning political union of the newly independent colonies. 

Some novelists and film-makers, rather implausibly, trace today's situation to the Battle of Gettysburg, the enormous bloodletting across three days in July, 1863. The triumph at Gettysburg led to further victories for Robert E. Lee at the Susquehanna and Pottsville in August. And on to Philadelphia, the city where the Union ended and began. No one can deny that Lee's spectacular success in the summer of 1863 made him immortal. Military glory ensured his election as the second President of the Confederate States of America. It did not, however, save him from assassination in Birmingham on November 22, 1868. By then, Lee's brief presidency had already been a failure, as the great military conqueror proved unable to reconcile feuding delegations in the Confederate Congress. Two years later, the coup against President Longstreet by General Nathan Bedford Forest would signal the beginning of the end for the Confederacy. 

Its former antagonist in the North fared no better. The disputed presidential elections of 1868 and 1872 degenerated into open warfare among contending military factions. Commanders who fought alongside one another in the War for the Union now waged bitter, bloody combat in the farmlands and cities of the United States. General Custer's seizure of Washington on June 25, 1876, ended the fighting. His summary execution of Generals Grant and Sherman on the national centennial made another war -- and the final breakup of the Northern Union -- inevitable. 

From a certain simple point of view, the bitterness and squalor of the century and a half since 1863 might, conceivably, be traced to a few key moments in that fateful year. If Lee's daring decision to divide his forces at Chancellorsville had backfired, maybe the Confederate victory at Gettysburg two months later would not have happened. If Union General George Meade had not abandoned the left flank of his lines at Gettysburg on July 2, maybe Longstreet's Charge would not have swept Union forces from Cemetery Ridge on the following day. Perhaps Meade might have held the left flank on July 2 if not for the surrender of the 20th Maine at Little Round top that day. Maybe the destruction of the United States of America happened because a professor of rhetoric from Bowdoin College, one Joshua Lawrence Chamberlain, found himself commanding men in battle at a critical moment in North American history. If Chamberlain had rallied his regiment at Little Round top, instead of surrendering, maybe everything afterward would have been different, as some writers of the Lost American Cause have insisted. Maybe Meade wouldn't have panicked when Confederate guns gained the summit of the hill Chamberlain lost. Maybe he wouldn't have ordered the withdrawals that led to Longstreet's Charge the next day. Maybe, if, only, might, would, should.

The urge to seek a single, decisive turning point in North American history is tempting. But turning points are the stuff of storytelling and myth. Historians are less apt to romanticize the course of human events. Their job is simply to seek explanation, by clearly defined chains of cause and effect. Unfortunately, events in the century and a half since the War of Secession (or the War for the Union, take your pick) do not easily yield to clear simple answers. The chains of cause and effect exist in an ocean of possibility and time. In the end, it's beyond our understanding, except in the most tentative of ways. 


We know that we are here, now. In this time and place. How this happened matters less than what we do today. For our descendants, it will be the same, looking back at the decisions we made. Mystic chords of memory will not offer simple understanding or solace. Only questions. We do the best we can, and go on. There is no other choice.


* * * * * 




Acknowledgments: 

I first encountered a depiction of "Longstreet's Charge," in an alternate Battle of Gettysburg, in the book Gettysburg: An Alternate History, by Peter G. Tsouras.  My use of that phrase is freely borrowed from that book, which portrays a very different outcome at Gettysburg than the one I sketched in this post. In Tsouras' account, Longstreet's Charge ends in an unspeakable catastrophe for the Army of Northern Virginia, on a scale far greater than the nightmare charge led by George Pickett in our universe. The bloody slaughter of Longstreet's men triggers a furious Union counterattack that breaks the Confederate Army, resulting in the capture of Robert E. Lee and the ending of the Civil War two years early. 


In the fiction of Harry Turtledove, I first pondered the idea of a Confederate victory in the American Civil War leading to German triumph in World War I. In Turtledove's alternate history and mine, there is no unified American republic to reinforce the exhausted Western Allies in 1917-1918. Thus, Kaiserine Germany succeeds in its bid for European hegemony. The Bolshevik Revolution, the rise of Hitler, and the Cold War never happen. The twentieth century world becomes a very different place. 


I owe my sketch of fratricidal warfare in North America in part to the novel The Difference Engine, by William Gibson and Bruce Sterling. In that book, British inventor Charles Babbage builds a functioning computer in the 1830s, inaugurating a revolution in information technology during the Victorian Age. British technological supremacy leads to overwhelming global dominance, with Her Majesty's diplomats and generals successfully instigating the breakup of the fragile union of American states by the 1850s. 


Confederate victory in the Civil War is probably, along with Nazi conquest of the Western Allies, the most popular alternate history in fiction. Like all good fictional tropes, it is subject to endless, fascinating variation. It was fun to add one more version to the mix.

Saturday, July 2, 2011

Malthus was not wrong: further lessons on bad interpretations of politics and history

Costa Pinto biofuel production facility, 
Piracicaba, Brazil, October 19, 2008
From http://commons.wikimedia.org/wiki/File:Panorama_Usina_Costa_Pinto_Piracicaba_SAO_10_2008.jpg


[I've updated this post a bit since its initial appearance, revising and extending certain sections to clarify and bolster the argument a bit. I hope. -Ed M. 7/3/2011]

I've been thinking a great deal about a recent visit from an old friend, who happens to be a member in good standing of the mainstream, Foreign Affairs-abiding, academic community of foreign policy scholars.

I talked a lot with my friend about peak oil, climate change, and ecological ideas more generally, as applied to the study of politics and international affairs. I tried to make the case that any valid interpretation of contemporary politics has to place at its center the unfolding ecological holocaust. Any useful understanding of politics or policy has to acknowledge ecological collapse as the basic motive force of human history, driving and defining all other current events. Any interpretation that ignores the unfolding ecological holocaust, or depicts it as but one policy concern among many, is intellectually bankrupt. Or so I tried to imply, assert, bluster, argue, and cajole, in conversations with my foreign policy buddy, over burritos and booze.

A lot of his responses boiled down to asking whether a crisis of the immensity I described actually existed. For example, my friend employed the venerable rhetorical strategy of invoking Malthus to show, supposedly, that assertions of imminent resource scarcity have always been wrong. Malthus was wrong about scarcity, therefore today's arguments about ecological scarcity are wrong as well. See my earlier post on this subject for more details.

The Malthus assertion, obviously, is a logical fallacy. It is also factually untrue, as my friend Steve -- a professional ecologist by trade -- pointed out in a comment on my earlier post. Malthus was not wrong. His basic argument about maximum limits on human food supplies, and therefore human population, was in fact correct. Yes, really.

More on that in a moment. For now, I want to digress just a bit. I want you academics in the humanities and social sciences reading this to ponder my statement for a second. Thomas Malthus was in fact correct. That means the little bit of folk wisdom you've been nodding to ever so sagely in graduate seminars for lo these many years - we all know Malthus was wrong -- is one hundred percent crap. You should think about that, when you are trying to teach your undergraduates and your graduate students. Something you have routinely believed for your entire professional career is, in fact, completely wrong, with no basis in fact. Not just misguided, not just subject to differing interpretations, but wrong. I will offer evidence for that further down in this essay. For now, I just want to point out to you academic types that there are probably other things in your field of interest which seem self-evidently true but have no actual basis in fact.

In the medical profession, practitioners are a little less complacent. This became clear to me during the days when (until yesterday) I was employed in administrative support for physicians. Medicine has tried in recent years to institutionalize a corrective mechanism for blind acceptance of past practice. Such acceptance had been the norm for decades. Physicians had long been indoctrinated to believe that inherited wisdom from their mentors was more or less true. All those little tricks of patient care that you learned in medical school and residency were thoughtlessly applied out in the real world of the hospital and the clinic.

Then, beginning in the 1990s, a movement swept through Western medical professions, arguing for a new way of evaluating conventional medical wisdom. This movement called itself by the term evidence-based medicine. It held that doctors shouldn't simply do something because it had always been accepted as true. Instead, medical practice and belief should be constantly re-evaluated in the light of hard evidence. Professional mechanisms of promotion and recognition should encourage such habits.

You would think that medicine had been doing that all along -- millions of pages of scientific medical journals and textbooks, after all, had filled library and hospital bookshelves since the Victorian age. Yet, in reality, doctors out in their practices tended to ignore accumulating bodies of new evidence and simply do what they were taught, especially in the non-glamorous types of medicine. Sure, once organ transplants became big news, a family doctor out in Everytown, New Mexico, would take that into account in patient care. But, in addressing everyday aches, pains, and breakdowns, the old practices tended to prevail. Because it's easier to do something the same way than it is to constantly worry about whether or not you're doing it right.

That started to change, in the last two decades or so, as more and more doctors accepted the premise of evidence based medicine -- that they should constantly check their beliefs and methods against an updated knowledge base. Computers and the internet, naturally, made this much easier to do. When I worked at the University of New Mexico School of Medicine, much of my daily work involved helping doctors in isolated rural towns try to access online databases of the latest medical information. These doctors actually gave a shit whether the information they used to treat their patients was like, you know, true. Evidence based medicine, they realized, wasn't just good practice, it was also an ethical obligation.

Today, information long thought to be self-evidently true is much more likely to be questioned. Cough syrup, for example, turns out to be no better than a placebo, and beta blocker medications have been revealed as ineffective in treating a heart attack. Yes, some doctors think evidence based medicine goes too far, and they grouse about fancy medical schools insisting that every little method has to have a vast body of statistical evidence backing it up ("I don't need no stinking statistics to tell me that jumping out of an airplane will kill you"). Such criticism is mostly misguided and irrelevant. Evidence based medicine is here to stay.

In my experience, the humanities and social sciences -- most notably history and political science -- are much less sophisticated and systematic about re-evaluating their internalized, ossified conventional wisdom. Oh, to be sure, professors in those fields constantly publish in their little sub-specialized journals, like the American Historical Review, the American Political Science Review, Diplomatic History, and International Security. It is fashionable, in those journals, to publish some contrarian screed or other, boldly asserting that a bit of inherited academic wisdom is grievously wrong (Hitler did not deliberately seek war, slavery did not cause the Civil War, etc. etc. etc.). But each little contrarian screed, sad to say, is just one more bit of intellectual flotsam in a vast sea of muck. A contrarian screed in history or political science does not, for the most part, serve as a peer-review corrective to the knowledge base. We know this, because contrarian screeds are just as likely to appear on any side or angle of an interpretive or empirical question. The Russians caused the Cold War; no, the Americans did; no it was a combination of both; no, asking who's to blame is not the right way to approach the topic. Blah, blah, blah. The goal isn't progress in knowledge. It's endless proliferation of viewpoints, which is not the same thing.

There are occasional exceptions. Historians revised their beliefs about slavery in the American South, for example, when new research began to show that it was immensely profitable and not, as conventional wisdom had it, on the way to extinction. In this case, progress in knowledge actually happened. For once.

Mostly, though, some new bit of counter-intuitive information in a journal of history or political science doesn't serve to correct old beliefs. Instead, it functions as one more item on a curriculum vita on the way to tenure. Or the hope of tenure, in an academic industry dominated by adjunct slave labor. The true, original function of academic journals -- recording the steady accumulation and refinement of knowledge -- has long since been abandoned in the humanities and social sciences. Those journals mainly disgorge an endless series of contradictory viewpoints, with no real effort at synthesis or consolidation or correction.

This stands in rather horrifying contrast to the situation in the natural sciences, such as physics, chemistry, and biology. Academic journals in those fields record an ever-expanding, consolidated, and verified body of reliable knowledge about the universe. Yes, yes, yes, careerism and mediocrity prevail in the natural sciences, too. Spare me. That doesn't change the fact that those disciplines produce actual knowledge with real world applications. They have given us stem cell treatments and microprocessors. Historians and political scientists have given us a continual vomit of worthless and forgotten verbiage, like "post-revisionism" or "hegemonic stability theory."

Or, far worse, they have given us empty ideological blather masquerading as knowledge -- the foremost example being neoclassical economics. Only slightly less prominent are such favorite delusions of policy-makers as theories of counter-insurgency, nuclear deterrence and "the democratic peace." Such blithering finds its way into government policy-making all the time, to serve as a ready-made justification for the agenda of powerful elites. As a description of reality, these ideas are useless at best, destructive at worst.

They also tend to be, on the whole, a patch-work quilt of assertions, distortions, and lies. The above examples are notorious. But the myth of Thomas Malthus as chicken little takes the cake, because it is so simple to debunk. Malthus was not wrong. He argued that there were absolute physical limits to food production, and therefore to human population.

He was right. How do we know this? Consider the following figure, which I borrowed from the Global Change curriculum at the University of Michigan:


"Years BP" = Years Before Present

This graph shows the changes in human population dating back 500,000 years. You will notice that prior to the industrial revolution, the line makes exceedingly little upward progress, and in fact never goes higher than a certain point. Only at the outset of the industrial revolution does the line shoot  upward in a sudden, exponential explosion of population.

Why? What changed? The easy answer: industrialization. Well, what made industrialization possible? There is one answer, and only one. Fossil fuels. That's it. Not Western intellectual traditions, not democratic institutions, not easy flow of capital. Nope, sorry ladies and gentleman. We owe industrialization to the mere existence of coal, oil, and natural gas. They alone provided, or ever could provide, the necessary supplies of cheap, abundant, easily accessible energy. Without that energy, the huge upward curve of economic growth after 1750 would not have happened. End of story. Without the upward curve of economic growth, there would be no gigantic leap in technology for industry and agriculture. Without that technology, expanding food production and the gigantic infrastructure to distribute the massive supplies of food would not have existed. Therefore, the upward curve of population you see on the above graph would not have happened. It would be physically impossible without the necessary technology and energy. Nothing trumps physical reality. Nothing. Fossil fuels alone made exponential population increase physically possible.

What is it about the physical reality of fossil fuels that makes them the sole enabling factor for the industrial revolution and associated population growth? As Steve the ecologist has pointed out, fossil fuels have unique physical properties. They provide an enormous bang for the buck, in terms of energy released per unit of fuel. No other source in nature even comes close. Nuclear fission appears to do so, but in actuality, it doesn't. Nuclear power plants are serviced by infrastructure made possible only by fossil fuels. Plus, uranium and other nuclear fuels are limited in supply. So the actual net energy yield of nuclear power is dramatically less than it appears to be. The fact that our civilization is rapidly depleting its stocks of fossil fuels and of uranium means the days of nuclear energy are numbered. Just like the days of our civilization itself.

The relatively flat curve on the above graph, prior to the industrial revolution, means that there is, in fact, an absolute limit on human population. Without fossil fuels, the natural carrying capacity of the Earth won't allow human populations to grow beyond those of the early 1700s. That's why the population growth curve in the graph above is so flat. The planet's inherent ecology and resources won't allow us to build enough infrastructure and grow enough food to support a population in the billions. We have 500,000 years worth of evidence for this hypothesis. Fossil fuels granted a temporary, one-time only exemption to the basic rule. No other energy source can match their yield or their ease of extraction, production, and distribution. Now they are going away. Oil first, natural gas and coal eventually. Ergo, the natural ceiling on human population will return.

Malthus was right.

Mainstream academics and policy-makers desperately want this not to be true. Everything about our current way of life depends on escaping natural limits to energy supplies and resources. Such an escape was the essence of the industrial revolution, which initiated two hundred plus years of seemingly unlimited economic growth, punctuated only by occasional episodes of depression and recession. We've been in such an episode since about 2008. Conventional wisdom says we'll grow out of it, as we always have. And perhaps we will, but the resumption of growth will prove, in the long run, to be transitory. And the long run might turn out to be very short. World production of crude oil has been flat since about 2005. It's not a coincidence that the global economy has been stagnant in that time, very nearly collapsing into the abyss during the catastrophic financial meltdown of September 2008. World oil production might begin to creep upward again, or coal and natural gas might begin to substitute for oil in various economic sectors, or certain alternative energy technologies (e.g. electric cars) might for a while stave off the impact of petroleum depletion. But substitution, like everything else in nature, has limits. The reckoning will come.

The people who run human societies don't want to face what Al Gore, in a different context, called an inconvenient truth. Everything about the organization of today's societies depends on growth. To question this is unthinkable. And so, instead, we look for a magic bullet. Some miraculous formula that will allow growth to continue without limit. Human beings thought they'd found such a magic bullet once before. Fossil fuels allowed human populations after 1800 to grow exponentially, in defiance of the natural limits described -- accurately -- by Thomas Malthus. Can't we find a way to do that again?

My academic historian friend, the one who visited recently, would like to think so. He casually invoked Malthus to dismiss my worries about resource limits. That attempt offers just one example of the desperate yearning among mainstreamers for a magic bullet. In grasping for one, my friend pointed to Brazil's ethanol industry as evidence against my worries about peak oil. Brazil has simply grown corn and sugar for fuel, thereby substituting ethanol and other biofuels for petroleum. Presto! Problem solved. No need to worry about peak oil.

Except my friend is grievously, catastrophically misinformed about the success of Brazil's biofuel industry. Like physicians before the advent of evidence based medicine, he would rather accept prevailing dogma uncritically than examine the available evidence. Brazil's shift to biofuels has been an unmitigated economic and human disaster. Converting finite agricultural land area to growing transportation fuel -- in Brazil and around the world -- has helped drive world food prices through the roof. Brazil fuels its cars and trucks by starving human beings. Even worse, climate change is destroying the finite area of agricultural land. This makes biofuels even more unconscionable. And also untenable. Brazilian biofuel prices have lurched periodically upward as weather disruption devastates global agriculture. With climate change blasting farmland and water supplies into oblivion via heat and storms, it makes no sense to substitute for Saudi oil by using scarce land for gasoline instead of food.

But never mind. Mainstreamers would rather have their magic bullet. They would rather not face the inconvenient truth -- that human beings must begin dramatically reducing their energy use. We must, of necessity, accept non-growth-based economics, also known as steady state economics or ecological economics. Economist Herman Daly has helped pioneer this emerging discipline, in preparation for the day when his colleagues abandon neoclassical fantasies of endless growth.

Historians, political scientists, and government officials would do well to follow Daly's example. Growth-based economics is about as useful as theology and metaphysics. Like those fanciful branches of pseudo-knowledge, abstract social science theories of endless growth are very elaborate and interesting, but useless as a guide to behavior in the real world.


Real world examples from nature of growth without constraints are not comforting. Ecology offers numerous examples of populations overshooting their resource base, then crashing.  An even more stark example comes from the science of medicine. Physicians don't need evidence based medicine to know the name for a living mass that grows without limit. Such a mass is called cancer.

Cancer eventually destroys its host. Growth-based economics will do the same. It is destroying human civilization today. It is annihilating humanity's resource base by consuming resources faster than they can be replenished or substitutes invented (the literature on peak oil, for example, illustrates the physical impossibility of finding bang-for-the-buck alternatives to fossil fuels). The same resource consumption is destroying the climate and biosphere. Destruction of these finite support systems will eventually force human populations back to the natural limits to growth. Whether we do so voluntarily or not.