Why Socialism?

Hunter S. Thompson got a lot right in his Atlantic obit of Tricky Dick, maybe more than I’ll ever get right in all my writing. Specifically, he said, reporters had traditionally gotten it wrong when it came to covering one of the Greatest American Crooks: “Some people will say that words like scum and rotten are wrong for Objective Journalism – which is true, but they miss the point. It was the built-in blind spots of the Objective rules and dogma that allowed Nixon to slither into the White House in the first place.”

Thompson got a lot right, but with a big proviso. Now, I wouldn’t presume to lecture the late Dr. of Gonzo on astute observation. While, as a friend of mine observed, Hunter S. was an immature degenerate who blew his brains out while talking on the phone with his wife and may or may not have participated in gang rape in his reporting on the Hell’s Angels, he was a literary genius. But I have to say I think he missed the point himself. In the final paragraph of the obit, perhaps Nixon’s defining epitaph, Thompson distains the sullying of the office of the American president – the ironic symbol itself, the ultimate power emblem, of the dark political arts that over the course of 227 years have sullied the very idea of America.

“Nixon,” Thompson wrote, “will be remembered as a classic case of a smart man shitting in his own nest. But he also shit in our nests, and that was the crime that history will burn on his memory like a brand. By disgracing and degrading the Presidency of the United States, by fleeing the White House like a diseased cur, Richard Nixon broke the heart of the American Dream.”

This is a profound reflection; Nixon weaseled his way into the position he did because of a press weakened by the confines of Traditional Objectivity as USA Today would define it. But so did many other presidents, and the ones who didn’t slunk into power through other proverbial loopholes, like Obama did with his complicit following, or like both Roosevelts and both Bushes via Nepotism Boulevard.

The American Dream itself has blind spots written into it, burned like their own invisible sort of brand in our constitutional obsession and capitalistic addictions. The dirty work is done behind oaken doors hung with “EXECUTIVE SESSION” signs, but we all know about it. Still we act disgusted.

Richard Nixon was not some horrific experiment-gone-wrong in the laboratory of democracy, no bizarre anomaly in the annals of Western leadership. He was only unique in that he got caught doing something all other presidents have done. But that wasn’t where we focused, and it wasn’t where we litigated. The investigation was swept into the spotlight by a couple of scrappy beat reporters at The Washington Post who uncovered the vast underworld of corporate and political villainy at home and abroad by looking into a seemingly isolated, blue-collar incident. We were only able to see the bigger picture of Nixon’s entire preceding lineup through the lens of a break-in, and we quickly forgot the former.

Watergate is a household term;  the Pentagon Papers, not so much.

We’re all complicit, even if it’s only by our own small ballot-box participations in the Great Machine, including me.

Since I enlisted in the World’s Greatest Fighting Force, a keystone of said apparatus, a year and a half ago, I’ve been trying to reconcile this chasm. It’s been especially tangible of late, like the stinging metallic tingle you get in your sinuses when you hit your head. It’s driven me a little batty, to the point I’m not sure if it shows in my social and professional interactions.

In a Navy C school lesson several weeks ago about a certain maintenance check on the weapons system I’m learning for what will become my job on the U.S. Navy aircraft carrier the USS John C. Stennis, an instructor was reading instructions for a procedure. He pointed out the blinking indications we should see on the equipment if everything works properly, and we came in conversation across the subject of just how fucking formidable the technology is. Here’s how it works in action: On floating cities in the middle of the world’s deep blue, through dozens of tons of steel cabinetry containing millions of dollars’ worth of electronic, hydraulic, pneumatic circuitry, signals dart to provide target coordinates to one of eight nearly 600-pound missiles, which takes off at something around three times the speed of sound. A few seconds later the missile detonates, destroying a threat to the ship that is generally flying through the air, also at a sheer velocity. It’s just one of the many fire control systems the Navy’s FC technicians maintain and operate. Ours is strictly defense-based, unlike the Tomahawk missiles capable of targeting small items well past 1,000 miles away. But, NATO Sea Sparrow techs invariably agree, it’s as much a part of our show of force out there in the world.

Don’t fuck with us.

“Speak softly, and carry big stick,” our instructor quoted a former resident of 1600 Pennsylvania Avenue, referring to how the system intimidates. “Teddy Roosevelt said that. He was a very wise man – and kind of a bad-ass,” the instructor told us.

His enthusiasm incorporates a level of institutional knowledge that has been difficult to find elsewhere in my Navy experience. He’s more assured in the historical lexicon than the average sailor, who’s generally more interested in beating the next level in the Halo video game series. But his fluency in the Official Record notwithstanding, I winced internally at the praise heaped on the former president.

I’d been reading a lot of literature advocating the abolition of the imperial structure lately, most notably late historian and World War II veteran Howard Zinn’s incredible account of American imperialism “A People’s History of the United States,” and had grown queasy at any fond rumination of American presidents in general.

I wondered if any of my six classmates in the laboratory could sense my apprehension, but they were probably focused on the lesson at hand. The test went well; we went about our day; we always did; we always do. We salute, request permission and go ashore.

The next day, my wife and infant daughter and I visited a naval medical center. Waiting for medical staff to call us for 7-month-old Harper’s checkup, Hailey and I noticed a man amble in, mid-70s I guessed, wearing a faded blue T-shirt, tattered jeans, a pair of tan moccasins and a “Navy Veteran” ball cap. He sat in the seats across from us. His hair was tousled salt-and-pepper, his chin a field of unwieldy stubble, his eyes knowing, gray, friendly. He started making wink-and-twinkle jokes: “That’s a real beautiful little boy you got there,” he said about Harper, who wore sparse dishwater in a small pony tail that stuck straight into the air from atop her head. “Those are such beautiful brown eyes,” he added about Harper’s deep blue optics.

The old man wanted to impart his naval acumen to me, the young third-class petty officer across the way. It would turn out he’s a retired command master chief, the third highest enlisted rank, next to fleet or force master chiefs and the Master Chief Petty Officer of the Navy.

His pious caretaking Betty Crocker wife filled his prescription at the desk behind him while he informed me about ship passageways. I had no initial way of telling how true it all was, but he told me he suffered from early onset dementia from “hitting my head too many times in the doorways designed for 5-foot-8 guys.” He’d been my height, he told me, 6-foot-1, but had diminished several inches, curled over like a question mark, over the cruel Civilian-World years.

His stories started slowly and shakily.

He was in the Navy during its heyday, was one of the first CMCs under a program officiated in the mid-‘90s to facilitate better communication between enlisted ranks and commanders. He imparted knowledge of how the chain of command should work in all situations – the CMC always goes through the next person in the chain, the executive officer, never straight to the captain. It’s just as it works at any level of rank; always go one rung at a time.

He gave me a history lesson on his final ship, the USS San Jacinto, one of the first Aegis class cruisers in the Navy. His first act as CMC was to commission the San Jacinto in Houston in 1988, with then-Vice President George H.W. Bush in attendance. Bush was there because 47 years earlier he’d flown from the deck of an aircraft carrier of the same name in support of air missions during WWII. The to-be president of the United States had been shot down near Chichi-jima in the Pacific, and had a harrowing tale of escape.

Anyone with a tenuous grasp on American politics knows Bush went on to become a Republican director of Central Intelligence and a politician who strongly supported the military mission of the United States, for whatever that’s worth.

The master chief held a grin and a nostalgic bravado describing his guidance of First Lady, Barb, about the San Jacinto as it became the 56th Navy cruiser off the coast of ‘Murca-town. It was a celebration. The residents of Houston “treated my soldiers to burgers, beer; they really treated my sailors well.”

One last piece of advice as his wife approached, his medication in hand, and he stood to leave: If I ever hit my head on a ship, get it logged in my medical record – the benefits require rigorous documentation.

He showed me his tattered blue VA photo ID confirming his former position, “so you know I’m not bullshittin’ you.” He shook my hand, but didn’t like the grip he had and readjusted to give my palm the engineman’s grip he’d used in the Navy before he became a CMC.

“That’s better; I used to turn wrenches, you know,” he said. He winked, let go of my hand and walked out the door.

“Have a good evening, master chief,” I said.

The interaction lasted no more than 10 minutes, but I already liked the guy. If I were still a drinker, I could have sat down for a beer with him, played a game of pool, heard more sea stories. I generally feel the same way about the far younger C school instructor who’d quoted Roosevelt. Their Americana is the truest, the bluest out there.

But with their laudatory commentary on men who helmed the warship contraption – our military-industrial complex that you’d criticize but lose your humanity – my distaste for Americana only intensified. The idea of accomplishment by most American politicians, who’ve always held the American ideal of capitalist imperialism higher than the health of the body politic, grew more caustic. Roosevelt, Bush, everyone before, in between and after – they were just muscles tightening the bony clutches of American expansionism, turning our green world ugly.

Teddy’s Big Stick

ROOSEVELT, A REPUBLICAN, is more than most held among liberals and intellectuals as one of the greatest American presidents, the truest of progressives, though he’d later be overshadowed by his fifth cousin, FDR. Teddy was hailed as a progressive champion of the betterment of the working class. But he was, by the essence a man who climbed military rank to the top echelons to assistant secretary of the Navy, then vice president, then to Most Powerful Man in the World, another apparition of the imperial erectile hyper-function that has always defined American foreign policy.

I’ll quote heavily from Zinn, so here I go: “Theodore Roosevelt wrote to a friend in the year 1897: ‘In strict confidence … I should welcome any war, for I think this country needs one.’”

The letter, one of the most direct and honest mandates for the military-industrial complex, was written during the Roosevelt ascendency, and a time of particular gloom among Americans. The country had been in an economic depression as American corporate empire-building sputtered to a halt at the Pacific Ocean. The economy had relied disproportionately on railroad expansion in which the tycoons, in bed with political structure, exploited and stole from American Indians and poor farmers just trying to earn an honest day’s wage. At shining sea, they could go no farther. They’d wasted away their most important resource: frontier. Thousands of businesses and hundreds of banks closed, and unemployment remained higher than 10 percent from 1893 to 1898.

Inheriting the mess in 1897, the newly-elected Republican William McKinley was desperate for new boundaries to push. He found them on the lush islands of Cuba, Puerto Rico, Guam and the Philippines, where the United States had invested millions in railroads, tobacco and mining. It was widely reported that McKinley didn’t want war with Spain, which would be marketed as retribution for Spain’s treatment of an already mobilized local rebellion in Cuba. McKinley didn’t want war – he needed it. A war would mean two big things: First, strategic expansion of the American marketplace abroad. Industry needed this because of an overabundance of American industrial and agricultural production for which there was no letting valve. Second, the remobilization of the war economy, which, with Twentieth Century wars to come, would establish itself as a vital and clever gimmick in keeping America’s imperialist id as secure and largely unconscious as it has been since.

So a mysterious explosion that destroyed the battleship USS Maine  – which sat just off the coast of Havana “as a symbol of American interest in Cuban events,” according to Zinn – offered a convenient catalyst for public support for war with Spain, after establishment newspapers in New York City blamed the explosion on Spanish forces.

Needless to say, America won the conflict and, in a $20 million settlement with Spain, annexed Puerto Rico, Guam and the Philippines, the last of which allowed the States theater in the burgeoning marketplace of China. The nearly immediate insurrection of the Filipinos in 1899 did not withstand, as Americans brutally quelled it, allegedly massacring the dissenters. As for Cuba, America forced upon it a deal that effectively enslaved it economically to the United States and allowed the American military to set up permanent shop there. This foreshadowed the establishment of the naval base at Guantanamo Bay, where a large number of innocent dark men the U.S. has accused of terrorist affiliation are on hunger strike and are having hoses shoved through their nasal passages as conduit for food to be pumped to their stomachs.

In the swirling clamor, Teddy Roosevelt, an Army colonel whose Rough Riders would orchestrate much of the violence in the American expansion in Cuba, got busy.

He was named assistant secretary of the Navy in 1897 by McKinley, was elected governor of New York in 1898 and was elected McKinley’s vice president in the 1900 presidential contest. McKinley was assassinated a year later, and Roosevelt was sworn in. He would serve two terms.

Though Roosevelt’s administration waged no war, he was arguably one of the most prolific imperialists in the history of the United States. He continued and strengthened the policies of McKinley and McKinley’s predecessor, Grover Cleveland, both rabid imperialists, by creating what’s known as the “Roosevelt corollary to the Monroe Doctrine.” The policy states that, because America had among the strongest assets – read power – of all hemispheric powers, it therefore had a moral responsibility to intervene in economic policy in nearby Latin America. This would spark a century of exponentially more frequent coups d’état in the southerly parts of our side of the world to strengthen American capitalism and quell peoples’ movements.

Roosevelt’s “big stick.”

The Roosevelt administration tried to force Colombia to lease for $10 million and then an annual $250,000 a part of the Panamanian isthmus to the United States. The administration was planning to pay a contractor, the New Panama Canal Company, $40 million to build a massive canal through the isthmus for a new trade route. When the Columbian government refused, the administration financially and militarily augmented a Panamanian uprising that facilitated America’s business interests, and, over a decade, the Panama Canal became reality. It set the stage for Bush 41’s funtivities in Panama eight and half decades later.

Privately hostile to worker’s movements, Roosevelt also personally sought to quell the socialist uprising during his administration at home. Zinn reports Roosevelt’s response to an article written in a radical newspaper by socialist leader Eugene Debs in which Debs called for a general strike if the government cracked down on the socialist movement: “Theodore Roosevelt, after reading this, sent a copy to his Attorney General … with a note: ‘Is it possible to proceed against Debs and the proprietor of this paper criminally?’”

Roosevelt didn’t have time for a crowd of 3,000 that represented the hundreds of thousands of child laborers exploited by corporations. When they marched on Oyster Bay to seek his advice on how to abolish child labor, he wouldn’t see them. He opposed a 1910 Supreme Court opinion that a “workmen’s compensation rule was unconstitutional because it deprived corporations of property without due process of law,” Zinn writes, because Roosevelt believed it would be a lightning rod for the Socialist Party. The Supreme Court had decades earlier spelled out the concept of corporate personhood under the Constitution’s Fourteenth Amendment.

Publicly, he worked to throw progressive movements a bone here and there, requiring food companies to accurately label their products and calling on Congress to limit the powers of large corporations. But this had an ancillary effect, much like all seeming progressive policy in the United States, which had before and has since worked in the interest of capital growth. As Zinn notes, Roosevelt made these slight concessions in private meetings with corporate representatives who promised him the veneer of progressivism in exchange for assurance that Roosevelt would guarantee the health of industry.

Progressive is how Roosevelt is seen, same as it ever was.

“The Environmental President”

THOUGH NOT QUITE so as either Roosevelt, H.W. is also seen fondly, but that’s no accomplishment when the historical lens is muddied by the leadership of his son, one of the worst presidents in history.

As president, 41 approved policy that, on the surface, looks progressive. He signed the Clean Air Act in 1990, and used it to coin himself “the environmental president.” He promised no new taxes by lip-reading. His military ventures, most notably the Gulf War, are seen as smashing successes, as liberations of countries in need. Accordingly, he relished a record approval rating between 89 and 91 percent in 1991.

But, also like Roosevelt, and certainly a little easier, when you start picking apart the details, the picture is not so rosy. His biggest environmental initiative the Clean Air Act was a lukewarm solution to a stunningly difficult problem (especially in terms of politics), and in subsequent years Congress gutted it financially and the administration withdrew support.

Zinn writes:

… two years after it was passed, it was seriously weakened by a new rule of the Environmental Protection Agency that allowed manufacturers to increase by 245 tons a year hazardous pollutants in the atmosphere.

Furthermore, little money was allocated for enforcement. Contaminated drinking water had caused over 100,000 illnesses between 1971 and 1985, according to an EPA report. But in Bush’s first year in office, while the EPA received 80,000 complaints of contaminated drinking water, only one in a hundred was investigated. And in 1991 and 1992, according to a private environmental group, the Natural Resources Defense Council, there were some 250,000 violations of the Safe Drinking Water Act (which had been passed during the Nixon administration).

Shortly after Bush took office, a government scientist prepared testimony for a Congressional committee on the dangerous effect of industrial uses of coal and other fossil fuel in contributing to “global warming,” a depletion of the earth’s protective ozone layer. The White House changed the testimony over the scientist’s objections, to minimize the danger (Boston Globe, Oct. 29, 1990). Again, business worries about regulation seemed to override the safety of the public.

At international conferences to deal with the perils of global warming, the European Community and Japan proposed specific levels and timetables for carbon dioxide emissions, in which the United States was the leading culprit. But, as the New York Times reported in the summer of 1991, “the Bush administration fears that … it would hurt the nation’s economy in the short term for no demonstrable long-term climatic benefit.” Scientific opinion was quite clear on the long-term benefit, but this was not as important as “the economy” – that is, the needs of corporations.

Sound familiar? H.W,’s entire administration, despite forgetfulness of hindsight – it’s not 20/20, it appears – smacks of neoconservatism.

Bush enshrined the main policy tenant of neoconservatism in his speech at the 1988 Republican National Convention when he made a statement that played a big role in his losing a second term to Slick Willy four years later: “Read my lips: no new taxes.” Taken by itself, this little gem promised one dangerous third of the neoconservative agenda. The other two-thirds – keeping existing effective tax rates static or shrinking them and rescinding the government’s essential responsibility to regulate businesses – were promised in other parts of the same speech. The address must have done his boss, the Gipper, proud. But, much like his predecessor, who had claimed office twice by promising to deregulate and detax, Bush felt the embarrassment of such unyielding rhetoric always precedes. Facing a Republican Congressional base recalcitrant over Bush’s tacit support of a mixed approach of cutting spending but raising taxes to pay for Reagan’s $220 billion deficit, the president was forced to sign a bill crafted and passed by the Democratic majority of both chambers that focused more heavily on tax increases.

The Bible is full of truisms, and sayeth, “Pride goeth before a fall” – Proverbs 16:18. Clinton, the Great Talker, beat Bush of a second term, and everyone blamed 41’s RNC lips.

Bush signed a ceremonial version of the North American Free Trade Agreement, which facilitated what politicians, including his sweet-talking successor, hailed as a “liberalization” of markets. The law, which dissolved a number of tariffs and trade barriers between the three northernmost North American countries, has allowed large corporations to amass ever larger fortunes, a typical pitfall of regional trade agreements. That expansion predictably attended exploitation of the poverty endemic in such a developing country as Mexico, contributing to worsening squalor.

But his administration’s biggest economic eulogy would write itself militarily. It was his martial guard, much of which he picked up from the Reagan administration (and in turn bequeathed his son), that began to formulate and codify the notion of a perpetual war economy. On the surface, Bush’s military ventures were either self-serving or foolish or both, and always catastrophic. In December 1989, the same year Bush took office, Bush’s Defense Secretary, Yertle the Turtle, noticed a game brewing down south in America’s imperial-corporate corridor, Panama. According to Cheney’s http://www.defense.gov profile:

Panama, controlled by General Manuel Antonio Noriega, the head of the country’s military, against whom a U.S. grand jury had entered an indictment for drug trafficking in February 1988, held Cheney’s attention almost from the time he took office. Using economic sanctions and political pressure, the United States mounted a campaign to drive Noriega from power. In May 1989 after Guillermo Endara had been duly elected president of Panama, Noriega nullified the election outcome, incurring intensified U.S. pressure on him. In October Noriega succeeded in quelling a military coup, but in December, after his defense forces shot a U.S. serviceman, 24,000 U.S. troops invaded Panama. Within a few days they achieved control and Endara assumed the presidency. U.S. forces arrested Noriega and flew him to Miami where he was held until his trial, which led to his conviction and imprisonment on racketeering and drug trafficking charges in April 1992.

All true, but the Establishment writer hood who penned this article neglected to note some important context. According to this quick Guardian rundown, in the years leading to his ouster, Noriega had served the U.S. government well by facilitating its war against the socialist Sandinista government in Nicaragua.

Why, after all, wouldn’t we fight against socialism in our backyard? The Sandinistas established high literacy rates, universal health care and gender equality. So Reagan sold arms to Iran against its own policy and used the revenue to fund the Contras’ kidnap, rape, mutilation and murder of Nicaraguan civilians to spread capitalism. Human Rights Watch reported: “This is an important change from a human rights perspective, because the contras were major and systematic violators of the most basic standards of the laws of armed conflict, including by launching indiscriminate attacks on civilians, selectively murdering non-combatants, and mistreating prisoners.” But, hey, Reagan was an affable guy, so what the hell? But that’s a deep rabbit-hole.

The point is, we the taxpayers paid Noriega through the CIA because his military junta allowed the U.S. to gather intelligence on the Nicaraguan government from outposts on Panamanian soil. But when Noriega became too dangerous a political liability for even a United States Republican to support, the Bush administration came up with a number of reasons to suddenly halt support and CIA payments for Noriega. We invaded militarily and installed a leader we considered friendlier to American imperialist interests. The stated reasons were that Noriega’s military government threatened the lives of the 35,000 American citizens living in Panama; that Noriega somehow posed a threat to the Carter-Torrijos Treaty, which would give back the Panama Canal to Panama in 1999; that a cession of support would stanch the flow of narcotics through Panama. So Bush invaded Panama to extract the at-large Noriega and to install Guillermo Endara, who’d run unsuccessfully against Noriega in the previous election. During the month-long occupation in December 1989, 24 U.S. soldiers died, about 200 Panamanian soldiers lost their lives and an intensely argued-over number of Panamanian citizens were killed. The United States told us everything was OK because only a few hundred Panamanians died in the name of American fascism. Noriega’s associates claimed the number is much higher. Noriega was imprisoned in the U.S. and did nearly two decades of hard time in Florida. He has since been extradited about the globe, to France for murder and then to Panama for money laundering. According to the State Department, “Panama remains a transshipment crossroads for illicit trafficking due to its geographic location and the presence of the canal.” The people who died in the conflict, which the Bush administration called “Operation Just Cause,” are still dead.

Operation Just Cause was a simple jaunt into the by-then familiar territory of the Browner parts of the Western Hemisphere, where America had become very comfortable installing its own dictators and regressive capitalistic policies. (I’ll not go into the region’s entire history here, but I’ll suggest Naomi Klein’s “The Shock Doctrine.” It is fantastic journalism.)

The American intervention in the Middle East was, effectively if not intentionally, more ambitious. The Gulf War, known in Bush White House parlance as “Operation Desert Storm,” is seen in the nation’s classrooms through the lens of Official Narrative as the epitome of just intervention.

The Bush administration started marketing intervention in Iraq mainly as a necessary exercise of America’s moral authority in mitigating the human rights abuses perpetrated on Kurds and Iranians by Saddam Hussein’s military. Under that banner, the same one the next Bush would fly for Operation Iraqi Freedom, Western forces flew their terrifying planes and drove their mighty tanks in what became an iconic flexing of military muscle into the Iraqi desert, fighting, firing and bombing their way to Baghdad’s doorstep, but no farther.

In the end, it was simply a show of power. Why? Zinn writes:

Although in the course of the war Saddam Hussein has been depicted by U.S. officials and the press as another Hitler, the war ended short of a march into Baghdad, leaving Hussein in power. It seemed that the United States had wanted to weaken him, but not to eliminate him, in order to keep him as a balance against Iran. In the years before the Gulf War, the United States had sold Arms to both Iran and Iraq, at different times favoring one or the other as part of the traditional “balance of power” strategy.

Zinn and a number of other journalists and historians also write that U.S. involvement in the Gulf conflict was intended to help secure Bush a second term. But those two benefits were accompanied by a third, which represented what was truly at stake for American corporatism: Saudi oil. Tricky Dick Jr. met several times with King Fahd to assure him that, in exchange for permission to choreograph Operation Desert Storm from Saudi soil, Saddam would be no more. Of course, though Hussein would be weakened, his was not wrested until the younger Bush’s presidency. (Cheney, who for a time took the helm of one of biggest oil companies in the world, Halliburton, has since become a frequent houseguest in the Saudi caliphate, during his vice presidency, as well as more lately to discuss items that are unclear – you know, probably just catching up with old friends.)

But the bigger meaning of the Gulf War for many Iraqis, who were hardly liberated, was terror. According to Bloomberg Businessweek:

Although Cheney said shortly after the 1991 Gulf War that “we have no way of knowing precisely how many casualties occurred” during the fighting “and may never know,” Daponte had estimated otherwise: 13,000 civilians were killed directly by American and allied forces, and about 70,000 civilians died subsequently from war-related damage to medical facilities and supplies, the electric power grid, and the water system, she calculated.

In all, 40,000 Iraqi soldiers were killed in the conflict, she concluded, putting total Iraqi losses from the war and its aftermath at 158,000, including 86,194 men, 39,612 women, and 32,195 children.

The carnage was especially palpable in Fallujah, where military personnel I know, nearly a decade and a half later, yelled racial slurs against Iraqis as they loaded bombs bound for the city in 2004. As Jeremy Scahill writes in “Blackwater – The Rise of the World’s Most Powerful Mercenary Army”:

During the 1991 Gulf War, Fallujah was the site of one of the single greatest massacres attributed to “errant” bombs during a war that was painted as the dawn of the age of “smart” weaponry. Shortly after 3:00 p.m. on the afternoon of February 13, 1991, allied war planes thundered over the city, launching missiles at the massive steel bridge crossing the Euphrates River and connecting Fallujah to the main road in Baghdad. Having failed to bring the bridge down, the planes returned to Fallujah an hour later. “I saw eight planes,” recalled an eyewitness. “Six of them were circling as if they were covering. The other two carried out the attack.” British Tornado warplanes fired off several of the much-vaunted laser-guided “precision” missiles at the bridge. But at least three missed their supposed target, and one landed in a residential area some eight hundred yards from the bridge, smashing into a crowded apartment complex and slicing through a packed marketplace. In the end, local hospital officials said more than 130 people were killed that day and some 80 others were wounded. Many of the victims were children. An allied commander, Capt. David Henderson, said the planes’ laser system had malfunctioned. “As far as we were concerned, the bridge was a legitimate military target,” Henderson told reporters. “Unfortunately, it looks as though, despite our best efforts, bombs did land in the town.” He and other officials accused the Iraqi government of publicizing the “errant” bomb as part of a propaganda war, saying, “We should also remember the atrocities committed by Iraq against Iran with chemical warfare and against [its] own countrymen, the Kurds.” As rescue workers and survivors dug through the rubble of the apartment complex and neighboring shops, one Fallujan shouted at reporters, “Look what Bush did! For him Kuwait starts here.”

Whether or not it was an “errant” bomb, for the decade that followed the attack, it was remembered in Iraq as a massacre and would shape the way Fallujans later viewed the invading U.S. forces under the command of yet another President Bush.

All that is, of course, not to mention the horrific Gulf War Syndrome, a nasty cocktail of symptoms like chronic fatigue, diarrhea and joint pain suffered by more than a third of the American veterans of that conflict. No one knows what caused it, but evidence suggests the U.S. military’s usage of chemicals in warfare and ancillary activity.

The Run of the Mill

I REALLY COULD go on about how horrible Bush senior was, but it’s depressing, so I’ll get to the point: Bush, like Nixon and Roosevelt, was typical. Before Bush, Harry Truman oversaw the respective 1953 and ‘54 CIA coups of Iranian Prime Minister Mohammad Mosaddegh, who’d pissed off British Petroleum when he tried to nationalize Iranian oil, and Guatemalan President Jacobo Arbenz Guzman, who’d made enemies with the United Fruit Company by challenging its agricultural monopoly in the country. These, which are far from the first American overthrows of foreign governments, were special because they helped establish the CIA as the go-to agency for installing Western democracies, or Western trade sanctuaries if we’re being honest, in foreign countries. Every president since has had a lot of fun with it.

The United States and its corporate fiends have invaded, staged military coups, financed the restructuring of leftist political and economic infrastructure by influencing academia, secretively installed dictators, sent conservative economic advisory groups or otherwise intervened in all 20 Latin American countries not owned by France but three: Venezuela, Paraguay and Colombia. The latter was the only Latin American country to support the second Bush administration’s war on terror.

Since the turn of the Twentieth Century, the United States has taken the Roosevelt Corollary and applied it worldwide. Just since 1945, the end of World War II, to say nothing of other types of meddling we love, the United States has bombed 17 countries not in our hemisphere. It wouldn’t be difficult to imagine that America, in its 237-year existence, has tried in some way to implement its corporatist itinerary in all 194 countries the State Department recognizes and the territories it claimed in its westward invasion of North America. We’re certainly not alone here; Great Britain is known to have invaded nine of 10 countries in the world.

Don’t look to the liberal wing for progress. After H.W., Clinton helped implement a large number of oppressive policies that, whether intentionally or not, appeased his corporate underwriters like the Martin Marietta Corporation. NAFTA ushered in pain for poor people and gain for the rich, widening the wealth gap. It also made it cheap and easy for multinationals to buy cheaper labor in other countries and to bring them home. His vast derivatives deregulation called the Commodities Futures Modernization Act, signed the year before he left office, helped Wall Street bankrupt America, a disaster for which liberals, sleight-of-tongue masters that they are, disingenuously blamed Bush. The legislation ensured Clinton fancy post-presidential digs doing what he did best: talking to cooing crowds from behind a lectern. He’s since advocated at elegant confabs a lower corporate tax rate in America.

W. – well, we all know about W., but Jonathan Chait of New York Magazine has an apt digest of just how much he sucked. Also, Rex Nutting, of The Wall Street Journal’s marketwatch.com puts it this way:

Bush had all the luck of Jimmy Carter, the attention to detail of Ronald Reagan, the adaptability of Lyndon Johnson, the abiding respect for the Constitution of Richard Nixon, the humility of Teddy Roosevelt, the rhetorical skills of Calvin Coolidge, the fiscal restraint of Franklin Roosevelt, the cronyism of Warren Harding, and the overreaching idealism of Woodrow Wilson.

Of course, the newest denizen of the white monstrosity at the exchange of Money and Power is no better. The wealth gap is huge, bigger than at any time under Bush. Obama has, in his latest grand statement on the matter, sworn fealty to those looking to boost their corporate profits by exacerbating climate change – the proliferation of natural gas – and failed to mention the real clincher with climate change: that we must, must, make some cessions in our standard of living if we are to solve the climate crisis. Yes, it’s a signature issue, and yes, he’s botching it. His justice department is wasting vast amounts of public money in going rabidly after drugs – a nice way of saying it’s going rabidly after young black men, who Obama was, as he notes, 35 years ago – while ignoring the systemic causes of America’s drug problems. And of course, Obama has expanded nearly every Bush national security program that he promised to scale back during his campaigns. Obama’s military-industrial complex is doing just fine, thank you.

A short jaunt into the test-beds of economic reductionism would have you otherwise believe, but capitalism in America is not on the downswing. It’s being codified in our nation’s infrastructure by the very corporations it benefits, written parasitically in the American government to where we can’t tell where one begins and the other ends. Republicans might sound a little crazier, but it’s happening on both ends of the political spectrum.

Capitalism: The Auction-Block of Government

The first Bush may have sat a little more to the right of most Democrats, but not by much and only in symbolic ways. Clinton expanded his policies. Bush Jr. combined those policies with Dick Cheney’s dream of privatization of services that are, by their very essence, meant to administered publicly; they’re too important, as journalist Naomi Klein says, to leave to a marketplace that has fickle loyalties except when it comes to commerce. Because when you take away the notion of such services as fundamentally public enterprises, the administration thereof is no longer accountable to the public it affects. Corporations are just doing what corporations do so they’re also immune to criticism.

Scahill writes of former spy Robert Richer, who was later CEO of Total Intelligence, one of the many private security contractors that were so successful by Bush’s war on terror:

In 2007 Richer told [The Washington] Post that now that he is in the private sector, foreign military officials and others are more willing to give him information than they were when he was with the CIA. He recalled a conversation with a general from a foreign military during which Richer was surprised at the potentially “classified” information the general revealed. When Richer asked why the general was giving him the information, he said the general responded, “If I tell it to an embassy official I’ve created espionage. You’re a business partner.”

Privatization of government takes away the public element, and the public is no longer a shareholder. He’s simply an onlooker, blind until it he’s been fleeced and the thief has made off through a labyrinth of corporate alleyways with booty of public treasure. And maybe that’s the point. Maybe the Cheneys and Bushes and Rumsfelds and Bremers, the Obamas and Clintons and Pelosis and Reids are needy in their own way: they need the needy to disappear, sequestered to scrap squabbles among themselves, biting at each others’ throats, as the diminishing middle class does at the jugular of the burgeoning poor, while the bourgeoisie leans back in a lawn chair next to a temperature-regulated pool with an umbrella-capped cocktail, and watches.

But then again, we’re not just bystanders. Bystanders have a responsibility to intervene, which we’ve abdicated to many of the very structures – government-funded watchdog groups and corporate media – that dangle the scraps. We squander rare opportunities to right ourselves when independent groups of intellectuals warn something’s really wrong. We distain intellectualism, we bristle at facts, we kill the messenger. That revulsion informs our worldview. We trust gut feelings that the earth self-heals from the worst wounds so can leave the light on. We let our most venal, reflexive proclivities get the best of us, and always when it’s most important that we don’t.

Historically, Americans have supported efforts to advance Western imperialism. By propping the power structures, we desperately cling, with brittle fingernails, to the idea that if we scrape by, take our lumps, until such time as we can win the Powerball or patent one of those ideas rolling about our brains, we’ll not have to worry anymore. Don’t fret, Margaret, I’ll be coming into some money, soon. It’s stimulated by America’s profound flaw, the alcoholic’s penchant to “never trust a man who doesn’t drink.” David Simon called it the “callow insecurity that accompanies any cry of ‘America, right or wrong’ or ‘America, love it or leave it.’”

In his sweeping novel, “East of Eden,” John Steinbeck wrote of the attitude of residents of his home region, the Salinas Valley, during World War I toward an innocent sad old German man who’d migrated to the valley and had a thick accent, and others:

One Saturday night, they collected in a bar and marched in a column of fours out Central Avenue, saying, “Hup! Hup!” in unison. They tore down Mr. Fenchel’s white picket fence and burned the front of his house. No Kaiser-loving son of a bitch was going to get away with it with us. And then Salinas could hold up its head with San Jose.

Of course that made Watsonville get busy. They tarred and feathered a Pole they thought was a German. He had an accent.

We of Salinas did all of the things that are inevitably done in a war, and we thought the inevitable thoughts. We screamed over good rumors and died of panic at bad news. Everybody had a secret he had to spread obliquely to keep its identity as a secret. Our pattern of life changed in the usual manner. Wages and prices went up. A whisper of shortage caused us to buy and store food. Nice quiet ladies clawed one another over a can of tomatoes.

This is just how we collectively act toward brown people after 9/11. It’s how we act toward everyone who looks similar to a person who’s done something wrong. It’s how we act toward people who look like those our betters have told us have done something wrong, even when they haven’t.

All leaders named above, and every president not named, and the vast majority, with a few notable exceptions, of their cabinet members, have always worked in the interest of parasitic corporations or an arcane bourgeoisie before they’ve worked in the interest of the body politic that voted them into office, and they’re praised for it. It’s written dramatically, and with a certain degree of permanence, into revisionist grade- and high-school curricula.

Zinn writes:

The treatment of heroes (Columbus) and their victims (the Arawaks) – the quiet acceptance of conquest and murder in the name of progress – is only one aspect of a certain approach to history, in which the past is told from the point of view of governments, conquerors, diplomats, leaders. It is as if they, like Columbus, deserve universal acceptance, as if they – the Founding Fathers, Jackson, Lincoln, Wilson, Roosevelt, Kennedy, the leading members of Congress, the famous Justices of the Supreme Court – represent the nation as a whole. The pretense is that there really is such a thing as “the United States,” subject to occasional conflicts and quarrels, but fundamentally a community of people with common interests. It is as if there really is a “national interest” represented in the Constitution, in territorial expansion, in the laws passed by Congress, the decisions of the courts, the development of capitalism, the culture of education and the mass media.

You’d almost prefer apathy in public school education, if only so the lies go in one ear and out the other, wasted in the winds, and death would be by pure complacency and not the worse thing we have.

All the problems engendered by this modus revolve around one concept: our addiction to capitalism. Every significant military- and foreign policy-based action taken by the United States – with very few exceptions, like the American signing of the Geneva Conventions or our participation, hesitant and ceremonial though it always is, in intergovernmental firesides on how to deal with climate change – are in the interest of expanding American market share in the world economy. It’s always about money. We measure American vitality always against the size of our marketplace with manic, nebulous metrics like gross domestic product, jobless claims and fluctuations in the economic confidence index. We measure these instead of the vibrancy of our culture or the conditions under which we are truly happy or the quality of our compassion toward one another. Money has come to embody our culture, our happiness, our compassion – there is nothing outside of it. Otherwise happy couples divorce over it, and the prescription always comes down to personal frugality: Do well by your money, and you’ll be happy; do poorly by it, and no matter how well you handle  other aspects of life, you’ll be screwed. “Money makes the world go ‘round,” they tell us, and we smile and unquestioningly nod. The more fundamental question of whether we should have a social system with money as its foundation – especially the one of which Americans have proven themselves to be incredibly poor, egocentric and greedy stewards – is never asked, never thought of.

This frenzied, rabid struggle for money above all things necessitates the mistreatment of others. Take climate change. The conservative establishment, in its never-ending lip service to deregulation, must deny the fact of human-caused temperature shifts. If they don’t, they’ll be forced to admit the problem requires regulation, which threatens not only the profits of their biggest investors, but their religious and social views. Klein addressed the drop of late in public belief that global warming exists, and that if it does it is caused by humans, in a recent interview with PBS’s Bill Moyers:

Climate change is, I would argue, the greatest single free-market failure. This is what happens when you don’t regulate corporations and you allow them to treat the atmosphere as an open sewer. So it isn’t just, OK the fossil fuel companies want to protect their profits; it’s that this science threatens a worldview. And when you dig deeper, when you drill down into those statistics about the drop in belief in climate change, what you see is that Democrats still believe in climate change in the 70th percentile. That whole drop-off in belief has happened on the right side of the political spectrum. … People who have very strong conservative political beliefs cannot deal with this science because it threatens everything else they believe.

To keep the status quo, we’ll drown nations. Climate is only the most important thing, but we have backups. We’ll also go after national heroes, telling them they’ve aided and abetted the enemy, for whatever that’s worth. We wield the traitor brush as if it’s some original, profound and horrifying observation that, holy shit, terrorists are able to buy technology that connects them to the leaked information. So I’d love to know precisely why, since Bradley Manning and Edward Snowden and Julian Assange represent such a boon for terrorists, litigators have not illustrated any specific instances in which terrorists used the information the leakers disclosed to conduct violence. The truth is, establishment forces must do everything they can to degrade efforts at transparency because they portend public dissent against the war machine and its surveillance state.

Even so, we turn blind eyes, toe the party line and vote for whom the media expects us to. But the culprit of our social ills is quite clear when the backrooms in which our laws are written are monitored by the most courageous journalists.

Why Socialism?

The next leap is to identify a way to neutralize the problem. A number of philosophers and writers look to the historical precedent of nations that have been successful – at least to the point America or another Western empire has intervened – in their experiments with egalitarianism.

These people are quickly written off by conservative intellectuals in palpable ways.

Read Steven Plaut, a professor of economics at Israel’s University of Haifa. He wrote in 2011 in Frontpage Mag – whose online flag motto ominously warns, “Inside Every Liberal Is A Totalitarian Screaming to Get Out” – about Scandinavian countries, which are upheld by liberals as examples of good socialism. He says Scandinavia has low rates of poverty not because it is relatively socialistic, but because Scandinavians, almost as a rule, are stalwart partisans of parsimony, and they put their fine backs and strong shoulders to good work, producing capital:

The interesting question is whether the low poverty rates there are thanks to the economic system or thanks to Scandinavians being hard-working thrifty disciplined people.  That Scandinavians are hard-working is evident from the fact that in spite of enormous benefits in Sweden for the unemployed and for those who do not work, creating incentives to avoid work, Sweden has a labor force participation rate that is one of the highest in Europe.

One way to test our question is to examine Scandinavians who do not live in Scandinavia.  There is a large Scandinavian population that lives in the bad-old-selfish-materialist-capitalist United States.  Well, it turns out that Scandinavians living under its selfish capitalism also have remarkably low poverty rates.  Economists Geranda Notten and Chris de Neubourg have studied Scandinavians living in the US and in Sweden and compared their poverty rates.  They estimate the poverty rate for Scandinavians living in the United States as 6.7%, half that of the general U.S population.  Using measures and definitions of poverty like those used in the US, the same analysts calculate the poverty rate in Sweden using the American poverty threshold as an identical 6.7% (although it was 10% using an alternative measure).   So low poverty among Scandinavians seems to be because Scandinavians work, whether or not Scandinavian “socialism” can be said to work.

But Plaut’s thesis rings hollow – as do those that logically flow from it – when you work past the conceit in his theorem.

To say that poverty is low in Scandinavia simply because Scandinavians have an excellent work ethic, which I’m sure they do, is to say that America has a higher poverty rate because the people who fall into that category are moochers. But more than 90 percent of entitlement benefits meant for the poor in America go to the elderly, most of whom worked until late in life, to the disabled or to working families. An infinitesimal portion of benefits in America go to people who drift on the waves of the social welfare system. Perhaps the Swedes work hard because they have productive socially administered means of procuring decent employment. This contrasts the vast – sometimes unquantifiable because many of these budgets are “dark” – amount of American taxpayer money that goes toward subsidies, for example, to the war machine, the oil industry or the health care apparatus.

Plaut examines a number of statistics about poverty among migrants – Scandinavians who travel to America and don’t fall into the poverty trap, while “moochers” to Scandinavia’s south travel to Scandinavia and still live in squalor. But he fails to notice the dynamic of white gentry that feeds poverty in racially diverse countries like the U.S. or most European countries. Scandinavia has an incredibly homogenous population. According to the CIA, the resident population of Finland is 93.4 percent Finn and 5.6 percent Swede; Norway is 94.4 percent Norwegian; and according to Eurostat, Sweden is 87.7 percent Swede. In America, Whites face far less hardship in finding jobs than their ethnic counterparts. Historically, American policy has also disproportionately favored whites in building a base for success – in acquiring a set of bootstraps by which to pull themselves up. For example, post-Civil War Scandinavian immigrants used the federal homestead acts, which gave portions of public land to private homesteaders but originally excluded blacks, to found their successful futures. It would be reductive to say Scandinavia is without race problems. Indeed, it needs profound self-reflection to right many ethnic wrongs. But America has a far more racially divided history than Scandinavia, and if you’ve been reading the front pages, racism in America is not over by a long shot. Read in this light, Plaut’s column becomes a disingenuous denial of the existence of the American poverty trap, especially in ethnic communities.

Plaut cites a nationmaster.com digest of poverty statistics by country that focuses on the percentage of the population that lives below the poverty line, ranked from the highest poverty rates to the lowest. He notes that Switzerland, a capitalist paradise (which, incidentally, provides universal health care and requires many of its men to own a gun), comes in at 146 of 153, beating out all Scandinavian countries. But he fails to note that Switzerland is beaten by three positions by notoriously socialist Ireland, whose government controls health care, education, banks and many businesses and whose public spending policies are credited with a turnaround from its recent economic recession. The rest of Switzerland’s successors are an amalgam of different types of economies of France, Austria, Malaysia, Lithuania, China and Taiwan. It seems Switzerland is, in a way, its own Plaut anathema. Maybe he needs a different yardstick. Indeed, let’s talk other metrics of success, like free access to good health care and education, excellent job benefits like mandatory maternity leave, and 100 percent literacy rates – all proven corollaries for better economic vitality. Maybe those, when juxtaposed against the abysmal records in the United States would be convincing.

Twice in the column, Plaut writes about real evidence of social dynamics in real places as if they existed in a hypothetical universe, a typical obfuscation tactic for those on the right. He notes the inconsistent protocol across different economies of measuring poverty, which could suggest poverty measured by different standards might mean Scandinavia has a higher poverty rate than it had seemed when viewed through a more Western lens. He writes:

The definition of ‘poverty’ and its measurement are both highly problematic, and both vary dramatically, making inter-country comparisons difficult. In all countries there are serious problems with the measures. Wealthy people are sometimes counted as part of the population below the poverty line, as long as their current income happens to be low.  Examples are retired people and students.  The poverty statistics are based on reported incomes, meaning that lots of people living high on the hog are counted as poor because they do not report their income at all to the tax authorities, earning income from the “shadow economy.”  Poverty is generally measured by income, not consumption.  It is often measured as a percent of median income, not by material hardship, or by the rather silly “Gini coefficient.”  If every single person discovered a petroleum well in his yard, poverty rates would not change much.

OK, Justice Scalia, poverty statistics are flawed – just like any other type. That’s why studies based on data and statistics have error margins. But by Plaut’s logic, Scandinavia might also have a smaller poverty rate, relative to Western countries, than reported. If we’re calling the statistical accuracy of the evidence upon which we base our suppositions into question, why bother to make a conclusion in the first place?

Here, too, we find the same problem: Scandinavian countries, while far more socially progressive than many other Western democracies, are as Plaut points out not at all entirely socialistic. He seems to allude, by citing Scandinavians critical of socialism, that if corporations Scandinavia were on an even longer leash, poverty might be completely eliminated. Again, we must lean on the if-then-might argument to mollify our absence of science, in which the opposite conclusion is equally valid.

All this leads to a magnificent crescendo in the interest of discrediting socialism, and it’s a common conservative talking point on Scandinavia: “The conclusion can only be one thing.  The low poverty rate among Scandinavians in Scandinavian countries is thanks to the fact that Scandinavians work.  It is NOT because socialism works!”

Plaut separates Scandinavia as an aberration of the philosophy: Scandinavians are nothing like those goddamn Browns who tried to run their own countries to America’s south and to Asia’s west. They don’t, somehow, look to mooch off the waning faction of their society that produces the wealth.

The conservatives are perplexed; how does this work? The conclusion can only be one thing …

And, boom and bust, on we go.

This hypothesis ignores that Venezuela’s Chavez, Chile’s Allende, Iran’s Mosaddegh, and so on and so forth brought their people up from poverty, establishing vibrant local economies at the loss of multinational corporations. They placed the ownership of their countries back in the hands of the people. America neutralized or ostracized them. Where we could, we showed up with squadrons of militants to hack the head off socialist Hydra and conservative philosophers cauterized the wound. We never let true socialism work, and where we couldn’t help it, isolated it like a leper. So, Dr. Plaut, we can’t say it doesn’t work. And by any remote indication, it would work far less destructively than the corporatism upon which we base our lives.

But Plaut’s only the mainstream; the tributaries of capitalism touch all the backwoods indigenous who like to think themselves populist, while railing Occupy Wall Street and its “orgies in the street” Glenn Beck’s henchmen swear they saw. The arguments against socialism from conservatives of every type, even the humanitarians, abound.

Men and women, blacks, Hispanics, Asians, whites, LBGT people, old and young alike should have the freedom to work hard in the craft of their choice, while making meaningful contributions to society. That’s the libertarian argument against socialism for capitalism. We should rely solely on charity and church to mitigate poverty or sooth mental illness; they’ll get it done faster and at half the cost. They say the simple liberalization of markets – the abolition of government, if we’re blunt about it – would hand power back to the people and eliminate the problems our form of government poses. It’s true that pure liberalization of the marketplace would jettison the festering sinkhole of corporate welfare, but that’s as far as it would go. We’d hope some group of wealthy eccentrics would be crazy enough to put up the billions to maintain and revise the transportation system. We’d cross our fingers, wait for publicly funded projects like the Internet to be realized. We’d wonder, if only there were a mechanism for corporate oversight, companies might face consequences when they dump cyanide in water supplies, to enslave children under the pretense of lifting them from poverty, to indiscriminately harvest whatever they feel necessary to bolster the bottom line. I’m not writing in hypotheticals or hyperbole. Multinationals do this as second nature in other countries, where every day is a tax holiday, every worker is low-wage and every river is a sewer.

Still others say socialism is the problem, that America’s pendulum is stuck left. This argument, often made in depressingly vapid earnest, is formulated from a fundamental misunderstanding of the basic definitions of economic philosophies. Consider this pearl of prose from American Thinker. Peter Ferrara founds his entire hypothesis on and frames all his dubious data in the notion that Barry is a Marxist, Red as they come. Perhaps Ferrara should buy a dictionary and reference it when thinking about going to work.

I’ll use the same example to which most neoconservatives first turn when they call Obama a socialist: the Affordable Care Act or “Obamacare,” a law originally dreamed up by a consortium of conservative economists. Former health care executive J.D. Kleinke explains in The New York Times:

The president’s program extends the current health care system — mostly employer-based coverage, administered by commercial health insurers, with care delivered by fee-for-service doctors and hospitals – by removing the biggest obstacles to that system’s functioning like a competitive marketplace.

Chief among these obstacles are market limitations imposed by the problematic nature of health insurance, which requires that younger, healthier people subsidize older, sicker ones. Because such participation is often expensive and always voluntary, millions have simply opted out, a risky bet emboldened by the 24/7 presence of the heavily subsidized emergency room down the street. The health care law forcibly repatriates these gamblers, along with those who cannot afford to participate in a market that ultimately cross-subsidizes their medical misfortunes anyway, when they get sick and show up in that E.R. And it outlaws discrimination against those who want to participate but cannot because of their medical histories. Put aside the considerable legislative detritus of the act, and its aim is clear: to rationalize a dysfunctional health insurance marketplace.

This explains why the health insurance industry has been quietly supporting the plan all along. It levels the playing field, expanding the potential market by tens of millions of new customers. Hardly a government takeover of health care. Basically, the law simply ensures the health insurance industry a clientele by requiring Americans to buy its product with the compromise that health insurers can’t refuse anyone, without addressing the real problem in American health care of rampant price gouging in health care administration. Corporate welfare, fascism Western style, at its finest.

The conservatives use climate change as another non-starter for progress. Obama’s vaunted speech on the world’s biggest problem was predictably maligned by the crazies not as being too timid, but as being too aggressive, even though the president, like he did with health care, basically promised the fossil fuel industry a windfall in moving forward.

Under guise of progress, such is Obama Policy.

This is the opposite of socialism, which is defined by Encyclopedia Britannica as “social and economic doctrine that calls for public rather than private ownership or control of property and natural resources.” Meaning not that everyone would own each other’s furniture, but that everyone would own a say in the way the most important resources – like water and food and energy, education and information and health care – are gathered and distributed.

Of course, like any ideology, the idea of socialism deserves criticism, and it can’t be realized without generous flexibility. But the argument isn’t to nitpick the particulars of history, in which, yes, many evil men have used the pretext of socialism to establish one-man rule. It’s to bring the definition of society back to its roots, to redistribute ownership and stewardship of public domain from the hands of a wealthy few and back to the public. This requires reflection and acknowledgement of socialism’s flaws.

Responsible socialism would not aim to completely eliminate poverty and all other social ills, like Plaut says proponents of socialism claim. The best socialists do not claim this. Responsible gun control activists say their proposals would reduce gun violence as it has done in many other countries instead of eliminate it; socialists would simply reduce poverty and other dangerous social paradigms, while implementing safety mechanisms for those who fall through the cracks.

No sane person argues for a rigid economy that is planned down to the cent in every respect. Naomi Klein notes that, though she indicts free-market philosophy with her journalism, she doesn’t think a fundamentalist socialist economy would work:

I think that mixed economies work better than a fundamentalist market system. And I’m not a utopian, and I don’t believe that it’s perfect, and there’s still gonna be violence, there’s still gonna be repression, there’s still gonna be poor people. But by acceptable U.N. measures of a standard of living, what we see is countries that have a mixed economy, i.e. have markets, people are able to go shopping … but also have social protections that identify areas that are too important to leave to the market – whether it’s education, health care – the minimal standard of life that everybody must have.

Under socialism, where we the people own the information, where we own the government, Nixon would have been rightly castigated and jailed for withholding the conversations on Camp David tapes from the public. No salute at his funeral barked from steel and gunpowder would have been tolerated. The backroom deal Obama struck with the health insurance industry’s top lobbyist to require Americans to buy insurance in exchange for its support for the Affordable Care Act would have been subject to scrutiny and criticism beforehand, and a decision made among the masses. Ed Snowden and Bradley Manning, possibly the two most important public servants, would not be in their respective hiding place and prison cell for exposing the corruption they did.

The world is not perfect; it never will be. That’s why Zinn places an asterisk on straight utopianism. He makes perhaps the most eloquent argument for an egalitarian society, a redistribution of true wealth – not of those small encoded green sheets of paper apart from which most Americans are so enslaved they can’t imagine a United States, but the power with which to oversee our own policymaking, absent the festering corporate influence that has diseased our polity:

With the Establishment’s inability either to solve severe economic problems at home or manufacture abroad a safety valve for domestic discontent, American’s might be ready to demand not just more tinkering, more reform laws, another reshuffling of the same deck, another New Deal, but radical change. Let us be utopian for a moment so that when we get realistic again it is not that “realism” so useful to the Establishment in its discouragement of action, that “realism” anchored a certain kind of history empty of surprise. Let us imagine what radical change would require of us all.

The society’s levers of power would have to be taken away from those whose drives have led to the present state – the giant corporations, the military, and their politician collaborators. We would need – by a coordinated effort of local groups all over the country – to reconstruct the economy for both efficiency and justice, producing in a cooperative way what people need most. We would start on our neighborhoods, our cities, our workplaces. Work of some kind would be needed by everyone, including people now kept out of the work force – children, old people, “handicapped” people. Society could use the enormous energy now idle, the skills and talents now unused. Everyone could share the routine but necessary jobs for a few hours a day, and leave most of the time free for enjoyment, creativity, labors of love, and yet produce enough for an equal and ample distribution of goods. Certain things would be abundant enough to be taken out of the money system and be available – free – to everyone: food, housing, health care, education, transportation.

Socialism pits the compassion against the egoisms of capitalism that inform and perpetuate our status quo. If we could take a deep breath and think big toward the fellow man, then equality, literacy, free education, happiness, et al. could win out over hierarchy, corporate welfare, money, patriarchy.

Socialism, if we let it, would neutralize the mechanisms of manipulation and obfuscation by which we’ve accrued our distresses at the intersection of money and power by entrusting the traffic direction of both vast avenues with the people.

It could provide an administrative infrastructure and an existential tolerance to relieve the problems of socioeconomic stratification our brothers and sisters face daily.

It would create an environment in which we could mobilize ourselves to solve the biggest challenges we face, most notably climate change.

Most importantly, it could pit our honesty against our addiction. Like the alcoholic who’s quit drinking, but is in a perpetual state of getting better, we could stop lying to ourselves.

But I’m getting ahead of myself. All this requires a structural overthrow. I hope I’m there to chronicle that battle.


What We Should Instead Celebrate this Fourth

I submitted an essay last July 4 to a Navy organization called Morale Welfare and Recreation that organizes funtivities and other similar contests, under the bullshit prompt, “What does Independence Day mean to you?” My essay went like this:

Leftist sailors have much to be conflicted about. I’m one, and I’m here to explain:

Before I joined the Navy, I was conflicted about serving an organization that orchestrated campaigns with which I sometimes disagreed. Our war efforts seemed unwarranted and productive only for those who stood to enjoy financial gain from them. I was frustrated with how our leaders handled military business.

But, after I dithered, interjected and finally enlisted in the face of a fat, ugly personal finance disaster, I looked past the military’s surface-level iniquities and justified them as necessary. And I got over my hesitation.

Not because they’re just a small part of something that’s making my life stable again.

Here’s the bigger thing: even as I gather mental sticky notes adding to an already respectable collection of items I disagree with, I’m also inundated by a cascade of reasons America should appreciate its military.

There at the top of the list is that biggest idea, which says if America didn’t have its military, it wouldn’t have what makes it itself. Some people call it “freedom” or “independence,” but it’s headier than what a single word can describe.

Whatever it is, it ebbs and flows in our hearts and heads until, every July 4, we pour it all out and it culminates in a deafening crescendo of fireworks in night skies over separate American towns.

Separate, but united in the same black night.

We, as sailors, whether we’re liberal, conservative, Christian, or we worship the Flying Spaghetti Monster, take it more seriously than the average American.

We’re the ones who stood at parade rest in the sloppiest ranks of boot camp while officers invoked messages of patriotism to Lee Greenwood’s formerly cheesy-sounding “Proud To Be An American,” and we felt it for the first time. We felt that idea creeping from our stomachs up through our hearts and throats and out onto our skin in thick goose bumps.

Some people take it for granted. Just another day. They might even say: because Hedge is a liberal and he sometimes disagrees, he’s not allowed to be patriotic. But that’s the whole point. They’re free to say that because we’re here. We can agree to disagree, as the cliché goes.

Of course, it’s more profound than just a few instances when people disagreed with me and exercised their freedom of speech; that’s only a small example.

The idea that we’re free to view the world in the way that makes the most sense expands into a collective school of thought, until we realize we actually sync in a bigger way than the ways we fight. Thus, we become a fluid, formidable singularity. Discord, hesitation and insecurity cease to matter. We’re the same country again.

Before I joined, I was very conflicted. I still am.

But now, I know that’s what all those fireworks late on Independence Day represent to me: the American freedom to think, feel and speak out, which citizens of too many other countries can’t do.

Some of that’s true, I guess. It won me an iPad, from which I sometimes edit this blog. But something’s happened since. I’ve read quite a bit, and reshaped my worldview. It’s changed so much that I’m shocked at my own naivety just a year ago. “Formidable singularity?” Those words were written by a different person. What the hell was I talking about? Was I in a haze from swearing in just a few months prior? Was I subconsciously justifying my enlistment to flee the shame of having to do something that goes against what I know is right? Was I trying to impress the contest judges?

Whatever the answer, I’d like to point out this year that Independence Day is in fact simply a celebration of a gathering a bunch of men who didn’t want to pay taxes and to be allowed to keep fucking their slave girls. In their collusion, they would establish their own independent aristocracy in which the people would still founder. “We the People” were more specifically, we the privileged white men with fancy hats and sprawling estates. They we should lament.

The true heroes were the poor and the exploited who rose up in the face of the gentry. We should instead celebrate Tecumseh, who fought to stanch the inexorable white westward movement in the Ohio River Valley, in the War of 1812. We should pay ode to the weary slavery abolitionists President Lincoln was eventually forced to heed by turning tide during the Civil War, more than 300 years after Spain outlawed slavery. We should remember Eugene Debs, who founded the Industrial Workers of the World.

Of course, as a nation, we’ll never do this, because as the fictional Jackie Cogan aptly noted, “America’s not a country; America’s a business.” It would make no sense for the institution to celebrate the very people’s movements aiming to destroy the structure. But as individuals, we still cling – some with more urgency that others – to that idea of a free, equal and decent society, and that we should celebrate.

The Information Age is Important. Let’s Use it Properly.

The Gold Room on the second floor of my schoolhouse is a dreaded place. It has two uses: 1) a meeting place for duty musters. If you’re in the military, you don’t like duty days as a matter of course because on duty days, which at the Dam Neck naval annex are set on a rotational schedule of 11 separate “duty sections” made up of students and staff from all parts of the command, you don’t get to drink and are required to stand watch for a few hours, among several other trivial intrusions on your freedom. And, 2) it houses a lectern, a projector for Powerpoint presentations and auditorium style seating for Navy and other military training and for class graduations. Military training is intended brings a number of issues – drunk driving, military history, racism, etc. – to the fore. But at least in some cases it is largely ineffectual, as we’ve seen in recent weeks with parts of the media focused on a drastic increase in incidence of sexual assault. The military of the past decade has promised to crack down on sexual assault through awareness training, and zero tolerance. Secretary of Defense Chuck Hagel has tried to buck proposals to take decision making power from the hands of commanders, who have overturned convictions of sexual assault for members of their chain of command. Though the trend has been the worst in branches other than the Navy, I see at least one cause for it here, too: Sailors just don’t seem to care. They hate the Gold Room; they hate training.

Two of the four laboratory groups in my class found themselves here, by grudging lottery, near the end of April for an ill-written and -compiled slideshow to honor Holocaust Awareness Month.

One of my primary gripes about awareness months and holidays is cliché: In general, they are contrived shows of grievance for history’s worst ongoing crimes – late and randomly assigned periods of the year in which we educate ourselves about a certain atrocity or appreciate a certain group of our society for their contributions. We come up with a bunch of facts, organize them into bullet points, reflect for a short amount of time and then return whatever prejudices we otherwise harbor. Awareness months are masturbatory, hypocritical, unhelpful, designed to sooth majority guilt – not to right wrongs.

Look, we’re progressive and aware because we’ve acknowledged of the existence of women and their contributions. We know that bad things happened to black people. We’ve dedicated an entire month of the year to spread the word that there is, in fact, an LGBT community, and they have a deep, broad, compelling history. This makes us interesting, sophisticated, good people.

So I don’t find it entirely offensive that most of my class expressed disdain for the prospect of attending Holocaust awareness training.

“We already learned about the Holocaust in high school,” one of them said.

The comment struck me as calloused at the time, but on reflection, it was an honest observation of the moment: If we go see it today, again, will it really make us care more? Though my classmate probably didn’t mean it this way, another way of saying it is, If we’re going to care, why can’t we care every day?

But that’s a philosophical concern. The point was, half the class was attending whether they liked it or not. I liked it, mostly because I knew it would give me writing fodder down the road, and here we are.

The Gold Room seats were filled with uniforms bearing most of the spectrum of rank insignia. The commissioned officers rubbed thumbs and forefingers thoughtfully on clean-shaven chins in the front row; the chief petty officers huddled, giggling about something on an iPhone; the first- and second-class petty officers checked watches; the low-ranking students made a full and typical spectacle of sexual innuendo. Finally, a first-class stepped up to the podium and stumbled through an introduction to the slideshow he had apparently assembled, which told us – in a text of schizophrenic colors, fonts and sizes superimposed over random photographs taken during World War II – much of the following, which can be confirmed here and in other timelines, about the Holocaust.

Adolph Hitler, appointed Chancellor of Germany on January 20, 1933, and his National Socialist German Worker’s Party – the Nazis – had a very busy first year at the helm.

On March 22, the party opened the Dachau Concentration Camp, a facility designed to hold 5,000 prisoners. Communists, Socialists, Catholics, eventually Jews and Poles and other people deemed threats to party platforms would file in to manufacture weaponry and to expand the camp and, in many cases, to die. The precise number of prisoners who were murdered at Dachau from 1933 to its liberation in 1945 was never pinned down, but conservative estimates range from at least 28,000 to 32,000.

One day later, the legislature, called the Reichstag, dictated in a heated exchange with an embattled minority that Hitler had the authority to make unilateral legislative decisions. Under the promise that with the enabling act’s power Hitler would end unemployment in Germany, the Nazis called the legislation the Law for Removing the Distress of the People and the Reich.

On April 7, the Nazis excluded all Jews from all positions of public service.

In September 1935, the Nazis banned Jews from voting, from holding public office and from marrying gentiles with a set of new policies called the Nuremberg Laws.

On April 26, 1936, the Germans implemented rules requiring Jews to register their property.

November 9 and 10 of the same year marked Kristallnacht, a stunning nocturnal invasion by the German military into Jewish homes throughout German and Austria. The Germans destroyed Jewish valuables and rounded up approximately 30,000 Jews, bringing them to enslavement in concentration camps. Kristallnacht means “night of the broken glass.”

From February to June 1939, foreign policy failures and shortcomings denied tens of thousands of Jews asylum in Western and Near East countries where they might have otherwise escaped the cruel hand of the Nazi Party. Among other breakdowns in proposed aid influenced by Western government, the United States Congress had proposed a bill to let 20,000 Jews to flee to the states. It died in committee.

Carnage and, eventually, a world war continued for a very long time, until, in 1945, Allied Forces finally liberated the concentration camps and defeated the Nazis.

Here’s some context that wasn’t included in the Powerpoint. The National Socialist German Worker’s Party is basically a liberal euphemism for the following platform:

  • Ardent nationalism, which, under the definition of the party, excluded Jews and acted as a prerequisite for German citizenship.
  • Annulment of the Peace Treaty of Versailles, which ended World War I and established very stringent gun control rules.
  • Government as a jobs program, excluding foreign nationals.
  • Requirement that all citizens are given equal rights and duties
  • Ban all income that does not produce something of value to society.

That’s not nearly all of it; you can read the rest here. There are 25 points, many of which could be – and are – interpreted by liberals as representing the mainstream right wing and by conservatives as representing the mainstream left. Conservatives swear up and down that all the mention of statism smacks of liberal sentiment, while liberals accuse neoconservatives of descending from the Third Reich.

I too often immerse myself in the world of Internet memes – those horrible pictures under, above or beside a text that purports to say something funny or profound about our world, but are rarely successful – and, viewing the slideshow, I started thinking about the ones you’ve probably seen comparing the Bush or the Obama administration to the Nazi Party and both leaders to Hitler. And it made me think, If I were Jewish or Russian or Polish or black, or if I belonged to any minority group that suffered the unimaginable atrocities of the Holocaust, I would be personally offended by such messages. Sure, both presidents are horrible. Bush used the murder of nearly 3,000 innocent Americans to advance a right-wing political agenda in the Middle East. Obama has given himself the authority to kill people if his advisers say they are terrorists. Bush justified a privatization of the New Orleans public school system on the economic shock of Hurricane Katrina. Obama seems to have been unaware that his agencies unfairly targeted non-profits with a specific political philosophy and secretly monitored the phone records of journalists working on stories that his administration didn’t like. Bush suspended writs of Habeas Corpus, which allow the justice system to bring prisoners to court to determine the legality of their detention. Obama campaigned on promises to reverse most of Bush’s regressive policies and has not scratched the surface after five years; in fact, he has accelerated most of them. Both presidents, beholden to the definition of corporations as people, have yielded unwavering support to the corporate structure in America to the detriment of actual people with actual lives.

But neither is Hitler. Neither led parties whose officially stated platforms overtly discriminate against any race, religion or ideology. Any claims to the contrary are based in a deep misunderstanding of the key contextual elements of the narrative – the historical events described above, the philosophical ambiguities of the extreme ends of the political spectrum, the rightward movement of the political swinging political pendulum, and simple facts in general. More than that, to cast either in Hitler’s light is to trivialize one of the most horrific genocides in human history, and anyone who does so owes the survivors and heirs thereof a heartfelt apology.

I’d like to highlight a narrative that careens about the parts of Memeland adjacent to Jon Stewart’s “Bullshit Mountain” bumping uglies with the more depraved sectors of the American psyche I find particularly disingenuous: that Obama is Hitler’s political descendant because he is advocating more stringent gun control. The logic goes like this: Before he started killing state “enemies,” Hitler took all the guns away from them, rendering them defenseless. Therefore, Obama is like Hitler because he wants to “take our guns.” Therefore, if America enacts tighter gun restrictions, the government will soon show up at our doors and file us into communal transports to concentration camps or simply kill the more combative who refuse to surrender their weapons.

There are five huge problems with this argument. First, if you Google Holocaust timelines, you’ll notice a startling chronological contradiction: Hitler’s gun legislation wasn’t enacted until November 11, 1938, almost halfway through the Holocaust. By that time, five of the 27 concentration camps had already been established, the Nuremberg Laws had been enacted for three years, Hitler had already formed his pact with Italian dictator Benito Mussolini, Germany had already annexed Austria, the Nazis had already required Jews to register all their property, Kristallnacht happened the day prior and countless other violations of civil liberties that are far more important than owning firearms had already taken place. Had Hitler never imposed any gun legislation, it probably wouldn’t have made much of a difference. Second, all current gun policy talked about among lawmakers doesn’t even skirt the notion of confiscating all privately-owned American guns – only the ones that have generated the biggest headlines (but that’s almost an entirely different kerfuffle). Third, Hitler’s gun legislation is anything but the only example in history of gun restrictions, and many, though not all, others have not preceded dictatorship or genocide. In fact, it’s a pretty difficult case to argue that all genocides and dictatorships are solely predicated on gun bans. Fourth, the Nazi gun ban did not tighten existing gun laws established under dictate of the Treaty of Versailles, which ended World War I and disarmed all of Germany; it loosened them. Article 169 of the Treaty states: “Within two months from the coming into force of the present Treaty German arms, munitions and war material, including anti-aircraft material, existing in Germany in excess of the quantities allowed, must be surrendered to the Governments of the Principal Allied and Associated Powers to be destroyed or rendered useless. This will also apply to any special plant intended for the manufacture of military material, except such as may be recognised as necessary for equipping the authorised strength of the German army.” This rule was carried out by the Weimar Republic, the German power structure that directly preceded Hitler’s Nazi Party. Hitler actually gave guns back to people the Nazi Party didn’t see as a threat with the German Weapons Act, which undid much of the existing prohibition. Of course, it’s important to say that Jews were not allowed to own firearms because Hitler wanted to keep murdering them. But he never took guns away, as Wayne LaPierre wants you to believe; indeed, Hitler did away with far more important civil liberties than the right to bear arms – like the right to vote and own property – long before he denied their ownership to Jews. Fifth, even if Hitler had tightened gun restrictions, it wouldn’t be an indictment of thoughtful, well-intentioned and -implemented gun control policy, and it certainly wouldn’t make Obama a Hitler disciple.

Conservatives use the argument that liberals are closet Nazis to malign an entire political philosophy that opposes their own, and vice versa. But conservatives forget that it was the socialists who stood against the enabling act, one of the first and perhaps the most important step in Hitler’s march toward dictatorship. The embattled minority I mentioned above that challenged the legislation was represented by Otto Wells, the leader of the Social Democrats, who called out in the Reichstag debate before the vote: “We German Social Democrats pledge ourselves solemnly in this historic hour to the principles of humanity and justice, of freedom and socialism. No enabling act can give you power to destroy ideas which are eternal and indestructible.” Wells knew Hitler, for all his facade of the popular ideal, did not embody socialism. Conservatives forget the definition of socialism.

Liberals, by the other side of the token, forget that, in fact, many of the official platforms of the Nazi Party do align with leftist principles; in fact, they accuse the right wing of upholding Nazi platform. The liberals forget that Wells may as well have been a member of a party equivalent to today’s American Libertarian Party, which would have abhorred the Reichstag’s vote with equal verve. Nazism is a social experiment with political philosophy gone wrong, a Frankenstein monster that wreaked unimaginable horrors. Neither side’s portrayal of the other is fair or accurate.

But – and this is the clincher – do most Americans recognize the propaganda for what it is? I don’t think they do. Comparisons of the past two administrations to the Nazi Party went mainstream. This is because American politicians know the public is more easily duped by quip than convinced by fact and context. A showy advertisement, a clever turn of phrase – or, as in many cases, a not-so-clever one – is all we need to latch onto and enforce an ideology. This is because, in America, we value banality over nuance. We’re unwilling to invest time, research and hard work to form our worldview. We’re lazy.

WRITING THIS POST was tough. Late several nights ago, I kept scrolling through Internet repositories of the worst of human triteness – meme webpages and Facebook accounts – and found myself, swimming the emotional highs and lows of a pot of coffee and several bowls of Virginia Lighthouse, becoming incredibly depressed.

An organization called Know Your Liberal Memes superimposed the following text over a flattering portrait of Barack Obama:

3,000,000 new private sector jobs; Smaller government; $2,000,000,000,000 in deficit reduction; Health Care [sic] Reform; Wall Street Reform; Saved the US Auto Industry; All “bailout money” returned with interest

What did you do in the last three years at your job? 

This gem portrays a net positive for the Obama administration so far in his presidency. So I’ll answer under the implied assumption that all mentioned projects were beneficial or meaningful (which is another tough case): I was certainly not that productive. But I also didn’t orchestrate a drone campaign in the Middle East that has killed, by the reporting of The Bureau of Investigative Journalism, between 400 and 800 civilians, more than 150 of them children. The families of the innocent people killed probably see little good in the meme’s report card.

I was informed by a Facebook group that called itself “Guns ‘n’ Freedom” about a story I would never have heard in mainstream media on Mother’s Day: “Buena Vista, Michigan – –“ yes, complete with a newspaper dateline “A grandson uses his shotgun to protect his 98-year-old grandmother from a fugative [sic].” They told me this because Mr. LaPierre’s tribute, “The only way to stop a bad guy with a gun is a good guy with a gun” apparently rings truer with every contrived incidence of vigilantism, and the libruhls wanna take ‘r guns – never mind that no proposed gun legislation in our country would have taken away the kid’s weapon.

Another group that calls itself “Conservatives against Obama and his liberal adgenda. no longer bush’s fault”* had something many conservatives no doubt found quite poignant to say about the September 11, 2012, tragedy at an American diplomatic mission in Benghazi in which four American public servants lost their lives: Over a photograph of a certain former first daughter, “Draft Chelsea Clinton For [sic] Next U.S. Ambassador to Libya – What difference does it make?” The dig is clearly aimed at Hillary Clinton’s comment during her hearing on the tragedy, from which The New Yorker’s essential Amy Davidson drew the following excerpt. It’s an exchange between Clinton and Wisconsin Republican Senator Ron Johnson, who’d essentially accused Clinton of obfuscating, using a façade of bureaucracy to say she didn’t know what was happening in Benghazi before it happened:

Clinton: With all due respect, the fact is we had four dead Americans.

Johnson: I understand.

Clinton: Was it because of a protest or was it because of guys out for a walk one night who decided they’d go kill some Americans? What difference, at this point, does it make? It is our job to figure out what happened and do everything we can to prevent it from ever happening again, Senator. Now, honestly, I will do my best to answer your questions about this. But the fact is that people were trying in real time to get the best information. … But, you know, to be clear, it is, from my perspective, less important today looking backwards as to why these militants decided they did it than to find them and bring them to justice, and then maybe we’ll figure out what was going on in the meantime. [My emphasis.]

Since Mama Clinton said this during the hearing, which is part of a much larger investigation on which Republicans are pinning their hopes of winning the Oval Office in 2016 (they even doctored Executive emails to make it look like officials fixed false talking points about the tragedy), the GOP has dug in. They frequently take the quote in question out of context to make it look like Clinton was asking what difference the disaster makes in hindsight, when she was really saying investigators need to get to the bottom of what really happened.

These messages are typical. They each take an incredibly complex issue with many moving parts, smack it with a glitter stick and shove it, in the form of contrived and oversimplified propaganda, down the throats of unsuspecting social mediaites. We’ve lost facts, and we’ve lost sight of the overall – that the real scandal in Benghazi is American nation building; should we be there in the first place? What’s resulted is a philosophically divided electorate that can’t see past partisan politics to what could be smart, beneficial policy. And, though we disapprove of it our legislative body is simply a reflection of that paradigm. Yes, it’s our fault.

But that’s not the worst of it. The worst of it is that it doesn’t have to be this way.

Memes did not originate on the Internet or in any advanced technical setting at all. The Random House Dictionary definition of the word meme is: “a cultural item that is transmitted by repetition in a manner analogous to the biological transmission of genes.” Or, as memeticist Susan Blackmore describes it, “that which replicates.” The Internet is nothing more than a special lubricant lining the human communicational mechanism that spreads the information.

Blackmore is a leader in the utterly fascinating scholarship of this concept that originated with Richard Dawkins’ interpretation of Darwinism in the 1970s. It is how humans have evolved. Billions of years ago, amoeba began to copy themselves and through a massive orgy of replication, eventually spawned the biological diversity we, as humans, in an ironic twist of fate, are working so tirelessly to exterminate now.

Memes are the cultural equivalent of evolution.

Blackmore says, in an excellent address to a TED audience: “Cultural evolution is a dangerous child for any species to let loose on its planet. By the time you realize what’s happening, the child is a toddler, up and causing havoc, and it’s too late to put it back.” We are our own Pandora’s Box. The memes – of language, religion, art, science – own us, she says, going on: “As the memes evolve, as they inevitably must, they drive a bigger brain that is better at copying the memes that are doing the driving. This is why we’ve ended up with such peculiar brains, that we like religion and music and art. Language is a parasite that we’ve adapted to, not something that was originally there for our genes.”

It’s not necessarily dangerous, she says – many parasites establish a symbiotic relationship with their hosts. But when parasites find a way to replicate autonomously, as memes would in some future dystopia ruled by machines, they would propagate information simply for the sake thereof, absent humans. We’d be relegated back to the primordial soup, partially due to our own destruction of the environment, partially because we’ll no longer be useful to memes.

Though it’s an interesting path to wander, I’m not convinced. Maybe I’m missing something, but the personification of information itself feels like too much of a stretch for me.

But Blackmore is right that memes are incredibly important and potentially dangerous. A bigger, more important and immediate effect of memes is that they’d helped us evolve as a species, as well as in our social paradigms, creating meaningful progress. The concept of spreading ideas and information is a healthy thing for a society. In fact, it’s essential. That’s why the Fourth Estate, a concept that took hold in Western countries as the American Founding Fathers began to wage their revolution, is so important on a theoretical level. And on a practical one: American newspapers and other news organizations are responsible for a great amount of good in our country, as well as countless others. They are the quintessential bringers of meaningful change. Without them, any structure of government is bound to seize the authority to do things it shouldn’t.

And as the technical and philosophical wherewithal of journalists proliferates – like with the advent of instant and unlimited information sharing and the enabling tools to force transparency – so does the potential to grow that concept. The Fourth Estate has ever-expanding opportunities to influence.

But moments of triumph seem, increasingly, to stumble over the hurdle of power structures that suppress them. Perhaps the most excellent case study is Wikileaks, the network of information activists who publish government secrets the Internet – the only forum in the history of the world that would have allowed such ambitious cataloguing – that would likely otherwise never see public scrutiny. Governments around the world want Wikileaks to stop what it does, and they are trying very hard to make that happen.

And if government isn’t a foe of sufficient formidability, we can increasingly count on public apathy to get in the way. Attempts to educate – trues grassroots efforts to inform that make it through the gauntlet of policy – are often met by a public that is highly reluctant to learn.

Here’s one small, but palpable, example: My wife discovered a cell phone application called Buycott that allows consumers to scan barcodes at grocery stores and shopping outlets to find out where the money they would spend on specific items would end up – the coffers of Monsanto, Koch Industries, etc. It’s pretty brilliant. The incredible layers of corporate bureaucracy, the labyrinth of subsidiary, that characterizes the modern supply chain is one of the chief reasons corporations have such an untraceable stranglehold on economic and environmental policy in this country. The app cuts through much of that – it liberates informed consumption.

I brought up the subject of the app with a classmate of mine, and described what it does.

Here’s what she said: “That sounds like the most useless app I’ve ever heard of.”

Shocked, I stumbled through a shaky explanation of why such a tool is important: “It allows you to know whether food you’re buying is GMO, or if your dry goods are produced by a corporation that financed political campaigns you think are dangerous.” The benefits are vast and obvious.

She looked at me, and condescended: “I’m not like you; I just don’t have strong political views.”

There, the conversation ended. My classroom relationship with this person, if existent only on a surface-level, is fine. She’s a nice person, and we’ve even gone rollerblading together. But I was flabbergasted that a person who’d graduated from a good school with a bachelor’s in psychology could so blasé about an issue that affects so many aspects of everyday American life, and has collateral implications for the international community, where the corporations identified by the app thrive.

In my classmate’s comments, the American mind comes into relief. I’d say it indicates a larger systemic problem – that there’s a secretly waged campaign at the highest echelons of all Western power structures to sucker the public into being so listless. There is, of course. But that’s not the issue anymore, and it can no longer be used as an excuse. When presented with an opportunity to know more, most prefer to know less and they ridicule those who seek knowledge. In the free marketplace of ideas, we’ve told the drivers of intellectual thought in America that we wish to remain ignorant. We’ve demanded shallow cliché and vitriol from our news organizations. So that, with the occasional exception of a dwindling guard of brave journalists who venture to dark corners of corporate board rooms and blight-stricken Third World communities, is what they give us.

But, though I’ve done so less and less lately, I like to end my diatribes on positive notes. As I wrote above, technology and the memes it can produce can be used for good. I’d like to highlight an effort to bring the idea of memes to an issue that has gotten far too little coverage of late, climate change. A while back on the Hedge, I detailed a media skirmish in which the conservative punditry effectively convinced a huge swath of the public that it should pay climate scientists – whose data activists were using to demand responsible environmental policy – no attention:

… when we’re forced into a corner with facts, we deny that they’re facts. We contrive backroom conspiracies – the most troubling case being that of global warming, in which right-wing media hackers dug and dug until, in 2009, they found a few negligible communication fumbles made by a group of climate scientists in groundbreaking research on global warming. Conservative pundits who called themselves “journalists” judged these faux pas, which didn’t challenge the validity of the scientists’ research findings, worthy of the moniker “Climategate” and proof that the biggest heretofore looming disaster in the history of the world is a hoax.

Of course, to ask the deniers is learn that these climate scientists have a nefarious motive: They need to secure cash flow for future research, so they must convince the purveyors of these funds – in some cases the United States federal government – that more information must be gathered and analyzed, which requires more money. It’s like a job security Ponzi scheme in which they dupe the investors – taxpayers – into the belief that their capital will yield a sensible policy model to address what the scientists call the biggest geopolitical issue in the history of the world. Which of course will turn out to be the biggest scientific ruse of our time. Just wait and see. At the same time, the captains of the fossil fuels industry, who certainly only have the best interests of a hard-working proletariat at heart, have no such motive; the vastly increased profit margin of the past decade, some of which went to funding distorted and contrary research based on preconceived conclusions, don’t at all figure into their strategy.

It’s important to note that oil profits have slowed by a small rate in the last couple of years, but that doesn’t negate that the oil industry spends massive sums on lobbying Congress to favor its companies when it creates legislation.

And when the so-called scandal was shown for the blather it was by independent investigations, the news media dropped the ball. Climategate, and the supreme integrity of its preachers, inspired many Americans to abandon their knowledge of the science altogether. As Jon Stewart, a comedian who often covers the news better than most journalists, pointed out only 59 percent of Americans believed in 2011 that global warming was a matter of concern, a 20-point drop from just five years prior. And journalists didn’t call it to attention when the debunking of the scandal was confirmed in The Wall Street Journal, of all places.

Of course they didn’t cover it; the McRib was back.

It’s not man, the “climate skeptics” say, or it’s not all man, and even if it were, it wouldn’t necessarily herald a catastrophe. So why not continue living as we live? The risks are slim, they say, as the ice caps melt away.

Of course, the Climategate brouhaha generated endless meme fodder for organizations opposed to implementing climate change policy. But memeticists Joe Brewer and Lazlo Karafiath have launched a project they hope will combat this trend. Essentially, they are fighting fire with fire by crowd-sourcing memes that convey facts about climate change to the blogosphere.

The point is, efforts are underway to educate. Against a vast chaos of information on the Internet that leads to misconceptions or, worse, apathy about important social paradigms, a select few people are working very hard to buck the trend. Lots of wonderful writers are blogging insightful commentary on important information. And they’re trying to make it better. Tim Berners-Lee, the guy who invented the World Wide Web, detailed to a TED audience a proposal to put more basic, meaningful information on the Internet in the form of raw data.

But in the mainstream, when we talk about something “going viral” or just being popularized in the average daily conversation, it’s usually a banal political meme or a video of some unimportant activity being performed, like a woman falling from a wine press or a drunken squirrel scrambling about a tree trunk or a montage of screaming goats. Those items are fine in and of themselves, but, unbroken by a deeper message, they bastardize the efforts of information pioneers. I can’t imagine how the deterioration of the conversation as a whole must make Berners-Lee feel. The biggest problem is not a lack of tools, though we do need more of them; it’s getting people to care about the information and then to act on it in a responsible way using those tools, like Brewer and Karafiath are trying.

I hope it works.

*I shit you not, the grammatical mistakes in this name were made in earnest. I asked.

2 Big Lessons We’re Missing from the Marathon Bombing

The question had been burning: Who did it? Who planted two bombs near the finish line of one of America’s most hallowed sporting events, the Boston Marathon, and blew up 179 people?

The question had ruminated through my classroom at Dam Neck naval annex the day after it happened. Was it the Saudi national who was being questioned by law enforcement agencies? we asked ourselves. Was it a group or an individual? Was it white man or a brown one? Was it a woman?

The question stewed in the roiling atmosphere of the 24-hour news cycle.

Some news organizations had dubbed the Saudi man a “suspect,” but the Powers That Be insisted he was being interviewed as a witness. Media found itself awash in speculation about a number of things. They wondered whether the attack correlated with Patriot Day, a Massachusetts holiday that happened to fall on April 15, the day of the Boston Marathon. Some said maybe it has something to do with it being Tax Day, the last day of the year to file income taxes. The implication was that some right-wing nutjob killed three innocent people and wounded 176 others to somehow protest the Socialist Agenda of The Obama Administration. More media was swimming in reports on Wednesday that a suspect had been apprehended and was in the custody of law enforcement. Wolf Blitzer bragged at length on CNN about the incredible exclusivity of John King’s “scoop” on the “arrest,” and an hour later looked on as an FBI official, on the very same program, denied the report. There had been no arrest.

Thursday and Friday were no better. Thursday night, a rabid orgy of information gathering took as police gave chase to a pair of brothers who’d embarked on a crime spree across the city. Social media, then news blogs, reported, through a series of miscommunications that the suspects being chased were Brown University student Sunil Tripathi and Mike Mulugeta, a name conjured from a Tweeter’s faulty listening of a police scanner. Neither character had anything to do with the bombing.

All week, the Rat Race was in full gear.

At several points, I even found myself guilty of speculating. A classmate asked me during our lunch break on Wednesday, Who do you think bombed the Boston Marathon?

I have no idea, I said.

I think it was a lone person, he said.

That makes sense, I said. Normally, when a large terrorist cell carries something like this out, they immediately claim responsibility.

I agree, he said.

We carried on for a while.

On Friday, after law enforcement had taken after Chechen brothers Tamerlan and Dzhokhar Tsarnaev, the same classmate said what he thought with some authority based on his knowledge of the Chechen Situation, which is predicated largely on its violent history: That the brothers, who by then had been identified, one killed and the other still on the loose, were Muslim. But nowhere had yet it been stated.

By the time I was out of school, at around 11, The New York Times and The Wall Street Journal both had extensive profiles pieced together from interviews with people who said they knew the brothers and from contacting the Chechen government. Later on, NPR reported information from a social media account to which they caveated the younger brother might not be connected. It could just be a guy with the same name, NPR said. I was driving with my wife to get some ice cream when I heard it, and threw one hand into the air from the steering wheel in frustration.

“I fuckin’ hate it when they do that!” I scolded the radio.

It’s how we get simple facts so very wrong. An out-of-context reading of Tucson, Arizona, shooter Jared Loughner’s reading list by right- and left-wing pundits led idealogues on both sides to claim Loughner was a claimant of the extreme views of the opposite end of the political spectrum. Neither were accurate. After troubled college student James Holmes shot up a movie theater near the capital of my own state, the trigger finger of prejudice led ABC’s Brian Ross to publicly accuse a different man with the same name and a connection to the Tea Party of the crime. Jon Stewart eloquently castigated the veteran reporter, calling for Ross to issue the following apology: “And I am really sorry. Really fucking sorry. Deeply sorry. Deeply, irrevocably sorry to the innocent man that I casually, baselessly and publicly accused of – I don’t know – maybe being a mass murderer. ‘Cause when I was Googling his name I saw the phrase ‘the Tea Party,’ and I thought, ‘Oh! That’s a preexisting narrative! I should get that on the TV!’”

Simple factual error – a sin of which yours truly, it must be noted, was too often guilty when he reported for newspapers – is only a small piece of the bad coverage clusterfuck, and it only predicates a fraction of the troubling implication. John King – in fact, all of CNN – got something badly wrong in their fevered rush to Break the Story. King and his cohorts probably feel stupid. But what’s more fundamentally dangerous is our tendency to shape the national consensus in the void of stereotype. Blowhard David French of the National Review Online expressed contrition at the media’s misinformation campaign to preface the following gem: “Because he comes from Russia, a Chechen terrorist is more likely to easily assimilate within a Western country and is less likely to stand out – in dress, manner, or otherwise. In a country where the majority of citizens are white and westernized, it’s easier to blend in when you are white and westernized. That’s a simple fact.” To French, it was a simple fact. No need to cite evidence when we have wild speculation about two unfamiliar Chechen brothers as our foundation. On a statistical level, what French said may be true. But he painted two men about whom we have incredibly narrow information with the broad brush of category.

As egregious as it is to get them wrong so we can hurriedly promote a news organization’s profit imperative, facts are correctable. The increasingly impressionable American psyche, once started down an intellectual path whose roadbase is typecast, is far more difficult to right. We’ve manufactured our worldview, and that philosophy tells us everything we need to know about everything. We no longer need to consider facts, much less context. We need no deeper meaning.

Thank god not every reporter did a bad job. Thank god for NBC’s Pete Williams, who took it slowly, reported accurately. Thank god for The New York Times’ Katharine Q. Seelye, Scott Shane and Michael Cooper and others, who did the same. Thank god for Esquire’s Charles P. Pierce, with whom I often disagree because of his wavering support of the Obama administration, but who observed the Marathon Bombing situation with measured commentary from the perspective of a supporter of political commonwealth. Because we have them, we can have a foundation from which we can see past the Rat Race.

Which brings me to what’s important: how we first consider and then learn from this tragedy.

The question of who did it – an important one – has been resolved, as we all knew that it likely would. Of course, the answer has reproduced a host of new, more difficult queries. We may never learn the answers to all those. But perhaps they are not what matter, anymore.

By removing the introspection necessary in such situations, our focus on Jon Stewart’s “preexisting narratives” causes a paradigm that’s still worse than the bad turn we take with speculation. We miss essential existential questions about the nature of being Americans. I see two lessons here that, on the current trajectory, are being overshot.

The first is to consider what it means to be among the nations victimized by such violence as we saw on Monday. To flip through photographs of the marathon bombing contained in online newspaper galleries is to call up images of what happens in countless Middle Eastern and other Third-World countries on a daily basis. The blood, streaked across finish-line pavement, spilled from stumps of human flesh previously attached to limbs. The runners and onlookers weeping at the horror. America, it’s important to say, has not always been immune from similar violence. From 1978 to 1995, Ted Kaczynski, known widely as the Unabomber, murdered three people and injured 23 others by mailing a series of 16 bombs to university laboratories, computer stores and an airline. In 1993, the Bureau of Alcohol, Tobacco, Firearms and Explosives and the FBI attempted to raid and put under siege the Branch Davidian complex in Waco, Texas, and, after 51 days of standoff, killed 76 men women and children. In 1995, Timothy McVeigh and Terry Nichols killed 163 people and wounded more than 600, in reaction to the Waco Massacre, when they bombed the federal building in Oklahoma City. Nearly 3,000 people died on September 11, 2001, in a terrorist attack carried out by militant Muslims bent on killing Americans. There were mass murders prior and subsequent. But the point is that, as of the last decade or so, countries Unites States forces have invaded have been peppered with the emotional and physical pockmarks of explosions and attacks, sometimes daily. The people affected by these attacks did not ask for them, wherein lies a big a distinction: many Americans have actively and enthusiastically supported politicians who carry out many of these same attacks, which have in turn bred much of the violence the American people have suffered.

There’s a photograph being spread by sad people around the interwebz of Martin Richard, 8, one of the three people who died in the marathon bombing. He’s sitting at a school desk showing off an arts and crafts project, a piece of blue posterboard with the following text written in multicolored crayon: “No more hurting people – Peace.” Martin Richard never asked for his death, or anyone else’s. He’d never even voted. But just maybe, had we, as voters, not supported at ballot boxes the initiatives implemented by the American federal government, which include the extrajudicial murder of people one man decides are terrorist and collateral killings of unquestionably innocent women and children, Martin Richard would still be alive. It’s an oversimplification to say the Tsarnaev brothers attacked the Boston Marathon as a direct retaliation for America’s geopolitical sins. To do so would make me no better than David French. But, as something of a believer in karma, I have to wonder: is the philosophy of American Exceptionalism so misplaced that we bring these tragedies on ourselves?

As I said to a friend recently on social media: My hope – though it’s not my prediction – is that we’ll not sit by and deny the injuries and deaths suffered in Boston on Monday their meaning. I hope we can learn from the attack, that we can elect a more honest government when the current one is done. But the only way that’s possible is if we are brutally honest with ourselves about our addictions to the distorted American MO of the past decade and a half in which we’ve allowed one man to eviscerated the most important parts of our founding legal structure.

The second lesson is learned from the way journalists covered this tragedy. Megan Garber of The Atlantic magazine makes a salient point: that spewing a garbled picture of what happened into the ether of the American psyche in the vacuum of deadline pressure absent meaningful context is not really journalism. Garber didn’t say this in so many words; I guess maybe I see the purpose of this writing as a supplement to her thoughts. Her central point – that “Tamerlan and Dzhokhar Tsarnaev are not simply ‘the Marathon bombers,’ or ‘murderers,’ or ‘Chechens,’ or ‘immigrants,’ or ‘Muslims.’ They might turn out to be all of those things. They might not. The one thing we know for sure is that they are not only those things” – necessitates the conclusion that the theretofore described comforts we as journalists and, by extension, a readership took in such stark terms does not reflect the full story. Therefore, it does not reflect the truth. Therefore, it is not journalism. It’s stenography, a robotic art maligned by professors in the J-halls of our institutions of higher education, but too often practiced by many of the same guard. That’s how we arrive at foregone conclusions like the one in my classroom that the Tsarnaev brothers were extremist Muslims. That’s how we waged our “War on Terror,” which has gone nowhere good.

We find it easy to blame reporters alone for these mistakes. It is easy, and that’s how we like it as Americans. But, again, that’s not where the truth lies. On a systemic level, these mistakes are our fault because we’ve voted – by attending advertisements that squash any dedication to integrity in reporting – for this type of shabby coverage. Not all journalists buckle under the pressure, but the ones who don’t are becoming increasingly spare.

These two lessons are equally important. But they will, with all the distractions proffered by our media, especially the ilk of British conservative dingbat Mark Steyn who implicitly insist that to “co-exist” is a bad thing, be difficult to learn.

As the President said: “Americans refuse to be terrorized. Ultimately, that’s what we’ll remember from this week.” Yes. Hopefully we’ll do so – by refusing to further support his and his predecessors’ foreign policy and the corporate structure that has propagated that terrorization.

The Way We Destroy Things Now

Driving west on Highway 40, coming off the gradual downward slope from the majestic Continental Divide that eventually finds itself at the Green River, whose water used to find itself, after a series of grand confluences with other arresting streams, in a final smooth crescendo in the Gulf of California, you’d not know it, except to use your imagination. You look to your right and see the rocks rise like giant, grey, tree-covered almonds, jutting out to jagged peaks. Behind the rocks could be anything, you say to yourself. You wonder, but I can tell you.

One thing behind those rocks to the north, had you driven through a decade ago, was a younger me, running the hills, the thousands of acres of sage fields owned and managed – or neglected, however you chose to look at it – by the Bureau of Land Management, a government agency that is charged with the care of 8.4-some-odd million other acres in Colorado, part of the 256 million other acres in the 12 Western U.S. states it serves. I hiked and ran and frolicked, occasionally with a bunch of scrappy friends, but most times on my own, making my own roads that crossed the ones people drove in cars, over soft hills; arroyos; massive, sloping, white, bald rock formations that sweep and croon dramatically up and gut-wrenchingly down; into washes; canyons; cliffs; cacti fields; caves. Deer and elk did the same as me, rabbits fucked, lizards scurried and mountain lions lurked.

One day, I spent a long time stuck on a rock I had attempted to climb, free as I normally climbed, and came to place from which I couldn’t negotiate myself. The white sandstone was smooth, but it crumbled. I had stopped on a ledge from which I could continue no higher. And I couldn’t make my way back down the 40 feet below me to the buck brush. After looking about for maybe an hour, not finding a way, I was about send a friend back to my house to get help. But no, I told myself. I couldn’t do that. I took a deep breath and slid over the edge, facing the rock, clinging to a small knob I wasn’t sure would hold me, until I found a foothold below. As I went down, I scraped off tiny granules off the rock, which prematurely fell to the creek bed. They’ll be moved, one by one, toward the Pacific, where eons after my life is over, they’ll be subdued by the Earth’s crust along with the rest of the frontal ranks of the continental plate of which they are a part. But that’ll never matter to me, that I slightly altered the future of those tiny grains. For my part, I just got over the edge; I trusted the rock. Of course, the flimsy knob could just have well have cracked from the face and sent me tumbling to serious injury. But it didn’t, and I was healthy to continue my adventures.

Another day, I heard gunshots on a nearby ridgeback and found my way to them. It was several kids from town, target practicing with a dad’s new hunting rifle, a mean-looking 30.06 with an evil black composite stock and a slick partial night vision scope. They’d placed some beer cans and bottles on another ridge, about 200 yards away and were shooting over the hood of a pickup truck, using it as a benchrest. They did this often, and were excellent marksmen. The cans flew into the air, over the crest of the ridge on which they sat. For the final shot, one of them took rifle and stood away from the truck off-hand, with nothing but his hands to support the rifle, and nailed the final glass bottle, which shattered to oblivion in countless pieces in the distance.

They got in the truck and drove away, and I continued my endless trek, my cadence of bare back and feet. The bottles and cans stayed there for I don’t know how long. Now, eight or nine years later, they might still be there, decomposing as quickly as those items decompose, which isn’t very fast at all. In fact, glass, which is made from molten sand, never decomposes, though it’ll break back down, as quickly as those grains of sand I knocked off the cliff side again become lava, into the element it once was. If no one cleans it up, those glass shards will be there on the crest of that ridge for millions of years.

I used to leave things like that in places like that all the time.

My dad used to take me on fishing trips in high mountain Colorado creeks, where most people weren’t hardy enough to venture. He wanted more than anything to make me as good a fly fisherman as he, a lofty goal I’d never reach. Still, we’d go for brook and rainbow trout. Dad’d come back with 10 of them strung through the mouths and the gills on a chain or rope he’d fashioned to make sure the fish didn’t get away after he’d caught them. I’d have none, and never would any.

If we were camping, dad’d stick some slices of lemon and some butter in the chest cavities of the fish, where the guts used to be, wrap the fish in pieces of aluminum foil and toss them in a campfire for a while. If we brought them home, mom’d cook them in a frying pan, and we’d eat them. They were always absolutely delicious. They were the favorite food of a bunch of members of my family.

On one of the infamous Hedge family camping trips, we’d procured a camp ground near the lake, and there were a lot of rules in place, dictated by a bunch of faceless bureaucrats whose job it was to sit at a desk and make all those rules. Or so my family saw it. They spent the majority of the time drinking and complaining about the rules. I just drank. I drank so much that when it came time to leave, after a fitful night’s sleep, I wasn’t sure where I’d parked my car, and had to bumble about the campsite for an hour or so before I was in shape to leave. But during one of the three days we’d spent there, in one of the few sober moments of that trip, my dad and I drove his motorhome, which he brings on camping trips to make them more comfortable, to a stretch of the stream unoccupied by other fishermen. We parked on the side of a narrow part of the dirt road, and hiked down to the creek. After a few minutes, a park employee came by in a government vehicle and told us we couldn’t have our motorhome parked where it was, that we’d have to move it. My dad’s reaction was less than graceful. After what seemed like a very long, loud verbal dispute with the park employee, my dad threw his fishing pole in the motorhome and we drove back to the camping site.

He said something like, “The government’s just gotta be in everybody’s business all the time. You can’t do anything with the government giving you heck.”

My dad is very pessimistic about the government. He sees most things the government does as an unwelcome intrusion into what would otherwise be a harmonious clockwork of economic vitality, orchestrated by an unencumbered private sector. We’d be free to pursue our business ventures, our religious ambitions, our fishing and hunting trips. We’d be allowed to do whatever we wanted wherever we wanted. I don’t blame him for his reactions or the beliefs that spur them. I’m an idealist as well, though on the polar opposite side of the political spectrum. But in this case, I didn’t see what all the fuss was about. I tried to tell him there were lots of rules because there are lots of people who would ruin places like where we were camping if we didn’t have them. He wouldn’t agree.

As I got older, either my dad had started bringing wine coolers to drink on the way up the stream, or I started noticing that he brought them. We’d drink them as we dipped our fly line to the over, around and through of the brambles that swallowed the streams we fished. We’d chuck the empty bottles into the lush green underbrush, the small-scale, hyper-local version of the grander green around us. The White River National Forest, the Flat Tops Wilderness Area in the Routt National Forest, the Grizzly Creek Canyon that empties into Glenwood Canyon. We left glass and aluminum and plastic and lots of other things like that. We left a trace.

Last thing I want to say is that we were entirely disrespectful of nature. We were not. When my dad goes hunting and kills an animal, he thanks the Creator, who he incidentally, like many people, calls God, for the sustenance. We do our best to clean up our campsites. But, fact remains, we never made a big fuss about it.

But the point is, we got lazy. Lots of people still do. But – and this is more important – the point is, when we are told by people like park rangers that we have to do things a different way, that maybe we’re not allowed to park in certain places and when they write us tickets for fishing without a license, we don’t take it with poise. We see it as an overreach of government into parts of our lives in which it doesn’t belong. We subscribe to wild conspiracy theories every time there was a news story about popularly elected representatives setting aside a piece of land as sacred, not to be trampled on, or making a new rule intended to block corporations from dumping poison into public waterways. They were restricting our access to the public land because they were doing things on that land that they knew we’d be angry about if we knew, like building underground facilities from which they’d stage their evil plans to systemically overthrow the electorate and turn America Red. They were planting tracking devices in rocks and other things you’re not supposed to take from national monuments and forests to make sure that people were staying in line with the rules. They were training military police in urban warfare tactics with which they’d suppress any hints of popular uprising, the tools of which they are parenthetically taking away by gutting the Second Amendment.

I never really knew where these ideas originated. I didn’t really care, because I knew we had all that BLM, where we could do pretty much whatever we wanted. And usually, that wasn’t bad. Aside from the relatively minor and occasional sin of littering, we tried to respect our resources and surroundings. Still, when government designated an endangered species or cordoned off a regrowth area, it was seen as part of that insidious government plot to Take It All Over.

And so, we just needed to be able to do pretty much whatever we wanted.

I’VE BEEN AWAY from the wilderness for a long time. The closest I’ve been to nature in the last year and a half is my weekly 3-mile run down a nondescript street on Dam Neck naval annex. The trees and undergrowth are incredibly green in Virginia, where it’s a lot more humid than in Colorado. The plants seem to encase the road, and on the rare occasions when no one else is running, it’s a lot easier to feel close to nature. There’s a beach nearby, where, as I’ve described previously on the Hedge, the shark eggs and horseshoe crabs wash up, the shells get collected before noon, a hurricane snapped off at the ground miles of flimsy fence that had separated the dunes from the flattened sands of the beach. I run there, too, when I think I can get away with it, which isn’t very often.

The base is gorgeous, if you stop and look at it, but most people around here don’t very often. But there are also lots of big, blocky military buildings, reminders of the industrial revolution now embodied in America’s ailing military-industrial complex.

I’m a long way away from the wilderness. So I started watching Ken Burns’s necessarily lengthy series of documentaries about America’s national parks, which he dubs in the documentary title “America’s best idea.” With every tale the series tells about the advent of the national parks and the assaults they endured to become establishments of American identity I find myself more upset with the world. Every national park started with a threat, an entrepreneur looking to turn a quick profit thought the wonder of a particular American place, and necessary exploitation thereof, would easily put dollars in their pockets.

But a guard of men and women who should be seen as the truest Americans, dedicated to the truest American idea, acted. Iowa Republican Representative John F. Lacey drafted the 1906 Act for Preservation of American Antiquities, which allowed Theodore Roosevelt to save the character of the Grand Canyon; gave FDR the authority to set aside the Grand Tetons; let Jimmy Carter establish the Denali Wilderness and expand the Arctic National Wildlife Refuge in Alaska. John D. Rockefeller Jr., heir to one of the most prominent American fortunes, donated the equivalent of $45 million to the cause, helping to create Acadia, Grand Teton, Great Smoky Mountains and several others. John Muir, a Presbyterian Scot who memorized the entire Bible when he was a boy and, by ironic turn, was a father of the treehuggers, convinced Theodore Roosevelt to protect Yosemite on an impromptu camping trip. The examples seem endless, so I’ll not attempt to list more here; you really just have to watch. It’s available on Netflix.

But by way of synopsis: The overall point Burns makes is that the national parks redemocratized American land. Until then, the trend was a horrific spiral of private overtake of acreage and the resources – oil, coal, wildlife – thereof for corporate exploitation, whether to draw tourists for capital gain or to strengthen industry’s foothold in the American mind. The flow of land to private individuals and companies – the few with the monies to purchase them – from a loosely defined public domain represented a growing disparity between the classes, a disturbing willingness of politicians to bend to the whims of people who hold more power than other people, a private coup of a huge part of the public sphere. It had gotten so bad that an ostensible return to the notion that public land was jointly owned by the public and administered and maintained by popularly elected officials was regarded in the mainstream psyche as a radical step. This is hardly a non sequitur to anyone with a tenuous grasp on naturally occurring paradigms of democratic society. Yet that was a nut of Burns’s project thesis – that the advent of natural parks was as radical a movement as the Declaration of Independence.

We see a similar trend of privatization today in the squelching of upward mobility for the lower classes – which are absorbing the middle class – and the attached widening of the gap between the classes. We’ve become a nation with a kind of caste system, hailed by the loudmouthed corporate shills on talk radio as an organic economy, and those in power who claim to be fighting its progression are only facilitating it. This new battle to benefit the corporate structure at the expense of the proletariat is waged on a number of new grounds, the biggest being new methods of disseminating propaganda more efficiently, totally and with more pizzazz than ever before. Which brings me to the point: those same advancements in the newest round of industrial revolution have also enabled those at The Top to wage a bigger war, that of the systemic destruction of our beautiful Earth.

Sure, people like Burns would argue that from the mid-19th century through the day of the New Deal, when progression, the solidification of our national identity as an America that cared about our fellow citizen and the preservation of our country, was a goal toward which government worked. And what progress it was! But the counterrevolution was quick to the draw, in mere decades.

The trend has a terrifying side-effect. As the corporate cause progresses, implants itself ever more permanently and systemically in our lives, we don’t destroy our environment only in the more benign ways of our past. We destroy them in ways that are so complete and permanent that I have serious doubts about the health of the world I will leave my 4-month-old daughter, Harper.

I’m no anthropologist, but I’ll try to put this in perspective: Throughout history, humans have polluted. Like the animals with whom they cohabitated the planet, our earliest ancestors roamed the Earth, defecating, urinating, leaving the dead carcasses of the previously living animals they ate in places that would have gone on in health otherwise. But these humans represented no malignant cancer, no population crisis that spelled the impending death of Earth as their kind knew it. They passed the unchanged Earth along to succeeding millenaries whose evolution spawned civilization. Some – the aboriginal tribes in Australia, North and South America, Africa and parts of Asia, for example – were not much more threatening to the health of their environments than their predecessors. They established equilibrium. But as European, Middle Eastern and Far Eastern men and women drafted their social constructs – villages, cultures, rudimentary hierarchies and, eventually, complex governments – the Earth showed the first true symptoms of her spreading disease. Communities sprung up, and humans needed to deal with large amounts of concentrated waste. With the advent of civic planning 10,000 years ago, a human community in what is now Scotland funneled its shit into a nearby creek. Primitive plumbing systems emerged in the Middle East over the millennia; the bourgeoisie in Crete forced servants and slaves to hand operate flushing systems for nascent toilets; the Romans took 225 years to design and build a sewage system, whose model is still in use today. After the fall of the Roman Empire, attempts to channel waste away from civilization and into the wilderness stopped; people slung their shit from buckets, vases, pots every which way, including onto each other, which naturally birthed a flurry of what passed for small claims cases back then. Environmental impact studies were an unpredicted thing of the future; nature was held in ignorant disregard, as most were simply concerned with the pains of living day-to-day through the Dark Ages. Subsequent generations of Londoners pumped feces into the ocean, Venetians pushed gondolas through sewage. It was systemic and contagious; some large modern communities still do the same. Still, these localized dangers, coupled with more-or-less stagnant population of about a billion people in the entire world, posed no real threat to the global environment.

It wasn’t until the age of the Renaissance, the intrinsic cognition of human ability to communicate, to better our own lives, to stop living in such squalor, to Go Forth and Multiply in concert with the Biblical dictate, that we counter intuitively began to destroy things on a terrifying systemic level. We created the printing press, the dry dock, the newspaper, the gun. We went farther, embarking on several industrial revolutions. The first, from 1760 to the early-to-mid-1800s, brought us mass production, machines that could do vastly more work in just a short period of time than a human could in weeks. We suddenly had the steam engine, which engendered the textiles industry and mass-distribution and mass production of steel.

But before we could make the whole system work, we had to make the energy these machines needed to operate, and that fossil fuels, mostly bituminous coal, were the perfect source, far more efficient that the biofuels we had been burning, like wood. So we started mining it on an industrial scale, using the same machines it would power, and we built economies on fossil fuels and the gadgets they made go. The chicken laid an egg very similar to the one from which it came, which came from another chicken and so on to infinite. Unfortunately, no environmental impact study was done due to deep contextual ignorance of atmospheric history, lack of foresight for what it could mean for the future health of humans and other species and an absence of tools to gather or methods to analyze any of this data.

A second Industrial Revolution, from the late 1800s to World War I, gave us railroads and electricity and chemicals and steel, all of which rely on the infrastructure predicated by fossil fuels, and we became further dependent on them. Succeeding developments gave us a drastically more subsidized (but not efficient) mass food production mechanism. Eventually, we had nationwide facilities for travel, mass communications on what evolved into an up-to-the-second timeframe, a 24-hour news cycle and so many daily implements that depend on fossil fuels that I can’t remember going a day in my life when I didn’t at least indirectly contributing to the problem. See? I’m doing it right now.

So, with no thought toward the matter, we’d built entire economies, a global society on this foundation, locking our collectively addictive personality inexorably in the vicious cycle of production, which spawned more human life, which spawned more production, and – to quote Kurt Vonnegut – so on. This excellent package by Mother Jones magazine, which I’ve linked before on the Hedge, shows us that the resulting population disaster isn’t represented in the political lexicon; when we discuss social planning, the idea of population control is relegated to the justly exiled realm of dictatorship. Instead of acknowledging that population, which is expected to plateau at more than 9 billion people in the middle of this century, we somehow think we must make to make it even bigger. The conservatives, under guise of an assumed moral superiority of their loosely-followed Christianity, insist that we continue to do things the way we have, to divest ourselves in science and rely in a political, secular realm on the fantasy that if people just didn’t have sex, society wouldn’t have unwanted babies. Sure, pure premarital and extramarital abstinence is the only way to be certain someone won’t get pregnant (unless we stuck to gay sex), but anyone who holds that premise as a valid foundation for a democratic secular society such as America’s profoundly misunderstands human nature.

Lots of things are wrong.

Gyres in the Atlantic and Pacific oceans suck in small bits of plastic trash and swirl it in an inescapable rotation that threatens the marine food chain. According to National Geographic, the gyre in the Pacific, known as the Great Pacific Garbage Patch, is 7 million square miles. It is so large that independent scientists cannot study it in its entirety, and no country is willing to put up the political capital to start the vast consortium necessary to clean it up.

People can light their water on fire and get all kinds of diseases from drinking it because we insist that, though oil is becoming a less viable foundation for our extreme consumption, we can continue the lifestyle with cleaner burning natural gas. We pull the gas from rocks deep underground in a frenzy for an alternative. The natural gas industry’s evils are captured here, by ProPublica’s excellent investigators.

According to 350.org, a nonprofit dedicated to influencing climate policy, we long ago passed what is considered among leading scientists to be the highest safe level of atmospheric carbon: 350 parts per million. Today, the atmosphere has about 392 parts per million, and we’re adding about two parts per million every year. The website notes the deep implication of this problem:

Accelerating arctic warming and other early climate impacts have led scientists to conclude that we are already above the safe zone at our current 390ppm, and that unless we are able to rapidly return to below 350 ppm this century, we risk reaching tipping points and irreversible impacts such as the melting of the Greenland ice sheet and major methane releases from increased permafrost melt.

Things like this happen.

Because we’re so focused on our pettily held religious notions, which in the vastness of the distorted American Dream only justifies our pettily held political conceptions, we are unable to admit we have a problem.

And when we’re forced into a corner with facts, we deny that they’re facts. We contrive backroom conspiracies – the most troubling case being that of global warming, in which right-wing media hackers dug and dug until, in 2009, they found a few negligible communication fumbles made by a group of climate scientists in groundbreaking research on global warming. Conservative pundits who called themselves “journalists” judged these faux pas, which didn’t challenge the validity of the scientists’ research findings, worthy of the moniker “Climategate” and proof that the biggest heretofore looming disaster in the history of the world is a hoax.

Of course, to ask the deniers is learn that these climate scientists have a nefarious motive: They need to secure cash flow for future research, so they must convince the purveyors of these funds – in some cases the United States federal government – that more information must be gathered and analyzed, which requires more money. It’s like a job security Ponzi scheme in which they dupe the investors – taxpayers – into the belief that their capital will yield a sensible policy model to address what the scientists call the biggest geopolitical issue in the history of the world. Which of course will turn out to be the biggest scientific ruse of our time. Just wait and see. At the same time, the captains of the fossil fuels industry, who certainly only have the best interests of a hard-working proletariat at heart, have no such motive; the vastly increased profit margin of the past decade, some of which went to funding distorted and contrary research based on preconceived conclusions, don’t at all figure into their strategy.

It’s important to note that oil profits have slowed by a small rate in the last couple of years, but that doesn’t negate that the oil industry spends massive sums on lobbying Congress to favor its companies when it creates legislation.

And when the so-called scandal was shown for the blather it was by independent investigations, the news media dropped the ball. Climategate, and the supreme integrity of its preachers, inspired many Americans to abandon their knowledge of the science altogether. As Jon Stewart, a comedian who often covers the news better than most journalists, pointed out only 59 percent of Americans believed in 2011 that global warming was a matter of concern, a 20-point drop from just five years prior. And journalists didn’t call it to attention when the debunking of the scandal was confirmed in The Wall Street Journal, of all places.

Of course they didn’t cover it; the McRib was back.

It’s not man, the “climate skeptics” say, or it’s not all man, and even if it were, it wouldn’t necessarily herald a catastrophe. So why not continue living as we live? The risks are slim, they say, as the ice caps melt away.

If I predicate an argument on an established fact and the person with whom I’m arguing says, “I don’t think that’s true,” it disables me from making my case. It’s a perfect strategy; works every time. This way, by denying hard scientific fact on the basis of data that is either taken out of context or outright falsified or on the basis of simple blind faith, we feel justified in supporting the fascist state that is perpetrating these crimes.

In Burns’s documentary series, there are many grandiose shots of cracks in mountains through which a much more fundamental life-support substance, fresh water, flows. Aside from the narratives describing how men who called themselves entrepreneurs exploited the natural resources we should embrace as more emblematic of our great land than the capitalistic aspirations of our increasingly right-wing political philosophy, it’s these beautiful shots that bother me the most. The landmarks they convey are breathtaking, the vast runoff of our most precious resource, the channels by which we navigated the West. But, with the aforementioned crises the world has endured, these waterways have become poisoned as industry in 2009 dumped 113,000 tons of toxic waste into American freshwater, according to this 2012 Environment America report. The great Colorado River, where I played and swam as a kid and again as a raft guide two summers ago, runs dry for 60 miles before where it once flowed effortlessly into the Gulf of California because it’s siphoned off, according to federal mandate, to cities and farms.

As I wrote above, we humans are a kind of malignant cancer.

It may seem counterproductive – downright madly depressive – and maybe a little cliché to look on human civilization as a disease. But it’s also a simple statement of fact. Counterproductive or not, I fancy myself a journalist in some capacity, and it’s a journalist’s responsibility to identify an existential crisis in its most basic essence. Indeed, a certain gloom is warranted for our failure to recognize the full meaning and context of our excessive way of life as a species – but even more so as Americans, who share a heavier burden of the guilt and more resources to reverse these troubling developments. Many people are working hard toward this very goal, including the scientists so maligned in right-wing thinking circles that control the public conversation that, even though there’s broad agreement among scientists about the causes of global warming, we are losing our consensus on even the most basic facts: that the Earth is getting warmer, that climate change is caused by human activity and that it has catastrophic implications for a future that is very quickly becoming the present.

Now that I’ve done that, it’s also my responsibility to light a path toward potential relief. There’s no small degree of irony in our human existence; we are also capable and would be profoundly behooved to come up with ways to counter the destruction of our own planet. We can spread the good word of Ken Burns. We can read good books about the problem; here’s one, “Running Dry,” by Jonathan Waterman, the writer of the New York Times article linked above. There are lots more, available in a simple Google search. We can demand that our representatives acknowledge our addiction and take steps to wean us from it.

You know, all the bullshit platitudes, all the same things Al Gore told us at the end of “An Inconvenient Truth.”

What am I kidding myself? This blog will probably never reach enough people to make a difference, and if it does, those people probably won’t do anything. I’d be preaching to a choir whose formidable opponents refuse to acknowledge science so they can justify their excess. Because the way we destroy things is now is not the real problem. If the object were the only thing that posed a threat, we’d not have drug addicts and alcoholics. Not only have we locked ourselves in the activities that are ruining our home, we have locked ourselves in the thinking that either a) we are not destructive, or b) if we were, it wouldn’t be that big a deal. It all feels so hopeless sometimes.

Fortunately, this blog certainly isn’t only source out there demanding solutions to this problem. The Mother Jones project linked above suggests the key is education. But a note from editors Monica Bauerlein and Clara Jeffery introducing the story expresses similar pessimism about whether we, as a species, are willing to take the measures necessary to preserve our home:

The truth, as both camps are coming to realize, is that you can’t protect the environment without advancing human rights—and vice versa. To wit: Every tiny improvement in the status of women, every bit of education for girls, translates into women having more control over their fertility, which translates into family sizes that match parents’ means and wishes, which in turn means more opportunity for the next generation—a virtuous cycle of enormous potential. The best 21st-century contraceptive, as Julia Whitty writes in our cover story, turns out to be a microloan. Improving the lot of humanity will also put us on track to limit our toll on earth’s resources—if (and it’s a massive if) we simultaneously manage to get our fossil fuel use in check.

So, what do I do, dear readers? I suppose there isn’t anything I can do except to keep writing, hoping to change a mind or two, and that those will go and change the minds of still others.

Several months ago, just after Barack Obama was reelected to the highest political office in America, I posted some optimism I was feeling for the political process. That maybe, next time around, we’ll find it within ourselves to elect a less fascistic government. But it was just a maybe, and a slim one at that. The optimism comes in cycles, and I’m in a rut.

When I go back to Colorado for leave, I plan to spend time with my dad on a hunting trip. We’ll go find some elk or some deer, but maybe we won’t succeed. Either way, we’ll talk about politics to mom’s and Hailey’s chagrin. I’ll try to convince my dad about some of the things I wrote about in this and other blog posts. He’ll try to convince me to the contrary. We’ll probably both come away feeling unproductive. It’ll be OK as it can be. Which isn’t very OK, because people like my dad are the people we who see the facts for what they are have to convince. It’s on all of us to change things. It can happen, I suppose. But I don’t have a lot of faith.

Indeed, the way we destroy things now is not the problem. Ralph Waldo Emerson had it right:

“What lies behind us and what lies before us are tiny matters compared to what lies within us.”

Nationalize the Commons

Four thousand square feet of aluminum paneling make up the Arecibo radio satellite dish, the sensitive receiver in the world.

Four thousand square feet of aluminum paneling make up the Arecibo radio satellite dish, the most sensitive receiver in the world.

El Pozo de Jacinto was calling to me, instead of the other way around, as legend has it. The massive hole, which had been eaten through the modest oxidized cliff-side over millennia by the vicious northeastern Caribbean waves and out the top, is a telling conduit of nature’s power. The saltwater, forced by an early spring weather front moving in, rushed through the formation and sprayed what seemed like thousands of gallons at stunning force into the waning gloomy light of the approaching mid-afternoon storm. The surf misted finely down with the breeze and onto my dirty blue jeans and blue Brewfest shirt in a calm quite unbecoming of the weather.

There was something down there, in that hole, maybe just the first call of the recognition that I had a drinking problem. But that wasn’t something I would square with myself here. Three coworkers who are also close friends and I had traveled to Isabella, Puerto Rico, after a few nights’ revelry in San Juan, to visit one of the friends’ grandparents and to party. It was spring break 2008, and we needed the disruption in our frenetic college newspapering and busy-if-we-chose-to-attend class schedule. Today, the friend’s grandfather was driving us around, showing us the sights.

We’d also visited Cornell University’s Arecibo Observatory, whose 1,000-foot-wide reflector dish – assembled in the early ‘60s from almost 4,000 aluminum panels – makes it the largest focusing antenna and most sensitive radio receiver in the world. It is where the death of the diabolical villain played by Sean Bean in the epic conclusion of the Bond flick “Goldeneye” was filmed. You may also know it from “Contact” and “The X-Files.” Also called the National Astronomy and Ionosphere Center, or NAIC, it more importantly is the site for about 300 scientists a year to listen to the universe, and for about 30,000 grade school students annually to learn about it. It is unique in too many respects to name here, but notably that it can see things in space invisible to every other telescope on Earth. The ionospheric data it gathers informs the GPSs that in turn inform the gadgets that help drive our economic paradigm. It is instrumental in the SETI@home project, a massive collaborative search for extraterrestrial intelligence that has lobbied Congress to maintain the financial health of Arecibo. It is the only facility in the world capable of tracking a possibly threatening asteroid in enough time for humans to do something about it. And it is essential in the quest for understanding of our place in the universe. But in 2007, its funding came under scrutiny in a report issued by the National Science Foundation to Cornell saying the organization was a few million dollars short of keeping NAIC open. Since the facility is technically owned by the United States government, it could take no private donations in the interest of keeping it open. In 2010, the NSF had to give up its responsibility for the observatory to Stanford Research Institute, or SRI, International, a science research non-profit, and two other administrative cohorts. Scientists at Arecibo are still innovating.

But, during the tumult, as I leaned over the rail on the viewing platform to ponder the expanse of the dish, I was ignorant. The dish and all its complex working parts were just that – still working. Arecibo was in the middle of a financial crisis, and all I knew was the sickening plunge over which I leaned into the dark grey of the aluminum basin, which is neatly tucked into the lush tropical foliage, was not helped by our trip’s persistent hangover.

We’d spent the first three days of our adventure in San Juan, where on our second day, one of my fellow travelers and I had decided we wanted to wake around 10 a.m. and go on a reporting adventure to an area of the city we’d been advised to avoid. We were going to uncover some sort of drug trafficking or violence and write about it. But first, we needed rum. On the balcony of our hotel, my friend and I quickly downed a large bottle of Bacardi 151 to get our levels correct for such an adventure. To do what we were about to do, we needed to be more dangerous than the degenerate, drug- and sex-peddling scalawags we were sure to encounter. I remember leaving on the miles-long walk toward the dubious borough; I don’t remember anything after. But from pictures and first-hand accounts of the incident, I was able to piece together the following: It happened on Condado Avenue, the main seaside thoroughfare through one of San Juan’s higher-end tourist flocking points. This was the road to where we would have our epic adventure, but it was cut quite short. Turns out, I was more dangerous to myself than any San Juan coke dealer. In a drunken daze, I fell and cracked my primary mandible incisor on the left side in half, slicing a nasty gash in my upper lip on the San Juan sidewalk. The embarrassing spectacle inspired a group of bewildered tourists to fetch my friends from a gift shop they were perusing.

“Is that your friend outside with the long hair?” the tourists called into the shop.

My friends rushed out to help me up from my stumble, and brought me back to the hotel. I remember the rest of it. I woke up at 3 the next morning in our hotel, alone, and ran to the bathroom mirror to survey the damage. My face and my tooth had been ripped apart by the concrete, and my brain felt in equal tatters from the booze. The gash in my face was at that stage where a wound starts to ooze clear liquid, and you can touch it lightly without too much of a sting.

A few moments later, my friends got back from their own adventure, which from the sound of things had taken a number of troubling turns. They were fighting about something that had happened. I didn’t pay any attention or try to figure out what the trouble was. I needed more sleep. We all tried to get to bed and sleep the rest of the morning away before we would head to quaint Isabella, where the friend’s grandparents would put us up and feed us some of the best Hispanic cuisine I’d ever experience. But the fight wasn’t ready for bed. I was falling asleep between the bickering parties, when one of the friends said something dickish, and the offended flew over me in a rage of fists and defensiveness. The other friend and I jumped from our sleep stations and labored to pull the second off the first, breaking up the fight. I accompanied the attacking party to the beach, where we met two young British lovelies also on vacation, and paired off. We spent the remaining hours before the twilight there; he smoked pot with his lady, and I swam in the crimson waves of the San Juan sunrise with mine. She invited me back to the hotel to join her for a drink at the 24-hour bar, and the other invited my friend to stay on the beach and smoke more pot with her. I told the swimming girl that I needed to stay with my friend, not catching the implication of eccentric sex in an exotic hotel room. No, we’d stay on the beach and watch the sunrise, I told her. The women reluctantly conceded, and we went our separate ways.

“Dude, you’re an idiot,” my friend informed me, as he watched the glad but unrealized specter of exotic tail in a strange city fade up the beach and disappear into the distant hotel entrance. It was only then that I realized my blunder.

But, for me, a man who drank too much, lapses in social observation were more the norm than the exception, and the faux pas didn’t apply solely to sexual suggestion. That semester, for example, I had forgotten to notify the editors and advisers of my student newspaper that we had won Colorado’s highest honor for student publications, the best newspaper accolade. The editors had been invited to attend the awards ceremony to accept the prize, but the summons got lost at my “Rocky Mountain Collegian, this is Aaron” greeting on the telephone to the woman from the Denver Press Club who called to inform us of our winning. I had simply forgotten about it. A couple months later, after a last-minute call from the Press Club, the editor – who today was my companion on the San Juan beach – and his second-in-command had to rush to the Club, an hour away to accept it.

I knew the resulting derisive shit calls to spread from this latest incident would be abundant. But whatever. After some momentary ridicule, the friend forgave my thick-headedness, and we watched the rest of the sunrise over the distant, black horizon. We walked up and down the beach, and I told him I felt like I was squandering time, like I should have been doing something because on vacation, not every minute of normal human waking time was eaten by an interview or an appointment to pick up some documents.

“Relax, man; we do a public service every day,” the friend told me.

SEVERAL DAYS LATER, in Isabella, was when the spray erupted from the hole in the rock at El Pozo de Jacinto, or Jacinto’s Pit Cave. According to a local folktale, a farmer named Jacinto had led a cow too close to the hole, and it dragged him in. When someone calls, “Jacinto, dame la vaca” (Jacinto, give me the cow), the water will spray through the hole as Jacinto’s answer. I’ve never been superstitious and I certainly didn’t hear anyone call the phrase out when I visited the hole, so I naturally choose to remain dubious of the story.

But what strikes me as profound about the trip when I reminisce on it, as I often do with my thinking and now-sober brain, is that the hole in the rock is part of an identity for a community, and it is managed publicly, cleaned by the beachgoers for propagation of public enjoyment.

The Commonwealth of Puerto Rico, an unincorporated territory of the United States, is a conservative place, or so I gathered from the scattered understanding of brief conversations with the friend’s grandfather. He’s an Army veteran and a Puerto Rican to his core who schmoozed with local politicians at a bar on yet another beach where he bought us a few drinks later that night. In every conflict since World War I, 200,000 Puerto Ricans have served in the American military, and 10,000 are currently serving. Puerto Rico’s economy was forged in the conservative fires of early 20th Century agriculture – hard work for hard money; sugarcane, pineapple and coffee for even payment. The United States had ended Spain’s four-century rule of the archipelago – there are three islands other than the big one – that comprises the territory in 1898 when it won the Spanish-American War. Though Puerto Ricans have elected their own government since 1948, when they put a governor in place, the United States Congress has had authority to drive much of the policy apparatus – including funding for Arecibo – on the islands. Its two national holidays are July 25, Puerto Rico Constitution Day – which reflects the 1952 day the polity’s governing document was ratified – and our Independence Day.

As anyone would in their position, Puerto Ricans are looking to obtain more control of their own government, though they are torn on how to accomplish that task. They doubtfully voted last year by referendum for statehood and the ability to wield the voting power in presidential elections that other Americans do. But while statehood supporters said it spelled victory for the people, the decision of whether to begin the process to admit Puerto Rico as a state still rests on the U.S. Congress. And analysts told reporters the election represented a divided electorate that was just voting in favor of something, anything, to change the structure in place. Thirty-nine percent of voters said they preferred an option other than statehood, either sovereign free association or outright independence. The discord over the status of Puerto Rican ownership of American rights didn’t start recently; it goes back to 1917 and WWI, when the United States imposed U.S. citizenship on the people there. Every member of the Puerto Rican House of Delegates voted against the rule that made their constituents citizens because it would effectively allow the government to draft young Puerto Rican men into the military. But the U.S. government made it happen, anyway.

It was part of a clever expansion of the American Empire. Sure, it was also the catalyst for Puerto Rico to complete its bicameral legislature with the advent of a Senate and for the ratification of a Bill of Rights. But the bigger picture it helped achieve was that ever-American vision of Western superiority with the U.S., of course, in the lead. It helped mobilize the military for the most epic conflicts in history and gave us another base from which we would draw for subsequent battles – Korea, Vietnam, American operations in the Middle East – in which we’d lose the tattered remains of our moral dominance.

In many prominent circles and according to most official accounts of history, America during the WWI and every generation since has been seen as a beacon for all that is right with humanity, an actor to which other countries should model themselves. But to Puerto Rico, as well as other Latin American and Third World polities in which the American government – acting mostly at the whim of corporate interests – has violently imposed itself, has rightly been seen as something far different and far more insidious. In the circles to which it matters most, the spread of the American Empire is a metastasizing tapeworm, swallowing natural, industrial and intellectual resources with impunity.

Within the restrictive boundaries of America as Most of Us See It, the 50 states in all their self-importance, a parallel and equally troubling – and in many ways, conjoined – trend has been on the rise for all of the same decades. Recently, the structures in place to stop that movement have been systematically deconstructed. The death of public service for the sake thereof, and the correlating proliferation of capitalism on the same, has taken an insidious and quiet toehold in the American psyche, and we have allowed it to happen.

The assets that make up – or should make up – the public sphere, to which I’ll refer as “the commons,” involve a long list: the dissemination of natural resources, health care, military, the diffusion of information in the public interest, education, etc. They are the things on which we depend for individual survival and for the health of the collective. It’s no secret that these things are under assault. Especially in the right wing of the American polity, any publically administered service that necessitates the expense of tax money is to be deleted post haste, and would be if it were not for the meddling of those pesky liberals in the public dialogue. To replace it, they would have – and in many instances have had – those services outsourced to private, for-profit functions, which, according to legal officials, are legally beholden to not only being profitable, but to pursuing profit above every other motive in the interest of making money for their private shareholders. It doesn’t matter if the service they provide – food production, for example – is essential to the mental wellbeing, health or life of the public herd; corporate administrators are to do everything in their power to make sure their profit margin is as large as possible. This is not a goal that’s difficult to accomplish. There’s a built-in incentive beyond just the legal ramifications of failure in the profit quest – the lavish bonuses, social status and political influence that naturally come with the implementation of said objective are not unattractive to corporate fat-cats.

America’s Anchorman, Rush Limbaugh, is fond of pronouncing his skepticism of the Affordable Care Act, along with other ostensibly public programs, with this concept as his founding philosophy. If an entity is motivated not by profit but by a focus on the service it provides to its community, that goal is naturally coupled with an underhanded effort to turn the country Red, stifling competition and, by that turn, innovation. Therefore, implies Rush, it is necessary to incentivize large-scale industrial innovations by privatizing and deregulating all the working parts of a society that would otherwise be accomplished by government. Rush, of all people, should be happy with the direction in which America’s health care system is headed; the biggest thing the Affordable Care Act achieves is a guaranteed clientele for the largest private industry in the country by mandating insurance coverage, and this was done at the expense of a public option.

The interests of corporations were very well-served, indeed.

But even when a company tries to get away from this sadly omniscient metric of corporate health, they are chastened by America’s legal structures, which seem to exist now not for the public good as a check on government but solely toward the end of profitability. Take the case of Craigslist, Inc.: When sued by eBay for breach of fiduciary duty – legalese for the profit imperative required by federal and some state law – the Delaware Court of Chancery judge said Craigslist cannot measure its success by how well it serves its community because profit then becomes an ancillary goal.

What results is an operating paradigm in which corporations do everything within their power to turn a bigger profit – and they have done an excellent job – including actions that are detrimental to the health of the proletariat. One of the best examples is the food industry, which, according to this extraordinary report by New York Times Magazine reporter Michael Moss, intentionally alters the chemical properties of their products – food that is often very unhealthy – to enhance their addictive qualities. Food companies also administer persuasive advertising campaigns to convince the public to buy the products under the false message that they are not unhealthy. The story should be required reading for anyone thinking about eating these foods (I’m one of them), but I’ll leave you with an excerpt that I think best explains the trend:

The public and the food companies have known for decades now … that sugary, salty, fatty foods are not good for us in the quantities that we consume them. So why are the diabetes and obesity and hypertension numbers still spiraling out of control? It’s not just a matter of poor willpower on the part of the consumer and a give-the-people-what-they-want attitude on the part of the food manufacturers. What I found, over four years of research and reporting, was a conscious effort — taking place in labs and marketing meetings and grocery-store aisles — to get people hooked on foods that are convenient and inexpensive. I talked to more than 300 people in or formerly employed by the processed-food industry, from scientists to marketers to C.E.O.’s. Some were willing whistle-blowers, while others spoke reluctantly when presented with some of the thousands of pages of secret memos that I obtained from inside the food industry’s operations. What follows is a series of small case studies of a handful of characters whose work then, and perspective now, sheds light on how the foods are created and sold to people who, while not powerless, are extremely vulnerable to the intensity of these companies’ industrial formulations and selling campaigns.

For another example, I’ll look again to the health care industry. In this TIME report, which is another shining example of investigative journalism, weathered writer Steven Brill asks a question that is essential to the health care debate, but has gone shockingly unaddressed until Brill’s report was published: Why does health care cost so much in America? As Brill notes, the health care debate focuses on who should be footing the bill for America’s obscene hospital bills. Nobody bothers to follow the money, to find out why Americans pay more for health care than the next 10 top expensive countries combined. He explores a landscape in which hospitals, many of them classified as non-profits and associated with public institutions, provide sordid salaries to their administrators and board members without justifying the costs, while handing patients – people who didn’t choose to become part of that marketplace, as Brill notes – equally nauseating bills. Take one the patients whose story Brill paints:

One night last summer at her home near Stamford, Conn., a 64-year-old former sales clerk whom I’ll call Janice S. felt chest pains. She was taken four miles by ambulance to the emergency room at Stamford Hospital, officially a nonprofit institution. After about three hours of tests and some brief encounters with a doctor, she was told she had indigestion and sent home. That was the good news. The bad news was the bill: $995 for the ambulance ride, $3,000 for the doctors and $17,000 for the hospital — in sum, $21,000 for a false alarm.

There’s plenty of other enraging anecdote – a nearly $8,000 privately-purchased CT procedure that would have cost Medicare mere hundreds, $1.50 tablets of generic Tylenol purchased from a hospital that would have cost 100 times less on Amazon,  etc. – but the overall is terrifyingly encapsulated in this passage:

When you crunch data compiled by McKinsey and other researchers, the big picture looks like this: We’re likely to spend $2.8 trillion this year on health care. That $2.8 trillion is likely to be $750 billion, or 27%, more than we would spend if we spent the same per capita as other developed countries, even after adjusting for the relatively high per capita income in the U.S. vs. those other countries. Of the total $2.8 trillion that will be spent on health care, about $800 billion will be paid by the federal government through the Medicare insurance program for the disabled and those 65 and older and the Medicaid program, which provides care for the poor. That $800 billion, which keeps rising far faster than inflation and the gross domestic product, is what’s driving the federal deficit. The other $2 trillion will be paid mostly by private health-insurance companies and individuals who have no insurance or who will pay some portion of the bills covered by their insurance. This is what’s increasingly burdening businesses that pay for their employees’ health insurance and forcing individuals to pay so much in out-of-pocket expenses.

Discussing my apprehension on being unoccupied on that beach in Puerto Rico with my editor, I was focused on founding a career in journalism. Of course, that goal has not yet been realized. I hope it will, someday. For now, I’m forced by personal circumstances to watch the scene from the sidelines – unfortunately, a scene in which that public service squanders its identity to profit imperative. It’s well known that the paragon of investigative journalism is under attack. It’s also no secret that newspapers and journalism in general are in trouble. But it’s not solely the fault of journalists (though that is, indeed, part of the problem). Publishers discovered decades ago that papers that carry messages about the government to the public are incredibly valuable – valuable to the advertisers whose products are displayed to readers each time they sit down for a cup of joe in the morning. That public service should be valued, but not in the way that it is now. Money is naturally generated from that value, and money has become the endgame. From here, we all know the story: Newspaper owners were very successful until the Internet and free advertising thereon killed the bread-and-butter classifieds. But publishers have not been able to think outside the profit box. Before journalists started realizing the trend that was killing their industry, savvy business people started buying up struggling newspapers, gutting their staffs and funneling the profits to the resulting conglomerates. The starkest example of this is Gannett Company, the largest newspaper conglomerate in the world. Arthur Ochs Sulzberger, Jr., the recently late publisher of The New York Times and chair of The New York Times Company, was part of this guard. He was reported as viewing a publication that was not profitable with disdain because he saw it as unhealthy. Which, in my mind, is why The New York Times is in so much trouble. They keep trying to find new ways to stay profitable, when profit should not be the motivator. Though the paywall described in the above link has been successful and is helping the paper stay afloat misses the point. It remains that we, as public servants, have lost our way. Of course, it’s tough to rival the nation’s newspaper of record when it comes to investigative public service journalism. But new models are dong just that. ProPublica, the 6-year-old online investigative newspaper/magazine 501(c)(3) that has won two Pulitzer Prizes and numerous other award for its investigative reporting, has come as close as any through partnerships with larger news organizations (including the Times). And as the biggest newspapers around the country – with the notable exception of The Wall Street Journal, which remains healthy because, among other operating paradigms, of a highly-targeted marketing and branding strategy – hemorrhage advertising revenue, my sense is that publications like ProPublica will become the only sustainable model for the most essential of public services.

Food, health care and the Fourth Estate are only three brief examples; I could go on to energy utilities, water companies, education, transportation, the stewardship of national treasures – any service that has been widely privatized in the name of concentrating wealth in the hands of a select few. But after 3,800 words, you get the idea. The point is, something has to change; Americans must demand that services essential to individual and collective wellbeing be re-collectivized, run by grassroots, not-for-profit organizations like local businesses and co-ops whose goals are in concert with the communities they serve. The companies that accomplish larger, national goals should be democratized, run like the federal government should be, complete with the highest standards of transparency and accountability, with shareholders mobilized with the ability to oust leaders who refuse to lead properly.

Canadian journalist Naomi Klein has dedicated her career to documenting the battle between corporations and the public – a battle that is currently being won by the former – for the commons. She discusses her book “The Shock Doctrine” – another must-read – in an old interview with PBS’s Charlie Rose, for which she reported from the front lines of that battle, which was waged from the Southern Cone of South America to Iraq to her home country to the United States. She summarizes the battle toward the end of the interview, in discussing what Third World proletariat has organized to accomplish in the face of privatization of their lives:

I think what is being rejected [in this battle] is this ideology of leaving everything to the hands of the market; they do want intervention … In Bolivia, it’s interesting: It really all began with this revolt against Bechtel, and it was over the issue of water. Because water came to symbolize – you know, and the slogan was “Water Is Life,” and it comes back to this issue of, it’s fine to have a market, capitalism is fine when it comes to buying shoes. But when it comes to surviving, whether it’s health care or disaster response … you can’t leave it to the free market. In Bolivia it became illegal … to collect rainwater because it was seen as unfair competition for the privatized water companies. This became a symbol for taking [disaster capitalism] too far.

In her book, Klein writes extensively on solutions, as she does in this column in The Nation about climate change denial in The Nation, in case you don’t get the chance to check out “The Shock Doctrine.” Excerpt:

The way out is to embrace a managed transition to another economic paradigm, using all the tools of planning discussed above. Growth would be reserved for parts of the world still pulling themselves out of poverty. Meanwhile, in the industrialized world, those sectors that are not governed by the drive for increased yearly profit (the public sector, co-ops, local businesses, nonprofits) would expand their share of overall economic activity, as would those sectors with minimal ecological impacts (such as the caregiving professions). A great many jobs could be created this way. But the role of the corporate sector, with its structural demand for increased sales and profits, would have to contract. So when the Heartlanders react to evidence of human-induced climate change as if capitalism itself were coming under threat, it’s not because they are paranoid. It’s because they are paying attention.

But – and this may be the supremely ironic item in today’s America – the most fundamental thing that must happen first, as Klein writes, is the re-democratization of our elections:

Of course, none of this has a hope in hell of happening unless it is accompanied by a massive, broad-based effort to radically reduce the influence that corporations have over the political process. That means, at a minimum, publicly funded elections and stripping corporations of their status as “people” under the law.

The commons are not only Arecibo or El Pozo de Jacinto or the 300,000 acres of public land in 41 states the Bush administration tried to auction to private bidders or the Arctic National Wildlife Refuge. They are not limited to these physical spaces most often thought of as “commons” in which we gather, communicate, be profound, be idiotic, be human; they are the paradigms we’ve created by which we accomplish those human transactions. The commons are the arenas in which we as a public administer our way of living, where we tell our leaders where we want the country to go, where we inform one another when those leaders are not doing their jobs the way we want them done, where we decide what type of country our children will inherit. They are the marketplaces in which we consume products that are essential to our way of life. We must stop allowing private corporations to decide the way the commons are managed because private corporations have a vested interest, a causal financial obligation, in doing so to the detriment of society.

Don’t misunderstand: The call to nationalize the commons does not necessitate a government takeover of all private industry. I’m calling to take the oversight of archetypes necessary for social function back from Captains of Industry and hand them back to the proletariat. This means small, local businesses running small, local industry, instead of massive conglomerates moving in to destroy mom and pop operations and change the physical and social landscape of entire communities. It means the deconstruction of Monsanto and the restructure of Walmart. It means taking the profit motive away from the craft of journalism so local, regional and national news organizations are free to invest in quality watchdog reporting without outside influence, especially in the face of economic trouble. It means the decentralization of Gannett Company. It means government reclamation of the military. And in some cases, yes, it means tighter government control – possibly even intervention or overtake – of the international supply chains that provide us with our way of life, but also inflict so much suffering in Third-World sweat factories.

The industries described above, and ones I’ve missed, should be left to the people to govern. We should heed the calls by Klein and others to hand these responsibilities back to the people, to participate, as Klein says, in some “disaster collectivism.”

A Dying Legacy of an Investigative Journo

The white anthology of essays by I.F. Stone, a father of modern investigative journalism, came to me at a bar in a conversation with a source. It was a bar in which another journalism and literary icon, Hunter S. Thompson, once snorted a line of cocaine off a strange woman’s breast, a bar at which the late Thompson was recorded participating in many activities that might, in our cleaner, drier day, be considered quite strange. But back then, it wasn’t particularly out of the ordinary for the late Thompson; when he was around in Aspen, that sort of behavior was just something one might expect.

Now, that era, Thompson’s long-form debauch and the in-tandem rich, long-form literature, the day of slow, thoughtful writing, was over, to the dismay of other journalistic geniuses. It had been the day of the not-quite-lucid 20,000- to 40,000-word magazine or newspaper narration, or book even, on an aspect of American life that would have been ignored if not for their type – the horrific excesses of the sports bourgeoisie, the rabid indulgences of motorcycle gangs, the general inside-out nature of the American Dream. It had been the day.

Another era was long dead in Aspen. Though drugs still trickled through the Roaring Fork Valley, at some times heavier flows than at others, it was no longer the palpable gush it was in the ‘70s and ‘80s. It felt cleaner. Now, though nostalgia was evident in ruminations of local lawyers, cops, lawmakers and in late-night drinking sessions, Aspen seemed calmer, more intellectual. Now was the day of the Ideas Festival, the Food and Wine Classic, the X-Games. It was the day of clean spectacle, of supremely careful city council deliberation over the light pollution impact an expansion of the local hospital and the naturally conjoined increase in taxes; a relocation of the city’s internationally-renowned art museum; a hydropower project that could have dramatically reduced stream flows in Castle Creek, and could have harmed an aquatic ecosystem.

At this small late-night drinking table at the Hotel Jerome, a couple of weeks after I had incited the worst anger of city attorneys, sat three local politics gadflies. One held the medium-sized white, yellow and blue I.F. Stone paperback that exemplified the death-rattle of yet another truly American epoch. The man holding the book was a radical liberal who hailed from a town just few miles west and was active in the local Democratic Party and in election activism. He was one of my sources on the story I wrote for my newspaper, The Aspen Times, that was the object of said city officials’ wrath, and he had brought the book, which I now assume was part of a larger collection of radical leftist literature, as a token of encouragement for me.

I was quite down lately over a number of corrections that had to be printed for the story and a subsequent loss of support from my editors. I was planning to quit working for the paper, to apply at some rag crazy enough to hire me, maybe somewhere in the Midwest. Didn’t matter too much, as long as it was far from here. Of course, it would turn out that I wouldn’t work for another newspaper anytime soon. As Hedge readers know, I joined the Navy and so, another era, the journalism one particular to the life of Aaron Hedge, was dying in the Hotel Jerome, also.

Sitting in the dim, the source with the white, yellow and blue paperback seemed to know this. After a quick, semi-unprofessional – perhaps borderline unethical – conversation about the city’s aggressive lashing of my paper and me for printing the story and a not-so-quick drink, I said I needed to head to the Times’s newsroom to file some copy on the night’s city council meeting. Before I left, the source handed me the book – a gift, he said, to inspire the future watchdog in me.

“The Best of I.F. Stone,” reads the cover. “To Aaron – following in the footsteps,” reads a note quickly scrawled in thick ballpoint on the inside cover, followed by the source’s John Hancock.

I had never heard of I.F. Stone, nor had I ever expected to be this ambivalent toward learning something new about journalism. My interest had somehow waned to point I didn’t care about it at all anymore. The source and his friends sitting at the table assured me the book was full of pertinent insight to the world in which I had previously so wanted to live. I thanked them warmly and headed out into a chilly wind that heralded the first snowstorm of the year. But despite my appreciation, I knew my career was over, at least for the foreseeable future.

I put the book on a shelf and drank for several months.

I DROPPED OUT of college in 2010 for different reasons than Isidor Feinstein Stone, a self-described “anachronism,” did more than 80 years earlier. I quit studying because all my college money had become drinking money, and I got a job at The Aspen Times. Stone left a philosophy major at the University of Pennsylvania from which he would have graduated in 1928 to pursue what would become a burgeoning career at newspapers and magazines. Subconsciously at least, I eschewed college so I could drink and write. He abjured it because he felt a piece of paper certifying a person has completed a specified amount of academic work is not what makes a journalist great. Journalism should be a meritocracy of on-the-ground shoe-leather scrappiness, of hard work – not one of academic ivory towers. He wanted to buck the status quo.

And he was able to, and to be successful, simply because it used to be that all you needed to get a quality job in journalism was willingness and talent for getting information and disseminating it legibly. He had the most intimate experience in journalism. He notes in the book’s introductory essay, “I have done everything on a newspaper except run a linotype machine.”

He would go on to work for many papers and magazines – including then-flourishing but now-dead publications – like The Camden Courier-Post, The Philadelphia Record, The New York Post, The Nation and P.M. But – and I’m repeating a ubiquitous but essential note of all writing about Stone – the capstone on his career was one that lasted more than two decades from 1953 to 1971: He realized the existence of a cottage sub-industry within newspapering – the rich story of how a government that presumably operates under the idea of freedom of speech effectively controlled its press and most notably, how: “But the news is ‘managed’ because the reporters and editors let themselves be managed,” he wrote in a 1955 entry in his venerable and iconoclastic one-man investigative newsletter, I.F. Stone’s Weekly, from which came many essays in the book the Hotel Jerome source gave me.

On an ironically different path in life – traversing the military, of which Stone was intensely skeptical – I have moved past the depression I felt leaving the Hotel Jerome with the book in my hand. Recently, I picked it from a different bookshelf – the third or fourth in the nearly two years and a half years since I left Aspen – and began reading the essays, which are organized chronologically, one for every lunch break. Stone’s graceful dispatches, which he wrote from D.C. but never from the Press Corps, drew mostly from publicly verifiable documentation and rarely from anecdote or conjecture. Famously, he used facts from the Geodetic Survey to corner the Atomic Energy Commission into an admission that its nuclear testing in Nevada was felt 2,600 miles away in Fairbanks, Alaska – instead of detectable shocks being limited to a 200-mile radius, as was the agency’s claim. This story was important because the government had used the 200-mile figure to insist a nuclear test ban would allow Russia to conduct nuclear tests without the knowledge of other countries. He broke news of similar importance throughout his publication of the Weekly, using obscure records through which most newsmen were unable, due to time constraint, or unwilling, due to complicity, to sort.

“The fault I find with most American newspapers is not the absence of dissent,” he wrote. “It is the absence of news. With a dozen or so honorable exceptions, most American newspapers carry very little news. Their main concern is advertising. The main interest of our society is merchandising.”

Though the method and speed of reporting in a 24-hour news cycle is different, today’s journalistic world employs the same profit motive in its managing of the news. With the exception of the occasional mammoth scoop, even The New York Times, the beloved American newspaper of record, carries the same stories every other news organization does.

Very little news, indeed.

But more importantly, Stone defined what a good investigative journalist should mean to a society she covers. It means not being complicit, a sin of which too many reporters in today’s world are guilty. Stone was loud about his former affiliation with the Communist Party – not a timid action in the day of McCarthyism. And he was equally loud about his leaving it and the philosophy thereof. He also defined the political dialogue, the cog that ties policy tangibly together and spits it, for better or for worse, into the sphere of the American polity. At random, I’ve found it difficult to flip to a passage that doesn’t draw an eloquent insight into ongoing themes in American politics, one to which there’s no parallel in today’s Washington.

Here are some excerpts:

From “Only the Bums Can Save the Country Now,” a 1970 rebuke of the Ohio National Guard’s killing of four Vietnam War protesters at Kent State University:

In a dispatch from a landing zone in Cambodia, Jack Foisie of the Washington Post (May 8) described GIs jumping from helicopters under enemy fire with derisive denunciations of the war scrawled on their helmets. One of those he copied down sums up the situation of the whole country in this war. “We are the unwilling,” it said, “led by the unqualified, doing the unnecessary, for the ungrateful.” As usual the country is not being told the truth about why we went into Cambodia. In his war address of April 30 Nixon pictured the attack across the border as a preemptive exercise to hit an “enemy building up to launch massive attacks of our forces and those of South Vietnam.” It was described as swift preventive action from which we would soon withdraw and which was not part of any broader intervention in Cambodian affairs.

This should sound familiar to modern payers of attention; preemption under false threats was the cornerstone of W.’s campaign in Iraq. As Stone goes on to note, America had long been plotting to throw Cambodia out of its neutrality in the region’s bitter conflicts.

From “The End of the War,” a 1945 reflection about America’s popular apathy over the approaching of the end of World War II:

Since 1931, when peace began to crumble in Manchuria, there has been war, civil war and world war, at an increasingly furious tempo: from those first shootings in Mukden and the first beatings in Dachau to the bombing of Shanghai and the civil war in Spain, from the first blitz on Poland to the use of an atomic bomb over Hiroshima and Nagasaki. What a vista of blood and cruelty! But not blood and cruelty alone. As in some gigantic symphony played on human hearts for the delectation of a mightier race, agony has blended with a beaten-down but irrepressible, mounting, and finally victorious heroism and aspiration. These are only gaudy words to us. We and our orators have rung the changes on democracy and freedom until the words have grown shabby and nauseating. But to certain men the war’s ending comes as the end of a struggle against fascism begun long before the war was declared, fought in the underground hideaways of Japan, Italy and Germany, in occupied China and Spain, in the Vienna working class suburbs and in the Warsaw ghetto, humbly and obscurely, but as bitterly as on the broader battlefields. Those who understand that this was in truth, for all its contradictions and compromises, a war against fascism, a successful war against fascism, a war that is slowly but surely letting loose the forces of freedom the world over, cannot take the end of the war casually.

This entry should be a reminder, especially for a generation like mine obsessed with Snooki instead of thoughtful about foreign policy, to pay close attention and deep contemplation as we exit wars that have taken a different toll: that of America’s moral superiority.

From “The First Welts on Joe McCarthy,” a celebration of the launch of the Senate’s investigation of the rabid senator who, until then, conducted his furious assaults on the freedoms afforded by the Constitution against startling little dissent in the legislature. It is also a warning of duplicity among McCarthy’s attackers whose very methods necessitated a level of legitimation for McCarthyism:

Great issues are rarely resolved by frontal assault; for every abolitionist prepared to challenge slavery as a moral wrong, there were dozens of compromising politicians (including Lincoln) who talked as if the real issue were states rights or the criminal jurisdiction of the federal courts or the right of the people in a new territory to determine their own future. In the fight against the witch-mania in this country and in Europe, there were few enough to defend individual victims but fewer still who were willing to assert publicly that belief in witchcraft was groundless. So today in the fight against “McCarthyism.” It is sometimes hard to draw a line of principle between McCarthy and his critics. If there is indeed a monstrous and diabolic conspiracy against the world peace and stability, then isn’t McCarthy right? If “subversives” are at work like termites here and abroad, are they not likely to be found in the most unlikely places and under the most unlikely disguises? How talk of fair procedure if dealing with a protean and Satanic enemy? To doubt the power of the devil, to question the existence of witches, is again to read oneself out of respectable society, to brand oneself a heretic, to incur suspicion of being oneself in league with the powers of evil.

The proliferation of thought control, the product of unyielding partisan rhetoric, in McCarthy’s Senate is  readily palpable in both chambers in today’s legislature.

From “The Mason-Dixon Line Moves to New York,” a 1968 essay reported from the trenches of the increasingly divided house of Brooklyn, where the national racial cacophony was echoing in complex racial chambers. Stone brings the highest level of nuance to the race debate by illustrating it through the lens of the fights between Jews and blacks in New York City:

I asked a Brooklyn school teacher just what was at issue in the strike. She replied with appalling simplicity, “Anti-Semitism.” How do you win a strike against anti-Semitism? By circumcising all Gentiles and Black Muslims into Black Jews? “What does Mr. Shanker want?” the Mayor asked in a similar vein in a radio interview the next day. “For the police vans to come into the [Ocean Hill-Brownsville] community, arrest them and send them to New Jersey?” Is the Exodus to be reenacted, this time with a black cast? The plain truth is that John V. Lindsay is in trouble because he suddenly finds himself the Mayor of a Southern town. The Mason-Dixon line has moved north, and the Old Confederacy has expanded to the outer reaches of the Bronx.

The insight is endless.

I.F. Stone’s Weekly independent, running on free-market principles to promote what was considered a radical liberal agenda. Stone charged $5 for each subscription, a market that eventually grew to 70,000. Using government post, it remained profitable for each of its 18 years. But profit was not his motive. In an age when stuffy businessmen were recognizing the most fundamental cornerstone of a well-oiled democracy as a wellspring of profit, Stone embraced freedom from advertisers – in a way, making him one of most independent capitalists in American history – and from government. He the news not as a product but primarily as the most essential public service. Using the microphone of the written word, he proudly shouted anecdotes about government agencies blacklisting him. He was the most eager teller of stories that would otherwise go untold, the proudest champion of the message that needs to be heard, the quintessential herald of information considered by too few among us to be quintessential.

Looking back to the night in the Jerome Hotel, when my career was falling apart, I hope the source’s optimistic proclamation – “following the footsteps” – will ring true, that there will be a re-revolution in the news. It had been the day. I hope the day comes back, and that I’m part of it.