Capitalism in Singapore

Every day of the five I was there, the Singaporean heat would break in the early afternoon and heavy clouds would spend themselves in 30-minute torrents. The clouds, lifted of their burden, gave way to sun only after double-digit drops in temperature. I smelled the connection between the daily release and the greening project undertaken by the successful Singaporean postcolonial project. It seemed as if the millions of leaves that line the city’s plant-draped buildings were engineered to breathe the heat away.

There was no apparent underbelly here, like one would expect in a metropolis of more than five million people. With two comrades on longboards in tow, I rollerbladed the length and breadth of the Republic of Singapore, criss-crossing any number of times and directions from the expat districts, for several days. Land reclamation projects jutted from the clouds that shrouded the body of the island. Towers were prominent, but they didn’t loom like a Western skyscraper. The buildings seemed more subtly embedded in the largely artificial cityscape, draped in gowns of lush vertical green, a man-made rainforest, more felt than imposed. It is a gorgeous city, which, presiding over the world’s largest economic thoroughfare, boasts a rich history.

The city-state succeeded a European occupation that started in the early 16th century with the Portuguese and ended in 1963 with the foundering British colony (which was interrupted by a short Japanese cut-in during World War II). The local successors did well, economically speaking. The Singaporean economy “depends heavily on exports, particularly of consumer electronics, information technology products, medical and optical devices, pharmaceuticals, and on its vibrant transportation, business, and financial services sectors,” the CIA’s World Factbook says. Economic metrics are stronger in Singapore than in other developed nations, as the state props up the market economy. It enjoys the seventh highest GDP per capita in the world, the Factbook says.

My wheeled friends and I noticed some gentrification to be sure, the tonier areas populated by lighter-skinned holdovers from European rule and the seedier ones by brown-skinned Singaporeans (ethnic Chinese, Malay and Indian) who smoked cigarettes and sometimes missed teeth. The city’s industrial aspects were not apparent as we rolled through the poorer areas. My colleagues and I ducked into shops where they served cuttlefish porridge and Tiger Beer, the local version of Budweiser, or braised duck with pickled eggs and Hoegaarden, a 500-year-old Belgian wit popular the world over, but cheap in every country except the United States. Singaporean culture was decidedly lower class in these areas, but here, where a modicum of trash was sometimes detectable on the otherwise immaculate streets, the locals we encountered always smiled, and if they had fewer teeth, they still always had money. No one begged for change.

The city-state is about as advanced as a society can become. This level of social evolution is particularly palpable in the city-state’s birth statistics. It has the world’s fourth lowest infant mortality rate, at 2.4 deaths per 100,000 live births, compared with the United States, which comes in at 56th place with 5.8 deaths per 100,000 live births. The Factbook lists Singapore as the state with the world’s slowest birthrate at .82 children per woman. (This is up from .79 some months ago.) Given that to keep a population constant, each woman should have around 2.3 children, there are grave social security implications in this number. Still, it is a sign of advancement. World demographics generally show countries with higher levels of economic and social development have lower birthrates. The Factbook lists Africa’s Niger with the highest fertility rate, at more than six children per woman.

There’s nuance to this social evolution, of course. Singapore has some utterly draconian rules, some enforced by public caning and jail time. Singaporean byways are protected from errant lugies by threat of imprisonment for any spitter. The same punishment is wielded for chewing gum. Visiting military commands drill this into service members during port-call briefs. Other edicts are more serious than the spitting and chewing dictates. Don’t get caught having gay sex, or go to jail. Drug and arms trafficking are punishable by death. If a sailor winds up in jail, she must stay there, as the United States lacks a status of forces agreement with Singapore. Saying it’s a fraught experience might be too much, but one should tip-toe.

There are what could be considered eyesores. The shores are lined with barges stacked with massive hills of dredged or imported sand. Incessant building projects are bordered by fences with signs threatening prosecution for trespassing. The vestiges of land reclamation projects from throughout the country’s relatively short history dot the island like pockmarks. The conspiracy theorist in me assumed the Singaporean government had the poor bussed out of the city proper to some obscure reclaimed wasteland. In a slick economy fueled almost solely by the trendiest technology and upscale market speculation, nearly a quarter of the land is artificial, and Singaporean engineers are busily studying ways to take more land back from the sea. A recent New York Times Magazine piece written by journalist Samanth Subramanian noted: “Land is Singapore’s most cherished resource and it’s deepest ambition.”

A beacon of the “leveling power” of Western-style capitalism, Singapore is one of four Asian Tigers, with consistent gross domestic product growth of 7 percent from the 1960s, after Britain lost its control of the country, until 1990. (The other three Asian Tigers are Hong Kong, South Korea and Taiwan.) That economic growth sits around 2 percent today, a relative slump that no doubt sparks envy in the hearts of Western capitalists, regardless of the relativity.

Those who can afford to own a car, considering the 100 percent tax on vehicles, are few – only 15 percent of Singaporeans drive. But those who do drive, drive shiny, expensive sedans. There’s a strong public sphere, an incredible public rail line and a sense of wellbeing. The markets and malls are always full of patrons. There’s a strict ban on any type of weapon. There’s a “sky park” atop the Marina Bay Sands Hotel, a massive edifice that caps the tops of three downtown skyscrapers with a pricy admission fee and $50 burgers and an Olympic-size swimming pool. This goliath was built, like many Singaporean landmarks, on reclaimed soil.

But something is amiss. Subramanian wrote, “The country is so devotedly pro-business that it can feel like a corporation; it’s constitution includes several pages on how the government’s investments should be managed.” After a few days there, this sterility becomes apparent in the transactions a visitor has in the swanky bars in the expat districts that employ elevator-style bands covering American folk rock and serve cuisine designed to mask, for residents and tourists alike, the social and economic impediments of Singapore’s global region.

It’s a broad truth that lavish lifestyles are built on the backs of the global poor. Perched on the Strait of Malacca, the island embodies the intersection of commerce and environment, and there’s an economic and environmental cost to Singapore’s serenity. A great place to start looking for this cost lies in the Strait and in neighboring Malaysia, of which Singapore used to be part. The countries parted ways in economic philosophy decades ago, and Singapore has gotten the better end of the deal. Malaysia struggles, while Singapore does not.

The Strait, which separates Singapore from Malaysia, is impossibly full of massive ships. It’s the busiest trade route in the world. Naturally, there’s more piracy – another crime of desperation that can yield the Singaporean death penalty – in the Strait than anywhere else in the world, including the infamously pirate-infested trade routes that pass Somalia. Closer to home, there’s a swelling human trafficking trade, often patronized by the American military. We’re told at length before every port call to eschew soliciting prostitutes, a practice that comes with dire consequences if you’re caught. Still, one of my longboarding compatriots dipped into a massage parlor for an hour while the other and I drank beer and felt uncomfortable about the situation until our prodigal friend returned with a smile.

But the meat of Singapore’s opaque, illicit substrata are more fundamentally economic and environmental. This elaborate, shining city-state consumes at breathtaking pace. The Factbook says Singapore imports the 15th largest amount of crude oil in the world, at more than 830,000 barrels every day (compared with more than eight million for the United States, which is the world’s largest importer). This is notable, considering Singapore has only the world’s 114th largest population. Some materials that help put Singapore at the forefront of world exports come from toxic mining operations and manufacturing processes that harm global citizens in distant lands who haven’t the means to defend themselves against neoliberalism. Public officials here are obsessed with encroaching on an ocean that threatens to inundate its low-lying terrain. Subramanian wrote a third of the land is lower than 16 feet above sea level. The Singaporean government has affixed a concerted effort – which other low-lying cities are sure to emulate – to keep the city afloat as global warming raises the sea level. It is building creative locks and dams, reclaiming land from the sea, a nearly two-century project, initiated by the British and continued by the Republic of Singapore, that grows more urgent year by warming year. Subramanian, whose story focused on the island’s efforts to stay afloat in rising seas, noted that the biggest worry of Singaporean officials is maintaining the size of the island.

Singapore seemed, like the United States, quite intent on bettering the lives of its own. I think Singapore does a better job of it. But that’s not really the point. The point is locating the threshold where bettering the living standard for people who already have an excellent living standard is no longer excusable. Where is the moral tipping point at which a country can no longer support the industries that create utopia at home and cause havoc abroad? Within the capitalist market economy on which the world turns, it’s at the top of the infant survival rate. And it’s only a matter of physics and time before Singapore’s growth becomes untenable. Wrote Subramanian:

But the desire to reclaim never-ending shelves of land, farther and farther into the sea, will inevitably be outfoxed by physics. On a whiteboard, [Assistant Chief Executive of the Jurong Town Corporation David] Tan drew me a diagram of the process: first, building a wall in the water, reaching all the way down into the seabed; next, draining the water behind the wall and replacing it with infill. As the ocean grows less shallow, it becomes harder and harder to build the wall, to stabilize the infill, to protect it all from collapse. “We’re already reclaiming in water that is 20 meters deep,” Tan said. “Maybe it would be viable to reclaim in 30 meters, if land prices go up. But 40 and 50 meters would be very difficult. It’s physically difficult and economically unviable.”

The Gender-Specific Pronoun and Me

My first meaningful experience with the gender-specific pronoun was in my starter English comp class at Colorado Northwestern Community College. The teacher, a feisty short conservative woman with freckles and a penchant for embracing stereotypes, thought the “him or her/he or she/his or hers” imperative of political correctness in writing clunky, overwrought and unwarranted, and she was right about that part. Clunky, overwrought and unwarranted, yes.

She preferred, she said, to simply use masculine pronouns. It simplified sentences and called no attention away from writing’s substance. But on top of her writerly conceits, the teacher also scoffed at the notion that women are somehow subjugated by the exclusive usage of the word “he.” You can’t use “they” when talking about a lone person (even in the abstract) because it’s grammatically incorrect. And you shouldn’t default to she/her/hers because that’s what the pussy snowflakes do. The teacher scoffed at political correctness in general, seeing it as part of leftism’s insidious seepage into the American zeitgeist.

There’s some truth to this. I generally agree with Bill Maher that there’s something to the conservative complaint of liberal term-baiting and identity politics (though conservatives ironically lick their wounds in the comfort of their own safe spaces). Political correctness has been carried too far by the American left. But not in this case. Here, it has not been carried far enough.

There’s a fascinating movement in the trans community these days to contrive a singular gender-fluid pronoun to accomplish political correctness and inclusivity in writing while simultaneously achieving simplicity. This is an essential effort (which I’ll leave to the lexicographers), and there must come a time for it. But in lieu of a solution on that front, I say it’s women’s turn.

Over the last couple of years, I’ve used the gender-specific pronouns she/her/hers nearly exclusively, the only exceptions being when I’m writing about a demographic that is entirely or almost entirely comprised of men. I do this for a calculated reason: I want my readers to think about feminism, even if that’s not the subject of my writing. An imperfect analogy is killing our own meat, which we more often than not consume without thinking. If we kill something, we have to look it in the eye and be reminded that there’s life affected by our diet. In the same way, the words we choose affect people, whether or not the effect is perceived. If we’re thoughtful about language as we consume or offer it, we have to admit its conventions – including that more writers seem to use masculine pronouns – are inherently racist and sexist, to reflect an inherently racist and sexist society. If we read “she” in place of “he” and it shocks us, we’re forced to reckon with our mode of word usage and consumption.

Let’s talk statistics. The “gender pronoun gap” fluctuates with the social status of women. When women have more purchase on the economy, make up a larger portion of the workforce and write more, the use of feminine pronouns increases. Jen Doll described in The Atlantic magazine in 2012 research that tracked the use of  gender pronouns in literature over the previous century.

[San Diego State psychology professor Jean M.] Twenge explains in her paper that “the gender pronoun ratio was significantly correlated with indicators of U.S. women’s status such as educational attainment, labor force participation, and age at first marriage as well as women’s assertiveness, a personality trait linked to status. Books used relatively more female pronouns when women’s status was high and fewer when it was low. The results suggest that cultural products such as books mirror U.S. women’s status and changing trends in gender equality over the generations.” Or, more simply, the rise in all of those indicators – as women married later, had careers, and became more independent and assertive – correlated to a rise in the use of female pronouns in writing.

Logically, the study found there has been a drastic uptick in the use of the female pronoun since World War II, when women took over manufacturing and established a foothold in the American workforce. The underlying theme is that when women do well, we acknowledge their existence.

I’m having some friends look over a long essay I’m planning to send out for publication soon in which I almost exclusively use feminine pronouns, including when I describe some negative trait. One friend, in her critique, noted that she found it jarring, that it removed her as a reader from the substance of my essay to ask why I wrote she/her. She asked me to think about this and determine why I’m doing it, and here’s what I have for an answer:

Because even when women don’t have the social purchase they deserve (I can’t think of a time when they have), they are still here, they are still driving the economy with immeasurable grace and subtlety and they are never going to go away. They are a force to be reckoned with and they are still not sufficiently recognized as such even in a liberal and thoughtful profession like writing.

My critic friend is probably right about the feminine pronoun jerking readers from the narrative. We are not used to it as readers. But sometimes the teeter-totter needs a violent tug to level out. At the turn of the century, according to The Atlantic’s account of Twenge’s study, masculine pronouns outpaced feminine pronouns by about three and a half to one in American literature before the upheavals of WWII, the Sexual Revolution and the establishment of feminism in the cultural mainstream. Now, that ratio of masculine pronouns to feminine is two to one, Doll writes, citing the Associated Press. Insufficient progress, if you ask me, and the women, who will save us from our collective dementia, deserve more.

Why Socialism?

Hunter S. Thompson got a lot right in his Atlantic obit of Tricky Dick, maybe more than I’ll ever get right in all my writing. Specifically, he said, reporters had traditionally gotten it wrong when it came to covering one of the Greatest American Crooks: “Some people will say that words like scum and rotten are wrong for Objective Journalism – which is true, but they miss the point. It was the built-in blind spots of the Objective rules and dogma that allowed Nixon to slither into the White House in the first place.”

Thompson got a lot right, but with a big proviso. Now, I wouldn’t presume to lecture the late Dr. of Gonzo on astute observation. While, as a friend of mine observed, Hunter S. was an immature degenerate who blew his brains out while talking on the phone with his wife and may or may not have participated in gang rape in his reporting on the Hell’s Angels, he was a literary genius. But I have to say I think he missed the point himself. In the final paragraph of the obit, perhaps Nixon’s defining epitaph, Thompson distains the sullying of the office of the American president – the ironic symbol itself, the ultimate power emblem, of the dark political arts that over the course of 227 years have sullied the very idea of America.

“Nixon,” Thompson wrote, “will be remembered as a classic case of a smart man shitting in his own nest. But he also shit in our nests, and that was the crime that history will burn on his memory like a brand. By disgracing and degrading the Presidency of the United States, by fleeing the White House like a diseased cur, Richard Nixon broke the heart of the American Dream.”

This is a profound reflection; Nixon weaseled his way into the position he did because of a press weakened by the confines of Traditional Objectivity as USA Today would define it. But so did many other presidents, and the ones who didn’t slunk into power through other proverbial loopholes, like Obama did with his complicit following, or like both Roosevelts and both Bushes via Nepotism Boulevard.

The American Dream itself has blind spots written into it, burned like their own invisible sort of brand in our constitutional obsession and capitalistic addictions. The dirty work is done behind oaken doors hung with “EXECUTIVE SESSION” signs, but we all know about it. Still we act disgusted.

Richard Nixon was not some horrific experiment-gone-wrong in the laboratory of democracy, no bizarre anomaly in the annals of Western leadership. He was only unique in that he got caught doing something all other presidents have done. But that wasn’t where we focused, and it wasn’t where we litigated. The investigation was swept into the spotlight by a couple of scrappy beat reporters at The Washington Post who uncovered the vast underworld of corporate and political villainy at home and abroad by looking into a seemingly isolated, blue-collar incident. We were only able to see the bigger picture of Nixon’s entire preceding lineup through the lens of a break-in, and we quickly forgot the former.

Watergate is a household term;  the Pentagon Papers, not so much.

We’re all complicit, even if it’s only by our own small ballot-box participations in the Great Machine, including me.

Since I enlisted in the World’s Greatest Fighting Force, a keystone of said apparatus, a year and a half ago, I’ve been trying to reconcile this chasm. It’s been especially tangible of late, like the stinging metallic tingle you get in your sinuses when you hit your head. It’s driven me a little batty, to the point I’m not sure if it shows in my social and professional interactions.

In a Navy C school lesson several weeks ago about a certain maintenance check on the weapons system I’m learning for what will become my job on the U.S. Navy aircraft carrier the USS John C. Stennis, an instructor was reading instructions for a procedure. He pointed out the blinking indications we should see on the equipment if everything works properly, and we came in conversation across the subject of just how fucking formidable the technology is. Here’s how it works in action: On floating cities in the middle of the world’s deep blue, through dozens of tons of steel cabinetry containing millions of dollars’ worth of electronic, hydraulic, pneumatic circuitry, signals dart to provide target coordinates to one of eight nearly 600-pound missiles, which takes off at something around three times the speed of sound. A few seconds later the missile detonates, destroying a threat to the ship that is generally flying through the air, also at a sheer velocity. It’s just one of the many fire control systems the Navy’s FC technicians maintain and operate. Ours is strictly defense-based, unlike the Tomahawk missiles capable of targeting small items well past 1,000 miles away. But, NATO Sea Sparrow techs invariably agree, it’s as much a part of our show of force out there in the world.

Don’t fuck with us.

“Speak softly, and carry big stick,” our instructor quoted a former resident of 1600 Pennsylvania Avenue, referring to how the system intimidates. “Teddy Roosevelt said that. He was a very wise man – and kind of a bad-ass,” the instructor told us.

His enthusiasm incorporates a level of institutional knowledge that has been difficult to find elsewhere in my Navy experience. He’s more assured in the historical lexicon than the average sailor, who’s generally more interested in beating the next level in the Halo video game series. But his fluency in the Official Record notwithstanding, I winced internally at the praise heaped on the former president.

I’d been reading a lot of literature advocating the abolition of the imperial structure lately, most notably late historian and World War II veteran Howard Zinn’s incredible account of American imperialism “A People’s History of the United States,” and had grown queasy at any fond rumination of American presidents in general.

I wondered if any of my six classmates in the laboratory could sense my apprehension, but they were probably focused on the lesson at hand. The test went well; we went about our day; we always did; we always do. We salute, request permission and go ashore.

The next day, my wife and infant daughter and I visited a naval medical center. Waiting for medical staff to call us for 7-month-old Harper’s checkup, Hailey and I noticed a man amble in, mid-70s I guessed, wearing a faded blue T-shirt, tattered jeans, a pair of tan moccasins and a “Navy Veteran” ball cap. He sat in the seats across from us. His hair was tousled salt-and-pepper, his chin a field of unwieldy stubble, his eyes knowing, gray, friendly. He started making wink-and-twinkle jokes: “That’s a real beautiful little boy you got there,” he said about Harper, who wore sparse dishwater in a small pony tail that stuck straight into the air from atop her head. “Those are such beautiful brown eyes,” he added about Harper’s deep blue optics.

The old man wanted to impart his naval acumen to me, the young third-class petty officer across the way. It would turn out he’s a retired command master chief, the third highest enlisted rank, next to fleet or force master chiefs and the Master Chief Petty Officer of the Navy.

His pious caretaking Betty Crocker wife filled his prescription at the desk behind him while he informed me about ship passageways. I had no initial way of telling how true it all was, but he told me he suffered from early onset dementia from “hitting my head too many times in the doorways designed for 5-foot-8 guys.” He’d been my height, he told me, 6-foot-1, but had diminished several inches, curled over like a question mark, over the cruel Civilian-World years.

His stories started slowly and shakily.

He was in the Navy during its heyday, was one of the first CMCs under a program officiated in the mid-‘90s to facilitate better communication between enlisted ranks and commanders. He imparted knowledge of how the chain of command should work in all situations – the CMC always goes through the next person in the chain, the executive officer, never straight to the captain. It’s just as it works at any level of rank; always go one rung at a time.

He gave me a history lesson on his final ship, the USS San Jacinto, one of the first Aegis class cruisers in the Navy. His first act as CMC was to commission the San Jacinto in Houston in 1988, with then-Vice President George H.W. Bush in attendance. Bush was there because 47 years earlier he’d flown from the deck of an aircraft carrier of the same name in support of air missions during WWII. The to-be president of the United States had been shot down near Chichi-jima in the Pacific, and had a harrowing tale of escape.

Anyone with a tenuous grasp on American politics knows Bush went on to become a Republican director of Central Intelligence and a politician who strongly supported the military mission of the United States, for whatever that’s worth.

The master chief held a grin and a nostalgic bravado describing his guidance of First Lady, Barb, about the San Jacinto as it became the 56th Navy cruiser off the coast of ‘Murca-town. It was a celebration. The residents of Houston “treated my soldiers to burgers, beer; they really treated my sailors well.”

One last piece of advice as his wife approached, his medication in hand, and he stood to leave: If I ever hit my head on a ship, get it logged in my medical record – the benefits require rigorous documentation.

He showed me his tattered blue VA photo ID confirming his former position, “so you know I’m not bullshittin’ you.” He shook my hand, but didn’t like the grip he had and readjusted to give my palm the engineman’s grip he’d used in the Navy before he became a CMC.

“That’s better; I used to turn wrenches, you know,” he said. He winked, let go of my hand and walked out the door.

“Have a good evening, master chief,” I said.

The interaction lasted no more than 10 minutes, but I already liked the guy. If I were still a drinker, I could have sat down for a beer with him, played a game of pool, heard more sea stories. I generally feel the same way about the far younger C school instructor who’d quoted Roosevelt. Their Americana is the truest, the bluest out there.

But with their laudatory commentary on men who helmed the warship contraption – our military-industrial complex that you’d criticize but lose your humanity – my distaste for Americana only intensified. The idea of accomplishment by most American politicians, who’ve always held the American ideal of capitalist imperialism higher than the health of the body politic, grew more caustic. Roosevelt, Bush, everyone before, in between and after – they were just muscles tightening the bony clutches of American expansionism, turning our green world ugly.

Teddy’s Big Stick

ROOSEVELT, A REPUBLICAN, is more than most held among liberals and intellectuals as one of the greatest American presidents, the truest of progressives, though he’d later be overshadowed by his fifth cousin, FDR. Teddy was hailed as a progressive champion of the betterment of the working class. But he was, by the essence a man who climbed military rank to the top echelons to assistant secretary of the Navy, then vice president, then to Most Powerful Man in the World, another apparition of the imperial erectile hyper-function that has always defined American foreign policy.

I’ll quote heavily from Zinn, so here I go: “Theodore Roosevelt wrote to a friend in the year 1897: ‘In strict confidence … I should welcome any war, for I think this country needs one.’”

The letter, one of the most direct and honest mandates for the military-industrial complex, was written during the Roosevelt ascendency, and a time of particular gloom among Americans. The country had been in an economic depression as American corporate empire-building sputtered to a halt at the Pacific Ocean. The economy had relied disproportionately on railroad expansion in which the tycoons, in bed with political structure, exploited and stole from American Indians and poor farmers just trying to earn an honest day’s wage. At shining sea, they could go no farther. They’d wasted away their most important resource: frontier. Thousands of businesses and hundreds of banks closed, and unemployment remained higher than 10 percent from 1893 to 1898.

Inheriting the mess in 1897, the newly-elected Republican William McKinley was desperate for new boundaries to push. He found them on the lush islands of Cuba, Puerto Rico, Guam and the Philippines, where the United States had invested millions in railroads, tobacco and mining. It was widely reported that McKinley didn’t want war with Spain, which would be marketed as retribution for Spain’s treatment of an already mobilized local rebellion in Cuba. McKinley didn’t want war – he needed it. A war would mean two big things: First, strategic expansion of the American marketplace abroad. Industry needed this because of an overabundance of American industrial and agricultural production for which there was no letting valve. Second, the remobilization of the war economy, which, with Twentieth Century wars to come, would establish itself as a vital and clever gimmick in keeping America’s imperialist id as secure and largely unconscious as it has been since.

So a mysterious explosion that destroyed the battleship USS Maine  – which sat just off the coast of Havana “as a symbol of American interest in Cuban events,” according to Zinn – offered a convenient catalyst for public support for war with Spain, after establishment newspapers in New York City blamed the explosion on Spanish forces.

Needless to say, America won the conflict and, in a $20 million settlement with Spain, annexed Puerto Rico, Guam and the Philippines, the last of which allowed the States theater in the burgeoning marketplace of China. The nearly immediate insurrection of the Filipinos in 1899 did not withstand, as Americans brutally quelled it, allegedly massacring the dissenters. As for Cuba, America forced upon it a deal that effectively enslaved it economically to the United States and allowed the American military to set up permanent shop there. This foreshadowed the establishment of the naval base at Guantanamo Bay, where a large number of innocent dark men the U.S. has accused of terrorist affiliation are on hunger strike and are having hoses shoved through their nasal passages as conduit for food to be pumped to their stomachs.

In the swirling clamor, Teddy Roosevelt, an Army colonel whose Rough Riders would orchestrate much of the violence in the American expansion in Cuba, got busy.

He was named assistant secretary of the Navy in 1897 by McKinley, was elected governor of New York in 1898 and was elected McKinley’s vice president in the 1900 presidential contest. McKinley was assassinated a year later, and Roosevelt was sworn in. He would serve two terms.

Though Roosevelt’s administration waged no war, he was arguably one of the most prolific imperialists in the history of the United States. He continued and strengthened the policies of McKinley and McKinley’s predecessor, Grover Cleveland, both rabid imperialists, by creating what’s known as the “Roosevelt corollary to the Monroe Doctrine.” The policy states that, because America had among the strongest assets – read power – of all hemispheric powers, it therefore had a moral responsibility to intervene in economic policy in nearby Latin America. This would spark a century of exponentially more frequent coups d’état in the southerly parts of our side of the world to strengthen American capitalism and quell peoples’ movements.

Roosevelt’s “big stick.”

The Roosevelt administration tried to force Colombia to lease for $10 million and then an annual $250,000 a part of the Panamanian isthmus to the United States. The administration was planning to pay a contractor, the New Panama Canal Company, $40 million to build a massive canal through the isthmus for a new trade route. When the Columbian government refused, the administration financially and militarily augmented a Panamanian uprising that facilitated America’s business interests, and, over a decade, the Panama Canal became reality. It set the stage for Bush 41’s funtivities in Panama eight and half decades later.

Privately hostile to worker’s movements, Roosevelt also personally sought to quell the socialist uprising during his administration at home. Zinn reports Roosevelt’s response to an article written in a radical newspaper by socialist leader Eugene Debs in which Debs called for a general strike if the government cracked down on the socialist movement: “Theodore Roosevelt, after reading this, sent a copy to his Attorney General … with a note: ‘Is it possible to proceed against Debs and the proprietor of this paper criminally?’”

Roosevelt didn’t have time for a crowd of 3,000 that represented the hundreds of thousands of child laborers exploited by corporations. When they marched on Oyster Bay to seek his advice on how to abolish child labor, he wouldn’t see them. He opposed a 1910 Supreme Court opinion that a “workmen’s compensation rule was unconstitutional because it deprived corporations of property without due process of law,” Zinn writes, because Roosevelt believed it would be a lightning rod for the Socialist Party. The Supreme Court had decades earlier spelled out the concept of corporate personhood under the Constitution’s Fourteenth Amendment.

Publicly, he worked to throw progressive movements a bone here and there, requiring food companies to accurately label their products and calling on Congress to limit the powers of large corporations. But this had an ancillary effect, much like all seeming progressive policy in the United States, which had before and has since worked in the interest of capital growth. As Zinn notes, Roosevelt made these slight concessions in private meetings with corporate representatives who promised him the veneer of progressivism in exchange for assurance that Roosevelt would guarantee the health of industry.

Progressive is how Roosevelt is seen, same as it ever was.

“The Environmental President”

THOUGH NOT QUITE so as either Roosevelt, H.W. is also seen fondly, but that’s no accomplishment when the historical lens is muddied by the leadership of his son, one of the worst presidents in history.

As president, 41 approved policy that, on the surface, looks progressive. He signed the Clean Air Act in 1990, and used it to coin himself “the environmental president.” He promised no new taxes by lip-reading. His military ventures, most notably the Gulf War, are seen as smashing successes, as liberations of countries in need. Accordingly, he relished a record approval rating between 89 and 91 percent in 1991.

But, also like Roosevelt, and certainly a little easier, when you start picking apart the details, the picture is not so rosy. His biggest environmental initiative the Clean Air Act was a lukewarm solution to a stunningly difficult problem (especially in terms of politics), and in subsequent years Congress gutted it financially and the administration withdrew support.

Zinn writes:

… two years after it was passed, it was seriously weakened by a new rule of the Environmental Protection Agency that allowed manufacturers to increase by 245 tons a year hazardous pollutants in the atmosphere.

Furthermore, little money was allocated for enforcement. Contaminated drinking water had caused over 100,000 illnesses between 1971 and 1985, according to an EPA report. But in Bush’s first year in office, while the EPA received 80,000 complaints of contaminated drinking water, only one in a hundred was investigated. And in 1991 and 1992, according to a private environmental group, the Natural Resources Defense Council, there were some 250,000 violations of the Safe Drinking Water Act (which had been passed during the Nixon administration).

Shortly after Bush took office, a government scientist prepared testimony for a Congressional committee on the dangerous effect of industrial uses of coal and other fossil fuel in contributing to “global warming,” a depletion of the earth’s protective ozone layer. The White House changed the testimony over the scientist’s objections, to minimize the danger (Boston Globe, Oct. 29, 1990). Again, business worries about regulation seemed to override the safety of the public.

At international conferences to deal with the perils of global warming, the European Community and Japan proposed specific levels and timetables for carbon dioxide emissions, in which the United States was the leading culprit. But, as the New York Times reported in the summer of 1991, “the Bush administration fears that … it would hurt the nation’s economy in the short term for no demonstrable long-term climatic benefit.” Scientific opinion was quite clear on the long-term benefit, but this was not as important as “the economy” – that is, the needs of corporations.

Sound familiar? H.W,’s entire administration, despite forgetfulness of hindsight – it’s not 20/20, it appears – smacks of neoconservatism.

Bush enshrined the main policy tenant of neoconservatism in his speech at the 1988 Republican National Convention when he made a statement that played a big role in his losing a second term to Slick Willy four years later: “Read my lips: no new taxes.” Taken by itself, this little gem promised one dangerous third of the neoconservative agenda. The other two-thirds – keeping existing effective tax rates static or shrinking them and rescinding the government’s essential responsibility to regulate businesses – were promised in other parts of the same speech. The address must have done his boss, the Gipper, proud. But, much like his predecessor, who had claimed office twice by promising to deregulate and detax, Bush felt the embarrassment of such unyielding rhetoric always precedes. Facing a Republican Congressional base recalcitrant over Bush’s tacit support of a mixed approach of cutting spending but raising taxes to pay for Reagan’s $220 billion deficit, the president was forced to sign a bill crafted and passed by the Democratic majority of both chambers that focused more heavily on tax increases.

The Bible is full of truisms, and sayeth, “Pride goeth before a fall” – Proverbs 16:18. Clinton, the Great Talker, beat Bush of a second term, and everyone blamed 41’s RNC lips.

Bush signed a ceremonial version of the North American Free Trade Agreement, which facilitated what politicians, including his sweet-talking successor, hailed as a “liberalization” of markets. The law, which dissolved a number of tariffs and trade barriers between the three northernmost North American countries, has allowed large corporations to amass ever larger fortunes, a typical pitfall of regional trade agreements. That expansion predictably attended exploitation of the poverty endemic in such a developing country as Mexico, contributing to worsening squalor.

But his administration’s biggest economic eulogy would write itself militarily. It was his martial guard, much of which he picked up from the Reagan administration (and in turn bequeathed his son), that began to formulate and codify the notion of a perpetual war economy. On the surface, Bush’s military ventures were either self-serving or foolish or both, and always catastrophic. In December 1989, the same year Bush took office, Bush’s Defense Secretary, Yertle the Turtle, noticed a game brewing down south in America’s imperial-corporate corridor, Panama. According to Cheney’s profile:

Panama, controlled by General Manuel Antonio Noriega, the head of the country’s military, against whom a U.S. grand jury had entered an indictment for drug trafficking in February 1988, held Cheney’s attention almost from the time he took office. Using economic sanctions and political pressure, the United States mounted a campaign to drive Noriega from power. In May 1989 after Guillermo Endara had been duly elected president of Panama, Noriega nullified the election outcome, incurring intensified U.S. pressure on him. In October Noriega succeeded in quelling a military coup, but in December, after his defense forces shot a U.S. serviceman, 24,000 U.S. troops invaded Panama. Within a few days they achieved control and Endara assumed the presidency. U.S. forces arrested Noriega and flew him to Miami where he was held until his trial, which led to his conviction and imprisonment on racketeering and drug trafficking charges in April 1992.

All true, but the Establishment writer hood who penned this article neglected to note some important context. According to this quick Guardian rundown, in the years leading to his ouster, Noriega had served the U.S. government well by facilitating its war against the socialist Sandinista government in Nicaragua.

Why, after all, wouldn’t we fight against socialism in our backyard? The Sandinistas established high literacy rates, universal health care and gender equality. So Reagan sold arms to Iran against its own policy and used the revenue to fund the Contras’ kidnap, rape, mutilation and murder of Nicaraguan civilians to spread capitalism. Human Rights Watch reported: “This is an important change from a human rights perspective, because the contras were major and systematic violators of the most basic standards of the laws of armed conflict, including by launching indiscriminate attacks on civilians, selectively murdering non-combatants, and mistreating prisoners.” But, hey, Reagan was an affable guy, so what the hell? But that’s a deep rabbit-hole.

The point is, we the taxpayers paid Noriega through the CIA because his military junta allowed the U.S. to gather intelligence on the Nicaraguan government from outposts on Panamanian soil. But when Noriega became too dangerous a political liability for even a United States Republican to support, the Bush administration came up with a number of reasons to suddenly halt support and CIA payments for Noriega. We invaded militarily and installed a leader we considered friendlier to American imperialist interests. The stated reasons were that Noriega’s military government threatened the lives of the 35,000 American citizens living in Panama; that Noriega somehow posed a threat to the Carter-Torrijos Treaty, which would give back the Panama Canal to Panama in 1999; that a cession of support would stanch the flow of narcotics through Panama. So Bush invaded Panama to extract the at-large Noriega and to install Guillermo Endara, who’d run unsuccessfully against Noriega in the previous election. During the month-long occupation in December 1989, 24 U.S. soldiers died, about 200 Panamanian soldiers lost their lives and an intensely argued-over number of Panamanian citizens were killed. The United States told us everything was OK because only a few hundred Panamanians died in the name of American fascism. Noriega’s associates claimed the number is much higher. Noriega was imprisoned in the U.S. and did nearly two decades of hard time in Florida. He has since been extradited about the globe, to France for murder and then to Panama for money laundering. According to the State Department, “Panama remains a transshipment crossroads for illicit trafficking due to its geographic location and the presence of the canal.” The people who died in the conflict, which the Bush administration called “Operation Just Cause,” are still dead.

Operation Just Cause was a simple jaunt into the by-then familiar territory of the Browner parts of the Western Hemisphere, where America had become very comfortable installing its own dictators and regressive capitalistic policies. (I’ll not go into the region’s entire history here, but I’ll suggest Naomi Klein’s “The Shock Doctrine.” It is fantastic journalism.)

The American intervention in the Middle East was, effectively if not intentionally, more ambitious. The Gulf War, known in Bush White House parlance as “Operation Desert Storm,” is seen in the nation’s classrooms through the lens of Official Narrative as the epitome of just intervention.

The Bush administration started marketing intervention in Iraq mainly as a necessary exercise of America’s moral authority in mitigating the human rights abuses perpetrated on Kurds and Iranians by Saddam Hussein’s military. Under that banner, the same one the next Bush would fly for Operation Iraqi Freedom, Western forces flew their terrifying planes and drove their mighty tanks in what became an iconic flexing of military muscle into the Iraqi desert, fighting, firing and bombing their way to Baghdad’s doorstep, but no farther.

In the end, it was simply a show of power. Why? Zinn writes:

Although in the course of the war Saddam Hussein has been depicted by U.S. officials and the press as another Hitler, the war ended short of a march into Baghdad, leaving Hussein in power. It seemed that the United States had wanted to weaken him, but not to eliminate him, in order to keep him as a balance against Iran. In the years before the Gulf War, the United States had sold Arms to both Iran and Iraq, at different times favoring one or the other as part of the traditional “balance of power” strategy.

Zinn and a number of other journalists and historians also write that U.S. involvement in the Gulf conflict was intended to help secure Bush a second term. But those two benefits were accompanied by a third, which represented what was truly at stake for American corporatism: Saudi oil. Tricky Dick Jr. met several times with King Fahd to assure him that, in exchange for permission to choreograph Operation Desert Storm from Saudi soil, Saddam would be no more. Of course, though Hussein would be weakened, his was not wrested until the younger Bush’s presidency. (Cheney, who for a time took the helm of one of biggest oil companies in the world, Halliburton, has since become a frequent houseguest in the Saudi caliphate, during his vice presidency, as well as more lately to discuss items that are unclear – you know, probably just catching up with old friends.)

But the bigger meaning of the Gulf War for many Iraqis, who were hardly liberated, was terror. According to Bloomberg Businessweek:

Although Cheney said shortly after the 1991 Gulf War that “we have no way of knowing precisely how many casualties occurred” during the fighting “and may never know,” Daponte had estimated otherwise: 13,000 civilians were killed directly by American and allied forces, and about 70,000 civilians died subsequently from war-related damage to medical facilities and supplies, the electric power grid, and the water system, she calculated.

In all, 40,000 Iraqi soldiers were killed in the conflict, she concluded, putting total Iraqi losses from the war and its aftermath at 158,000, including 86,194 men, 39,612 women, and 32,195 children.

The carnage was especially palpable in Fallujah, where military personnel I know, nearly a decade and a half later, yelled racial slurs against Iraqis as they loaded bombs bound for the city in 2004. As Jeremy Scahill writes in “Blackwater – The Rise of the World’s Most Powerful Mercenary Army”:

During the 1991 Gulf War, Fallujah was the site of one of the single greatest massacres attributed to “errant” bombs during a war that was painted as the dawn of the age of “smart” weaponry. Shortly after 3:00 p.m. on the afternoon of February 13, 1991, allied war planes thundered over the city, launching missiles at the massive steel bridge crossing the Euphrates River and connecting Fallujah to the main road in Baghdad. Having failed to bring the bridge down, the planes returned to Fallujah an hour later. “I saw eight planes,” recalled an eyewitness. “Six of them were circling as if they were covering. The other two carried out the attack.” British Tornado warplanes fired off several of the much-vaunted laser-guided “precision” missiles at the bridge. But at least three missed their supposed target, and one landed in a residential area some eight hundred yards from the bridge, smashing into a crowded apartment complex and slicing through a packed marketplace. In the end, local hospital officials said more than 130 people were killed that day and some 80 others were wounded. Many of the victims were children. An allied commander, Capt. David Henderson, said the planes’ laser system had malfunctioned. “As far as we were concerned, the bridge was a legitimate military target,” Henderson told reporters. “Unfortunately, it looks as though, despite our best efforts, bombs did land in the town.” He and other officials accused the Iraqi government of publicizing the “errant” bomb as part of a propaganda war, saying, “We should also remember the atrocities committed by Iraq against Iran with chemical warfare and against [its] own countrymen, the Kurds.” As rescue workers and survivors dug through the rubble of the apartment complex and neighboring shops, one Fallujan shouted at reporters, “Look what Bush did! For him Kuwait starts here.”

Whether or not it was an “errant” bomb, for the decade that followed the attack, it was remembered in Iraq as a massacre and would shape the way Fallujans later viewed the invading U.S. forces under the command of yet another President Bush.

All that is, of course, not to mention the horrific Gulf War Syndrome, a nasty cocktail of symptoms like chronic fatigue, diarrhea and joint pain suffered by more than a third of the American veterans of that conflict. No one knows what caused it, but evidence suggests the U.S. military’s usage of chemicals in warfare and ancillary activity.

The Run of the Mill

I REALLY COULD go on about how horrible Bush senior was, but it’s depressing, so I’ll get to the point: Bush, like Nixon and Roosevelt, was typical. Before Bush, Harry Truman oversaw the respective 1953 and ‘54 CIA coups of Iranian Prime Minister Mohammad Mosaddegh, who’d pissed off British Petroleum when he tried to nationalize Iranian oil, and Guatemalan President Jacobo Arbenz Guzman, who’d made enemies with the United Fruit Company by challenging its agricultural monopoly in the country. These, which are far from the first American overthrows of foreign governments, were special because they helped establish the CIA as the go-to agency for installing Western democracies, or Western trade sanctuaries if we’re being honest, in foreign countries. Every president since has had a lot of fun with it.

The United States and its corporate fiends have invaded, staged military coups, financed the restructuring of leftist political and economic infrastructure by influencing academia, secretively installed dictators, sent conservative economic advisory groups or otherwise intervened in all 20 Latin American countries not owned by France but three: Venezuela, Paraguay and Colombia. The latter was the only Latin American country to support the second Bush administration’s war on terror.

Since the turn of the Twentieth Century, the United States has taken the Roosevelt Corollary and applied it worldwide. Just since 1945, the end of World War II, to say nothing of other types of meddling we love, the United States has bombed 17 countries not in our hemisphere. It wouldn’t be difficult to imagine that America, in its 237-year existence, has tried in some way to implement its corporatist itinerary in all 194 countries the State Department recognizes and the territories it claimed in its westward invasion of North America. We’re certainly not alone here; Great Britain is known to have invaded nine of 10 countries in the world.

Don’t look to the liberal wing for progress. After H.W., Clinton helped implement a large number of oppressive policies that, whether intentionally or not, appeased his corporate underwriters like the Martin Marietta Corporation. NAFTA ushered in pain for poor people and gain for the rich, widening the wealth gap. It also made it cheap and easy for multinationals to buy cheaper labor in other countries and to bring them home. His vast derivatives deregulation called the Commodities Futures Modernization Act, signed the year before he left office, helped Wall Street bankrupt America, a disaster for which liberals, sleight-of-tongue masters that they are, disingenuously blamed Bush. The legislation ensured Clinton fancy post-presidential digs doing what he did best: talking to cooing crowds from behind a lectern. He’s since advocated at elegant confabs a lower corporate tax rate in America.

W. – well, we all know about W., but Jonathan Chait of New York Magazine has an apt digest of just how much he sucked. Also, Rex Nutting, of The Wall Street Journal’s puts it this way:

Bush had all the luck of Jimmy Carter, the attention to detail of Ronald Reagan, the adaptability of Lyndon Johnson, the abiding respect for the Constitution of Richard Nixon, the humility of Teddy Roosevelt, the rhetorical skills of Calvin Coolidge, the fiscal restraint of Franklin Roosevelt, the cronyism of Warren Harding, and the overreaching idealism of Woodrow Wilson.

Of course, the newest denizen of the white monstrosity at the exchange of Money and Power is no better. The wealth gap is huge, bigger than at any time under Bush. Obama has, in his latest grand statement on the matter, sworn fealty to those looking to boost their corporate profits by exacerbating climate change – the proliferation of natural gas – and failed to mention the real clincher with climate change: that we must, must, make some cessions in our standard of living if we are to solve the climate crisis. Yes, it’s a signature issue, and yes, he’s botching it. His justice department is wasting vast amounts of public money in going rabidly after drugs – a nice way of saying it’s going rabidly after young black men, who Obama was, as he notes, 35 years ago – while ignoring the systemic causes of America’s drug problems. And of course, Obama has expanded nearly every Bush national security program that he promised to scale back during his campaigns. Obama’s military-industrial complex is doing just fine, thank you.

A short jaunt into the test-beds of economic reductionism would have you otherwise believe, but capitalism in America is not on the downswing. It’s being codified in our nation’s infrastructure by the very corporations it benefits, written parasitically in the American government to where we can’t tell where one begins and the other ends. Republicans might sound a little crazier, but it’s happening on both ends of the political spectrum.

Capitalism: The Auction-Block of Government

The first Bush may have sat a little more to the right of most Democrats, but not by much and only in symbolic ways. Clinton expanded his policies. Bush Jr. combined those policies with Dick Cheney’s dream of privatization of services that are, by their very essence, meant to administered publicly; they’re too important, as journalist Naomi Klein says, to leave to a marketplace that has fickle loyalties except when it comes to commerce. Because when you take away the notion of such services as fundamentally public enterprises, the administration thereof is no longer accountable to the public it affects. Corporations are just doing what corporations do so they’re also immune to criticism.

Scahill writes of former spy Robert Richer, who was later CEO of Total Intelligence, one of the many private security contractors that were so successful by Bush’s war on terror:

In 2007 Richer told [The Washington] Post that now that he is in the private sector, foreign military officials and others are more willing to give him information than they were when he was with the CIA. He recalled a conversation with a general from a foreign military during which Richer was surprised at the potentially “classified” information the general revealed. When Richer asked why the general was giving him the information, he said the general responded, “If I tell it to an embassy official I’ve created espionage. You’re a business partner.”

Privatization of government takes away the public element, and the public is no longer a shareholder. He’s simply an onlooker, blind until it he’s been fleeced and the thief has made off through a labyrinth of corporate alleyways with booty of public treasure. And maybe that’s the point. Maybe the Cheneys and Bushes and Rumsfelds and Bremers, the Obamas and Clintons and Pelosis and Reids are needy in their own way: they need the needy to disappear, sequestered to scrap squabbles among themselves, biting at each others’ throats, as the diminishing middle class does at the jugular of the burgeoning poor, while the bourgeoisie leans back in a lawn chair next to a temperature-regulated pool with an umbrella-capped cocktail, and watches.

But then again, we’re not just bystanders. Bystanders have a responsibility to intervene, which we’ve abdicated to many of the very structures – government-funded watchdog groups and corporate media – that dangle the scraps. We squander rare opportunities to right ourselves when independent groups of intellectuals warn something’s really wrong. We distain intellectualism, we bristle at facts, we kill the messenger. That revulsion informs our worldview. We trust gut feelings that the earth self-heals from the worst wounds so can leave the light on. We let our most venal, reflexive proclivities get the best of us, and always when it’s most important that we don’t.

Historically, Americans have supported efforts to advance Western imperialism. By propping the power structures, we desperately cling, with brittle fingernails, to the idea that if we scrape by, take our lumps, until such time as we can win the Powerball or patent one of those ideas rolling about our brains, we’ll not have to worry anymore. Don’t fret, Margaret, I’ll be coming into some money, soon. It’s stimulated by America’s profound flaw, the alcoholic’s penchant to “never trust a man who doesn’t drink.” David Simon called it the “callow insecurity that accompanies any cry of ‘America, right or wrong’ or ‘America, love it or leave it.’”

In his sweeping novel, “East of Eden,” John Steinbeck wrote of the attitude of residents of his home region, the Salinas Valley, during World War I toward an innocent sad old German man who’d migrated to the valley and had a thick accent, and others:

One Saturday night, they collected in a bar and marched in a column of fours out Central Avenue, saying, “Hup! Hup!” in unison. They tore down Mr. Fenchel’s white picket fence and burned the front of his house. No Kaiser-loving son of a bitch was going to get away with it with us. And then Salinas could hold up its head with San Jose.

Of course that made Watsonville get busy. They tarred and feathered a Pole they thought was a German. He had an accent.

We of Salinas did all of the things that are inevitably done in a war, and we thought the inevitable thoughts. We screamed over good rumors and died of panic at bad news. Everybody had a secret he had to spread obliquely to keep its identity as a secret. Our pattern of life changed in the usual manner. Wages and prices went up. A whisper of shortage caused us to buy and store food. Nice quiet ladies clawed one another over a can of tomatoes.

This is just how we collectively act toward brown people after 9/11. It’s how we act toward everyone who looks similar to a person who’s done something wrong. It’s how we act toward people who look like those our betters have told us have done something wrong, even when they haven’t.

All leaders named above, and every president not named, and the vast majority, with a few notable exceptions, of their cabinet members, have always worked in the interest of parasitic corporations or an arcane bourgeoisie before they’ve worked in the interest of the body politic that voted them into office, and they’re praised for it. It’s written dramatically, and with a certain degree of permanence, into revisionist grade- and high-school curricula.

Zinn writes:

The treatment of heroes (Columbus) and their victims (the Arawaks) – the quiet acceptance of conquest and murder in the name of progress – is only one aspect of a certain approach to history, in which the past is told from the point of view of governments, conquerors, diplomats, leaders. It is as if they, like Columbus, deserve universal acceptance, as if they – the Founding Fathers, Jackson, Lincoln, Wilson, Roosevelt, Kennedy, the leading members of Congress, the famous Justices of the Supreme Court – represent the nation as a whole. The pretense is that there really is such a thing as “the United States,” subject to occasional conflicts and quarrels, but fundamentally a community of people with common interests. It is as if there really is a “national interest” represented in the Constitution, in territorial expansion, in the laws passed by Congress, the decisions of the courts, the development of capitalism, the culture of education and the mass media.

You’d almost prefer apathy in public school education, if only so the lies go in one ear and out the other, wasted in the winds, and death would be by pure complacency and not the worse thing we have.

All the problems engendered by this modus revolve around one concept: our addiction to capitalism. Every significant military- and foreign policy-based action taken by the United States – with very few exceptions, like the American signing of the Geneva Conventions or our participation, hesitant and ceremonial though it always is, in intergovernmental firesides on how to deal with climate change – are in the interest of expanding American market share in the world economy. It’s always about money. We measure American vitality always against the size of our marketplace with manic, nebulous metrics like gross domestic product, jobless claims and fluctuations in the economic confidence index. We measure these instead of the vibrancy of our culture or the conditions under which we are truly happy or the quality of our compassion toward one another. Money has come to embody our culture, our happiness, our compassion – there is nothing outside of it. Otherwise happy couples divorce over it, and the prescription always comes down to personal frugality: Do well by your money, and you’ll be happy; do poorly by it, and no matter how well you handle  other aspects of life, you’ll be screwed. “Money makes the world go ‘round,” they tell us, and we smile and unquestioningly nod. The more fundamental question of whether we should have a social system with money as its foundation – especially the one of which Americans have proven themselves to be incredibly poor, egocentric and greedy stewards – is never asked, never thought of.

This frenzied, rabid struggle for money above all things necessitates the mistreatment of others. Take climate change. The conservative establishment, in its never-ending lip service to deregulation, must deny the fact of human-caused temperature shifts. If they don’t, they’ll be forced to admit the problem requires regulation, which threatens not only the profits of their biggest investors, but their religious and social views. Klein addressed the drop of late in public belief that global warming exists, and that if it does it is caused by humans, in a recent interview with PBS’s Bill Moyers:

Climate change is, I would argue, the greatest single free-market failure. This is what happens when you don’t regulate corporations and you allow them to treat the atmosphere as an open sewer. So it isn’t just, OK the fossil fuel companies want to protect their profits; it’s that this science threatens a worldview. And when you dig deeper, when you drill down into those statistics about the drop in belief in climate change, what you see is that Democrats still believe in climate change in the 70th percentile. That whole drop-off in belief has happened on the right side of the political spectrum. … People who have very strong conservative political beliefs cannot deal with this science because it threatens everything else they believe.

To keep the status quo, we’ll drown nations. Climate is only the most important thing, but we have backups. We’ll also go after national heroes, telling them they’ve aided and abetted the enemy, for whatever that’s worth. We wield the traitor brush as if it’s some original, profound and horrifying observation that, holy shit, terrorists are able to buy technology that connects them to the leaked information. So I’d love to know precisely why, since Bradley Manning and Edward Snowden and Julian Assange represent such a boon for terrorists, litigators have not illustrated any specific instances in which terrorists used the information the leakers disclosed to conduct violence. The truth is, establishment forces must do everything they can to degrade efforts at transparency because they portend public dissent against the war machine and its surveillance state.

Even so, we turn blind eyes, toe the party line and vote for whom the media expects us to. But the culprit of our social ills is quite clear when the backrooms in which our laws are written are monitored by the most courageous journalists.

Why Socialism?

The next leap is to identify a way to neutralize the problem. A number of philosophers and writers look to the historical precedent of nations that have been successful – at least to the point America or another Western empire has intervened – in their experiments with egalitarianism.

These people are quickly written off by conservative intellectuals in palpable ways.

Read Steven Plaut, a professor of economics at Israel’s University of Haifa. He wrote in 2011 in Frontpage Mag – whose online flag motto ominously warns, “Inside Every Liberal Is A Totalitarian Screaming to Get Out” – about Scandinavian countries, which are upheld by liberals as examples of good socialism. He says Scandinavia has low rates of poverty not because it is relatively socialistic, but because Scandinavians, almost as a rule, are stalwart partisans of parsimony, and they put their fine backs and strong shoulders to good work, producing capital:

The interesting question is whether the low poverty rates there are thanks to the economic system or thanks to Scandinavians being hard-working thrifty disciplined people.  That Scandinavians are hard-working is evident from the fact that in spite of enormous benefits in Sweden for the unemployed and for those who do not work, creating incentives to avoid work, Sweden has a labor force participation rate that is one of the highest in Europe.

One way to test our question is to examine Scandinavians who do not live in Scandinavia.  There is a large Scandinavian population that lives in the bad-old-selfish-materialist-capitalist United States.  Well, it turns out that Scandinavians living under its selfish capitalism also have remarkably low poverty rates.  Economists Geranda Notten and Chris de Neubourg have studied Scandinavians living in the US and in Sweden and compared their poverty rates.  They estimate the poverty rate for Scandinavians living in the United States as 6.7%, half that of the general U.S population.  Using measures and definitions of poverty like those used in the US, the same analysts calculate the poverty rate in Sweden using the American poverty threshold as an identical 6.7% (although it was 10% using an alternative measure).   So low poverty among Scandinavians seems to be because Scandinavians work, whether or not Scandinavian “socialism” can be said to work.

But Plaut’s thesis rings hollow – as do those that logically flow from it – when you work past the conceit in his theorem.

To say that poverty is low in Scandinavia simply because Scandinavians have an excellent work ethic, which I’m sure they do, is to say that America has a higher poverty rate because the people who fall into that category are moochers. But more than 90 percent of entitlement benefits meant for the poor in America go to the elderly, most of whom worked until late in life, to the disabled or to working families. An infinitesimal portion of benefits in America go to people who drift on the waves of the social welfare system. Perhaps the Swedes work hard because they have productive socially administered means of procuring decent employment. This contrasts the vast – sometimes unquantifiable because many of these budgets are “dark” – amount of American taxpayer money that goes toward subsidies, for example, to the war machine, the oil industry or the health care apparatus.

Plaut examines a number of statistics about poverty among migrants – Scandinavians who travel to America and don’t fall into the poverty trap, while “moochers” to Scandinavia’s south travel to Scandinavia and still live in squalor. But he fails to notice the dynamic of white gentry that feeds poverty in racially diverse countries like the U.S. or most European countries. Scandinavia has an incredibly homogenous population. According to the CIA, the resident population of Finland is 93.4 percent Finn and 5.6 percent Swede; Norway is 94.4 percent Norwegian; and according to Eurostat, Sweden is 87.7 percent Swede. In America, Whites face far less hardship in finding jobs than their ethnic counterparts. Historically, American policy has also disproportionately favored whites in building a base for success – in acquiring a set of bootstraps by which to pull themselves up. For example, post-Civil War Scandinavian immigrants used the federal homestead acts, which gave portions of public land to private homesteaders but originally excluded blacks, to found their successful futures. It would be reductive to say Scandinavia is without race problems. Indeed, it needs profound self-reflection to right many ethnic wrongs. But America has a far more racially divided history than Scandinavia, and if you’ve been reading the front pages, racism in America is not over by a long shot. Read in this light, Plaut’s column becomes a disingenuous denial of the existence of the American poverty trap, especially in ethnic communities.

Plaut cites a digest of poverty statistics by country that focuses on the percentage of the population that lives below the poverty line, ranked from the highest poverty rates to the lowest. He notes that Switzerland, a capitalist paradise (which, incidentally, provides universal health care and requires many of its men to own a gun), comes in at 146 of 153, beating out all Scandinavian countries. But he fails to note that Switzerland is beaten by three positions by notoriously socialist Ireland, whose government controls health care, education, banks and many businesses and whose public spending policies are credited with a turnaround from its recent economic recession. The rest of Switzerland’s successors are an amalgam of different types of economies of France, Austria, Malaysia, Lithuania, China and Taiwan. It seems Switzerland is, in a way, its own Plaut anathema. Maybe he needs a different yardstick. Indeed, let’s talk other metrics of success, like free access to good health care and education, excellent job benefits like mandatory maternity leave, and 100 percent literacy rates – all proven corollaries for better economic vitality. Maybe those, when juxtaposed against the abysmal records in the United States would be convincing.

Twice in the column, Plaut writes about real evidence of social dynamics in real places as if they existed in a hypothetical universe, a typical obfuscation tactic for those on the right. He notes the inconsistent protocol across different economies of measuring poverty, which could suggest poverty measured by different standards might mean Scandinavia has a higher poverty rate than it had seemed when viewed through a more Western lens. He writes:

The definition of ‘poverty’ and its measurement are both highly problematic, and both vary dramatically, making inter-country comparisons difficult. In all countries there are serious problems with the measures. Wealthy people are sometimes counted as part of the population below the poverty line, as long as their current income happens to be low.  Examples are retired people and students.  The poverty statistics are based on reported incomes, meaning that lots of people living high on the hog are counted as poor because they do not report their income at all to the tax authorities, earning income from the “shadow economy.”  Poverty is generally measured by income, not consumption.  It is often measured as a percent of median income, not by material hardship, or by the rather silly “Gini coefficient.”  If every single person discovered a petroleum well in his yard, poverty rates would not change much.

OK, Justice Scalia, poverty statistics are flawed – just like any other type. That’s why studies based on data and statistics have error margins. But by Plaut’s logic, Scandinavia might also have a smaller poverty rate, relative to Western countries, than reported. If we’re calling the statistical accuracy of the evidence upon which we base our suppositions into question, why bother to make a conclusion in the first place?

Here, too, we find the same problem: Scandinavian countries, while far more socially progressive than many other Western democracies, are as Plaut points out not at all entirely socialistic. He seems to allude, by citing Scandinavians critical of socialism, that if corporations Scandinavia were on an even longer leash, poverty might be completely eliminated. Again, we must lean on the if-then-might argument to mollify our absence of science, in which the opposite conclusion is equally valid.

All this leads to a magnificent crescendo in the interest of discrediting socialism, and it’s a common conservative talking point on Scandinavia: “The conclusion can only be one thing.  The low poverty rate among Scandinavians in Scandinavian countries is thanks to the fact that Scandinavians work.  It is NOT because socialism works!”

Plaut separates Scandinavia as an aberration of the philosophy: Scandinavians are nothing like those goddamn Browns who tried to run their own countries to America’s south and to Asia’s west. They don’t, somehow, look to mooch off the waning faction of their society that produces the wealth.

The conservatives are perplexed; how does this work? The conclusion can only be one thing …

And, boom and bust, on we go.

This hypothesis ignores that Venezuela’s Chavez, Chile’s Allende, Iran’s Mosaddegh, and so on and so forth brought their people up from poverty, establishing vibrant local economies at the loss of multinational corporations. They placed the ownership of their countries back in the hands of the people. America neutralized or ostracized them. Where we could, we showed up with squadrons of militants to hack the head off socialist Hydra and conservative philosophers cauterized the wound. We never let true socialism work, and where we couldn’t help it, isolated it like a leper. So, Dr. Plaut, we can’t say it doesn’t work. And by any remote indication, it would work far less destructively than the corporatism upon which we base our lives.

But Plaut’s only the mainstream; the tributaries of capitalism touch all the backwoods indigenous who like to think themselves populist, while railing Occupy Wall Street and its “orgies in the street” Glenn Beck’s henchmen swear they saw. The arguments against socialism from conservatives of every type, even the humanitarians, abound.

Men and women, blacks, Hispanics, Asians, whites, LBGT people, old and young alike should have the freedom to work hard in the craft of their choice, while making meaningful contributions to society. That’s the libertarian argument against socialism for capitalism. We should rely solely on charity and church to mitigate poverty or sooth mental illness; they’ll get it done faster and at half the cost. They say the simple liberalization of markets – the abolition of government, if we’re blunt about it – would hand power back to the people and eliminate the problems our form of government poses. It’s true that pure liberalization of the marketplace would jettison the festering sinkhole of corporate welfare, but that’s as far as it would go. We’d hope some group of wealthy eccentrics would be crazy enough to put up the billions to maintain and revise the transportation system. We’d cross our fingers, wait for publicly funded projects like the Internet to be realized. We’d wonder, if only there were a mechanism for corporate oversight, companies might face consequences when they dump cyanide in water supplies, to enslave children under the pretense of lifting them from poverty, to indiscriminately harvest whatever they feel necessary to bolster the bottom line. I’m not writing in hypotheticals or hyperbole. Multinationals do this as second nature in other countries, where every day is a tax holiday, every worker is low-wage and every river is a sewer.

Still others say socialism is the problem, that America’s pendulum is stuck left. This argument, often made in depressingly vapid earnest, is formulated from a fundamental misunderstanding of the basic definitions of economic philosophies. Consider this pearl of prose from American Thinker. Peter Ferrara founds his entire hypothesis on and frames all his dubious data in the notion that Barry is a Marxist, Red as they come. Perhaps Ferrara should buy a dictionary and reference it when thinking about going to work.

I’ll use the same example to which most neoconservatives first turn when they call Obama a socialist: the Affordable Care Act or “Obamacare,” a law originally dreamed up by a consortium of conservative economists. Former health care executive J.D. Kleinke explains in The New York Times:

The president’s program extends the current health care system — mostly employer-based coverage, administered by commercial health insurers, with care delivered by fee-for-service doctors and hospitals – by removing the biggest obstacles to that system’s functioning like a competitive marketplace.

Chief among these obstacles are market limitations imposed by the problematic nature of health insurance, which requires that younger, healthier people subsidize older, sicker ones. Because such participation is often expensive and always voluntary, millions have simply opted out, a risky bet emboldened by the 24/7 presence of the heavily subsidized emergency room down the street. The health care law forcibly repatriates these gamblers, along with those who cannot afford to participate in a market that ultimately cross-subsidizes their medical misfortunes anyway, when they get sick and show up in that E.R. And it outlaws discrimination against those who want to participate but cannot because of their medical histories. Put aside the considerable legislative detritus of the act, and its aim is clear: to rationalize a dysfunctional health insurance marketplace.

This explains why the health insurance industry has been quietly supporting the plan all along. It levels the playing field, expanding the potential market by tens of millions of new customers. Hardly a government takeover of health care. Basically, the law simply ensures the health insurance industry a clientele by requiring Americans to buy its product with the compromise that health insurers can’t refuse anyone, without addressing the real problem in American health care of rampant price gouging in health care administration. Corporate welfare, fascism Western style, at its finest.

The conservatives use climate change as another non-starter for progress. Obama’s vaunted speech on the world’s biggest problem was predictably maligned by the crazies not as being too timid, but as being too aggressive, even though the president, like he did with health care, basically promised the fossil fuel industry a windfall in moving forward.

Under guise of progress, such is Obama Policy.

This is the opposite of socialism, which is defined by Encyclopedia Britannica as “social and economic doctrine that calls for public rather than private ownership or control of property and natural resources.” Meaning not that everyone would own each other’s furniture, but that everyone would own a say in the way the most important resources – like water and food and energy, education and information and health care – are gathered and distributed.

Of course, like any ideology, the idea of socialism deserves criticism, and it can’t be realized without generous flexibility. But the argument isn’t to nitpick the particulars of history, in which, yes, many evil men have used the pretext of socialism to establish one-man rule. It’s to bring the definition of society back to its roots, to redistribute ownership and stewardship of public domain from the hands of a wealthy few and back to the public. This requires reflection and acknowledgement of socialism’s flaws.

Responsible socialism would not aim to completely eliminate poverty and all other social ills, like Plaut says proponents of socialism claim. The best socialists do not claim this. Responsible gun control activists say their proposals would reduce gun violence as it has done in many other countries instead of eliminate it; socialists would simply reduce poverty and other dangerous social paradigms, while implementing safety mechanisms for those who fall through the cracks.

No sane person argues for a rigid economy that is planned down to the cent in every respect. Naomi Klein notes that, though she indicts free-market philosophy with her journalism, she doesn’t think a fundamentalist socialist economy would work:

I think that mixed economies work better than a fundamentalist market system. And I’m not a utopian, and I don’t believe that it’s perfect, and there’s still gonna be violence, there’s still gonna be repression, there’s still gonna be poor people. But by acceptable U.N. measures of a standard of living, what we see is countries that have a mixed economy, i.e. have markets, people are able to go shopping … but also have social protections that identify areas that are too important to leave to the market – whether it’s education, health care – the minimal standard of life that everybody must have.

Under socialism, where we the people own the information, where we own the government, Nixon would have been rightly castigated and jailed for withholding the conversations on Camp David tapes from the public. No salute at his funeral barked from steel and gunpowder would have been tolerated. The backroom deal Obama struck with the health insurance industry’s top lobbyist to require Americans to buy insurance in exchange for its support for the Affordable Care Act would have been subject to scrutiny and criticism beforehand, and a decision made among the masses. Ed Snowden and Bradley Manning, possibly the two most important public servants, would not be in their respective hiding place and prison cell for exposing the corruption they did.

The world is not perfect; it never will be. That’s why Zinn places an asterisk on straight utopianism. He makes perhaps the most eloquent argument for an egalitarian society, a redistribution of true wealth – not of those small encoded green sheets of paper apart from which most Americans are so enslaved they can’t imagine a United States, but the power with which to oversee our own policymaking, absent the festering corporate influence that has diseased our polity:

With the Establishment’s inability either to solve severe economic problems at home or manufacture abroad a safety valve for domestic discontent, American’s might be ready to demand not just more tinkering, more reform laws, another reshuffling of the same deck, another New Deal, but radical change. Let us be utopian for a moment so that when we get realistic again it is not that “realism” so useful to the Establishment in its discouragement of action, that “realism” anchored a certain kind of history empty of surprise. Let us imagine what radical change would require of us all.

The society’s levers of power would have to be taken away from those whose drives have led to the present state – the giant corporations, the military, and their politician collaborators. We would need – by a coordinated effort of local groups all over the country – to reconstruct the economy for both efficiency and justice, producing in a cooperative way what people need most. We would start on our neighborhoods, our cities, our workplaces. Work of some kind would be needed by everyone, including people now kept out of the work force – children, old people, “handicapped” people. Society could use the enormous energy now idle, the skills and talents now unused. Everyone could share the routine but necessary jobs for a few hours a day, and leave most of the time free for enjoyment, creativity, labors of love, and yet produce enough for an equal and ample distribution of goods. Certain things would be abundant enough to be taken out of the money system and be available – free – to everyone: food, housing, health care, education, transportation.

Socialism pits the compassion against the egoisms of capitalism that inform and perpetuate our status quo. If we could take a deep breath and think big toward the fellow man, then equality, literacy, free education, happiness, et al. could win out over hierarchy, corporate welfare, money, patriarchy.

Socialism, if we let it, would neutralize the mechanisms of manipulation and obfuscation by which we’ve accrued our distresses at the intersection of money and power by entrusting the traffic direction of both vast avenues with the people.

It could provide an administrative infrastructure and an existential tolerance to relieve the problems of socioeconomic stratification our brothers and sisters face daily.

It would create an environment in which we could mobilize ourselves to solve the biggest challenges we face, most notably climate change.

Most importantly, it could pit our honesty against our addiction. Like the alcoholic who’s quit drinking, but is in a perpetual state of getting better, we could stop lying to ourselves.

But I’m getting ahead of myself. All this requires a structural overthrow. I hope I’m there to chronicle that battle.

Digest – Suicide Note, Washington Incest, Etc.

A look back “Fatal Distration,” a devastating, Pulitzer Prize-winning venture into the lives of men and women who accidentally leave infants and toddlers in hot cars to die, by Gene Weingarten, a humor writer for The Washington Post. It’s timely, after a 3-year-old died this month in a hot car in Florida. It should be required reading for parents and people thinking about chastising a parent who makes the “inexplicable, inexcusable mistake”:

Two decades ago, this was relatively rare. But in the early 1990s, car-safety experts declared that passenger-side front airbags could kill children, and they recommended that child seats be moved to the back of the car; then, for even more safety for the very young, that the baby seats be pivoted to face the rear. If few foresaw the tragic consequence of the lessened visibility of the child . . . well, who can blame them? What kind of person forgets a baby?

The wealthy do, it turns out. And the poor, and the middle class. Parents of all ages and ethnicities do it. Mothers are just as likely to do it as fathers. It happens to the chronically absent-minded and to the fanatically organized, to the college-educated and to the marginally literate. In the last 10 years, it has happened to a dentist. A postal clerk. A social worker. A police officer. An accountant. A soldier. A paralegal. An electrician. A Protestant clergyman. A rabbinical student. A nurse. A construction worker. An assistant principal. It happened to a mental health counselor, a college professor and a pizza chef. It happened to a pediatrician. It happened to a rocket scientist.

A suicide not, titled, “’I Am Sorry That it Has Come to This’: A Soldier’s Last Words.” It might be one the worst, but will certainly be one of the most important, things you ever read:

Daniel Somers was a veteran of Operation Iraqi Freedom. He was part of Task Force Lightning, an intelligence unit. In 2004-2005, he was mainly assigned to a Tactical Human-Intelligence Team (THT) in Baghdad, Iraq, where he ran more than 400 combat missions as a machine gunner in the turret of a Humvee, interviewed countless Iraqis ranging from concerned citizens to community leaders and and government officials, and interrogated dozens of insurgents and terrorist suspects. In 2006-2007, Daniel worked with Joint Special Operations Command (JSOC) through his former unit in Mosul where he ran the Northern Iraq Intelligence Center. His official role was as a senior analyst for the Levant (Lebanon, Syria, Jordan, Israel, and part of Turkey). Daniel suffered greatly from PTSD and had been diagnosed with traumatic brain injury and several other war-related conditions. On June 10, 2013, Daniel wrote the following letter to his family before taking his life. Daniel was 30 years old. His wife and family have given permission to publish it.

Awesome, useful details from The New York Times’s Eric Lichtblau on America’s most secretive judicial entity, FISA, in “In Secret, Court Vastly Broadens Powers of N.S.A.”:

Unlike the Supreme Court, the FISA court hears from only one side in the case — the government — and its findings are almost never made public. A Court of Review is empaneled to hear appeals, but that is known to have happened only a handful of times in the court’s history, and no case has ever been taken to the Supreme Court. In fact, it is not clear in all circumstances whether Internet and phone companies that are turning over the reams of data even have the right to appear before the FISA court.

Created by Congress in 1978 as a check against wiretapping abuses by the government, the court meets in a secure, nondescript room in the federal courthouse in Washington. All of the current 11 judges, who serve seven-year terms, were appointed to the special court by Chief Justice John G. Roberts Jr., and 10 of them were nominated to the bench by Republican presidents. Most hail from districts outside the capital and come in rotating shifts to hear surveillance applications; a single judge signs most surveillance orders, which totaled nearly 1,800 last year. None of the requests from the intelligence agencies was denied, according to the court.

An old, but essential essay from Seymour Hersh in The New Yorker on the horrors of Abu Ghraib, “Torture at Abu Ghraib”:

Taguba’s report listed some of the wrongdoing:

Breaking chemical lights and pouring the phosphoric liquid on detainees; pouring cold water on naked detainees; beating detainees with a broom handle and a chair; threatening male detainees with rape; allowing a military police guard to stitch the wound of a detainee who was injured after being slammed against the wall in his cell; sodomizing a detainee with a chemical light and perhaps a broom stick, and using military working dogs to frighten and intimidate detainees with threats of attack, and in one instance actually biting a detainee.

More than just a look into the insular and at the same time publicly narcissistic world of Washington’s political operatives, Mark Leibovich’s profile on the rising and volatile star on California Republican Representative Darrell Issa’s team is a deep look into how relationships between reporters and staffers work. And more than just that, it details how much time is spent in Washington by media and politicians serving selfish prurience instead of the public good. Lots of detail, lots of intrigue. It’s fascinating. Here’s “How to Win in Washington.”

Bardella evinced a desperation that made him more honest than people in Washington typically are. Or maybe “transparent” is a better word, because he did seem to lie sometimes (or “spin” sometimes), at least to me. Even as he stuck out among earnest Hill deputies, something about Bardella wonderfully embodied the place. It’s not that Washington hasn’t forever been populated by high-reaching fireballs. But an economic and information boom in recent years has transformed the city in ways that go well beyond the standard profile of dysfunction. To say that today’s Washington is too partisan and out of touch is to miss a much more important truth — that rather than being hopelessly divided, it is hopelessly interconnected. It misses the degree to which New Media has both democratized the political conversation and accentuated Washington’s myopic, self-loving tendencies. And it misses, most of all, how an operator like Kurt Bardella can land in a culture of beautifully busy people and, by trading on all the self-interest and egomania that knows no political affiliation, rewrite the story of his own life.

What We Should Instead Celebrate this Fourth

I submitted an essay last July 4 to a Navy organization called Morale Welfare and Recreation that organizes funtivities and other similar contests, under the bullshit prompt, “What does Independence Day mean to you?” My essay went like this:

Leftist sailors have much to be conflicted about. I’m one, and I’m here to explain:

Before I joined the Navy, I was conflicted about serving an organization that orchestrated campaigns with which I sometimes disagreed. Our war efforts seemed unwarranted and productive only for those who stood to enjoy financial gain from them. I was frustrated with how our leaders handled military business.

But, after I dithered, interjected and finally enlisted in the face of a fat, ugly personal finance disaster, I looked past the military’s surface-level iniquities and justified them as necessary. And I got over my hesitation.

Not because they’re just a small part of something that’s making my life stable again.

Here’s the bigger thing: even as I gather mental sticky notes adding to an already respectable collection of items I disagree with, I’m also inundated by a cascade of reasons America should appreciate its military.

There at the top of the list is that biggest idea, which says if America didn’t have its military, it wouldn’t have what makes it itself. Some people call it “freedom” or “independence,” but it’s headier than what a single word can describe.

Whatever it is, it ebbs and flows in our hearts and heads until, every July 4, we pour it all out and it culminates in a deafening crescendo of fireworks in night skies over separate American towns.

Separate, but united in the same black night.

We, as sailors, whether we’re liberal, conservative, Christian, or we worship the Flying Spaghetti Monster, take it more seriously than the average American.

We’re the ones who stood at parade rest in the sloppiest ranks of boot camp while officers invoked messages of patriotism to Lee Greenwood’s formerly cheesy-sounding “Proud To Be An American,” and we felt it for the first time. We felt that idea creeping from our stomachs up through our hearts and throats and out onto our skin in thick goose bumps.

Some people take it for granted. Just another day. They might even say: because Hedge is a liberal and he sometimes disagrees, he’s not allowed to be patriotic. But that’s the whole point. They’re free to say that because we’re here. We can agree to disagree, as the cliché goes.

Of course, it’s more profound than just a few instances when people disagreed with me and exercised their freedom of speech; that’s only a small example.

The idea that we’re free to view the world in the way that makes the most sense expands into a collective school of thought, until we realize we actually sync in a bigger way than the ways we fight. Thus, we become a fluid, formidable singularity. Discord, hesitation and insecurity cease to matter. We’re the same country again.

Before I joined, I was very conflicted. I still am.

But now, I know that’s what all those fireworks late on Independence Day represent to me: the American freedom to think, feel and speak out, which citizens of too many other countries can’t do.

Some of that’s true, I guess. It won me an iPad, from which I sometimes edit this blog. But something’s happened since. I’ve read quite a bit, and reshaped my worldview. It’s changed so much that I’m shocked at my own naivety just a year ago. “Formidable singularity?” Those words were written by a different person. What the hell was I talking about? Was I in a haze from swearing in just a few months prior? Was I subconsciously justifying my enlistment to flee the shame of having to do something that goes against what I know is right? Was I trying to impress the contest judges?

Whatever the answer, I’d like to point out this year that Independence Day is in fact simply a celebration of a gathering a bunch of men who didn’t want to pay taxes and to be allowed to keep fucking their slave girls. In their collusion, they would establish their own independent aristocracy in which the people would still founder. “We the People” were more specifically, we the privileged white men with fancy hats and sprawling estates. They we should lament.

The true heroes were the poor and the exploited who rose up in the face of the gentry. We should instead celebrate Tecumseh, who fought to stanch the inexorable white westward movement in the Ohio River Valley, in the War of 1812. We should pay ode to the weary slavery abolitionists President Lincoln was eventually forced to heed by turning tide during the Civil War, more than 300 years after Spain outlawed slavery. We should remember Eugene Debs, who founded the Industrial Workers of the World.

Of course, as a nation, we’ll never do this, because as the fictional Jackie Cogan aptly noted, “America’s not a country; America’s a business.” It would make no sense for the institution to celebrate the very people’s movements aiming to destroy the structure. But as individuals, we still cling – some with more urgency that others – to that idea of a free, equal and decent society, and that we should celebrate.

Weekly Digest – A Tribute to Hastings, A Look Back on Dollard, Etc.

In honor of the recently late Michael Hastings, the fearless Buzzfeed reporter and author of several influential books who ended the career of Counterinsurgency apostle General Stanley McChrystal on the pages of Rolling Stone, here’s his story on McChrystal. Perhaps more important than his legacy as the man who brought down a four-star is the astute commentary on the larger effort in Afghanistan, the war McCrhystal was leading when he fell. “The Runaway General”:

When it comes to Afghanistan, history is not on McChrystal’s side. The only foreign invader to have any success here was Genghis Khan – and he wasn’t hampered by things like human rights, economic development and press scrutiny. The COIN doctrine, bizarrely, draws inspiration from some of the biggest Western military embarrassments in recent memory: France’s nasty war in Algeria (lost in 1962) and the American misadventure in Vietnam (lost in 1975). McChrystal, like other advocates of COIN, readily acknowledges that counterinsurgency campaigns are inherently messy, expensive and easy to lose. “Even Afghans are confused by Afghanistan,” he says. But even if he somehow manages to succeed, after years of bloody fighting with Afghan kids who pose no threat to the U.S. homeland, the war will do little to shut down Al Qaeda, which has shifted its operations to Pakistan. Dispatching 150,000 troops to build new schools, roads, mosques and water-treatment facilities around Kandahar is like trying to stop the drug war in Mexico by occupying Arkansas and building Baptist churches in Little Rock. “It’s all very cynical, politically,” says Marc Sageman, a former CIA case officer who has extensive experience in the region. “Afghanistan is not in our vital interest – there’s nothing for us there.” 

R.I.P., Michael Hastings

P.S. Here’s his final post for Buzzfeed: a searing commentary on how fucking evil the government’s domestic surveillance program is and how fucking duplicitous the Democrats are. “Why Democrats Love To Spy On Americans“:

Vintage: War pits the classes against each other, allowing only those at the top the wherewithal to seek solace. Evan Wright, whom I met briefly at a Miami book signing for “Generation Kill,” profiles Pat Dollard, a neoconservative movie, entertainment and journalism executive in Vanity Fair. The story, “Pat Dollard’s War on Hollywood,” is old, but reading it anew makes you realize the prescient indictment on all forms of American addiction. It’s well worth revisitation. On a war documentary Dollard is editing during Wright’s reporting, Wright writes:

There is evidence of a possible war crime in the trailer: a Marine clutches the head of a dead Iraqi and raises it in front of the camera like a jack-o’-lantern. (This footage was given to Dollard by troops, and he claims not to know the provenance of the decapitated man, or why a Marine was playing with his severed head.) In Dollard’s presentation, the act of desecration, accompanied by the faces of grinning Marines, is treated as a macabre joke. By intercutting this with actual Jackass footage, the trailer seems to suggest that, for the young, wild, and patriotic American, war in Iraq is sort of like the ultimate Jackass.

When I mention to Dollard that his severed-head scene might turn more Americans against the war, or even against the troops, he laughs. “The true savagery in this war is being committed by the American left on the minds of the young men and women serving over there by repeatedly telling them that their cause is lost.” He adds, “My goal is to de-sensitize young people to violence. I want kids to watch my film and understand that brutality is the fucking appropriate response to a brutal enemy.”

Dollard’s target audience is the same as any rock band’s: kids—the more disaffected the better. He aims to alter the course of pop culture. “What we’ve celebrated since at least the 1950s is the antihero,” Dollard says. “Today, even though our country has been attacked, nothing has changed. If you are a young man in America right now, the coolest fucking thing you can aspire to be is like a gangsta rapper, or a pseudo bad guy. The message of my movie is simple: If you’re a young person in America, the coolest, fucking most badass and most noble thing you can be today is a combat Marine. Period.”

Check out Dollard’s website, About an amalgam of violence mongering, the tagline reads: “THE WAR STARTS HERE!”

Glen Greenwald on Barrett Brown, who has been imprisoned, not for his hacking, but for his journalism. Simple reporting, it seems, has become a crime. “The persecution of Barrett Brown – and how to fight it”:

What the US government counts on above all else is that the person it targets is unable to defend themselves against the government’s unlimited resources. It takes an enormous amount of money to mount an effective defense. That’s what often drives even the most innocent people to plead guilty and agree to long prison terms: they simply have no choice, because their reliance on committed and able but time-strapped public defenders makes conviction at trial highly likely, which – under an outrageous system that punishes people for exercising their right to a fair trial – means a much harsher punishment than if they plead guilty.

If the US government is going to attempt to imprison activists and journalists like this, it should at the very least be a fair fight. That means that Brown, who is now represented by a public defender, should have a vigorous defense able to devote the time and energy to his case that it deserves. He told me in the telephone interview we had that he believes this is the key to enabling him to avoid pleading guilty and agreeing to a prison term: something he has thus far refused to do in part because he insists he did nothing criminal, and in part because he refuses to become a government informant.

Some fascinating history on the surveillance state from The New Yorker’s Jill Lepore, in “The Prism: Privacy in the age of publicity”:

The fetish for privacy attached, with special passion, to letters. In the spring of 1844, the year of the Mazzini affair, Poe sat down to write a story called “The Purloined Letter.” A few months later, a hardworking young man named James Holbrook was hired as a special agent by the United States Post-Office Department. He chronicled his experiences in a memoir called “Ten Years Among the Mail Bags; or, Notes from the Diary of a Special Agent of the Post-Office Department.” “A mail bag is an epitome of human life,” Holbrook explained. The point of this Post-Office Department was not to violate people’s privacy but to protect it. Holbrook’s job was to stop people from opening other people’s mail. He was a post-office detective. “Ten Years Among the Mail Bags,” like a great deal of nineteenth-century fiction, is full of purloined letters.

E-mail isn’t that different from mail. The real divide, historically, isn’t digital; it’s literary. The nineteenth century, in many parts of the West, including the United States, marked the beginning of near-universal literacy. All writing used to be, in a very real sense, secret, except to the few who knew how to read. What, though, if everyone could read? Then every mystery could be revealed. A letter is a proxy for your self. To write a letter is to reveal your character, to spill out your soul onto a piece of paper. Universal literacy meant universal decipherment, and universal exposure. If everyone could write, everyone could be read. It was terrifying.

David Carr of The New York Times, one of the best media commentators out there, writes about the shifting identity of the gatekeeper from news editor to news subject in Big News Forges Its Own Path, on his Media Equation blog:

… the historical strengths of big news organizations like the one I work for — objectivity, deep sources in the government and a history of careful reporting — were seen by Mr. Snowden as weaknesses. He went to Mr. Greenwald because they share values, because Mr. Greenwald is a loud and committed opponent of the national security apparatus and because he is not worried what the government thinks of his reporting. Of course, Mr. Greenwald had the international reach of The Guardian behind his story, and Mr. Snowden also shared information with The Washington Post, although it was clear that Mr. Greenwald’s past coverage on the issue was as important as where he worked.

The way to break a big story used to be simple. Get the biggest outlet you can to take an interest in what you have to say, deliver the goods and then cross your fingers in hopes that they play it large.

That’s now over. Whether it’s dodgy video that purports to show a public official smoking crack or a huge advance in the public understanding of how our government watches us, news no longer needs the permission of traditional gatekeepers to break through. Scoops can now come from all corners of the media map and find an audience just by virtue of what they reveal.