Tuesday, October 17, 2017

What Else Can Happen Here

As we watch with half-averted eyes ("Bah, the latest news, the latest news is not the last") the bitter slog of the Unnamable, a moment to note what we are not seeing, that we are no longer seeing...

The news was everywhere in the 1970s and 1980s and well into the 90s of the daily violence and intractable conflict between Protestants and Catholics in Northern Ireland.

It was rebellion, insurrection, guerrilla war, terrorism, ethnic violence, international intrigue and angry political controversy on several continents, all in one.

Known as "The Troubles" it resulted in bombings, shootings and other violence, especially once the UK installed troops in the most violent areas.  Some 3,500 people were killed according to official statistics.

The Catholic minority claimed economic and political oppression, and demanded that the North, which was majority Protestant and part of the United Kingdom, be joined to the Republic of Ireland to the south (majority Catholic) in a single independent state.

Those of us exposed to the news of those days saw these images on TV screens nearly every day.  The Troubles spawned countless movies and TV episodes that dramatized what was clearly intractable conflict on the order of the other great source of violence in those days, the deadly enmity between Arabs and Israelis.

 It was a war of two sides, each demanding absolute loyalty. For those involved it was us against them, with mutually exclusive demands: either/or.  Each took a position that ceded nothing to the other side; each side demanded complete victory.

There were many cease-fires that never seemed to hold.  It all seemed hopeless, especially as it reflected ethnic histories and loyalties.

And then, it vanished.

It vanished off the nightly news, the television and movie screens.  For after talks led to dramatic agreements (that few believed would hold), it entered a long period of undramatic negotiation. Today this intractable conflict is all gone, due to a series of far-reaching agreements that involve a very much changed Republic of Ireland.
Belfast City Hall today
The absolute first thing to notice about this is: a viciously violent conflict between two enemies with roots in history and economics that seemed to have no peaceful solution, has been peacefully solved by the two parties themselves.  And neither side lost, and both sides won.

The second is the how, which involves creativity, complexity and diversity, to solve the underlying problems.  It even involved accepting paradox.

  Britain maintained Northern Ireland is part of the UK, but allowed it to go its own way.  The Republic of Ireland gave up its territorial claims on the North, and entered into agreements with the North that made the two Irelands functionally interdependent.  In a fascinating essay in the New York Review of Books, Fintan O'Toole writes:

"This reciprocal withdrawal of territorial claims has recreated Northern Ireland as a new kind of political space—one that is claimed by nobody. It is not, in effect, a territory at all. Its sovereignty is a matter not of the land but of the mind: it will be whatever its people can agree to make it. And within this space, national identity is to be understood in a radically new way."

"In its most startling paragraph the Belfast Agreement recognizes “the birthright of all the people of Northern Ireland to identify themselves and be accepted as Irish or British, or both, as they may so choose.” It accepts, in other words, that national identity (and the citizenship that flows from it) is a matter of choice. Even more profoundly, it accepts that this choice is not binary. If you’re born in Northern Ireland, you have an unqualified right to hold an Irish passport, a British passport, or each of the two. Those lovely little words “or both” stand as a rebuke to all absolutist ideas of nationalism. Identities are fluid, contingent, and multiple."

Ireland's new Prime Minister Leo Varadkar, son of an
immigrant from India, and openly gay
These agreements were ratified by large majorities in both countries. It helps as well that both countries--particularly the Irish Republic--are much more ethnically diverse than they were.

Today Ireland is in the news for a very different reason (apart from being the absurd target of a hurricane.)  The "Irish problem" in 2017 is prompted by Brexit.  The UK as a whole--though it was mostly Great Britain--voted to leave the European Union.  But Ireland, North and South, doesn't want to leave.  O'Toole:

"Ireland has evolved a complex and fluid sense of what it means to have a national identity while England has reverted to a simplistic and static one. This fault line opens a crack into which the whole Brexit project may stumble."

The nub of the Brexit problem is this:

"When these ideas were framed and overwhelmingly endorsed in referendums on both sides of the Irish border, there was an assumption that there would always be a third identity that was neither Irish nor British but that could be equally shared: membership of the European Union. In the preamble to the agreement, the British and Irish governments evoked “the close co-operation between their countries as friendly neighbours and as partners in the European Union.” The two countries joined the EU together in 1973 and their experience of working within it as equals was crucial in overcoming centuries of animosity."

I'm not going into the Brexit weeds on this--O'Toole is the better guide--but instead I point out a certain resonance, if not model.  I hear it said that the United States is still the most diverse country in the West, and demographically that diversity is growing. Yet we are in the political grip of the either/or, of two sides divided by ideology, world view, economics, education, geography and especially history and its relationship with ethnic and other identities.

 One flashpoint is immigration. There are lurches left and right all over the world prompted by immigration, as either causing real problems or as a hot button distracting from what's really causing the problems, or very likely both.

Sooner or later, a lot of "boths" ("Those lovely little words 'or both'") are going to define themselves beyond the either/ors.  The US may not get to that particular later, but it's possible. It might even come sooner.

We know it's possible because others have arrived there.  Which is why we're not seeing gunfire on the streets of Belfast anymore.

Wednesday, October 11, 2017

Library Days: The Boy of Summers

I recall the summers of my 11th and 12th year as both my actual initiation into the joys of reading books, and in some ways, the purest and most perfect experiences of them, due largely to the revelations of discovery, the awakening, shimmering clarity of the first time.

Between baseball, bike trips, paper routes and Saturday afternoons at the movies in those hot golden months in western Pennsylvania, my visits to the Greensburg Public Library on Main Street became more frequent and regular.  There I searched and settled on my three allotted volumes, which most times I would easily finish within my two-week limit.

I read outside, including with my back against a huge rock that I discovered in a nearby hillside vacant lot of scattered trees and bushes, my fortress of solitude.  I read sprawled on the sofa in the living room, the curtains drawn against the afternoon heat, Italian style.  I read in my bedroom, on my double bed or in it, within the umbra of my bedside lamp.

I read sipping Kool-Aid, munching an apple or pear, or with a stack of saltines nearby, with butter or peanut butter between two crackers.  I read restlessly and with total absorption, until I got restless again, forcing me reluctantly out of my reading dream.

My choice of reading wasn't at all precocious.  I've noted the Hardy Boys novels.  I also read sports books, both fiction and biographies.  I was beginning to notice the author's name on a book I liked, and to look for other books they wrote.  I read The Kid Comes Back, a baseball story by John R. Tunis, and I remembered his name and found other sports fictions he wrote, like Young Razzle and Go, Team, Go!

Sports books were a staple of fiction for boys--a cache of my uncle's books I discovered in my grandmother's attic had several from his childhood. John R. Tunis was perhaps a more serious writer than most.

The Kid Comes Back was a sequel to The Kid From Tomkinsville, about a rookie with the Brooklyn Dodgers.  Published in 1946, it begins with the hero, Roy Tucker, in France during World War II.  Like a number of Major League players, his career was interrupted by service overseas.  There are thoughtful reflections on regrets for not taking better advantage of school for knowledge that would be an advantage to their survival.

Tucker is injured, and once the novel returns to baseball, it is about his struggle to overcome his injury and war experiences, and find a role as an older player, which is to support the team first.  The baseball sections are detailed, with managerial strategy and game descriptions--just what we wanted at that age, as we were learning the game from radio broadcasts and--if we were lucky enough to run into knowledgeable adults--from coaches and baseball dads.

Tunis is considered the father of modern sports books, and Roy Tucker is an obvious forerunner for Roy Hobbs in Malamud's The Natural.

 His young football hero in All American battles anti-Semitism and racial discrimination. Tunis also wrote on other subjects in his books for young readers. His last book in the 1960s, His Enemy, His Friend, is a poignant and eloquent novel exposing the brutalization of war, centered on the retreat from Dunkirk. In the adult world Tunis was a sports commentator and writer who could be a harsh critic of, for example, the role of money in college football.

Joe Archibald is another author's name I remembered, though I'm not sure which of his scores of sports books I read--probably novels about football and baseball. Archibald wrote fiction and sport biographies, and I also read those. One of the bios I remember (not Archibald's work, I don't think) was about Jackie Robinson (the first African American player in the majors and one of baseball's great all-time star), which opened with him at a young age trying to scratch his black skin off his arm, because of the prejudice he experienced. It was a powerful image and message.

I read adventure stories about sailing ships and pirates,  possibly under the influence of  TV's "The Buccaneers" (another syndicated series imported from England and largely written by blacklisted American writers) as well as the Disney movie of Treasure Island, at least as excerpted on the Disneyland TV show.  Though I don't remember any specifically, I may have actually read some Robert Louis Stevenson, but all I recall is that I came up against my limit in reading one of these books about seafaring--it was too long and too hard to follow.  (It might have been one I secreted from the adult stacks.)  But eventually I would return to such tales: a paperback of Conrad's Lord Jim was one of the books I took with me to college.

I was actually much more interested in modern stories about ships and the Navy.  I particularly liked Midshipman Lee of the Naval Academy by Annapolis grad Robb White.  I'm pretty sure I'd already read it when I first saw a TV series called "Men of Annapolis" (it began in 1957 and ran for only one season)-- Robb White was one of its writers.  There was also a West Point Story TV series at the time, which I watched, and I found novels about West Point, too.

I got so enthusiastic I decided that when I was college age I would try to get into the Naval Academy.  I would need to be appointed by our congressman, but my father used to see him at Democratic gatherings, and I'd already corresponded with him, so I thought I had a good chance.  I held onto that dream until one day, descending from my house to the road, I suddenly realized that being deaf in one ear would disqualify me.

But before that terminal thought, I actually did go to the Naval Academy.  During my 11th summer I spent several weeks at my cousins on the eastern shore of Maryland, and when my parents drove down to pick me up, we detoured to Annapolis for a quick drive through the Academy grounds, pausing long enough for me to hop out and have my picture taken.

Something else happened on that trip.  As we were piling into the Ford to leave Federalsburg, my Aunt Toni produced a tin of her homemade cookies and a cardboard box of old science fiction magazines which I assume had belonged to my Uncle Bill.  These were the classic s/f pulps with boldly colored art on the covers, containing mostly short stories.  It was a long drive, and I spent it in the back seat, reading story after story while munching chocolate chip cookies, nut rolls and jumbalones in the speeding summer light.

American science fiction was largely that--short stories published in the pulps, and it had been that way since the 1920s.  The relatively few science fiction novels were usually cheap paperbacks.  This was still a pulp genre.  But in the 1950s, several publishers started a science fiction novel series for young readers--especially baby boomers like me, because there were a lot of us.

Alice May Norton (under the name "Andre Norton") wrote a series for Ace books--I don't remember seeing these.  But I remember the other two.

 Beginning in 1947, Robert Heinlein wrote a series of a dozen "juveniles" for Scribners.  For some reason--possibly because he published short stories in Boys Life--I recognized Heinlein's name early, probably the first science fiction writer's name I knew.  Or maybe I learned to recognize his name from this series. They were published in hardback, expressly for libraries, and their success helped jumpstart hardback science fiction novels for adults.

Among the differences between his adult and juvenile fiction, Heinlein said, was that "the books for boys are somewhat harder to read because younger readers relish tough ideas they have to chew and don't mind big words..."  In fact, these novels track well with the universe Heinlein created in his work for adults.

The most famous of his series was Space Cadet, a major influence on Gene Roddenberry and Star Trek, as well as inspiring the early TV series Tom Corbett, Space Cadet and several other such TV shows that featured variations on the Space Patrol.

The last in the Heinlein series in 1958 (Have Space Suit--Will Travel) has a lot of technical detail on space suits which didn't yet exist, partly because as a Navy engineer in World War II Heinlein had been tasked to design a space suit (or more specifically a high altitude pressure suit), a job he passed on to another officer who would also become a prominent science fiction writer.  The whole idea of the space suit began in science fiction, as did so much of early space technology.

The titles I was most likely to have seen in these years were Citizen of the Galaxy and Time for the Stars, though I have a feeling I read Starman Jones. Tunnel in the Sky (1955) depicted a group of young people marooned on a hostile wilderness planet who split into rival groups but ultimately realize they need to all cooperate to survive. Science fiction historian H. Bruce Franklin believes it was Heinlein's reply to William Golding's famous Lord of the Flies, published the year before.

 But I don't really remember which of this series I read back then, and probably never will, because about four years ago I tracked them all down (some I had, some were actually in the university library's children's room in their original editions, and some were available to buy in quality paperback collections) and read them in order of composition. They are all excellent.  I wrote about them individually here.

Apart from the quality of storytelling and sophisticated scientific detail, they engage moral and ethical questions, and expose various forms of racism and prejudice, as well as authoritarian (and big business) excesses.

The juveniles I remember mostly vividly and even reverently were in the Winston Science Fiction series.  These were written by different authors including Arthur C. Clarke, Ben Bova, Raymond F. Jones, and several (as many as 9 of the 35) by Lester Del Rey.  This series (and the other two) "seem to have started a whole generation toward becoming science fiction fans," Del Rey wrote in 1979.  "People still come up to me to declare that one of my juveniles was the first science fiction book they ever read."

Again, I don't remember which I read, though I'm fairly sure they included Trouble on Titan by Alan E. Nourse, Rocket to Luna by Robert Marsten (one of crime novelist Ed McBain's several pseudonyms) and Son of the Stars by Raymond F. Jones (who authored This Island Earth, which was freely adapted into the script for one of the better 50s sci-fi films.)

The protagonists were usually young and so these novels were typically coming of age stories in which the heroes learned and overcame obstacles that included injustices.  Like Heinlein, these science fiction authors often had strong scientific or engineering backgrounds, and so the details of space travel and technology were usually pretty accurate. When the rockets really started going up in years that immediately followed, there wasn't a lot that surprised readers of these books.

A number of these novels were reprinted in paperback editions by Thunderchild Publishing. These have the virtue of the original cover illustrations but lack the wraparound endpages and the heft of the original.  One (The Secret of the Ninth Planet by Donald Wollheim) includes a marginally informative but ultimately unsatisfying essay on the series.  Original editions with pristine dust jackets are collector's items, but some old hardback library copies are around, and carry more resonance for me.

The Winston series was unified by a distinctive logo and especially the eerie endpapers which were the same for every book.  The cover art was distinctive, too, but I never saw it on my public library copies. But those endpaper illustrations inside the front and back were in every volume, and more than any single book, I remember poring over this art,  trying to imagine a tale that would unite them into a single story.

I wouldn't have seen or read these books but for the public library. (Some cheap editions of Robb White's work were available through schools, though I don't believe they were in mine. Also, I'm not aware that any of my schools had libraries until high school, and that was a small one I hardly ever used.)

None of these are classic authors or books, though I suspect many baby boomers remember them fondly. Their techniques and rhythms prepared me for other literature, I believe, but I still find value in them on their own.  I have copies of many of them today, and have read them in recent years.  That they reawaken and renew the boy that remains an integral part of me is only one of their functions.  For I read them again with calm pleasure and active delight, as well as the thrill of re-discovery.

 This is one of a series of posts on childhood reading and the origins of my relationships with books, inspired by Larry McMurtry's reflections in  his autobiographical Walter Benjamin at the Dairy Queen. Earlier posts are  on the Book House books, Library Days and the Hardy Boys

Tuesday, October 10, 2017

The Unnamable

"Bah, the latest news, the latest news is not the last."

Many in Puerto Rico and on nearby islands are still without sufficient supplies of water and food, let alone electricity.  A firestorm has obliterated parts of a major city down 101 here in northern California, resulting in deaths, injuries, homes destroyed, dislocations and evacuations, while fires affect everyday life in every part of the state, including here on the North Coast.  Out of the media glare, thousands in the Gulf Coast, Texas and Florida struggle in the distant wake of hurricanes.  Someone used unspeakable firepower to kill and main hundreds in Las Vegas in ten minutes of gunfire, and they say no one knows why.

The White House is described as a pressure-cooker with a chief executive so dangerous that others must struggle to control him, though he continues to threaten nuclear Armageddon.  There's a feeling that it all could blow any time, any day. The world is locked into a US leader who many agree is deranged, and everyone seems powerless to do anything about it, not now and, absent an electoral alternative that has not yet materialized, not for years.  Meanwhile the administration relentlessly attempts the slow murder of the planet.

The Washington Post calculates that in 263 days, our dictator apprentice has made 1,318 false or misleading statements. A few days before, he reached a new low of 32% approval, with only a quarter of Americans believing the country is on the right track.  On the same day an analysis insists that he is on track to reelection.

Another poll says that more than half say that he is not fit to be President.  In one sense, extraordinary.  In another, the percentage should be higher.

We do what we can to bring attention to those in need, to help those we can.  We try to be prepared for catastrophe in what few ways we can.  But basically...we go on. We remember, we preserve, we fight on. It's not healthy to pay too much attention to much of this.  But we go on.

"...you must go on, I can't go on, I'll go on."

(Both quotes from Samuel Beckett: The Unnamable, a 1953 novel, 1958 in English.)

Saturday, October 07, 2017

Big Ideas

In one last late night look at a few news sites for a sense of what Friday morning's stories might be, I happened on a live TV feed from Stockholm, announcing the Nobel Peace Prize.  It went to an organization, the International Campaign to Abolish Nuclear Weapons, known as ICAN.

By the time I checked my usual list of news sites on late Friday afternoon, there was no trace of the story on most of their front pages, including the NY Times and Washington Post, nor on the Google aggregator.  The Guardian, which had followed the announcement with a live blog for its UK and European readers, had its story buried deep in the front page weeds.  Even the BBC, which had room for butterflies and pandas, didn't front page it.

The only US site I saw that had a story was the New Yorker, with Robin Wright's report entitled The Nobel Peace Prize Goes to Anti-Bomb Idealists. Though the piece itself is informative, the tone of the headline and the opening sentence ("The dreamers won") seem dismissive.  It is after all an organization that works with nearly 500 other non-governmental groups in 101 countries, and scored a success with a UN treaty banning nuclear weapons, currently being ratified by the more than 120 nations that voted for it.  "Anti-Bomb Activists" might seem more appropriate and less condescending. But then again, ICAN's budget is a meager one million dollars a year.

If the sneer that seems implied is discounted, the active representation of this ideal is precisely why the group deserves the award.  They are advocating for an idea, a Big Idea, and it is with such big ideas that nearly every step forward in civilization has begun.

The ideas come first, and then stated in some major way, and a society embraces them.  In American history, it was the bold idea that we are all created equal, appearing in the Declaration of Independence as the primary rationale for a new and self-governing nation.

Once these ideas are so embraced, they become accepted standards that seem to have always existed.  They also make hypocrisy possible when people and nations don't live up to them and businesses find creative ways of circumventing them, which leads to outrage, condescension and cynicism, as well as activism and change.

That we are all equal under the law was not a given, not even an idea or an ideal, for much of western history alone.  Now we have arguments about whether lots of money makes you more equal than others even in court, but we wouldn't have that or any other argument unless the principle existed, and was enshrined in our highest laws.

Slavery was a legal and accepted business practice for much of American history. At one time, the idea that it is wrong was a new idea that was met with fierce opposition as well as condescension.  It was outlawed in England, and in the midst of a war in which it was the primary issue, the Emancipation Proclamation began the process of ending it in America.

A great deal has followed from that idea, based on human equality, and it has echoed through our age, from civil rights to the end of legal discrimination based on sexual orientation.  The idea has not been fully applied, obvious retrenchment is being attempted today, just as new forms of slavery have arisen and spread all over the world.  But slavery and racism and other forms of discrimination are no longer accepted as legitimate.  They are societially shameful.

In the 20th century a new idea painfully emerged: the world.  International treaties began to codify rights and relationships among nations not based on empire or any coercion.  An international organization formed and for the first time, lasted and grew to a global organization: the United Nations.  It promoted some awfully big ideas.

One was universal human rights, beyond the borders of any one nation. Together with others in England,  H.G. Wells, who made the case for world government throughout his long career, devoted the last years of his life to the cause of codifying such rights--which had never been done before--and insisting that the international body that would be formed at the end of World War II formally adopt them.

Others (among them, Eleanor Roosevelt) negotiated such a document, and the Universal Declaration of Human Rights was considered by the UN in 1946, its first year of existence, and formally adopted in 1948.  It is the basis for further codifications and declarations, and of UN policies and actions, such as those that the UN and member states took to end apartheid in South Africa, and in so doing, end that idea as a normal right of the state.

But as Berit Reiss-Andersen, the chair of the Norwegian Nobel Committee, noted in awarding the Peace Prize to ICAN on Friday morning, one of the UN's very first resolutions in 1946 was to establish a commission to deal with the problems of atomic energy, including nuclear weapons.  The historic treaty in the 1960s banning atmospheric nuclear tests led to a complete cessation of such tests.  And as Reiss-Andersen also noted the non-proliferation treaties of the 1970s committed nuclear nations to eventual nuclear disarmament.

The 2017 UN resolution legally binds the signatories to prohibit nuclear weapons completely, with 123 nations voting in favor of it.  None of the nuclear nations or their close allies were among them, but the idea has been stated and supported by a majority of world governments.  It means something a bit more than a dream.  It sets a standard that every nation must deal with.  Nuclear weapons, like chemical weapons, are outside the norms of current civilization.

Such ideas, even when officially stated at the highest levels, and even when enshrined in law, can be violated, and often are.  But we don't call laws against murder no more than addled idealism by naive dreamers just because people continue to kill each other.  Not even when they get away with it.

Another big example of a big idea is war itself.  Louis Menand wrote a New Yorker piece called What Happens When War is Outlawed.  It begins with a little known international treaty, eventually signed by every significant nation in the world, that outlawed war between nations as an instrument of policy.  It was the General Treaty for the Renunciation of War.

It has been completely lost to history, Menand notes, principally because the last nations signed it in 1934, just a few years before the largest war in human history began.  So once again, at best, an embarrassingly naive expression of misplaced idealism (like the celebrated 1939 New York Worlds Fair), if not cynical to begin with.  A hard-headed realist Cold War foreign policy expert called it "Childish, just childish." Certainly ineffective.

But was it really?  Menand points out:

"And yet since 1945 nations have gone to war against other nations very few times. When they have, most of the rest of the world has regarded the war as illegitimate and, frequently, has organized to sanction or otherwise punish the aggressor. In only a handful of mostly minor cases since 1945—the Russian seizure of Crimea in 2014 being a flagrant exception—has a nation been able to hold on to territory it acquired by conquest."

The rest of the piece goes into various contributing reasons why this has been so, the most recent of which is globalization, when international trade is too essential to disrupt with warfare.  But my conclusion would include the very fact of the spread of the big idea represented by that treaty outlawing war, which is that war is no longer considered to be a normal, legitimate instrument of policy.  Whereas for centuries before, it was.

The latest Big Idea to emerge in a big way was expressed in the Paris climate agreement.  For the first time in human history, official representatives of nearly every human on the planet agreed to take responsibility for fixing what humanity had done to destroy that planet.

 Like every other such statement, the actual accord may be flawed and not immediately effective, and backsliding will occur.  But the response to its rejection by the current White House incumbent by other nations as well as states, cities, businesses and citizens of the US shows it is still a powerful Big Idea, inspiring passion and determination.

The idea is now Big, and humanity has agreed on it.  It may well happen that the damage to the planet will doom civilization and the Earth as we know it before humanity can fully live up to this idea.  But even in the event of that ultimate tragedy, there is at least this expression of the best in humanity and human civilization.

We have enough evidence of human weakness and evil to tempt all of us to consider these as the essence of human nature.  We have to remind ourselves that, just as individual humans do exhibit such contrary traits as goodness, intelligence, courage, kindness, fairness and self-sacrifice, human civilization has had some awfully good Big Ideas and they have changed things for the better.  The Nobel Peace Prize shines a light on one of the most important of those Big Ideas today.

Wednesday, October 04, 2017

The Big Beep

"The Russians, Conquerors of Space. Oct.4, 1957. I have just heard some news which will affect my whole future. Russia has just successfully launched the first man-made satellite into space…How did the Russians do it? Out of their own ingenuity? Did they get information from a spy in America? A traitor? All the work our scientists and top brains did, what for? Will the Russians take advantage of this and use it to start a war?"

I wrote that, in my brown notebook, just minutes after I heard the news--after in fact I heard the actual beeps from Sputnik in orbit, as broadcast on the radio.  Sixty years ago tonight.

I was 11. So was Bill Clinton and George W. Bush. Steven Speilberg would turn 11 in a couple of months. George Lucas was 13. Sputnik changed our lives, in some ways encouraging our dreams of the future, but also introduced a new dose of fear.

I was in my room, ostensibly doing my homework.  At the end of a long, multi-jointed arm, a green-shaded lamp focused light on the surface of the heavy, dark-grained wood desk, a hand-me-down undoubtedly older than I was. The rest of my room was in shadow.

Maybe I actually was doing my homework.  But that brown notebook also contained the text and drawings of my latest science fiction effort, "The Desert Menace." Anyway, I was engrossed in my pool of light, so when my bedroom door flew open I was startled. My father leaned in, and asked me if I’d been listening to the radio. I said “no” defensively, but he wasn’t checking on my homework diligence. He said the Russians had launched a satellite into space. It was orbiting the earth right now.  My parents were in the living room, watching television.  It was such an important event that a news flash had interrupted network programming.

Alone again, I reached up and to the right to my bookshelf, and switched my radio on.  My father had put it together from a kit.  It had a slate gray face but its works behind were exposed. (That radio looked like this one.  It may even have been a "Space Scanner.")

 It was supposed to be a short wave set but despite its impressive dials, it seldom pulled in anything farther than the local AM station, WHJB.  "Nightwatch" was my favorite program, with a mysterious instrumental opening--a song I heard almost every night but have never heard since. Eventually that station played a recording of the beep.

A lot of people were shocked.  The very idea of orbiting a satellite in space was nothing but silly science fiction to them.  I was used to the idea from, well, the silly science fiction I watched and read.  But I was also a little better informed.

I knew about IGY--the International Geophysical Year.  I probably learned about it from a Young Catholic Messenger article (a periodical we got at school) that dramatized the upcoming first satellite launch during this year of globally shared geophysical research.  Only this was supposed to be a United States launch in 1958, climaxing the IGY.

I'd also seen a television documentary on the IGY.  A newsman was interviewing none other than Charles van Doren about it.  Van Doren had the reputation of being the smartest man in the country because he was a quiz show champion.  When the newsman asked van Doren if the Russians might orbit a satellite first, he just chuckled.  (Van Doren of course became the most famous quiz show contestant to be exposed as having been given the answers.)

But the Russians had done it first and the news and shock rippled around the country. The news of Sputnik had spread quickly first in government and scientific circles earlier that day. Around 6:30 PM on the East Coast, President Dwight Eisenhower was at Camp David when he was told. It was just a few minutes after 8 PM in New York when RCA technicians recorded the sound. Sputnik had already orbited over the U.S. three times by then.

NBC News broke into radio and TV programming coast to coast. “Listen now for the sound,” the radio announcer said, “which forevermore separates the old from the new.” (The announcement did not, as some stories say, break into broadcast of the World Series. October 4, 1957 was a travel day for the Yankees and Milwaukee Braves--no game was played. Besides, night games in the World Series didn't begin until 1971.)

It was rush hour on the West Coast. Commuters might have been listening to Jimmie Rodgers sing “Honeycomb,” the current number one hit, or the song it dethroned, “That’ll Be the Day” by Buddy Holly and the Crickets. When they first heard the Sound.

We're now told that people in the know in Washington were expecting it, and that Eisenhower’s military people welcomed it, specifically because they wanted to spy on the Russians from space, and now the Russians could hardly object when the U.S. sent a satellite over their country.  They sure didn't seem happy about it at the time.

Sputnik specifically changed the lives of my generation.  Apart from increasingly delusional 'duck and cover' drills, it started a national fervor over science education and education in general, parodied for our generation by the Firesign Theatre pitting Communist Martyrs High School against More Science High.

It also led to the National Defense Education Act, which provided the first federal loans for college, which is how many of us got there.  It was in fact one of the historical breaks that allowed a working class kid like me to go to college, or at least to have some choice of which I'd go to.

But the Cold War fear and alarm I'd expressed in my notebook was about more than a spy in the sky.  People immediately thought of being bombed from space. Though that wasn't literally possible, there was some truth to it.

This was 1957, little more than a decade after the first atomic bombs. Although the U.S. had exploded the first true hydrogen bomb in 1952, it was too large and fragile for a weapon. The bomb the Soviets designed and exploded the next year was not as powerful, but it was already a weapon. The U.S. soon had created usable hydrogen bombs, but the Soviets had a brief advantage which had shaken the military establishment.

Now it seemed the Soviets had leaped ahead and were a much greater threat. Until then, an attack on the U.S. or Russia could be conducted only by using bombers. Although the U.S. was rapidly developing guided missiles, Sputnik (and especially the bigger and heavier Sputnik II launched a month later) proved the Soviets had built missiles capable of reaching the U.S. and delivering atomic bombs.

Missiles were much more of a threat--they were faster than bombers and harder to detect. Airplanes could be shot down, but not guided missiles. They didn't have to be particularly accurate, because hydrogen bombs were so powerful they didn’t have to be delivered to precise targets. To destroy New York or Moscow might require as many as 24 atomic bombs. The first hydrogen bombs were each as powerful as a thousand Hiroshima bombs. New York could be destroyed by one of them, which would also produce radiation lethal to the population of Washington, D.C., and would contaminate most of the Northeast, into Canada. The "lethal zone" in H-bomb tests in the Pacific after the Bravo test proved so powerful was equal to 20% of the continental United States.

That's why, by the way, a North Korean hydrogen bomb that can be delivered by a guided missile is such a big deal.  It's a geometric increase in destruction.

As the first satellite, Sputnik also introduced globe-circling systems that have become central to global life in 2017 for communications of all kinds, GPS, power grids and more.  All that became practical just five years after Sputnik when the first communications satellite Telestar was launched by the US, and we were all rocking out to its theme song.

Now the degree of our dependence on these satellites is so great that failures to these systems have apocalyptic possibilities to rival thermonuclear war.

But for all the foreboding, the reality of Sputnik overhead was eventually also exciting (though not entirely until the televised satellite launches by the US in 1958.)  Sputnik was the first real event that proved all those science fiction dreams of rockets into space weren't just fiction.  For kids my age who'd flown on the Galaxy, the Polaris, Terra 5, the Orbit Jet and the United Planets Cruiser C-57D as well as the first manned spaceship launched on Disneyland's "Man in Space" in 1955, it was an exciting glimpse of the future we hoped for and expected, a tantalizing Coming Attractions.

 And it all started sixty years ago tonight.


That I am way behind a lot of popular culture is not news.  Neither is the fact that I don't much care.  But sometimes, something interesting, or even halfway significant, catches up with me.  Like this new meaning of "snowflake."

I saw it used twice in contexts I didn't understand in one Internet sift, so I looked it up.  Merriam-Webster characterizes it as "a disparaging term for a person who is seen as overly sensitive and fragile."

Its origin (according to M-W) is another tipoff as to why I wasn't familiar with it."In the lead-up to the 2016 U.S. elections it was lobbed especially fiercely by those on the right side of the political spectrum at those on the left."

These days it tends to be used (and not just by the far right) in context of the easily offended on college campuses, and the culture of trigger words, micro-aggressions and speech that makes people "feel uncomfortable."  Sensitive snowflake is the usual combination.

Thus the Wiktionary definition (#4):"Someone who believes they are as unique and special as a snowflake; someone hypersensitive to insult or offense, especially a young person with politically correct sensibilities."

It is now used however against the right and its own sensitivity to criticism, as by Jonathan Chiat who referred to the current regime in Washington as the Snowflake Administration.  The Urban Dictionary definition turns it around completely and aims it at the alt.right "whose immense white fragility causes a meltdown when confronted with the most minute deviation from orthodox White Supremacy." This connects it to a 19th use, when "snowflake" meant whites who opposed the abolition of slavery.

And of course, somebody (at the Atlantic in this case) applied it to both sides and dubbed the entire country A Nation of Snowflakes.

Today's use is also age-based. M-W again: "There were glimmers of this use in the decade and a half that preceded that election, but the meaning at first was a bit softer, referring mostly to millennials who were allegedly too convinced of their own status as special and unique people to be able (or bothered) to handle the normal trials and travails of regular adult life."

M-W traces this use back to the cult movie (and book) Fight Club in the late 90s.  It also reminds me of the 2003 musical Avenue Q which ran for six years on Broadway and several more elsewhere in New York, as well as being produced elsewhere in subsequent years (in Eureka, for instance.)  It was in large part a satire on Sesame Street, bemoaning the contrast between the happy positive messages Sesame Street gave Millennials as children, and the hard world they faced out of college.  It turns out, they sang, that they aren't unique and special as a snowflake.

Obviously a lot of people liked this hit show (though I pretty much hated it) and that sentiment has fed into the negative portrayal of the Millennial generation.  I don't want to get into a war of generations, or generational stereotypes.  My purpose is bringing the subject up is to take it back to individuals. Individuals who are unique snowflakes.

Now I recognize that there are ways in which all assholes are the same, or else we wouldn't recognize them as such.  And I recognize that culture, especially popular culture tends to enforce sameness, very powerfully for the generations who've grown up with social media and smart phones.  (There's even the assertion that smartphones have already destroyed a generation.)

But I found in my years of interviewing people that there was something unique about all of them.  This especially became obvious when I talked to them about their work.  Some people, however, are obviously remarkable and unique, and they ought to be celebrated.

Some of them are young.  I know of two young women who grew up in this community who have done immense good for people in some of the harshest conditions on the planet.  (One of them was on the cover of Time as a Person of the Year.)

I know of others who approach their lives and careers with intelligence, very good questions and an open heart.  And there are those who come to my attention through the media who impress me.  One of these is Raianna Brown, who was briefly famous in that Internet evanescent sort of way when a photo of her in cheerleader uniform taking a knee in protest at a Georgia Tech football game went viral.  (Photo at the top.)

What's especially impressive is that while the reposted photo turned heads last week during the NFL protests and contortions, it was originally posted to no acclaim at all last year, which is when she felt moved to protest.  I also learned from this story how thoughtful and articulate she is, and how impressive.

Although many protests involve others--with some involving many others--it is an individual decision to participate.  Protests in America, particularly racial protests, always are met with rabid opposition and are rarely popular, as Ta-Nehisi Coates proves in a new essay.  This includes the original March on Washington in 1963, now universally celebrated and treated with reverence, though those of us who took part in it knew that 60% of the country viewed it unfavorably at the time, according to one poll, although at times it felt more like 75%.

In this case Raianna Brown protested completely alone, though because of who she is, she had explained what she was going to do and why, and had her coach's support.  She knew she would get criticism, but in addition to making a stand, she wanted to start a discussion.

Young people like Raianna Brown give me hope, not just for what she did but for the character and quality of her thought that comes through her words.  She is special.  She is unique, and should inspire others to explore and cultivate their unique opportunities for being in the world and contributing to a better future.

Only when young people believe in their integrity as individuals can they have the confidence to participate to the best of their unique abilities.  Sesame Street was not wrong, about a lot of things, and certainly about this.

Monday, October 02, 2017

Las Vegas (and Puerto Rico)

It may seem unfortunate that I chose last night to post the adage below under the title of Today's News, when today's major news story would be a mass shooting in Las Vegas, where guns and gun shows are big business.

 But without demeaning the pain and suffering involved in this horrific event, the adage suggests both the evanescence of the surrounding situation, and in a different way, the depressingly repetitive nature of the event taken as a whole, within the current context.

If the United States survives as a civilized nation for another century, it will have universal healthcare and effective gun control.  As both ideas or ideals and as ongoing policies with bureaucracies and rules to carry them out, these are features of civilized nations now.  They are so important to the needs of civilization going forward that the US cannot continue as an exception over the long run.

But the short term is something else, and a number of essays point out the numbing ritual of political response.  Several (for example, here) also point out that because the shooter is apparently a white guy (with a huge and sophisticated arsenal in a hotel room), it will become a story of a crazed individual, but if it had been someone of a different type, it would be a rallying cry for repression.

James Fallows essay at the Atlantic--Two Dark American Truths From Las Vegas-- covers both aspects: "The first is that America will not stop these shootings. They will go on. We all know that, which makes the immediate wave of grief even worse."  

He notes examples of other countries that have enacted gun control after mass shootings, and have stopped them or at least made them less frequent and deadly."No other society allows the massacres to keep happening. Everyone around the world knows this about the United States. It is the worst aspect of the American national identity."

He points out that the perpetrators of mass killings in America are almost always white men, but what if this one was not?  That's the other dark truth:

"If they had Arab-sounding names, this would be a new episode of jihad. How often has Donald Trump invoked “San Bernardino” in his speeches, as shorthand for the terrorist threat in our heartland?
If they were Mexican, they would demonstrate the perils of immigration, and that Mexico is “not sending its best people.”
If they had been illegal immigrants, they’d dramatize the need to crack down harder, right now.
And if they had been black, I shudder to imagine the consequences."

Further, I point out another set of victims: the people of Puerto Rico and other islands devastated by Hurricane Maria.  With this story diverting the necessary attention to fuel a sense of urgency, their needs are likely to be ignored.  Thirst and hunger are just as deadly as guns.

Today's News

"And this, too, shall pass away."

A Persian adage, quoted by Abraham Lincoln, among others.  One tale says that it is an inscription on a ring that makes the happy person sad, and the sad person happy.

Art by Arp.

Sunday, October 01, 2017

Peter Hall

One of  several essays that appeared after Peter Hall's recent death, written by younger theatre artists he mentored, ended with traditional words: We shall not see his like again.

Traditional, even cliched, and yet they are not only true in his case, it's hard to think of anyone in theatre since Laurence Olivier of whom these words so clearly apply. And equally hard to think of anyone now alive to whom they could apply, at least in the same way.

But his institutional and artistic achievements in theatre resonate far beyond the theatre world.  (And not just because he also was a creative force in film and opera.)

As a very young director who ran a small London theatre, he introduced Samuel Beckett to the English speaking world with his production of Waiting for Godot in 1955.  In the same period he was the first to stage plays by Harold Pinter, ultimately directing around 10 of them.

The influence of these two playwrights alone on theatre and film, all the arts and permeating into how we speak and think today is profound.  This alone (along with championing America's Tennessee Williams when his home country was done with him) would have made him a significant contributor to theatre and cultural history.

At the same time as he broke new ground with living playwrights, Peter Hall directed innovative productions of Shakespeare featuring some of England's great stage stars.  He then founded the Royal Shakespeare Company in 1960 before he was 30.

His achievements there were even greater, because he changed the way Shakespeare was performed, and contributed a great deal to expanding the audience and the understanding of Shakespeare.

  Without a lot of gimmicks, he made the plays more accessible, and the popularity as well as prestige of the RSC made Shakespeare a popular playwright again.

 Together with John Barton (whose "Playing Shakespeare" series lives on YouTube) he found in Shakespeare's verse the directions for speaking it, and for playing the part.  This had the effect, he insisted, of making the verse more intelligible, more natural, and so audiences understood and could enjoy more.

Today we are used to Shakespeare again being part of a common experience, and also of personal revelation.  For example, shortly before leaving office President Obama told New York Times book critic Michiko Kakutani that Shakespeare is “foundational for me in understanding how certain patterns repeat themselves and play themselves out between human beings.”

Many people contributed to a Shakespeare revival in the 1950s and onward, but Peter Hall and the RSC (which left a legacy of every play recorded on DVD etc.) were instrumental.

Peter Hall insisted that the director's job was to reveal the play on its own terms, not impose concepts or see the plays as opportunities for the director's self-expression. He insisted on specifics and favored collaboration with the cast, even to the extent of involving the cast in set and costume design as well as the blocking of the play.

But he moved on to an even greater institutional accomplishment, when he led the UK's fledgling National Theatre into prominence in the 70s and 80s,  taking it from a small company doing a half dozen plays a year at the Old Vic, to its huge new building with more than 100 actors and 500 staff producing 18 to 20 plays a year. In the process, the National overcame general opposition by becoming popular with the public.  Again, Shakespeare was integral to his achievement there.

Today, despite government cutbacks, the National is as thoroughly a British institution as, well, the Royal Shakespeare Company. They are such pillars of English life and what England means to the world that by now it seems that both must have always existed.

Peter Hall made his mark on American theatre as well, as evidenced by this essay and his New York Times obit.  May he rest in peace, for his legacy lives on.

Thursday, September 28, 2017

Guess Again--It Ain't Tribalism

The latest trope being used to explain political America is "tribalism."  Close but no cigar.  Keep looking for an appropriate hook because this one isn't it.  It is only partially accurate, even as a metaphor, and partially offensive.

"Tribalism" has been used before as shorthand for a closed society, intolerant of others and their views.  It's even been applied to corporations.  But only recently has it become a more elaborate idea applied to politics, even as it has seemingly become the fashionable replacement for the previous lazy buzzword, "polarization."

The brief evolution of this trope is interesting, for things mutate fast in cyberspace.  The latest round probably started with Lee Drutman at Vox about hyperpartisanship in today's politics, positing two mutually exclusive positions held by two mutually exclusive segments of the electorate, each informed by contrary and rigid ideologies, and by closed information loops to the point that anything said by the other side is automatically dismissed as, at best, untrue.

Fine, not a lot new there, until this sentence: "But for years now, we’ve been retreating into our separate tribal epistemologies, each with their own increasingly incompatible set of facts and first premises."  The term 'tribal epistemologies' was hot linked to a piece by David Roberts published in May called Donald Trump and the rise of tribal epistemology, in which he quotes Rush Limbaugh:

"We live in two universes. One universe is a lie. One universe is an entire lie. Everything run, dominated, and controlled by the left here and around the world is a lie. The other universe is where we are, and that’s where reality reigns supreme and we deal with it. And seldom do these two universes ever overlap."

Roberts goes on to say "It is well known that Americans have been sorting themselves into like-minded communities by race, class, and ideology, creating more in-group homogeneity and cultural “bubbles.”' That it's well known doesn't stop him from adding some social science research showing that Americans have been "sorting themselves" by personality as well as the usual other categories.  All of this results in the US being split into "two increasingly polarized cultures" or even "two countries."

Drutman's piece adds more social science observations, such as "As political scientist Lilliana Mason convincingly argues, 'The more sorted we become, the more emotionally we react to normal political events.'”

The justifications for these generalities or categorizations aside, both pieces use the term "tribal epistemologies" without defining it or even ever again referring to the term.

However, the tribalism motif goes into high gear with Andrew Sullivan's long mid-September essay "America Wasn’t Built for Humans," with the explanatory subtitle: "Tribalism was an urge our Founding Fathers assumed we could overcome. And so it has become our greatest vulnerability."

This time Sullivan is explicit about what he means by tribalism, if mostly through examples:

"Tribal loyalties turned Beirut, Lebanon’s beautiful, cosmopolitan capital, into an urban wasteland in the 1970s; they caused close to a million deaths in a few months in Rwanda in the 1990s; they are turning Aung San Suu Kyi, winner of the Nobel Peace Prize, into an enabler of ethnic cleansing right now in Myanmar." 

Other examples are Scots and Catalans.
All these are ethnic/religious groups within artificially constructed countries. Sullivan mentions Sebastian Junger's book Tribes, which posits that humans most naturally bond into small groups, as they have since the beginning of human society. So Sullivan (following Junger one assumes) includes Dead Heads, Packers fans, and members of Facebook groups within the definition of tribes.

 African Americans, brought forcibly to America as slaves, constitute a tribe, as do Mormons.  Today the political tribes are called Democrats and Republicans, with distinct geographical distribution, socioeconomic status, educational attainment and so on, in addition to race.

Despite their differences (some of which Sullivan acknowledges), he calls all of these groups tribes.  His contemporary analysis of America however does not include the one group that defines itself as living in tribes: Native Americans, American Indians (known by different names in Canada, Mexico and Central America.)

In my work with American Indian organizations I learned a number of things, the first of which is that non-Indians like me should never presume to speak for Indians, because they can speak for themselves.  But my recollection as well as my educated guess is that the appropriation of the term "tribes" is generally offensive to them, as it does not comport with either their contemporary realities nor their history and traditions. Nor does it, in my view, with other surviving indigenous societies.

Sullivan's essay appeared in New York Magazine online, as did a subsequent video The Psychology of Tribalism which in images as well as words takes the next step of linking political polarization to the history of tribes.

While the voice of an academic psychologist talks about the "normative value" of "following the crowd" fostered by tribal societies, the apparently contemporary video shows a group of girls in what appears to be Northwest Coast Native regalia, following each other to presumably participate in a ceremony or an exhibition of a traditional--that is, religious--dance.  

This combination of words and images I am pretty certain is specifically offensive.  This is the point where the distortions of a sloppy metaphor become ignorant and dismissive.

It is more than ironic that the only American societies with the name of tribes, legally recognized as such, largely do not fit the stereotypes of the tribe in these analyses.  Moreover, the history and traditions of existing Indian tribes contravene many of these stereotypes.

Even historically, the basis of American democracy and of the union of the United States can both be found in prior American Indian concepts and practices.  The tribal council was the first Congress on this continent.  The Iroquois Confederacy was the first union.

Today's tribes are heterogeneous, and blood relationship is usually just one of the factors required for membership, and may not be considered at all.  This pertains to small tribes (though some of them don't use the term), and the few large ones.  But in other ways, size matters.

Anthropologists say that humans banded together in small groups--sometimes only a dozen or so--for hundreds of thousands of years before history began.  This is the basis for the idea of tribes.  Small groups were often isolated, and sometimes hostile to one another.  But this was not always the case.  In fact, mating or marriage within the group--even if it were possible, with a group so small--was often frowned upon, or forbidden.  Marriages among tribes were important even when not necessary, as extending relationships helped keep the peace.

Tribal beliefs--including an origin story, ceremonies, etc.--might be different, but often they were similar among neighboring tribes.  There was enmity and stereotyping and fear, but there was also exchange of knowledge and ideas as well as goods.  Often there had to be, for survival.  Kim Stanley Robinson's Ice Age novel Shaman captures this very well.  There are also documented traditions among tribal groups (in Africa for instance) that required that one group help another that might be experiencing food shortages or drought, with the understanding that they were be aided in return when they were in similar straits.

How any of this applies to either Cheeseheads or 35% of the American electorate eludes me.

Moreover, these tribal traditions cannot be understood without the basis that none of these analyses consider.  They are all about the relationship of these human societies to the natural world.

When Andrew Sullivan mentions Native American tribes, he quotes Junger as noting that in colonial times many whites joined Indian tribes (including captives who refused to return to white society) while the reverse did not voluntarily occur.  Sullivan and Junger conclude that this was because of the strength of tribal bonds and relationships.

But that's a simplistic and deceptive explanation, according to the reading and research I did on the so-called "white Indian" phenomenon, specifically in colonial Pennsylvania.  It was the way of life derived from the relationship to the natural world that attracted many.  Indeed, they rebelled against the constraints and enforced conformity of white society (especially the women), which were more "tribal" in the sense these analysts seem to use the term.

That they were embedded in the natural world, physically and spiritually, informed all the social and religious traditions of indigenous peoples in America and elsewhere in the world.  That relationship is so essential to early humanity that no analysis of their social organization can be separated from it.

This is also the basis for the revival of traditions within contemporary tribes, such as ceremonies by Northwest and North Coast peoples.  These ceremonies as well as languages and other cultural elements were forbidden by white governments for generations and many were partially or wholly lost.  American Indians suffered their own physical and cultural holocaust over centuries, and these revivals are ways of reclaiming identity and self-respect as well as tribal wisdom that's increasingly useful to the world at large as it faces the consequences of destroying its natural world context.

These are some of the reasons why American Indians might be sensitive about the appropriation of some distorted notion of tribe, not to mention specific expressions of what in many cases has been a hard won and nearly miraculous cultural revival.

So tribes reflect our primordial roots in small face-to-face groups.  Then how can a tribe be comprised of millions of people who don't know each other?  And if this is a tribal divide, why just two sides?  If we were subdividing on this basis, wouldn't we have many tribes?   The word starts to lose its meaning fast.

The application of the tribal model to this debate is questionable on other grounds, often due to the distortions of human societies caused by capitalism, as well as other factors such as urbanization and the dominance of the desert religions.

Capitalism as a social organizer means that a person's behavior and expressed beliefs are often determined by how they make their money.  Entire societies depend on that context.  A lot of our current division may as well--as income inequality becomes extreme.

A lot of what the tribal epistemology is supposed to represent depends on how you make your money (which also means who is paying you for what, and how much.)  The enforced sameness within a group seems tribal only when it is smaller in scale.  Sometimes it's been pretty big. Does anyone remember the 1950s?

The 1950s, as analyzed in the early 1960s, were characterized by what at the time was called conformity, and it was a force across American society.  There was one way to think and behave, modified by other factors such as ethnic groups and class.  The operative metaphor was the assembly line.

Conformity was so strong that apart from rebellion by some crazy artists and intellectuals, the only opposition was generational.  With those crazy artists and intellectuals as touchstones, the younger generation--first the teenagers and juvenile delinquents of the 50s, and then the Vietnam/hippie generation of the late 60s--diverged.

  It was a worldwide phenomenon, the generation gap, and though the hippies were drawn to Native traditions and emulated aspects of tribes, the dividing line was between a portion of the Baby Boom generation and the majority of their parents' generation.  The lockstep conformity was broken.  And now that's apparently considered a bad thing, because it led to "tribalism."

Certainly there are merits to the analysis that sees strong political divergence according to cultural differences rooted in geography and history (the South, as the most vivid example) and socioeconomics.  These differences are abetted by a breakdown in common sources and standards of information , fostered by a number of factors, most prominent among them being new information sources made possible and accessible by the Internet and related hardware, but also by partisan political opportunists and what must be a really shitty educational system.

So if you posit two sides, as all these folks basically do, then you have two sets of facts.  Unfortunately for the theory, actual tribal societies could not afford two sets of facts.

They might have different beliefs about why things happen or what to do about them. (And the reasons thing happened did not, as Sullivan asserts, always depend on gods, especially not in our cut-off-from-nature sense of the term.)   But their constant dependence on the realities of the natural world meant that its facts could not be ignored.  Close observation and accurate memory over generations were crucial.  In this sense, what we need is precisely a lot more tribal thinking.

There have been other objections made to Sullivan's essay.  I've previously noted my agreement with Jonathan Chiat's point disputing the equivalence given to the two sides' contentions in the Drutman piece.  People from the left are fully capable of asserting and believing stuff that isn't true, but by and large, it is the folks from the right that do not respect facts and the process of ascertaining facts.  Some people are capable of being persuaded they are wrong, based on the application of scientific, scholarly and journalistic standards as well as logic and debate, and some people are not.

You have to accept the absurd right wing notion of a complete conspiracy not only of people but of process to make this equivalence. Similarly you have to bend the concepts of tolerance and intolerance a long way to the right to make that equivalence. A sense of security and belonging of course pertain to the tenacious hold on views by both sides, but probably a lot more strongly on the right, which may have to do with the common fear and sense of threat implied by exclusionary beliefs (racism, misogyny etc.) and the particulars of a common and absolutist set of religious tenets.

Everyone makes assumptions and can be blind to them, although some can learn from having them pointed out. The difference finally is between those who are dogmatic (or entirely opportunistic) and those who are not.  Dogmatism is not an intrinsic quality of tribes or tribal knowledge.  Nor is shameless opportunism.  The metaphor fails.