Saturday, October 07, 2017

Big Ideas

In one last late night look at a few news sites for a sense of what Friday morning's stories might be, I happened on a live TV feed from Stockholm, announcing the Nobel Peace Prize.  It went to an organization, the International Campaign to Abolish Nuclear Weapons, known as ICAN.

By the time I checked my usual list of news sites on late Friday afternoon, there was no trace of the story on most of their front pages, including the NY Times and Washington Post, nor on the Google aggregator.  The Guardian, which had followed the announcement with a live blog for its UK and European readers, had its story buried deep in the front page weeds.  Even the BBC, which had room for butterflies and pandas, didn't front page it.

The only US site I saw that had a story was the New Yorker, with Robin Wright's report entitled The Nobel Peace Prize Goes to Anti-Bomb Idealists. Though the piece itself is informative, the tone of the headline and the opening sentence ("The dreamers won") seem dismissive.  It is after all an organization that works with nearly 500 other non-governmental groups in 101 countries, and scored a success with a UN treaty banning nuclear weapons, currently being ratified by the more than 120 nations that voted for it.  "Anti-Bomb Activists" might seem more appropriate and less condescending. But then again, ICAN's budget is a meager one million dollars a year.

If the sneer that seems implied is discounted, the active representation of this ideal is precisely why the group deserves the award.  They are advocating for an idea, a Big Idea, and it is with such big ideas that nearly every step forward in civilization has begun.

The ideas come first, and then stated in some major way, and a society embraces them.  In American history, it was the bold idea that we are all created equal, appearing in the Declaration of Independence as the primary rationale for a new and self-governing nation.

Once these ideas are so embraced, they become accepted standards that seem to have always existed.  They also make hypocrisy possible when people and nations don't live up to them and businesses find creative ways of circumventing them, which leads to outrage, condescension and cynicism, as well as activism and change.

That we are all equal under the law was not a given, not even an idea or an ideal, for much of western history alone.  Now we have arguments about whether lots of money makes you more equal than others even in court, but we wouldn't have that or any other argument unless the principle existed, and was enshrined in our highest laws.

Slavery was a legal and accepted business practice for much of American history. At one time, the idea that it is wrong was a new idea that was met with fierce opposition as well as condescension.  It was outlawed in England, and in the midst of a war in which it was the primary issue, the Emancipation Proclamation began the process of ending it in America.

A great deal has followed from that idea, based on human equality, and it has echoed through our age, from civil rights to the end of legal discrimination based on sexual orientation.  The idea has not been fully applied, obvious retrenchment is being attempted today, just as new forms of slavery have arisen and spread all over the world.  But slavery and racism and other forms of discrimination are no longer accepted as legitimate.  They are societially shameful.

In the 20th century a new idea painfully emerged: the world.  International treaties began to codify rights and relationships among nations not based on empire or any coercion.  An international organization formed and for the first time, lasted and grew to a global organization: the United Nations.  It promoted some awfully big ideas.

One was universal human rights, beyond the borders of any one nation. Together with others in England,  H.G. Wells, who made the case for world government throughout his long career, devoted the last years of his life to the cause of codifying such rights--which had never been done before--and insisting that the international body that would be formed at the end of World War II formally adopt them.

Others (among them, Eleanor Roosevelt) negotiated such a document, and the Universal Declaration of Human Rights was considered by the UN in 1946, its first year of existence, and formally adopted in 1948.  It is the basis for further codifications and declarations, and of UN policies and actions, such as those that the UN and member states took to end apartheid in South Africa, and in so doing, end that idea as a normal right of the state.

But as Berit Reiss-Andersen, the chair of the Norwegian Nobel Committee, noted in awarding the Peace Prize to ICAN on Friday morning, one of the UN's very first resolutions in 1946 was to establish a commission to deal with the problems of atomic energy, including nuclear weapons.  The historic treaty in the 1960s banning atmospheric nuclear tests led to a complete cessation of such tests.  And as Reiss-Andersen also noted the non-proliferation treaties of the 1970s committed nuclear nations to eventual nuclear disarmament.

The 2017 UN resolution legally binds the signatories to prohibit nuclear weapons completely, with 123 nations voting in favor of it.  None of the nuclear nations or their close allies were among them, but the idea has been stated and supported by a majority of world governments.  It means something a bit more than a dream.  It sets a standard that every nation must deal with.  Nuclear weapons, like chemical weapons, are outside the norms of current civilization.

Such ideas, even when officially stated at the highest levels, and even when enshrined in law, can be violated, and often are.  But we don't call laws against murder no more than addled idealism by naive dreamers just because people continue to kill each other.  Not even when they get away with it.

Another big example of a big idea is war itself.  Louis Menand wrote a New Yorker piece called What Happens When War is Outlawed.  It begins with a little known international treaty, eventually signed by every significant nation in the world, that outlawed war between nations as an instrument of policy.  It was the General Treaty for the Renunciation of War.

It has been completely lost to history, Menand notes, principally because the last nations signed it in 1934, just a few years before the largest war in human history began.  So once again, at best, an embarrassingly naive expression of misplaced idealism (like the celebrated 1939 New York Worlds Fair), if not cynical to begin with.  A hard-headed realist Cold War foreign policy expert called it "Childish, just childish." Certainly ineffective.

But was it really?  Menand points out:

"And yet since 1945 nations have gone to war against other nations very few times. When they have, most of the rest of the world has regarded the war as illegitimate and, frequently, has organized to sanction or otherwise punish the aggressor. In only a handful of mostly minor cases since 1945—the Russian seizure of Crimea in 2014 being a flagrant exception—has a nation been able to hold on to territory it acquired by conquest."

The rest of the piece goes into various contributing reasons why this has been so, the most recent of which is globalization, when international trade is too essential to disrupt with warfare.  But my conclusion would include the very fact of the spread of the big idea represented by that treaty outlawing war, which is that war is no longer considered to be a normal, legitimate instrument of policy.  Whereas for centuries before, it was.

The latest Big Idea to emerge in a big way was expressed in the Paris climate agreement.  For the first time in human history, official representatives of nearly every human on the planet agreed to take responsibility for fixing what humanity had done to destroy that planet.

 Like every other such statement, the actual accord may be flawed and not immediately effective, and backsliding will occur.  But the response to its rejection by the current White House incumbent by other nations as well as states, cities, businesses and citizens of the US shows it is still a powerful Big Idea, inspiring passion and determination.

The idea is now Big, and humanity has agreed on it.  It may well happen that the damage to the planet will doom civilization and the Earth as we know it before humanity can fully live up to this idea.  But even in the event of that ultimate tragedy, there is at least this expression of the best in humanity and human civilization.

We have enough evidence of human weakness and evil to tempt all of us to consider these as the essence of human nature.  We have to remind ourselves that, just as individual humans do exhibit such contrary traits as goodness, intelligence, courage, kindness, fairness and self-sacrifice, human civilization has had some awfully good Big Ideas and they have changed things for the better.  The Nobel Peace Prize shines a light on one of the most important of those Big Ideas today.

Wednesday, October 04, 2017

The Big Beep


"The Russians, Conquerors of Space. Oct.4, 1957. I have just heard some news which will affect my whole future. Russia has just successfully launched the first man-made satellite into space…How did the Russians do it? Out of their own ingenuity? Did they get information from a spy in America? A traitor? All the work our scientists and top brains did, what for? Will the Russians take advantage of this and use it to start a war?"

I wrote that, in my brown notebook, just minutes after I heard the news--after in fact I heard the actual beeps from Sputnik in orbit, as broadcast on the radio.  Sixty years ago tonight.



I was 11. So was Bill Clinton and George W. Bush. Steven Speilberg would turn 11 in a couple of months. George Lucas was 13. Sputnik changed our lives, in some ways encouraging our dreams of the future, but also introduced a new dose of fear.

I was in my room, ostensibly doing my homework.  At the end of a long, multi-jointed arm, a green-shaded lamp focused light on the surface of the heavy, dark-grained wood desk, a hand-me-down undoubtedly older than I was. The rest of my room was in shadow.

Maybe I actually was doing my homework.  But that brown notebook also contained the text and drawings of my latest science fiction effort, "The Desert Menace." Anyway, I was engrossed in my pool of light, so when my bedroom door flew open I was startled. My father leaned in, and asked me if I’d been listening to the radio. I said “no” defensively, but he wasn’t checking on my homework diligence. He said the Russians had launched a satellite into space. It was orbiting the earth right now.  My parents were in the living room, watching television.  It was such an important event that a news flash had interrupted network programming.

Alone again, I reached up and to the right to my bookshelf, and switched my radio on.  My father had put it together from a kit.  It had a slate gray face but its works behind were exposed. (That radio looked like this one.  It may even have been a "Space Scanner.")

 It was supposed to be a short wave set but despite its impressive dials, it seldom pulled in anything farther than the local AM station, WHJB.  "Nightwatch" was my favorite program, with a mysterious instrumental opening--a song I heard almost every night but have never heard since. Eventually that station played a recording of the beep.

A lot of people were shocked.  The very idea of orbiting a satellite in space was nothing but silly science fiction to them.  I was used to the idea from, well, the silly science fiction I watched and read.  But I was also a little better informed.

I knew about IGY--the International Geophysical Year.  I probably learned about it from a Young Catholic Messenger article (a periodical we got at school) that dramatized the upcoming first satellite launch during this year of globally shared geophysical research.  Only this was supposed to be a United States launch in 1958, climaxing the IGY.

I'd also seen a television documentary on the IGY.  A newsman was interviewing none other than Charles van Doren about it.  Van Doren had the reputation of being the smartest man in the country because he was a quiz show champion.  When the newsman asked van Doren if the Russians might orbit a satellite first, he just chuckled.  (Van Doren of course became the most famous quiz show contestant to be exposed as having been given the answers.)

But the Russians had done it first and the news and shock rippled around the country. The news of Sputnik had spread quickly first in government and scientific circles earlier that day. Around 6:30 PM on the East Coast, President Dwight Eisenhower was at Camp David when he was told. It was just a few minutes after 8 PM in New York when RCA technicians recorded the sound. Sputnik had already orbited over the U.S. three times by then.

NBC News broke into radio and TV programming coast to coast. “Listen now for the sound,” the radio announcer said, “which forevermore separates the old from the new.” (The announcement did not, as some stories say, break into broadcast of the World Series. October 4, 1957 was a travel day for the Yankees and Milwaukee Braves--no game was played. Besides, night games in the World Series didn't begin until 1971.)

It was rush hour on the West Coast. Commuters might have been listening to Jimmie Rodgers sing “Honeycomb,” the current number one hit, or the song it dethroned, “That’ll Be the Day” by Buddy Holly and the Crickets. When they first heard the Sound.

We're now told that people in the know in Washington were expecting it, and that Eisenhower’s military people welcomed it, specifically because they wanted to spy on the Russians from space, and now the Russians could hardly object when the U.S. sent a satellite over their country.  They sure didn't seem happy about it at the time.

Sputnik specifically changed the lives of my generation.  Apart from increasingly delusional 'duck and cover' drills, it started a national fervor over science education and education in general, parodied for our generation by the Firesign Theatre pitting Communist Martyrs High School against More Science High.

It also led to the National Defense Education Act, which provided the first federal loans for college, which is how many of us got there.  It was in fact one of the historical breaks that allowed a working class kid like me to go to college, or at least to have some choice of which I'd go to.

But the Cold War fear and alarm I'd expressed in my notebook was about more than a spy in the sky.  People immediately thought of being bombed from space. Though that wasn't literally possible, there was some truth to it.

This was 1957, little more than a decade after the first atomic bombs. Although the U.S. had exploded the first true hydrogen bomb in 1952, it was too large and fragile for a weapon. The bomb the Soviets designed and exploded the next year was not as powerful, but it was already a weapon. The U.S. soon had created usable hydrogen bombs, but the Soviets had a brief advantage which had shaken the military establishment.

Now it seemed the Soviets had leaped ahead and were a much greater threat. Until then, an attack on the U.S. or Russia could be conducted only by using bombers. Although the U.S. was rapidly developing guided missiles, Sputnik (and especially the bigger and heavier Sputnik II launched a month later) proved the Soviets had built missiles capable of reaching the U.S. and delivering atomic bombs.

Missiles were much more of a threat--they were faster than bombers and harder to detect. Airplanes could be shot down, but not guided missiles. They didn't have to be particularly accurate, because hydrogen bombs were so powerful they didn’t have to be delivered to precise targets. To destroy New York or Moscow might require as many as 24 atomic bombs. The first hydrogen bombs were each as powerful as a thousand Hiroshima bombs. New York could be destroyed by one of them, which would also produce radiation lethal to the population of Washington, D.C., and would contaminate most of the Northeast, into Canada. The "lethal zone" in H-bomb tests in the Pacific after the Bravo test proved so powerful was equal to 20% of the continental United States.

That's why, by the way, a North Korean hydrogen bomb that can be delivered by a guided missile is such a big deal.  It's a geometric increase in destruction.

As the first satellite, Sputnik also introduced globe-circling systems that have become central to global life in 2017 for communications of all kinds, GPS, power grids and more.  All that became practical just five years after Sputnik when the first communications satellite Telestar was launched by the US, and we were all rocking out to its theme song.



Now the degree of our dependence on these satellites is so great that failures to these systems have apocalyptic possibilities to rival thermonuclear war.

But for all the foreboding, the reality of Sputnik overhead was eventually also exciting (though not entirely until the televised satellite launches by the US in 1958.)  Sputnik was the first real event that proved all those science fiction dreams of rockets into space weren't just fiction.  For kids my age who'd flown on the Galaxy, the Polaris, Terra 5, the Orbit Jet and the United Planets Cruiser C-57D as well as the first manned spaceship launched on Disneyland's "Man in Space" in 1955, it was an exciting glimpse of the future we hoped for and expected, a tantalizing Coming Attractions.

 And it all started sixty years ago tonight.

Snowflakes


That I am way behind a lot of popular culture is not news.  Neither is the fact that I don't much care.  But sometimes, something interesting, or even halfway significant, catches up with me.  Like this new meaning of "snowflake."

I saw it used twice in contexts I didn't understand in one Internet sift, so I looked it up.  Merriam-Webster characterizes it as "a disparaging term for a person who is seen as overly sensitive and fragile."

Its origin (according to M-W) is another tipoff as to why I wasn't familiar with it."In the lead-up to the 2016 U.S. elections it was lobbed especially fiercely by those on the right side of the political spectrum at those on the left."

These days it tends to be used (and not just by the far right) in context of the easily offended on college campuses, and the culture of trigger words, micro-aggressions and speech that makes people "feel uncomfortable."  Sensitive snowflake is the usual combination.

Thus the Wiktionary definition (#4):"Someone who believes they are as unique and special as a snowflake; someone hypersensitive to insult or offense, especially a young person with politically correct sensibilities."

It is now used however against the right and its own sensitivity to criticism, as by Jonathan Chiat who referred to the current regime in Washington as the Snowflake Administration.  The Urban Dictionary definition turns it around completely and aims it at the alt.right "whose immense white fragility causes a meltdown when confronted with the most minute deviation from orthodox White Supremacy." This connects it to a 19th use, when "snowflake" meant whites who opposed the abolition of slavery.

And of course, somebody (at the Atlantic in this case) applied it to both sides and dubbed the entire country A Nation of Snowflakes.

Today's use is also age-based. M-W again: "There were glimmers of this use in the decade and a half that preceded that election, but the meaning at first was a bit softer, referring mostly to millennials who were allegedly too convinced of their own status as special and unique people to be able (or bothered) to handle the normal trials and travails of regular adult life."

M-W traces this use back to the cult movie (and book) Fight Club in the late 90s.  It also reminds me of the 2003 musical Avenue Q which ran for six years on Broadway and several more elsewhere in New York, as well as being produced elsewhere in subsequent years (in Eureka, for instance.)  It was in large part a satire on Sesame Street, bemoaning the contrast between the happy positive messages Sesame Street gave Millennials as children, and the hard world they faced out of college.  It turns out, they sang, that they aren't unique and special as a snowflake.

Obviously a lot of people liked this hit show (though I pretty much hated it) and that sentiment has fed into the negative portrayal of the Millennial generation.  I don't want to get into a war of generations, or generational stereotypes.  My purpose is bringing the subject up is to take it back to individuals. Individuals who are unique snowflakes.

Now I recognize that there are ways in which all assholes are the same, or else we wouldn't recognize them as such.  And I recognize that culture, especially popular culture tends to enforce sameness, very powerfully for the generations who've grown up with social media and smart phones.  (There's even the assertion that smartphones have already destroyed a generation.)

But I found in my years of interviewing people that there was something unique about all of them.  This especially became obvious when I talked to them about their work.  Some people, however, are obviously remarkable and unique, and they ought to be celebrated.

Some of them are young.  I know of two young women who grew up in this community who have done immense good for people in some of the harshest conditions on the planet.  (One of them was on the cover of Time as a Person of the Year.)

I know of others who approach their lives and careers with intelligence, very good questions and an open heart.  And there are those who come to my attention through the media who impress me.  One of these is Raianna Brown, who was briefly famous in that Internet evanescent sort of way when a photo of her in cheerleader uniform taking a knee in protest at a Georgia Tech football game went viral.  (Photo at the top.)

What's especially impressive is that while the reposted photo turned heads last week during the NFL protests and contortions, it was originally posted to no acclaim at all last year, which is when she felt moved to protest.  I also learned from this story how thoughtful and articulate she is, and how impressive.

Although many protests involve others--with some involving many others--it is an individual decision to participate.  Protests in America, particularly racial protests, always are met with rabid opposition and are rarely popular, as Ta-Nehisi Coates proves in a new essay.  This includes the original March on Washington in 1963, now universally celebrated and treated with reverence, though those of us who took part in it knew that 60% of the country viewed it unfavorably at the time, according to one poll, although at times it felt more like 75%.

In this case Raianna Brown protested completely alone, though because of who she is, she had explained what she was going to do and why, and had her coach's support.  She knew she would get criticism, but in addition to making a stand, she wanted to start a discussion.

Young people like Raianna Brown give me hope, not just for what she did but for the character and quality of her thought that comes through her words.  She is special.  She is unique, and should inspire others to explore and cultivate their unique opportunities for being in the world and contributing to a better future.

Only when young people believe in their integrity as individuals can they have the confidence to participate to the best of their unique abilities.  Sesame Street was not wrong, about a lot of things, and certainly about this.

Monday, October 02, 2017

Las Vegas (and Puerto Rico)

It may seem unfortunate that I chose last night to post the adage below under the title of Today's News, when today's major news story would be a mass shooting in Las Vegas, where guns and gun shows are big business.

 But without demeaning the pain and suffering involved in this horrific event, the adage suggests both the evanescence of the surrounding situation, and in a different way, the depressingly repetitive nature of the event taken as a whole, within the current context.

If the United States survives as a civilized nation for another century, it will have universal healthcare and effective gun control.  As both ideas or ideals and as ongoing policies with bureaucracies and rules to carry them out, these are features of civilized nations now.  They are so important to the needs of civilization going forward that the US cannot continue as an exception over the long run.

But the short term is something else, and a number of essays point out the numbing ritual of political response.  Several (for example, here) also point out that because the shooter is apparently a white guy (with a huge and sophisticated arsenal in a hotel room), it will become a story of a crazed individual, but if it had been someone of a different type, it would be a rallying cry for repression.

James Fallows essay at the Atlantic--Two Dark American Truths From Las Vegas-- covers both aspects: "The first is that America will not stop these shootings. They will go on. We all know that, which makes the immediate wave of grief even worse."  

He notes examples of other countries that have enacted gun control after mass shootings, and have stopped them or at least made them less frequent and deadly."No other society allows the massacres to keep happening. Everyone around the world knows this about the United States. It is the worst aspect of the American national identity."

He points out that the perpetrators of mass killings in America are almost always white men, but what if this one was not?  That's the other dark truth:

"If they had Arab-sounding names, this would be a new episode of jihad. How often has Donald Trump invoked “San Bernardino” in his speeches, as shorthand for the terrorist threat in our heartland?
If they were Mexican, they would demonstrate the perils of immigration, and that Mexico is “not sending its best people.”
If they had been illegal immigrants, they’d dramatize the need to crack down harder, right now.
And if they had been black, I shudder to imagine the consequences."


Further, I point out another set of victims: the people of Puerto Rico and other islands devastated by Hurricane Maria.  With this story diverting the necessary attention to fuel a sense of urgency, their needs are likely to be ignored.  Thirst and hunger are just as deadly as guns.

Today's News

"And this, too, shall pass away."

A Persian adage, quoted by Abraham Lincoln, among others.  One tale says that it is an inscription on a ring that makes the happy person sad, and the sad person happy.

Art by Arp.

Sunday, October 01, 2017

Peter Hall

One of  several essays that appeared after Peter Hall's recent death, written by younger theatre artists he mentored, ended with traditional words: We shall not see his like again.

Traditional, even cliched, and yet they are not only true in his case, it's hard to think of anyone in theatre since Laurence Olivier of whom these words so clearly apply. And equally hard to think of anyone now alive to whom they could apply, at least in the same way.

But his institutional and artistic achievements in theatre resonate far beyond the theatre world.  (And not just because he also was a creative force in film and opera.)

As a very young director who ran a small London theatre, he introduced Samuel Beckett to the English speaking world with his production of Waiting for Godot in 1955.  In the same period he was the first to stage plays by Harold Pinter, ultimately directing around 10 of them.

The influence of these two playwrights alone on theatre and film, all the arts and permeating into how we speak and think today is profound.  This alone (along with championing America's Tennessee Williams when his home country was done with him) would have made him a significant contributor to theatre and cultural history.

At the same time as he broke new ground with living playwrights, Peter Hall directed innovative productions of Shakespeare featuring some of England's great stage stars.  He then founded the Royal Shakespeare Company in 1960 before he was 30.

His achievements there were even greater, because he changed the way Shakespeare was performed, and contributed a great deal to expanding the audience and the understanding of Shakespeare.

  Without a lot of gimmicks, he made the plays more accessible, and the popularity as well as prestige of the RSC made Shakespeare a popular playwright again.

 Together with John Barton (whose "Playing Shakespeare" series lives on YouTube) he found in Shakespeare's verse the directions for speaking it, and for playing the part.  This had the effect, he insisted, of making the verse more intelligible, more natural, and so audiences understood and could enjoy more.

Today we are used to Shakespeare again being part of a common experience, and also of personal revelation.  For example, shortly before leaving office President Obama told New York Times book critic Michiko Kakutani that Shakespeare is “foundational for me in understanding how certain patterns repeat themselves and play themselves out between human beings.”

Many people contributed to a Shakespeare revival in the 1950s and onward, but Peter Hall and the RSC (which left a legacy of every play recorded on DVD etc.) were instrumental.

Peter Hall insisted that the director's job was to reveal the play on its own terms, not impose concepts or see the plays as opportunities for the director's self-expression. He insisted on specifics and favored collaboration with the cast, even to the extent of involving the cast in set and costume design as well as the blocking of the play.

But he moved on to an even greater institutional accomplishment, when he led the UK's fledgling National Theatre into prominence in the 70s and 80s,  taking it from a small company doing a half dozen plays a year at the Old Vic, to its huge new building with more than 100 actors and 500 staff producing 18 to 20 plays a year. In the process, the National overcame general opposition by becoming popular with the public.  Again, Shakespeare was integral to his achievement there.

Today, despite government cutbacks, the National is as thoroughly a British institution as, well, the Royal Shakespeare Company. They are such pillars of English life and what England means to the world that by now it seems that both must have always existed.

Peter Hall made his mark on American theatre as well, as evidenced by this essay and his New York Times obit.  May he rest in peace, for his legacy lives on.