Wednesday, December 08, 2021

Freedom 2021

In November 2014, the 85 year old Ursula K. Le Guin used her brief speech accepting her National Book Foundation Medal to issue a pointed warning.

 “Hard times are coming,” she said, “when we will be wanting the voices of writers who can see alternatives to how we live now, can see through our fear-stricken society and its obsessive technologies to other ways of being, and even imagine real grounds for hope. We’ll need writers who can remember freedom...” 

 To further emphasize this point, when the address was published in Le Guin’s last collection of nonfiction, Words Are My Matter, it was—it is—titled “Freedom.”

 Le Guin’s speech concentrated on commerce: on the commoditization of literature and commercial censorship.  (Her epic words—“We live in capitalism, it’s power seems inescapable—but then, so did the divine right of kings”—transcend her subject.)

 Perhaps her most chilling phrase is “writers who can remember freedom...” The assault on this kind of freedom was and is ongoing, and in the seven years since, it has gotten worse.  Commercial censorship (actual and de facto) continues, and the banning of books has gone from an accelerating danger to a trend.  The most prominent and vicious cases are driven by racial fearmongering and cynical rabid right politics.  But elements of the left, or education bureaucracies trying to interpret sensitivities, are also guilty of going overboard with censorship.

 Censorship takes away access to creative contributions to our understanding, and often the textures of history.  It also discourages free expression in the first place, by means as complex as shaming and self-censorship, as well as economic intimidation.

  “...as the mind recaptures its capacity to create/and the soul may regain its freedom,” wrote the African American poet and painter Jesse Murry. It includes the individual soul of the one who creates, but also the soul of the one who apprehends the creation, and ultimately the soul of the community, in small or at large.

 But freedom to create and the freedom of access is also related to a freedom under even greater assault in 2021: freedom to seek, discover and speak the truth, or even all the uncertainties that sometimes are the best we know.

 Elements of the rabid right, led by or playing into the hands of white supremacists, are using the largely phantasmal specter of critical race theory to prevent a generation from learning about racism and racial history in America.  At their behest, teachers are fired, curricula changed and books banned.

 But extremists on the other side are playing right into their hands with their own broad brush of censorship and personal destruction known (and this time, I’m afraid rightly known) as cancel culture.  

Ursula K. Le Guin did not address this, and perhaps fortunately, she did not live long enough to watch the reputation of her father assaulted by a literal cancellation, when the University of California at Berkeley erased the name of her father Alfred Kroeber from Kroeber Hall, center of the anthropology department that he essentially created.

 In an article published in the LA Progressive entitled “Cancel Culture Hits Berkeley,” professor of medical anthropology Nancy Scheper-Hughes wrote: “To negate or ‘cancel’ Alfred L. Kroeber is to censor and defame one of the most distinguished anthropologists in America.”

 She notes that the decision was based on a report “prepared hastily and secretly.”  It was not shared with the anthropology faculty, but some of those who made the decision “clearly knew little or nothing about” Kroeber’s legacy or contributions, “and even less about Kroeber’s lifelong relations with Native Californians who worked closely with him to create one of the largest archives in America on the indigenous languages and cultures of California.”

 Kroeber’s cancellation came at the behest of some academics and activists (including some Native Americans) who made various accusations about his work with California Indians and specifically the man known as Ishi, believed to be the last member of the Yahi tribe in the early 20th century.

 Scheper-Hughes, who has solid credentials and expertise in these matters, goes into great detail about Kroeber, Ishi and the various issues involving Native Californians.  While she finds serious fault in some areas, and does not spare the anthropology department in general, especially for its treatment of Native remains, she finds that many of the accusations are overstated when they are not untrue. 

 She also provides instances that show that this damning critique of Kroeber is by no means shared by all California Indians—especially not by elders who knew him. She quotes an official statement from the California Council of Indians in 1950, shortly after Kroeber’s death, expressing gratitude for his work and friendship, “and in later life our greatest hope for long-delayed justice.”

 Her lengthy article, with links, footnotes and citations, may contain assertions open to debate. But at least she provides the basis for debate. She writes that the report against Kroeber however consisted of undocumented accusations. Nevertheless, Berkeley decided to mollify the accusers by erasing Kroeber’s name from their campus, which he helped put on the map.  Then again, unlike the namesakes of other buildings, he wasn’t a big donor.

 Cancellation in this case threatens to wipe out a legacy.  In others it ends careers and reputations. As justified as it might be—or at least feel at the time--it is very much akin to capital punishment without trial, rules of evidence or presumption of innocence.  There is no balance or proportion or context.

 And it sends out ripples, first of guilt by association.  Scheper-Hughes also asserts that to cancel Kroeber “also means negating and deriding Kroeber’s wife, Theodora Kroeber, the author of Ishi in Two Worlds, and Kroeber’s daughter, Ursula Kroeber Le Guin, who produced beloved books that were inspired by Kroeber’s ethnologies of Native Californians...”

 Which makes one wonder: if this had happened in Le Guin’s lifetime, would she have been free to imagine those worlds?

 Because it also sends out ripples of fear.  In his time, Alfred Kroeber did the unconventional.  He documented and helped to save Native languages that the rest of society were happy to see die. He was friends and collaborators of individuals that many of his peers considered savages.  He took risks and made mistakes, but he was far ahead of his time in his beliefs and his search for justice and truth.

 Yet in 2021, from the right and the left there is this assault on the very idea of truth.  The far right has gone so far as to deny facts of which there is visible and audible evidence from a thousand cell phones in Washington on January 6. My generic Google news aggregator every day posts “fact checks” painfully debunking the most viral and preposterous assertions and conspiracy theories claiming to be facts.  In a throwaway comment in a book review, a writer this week asserted what is apparently a common observation: “Posts on social media aren’t expected to be strictly truthful by their billions of users.”  

 It’s bad enough that the rabid right, now running a major political party with apparently great success, values successful trolling over truth, to the extent that truth to them is a concept without content.  Facts are just agents of manipulation.  More basically, facts are judged according to how they comport with positions and prejudices. At bottom they are expressions of displaced emotions—usually from the dark side of the unconscious.  It doesn’t take long for them to be multiplied as expressions of group emotions: of a mob. 

 But now elements of the left are reportedly joining in this assault.  In support of “White Accountability,” some scholars have prepared lists of qualities and behaviors that are White vs. those that are of non-Whites.  (Eric Levitz in New York Magazine reproduces one of these lists.)  As Levitz points out, they may have some validity as cultural abstractions or tendencies in specific contexts, but not as blatant racial characteristics.  As such, they are in themselves racist, tragically lending validity to a typical rightward charge of racism against whites.

 Perhaps worse is their misuse.  If calling for evidence and logic are White, and therefore don’t apply in, say, accusations of racial “micro-aggressions,” then we are once again left with no basis for ascertaining fact.   One can understand the force of grievance.  But there must be arbiters, and there must be standards of evidence, or there is no justice possible for anyone. 

 There is truth in emotion, but it is alone not the test for the standards of fact that must be the common ground of a democratic society, or academic institutions.  No institution, no country, can remain free without some standard of fact.  Imaginative literature as we know it isn’t even possible.

 In some of its manifestations, cancel culture expresses frustration with a justice system that bottom to top is rife with corruption, bigotry and the law of money.  But denying the very basis of justice invites a vicious circle of tyranny.

 A healthy society can’t just erase the history it doesn’t like, or believe it is repudiating it by selecting scapegoats. Taking down statues may be right and cathartic but hardly sufficient.  Evaluation and judgment using the conceptual tools we’ve paid for so dearly are needed. Historical context is a vital opportunity for learning.  Freedom to explore it is crucial.  For one thing, such inquiry is the basis for knowing what should be changed, and therefore of political and institutional action.

 With the Supreme Court set to end a woman’s right to choose and send this nation hurtling towards the previously fantastical vision of A Handmaid’s Tale, freedom of expression is hardly the only freedom under mortal assault.

 But when we are relegated to one of two increasingly extreme and rigid camps, with money as the only common executive in chief, freedom to search for, discover and express truths is being overwhelmed, and a new Dark Age approaches with the seeming inevitability of a Greek tragedy.  Freedom in this spiraling down no longer defines the society—the “free society”—but becomes just another word for nothing left to lose.

Monday, December 06, 2021

Politics 2021


 I don't write much about politics here anymore.  Why warn of what everyone knows is happening?  Just because the Rabid Right misuses Nazi analogies, doesn't mean they don't apply.  We are Germany in the early 1930s.  Our march to 21st century virtual reality fascism seems inexorable, if not inevitable.  

How many times did political pundits declared the Republican party discredited, disreputable and dead as they crossed one normative line after another?  But Republicans either quit or got with the fascist program, or else were punished and purged.  Now they are all but officially the American White Supremacist Fascist Party, even if many of their officeholders have zero integrity or commitment to democracy or even an ideology, and are only interested in retaining personal political power at any cost and the open taps of certain corporate supporters.

But the clincher is Covid.  Republican officeholders are creating conditions for more people to get sick and die so they can blame it on Biden.  They are sacrificing actual real lives (though mostly old people) for political gain.  Usually politicians don't do this so blatantly.  But is the American public alarmed and outraged?  Nearly 800,000 deaths officially--half a million people over 65-- and certainly many more than are officially counted, apparently aren't enough to matter.  How many lives will it take till we know that too many people have died?  The answer my friend is blowing in the wind.

Whatever the historical analogies of our current rigid political and cultural divisions, the mutual disdain and distrust in any government (or science or anything else) by what seems like a substantial proportion of the population--this polititcization of everything--offers gloomy prospects for effective response to future national challenges, including the foreseeable effects of climate distortion.  And that's regardless of any electoral outcomes.   

The electorate in 2021 seems composed of one-quarter Rabid Right fascists and one-quarter surly and impulsive voters, unable or unwilling to absorb or judge crucial information, whose voting patterns is little more than acting out.  Because of them (and Democrats who didn't vote), Republicans successfully market-tested their fascism in this year's elections.  Who would have believed that fulminating members of a party that says what everybody saw happen on January 6 didn't happen, would actually win the next elections. (We will see if these elections were won on national issues or local issues plus the respective candidates.  Less publicized were some recent local elections that Democrats unexpectedly swept.)

The other half of the electorate broadly agree with one another, but obsess on what fractures them from the others.   They can be coalesced around a candidate like Obama, or to oppose a Trump.  But in 2021 no Obama is apparent. 

 President Joe Biden is fearless and a smart political operator--he knew enough to ask for more than he expected to get, and he still may wind up with several multi-trillion dollar changes for the better.  But the default position of the American media and public is to listen to very little of what a President says.  Even a politician with the skills of an FDR would find it difficult to get through.  Homegrown Hitler did, but being the Troll-in-Chief only gets you attention and a cult of personality --it can't get bring a country together around a vision or a program.  Demagogues have the advantage of evoking the violent dark side; it's harder to guide the light.  

Making a speech while standing in front of a broken bridge or a sparkling solar panel doesn't get you more than a sound bite that comes and goes.  Not since JFK and LBJ has there been a Democrat who could command attention over the noise, at least enough.  Whatever it takes to "communicate" these days, Democrats haven't yet figured it out.  In 2021, it's hard to see where that skill or voice will come from.  It's also not clear who will replace Biden and Nancy Pelosi--the Last American Hero--in effective legislating.

President Biden is less than one-fourth into his term. Things can change (though I wouldn't count on the egregious abortion/choice issue, to do it.)  Whether or not voting rights legislation is possible or can come soon enough to govern the rules for 2022 or 2024 is one of the big questions for the coming year.

But Biden will still be President for both those elections.  In the coming year--and certainly by 24--the White House should be seriously gaming out federal responses to various alarming possibilities, just as if they were the Pentagon preparing for various war scenarios.  For it seems that if the Republicans haven't gerrymandered themselves into power, they will try to negate elections and elect themselves at the state level.  And if that doesn't work, armed insurrection is next on the menu. 

 Before January 6, 2021, that might have seemed like paranoia.  Not now. So what will the federal response be to an insurrection taking over the government of Georgia?  Of Michigan?  Of reversing the outcome of federal elections?  Some folks need to be thinking about this now, and getting reliable resources ready. For the United States has enemies, foreign and domestic--and the most obvious right now are the domestic.

Monday, November 29, 2021

TV and Me: A Million Year Journey to Johnny Jupiter


Television and I grew up together.  This is our story. First of a series.

Once upon a time, television began.  It was before wall-sized flat screens, streaming, HD, 3D, 4k, 5D...

 Actually, the story starts earlier.  Maybe a million years earlier.  Or at least several hundred thousand...

 Once upon a time, people talked.  Occasionally one person talked to a group of others.  They gathered in front of the fire in winter to hear a storyteller tell them the history of the world and their people, the adventures and fatal mistakes of their hunters. Or children gathered around an elder, or heard bedtime stories from their mother. They heard tales of animals and humans, and how they changed into each other.  They heard how the leopard got its spots and how the coyote lost its tail.  They heard stories about children lost in the forest, and people who were way too greedy. 

Listening to stories—and watching the storyteller-- around the fire must have been a part of life for a very long time.  When people were alone, they might stare into their own small fire, and think about the stories, and daydream their own...

 Eventually a number of people acted out such stories as moral tales and histories.  They might include music of the drums and flutes, and interludes of jugglers, tumblers and clowns, as well as painted masks and regalia.

 
Often these stories, told or acted out, were repeated.  Some were ceremonial, but others were teaching stories.  Some were scary, some made you laugh.  People might hear or see the same story told many times, and they might notice that one storyteller was better than another.  In this way, they distinguished between the teller and the tale.

 Other stories were told in pictures.  You might need special credentials to look at the ones on the walls deep in caves, but others were scratched into rocks for everyone to see.  Later other kinds of pictures told a story, in paint, in stone, and in stained glass.

 Eventually these elements combined in theatre.  They all had one thing in common: in order to experience the story, you had to be at the place and at the time they were told. 

There was one exception.  Somewhere in this history, years after writing developed and spread, years after books were copied out and stored in libraries, the invention of the printing press quickly made it possible for almost everyone who could read to acquire and read books.  Books told stories to you alone, anywhere you were, including your home. 

 Even before that, writing had changed things.  When the priest and the rabbi read from sacred texts, they clearly weren’t making up the stories.  Somebody else was the author, divine or otherwise.  There was another step between the story and you.  So in a way the story came from far away.

 Print led not only to books but to shorter packages of stories that were different every few months, or every month, week, or every day.  They were periodicals: chiefly magazines and newspapers.  Some were topical, some political. Some publications serialized the fictions known as novels. As more people became literate, especially from the early 19th century well into the 20th century, periodicals of various kinds gained almost universal readership, at least in cities and towns.   

 Daily newspapers carried stories (news, sports, weather) about a reader’s town or city, and stories that everybody in the country was following.  Immigrants learned English by following the stories told with crude drawings and brief words on “the funny pages.”  Some cartoon strips told a single story each day, usually humorous.  Others spread out a dramatic story over days or weeks or months: tales of adventure in exotic places, detective stories, science fiction, romance—every kind of story that books or periodicals told, and a few more.  Soon the comic strips spawned their own periodicals called comic books. 

People were learning all the ways stories could come to them.

 In the midst of this hubbub of storytelling, along came moving pictures.  Audiences by now knew about stories (maybe true, maybe not; not actually happening now, but on the other hand, actually happening now) and the conventions of theatre (with actors who weren’t really the characters but on the other hand, they were; and that the living room on the stage wasn’t really the living room of those people, but in a way, it was.) Now they had to learn how to see stories in motion: that the train coming at you is not actually coming at you, but then again, it is.  And that the music someone is playing on piano or organ over in the dark corner is related to what’s happening on the screen.

 Then moving pictures began to talk and sing, scream and explode.  The organ in the corner became a symphony you couldn’t see.  But what all these forms of storytelling had in common, together with other entertainment like the circus and the ballet, concerts and vaudeville, not to mention sporting events and public speeches, was one thing: you had to be there.  Even though, for the stories, you weren’t there: you didn’t have to be in Rome for a story about Rome, but you did have to be in the place the story is told, such as the theatre or the movie palace. 

No sooner had movies become the chief means of entertainment (with most Americans in 1930 going to the movies at least once a week) than there was something new, called radio.  Now voices told you stories from far away, when you were in your own living room.   They were like books that talked and sang and made funny noises to remind you of doors slamming, cars careening and guns firing.  They were the opposite of the old silent movies you could see but not hear; they were movies you could hear but not see: movies for the ear.

 At first it was called “the wireless” because it carried sound like a telephone but without telephone wires.  It had its technical problems, and remained a novelty through the 1920s. But the broadcasting of news, music, sporting events and a growing number of other kinds of stories began to fill the day in the 1930s.  Radio exploded.  Only two out of five American homes had a radio in 1931.  A year later, the proportion was one out of three.  By 1938 the conquest was nearly complete, as four out of five homes had a radio, usually in a prominent place in the living room.

 Radio presented actual stage plays or more often, plays meant to sound like stage plays, beginning with an announcer whispering that the curtain is going up, followed by applause. 

It told all the stories that movies told, and sometimes featured the same actors. You could go to the movies and see Basil Rathbone as Sherlock Holmes, and come home to listen to Basil Rathbone as Sherlock Holmes. Radio presented the entertainers of vaudeville and Broadway musicals, as the movies did.  Somehow radio even transcended its own limitations: for awhile its biggest star was a ventriloquist and his dummies (Edgar Bergen and Charlie McCarthy.)  A man whose gimmick was that he could make dummies talk without moving his lips was a sensation with audiences who couldn’t see him.

 And radio soon learned to tailor its storytelling to its main advantage: it was in the home, every minute of the day.  It took the kind of stories that appeared in romance novels and especially romance comic strips (sometimes with the same characters), crossed them with the movie serial, and told them incrementally in fifteen-minute segments, sometimes at a pace that would make a snail impatient.  These were the daytime dramas, the daytime serials, otherwise known as soap operas.

 Pepper Young’s Family, Hilltop House, Stella Dallas, Mary Noble, Backstage Wife...Soap operas were designed for women who were at home during the day and could listen with less than full attention while doing their domestic chores.  They became very popular.  The ten of them on the air in 1934 more than tripled to 31 by 1936, and then nearly doubled again to 61 by 1939.  Radio had invented a storytelling form.

 The next step seemed inevitable: a storytelling device in the home like radio but that told stories with the moving pictures and sounds so far available only in movie theatres.  America was introduced to television with a broadcast and on-site demonstrations at the 1939 New York World’s Fair, called The World of Tomorrow.

 That broadcast was of President Franklin D. Roosevelt’s opening remarks, but almost no one had the equipment within range to see it.  For most people, television remained the world of tomorrow through World War II, though TV sets began appearing in bars and other public places, especially in New York, which was to be the early center of television broadcasting.

Some television stations started broadcasting in 1947, but really it was 1948 before there was anything like a menu of regularly scheduled programs for at least part of the day. 

 By 1949, television was becoming part of everyday life.  People had once again gathered around the hearth to listen to stories on the radio.  Now they were again looking into flickering light, but not into the hearth fire: they gathered in front of the flickering images telling them stories on the TV set. 

 The first programs included boxing and wrestling matches: motion that a single fixed camera could capture in a confined space with good lighting.  Religious services and other ceremonies were also broadcast, partly for the same reasons.

 But soon television was producing versions of every kind of story told in every previous form.  Tales of heroes and heroines from books and comic strips and movie serials were told in 15 and 30 minute chunks. Stage plays or stories pretending to be plays got the same sort of treatment as radio, with the whispering announcer and the shot of the audience applauding. (One of these reached back beyond radio to the beginning of storytelling with its title, Fireside Theatre.)  

Soon the full panoply was on view: musicals, vaudeville, song and dance, detectives, cowboys, space ships, cops and robbers, classic drama, melodrama and soap opera, movie cartoons, news broadcasts, interviews and documentaries.

The arrival of movies, radio and television as storytelling media came so close together in time that there were many performers who had begun on stage but then proceeded through all three.  Jimmy Durante went from vaudeville to Broadway musicals and New York nightclubs to movies and radio before becoming an early television star—and he was far from the exception. 

 Almost all the television program forms began elsewhere, often coming directly from radio or the movies, and this became part of the story of early television as I experienced it, beginning in the late 40s and mostly the early 1950s.   

 But before I begin at the beginning of that part of the story, I want to remember Johnny Jupiter


I lived more than thirty miles from the nearest television stations, though the primary one at this time  happened to be located in an unlikely pioneer city for both radio and television: Pittsburgh, Pennsylvania. Other stations were broadcasting from farther away.

 In those circumstances, television in the late 40s and early 50s was a sometime thing.  At first it wasn’t on at all most of the time, as new stations tested their equipment. I recall being restless while accompanying my mother on a visit to the home of a family friend, and persuaded to calm down because a program was scheduled to start on their television set.  Although we were “early adopters,” we didn’t have a set yet; almost no one did.  So I was excited.  I waited, with decreasing patience.  And finally the little television set came to life-- with a program on how to freeze ice cubes in a new refrigerator.  It was likely a filmed segment of a late 1940s program, “In the Kelvinator Kitchen.”  Pre-Betty Furness. 

But even when real programs were on for some part of the day, reception was dicey at best.  I'm guessing we got our first television set by 1949 (maybe a year later or earlier.) We had one when we lived in the “foundation,” the basement of our house not yet built. By 1953, we had a television set in the living room of that house, hooked up to an antenna that lay across the top of the chimney.  On most channels what I saw was the sight and sound we knew as “snow,” a cacophony of dots in shades of gray and black accompanied by a brash hissing noise.  Or if not snow, then a picture jumping up and down, appearing and disappearing among bouncing and waving lines (which the “horizontal” and “vertical” controls on the set were supposed to fix, but seldom did.)

 Eventually we got two or three channels clearly enough to watch, at least at the TV’s less temperamental moments.  One of them was more reliably clear: WDTV Channel 3 in Pittsburgh (channel 2 after 1952), the Dumont network channel.  The Dumont network, begun by an early television technology pioneer and maker of TV sets like the one pictured, was one of the major networks (though it didn't make it beyond the 1950s), along with NBC and CBS, both powerhouses in radio. ABC soon got in the game. 

In those earliest days, relationships of individual stations to networks were more fluid.  WDTV was Pittsburgh's only station for a long while, and it broadcast shows from basically everybody.  But it was primarily a Dumont station, so among my earliest TV memories are Dumont programs.   The most poignant for me now is Johnny Jupiter. 

The premise of Johnny Jupiter was that an earnest old janitor at a television station (played by Vaughn Taylor) turns the dials on a television looking for something to watch, and accidentally tunes into the planet Jupiter, and two of its inhabitants, Johnny Jupiter and the robot B-12.  But he doesn’t just see and hear them.  He talks to them, and they talk to him.

 Johnny Jupiter and B-12 were hand puppets with the voices of the program’s writers, Jerome Coopersmith and Carl Harms.  They were two figures with the proscenium of a television screen around them, with the janitor, Ernest P. Duckweather, standing beside them on the other side of the screen.  But they were all inside my television proscenium screen.

 As far as I recall, mostly what they did for a half hour was talk.  Duckweather described Earth customs and behavior, which sounded ridiculous even to Earthlings, and Johnny Jupiter described the alternatives on Jupiter.  This for me was a very early initiation into societal satire.  Most of it was beyond me at age 6 going on 7, but not all of it. It wasn’t long before I was engaging in it myself. 

But the reason I start this series with Johnny Jupiter, which is not the first television show I remember, is the strange magic of it, and its premise.  Here was this barely believable medium of stories come to life in my living room, though my access to them was always a bit uncertain, according to the moods of weather and whatever else affected television signals. When it worked, it was like magic.

 And here was Ernest P. Duckweather, like me hoping for something to watch he’s never seen before, tuning into not just another city or state, but another planet. Not only that, but he interacts with the beings there.  If television was possible, it seemed to me, something like that might also be possible. Who knows what this magic box could really do? After all, you could talk to people on some radios.  And wouldn’t it be wonderful if someday I turned the TV on and found Jupiter myself? 

Johnny Jupiter was broadcast weekly on the Dumont network for only four months in 1953.  The next year it was on ABC, with a different, younger Ernest P. Duckweather (played by Wright King), and the program was transformed into more of a situation comedy. (A few episodes of the ABC series which ran from September 1953 to May 1954 have survived (mislabeled) on You Tube and elsewhere. As far as I know, nothing of the first series is available to see.)

 But WDTV in Pittsburgh didn’t broadcast it after it went to ABC.  It wasn’t on the only other station we got reliably in 1953 and 1954, Channel 6 in Johnstown.

 Still, clacking through the stations one day I discovered it was broadcast on one of the fainter and more distant stations, possibly channel 7 (Wheeling, West Virginia) or channel 9 (Steubenville, Ohio), both of which broadcast ABC shows, among others.  But try as I might, every week at that hour, I couldn’t bring it in for more than a few minutes at a time.  Occasionally the snow would slightly clear, the horizontal and vertical stop jumping long enough for me to see the outlines of the new Duckweather and Johnny Jupiter, and to hear some distorted lines of dialogue.  But soon it would fade away.

  And that became part of television, too: the elusive promise unkept, and the potential unfulfilled. But there were also more wonders along the way.    

Thursday, November 25, 2021

Happy Thanksgiving

 

Variations on a Theme

 Thank you my life long afternoon
 late in this spring that has no age
 my window above the river
 for the woman you led me to
 when it was time at last the words
 coming to me out of mid-air
 that carried me through the clear day
 and come even now to find me
 for old friends and echoes of them
 those mistakes only I could make
 homesickness that guides the plovers
 from somewhere they had loved before
 they knew they loved it to somewhere
 they had loved before they saw it
 thank you good body hand and eye
 and the places and moments known
 only to me revisiting
 once more complete just as they are
 and the morning stars I have seen
 and the dogs who are guiding me

--W.S. Merwin


 Last Thanksgiving I posted a Merwin poem that some readers found a little too, let’s say ironic. So even though this is a poem of spring, there’s giving thanks in it. In that way it can be read as saying that gratitude can be felt any day, in this case in long old age, and perhaps Thanksgiving symbolizes that. For me this poem has the additional benefit of expressing several specific reasons for my own daily gratitude, including the (endangered) snowy plovers of our own North Coast beaches.

Sunday, November 21, 2021

Window on Beings and Doings


Smooth is the skin of the woman who irons. 
Tall and bony, the man who repairs umbrellas.
 Plucked, the woman who sells chickens. 
In the inquisitor’s eyes shine demons.
 Coins lie behind the usurer’s eyelids. 
The watchmaker’s whiskers mark the hours.
 The janitor has keys for fingers.
 The prison guard looks like the prisoner and the psychiatrist looks crazed.
 The hunter becomes the animal he pursues. 
Time turns lovers into twins. 
The dog walks the man who walks him.
 The tortured tortures the dreams of the torturer.
 The poet flees from the metaphor in the mirror.

 --Eduardo Galeano
 from his book, Walking Words

 I heard Eduardo Galeano read from this book before I’d read anything he’d written. That’s fitting enough for an introduction to a poet. But in this age, it happened through my earphones on a bright sunny day as I walked up Forbes Avenue in Squirrel Hill Pittsburgh, listening to him being interviewed on All Things Considered, probably in 1993 when Walking Words was published.  I likely took a few more steps to try to find it at Squirrel Hill Books.

 Eduardo Galeano was a walking poet of the world. Born in Uruguay, his ancestors were Italian, Welsh, German and Spanish. He was an editor, journalist, novelist and poet. He wrote books like Walking Words, an amalgam of verse and tales from Latin American folk traditions, and uncompromising political books like Open Veins of Latin America. He was on death lists of several South American right wing governments. He also was considered the preeminent writer on international soccer, i.e. futbol. He died in 2015.

 As for the photo above, I’m the man this dog walks. His name is Howdy.

Update: RIP poet Robert Bly.  His death was announced Monday.

Tuesday, November 16, 2021

Busby Berkeley, Historian


 Busby Berkeley is both famous and notorious for his extravagant dance numbers in movies from the 1930s through the 1950s.  But at least in one instance, two of his musical numbers reflected the times and now attest to their history, though audiences today may not fully understand what they are about.

 “If you want to understand the key to Busby Berkeley’s choreography,” said Berkeley biographer Terry Thomas in a TCM documentary, “you have to consider the military.”

 In 1917, the 22 year-old Berkeley was drafted into the U.S. Army.  By the time he arrived in France as a field artillery lieutenant, the fighting was nearly over and American troops had little to do.  So he was tasked with marching and drilling them, and in the course of inventing new formations and maneuvers, he essentially invented the style that was to characterize the work of his career.  Except instead of just male soldiers, he had armies of beautiful young women, and a camera that found both large geometric patterns and the intimacies of faces and other elements of anatomy.

 After successfully bringing this style to Broadway (minus the camera), Berkeley was drafted by Hollywood.  He soon wound up at Warner Brothers for a series of movies that would revive both musical films and the studio’s fortunes.  He started with two of his most famous, back to back, both made and released in the depths of the Great Depression.

 The first was 42nd Street, released in 1933.  Warners organized a big publicity campaign involving a train loaded with studio stars touring cities, culminating in the film’s premiere in Washington, D.C.  Jack Warner, head of the studio, was a fervent FDR backer, and he timed the 42nd Street opening to Roosevelt’s Inauguration. 

42nd Street was a big hit and a critical success.  It set the template for the series of Warners musicals that starred Berkeley’s choreography: a backstage story about mounting a show, then the featured numbers presented as the opening night.  The music by Harry Warren (with Al Dubin’s lyrics) was innovative in that it included a mix of styles, especially jazz influences.  But it wasn’t a complete break: there were plenty of soporific love melodies.

 Back in Hollywood immediately afterward, Berkeley worked on musical number for the next Warners musical, Gold Diggers of 1933. The story had been the basis for a play and a couple of movies with “Gold Diggers” in the title (and there would be more.)  Directed by Melvyn Le Roy, it was released later that year, and again, it was a major hit.

 Apart from the insipid and cringe-worthy numbers were two that spoke directly to the audience about elements of the Great Depression they had just experienced, and that would continue to affect their lives.

 We’re In The Money

 The first was the now iconic presentation of the song, “We’re in the Money,” featuring a very young Ginger Rogers belting the tune (and singing part of it in Pig Latin crossed with jive) while seemingly wearing nothing but strategically placed coins (although some shots reveal she was wearing a body stocking.)

 The song is usually interpreted as wishful thinking about post-Depression prosperity, but it has a much more specific meaning that audiences in 1933 would understand.

 Upon taking office in March 1933, FDR first of all had to deal with the banking crisis, to keep money in circulation.  Deposits were guaranteed for the first time. Besides new jobs and relief programs, his administration bolstered the economy with price supports for farmers, a minimum wage for workers and uniform rules for businesses. 

But money was still too tight for economic growth, so FDR essentially took the country off dependence on gold as the backing for currency: the so-called Gold Standard.  Everyone knew about this because everyone in the country was required to turn in their gold coins (which came in various denominations) in exchange for paper money or silver, including silver dollars.  Going off the gold standard meant that the federal government could increase the money supply, and it did.  And that’s what “We’re In the Money” is about.

 It’s right there in the first verse:

 Gone are my blues and gone are my tears/

I've got good news to shout in your ears/

The long lost Dollar has come back to the fold/
With silver you can turn your dreams to gold, oh/

We're in the money/
We're in the money/
We've got a lot of what it takes to get along...

It’s been suggested that in this black and white movie, Ginger Rogers and the chorus girls, as well as the set itself, featured gold coins.  But the point is that they aren’t: the coins feature a caricature of the face on the silver dollar.  Those are the silver coins that represent the new flow of money resulting from FDR’s policies (“with silver you can turn your dreams to gold.”)  All of this would have been instantly clear to the 1933 audience.



 Remember My Forgotten Man

 A more poignant and equally specific reference is made in the last big production number, “Remember My Forgotten Man.”  The Depression is not a subtext: it is the text, and is dramatized.  There’s the obvious reference to FDR’s  “the forgotten man at the bottom of the economic pyramid” from a 1932 radio address.  But as Joan Blondell continues to sing about her specific forgotten man, he turns out to be a veteran of World War I. 


  The number features a solo by Etta Moten (who also dubbed part of Blondell’s singing), a distinguished Black singer whose Gospel background emphasizes the bluesy spiritual feel of this elegy.  

Again, the feeling was even more specific to the times, for the plight of World War I vets had been dramatized for all the country to see just the previous spring and summer of 1932.

 In the spring, thousands of American World War I veterans gathered in Washington, D.C. to petition Congress to pay them the war bonus they’d been promised for 1945, because they were in desperate straits.  They remained there in makeshift encampments, many with their families, through the summer, continuing to lobby Congress.  The press covered the story extensively, dubbing them the Bonus Army.

 The veterans organized themselves into units, led by officers.  In contrast to the armed forces in both world wars, their units and their camps were racially integrated.

 Things were at an impasse in late July, with Congress failing to provide the bonus and with President Hoover opposed to it. It was then that a police officer trying to clear away a crowd from the entrance to the Treasury Department panicked and shot a veteran dead.  Hoover called out the Army to settle things down.  Instead, General Douglas MacArthur decided to make war on the Bonus Army.

 MacArthur, with his officers including Major Dwight D. Eisenhower and Major George Patton, deployed tanks and tear gas, routing the veterans and burning down their camps with gasoline. (This was probably the first time tear gas was used by Americans on other Americans.)

  Patton led a cavalry charge with drawn sabers against unarmed men, women and children.  In a deadly irony pointed out by historian William Manchester, among those that Patton’s forces attacked was a World War I veteran decorated for saving the life of Patton himself. 

 Many of the veterans were literally run out of town, in trucks that took them west to Ohio and beyond.  Though newspaper stories of the day tended to support the government line that the Army had thwarted dangerous criminals and radicals, many Americans knew men who were there. Hoover never recovered his political reputation.  If he’d had any chance of winning the election that fall, it probably ended with his administration’s treatment of the Bonus Army. 

All of this could not have escaped the attention of that era’s veterans, like Busby Berkeley.  But the Bonus Army had also inspired a great deal of sympathy and support among the people who would be watching this movie. When they saw and heard “Remember My Forgotten Man,” they would likely remember the Bonus Army.  Even in this dubious context, this song was as close to a memorial that it would get.

Sunday, November 14, 2021

The Aegean


The Aegean 

 This music has lasted since the world began.
 A rock was born among the waters
 while tiny waves chatted in a soft universal tongue.
 The shell of a sea-turtle
 would not have foretold the guitar.
 Your music has always risen to the sky,
 green taproot, Mother Sea,
 first of all firsts. You enfold us,
 nurturing us with music - threat,
 fable, hypnosis, lullaby, roar, 
omen, myth,
 little agonies
 of grit, of wreckages, of joys

--Maria Luisa Spaziani
translated by Beverly Allen

 

Born in Turino in 1923, Maria Luisa Spaziani was a prominent Italian poet and editor. She was nominated for the Nobel Prize three separate times in the 1990s. She died in 2014 at the age of 91.

Monday, November 08, 2021

Choosing


Choosing

Was it home or foreign
 that city without color
 where I once lived for a time
 that often seemed long
 thinking there was no choice 
and all night I heard the captive
 lions roaring

 now I look back
 from when the rain is falling
 in the bright day

 a friend and I
 talked back then about a tree
 whose branches were the choice that we
 had not taken
 then she chose not to be

 never was there any such tree

 better
 the sound of the rain
 better the brightness falling
 better the day
 choosing to be morning

--W.S. Merwin

photo: Henri Cartier Bresson

Tuesday, November 02, 2021

Soul of the Future/ Evolution of Hope: Eight Figures for the Future

“We must take comfort from the fact that human nature gives rise to altruism as well as selfishness, to conscience as well as cruelty.  The hope of the race is that passions of generosity, restraint, and goodness may prove as strong as those of egoism, aggression, and cruelty.”

Eric Bentley

 “Better to make a good future than predict a bad one.”

Issac Asimov

 “The future is not a gift: it is an achievement.  Every generation helps make its own future.  This is the essential challenge of the present.”

Robert F. Kennedy


 “I am curiously not interested in things,” wrote H.G. Wells in his 1917 book, The Future In America, “and curiously interested in the consequences of things.”

 This is the first lesson of evolution, which Wells applied to the future to shape what “the future” means to us.  The consequences of things (technologies, processes, events, inventions, decisions, etc.) in the past and present comprise the conditions of the future.

 Wells is the first of my “eight figures for the future” and sets the agenda for the rest: to fill in his outline of the future.  Most of these figures have appeared earlier in this series, so this is a kind of summation. Significant aspects of their work help form a framework for envisioning and enacting an evolution of hope.  

 From his first public writings to his last over a fifty- year career, Wells was fixed on the future. He learned as he went, and came to another seemingly simple conclusion that seems also to have escaped many other who command the mainstream of assumptions about the future: the causes of future conditions are complex. 

 It was fine for a science fiction story, or a thought experiment, to follow the effects of one change while keeping everything else familiar.  But that’s not how the future works: it’s the consequence of everything.

One technological change—or even technological changes in general—can and usually does have a mighty influence on the future.  But these effects interacted with other factors, including other changes, and with responses that were often unanticipated.  Things happen on different scales, at different rates.  They interact unpredictably.  And the consequences always include the unintended. 

Future reality would be made by the interactions within the whole, and the whole acting on itself.  Analyzing one or two strands of change wouldn’t be enough. “The end of all intelligent analysis,” Wells wrote, “is to clear the way for synthesis.”

 This is a lesson that overwhelmed the sensible futurist of the 1970s, but other more egotistical predictors persist in making the fatal mistake of leaving too much out.  This is a particular habit of those predicting the dominance of technologies, and it still happens. 

 In an online Substack piece of August 2021, the author chides those who make extravagant claims for the rise of certain technologies in the near future, and then offers his more “conservative” or realistic predictions based on current trends for life in 2050.  He issues 18 detailed technological, social and political predictions, employing hundreds of words, and never even mentions climate--not causes nor effects of the ongoing climate catastrophe, or anything involving the context of the natural world.

 Moreover, this was published during the most violently and obviously consequential summer of the climate crisis so far, which included the hottest month ever recorded for the entire planet, and which climate scientists confirm would basically have been impossible without a seriously deformed climate.  

 These are not just omissions; they are serious distortions of the future we know is coming that make his predictions worthless.  His predictions presuppose everything else is stable, and at this point that’s fantasy.  The irony is that based on chemistry and atmospheric science alone, the acceleration of a distorted climate is about the most certain prediction about the future that can be made.

 The habit of holding on to a single through-line for the future based on a narrow interpretation of the past that requires that counter-evidence be ignored, has itself distorted the prospects for the actual future.  Such interpretations tell slightly different but mutually congenial stories about the human historical past and the biological past predating but including humanity—in particular the principles that govern the outcomes of evolution.

 The common story driving western civilization was that man and nature were separate, that at best nature was a useful source (i.e. “natural resources”) but most of the time, an enemy, a barrier to human “progress.” The dominant interpretation of Darwinian evolution complicated this story, without really changing it. The human species was placed in competition with the rest of nature, and that competition was defined as violent struggle, with a few winners—maybe only one-- and lots of losers.

 The idea that the future belonged to the best predators predated and probably influenced Darwin.  But Darwin’s positing of natural selection as determining survival became immediately distorted and supercharged by the ideology now known as Social Darwinism.  That two of its prominent 19th century adherents were John D. Rockerfeller and Andrew Carnegie pretty much explains its dominance.

 Though the 20th century strengthened this view through the reductive misinterpretation of genetics that produced the infamous “selfish gene” theory, it was during the 20th century that other voices countered this view by adding new information and context. 

Much of this information was hiding in plain sight. Another brilliant synthesist (and our second figure for the future) was Paul Shepard.  Credited as one of the pioneers of the science of ecology in the 1960s and 1970s, he braided observations of the natural world and human culture into a unique field, recognized when he became an endowed Professor of Human Ecology.

 Many other ecologists, scientists and thinkers contributed to Shepard’s syntheses, but many more have followed in his unacknowledged footsteps.  He wrote eloquently about the long human development in the Pleistocene, and the deeply human need for connection with the rest of nature, beginning with childhood.  This is “human nature--” physical and otherwise-- that goes back hundreds of thousands of years, and basically it has not changed.  Shepard brought human culture back into the context that nourishes it, and shows how human destruction of the natural world is profoundly self-destructive.

 Most of what Shepard and other ecologists found did not require laboratory experiments or expensive technology.  Archeological discoveries and analysis, a certain amount of quantitative data gathered in the field were part of it, but the new picture emerging also required rediscovering the research and insights that had been ignored because they didn’t fit the dominant program.

 For after all, the behavior among primates that exhibited cooperation, altruism and empathy in a carefully nurtured social structure had been there for researchers to see, centuries before primatologists like Frans de Waal and others showed up. But those earlier researchers didn’t see it because they weren’t looking for it—and since they were not prepared to believe it if they saw it, they didn’t see it.  More broadly, the deep relationship of humans and nature was evident in the words and practices of Indigenous peoples all over the world, but was dismissed as sentimental and exotic, and profoundly threatening.

 However, the 20th century also saw new information unavailable before, because (for example) new technology allowed researchers like Lynn Margulis to study microorganisms that Darwin knew nothing about. There she found evidence for symbiosis and other behavior contrary to the selfish gene theory and other prevailing prejudices. 

 She then eloquently described the implications of her research, inspiring such thinkers as William Irwin Thompson.  Margulis also went from micro to macro by becoming the co-author of the Gaia Hypothesis, a planetary vision of a single self-regulating organism.  She is our third figure for the future.

 Thanks to her and many others, a new synthesis and a more complex view of evolution has begun to achieve acceptance. (In a series of books, British philosopher Mary Midgley is especially trenchant on the weaknesses of the old standard view of evolution.) Whether the species that invented bombing deserves to survive is still a question.  But that humanity is programmed by its genes to self-destruct is no longer a viable scientific conclusion.

 The reality is that both competition and cooperation, both individuals and various kinds of groups, both genetics and epigenetics (when genes turn on or off), drive evolution. This vision has profound implications for the future, and offers hope that the consequences of the ongoing destruction of the natural world as we know it can be recognized more broadly before those consequences become entirely overwhelming.

 Wells came to a second crucial realization when he turned to envisioning an attainable and desirable future.  Wells believed human civilization could not survive much longer if humanity did not unite.  National, racial and other enmities were leading to global catastrophe in a world in which weapons were inevitably going to be more destructive  (Wells foresaw tank warfare before there were tanks, saturation bombing of cities from the air, and the atomic bomb.) 

 He saw that humanity needed a new vision of itself, a new story of human progress. So he wrote The Outline of History to tell that story, and it became the best-selling book of his career.  Well’s history is now outdated.  Progressives who worry about the future have for years called for “a new story.”  Elements of it are contributed by others among these eight figures for the future.

 At their best, utopian stories explore possibilities inherent in our past to create models of better futures. Today as in recent decades, dystopian stories remain plentiful, but there is still only one popular model for a better future: the Star Trek saga.  Over five decades and counting, Star Trek has evolved a capacious vision that has inspired generations spanning the globe.

  Its universe of easy travel to other planets populated by similar beings is very likely a fantasy, but beyond the visuals that delight many, Star Trek has always been blatantly allegorical.  It has championed a profound respect for life, whatever its form (“Infinite diversity in infinite combination” is the motto every fan knows) a spirit of adventure and wonder, but the wisdom of humility.

 In addition to the allegories of principles tested by the unknown, Star Trek: The Next Generation in particular models behavior: courage and civility, discipline and openness, technological expertise and explorations of art and culture, loyalty and compassion, humor, love and the ability to examine behavior and change it.  This is not a utopia without problems: it is a utopia because of how people define and attempt to solve problems together.

 This Star Trek also demonstrates the power of an institutional morality that aligns with individual commitments.  Institutions like the Federation and Starfleet have learned from history.  In their encounters with the alien, the Other, they anticipate the past.  “We are not invaders,” Captain Picard insists.  “We are explorers.”  This time, humans do not export their unconscious in attempting conquest by another name.  Though they do not always succeed, they have rules to help them (the Prime Directive), and (it’s worth repeating) the humility and the tools to reexamine themselves.

 Behind aspects of this vision were dozens of science fiction writers (Heinlein, Clarke, Bradbury, Asimov, Hamilton etc.), moviemakers and Saturday morning television shows, as well as those who contributed their talents to Star Trek itself.  But television writer and producer Gene Roddenberry began this saga, and he inspired others to add their creativity to it by enlisting their enthusiasm for a vision that functionally became theirs as well as his.  For this achievement he is our fourth figure for the future.

Other visionaries have explored enlightening and sophisticated visions of at least aspects of the future through the complexities of story, particularly science fiction.   These have not directly reached as many people as the Star Trek saga but those who have read them have been profoundly influenced.  For her unique achievements that include The Dispossessed, The Left Hand of Darkness, The Word for the World Is Forest, The Lathe of Heaven and Always Coming Home, Ursula K. Le Guin is our fifth figure for the future, but she also represents other significant visionaries from Olaf Stapeldon, Karel Capek,  Yevgny Zamyatin, George Orwell and Aldous Huxley to Kim Stanley Robinson, Margaret Atwood and George Zebrowsky.

 Particularly in the Next Generation series, Star Trek suggested that the journey outward through space was a journey inward, and that each was at least equally important.  Or as Star Trek writer David Gerrold wrote, “"...space is not the final frontier. The final frontier is the human soul."

 In the first half of the 20th century, C.G. Jung drew what he knew was only a rough map of our inner landscape—of the soul or psyche.  As crude as this map is, it suggests a crucial set of conceptual tools for understanding and directing our behaviors.  These tools—such as denial, projection, compensation, and the shadow—help us question and discern whether our thoughts and perceptions are actually products of our conscious mind, or are deceptions and misperceptions arising from our unfathomable unconscious.  For this alone, Jung is among our eight figures for the future.

 The future that evolves from the past is not an entirely rational process, Jung warned.  The mystery of the insistent unconscious—individual, group (or mob) and collective unconscious—persists in waves of human behavior, while the unconscious supplies deceptive reasons to keep the conscious feeling justified.  That’s why these tools are so crucial. 

(Using these tools to openly question our behavior was also modeled in Star Trek: The Next Generation.  The necessity of doing so makes the sometimes ridiculed addition of a ship’s counselor so important.) 

Jung also cautioned against one-sidedness, and called for the integration of rational thought, feeling, perception and intuition, or the head, heart and body that comprise soul.  The inner world is also complex, and the unconscious is a source of power and insight, and something that links all humanity, as well as the repository of raw fears and longings, especially those the society represses. 

 Thanks in part to the fervors surrounding two world wars, Jung saw that group delusion and a kind of mass psychosis fueled by unconscious compensation, self-righteous projection and denial are particular dangers in the modern world.  After World War II, he feared the power-mad Soviets and the suggestible Americans. The grip of this equivalent of shared and mutually reinforcing demonic possession can be broken, he felt, only one person to another.

  Towards the end of his life, Jung begged others to continue the exploration and mapping of the psyche.  “The world hangs on a thin thread,” he said in a video interview.  “That thread is the human psyche… We are the great danger.  The psyche is the great danger. But we know nothing about it.”  Despite the growing reductionism, arrogance and drug dependence of psychology since, others like James Hillman have continued the search—but not enough.

 Part of the necessary synthesis to secure the future is re-integrating the profound experience and insights of ancient traditions.  Particularly in the past quarter century, that work with Native American cultures has begun.

Leslie Marmon Silko (right) with Maxine Hong Kingston 
(left) and Toni Morrison (center.)
This synthesis is the ongoing work of many: Native leaders like Chief Oren Lyons, Native scholars such as Robin Wall Kimmerer and Gregory Cajete, non-Native scholars like Richard Nelson and Keith Basso, and non-Native synthesists like David Peat and Calvin Luther Martin, along with many Native poets, elders, activists, novelists and others. Leslie Marmon Silko, our seventh figure for the future, represents them all in the power and relevance of her writing, in Almanac of the Dead and the works that precedes and follows this American classic. (See the earlier post on The Ghost Dance future, as well as later in this post.)

 The Dalai Lama is among the great synthesists for the future of our time.  Over decades he has brought together scientists and Buddhist practitioners to reconcile religion and science, but especially to advance western science into introspective areas of the mind that Buddhists have been exploring for many more centuries than western science has existed.  These discussions through the Mind & Life Institute,  which led to new scientific experiments partially designed by Buddhist practitioners, have been public, resulting in nearly a dozen books and many hours of video, inspiring even more writing and discussion.

 The Dalai Lama also has articulated principles and ethics that are widely shared and do not depend on doctrines of particular religions, or any religious doctrine at all. “... we are all members of one human race and have the same worries and needs”, he writes.  “This ethical principle is not bound to a specific religion.  Even an atheist can follow it.  It is therefore not at all important whether we believe in God or the idea of rebirth.  We can always do good, even today when we are afraid of the dangers that the future may bring.”

 But this is not a matter only of the lowest ethical common denominator.  It is a re-orientation: the direction of a new story. According to visionary William Irwin Thompson: “...the Dalai Lama becomes not a medieval theocrat, but a global teacher precisely because Buddhism captures some of the dynamics of a worldview based on relationship, dependent co-origination, and compassion.” 

British author John Gray has tried to co-opt Buddhism and Eastern religions in general to support acceptance of inevitable apocalypse inherent in human nature by rejecting active hope for the future.  While it is true that Buddhist meditation focuses on exploring the individual’s present moment without judgment, this is only part of Buddhist tradition, and certainly the Dalai Lama has been outspoken in favor of both the possibility and the urgent necessity of compassionate action to build a better future. 

 Moreover, a core tenet of Buddhism is the value of the non-human world: of beings, of all life. The first precept of Buddhism, as Gary Snyder describes it, is ahimsa: “Cause the least possible harm.”  It applies not just to humans but to everything in the natural world, and requires judgment and forethought as well as this radical empathy.  As many contemporary Buddhists would attest, it applies to the larger questions of what harm humanity is causing to the life of the planet.


 “In order to change the external world,” the Dalai Lama writes, “first we must change within.” That inner world includes imagination and a vision for the future. “If you want a beautiful garden, in the human mind you make some kind of a blueprint in the imagination, and then according to that idea, you implement, so the garden will materialize.”  

 That change may well be underway. The current dominance of rigid denial, Buddhist philosopher David Midgley maintains, “is typical of the terminal phase in the life-cycle of a paradigm, and might be compared to the chrysalis stage in the life-cycle of a moth or butterfly...While the outer shell of the organism seems rigid and immoveable, invisible changes are taking place within, which may erupt dramatically when they reach a critical stage of development.”

 In any case, the activities of hope and personal commitment are essential to create the desirable future by enacting it now. "Whether we achieve what we are hoping for or not, it is important for us to keep hope,” said the Dalai Lama. “Hope is the basis of our future."

 The Dalai Lama is our eighth figure for the future.  But these eight are joined by newer voices, opening up new knowledge and possibilities. 

 The synthesis continues for example with books on parallels of Jung’s writing with Native American tradition and Buddhist practice (such as the connections between Buddhist mindfulness and Jungian consciousness), as well as Jung and nature, and Buddhism and ecology.  Others explore the implications of quantum physics for insights related to various mystical traditions, Indigenous practices and Jungian glimpses (such as synchronicity) into what Star Trek’s cosmic being Q called “the unknown possibilities of existence.” 

But the synthesis is also advanced with new analysis, information and insights. One of the most exciting is literally a new story, called The Dawn of Everything: A New History of Humanity by David Graeber and David Wengrow.  The authors demonstrate that the dominant received history of early human civilization is “entirely wrong,” according to a detailed article in the Atlantic by William Deresiewicz. “Drawing on a wealth of recent archaeological discoveries that span the globe, as well as deep reading in often neglected historical sources,” he writes, “the two [authors] dismantle not only every element of the received account but also the assumptions it rests on.”

 Deresiewicz suggests that this new story demolishes “the idea that human beings are passive objects of material forces.”  Societies at different times and in different places made deliberate choices: they saw the perils of their authoritarian government and changed it, they saw a neighbor’s stratified society based on wealth and avoided it, they refused to become trapped in a single mode of existence, and moved easily as appropriate among farming, gathering, herding and hunting in an “ecology of freedom.”

 “In a remarkable chapter, they describe the encounter between early French arrivals in North America, primarily Jesuit missionaries, and a series of Native intellectuals—individuals who had inherited a long tradition of political conflict and debate and who had thought deeply and spoke incisively on such matters as “generosity, sociability, material wealth, crime, punishment and liberty.”  They call this the “Indigenous critique,” and maintain that it helped inspire the European Enlightenment.

Humanity did not march from hunter-gatherer nomads to total agriculture and then cities.  They mixed and matched and did them backward and forward.  They did not develop from tribal chieftains to kings and bureaucracies.  They governed themselves in a variety of ways, with and without “authorities.”They did not always see civilization as acquiring wealth and power in a top down structure. Civilization might mean “mutual aid, social co-operation, civic activism, hospitality...simply caring for others...”

 Moreover, these weren’t just doomed experiments.  Some of these societies lasted longer than today’s. 

These were stories untold, and there are more, including the stories of women, other Indigenous peoples and subjugated cultures.  The stories include the occluded, ignored and dismissed.  They include overlooked, devalued or derided examples of cooperation, civic duty, kindness, empathy, love of nature, selflessness—the civic and human spirit of “You’d do the same for me.”

These behaviors and these stories emerge in the worst times.  Scholars Pearl Oliner and Samuel Oliner (a Holocaust survivor) studied Europeans who risked their lives sheltering Jews during the Holocaust, and expanded this research into broader inquiries on altruism and compassion.  In The Altruistic Personality and other books, they explored many more examples of altruism under pressure than are mentioned in conventional histories. 

In A Paradise Built In Hell (2009), Rebecca Solnit tells the stories of communities that responded to disaster with creativity and solidarity.  “The history of disaster demonstrates that most of us are social animals, hungry for connection, as well as purpose and meaning,” she concludes.  “Hierarchies and institutions are inadequate in these circumstances...Civil society is what succeeds, not only in an emotional demonstration of altruism and mutual aid but also in practical mustering of creativity and resources to meet the challenges.”

 This is one basis for hope, as we confront the seemingly overwhelming effects of the climate emergency.  But another basis, equally ancient yet reinterpreted for our time, is at least as vital, as we address the causes of what will otherwise be even worse fates. It can be expressed as the Buddhist principle of “bringing the future into the present path.”  It still cannot be said better than in the Great Law of the Haudenosaunee: “In every deliberation we must consider the impact on the seventh generation to come.”  

Remember the future.  Anticipate the past.

  This ends the Soul of the Future series.  I’ve decided to integrate its bibliography with a still ongoing series, History of My Reading.  But eventually it will also bear the Soul of the Future label, for direct access.