"One of the first things I noticed when I came back was that everybody talked a little too fast, and they took too much for granted."
Gary Snyder, recalling his return to the US in the 1960s from living in a Buddhist monastery in Japan.
Hope in a Darkening Age... news, comment, arts, ecology, wisdom, obsessions, the past, the future... "THE END OF ALL INTELLIGENT ANALYSIS IS TO CLEAR THE WAY FOR SYNTHESIS."--H.G. Wells. "It's always a leap into the unknown future to write anything."--Margaret Atwood "Be kind, be useful, be fearless."--President Barack Obama.
"One of the first things I noticed when I came back was that everybody talked a little too fast, and they took too much for granted."
Gary Snyder, recalling his return to the US in the 1960s from living in a Buddhist monastery in Japan.
Among the concepts I regard as guides are three with non-English names, from three different cultures.
Ahimsa is a Sanskrit word which literally means non-harming, the most basic Buddhist precept. Poet Gary Snyder, who studied Buddhism for many years, defines the concept as “do the least possible harm.” So it is less restrictive than total non-violence, but it is broader in application. It means do the least possible harm to everything—not just humans but animals, plants and even rocks. It turns out to be an ecological as well as a moral principle. In life it means a change in assumptions—that is, think first before making a harmful change, not exactly second nature to western civilization. Personally it means being mindful, respecting the other, and sincerely evaluating whether causing harm is necessary. This is a key to the sacred attitude towards eating and other aspects of ordinary life that nevertheless have profound meaning.
Hozho is a concept from Navajo culture, part of the Beauty Way. A Navajo character in a Tony Hillerman novel explains it with an example: In times of drought the Hopi and some other cultures will pray for rain. “The Navajo has the proper ceremony done to restore himself to harmony with the drought…. The system is designed to recognize what’s beyond human power to change, then to change the human’s attitude to be content with the inevitable.” This does not eliminate trying to right wrongs, address problems, or change what needs to be changed. But where it applies, this concept is not only a profound act of humility, but a necessary human response of adjustment to reality. It’s something that other animals do instinctively—they find ways to cope with drought, for example. They adapt to the environment. Instead of being angry or frightened, or obsessing on things as unjust and taking them personally, people accept the conditions and work within them to make life better—by conserving water, for instance. Being in harmony is not easy. The Navajo ceremonies take days. But as a general principle, it only makes sense.
The third concept is Italian, and so part of my own Italian American culture. Sprezzatura can be defined in various ways, but it can come down to making a personal style part of life, an elegance that is natural and appears effortless. Tony Bennett applied it to music, but outwardly it can most often be seen in modes of dress. There are places where such expression is noticed and valued, but more places where it isn’t. But that doesn’t matter so much, because sprezzatura is an attitude, and expression (playfulness, authenticity) for its own sake. Or to quote another Italian saying: Niente senza gioia. Nothing without joy.
Life as an art and art as a game—as action for its own sake, without thought of gain or loss, praise or blame—is a key, then, to the turning of living itself into a yoga, and art into a means to such a life.”
Joseph Campbell Myths to Live ByMalcolm Cowley
"You don't have to sell a hundred million records, you don't need thousands of platinum disks, you don't need to sing to millions and millions of people for music to nourish your soul. You can sing and play to the cat, it will still mend your life. My mantra is five simple words: Music is its own reward."
Sting
At least the Pulitzer Prize committee finally recognized Barbara Kingsolver. There's been no better US novelist, none more consistent and capacious in the breadth, depth and style of her work. But also this year the Nobel committee has once again ignored not only my now-perennial favorite--and the world's-- in yet again passing over Margaret Atwood.
Atwood was widely believed to be the favorite back in 2017--so much so that the actual winner, Kazuo Ishiguro, publicly apologized for winning it instead of her. But there was always next year. And next year. And next year... And now Margaret Atwood is 84.
I don't claim to have read all the fine writers of the world, and I must defer judgment on a lot of prizes, like the Booker. But the Nobel has a specific, specified mission. In the words of founder Alfred Nobel, it is for the writer "who, in the preceding year, shall have conferred the greatest benefit on mankind."
That person, year after year for the past decade (I've been touting her since at least 2011), has been Margaret Atwood. She is unique in the world for sustaining quality literary work over many years, while her work consistently engages the contemporary world. She is now a global presence, for the shared present and future dilemmas she writes about (and the increasingly relevance of The Handmaid's Tale), but also as a literary figure and an active voice. Her contributions are immeasurable.
Of course the Nobel committee in Sweden has ignored their founder's goals for years. They tend to award for a body of work rather than something from the previous year, which is defensible. But they also tend to choose writers who are perhaps known mostly within a single nation (usually a small one), if, frankly, at all. Their influence on the world is minimal, at least until they win and their work gets new editions. At least, that's what they usually do, when they're not awarding it to a legendary popular music lyricist who is either a skillful selector in the folk music tradition, or a serial plagiarist, including portions of his Nobel Prize acceptance address.
If the body of work is a major consideration, consider this: Margaret Atwood has published 18 novels, nine story collections, 18 volumes of poetry, 11 books of nonfiction, eight children's books and two graphic novels--and these are only works brought out by major publishers and presumably translated widely. She writes and speaks on ecology and economics as well as literature. And the world reads and listens.
But there's no point in making the case. Everyone knows it. Everyone knows she is the perfect Nobel Laureate, and has been for years.
So why hasn't she been one? It can't be only because she sometimes writes speculative or science fiction--they gave the award to Doris Lessing many years ago. Atwood herself hasn't commented on it in interviews I've read or watched on Youtube, usually praising whoever just won. But I got the feeling that she doesn't expect to ever win it, because of some problem with the Nobel committee regarding her. Maybe there's animosity, personal or otherwise, from a member or members of the committee. Maybe they feel she's gotten too many other awards. Who knows? (Actually, I think she does.)
The Nobel committee in any case has a well known track record of not getting around to honoring major literary figures in their lifetimes, and therefore never honoring them at all. So why should I care? I probably shouldn't. But to me it calls into question not only the Nobel committee's judgment but their integrity. I know all these prizes are political to some extent, and this is not the worst injustice in the world. But seeing some justice done is a rare but good feeling. She deserves this.
So I don't care who wins the Nobel anymore. Not until the name that's announced is Margaret Atwood.
P.S. Margaret Atwood's latest fiction is a story collection, Old Babes in the Woods. Barbara Kingsolver's Pulitzer-winning novel is Demon Copperhead. Both are fine gifts for discerning and appreciative readers.
It's not hyperbole (if a bit exaggerated in detail): that's how it was. And as it has never been again--which probably makes it inconceivable to the majority of Americans today, who were not old enough or even alive then, sixty years ago this month.
Among the contributing factors to the national response was the status of the presidency, and of the President as a kind of personification of the nation, that's largely gone now. There was the shock: it broke the continuity of time. Few if any then alive could remember McKinley's assassination in 1901. There was also the new intimacy which the Kennedys brought, largely through the relatively new national medium of television. People had never been as familiar with a President and his family.John F. Kennedy became a national figure at the 1956 Democratic convention, one of the first covered by television, when the presidential nominee Adlai Stevenson left the choice of the vice-presidential candidate to the convention. So there were nominating speeches and a tense vote. Kennedy came up short by about 20 votes, and took the platform to move that the nomination of his rival Estes Kefauver be made unanimous.
Early in 1960, Kennedy was frequently seen on the evening news as he took the new route to the presidential nomination of competing in primary elections. The campaigns in Wisconsin and West Virginia were particularly dramatic and heavily covered. The Democratic National Convention was in media-saturated Los Angeles and dominated the networks all that August week, as they played up the drama, though eventually Kennedy secured the nomination on the first ballot.
The 1960 campaign featured the first four televised debates. The election was close and so demanded a lot of media attention. Once Kennedy became President, he and his administration and especially his family were everywhere in the broadcast and print media, which included not only newspapers but magazines from newsmagazines to glossy photo-rich weekly magazines like Life and Look, plus glossy women's magazines like Good Housekeeping that featured Jacqueline Kennedy and daughter Caroline. Jacqueline Kennedy refurbished the White House public rooms to reflect their historic character, and the networks all carried her one-hour tour of the result.In other words, the American public had the opportunity to become more intimate with JFK than any President before him.
In both senses of the word, Kennedy appealed to a wide variety of Americans. The industrial and working middle class was strong and unionized, and he was supported by the unions. His first big legislative goal was to increase the minimum wage (to $1.25 an hour. Opponents warned it would tank the economy.) The young flocked to join the Peace Corps that he began. He proposed and fought for "medical care for the aged," that became Medicare, finally passed in 1965. He spoke often about education, and about the arts. He brought artists and entertainers to the White House. He supported the sciences, committed the US to landing an American on the moon, which at first resulted in probably the global event that came closest to presaging the attention to his funeral rites, which was the fully televised launching of John Glenn as the first American to orbit the Earth, as well as those three orbits and his splashdown.
Kennedy called for and negotiated a nuclear test ban treaty with the USSR, crucially important as the first reversal in Cold War hostilities and the arms race. He gave his signature speeches on the test ban and on civil rights one after the other in two days, and it was his words--the eloquence and directness of his argument-- even more than the acts themselves that set the stage for at least the near future, and that continue to echo through history. Most historians and political observers now agree that he was about to end direct American involvement in Vietnam. As 1963 was coming to a close, he told political advisors his 1964 campaign issues would be pursuing peace and addressing poverty in America.
He was up and down politically, and he was hated by some. There were cynics, as there are many more now. But Americans felt a connection, and he inspired widespread affection and admiration, confidence in his ability as President, and an optimism for the future. His assassination was a shock on so many levels--of vulnerability, of an unimagined violence, of reality cut loose, then of grief for a young family we'd come to know. Part of it for many was the feeling that it was a turning point in history--a turning back, a turning away. It turns out sadly that we were right.
It is true that most of us experienced those four days through television. Although I don't recall watching all that much on Friday, the day of the assassination itself. I was in an afternoon class in high school when the principal, Father Sheridan, came on the p.a. (as he often did), but this time announced that TV was reporting that President Kennedy had been shot in a Dallas motorcade. My next class was gym, which we held outside, engaged in some sports activity that allowed me to almost forget what I'd heard, as if it hadn't happened. Then we showered and dressed, and as I walked up the narrow stairs to the gym itself, a boy coming down for his own gym class answered my inquiry with a fatal nod--President Kennedy was dead.I remember wandering the dark silent halls, then walking home with my two best friends, including my debate partner, Mike. We were scheduled to work on our debate case at my house. Instead we spent the evening talking about what had happened and what might happen next.
But by the next day I was watching almost all the time. The entire broadcast day was devoted to this news coverage, and would be through Monday. There were no commercials whatsoever. Much of the time on Saturday especially, the TV coverage highlighted events in Kennedy's campaign and presidency--at least for another day, he was more alive than dead on the TV screen.
On Sunday I didn't go to Mass with my family so I could watch the procession to the Capitol, and thereby happened to witness Oswald being shot on live TV. A moment before I'd been startled by what I thought was a gun, but it turned out to be a microphone. Then the real shot, and the chaos. Like most people, I watched the funeral Mass (in the same church where, two days after his Inauguration, Kennedy reached back to shake my 14 year old hand), then the procession to Arlington National Cemetery on TV (although Margaret, having grown up in Arlington, was there.) We saw the caisson drawn by six horses--one of many deliberate echoes of Lincoln's funeral--and the riderless black horse, rearing and bucking. We saw Kennedy's three year old son John and his salute, and all the closeups of Mrs. Kennedy, a contemporary portrait of Our Lady of Sorrows.But though we were bound together through television, it wasn't the total measure of our participation or expression. I remember going with my father to Main Street in Greensburg, probably late Friday or early Saturday, to the Singer Sewing Machine store he managed, where we placed one of my Kennedy photos in the storefront window, surrounded by black bunting. Every store on the street had a similar display. None of them would be open for business until Tuesday. Main Street was in mourning--here and everywhere.When Kennedy's coffin was displayed in the Capitol rotunda, nearly half a million ordinary Americans filed by to pay their respects, all day and night and into the next day. On Monday, the coffin entered St. Mathew Cathedral at 12:14 pm. Historian William Manchester wrote in his book The Death of a President, that "millions of individuals, reading the funeral timetable in the morning papers, had spontaneously chosen that moment to express their own bereavement."
"For the next five minutes, the continental United States was virtually isolated: telephone and cable communication with the outside world was suspended until 12:19." Traffic in cities stopped. The New Jersey Turnpike was deserted. Trains across America stopped. Subway trains under cities stopped in their stations. Buses pulled off the highways and stopped. Planes scheduled for takeoff remained on the runways. Even elevators stopped.
There were official memorials: sailors on US ships at sea cast wreaths into the waters. Thousands of artillery pieces at 7,000 US military posts fired salutes. But citizens invented their own tributes. Two Eagle scouts played taps to a totally silent Times Square, where taxi drivers stood by their cabs with bowed heads. A railroad conductor got down from his stopped train in rural Pennsylvania and blew taps on his own trumpet. And it wasn't just Americans. It was rush hour in Athens but Greek police still stopped traffic for a period of silence. A tribal ceremony of mourning was held in Nairobi.
Manchester reports that television viewing dipped to its lowest during the funeral Mass, as people attended memorial services in their own churches, synagogues and mosques, and in San Francisco, at Buddhist temples. There were memorial programs at all fifty state capitals. These rituals kept the country from going crazy, but they did not heal the nation completely, nor the lives of many of us.
The mourning, and certainly the effects, did not stop at the end of those four days. But those days remain a moment in America without precedent and without repetition (though the assassinations of Martin Luther King and Robert Kennedy in 1968 prompted extensive response.) For most Americans now I'd guess, it is only the faded black and white television images sampled briefly now and then that prevent it from sliding completely into the historical obscurity of the response to FDR's death, or Lincoln's assassination. Those of us still around who lived through it consciously will remember a different America, and especially the different America that could have been.
Sixty years later, the assassination of President John F. Kennedy still reverberates through American culture and politics. For example, the matter of conspiracies.
Apparently the first "conspiracy theories"that were applied to a presidential assassination, followed the 1881 shooting of President James Garfield and his subsequent death. What befuddles me is that hardly anyone mentions the documented conspiracy to commit the first presidential assassination, of Abraham Lincoln. John Wilkes Booth conspired with at least two others to also kill the vice-president and secretary of state on the same night. They were angry white supremacists and ardent supporters of the defeated Confederacy. But our histories tend to ignore this, and focus on Booth as a deranged lone gunman with mysterious motives.
Still, the contemporary pattern of conspiracy theories really began a few years after JFK's death in Dallas on November 22, 1963. The Warren Commission Report in 1964, published in a fat tome that a lot of people bought (including me) and almost no one read, repeated the conclusion that Lee Harvey Oswald was the lone gunman. But talk of conspiracy began almost immediately: it was the Mafia, it was the Pentagon, it was the CIA, it was Cuba, it was LBJ, it was none or all of the above, mix and match. There was so much of it that Barbara Garson wrote a Shakespearian parody called MacBird! that pinned it on LBJ, and it ran for several years in various theatres on the West and East coasts beginning in 1966.
At first the receptivity to these possible conspiracies was fed by the sense of loss--not only of JFK himself but of his promise--of the kind of presidency and country he embodied. LBJ was a lesser usurper in every way; JFK's differences with some military leaders during the Cuban Missile Crisis and perhaps the Nuclear Test Ban Treaty fed other (or contributing) scenarios. That LBJ and the military and "Intelligence" establishment swiftly escalated the war in Vietnam that JFK was winding down, only added credibility to these impressions and suspicions.
And then the evidence started coming in--not of actual conspiracies, but of the implausible single gunman/"magic bullet" official explanation. The same debates are renewed today as they basically were formed over the past half century. A conspiracy in which nobody talked remains as implausible as the magic bullet single assassin explanation. Now nearly everyone who might have been involved is dead, and pretty clearly, we will never know.
Just to entertain the idea of some dark plot among the powerful was further encouraged by the revelations of the conspiracies of lies that supported the Vietnam war. That systematic lies and secret wars and assassinations have been used by US government agents since at least the start of the Cold War if not much earlier, is documented.
With those shocking revelations and speculations as background, all kinds of conspiracy theories are now available for those who need them. Perhaps only a small number of Americans believe the transparently absurd conspiracies pushed by Qanon and other extremists, but apparently a lot of Republicans believe federal and state governments conspired to steal the 2020 presidential election (though they seem less willing to concede that the Republican Supreme Court actually did steal the 2000 election.)
The link specifically to the JFK assassination is strong, as in the QAnon announcement two years ago that its anniversary in Dallas would be marked by the return of John F. Kennedy, Jr., to stand beside Donald Trump as they triumphantly expose the usual suspects, and return to the White House. For just as tabloids for years "revealed" that JFK was still alive and hidden on an island, his son's death in a plane crash also didn't happen. These wish fulfillments have apparently become part of today's extremist fever dreams, complete with a MAGA conversion. (See this Guardian piece by Steve Rose for more.)
This 60th anniversary also marks the first year in which a Kennedy not of JFK's generation is ostensibly running for President, though not nominated by either major party. Robert F. Kennedy, Jr.'s candidacy has been linked with a number of what are called conspiracy theories. I don't know all that much about him, but he does seem a puzzling and paradoxical character. He was a persistent and effective environmental advocate (and still supports efforts to address climate distortion), and his foreign policy speech earlier in his campaign put his finger on the change that JFK brought, from a policy of war to a policy of peace, and of the return since to what RFK jr. called "the forever war," and the attitudes it reflects and engenders applied to every area of foreign policy. His distrust of Big Pharma is well-earned, since he represented its victims in many successful lawsuits. But even though some nuances in his positions do get ignored, he does go to extremes, which we identify with the extreme right, along with playing fast and loose with facts.
A piece this summer in Slate reviews his positions and troubling tendencies, and relates them to the characteristics of conspiracy theories and those who adhere to them. But it never mentions what is very likely their origin in his life: the assassinations of his uncle and his father in the 1960s. Reputable sources have suggested that RFK senior did not much believe in the Warren Report explanation. Others suggest the killing of RFK was not necessarily as simple as reported. When I was the editor of an alternative weekly called Washington Newsworks in 1975 and much of 76, we covered the forthcoming congressional hearings that seriously examined alternative explanations to the lone lunatic orthodoxy of the killing of Martin Luther King, Jr. In fact, Coretta Scott King (we learned exclusively) was coming to Washington to be part of it.
It now seems likely that there will never be explanations of any of these monumental and world-changing events that convince everyone, or perhaps even a majority. But there are distinctions to be made between credible conspiracy "theories," and insane explanations promoted as fact.
I reinterate: I don't know Robert Kennedy, Jr. But it's not hard to imagine the residual suspicions and perhaps the psychological damage that might come with the burdens of uncertainty with such high stakes, for a nephew of JFK and the eldest son of RFK. A very long book about Robert Kennedy's 1968 presidential campaign by Lewis Chester, Godfrey Hodgson and Bruce Page was titled An American Melodrama. I think of RFK Jr.'s campaign as an American tragedy, a further ramification of that day in Dallas sixty year ago.
But there is another 60th anniversary this week that is of course related but too often gets overlooked as it is conflated with the assassination itself: the funeral of John F. Kennedy, and what that was like for America and the world. Which is my subject next time.
ad from the 1930s |
“ Mr. Lester Smiley, Vice-President of the American Chicle Company, will be on the campus Friday, February 23. He would like to hold a group meeting for those men interested in a job opportunity with their company. The interviews will be held on Friday, February 23 in the College Placement Office. Mr. Smiley is a Knox College graduate and, as you know, we have placed many Knox graduates with American Chicle.”
I can quote this notice so precisely because I experienced it as a bit of found poetry, and literally stapled it into the draft of the play I was writing, “What’s Happening, Baby Jesus?” When the play was performed that May, freshman Michael Shain came out in a business suit and recited it, with the cheerful addition: “So come on out and keep America chewing!” It got one of the bigger laughs of the show.
Rockford High yearbook |
“Chewing gum” as we know it began in the USA, though humans everywhere have been chewing stuff without swallowing it for a very long time. Some Indigenous peoples in South America (for example) chewed particular plants for energy and stamina, and/or to get high. Chewing tobacco is another such instance.
People chewed various leaves, nuts, twigs and gummy substances for millennia, as breath sweeteners and digestive aids, to stave off hunger and thirst, and just for the fun of it. Denizens of the far north chewed whale blubber, and Europeans chewed animal fats, sometimes in social hours at the end of meals (hence, perhaps, the expression “chewing the fat” to mean convivial—and trivial—conversation, though the origins of this phrase are obscure, based on what seem to be barely educated guesses.) By the nineteenth century in America, chewing wax was the popular if not entirely satisfactory favorite.
But the substances we know as chewing gum had their origins in the 1850s. For some of us you could say the story starts with Davy Crockett.
Just about anyone who went to Knox College—or any college-- in the 1960s would have experienced the Davy Crockett craze of the 50s, centered on TV films starring Fess Parker, shown endlessly on the Disneyland anthology hour. The last of the three supposedly biographical tales was about Davy Crockett joining the heroic band defending the Alamo—150 or so men facing 1500 Mexican soldiers. After holding the Alamo for ten days, Davy Crockett and his compatriots were all killed in the battle or executed afterwards. The general of the Mexican forces, who was named but never seen in the Disney film, was Santa Anna. We knew that name.
Antonio Lopez de Santa Anna was not only the General of the army ruthlessly intent on putting down a rebellious attempt of Texas to secede from Mexico—he was also at the time the President of Mexico. In fact he was President of Mexico at least five times. He was also the General that soon after the Alamo, was defeated by Sam Houston’s forces, and thereby lost Texas. Later he was the general (and president) who provoked and then lost a war with the entire United States.Santa Anna’s career was marked by idealism, hypocrisy, vanity, charm, chicanery, avarice, incompetence and betrayal, and by a remarkable ability to survive. He was also in it for the money. His brief last term as president was a mockery of a monarchy. After he was deposed he went into exile, and ended up for a time in—of all places-- Staten Island in New York, where he cultivated a partner in a get-rich-quick scheme.
There are several versions of this story. In one, he brought with him about a ton of chicle, the sap of the manikara zapota or sapodilla tree, an evergreen found in jungles of southern Mexico and Central America. His American partner (or employee or go-between-- the exact relationship varies with the telling) was Thomas Adams, a photographer and inventor. Santa Anna liked to chew the chicle, so Adams was tasked with finding a market for it—as the basis for rubber tires attached to buggy wheels. Santa Anna convinced him it was a great idea.
No one was interested. Then Santa Anna decamped, leaving Adams with a ton of chicle. He recalled how one of his sons picked up the habit of chewing it from Santa Anna, and got another son, a traveling salesman, to try selling it as a chewing substitute for paraffin wax. Probably unbeknownst to him, the product had been test-marketed for ages by the Aztecs, who chewed this chictli. But after a little success, Adams gave it the American mass-production spin by inventing a machine in 1871 that divided the chicle into strips. He also invented the first gumballs.
An enterprising druggist in Kentucky began adding flavor to the gum, though the taste was medicinal. Adams then created a licorice-flavored gum he called Black Jack. It was the longest surviving brand of chewing gum, still sticking to the bottom of movie theatre seats almost a century later. Adams sold Black Jack and another venerable brand, Tutti Frutti, through vending machines in New York. Later he would market Chiclets. In the late 1880s, a candy store owner in Cleveland named William J. White, who supposedly invented chewing gum all over again when he mistakenly ordered a barrel of Yucatan chicle, added a peppermint taste. Then other brands familiar to my generation began arriving. Physician Edward Beeman began processing pepsin for its stomach-soothing properties, and heeded a suggestion to add it to chewing gum: Beeman’s Pepsin Chewing Gum. It was still marketed as such in the 1950s and possibly longer.In 1890, Adams brought together several existing manufacturers, including Beeman and White, to form the American Chicle company. Eventually it would become an international giant. Together with production and product refinements (the first Adams chewing gum was the consistency of taffy), chewing gum became a lucrative product.
Helping that popularity along was a former soap salesman named William Wrigley, Jr., who introduced his Spearmint gum in 1892, followed by Juicy Fruit in 1893. Wrigley was also a pioneer in advertising and publicity—perhaps giving rise to the expression “selling it like soap.”Wrigley headquartered his own company in Chicago, which became enormously prosperous, presenting the city with the Wrigley Building and the classic ballpark Wrigley Field for his Chicago Cubs baseball team.
By my childhood in the 1950s and adolescence in the early 1960s, chewing gum was a somewhat controversial but still ubiquitous part of the every day. The brands we knew and chewed included some of the age-old: many early brands failed, but bright yellow Juicy Fruit packages were everywhere, and Black Jack and Cloves rattled down from the lobby vending machines at the movies.
There was chewing gum for everything: Aspergum contained aspirin, there were nicotine gums; my father regularly chewed tablets of an antacid gum called Chooz. By the 1970s there were sugarless gums, marketed as dieting aids.
My generation got the gum habit from childhood bubble gum. It had a separate and later history, because it took longer to create a gum that produced durable bubbles. But the techniques were finally perfected, and the Fleer Company began selling Dubble Bubble in the 1930s, though with sugar shortages it devoted its entire production to the armed forces in World War II and didn’t resume domestic sales until 1951. Around then, Topps began selling Bazooka bubble gum.In my childhood we got Dubble Bubble and Bazooka in small, fat squares, wrapped tightly and individually. Both brands were wrapped on the inside with paper containing a comic strip or panel. For Bazooka it was the adventures of Bazooka Joe. Dubble Bubble’s hero was called Pud. Both also included fortunes; Dubble added interesting “facts.”
In the 1930s, Fleer also started selling packages of bubble gum with cardboard photos of major league baseball players. By my 1950s childhood, Topps had joined and perhaps surpassed them. Those packages were single thin rectangles of gum slightly smaller than the cards that we all collected, treasured, traded and played with. It was also our early form of gambling, as you did not know what players you were getting in each package.We also got football cards, with pictures of professional football players (much less popular than baseball players--college and even high school football teams were better known then.) Eventually there would be cards of many different kinds: from Davy Crockett to the Beatles, Star Trek and Star Wars, and yes (I reluctantly admit) the Brady Bunch.
Chewing gum in a bewildering number of new tastes continues to be sold around the world. Still, as the 20th century wound down, chewing gum began to lose its cultural flavor. It was less fashionable, a little déclassé. Beemans got a boost when the Tom Wolfe book and the 1983 film The Right Stuff revealed it as famed test pilot Chuck Yeager’s favorite ritual before a dangerous flight. But that didn’t slow the trend downward.
The fortunes of those great chewing gum companies has followed, along with the disappearance of many classic brands. American Chicle is gone, Beech-Nut is back to making just baby food. Though now a subsidiary of a candy company, only Wrigley remains an international giant in chewing gum.
But such is the power of nostalgia for classic chewing gum brands that a wrinkled up package, a decades-dry stick or related item can fetch tens, hundreds, even thousands of dollars. And of course it’s become a cliché of my generation to bemoan the bubble gum baseball cards that were thrown away with the other detritus of childhood. A 1952 Mickey Mantle hauled in $12.6 million. Chew on that awhile.
Gallery
Thomas Adams (and sons) put their name on several products, and the Adams name was used for others long afterwards. This is one of the brands that didn't last.
Technology that filmmaker Peter Jackson developed for the Get Back TV film made it possible recently to extract Lennon's voice from the piano and extraneous noise on the 70s cassette, which was one of the problems that led to the song being abandoned in 1995. But George had recorded a guitar track before they gave up on it, so that was available for this recording. With Paul playing bass and slide guitar, and Ringo on the drums, this song--now titled "Now and Then"--includes all four Beatles. Since there are no known recordings of new songs with both John and George (who died in 2001), this is in effect--and officially--the last Beatles song.
The song was first played on BBC radio, and became available on vinyl, together with the Beatles first release, "Love Me Do." The official version appeared on YouTube with just the plain blue and gray box, with the song's title in white. Although John's demo has been unofficially available, I'd never heard this particular song before.The first thing that got to me was the clarity and vitality of John Lennon's voice. At a time when death seems all around, this was less a revival than a resurrection. The melody is enchanting, and the production is seamless. I liked the quickly neglected 90s songs more than many others did, but this is a contemporary Beatles record--part now, and part then.
The song itself has been edited from the demo version (partly because John hadn't finished the lyric), given a slightly faster tempo, and generally gets the Beatles treatment, with guitar solo and a string orchestra section. Paul and Ringo do new vocals but high harmony backgrounds by George, Paul and John are taken from previous recordings, notably "Because" on Abbey Road. There is one entire section of the demo that was dropped (which furnished its bootleg title, "I Don't Wanna Lose You,") that's now the center of online debates among Lennon enthusiasts. There were probably technical reasons for this as well as musical ones, but the song is given a classic Beatles shape, and I love it.
On November 3, the official music video by Peter Jackson was released. It contained a few images never seen before, but that isn't its main contribution. Jackson crafted these few minutes to celebrate the Beatles career but most tellingly, he does so in the context of this song. "Now and Then" can seem to be a wistful meditation on a delicate love relationship, perhaps a lost love (The interrupted relationship with Yoko is an obvious possibility.) But the video presents it as a commentary on the bond among the Beatles themselves, estranged as they were for a time in the 1970s. (Though throughout that decade, two or three or even all four of them continued to work together on solo projects, and the big break between John and Paul was largely healed by the late 70s.)
Instead of just intercutting clips from the past with some shots of the 2023 and 1995 recording sessions, Jackson created dazzling combinations in which Beatles from different eras interacted with each other, and even versions of themselves from another time, just as memory is part of the present. The video suggests the power of playing together over time, with the visceral effect of presence even in absence. Now and then exist together.(Another layer is added with the brief shot of Giles Martin, co-producer of this record and son of George Martin, who produced all the Beatles records. In this shot he even looks like his father.)
The video starts with the reality: Paul and George working out guitar for the song in 1995. While John's voice is heard, the first images aren't of him but the gauge that measures the sound level, moving with his vocal. He's present through the machines. And yet...
To get inside the song, the first image of him is a kind of silhouette, in which a scene of sunset (or sunrise) on the sea is reflected in his glasses, and a mirage appears of the four very young Beatles clowning around. Then the dizzying combinations of all four of them in different times, mostly playing together (suggesting a higher octane version of the subtly surreal baggage car scene in A Hard Day's Night, and some scenes in Help!) The images of young Lennon are so bright and full, though Jackson presents only John the cut-up and clown, rather than his more intense or gloomy or distant moments. In fact, the others are also seen frequently being silly. (Even present-day Paul gets in the act, mocking his shot playing bass, with a mugging young McCartney behind him.)The video ends with a series of images that go backward in time to the four as young boys, and then the famous early 60s black-and-white bow from the stage with the bright Beatles sign behind them. Then their images disappear and the light goes dark.Inevitably, the song applies to us as listeners, for however long the Beatles have been part of our lives. For me there was a time I thought of them every day. I needed their rhythms to face everyday life. These days I think of them only now and then, and I miss them. But I know they will be there for me.
The idea and the basic structure was dreamed up and built many years before by his father, Charles Hinton, who was an inventor (he created the first baseball pitching machine. Unfortunately, it was powered by gunpowder.) Charles Hinton also wrote scientific romances in the era of H.G. Wells' classics, but chiefly he was a mathematician. And so the purpose of his jungle gym was to...teach his children math.
Charles Hinton came from a radical but highly educated family in the UK. His mathematical interest was what he called the fourth dimension, within which exist the three dimensions we know. Or something like that. (It wasn't the Wells' version of the fourth dimension, which was time.) In the late 19th century, when he proposed his ideas (more influential now than then), he came to believe that people couldn't understand his fourth dimension because they really didn't know the mathematics of three dimensions.
So to teach his children how three-dimensional math works, he built a backyard structure to illustrate it, and encouraged his kids to identify the junctures of the x, y and z axes by climbing to each point and calling it out. They climbed all right, but they ignored the math lesson.Charles Hinton built his structure out of bamboo, since he was in Japan at the time. Later he moved to the US, taught at Princeton (where he invented the pitching machine), and worked at the US Naval Observatory and the Patent Office, though he never bothered to patent his "climbing frame."
Years later his son Sebastian suddenly remembered it, and described it to an educator at the progressive school system in the Chicago suburb of Winnetka, who encouraged him to build a prototype. It was tweaked, and eventually kids in Winnetka were climbing on the first jungle gyms (one of which still exists, also made of wood) and Hinton filed his patent. He didn't personally profit by it or see its success, for this is also the centennial of his death.
The patent referred to the structure as a version of tree branches upon which "monkeys" climb. Experts say it's really ape species that do this kind of climbing, but kids are often called monkeys. and the name stuck for one part of the jungle gym: the monkey bars. The jungle gym has been varied over the years, getting more elaborate and more safety- (and lawsuit-) conscious. But something like the original still features in many if not most playgrounds and a lot of backyards.There have been a few notices in the media of this centennial, notably the NPR All Things Considered segment by Matt Ozug. But no one answered the question that I had (nor did they ask it): The name "Jungle Gym" seems like an obvious pun on "Jungle Jim," of comic strip, film, radio and TV fame. But is it?
Nope. Sebastian Hinton patented what he called the "junglegym" in 1923. Jungle Jim didn't appear in the newspaper comics pages until 1934. Jungle Jim was created by comics artist Alex Raymond (with writer Don Moore) as a lead-in to Raymond's other famous hero, Flash Gordon--they both appeared for the first time on the same day. The other thing that seems obvious about Jungle Jim is true: he was created to compete with the wildly popular Tarzan, who started out in a series of novels by Edgar Rice Burroughs, then swung into the movies (the first Tarzan in silent pictures was Elmo Lincoln, of Knox College) before dominating the funny papers starting in 1931.So is it the other way around? Jungle Jim comes from the Jungle Gym? The official story is that Jungle Jim Bradley was named after Alex Raymond's brother Jim. But did the brothers ever play on a jungle gym as boys? I await the definitive Alex Raymond biography to answer that question.
There’s no certain or complete explanation for the current social and political dysfunction in the United States. A full analysis would need to include psychology and whatever human nature might be, as well as strands of history going as far back as history goes. Any such speculations might be dangerous, and certainly inexact.
But human institutions arose to cope with various manifestations of psychology and historical tendencies, to establish some sort of common order and public good. It occurs to me that people under the age of 50 or so wouldn’t remember that in the decades after World War II, the United States reached a sometimes uneasy but functioning consensus on organizing its institutions to foster a basically stable society and political order, while providing the tools for the society to improve.
In confronting the traumas of the Great Depression and World War II, that consensus was forged, and it created the stability, growing prosperity and progress of the 1950s through the 1970s, regardless of which party had the White House or the majority in the houses of Congress. (By consensus, I don’t mean the lack of spirited disagreement or even vicious conflict, or even consciously held convictions. I mean a functional consensus, an acceptance of core institutions and roles of government to foster the common good.)
Major political institutions operated by rules and tradition, with the respect and consent of the governed, even when some or many of whom were adamantly opposed to certain policies and leaders. The sanctity of the vote was never questioned.
To this was added the new core of the postwar consensus, which held that the government, particularly the federal government, had the key responsibility to guard and improve those public elements that benefited everyone. Just as FDR realized that new power generation in the South would benefit the entire region, Eisenhower saw that among those driving on a federal highway system would be patricians and paupers; that it would convey freight for everyone.In order to pay for public projects, as well as the common defense etc., income taxes were graduated based on ability to pay, with higher brackets paying a higher percentage. The top bracket during the Eisenhower years reached 90%.
The government was seen as the arbiter that leveled the playing field, so that through regulations the costs of ensuring the health and safety of Americans would fall equally on all competitors. Those institutions deemed vital to everyone in the society, like electric and gas companies, and some hospitals, were publicly or community owned, or highly regulated in exchange for virtual monopolies in a given place. Everyone knew them as “public utilities.”It wasn't perfect. McCarthyism and rigid conformity poisoned the 1950s, and institutions could not respond fast enough to challenges of the 1960s and 70s. And though the era served a large proportion of white people, blacks and other races suffered institutional prejudice, and largely rural sections of the country didn't share in prosperity. But progress was being made, slowly and painfully, and largely within established political and social institutions (which included expressions of dissent.)
With the election of Ronald Reagan in 1980, all of this began to change. The Reagan administration began the series of deep cuts to the taxes paid by wealthy individuals and corporations, depleting government funds which impoverished programs and added to the federal deficit. Over time, this was a prime factor in an income transfer from the middle class to the very few most wealthy.
Other devastating Reagan era policies included de-regulation and privatization. De-regulation hampered or ended government’s ability to monitor and control vital aspects of health and safety. At the same time, government or public institutional functions were transferred to corporations, on the theory that services would somehow improve when hefty profits were added to claims on income.
Big changes take time to exhibit their full effects, but the world we’re living in is largely a result of these and similar changes, many of which have become the new orthodoxy, especially as fewer remember when they were not so.
Apart from the near collapse of institutions that used to be considered essential for the public good, resulting in a general cynicism and sense of helplessness, the most conspicuous change is the extent of the gap between the few obscenely wealthy and everyone else. It’s not just the proportion that’s the problem, but the lack of security of what must be close to a majority of Americans. The usual example is that they are only a few inflated medical bills away from financial disaster. This is a double failure—the income transfer, and the lack of public support for a general and essential good, medical care. Though it may seem that anger is the chief motivation of frenzied politics, beneath that it is fear.
Other factors play into this, such as the de-industrialization and resulting collapse of labor unions in the 1970s and 80s. But the social insecurity that is roiling politics, making them more extreme, can largely be traced to how the US responded: with Reaganomics.
At the same time, de-regulation began to extend to political contributions, until the Supreme Court made them nearly boundless. Money is a huge factor in making our politics crazy. Money plus television and Internet celebrity multiplies the insanity.Just as both parties participated in the postwar consensus, the responsibility for our current condition resides also in both Republicans and Democrats. Republican policies laid the groundwork, and Republican politics are wounding and challenging institutions of our very republic. But Democrats have largely failed to confront the basic problems. Too many mollify wealthy interests for political gain. And to date no major Democratic leader has plainly explained these problems, and proposed direct solutions as the core of an agenda.
What we are in part witnessing is a series of challenges to core institutions, and the institutional responses. These are dramatic, and involve individuals, from federal judges to county election clerks. The outcomes of these battles themselves tell us whether our institutions will be able to respond to both the usual and the unique challenges the world—and the planet—present us. For the sense of selfishness and cynicism legitimized by the end of this consensus has given us so-called public servants who openly act with no responsibility to the public good, but only to their careers and the ideologies they believe they can ride to power and wealth.
The institutions of the world also face severe challenges, even apart from the greatest challenge of all, the climate distortion crisis. Right now warfare in the Middle East is the latest. I had the eerie experience of coincidentally reading an essay by Joseph Campbell just as the first Hamas attack became news, detailing the “war mythologies” of this region, copiously quoting both the Old Testament and the Koran on the divine instructions for waging war without limit, with no cruelty too great.
Campbell wrote this essay during the 1967 war, and commented: “These, then, are the two war mythologies that are even today confronting each other in the highly contentious Near East and may yet explode our planet.” But we are also witnessing the efforts of established and ad hoc international institutions to limit and end the violence, to maintain some sense of common humanity. International institutions also arose as part of a post-war consensus, though they were never very strong. They too are being challenged now.