Friday, February 23, 2018

Present: A Quotation.3

A Parkland student speaks: leading a nationwide revolt of high school students
 and others to demand gun control.Their scheduled march on Washington now aims
 for 500,000 participants.

"With all our focus on the mental illness of crazed killers like Nikolas Cruz and Stephen Paddock, we tend to lose sight that it’s another sign of mental illness that American political leaders and their apologists do nothing while the country literally destroys itself with gun violence."

Frank Rich

Wednesday, February 21, 2018

Past Future: Anticipations

“…Wells was a prophet, not merely in the popular sense of having predicted space travel, processed food and the Common Market, but in the wider sense of being able to think in a completely new way about the future---to be, as it were, at home in the future.”
Roslynn D. Haynes

“Something in us pursues information and data with some passion, but the soul is always eager to hear another story.” Thomas Moore

To remember the future, anticipate the past...

The idea of applying scientific principles to anticipate the future was not new when 1970s futurists claimed it as their defining and innovative procedure. Nor was it new when RAND and Herman Kahn started up their computers in the 1950s.

This approach to the future, re-discovered in the middle of the 20th century, had been proposed and applied as the century began. At that time it was the work of one man: H.G. Wells.

At the turn of the twentieth century, Wells was known for his novels and short stories, many of which he described as “scientific romances.” These include the classics for which he is most remembered in the 21st century, such as The Invisible Man, The Island of Doctor Moreau, The War of the Worlds and The Time Machine.

But he became famous in his own time as the result of a series of magazine articles on the future. They were published in England and the United States in 1901, and collected the following year in a book, titled Anticipations. It was a surprise success, outselling any of his novels.

Wells described a future dominated by big fast cars, long-distance buses and fat trucks traveling on wide asphalt highways, passing mega-cities and prefabricated suburban communities of air-conditioned homes filled with labor-saving appliances, along a solid sprawl spread unbounded from Boston to Washington, D.C.

Futurama 1939
This was still the future in 1939, when Futurama showed it at the New York World’s Fair. But Wells foresaw it before Henry Ford had sold a single Model T, or London had a single motorized bus.

Wells foresaw electric appliances and other labor saving devices in homes, when English homes were lit by coal gas lamps, or even oil lamps and candles.  He also saw social ramifications: the end of house servants, the beginning of keeping up with the Jones.

He saw darkness in the future as well: a series of major and highly mechanized wars. The next war would feature the “land ironclads” (or tanks, also not yet invented.) As soon as mid-century, war would be dominated by aircraft, flying in formation in large numbers. He was writing two years or so before the Wright Brothers demonstrated a successful airplane.

Victory will depend on expertise and planning, but he also envisioned the ghastly consequences for soldiers, with a vocabulary that eerily anticipates words later written by poets and novelists who were soldiers in World War I: “Tramp, tramp, tramp, they go, boys who will never be men, rejoicing patriotically in the nation that has thus sent them forth, badly armed, badly clothed, badly led, to be killed in some avoidable quarrel by men unseen.”

But in the far future Wells saw nations coming together in common markets and then in a world state.

These were not wild prophecies, but based on Wells’ wide reading. The book covers economics and class, forms of government, education, science, journalism and structures of social relationships and families, including improvement in the status of women. From the perspective of the actual future he got a lot wrong, and a few things wrong-headed (his advocacy of eugenics, for instance) that he needed to correct later. But in terms of the study of the future, Anticipations sets several standards.

“What made the book so successful was the willingness of the author to look at alternative futures,” writes Wells biographer David C. Smith, “to discuss his ideas with his readers, and to invite them to participate in his work of future-making.”

So Wells presaged alternative futures and participatory or democratic futures all at once. Even more basically, he made a case for the scientific study of the future, long before the 70s futurists.

He articulated this argument most elaborately in a speech, also in 1902, that was subsequently published as The Discovery of the Future.

He developed a daring thesis that the future can be foreseen, with at least the same accuracy as we know the past. “And the question arises how far this absolute ignorance of the future is a fixed and necessary condition of human life, and how far some application of intellectual methods may not attenuate even if it does not absolutely set aside the veil between ourselves and things to come. And I am venturing to suggest to you that along certain lines and with certain qualifications and limitations a working knowledge of things in the future is a possible and practicable thing.”

We can know as much about the future as about the past, he argued. Our knowledge of the past depends on partial information, inference and interpretation. As new information is uncovered, old information given new weight and relevance, and as interpretations change, so does our picture of the past.

This is especially true, he said, of the 19th century’s exploration of the remote past, such as the geological ages. Evidence is sparse, yet inferences can be made.
“...it may be possible to throw a searchlight of inference forward instead of backward,” to see the future as well as we see the deep past, using the same basic inductive methods. Such a vision of the future would be “infinitely more important to mankind” than even the view of the past that transformed 19th century thought.

It would be a scientific undertaking, he insisted. He reminded his audience that all science is ultimately about the future: a theory of gravity predicts what will happen when someone throws an object into the air, anytime in the future. The theories that result from scientific analysis of facts are tested by their ability to yield “confident forecasts.”

The key to forecasting the future is to combine scientific predictions of separate phenomena—including even “so unscientific a science as economics.” If “first-class minds” did something like that, couldn’t they create “an ordered picture of the future that will be just as certain, just as strictly science, and perhaps just as detailed as the picture that has been built up within the last hundred years of the geological past?”

Such forecasts would be useful in guiding present actions. There is “no reason why we should not aspire to, and discover and use, safe and serviceable generalizations upon countless important issues in the human destiny.”

“I must confess that I believe quite firmly that an inductive knowledge of a great number of things in the future is becoming a human possibility,” Wells asserted as his lecture reached its climax. “I believe that the time is drawing near when it will be possible to suggest a systematic exploration of the future.”

Wells’ speech was widely praised, and together with Anticipations, made him “a mind and a force to be reckoned with,” biographer Smith wrote. “He had become a great man.”

But no one seemed to take up the challenge to develop a scientific study of the future—not until the idea was rediscovered a half century later.

Wells himself wrote many more non-fiction books, all of them in one way or another expressing his vision of things to come. He had announced this commitment that guided the rest of his long career in The Discovery of the Future. He divided those who find their direction in the past, from those who look to the future. The past-oriented look for causes, the future-oriented for possible effects. “…the main dispute even in most modern wars…will be found to be a reference, not to the future, but to the past…”

The difference defines divergent ideas of a person’s purpose in life. One sees it as ”simply to reap the consequences of the past,” while the other believes “our life is to prepare the future.”

It was clear which Wells was going to be. “Yet though foresight creeps into our politics and a reference to consequence to our morality, it is still the past that dominates our lives. But why? Why are we so bound to it? It is into the future we go, to-morrow is the eventful thing for us.”

But in championing the future Wells did not restrict himself to articles, commentaries, non-fiction books and speeches. He returned to fiction, including many novels about the future. Some of his fictions were also eerily accurate about the future within his lifetime, but he did no more systematic forecasting as he began to do in Anticipations.

One reason could be that in fairly short order, he realized there were limitations to a scientific study of the future.

Partly because of the stir that the Anticipations articles created in the United States, Wells was invited for a lecture tour. He went, and subsequently wrote a book, The Future in America, published in 1906. There he retreated on his claims for a “scientific sort of prophecy,” admitting that his The Discovery of the Future lecture “went altogether too far in this direction.”

" Much may be foretold as certain, much more as possible, but the last decisions and the greatest decisions, lie in the hearts and will of unique incalculable men,” he wrote. “With them we have to deal as our ultimate reality in all these matters, and our methods have to be not 'scientific' at all for all the greater issues, the humanly-important issues, but critical, literary, even—if you will—artistic. Here insight is of more account than induction and the perception of fine tones than the counting of heads.”

The artistic method he often chose—the method he returned to—was storytelling. In addition to the “perception of fine tones” there are other decisive features of stories that make them our portals into alternative futures.


Story, writes Mark Turner, a neuroscientist as well as an English professor, “is the fundamental instrument of thought.”

Story is also “our chief means of looking into the future, of predicting, of planning, of explaining.”

“We look for stories in everything. If stones are thrown on the ground, they make a story," observed novelist Brian Kitely. "Consciousness is a story. It is a condensing of the world’s details into narrative.”

“Knowledge is stories,” writes Roger C. Shank, former director of Yale’s Artificial Intelligence Laboratory. Explanations in science itself tend to be narratives, even beyond the basic story of the if/then statement. “Science is the storytelling of our time,” asserts William Irwin Thompson. “By telling stories about our origins, from the big bang to the African savanna, science is really telling stories about what and where we are and where we want to go from here.”

As worthy and useful as futures studies might be, they are not what "the future" means to most people. Neither jargon-propelled generalities nor carefully couched projections and matrices quite get to the texture of life, to the quality of the future, the smell of it. Nor can they set up a realistic dynamic that makes room for behavior, emotion, creativity, spontaneity, instinct, dedication, insight, perception, perverseness, influence, aspiration, faith, courage, learning, intent, love, hope, imagination…in a word, soul.

Producing that kind of scenario is not really the futurist's role. That's the job of the storyteller.

In envisioning a future, stories can better provide the human element through characters and their emotions. This adds a necessary complexity to these futures.

Story can provide motivation and points of view. They can expose various self-interests. By showing effects on characters it adds the crucial condition of emotion, of human reaction and response.

Stories can explore consequences on the individuals and groups they tell us about. Stories can dramatize challenge and costs to society and individuals, and suggest qualities of mind and spirit needed to confront those consequences, to meet those challenges.

Stories can place a future in several layered contexts—the personal, interpersonal, cultural, and societal. They can place the future in the context of time—of present and pasts. They can suggest the context of an entire world.

In his fictions, H.G. Wells exemplified all of this. In her study of the influence of science on Wells’ writing, Roslynn D. Haynes concluded, “Thus Wells tends not to examine scientific principles per se, but their effect on individual characters and their causal implications for society or mankind as a whole.”

Since stories appear to be how we think and how we learn, they also have the advantage of seizing our attention, and remaining in our memories. Reports clotted with number and levels of probabilities, snowstorms of data plowed with arcane instruments of analysis, plus charts and graphs of dubious usefulness, aren’t likely to give many people a picture of the future. Nor much hope for the future either, beyond anticipating getting to the end of them and going out for ice cream.

But something that reaches almost everyone is a story. Stories may provide a sense of one alternative future, but more than that, stories suggest meanings. When we experience a story, feel what the characters feel, we get some sense of what the complex essence, what the soul of such a future might be.

Stories can’t do it all, any more than a scientific report or a comprehensive description. Stories are limited by their form, and by the skills of the storyteller. They have beginnings and endings, and concentrate on significant events. They may use analogy, irony, paradox. Or they may be quite a bit cruder in their effects. But they have definite advantages in considering the future.

They link with other stories and archetypal themes that suggest humanity’s deepest desires, fears and aspirations. In bringing us into the story and helping us to identify with characters, they offer the possibility of empathy, not only with our potential future selves in a given alternative future, but also with others.

That is a crucial element in the kind of systems thinking that was a cornerstone of 1970s futurism. According to one author on the subject, “the systems approach begins when first you see the world through the eyes of another.”

Some stories hope to tell us what future to avoid. Others suggest how we need to change to get to a future we need. Futurist Willis Harman noted: “The most carefully designed social measures will not achieve their desired goals unless they involve not only rationally designed programs and structures, but also changes in deep-rooted beliefs, values, attitudes and behavior patterns.”

“There is only one crisis in the world,” wrote futurist John Platt. “It is the crisis of transformation.” Stories can help us get to where we need to be.

In the end, all we actually have of the past or future are the stories we tell ourselves. So stories also reflect back to us what we feel about the future in general, or one or another alternative tomorrows.

The rest of this series will concentrate on such stories, their contexts and ramifications, as reflections of how we think and feel about the future.

Most of these stories are widely known and shared, many have historical significance, and a few are lesser known but just as indicative. They tend to be foundation stories—the first or best known, in print or as movies or television—of a particular kind, representing a significant view of the future. In many cases, other stories like them followed, including those still being told.

But fair warning: even the well known stories may not always be ones that usually get classified as foundation stories of the future, or of particular approaches to the future. Analysis of the literature, for example, doesn’t always include bug-eyed monster movies of the 1950s, Star Trek: The Next Generation or the American Indian Ghost Dance movement of the 1890s.

We begin with the foundation story of “the future” itself. For several years before he wrote the now forgotten Anticipations, H.G. Wells proposed and dramatized the basic idea of “the future” we have accepted ever since. Not in any essay or exegesis, but in a story that has been read and remembered for well over a century. Before Wells invented futurism, he invented the future.

This future begins with the personal. Its roots are found in a dingy house in an obscure village outside London, where Bertie Wells was born in the mid-19th century. So why did the son of two former servants start thinking about the future?

 Because his life depended on it.

to be continued...

Monday, February 19, 2018

President Day


Nation Cruelly Reminded That It Once Had a President was the headline for satirist Borowitz when President Obama's portrait was unveiled on Lincoln's Birthday.  The story continued:

In a televised event that many deemed unnecessarily cruel, millions of Americans were briefly reminded on Monday that they once had a President.
Unsuspecting Americans who turned on cable news Monday morning were suddenly assaulted with the memory of a time when the country’s domestic affairs, international diplomacy, and nuclear codes were entrusted to an adult.

Compounding the cruelty of the televised event, the networks lingered unnecessarily on a speech that only served to remind viewers that the nation once had a President who rigorously obeyed rules of grammar and diction.

In a mercifully brief discussion of the antipresident's domination of the past year at a Christmas party, a friend commented that he was especially alarmed by what he had to let go by.  We've all had to survive by, on some level, ignoring the latest from the fulminating con man who has destroyed the presidency, perhaps forever, and so degraded the public world that sanity as well as blood pressure are threatened by paying the normal attention of a citizen.

This is perhaps why I noticed that while I was hanging out at a bookstore in Menlo Park for an hour before Christmas Eve festivities, several people came in asking for "the Obama book."  It was the book of photographs by the official White House photographer called Obama: An Intimate Portrait.  I looked through it, and even though I'd seen most of the photos before, it was hard to hold back tears.

I didn't buy it then (I got the Ursula LeGuin book--easier to carry home) but I ordered it when I got back to Arcata.  However the book was so unexpectedly popular that the edition sold out, and it has been unavailable for more than a month.  Good news is that yesterday I got an email saying I could expect my copy in early March.  I decided to get it because I want it to leave to younger members of my family, so they will have some evidence of the existence of what we used to call a President.

Update: A survey of historians released Monday concludes that James Buchanan is no longer the worst president in American history.  You might be able to guess who has replaced him, after only a year in office.  President Obama on the other hand has moved up into the top ten of best presidents.

Saturday, February 17, 2018

You Are Responsible (Updated)

Update Sunday: What seems to be different this time, however, is the extent to which the young survivors of the massacre have quickly and publicly banded together to take political action, urging adults in power to act on gun control. Today, junior Cameron Kasky and his classmates announced that they are planning a nationwide march on March 24th, dubbed the March For Our Lives."

NBC: Students are taking hold of the gun debate in the wake of the school shooting in Parkland, Florida, by organizing a series of school walkouts across the country and a march in Washington to protest gun violence.

The events begin with the Women's March EMPOWER branch, which is dedicated to youth-led advocacy. It has called for "students, teachers, school administrators, parents and allies" to take part in a national school walkout on March 14.

The goal is for everyone to walk out of their classrooms for 17 minutes at 10 a.m. "to protest Congress' inaction to do more than tweet thoughts and prayers in response to the gun violence plaguing our schools and neighborhoods," the organization said in a statement.

Students from Parkland, meanwhile, are organizing their own event on March 24 in Washington called the "March for Our Lives." The protest will "demand that a comprehensive and effective bill be immediately brought before Congress to address these gun issues," according to its mission statement. "No special interest group, no political agenda is more critical than timely passage of legislation to effectively address the gun violence issues that are rampant in our country."

NY Daily News: "Children are dying, and their blood is on your hands because of that. Please take action. Stop going on vacation in Mar-a-Lago,” Marjory Stoneman Douglas High School student David Hogg told Trump on Sunday via an interview with Chuck Todd on “Meet the Press.” “Take action. Work with Congress. Your party controls both the House and Senate. Take action. Get some bills passed. And for God’s sake, let’s save some lives,” the student continued."

SATURDAY: Students survivors of the mass shooting at Marjory Stoneman Douglas High School  in Florida rallied on Saturday, with chants of "No More Guns!" and "You Are Responsible!"

Senior Emma Gonzalez was equally blunt in her speech: If the president wants to come up to me and tell me to my face that it was a terrible tragedy, and how it should never have happened, and maintain telling us how nothing is going to be done about it, I’m going to happily ask him how much money he received from the National Rifle Association. But hey, you want to know something? It doesn’t matter, because I already know: $30 million. … To every politician who is taking donations from the NRA, shame on you!... They say that tougher gun laws do not decrease gun violence: We call BS!"

As the New Yorker commented in introducing Emily's Witt's report on students in Parkland: It was a bad week for a lot of reasons, but at least we had evidence of one incorruptible value: the American teen-ager’s disdain for hypocrisy.

Other Floridians were also mad as hell, and rallied to say they weren't going to take it any more.  The AP reported: Pressure is growing for tougher gun-control laws after a mass shooting at a Florida high school, with thousands of angry protesters at state rallies demanding immediate action from lawmakers, and more demonstrations planned across the country in the weeks ahead.

One prominent Republican donor in Florida announced he wasn't going to give it anymore: no more money for Republicans who don't support an assault weapons ban.

Thursday, February 15, 2018

There is Nothing More American Than Gun Control

One of many lies the greedy gun lobby and its deluded followers perpetuate is that restrictions on guns is unAmerican, and specifically an invention of rich liberals to take away their rights so they can be controlled, or just have less freedom and fun.

And unfortunately even those liberals and the establishment press have accepted the premise that gun control is contrary to American history.

But there is nothing more American than gun control, not even guns.  And that specifically applies to the notorious Old West.  If you grew up on TV westerns you would know this but fortunately a recent article by Matt Jancer in Smithsonian Magazine and a book by Adam Winkler provide the actual historical record.  For example, in the West:

“Tombstone had much more restrictive laws on carrying guns in public in the 1880s than it has today,” says Adam Winkler, a professor and specialist in American constitutional law at UCLA School of Law. “Today, you’re allowed to carry a gun without a license or permit on Tombstone streets. Back in the 1880s, you weren’t.” Same goes for most of the New West, to varying degrees, in the once-rowdy frontier towns of Nevada, Kansas, Montana, and South Dakota.

And in the South:

The practice was started in Southern states, which were among the first to enact laws against concealed carry of guns and knives, in the early 1800s. While a few citizens challenged the bans in court, most lost. Winkler, in his book Gunfight: The Battle Over the Right to Bear Arms in America, points to an 1840 Alabama court that, in upholding its state ban, ruled it was a state’s right to regulate where and how a citizen could carry, and that the state constitution’s allowance of personal firearms “is not to bear arms upon all occasions and in all places.”

But today we've insanely regressed to the point that this hardly qualifies as a civilization.  Our moral morons who prevent gun control are now as responsible for gun violence and deaths in Florida and elsewhere as the actual shooter.  Perhaps more culpable, if the shooter is mentally ill.

They are indicted by the merest recitation of the facts.  From Think Progress on Wednesday:

A 19-year-old former student opened fire at Marjory Stoneman Douglas High School in Parkland, Florida on Wednesday afternoon, killing 17 people.

The suspect, identified by law enforcement officials as Nikolas Cruz, chose the same weapon used to carry out some of the deadliest mass shootings in U.S. history: an AR-15.

It’s the same assault rifle that was used at Sandy Hook Elementary School, where Adam Lanza killed 27 people, including 20 first graders. It was used in Aurora, Colorado, where James Holmes killed 12 people in a movie theater. It was used in San Bernardino, California, where Syed Farook killed 14 people. And it was used in Las Vegas, Nevada, where Steven Paddock killed 58 people — to date, the deadliest mass shooting in United States history.

The AR-15 is the most popular rifle in the U.S. according to the National Rifle Association (NRA); there are an estimated 9 million in circulation as of 2014. Its popularity is likely due to the fact that it is extremely customizable and accommodates high capacity magazines that can fire off 100 rounds or more within minutes, features the NRA frequently touts on its blog.

The rifle’s popularity is also likely due to how frequently the NRA promotes it. The organization’s Twitter feed has regularly featured messages of admiration for the AR-15, even after the assault rifle was used to murder numerous people in several mass shootings."

From New York Daily Intelligencer on Thursday, noting that since the antipresident took office a year ago:

... the president has rolled back a rule that would have made it more difficult for the severely mentally ill to obtain firearms; made it easier for fugitives to purchase guns; and proposed $12 million in cuts to America’s background check system. He has also tried to slash $625 million from federal mental health programs, and $1 trillion from Medicaid, one of the top sources of health insurance for the mentally ill in the United States.

And, of course, Trump and his party have prevented any piece of legislation regulating the firearms market from making it into law."

By media accounts, the Parkland police did the best they could under current law in monitoring a troubled young man tempted by guns, as did his parents.  The school had practiced procedures in the case of a shooting and were as prepared as they could be.  Teachers acted heroically to protect their students.

But against an AR-15 in an open carry state, and the people running Florida and this country, they were helpless.

Update: 2/16:According to an article in the Washington Post, the number of school shootings in the US in 2018 provided by a nonprofit group and widely reproduced (by ThinkProgress among many others) is inflated.  Their argument is persuasive, and I've removed those statistics from the ThinkProgress quotes.  Inaccuracy in such matters is damaging because it's distracting and, well, because it's inaccurate.

 The figures that the Post accepts are bad enough: An ongoing Washington Post analysis has found that more than 150,000 students attending at least 170 primary or secondary schools have experienced a shooting on campus since the Columbine High School massacre in 1999...A recent study of World Health Organization data published in the American Journal of Medicine that found that, among high-income nations, 91 percent of children younger than 15 who were killed by bullets lived in the United States...On average, two dozen children are shot every day in the United States, and in 2016 more youths were killed by gunfire — 1,637 — than during any previous year this millennium."

The Post piece is accompanied by the photo below, of a mother outside Parkland school after the shooting.  I noticed the smudge on her forehead: not only was the shooting on Valentine's Day but it was also Ash Wednesday, a holy day of repentance for many Christians.  The ashes are reminders of death.

Tuesday, February 13, 2018

Past Future: Day of the Futurists

“The challenge of the times can also be based on the future, which may challenge us to examine and prepare in advance to solve the problems it has in store for us....The future works upon the present only to the extent that the present can receive the challenging images it broadcasts.”
Fred Polak
Images of the Future

“We cannot humanize the future until we draw it into our consciousness and probe it with all the intelligence and imagination at our command. This is what we are now just beginning to do.”
Alvin Toffler
Introduction to The Futurists 1972

To remember the future, anticipate the past...

They weren’t sure what to call themselves. Europeans apparently preferred “futurologist.” Somebody came up with the nifty category of “futuristics.” Most North Americans seemed to settle for “futurist.”

Futurism had been an early 20th century term: part manifesto, part marketing, for a group of Italian artists. Now in the 1970s futurists were to be something different, and something new.

Alvin Toffler
They brought skills and knowledge from many existing areas. In his 1972 anthology called The Futurists, Alvin Toffler listed some: “anthropology, biology, engineering, philosophy, mathematics, physics, astronomy, sociology, art, economics, history, journalism and a dozen other disciplines.”

Toffler quoted futurist John Platt on what defined the futurist field: “a considerable number of organizations, institutionalized programs and individual workers...whose major, and often sole, activity is concerned with the study of the future.”

Studying the future was a seemingly new (and paradoxical) occupation. While institutions and individual professionals had been moving towards this concentration since at least the early 1960s, it was the popular success of Alvin Toffler’s 1970 book Future Shock that made futurists part of the public dialogue.


It was an immediate global best-seller, and its bold paperback cover in several different florescent covers were everywhere for several years. Its terminology became part of professional inquiry and popular conversation, but its title alone was enough to convince publishers that there was a keen public interest in the future.

 A flood of paperbacks ensued over the decade (Profiles of the Future, The Future of the Future, The Future As History, Creating Alternative Futures, Futures Conditional, The Futurists) as well as many other books about the future, including more best-sellers.

In Future Shock, Toffler studied how people could or should respond to change that was only going to happen faster and more thoroughly in the onrushing future. The subject was actually “change shock” as a new form of culture shock, but it was the future in the title that magnetized attention.

Others explored possible futures through art and performance, or envisioned and advocated for desirable futures in various media. The futurists with the most institutional support concentrated on ways to predict and rank future possibilities, to provide information that could be used to prepare for those possibilities, and to influence the direction of change.  Businesses were intrigued, and eventually so was the government.  After the unheavals of the 1960s, the seventies were seeing widespread changes, and they wanted to get ahead of them.

These futurists studied actions in time. They analyzed data to discern trends, measured resistance, extrapolated on where the trends led.  They looked at countervailing forces and possibilities, access to resources and other factors.  Beyond the "if present trends continue" futures, what might shape the future differently?

The future they examined could be quite narrow: what will be the market for paper towels in 1990?  Which suburbs will contract and which expand in the next 20 years?  What will be the economic and social result of increased automation? Their futures could also be comprehensive--what will America and the world look like in the year 2000?

Scientific prediction had its acknowledged roots in military operations research during World War II and the subsequent work of the RAND Corporation and other think tanks. While still part of the Douglas Aircraft Company, Project RAND employed the first electronic computers in 1946 for their report on “Preliminary Design of an Experimental World-Circling Spaceship.” Eventually RAND studied urban development, health, education and other topics, but its main work was centered on nuclear war.

Herman Kahn in the 1950s
Out of their work in tandem with the U.S. military and policymakers came the terminology that haunted the 1950s and early 1960s: throw-weight, megatons, megadeaths. Their analyses resulted in paradoxical doctrines like Mutual Assured Destruction (or MAD), the "rationality of irrationality" and "thinking the unthinkable," among the phrases coined by the best known nuclear warfare theorist and RAND’s star, Herman Kahn.

Kahn is credited with introducing a key concept that made futurists possible: alternative futures. Instead of settling on one prediction, Kahn at RAND assembled packages of assumptions, applied them to data and came up with multiple outcomes. Borrowing an Italian term that denoted the story told in an opera, each alternative future was described in a “scenario.” The now-familiar word is still used in this sense.

Kahn’s alternative futures (especially in his infamous book, On Thermonuclear War) were all grisly variations of unprecedented catastrophe. Still, the idea of alternative outcomes freed futurists to parse their packages of data and extrapolations into a range of probable and possible futures, as well as to start with preferable futures and work backwards to identify factors that could be changed in the present to make the preferable probable.

RAND and other futurists also employed a number of analytical techniques including information theory, game theory, statistical and cross-impact analysis, modeling and simulations. But the most basic conceptual tool was systems analysis. It was an attempt to look beyond individual factors to how they interacted.

Predictions were still tricky, however, as was the behavior of systems themselves. The study of systems suggested some surprising and paradoxical characteristics that applied to most systems over time, regardless of their content.

The perils of prediction and the limits of current capabilities were soon demonstrated in a fairly big way. In addition to ongoing issues (nuclear war, Vietnam, environment, poverty and racial inequality, for example) the early 1970s saw three international issues force their way into public consciousness.

The first was the fast-growing global population that started a very public debate with the 1968 best-seller, The Population Bomb by Paul Eurlich.

Overpopulation and the disasters it might lead to (food shortages, resource depletion, overcrowding, loss of liberties etc.) were talk show topics in the early 70s, as well as prompting new organizations and public policies.

Then, as if fulfilling predictions in The Population Bomb, there were global food shortages caused in part by two disastrous harvests in 1972 and 1974. Though not directly related to these failures, severe famine in Bangladesh captured public attention in 1974, partly due to relief efforts that inspired the Concerts for Bangladesh organized by George Harrison and Ravi Shankar, the first of many subsequent multi-artist benefit concerts.

At about the same time, the OPEC cartel of Middle Eastern oil-producing states imposed an oil embargo on western nations that supported Israel, leading to shortages and long lines at American gas pumps as well as quadrupling the price of oil. Another oil crisis later in the decade added to the sense of vulnerability, as energy became a prominent and emotional issue.

Population, food and energy had been the major factors considered and evaluated in the 1972 report on the future produced by the Club of Rome. It was called The Limits to Growth, and issued in paperback, it became another very controversial best-seller.

The study tabulated and compared statistics on food, population and industrial growth to come up with a computer simulation that forecast worldwide collapse. It set off an immediate, extensive and intense debate.

Critics called the report oversimplified, because it didn’t include important relevant factors and because its understanding of the dynamics involved in population growth and food production were crude.

The Club of Rome was sensitive enough to these critiques to produce a more sophisticated second report. Mankind at the Turning Point developed short, medium and long-range alternative scenarios, separated into regions and including other factors such as politics and values.

But the first report had already inspired a radical policy prescription that its critics warned could be dangerous and self-defeating as well as ethically repugnant.

But first we need to set the stage for this debate.

Another new area of public attention and concern in the 1960s was the natural environment. Rachel Carson’s bestseller Silent Spring, warning of the dangers of DDT, provided impetus to a movement to address pollution of water, earth and air by harmful chemicals and waste. By the late 1960s, environmental consciousness began to shift towards ecology, the long-term health of natural systems, which emphasized their future.  Ecology became a factor at least some futurists included.

Meanwhile, the Apollo space program was working towards its goal of a lunar landing.  During Apollo 8, the first manned spacecraft to orbit the moon, astronaut Bill Anders took an unscheduled photo of the Earth from lunar orbit, subsequently known throughout the world as "Earthrise."

 This photo, as well as the subsequent "Blue Marble" photo of the Earth in space taken during the 1972 Apollo 17 mission, were widely credited with bringing home the message of ecology, and of the Earth as a single, vulnerable planet in the darkness of space.

Buckminster Fuller in the 70s
Enter Buckminster Fuller.

Considered the grandfather of futurism by some, Buckminster Fuller was at the height of his popularity in the early 1970s, especially on college campuses. An innovative thinker and designer since the 1930s, Fuller was known for his long talks and writings that employed complicated sentence structure and unique if highly precise vocabulary, advocating for “comprehensive anticipatory design science,” by explaining that understanding is “symmetrically tetrahendronal.”

But he also had a way with brilliantly catchy names and phrases. As an active sailor and former naval officer, Fuller maintained that this planet is an enclosed system spinning through deadly space, utterly and completely dependent on the resources it holds, like a ship on the high seas.  Those onboard systems include the properties of the atmosphere, photosynthesis and so on--all of which are indispensable to life.  If the resources run out and the systems don't work, there's no place to go for alternatives.

He called it Spaceship Earth, a metaphor that is also a deeply accurate description in many practical ways. It is also memorable, and along with the geodesic dome he designed, is still associated most prominently with his name.  It especially seemed to summarize the impression produced by the Earthrise and Blue Marble photos.

Fuller published a book called Operating Manual for Spaceship Earth, but his phrase was first popularized by Barbara Ward’s 1966 book titled Spaceship Earth that examined the impact of technology and other factors on the planet’s future. By the 1970s it was a well-known if not always well-understood term.

And then in 1974 an American philosopher named Garrett Hardin called Spaceship Earth a really bad idea.

Hardin wrote a series of articles about how to respond to the catastrophic future predicted by The Limits to Growth report. He asserted that because of the inexorable mathematics of population growth, nothing could be done to prevent massive starvation in underdeveloped parts of the world. He maintained as well that if the developed world even tried to help, both rich and poor nations would suffer. He wrote one of his articles to specifically oppose relief for the ongoing famine in Ethiopia, because it would encourage the overpopulation that he saw as the cause of the famine.

Together with Jay Forrester of MIT, who developed the computer models for the Club of Rome report, Hardin proposed that the U.S. save its resources, and engage in “triage” by parceling out surplus food for political benefit.

But Hardin also recommended that advanced nations like the United States adopt a long-term philosophy he called Lifeboat Ethics. He specifically named Spaceship Earth as the idea he was opposing.

He accepted the idea of a craft alone on the boundless ocean, with only the resources aboard to sustain life.  But his metaphor was the lifeboat: one per (rich) nation.

Hardin saw individual rich nations as lifeboats with limited capacity. If they tried to save victims in the water (symbolizing poor nations) by bringing them aboard, the lifeboat would capsize and they would all drown.

Hardin asserted that the metaphor of Spaceship Earth doesn’t work because a world ship would have one captain to keep the crew and passengers in line. Otherwise resources will be quickly looted when everyone has access to them, resulting in what he called “the tragedy of the commons”—a theory he previously proposed that would itself become the center of debate in the 70s.

Hardin wrote that the Spaceship Earth idea, which implies the interdependence of all humanity, “can be dangerous when used by misguided idealists to justify suicidal policies for sharing our resources through uncontrolled immigration and foreign aid.”

Another reaction to the population debate was the 1973 film,
Soylent Green.
Hardin’s assertions were quickly criticized on all fronts. Rodger Revelle of the Harvard Center for Population Studies questioned his population projections and understanding of population growth.

Alan Berg of the World Bank questioned the accuracy of the food figures. Ecologist Barry Commoner and economist Robert Heilbroner both suggested that lifeboat ethics would lead to military conflicts, including nuclear terrorism.

Others said that he was identifying the wrong problems. Geoffrey Baraclough of Brandeis University was among those who maintained that prices and monetary policies rather than actual scarcities were the main culprits causing food shortages. Harvard’s Jean Mayer asserted that the problem was in the rich nations because they consume and pollute more. Nutritionist Frances Moore Lappe claimed that rich nations are wasting huge quantities of food, partly through overconsumption of beef: that lifeboat ethics is really hamburger ethics.

(Somebody may have also noted that immigration to America had in fact been controlled since the 1920s, and foreign aid was a very small item on the federal budget.)

Some countered Hardin and said that Spaceship Earth is the right metaphor. Limits to growth are real but they are global. Climatologist Stephen Schneider pointed out that long range climate change that affects agriculture could hit rich nations as well as poor. (Schneider would soon be one of the early voices warning of global warming.)

Hardin’s views of human nature and faulty history (particularly of cooperative use of common areas in England and elsewhere) were also attacked.  Though he may have set out to show that the needs of the many outweigh the freedom of the few, he wound up defending selfishness, cruelty and genocide.

Though even by the early 21st century a food crisis of such dire proportions had not materialized, and the economic interdependence of globalization has made the lifeboat metaphor problematic if not obsolete, clearly such fears persist, or at least recur.

In terms of 1970s futurism, the Club of Rome episode had at least one cautionary lesson. While some resistance to the report was noticeably due to a deep sense of denial that there are any limits to growth, some who accepted the report’s findings responded with panic—the kind of frenzy that might result in hasty adoption of extreme and ultimately self-destructive policies.

Still, by mid decade, futurism seemed to be exploding in size and influence. There were at least 20 independent futurist organizations in Washington alone, with futurists of one kind or another working for federal government agencies, Congress, foundations and corporations. The World Future Society claimed 1700 members just in the DC area. The Washington Chamber of Commerce itself employed 30 futurists.

Considering the future became a matter of law when the U.S. House of Representatives passed the Foresight Provision that required that a House committee study the future impact of bills sent to the floor. Congress also created the Office of Technology Assessment to keep Congress informed on possible effects of technological change.

Colorado Senator John Culver predicted that political decision-makers would turn to futurists for help on critical problems. The office of one representative from North Carolina ran a Congressional clearinghouse on the future, with seminars reputed to be among the most successful futurist activities in Washington.

In the Senate, a Democrat and a Republican cosponsored a bill to create a federal Office of Economic Planning. Russell Train, administrator of the Environmental Protection Agency under both Republican and Democratic presidents, called for “a continuing and comprehensive census of the future.”

Nationally, a consultant’s investigation found more than 35 major corporations employing forecasters, looking at least 20 years into the company’s future. Meanwhile at least a thousand college courses in future studies were being taught in North America, including Harvard, UCLA and Simon Fraser University in Canada. They used textbooks and readers like Worlds in the Making: Probes for Students of the Future. (“Probes” was the word Marshall McLuhan used to describe his own investigations into contemporary phenomena. McLuhan also must be regarded as an inspiration and forerunner of formal futurism.)

MIT students were excited by prospects of Bucky Fuller’s World Game, a kind of Internet without service providers that attempted to match world resources with needs.

There were future studies courses in high schools, and at least one (on Long Island) had a curriculum entirely organized by studying the future. There were adult education courses at the New School in New York City, and at least two graduate programs in futurism. The University of Massachusetts offered a PhD in Futures Studies. According to an academic observer who kept track, courses at all levels were increasing by about 50% every two years.

There were parallel developments in western Europe and elsewhere in the world, including within the Soviet Union. As the head of one of those Washington organizations suggested, futurism was becoming “an intellectual industry.”

But there were already problems within this dynamic field, as became clear when the second general assembly of the World Future Society convened in Washington in 1975.

It is at this point that I enter the story. For several years in the 70s I’d been looking into the futures field as a freelance journalist. I read the literature, conducted interviews, and visited organizations and universities, mostly in Massachusetts and Washington. In that role I attended the aforementioned convention, and witnessed what went on.

Alvin Toffler and Barbara Marx Hubbard at the
1975 WFS Assembly
Organized in 1966, the World Future Society was the big tent of futurism. So gathering in the Washington Hilton in early June were an estimated 2800 participants: city planners in Hush Puppies, corporate forecasters in khaki suits and university students in cutoff jeans.

There were parapsychologists, science fiction buffs, and members of communes dedicated to ecological self-sufficiency. Participants included the director of Technological Forecasting of Tel-Aviv University, government planners from Japan and India, and the director of marketing for Hooker Chemicals and Plastics. There were also at least four members of the House of Representatives in attendance.

Formal sessions explored topics such as long-range economic and environmental planning, new technologies, systems analysis, and the future of religion. The exhibit hall overflowed with future-oriented products from flushless toilets to books on the anthropology of outer space.

Above all, there were the futurist stars. Young visionaries like Stewart Brand and Karl Hess mingled with Alvin Toffler and author Paul Goodman (Growing Up Absurd), who both roamed the halls between hotel meeting rooms wearing sunglasses. Economist Kenneth Boulding spoke at one session.  Roy Amara, head of the Institute of the Future in Menlo Park, California delivered a paper. Senator Edward Kennedy (who’d pushed for the Office of Technology Assessment, and introduced three other future-oriented bills) was a featured speaker.  Senator Hubert Humphrey spoke on food security.

Herman Kahn in 1970s
The opening press conference on the first afternoon hosted the main eminences, including Harvard’s Daniel Bell and Herman Kahn, who’d left RAND and founded the Hudson Institute to research the future beyond nuclear war scenarios. He’d already issued Things to Come: Thinking About the Seventies and Eighties, and came armed with material from his forthcoming new book, The Next 200 Years: A Scenario for America and the World.

Kahn, who was once an inspiration for Stanley Kubrick’s Doctor Strangelove, was looking more like a white-bearded Santa Claus. He was speaking softly, gravely and authoritatively about the prosperous future ahead, when the stage was suddenly invaded.

The invader wasn’t a young protestor shouting slogans, but a neatly dressed middle-aged woman. She was Wilma Scott Heide, a past president of the National Organization of Women. Nevertheless, she was protesting. Standing uninvited on a stage filled exclusively by white men, she made the point she spoke about: the future didn’t have any women. Specifically, the assembly had no women among the main speakers.

Wilma Scott Heide in the 1970s
As shocking as the moment was, the issue shouldn’t have been a surprise. In his introduction to The Futurists, Alvin Toffler had noted that women and people of color were underrepresented at the previous General Assembly of the WFS, and among futurists generally.

Nor should the protest itself have been surprising. The feminist movement had been making headlines (and disrupting meetings) since 1970. It was therefore also an ironic moment, since the largest organization dedicated to the future was having its feminist crisis several years after everybody else. Anticipating it wouldn’t have required systems analysis, just reading the newspaper.

The assembly leadership’s first response was to try to physically shove Wilma Scott Heide off the stage. The audience protested, and she had her say. She finished to applause, and some conciliatory mumbles from the leadership.

That night a women’s caucus was formed, and assembly leaders were persuaded to add a woman as a main speaker. She spoke the very next morning, and apart from being the hit of the convention, she changed its entire emphasis. She brought another kind of futurism to the forefront.

Hazel Henderson from around this time
She was Hazel Henderson, an economist with an interest in ecology, and an advisor to the Office of Technology Assessment. Born in England and retaining a British accent, she was tall, blond, and charismatic, and she galvanized her audience.

She told them that she wasn't interested in developing strategies and scenarios. "I have humbler goals. They are to open up processes and decision mechanisms, to expose underlying values and assumptions buried deeply in our so-called value-free methodologies.”

“Citizens now understand that professionals with narrow, specialist training cannot adequately define our problems,” she continued. “Not that professionals aren't essential to the debate, but they must now see where the limits of their technical competence end, and”—the key phrase—“where their values carry no more weight than those of any other citizen in a democracy."

Hazel Henderson was instantly the new star of the convention, and her message inspired a fierce energy and focus for the rest of the proceedings. No one from the World Future Society had predicted this.

She’d issued a clarion call for a kind of participatory or democratic futurism, to guide and critique the experts and their priorities. Once “alternative futures” were admitted, choosing desirable alternatives became possible. What kind of future do you want? Answers might depend in part on different experiences of the present. Differences in race, gender, occupation and interests, socioeconomic status, geographical region, urban or rural etc. were all likely factors in those experiences.

This is not to say that all futurists were the same, let alone confirmations of any stereotype. At an MIT futurist seminar I met an advertising copywriter who talked about vector analysis and Gurdjieff, a social scientists who was also a transcendental meditator, and a Harvard Business School student who had spent four years in Vietnam. There was an Israeli physicist, an economist and several biologists who talked about an operating principle common to biology and futurism, that survival during periods of change depends not so much on being well-adapted as on being collectively adaptable.

Nevertheless, finding ways for the broadest and yet most effective participation in envisioning desirable futures became the emphasis of mostly smaller and newer futurist organizations.

On left, Barbara Marx Hubbard, with Hazel Henderson
(sitting on the floor), probably in the 1970s
For example, the Washington-based Committee for the Future, founded by Barbara Marx Hubbard, used video technology to conduct symposia on the future to broaden participation. It convened community “Meetings to Design a Desirable Future” in eight U.S. cities, including Boston, where the three day Town Meeting of the Future was broadcast on public television.

Grassroots groups like Earthrise, loosely affiliated with the Rhode Island School of Design, engaged in what it called “community future-consciousness raising” as part of a state planning effort. Mass planning meetings were conducted in Iowa, Oregon, Hawaii, and Georgia, often with the involvement of the governors of those states. In Georgia, that meant Governor Jimmy Carter, who would become President of the United States little more than a year after this WFS assembly. (The current President, Gerald Ford, had already spoken to a WFS symposium on energy.)

Others emphasized the need for more diverse voices within the futurist community. One potential effect was suggested at the assembly by the faces of technocrats listening to Third World speakers who said they weren’t interested in American advice anymore.

Eventually an alternative session led to a list of recommendations that were shouted from the podium in a free-form manifesto. The assembly ended in a frenzy, with several people—including Alvin Toffler—trying to set up some sort of communications network.

I checked back a year later. There had been a flurry of activity at first—Toffler’s Anticipatory Democracy Network sent out a follow-up letter noting “clearly the Washington meeting was an intense experience for a lot of people.” A gathering of Washington futurists resulted in a planning committee for a model futurist organization, which produced a simulated year-end report for five years in the future that anticipated a more participatory and rigorous futurism.

But beyond that report, not much happened in a formal sense as a result of the assembly’s fervor. Toffler’s network quietly died, and other groups “just melted away” in the words of a young Washington futurist. “Futurists don’t like to organize,” he explained.

 An administrative official of the World Future Society was philosophical. “You know how it is at conventions,” she said. “People have an interesting time, but there’s not a lot of follow-through.”

Five years later, with Ronald Reagan in the White House, federal interest in futurists began to wane. By the early 1990s, Speaker of the House Newt Gingrich, who has a young congressman identified himself as a futurist, dismantled the Office of Technology Assessment.

Many futurist assessments in the 1970s focused on the year 2000. But by the time 2000 came around, there weren’t many futurists left.

Some of what they had advocated became institutionalized. Lots of corporations employed forecasters or consultants on the future. Planning on local, state and regional levels routinely called upon citizen participation and input. But what concern for the future remained became more compartmentalized according to specific areas and disciplines.

Some futurist organizations persisted, as did some futurists—notably Hazel Henderson and Barbara Marx Hubbard. But the thrill was gone.  Futures Studies courses evaporated, and professors went back to their departments. One or two books a decade got any attention. John Naisbitt’s Megatrends in 1982 was the last best-seller of the futurist era. While good work was being done behind the scenes, the goal of envisioning comprehensive futures pretty much disappeared from public view.

So apparently did the future as focus or motivation. "We dwellers in the empire do not seem to want art with visionary power now—art that looks to the future," wrote Mark Edmudson in the late 1990s.

“Today we live entirely in the present," wrote British professor of Political Economy Robert Skidelsy at the end of 2000. "For most people the past has little meaning; there is no future on the horizon except more of the present, apart from the ambiguous promises offered by science.”

After interviewing a number of professional futurists, writer Jack Hitt concluded in early 2001, "I believe we the people have actually changed; we have turned the corner…Our cultural gaze turned away from the future…[O]ur once-ranging future has shrunk into a snug and warm place, as cozy and shag carpeted as a suburban den."

Then came the shock of September 11, 2001, when New York’s Twin Towers were toppled by terrorists piloting passenger airliners, leaving at least 3,000 dead. The future turned dark.

"Just yesterday, it seems, we were making a speed-freak, high-tech, fast-money hurdle into the 21st century," wrote San Francisco Chronicle columnist Joan Ryan that December. “Now the future is a fog, vague and ominous...”

Even after Americans recovered, observers tended to agree that they preferred to forget the future. “We are no longer the nation that used to amaze the world with its visionary projects,” wrote economist and New York Times columnist Paul Krugman in 2010. “We have become, instead, a nation whose politicians seem to compete over who can show the least vision, the least concern about the future and the greatest willingness to pander to short-term, narrow-minded selfishness.”

There were of course exceptions. During his eight years in office (2009-2016) President Barack Obama frequently linked the policies he advocated to their effect on the future. "Our future shouldn't be shaped by what's best for our politics,” he said. “Our politics should be shaped by what's best for our future."

This was the lasting legacy of the 70s futurists: the sense of responsibility for the future.

Some concerns for the future transferred into specific categories, like climate change. When virtually all nations on Earth acted as one planet for the first time to address the climate crisis in the Paris Accords, they had the future in mind.

But perhaps the future receded as the present became overwhelming.  Future shock had become future numb.  The unrelenting speed of life was transforming America into the land of the frazzled and the home of the frayed.

Hopes and fears still are fixed on the future. But as a guiding idea, the future faded. It no longer seems to be in the active vocabulary of the current zeitgiest. The future comes and goes.

Part of the reason for the latest lull may be that the future became tainted with promises seemingly made but not kept, as well as alarming consequences of breathless promises that did come true. In both instances, futurists were not blameless.

Kurt Vonnegut from a posthumous
collection, "Armageddon in Retrospect,"
2008
Yet even in the mid 70s, futurists knew that they hadn’t yet fulfilled their own promise. They knew what they wanted to do, but mostly (as several admitted to me) they didn’t know how to do it. The technology didn’t work well enough (including communications technology.) There turned out to be too many factors and inadequate means to organize them, or to forecast their relative contributions to an overall effect.

The problem Hazel Henderson identified was also a conundrum. There were values hidden in the structure of all those numbers and categories fed into computers. A diversity of views could simply create chaos, and yet they were necessary to expose those values, as well as to propose alternatives and creative solutions missed by experts.

 Important changes with vast implications (like the women's movement itself) just hadn't been factored.  On the other hand, advocates could tend to exaggerate the change, or at least speak of it in too general a way.

 They needed time to figure it all out.

But there was failure, or at least incompleteness, deep in the structure of what many futurists did, and therefore in the futures they proclaimed. A key weakness was quickly discovered by their usually unacknowledged forefather, in many ways the first futurist. He identified the idea of alternative futures a half century before Herman Kahn. He saw both the possibilities and the fallacies of scientific futurism in the first few years of the twentieth century. He realized there was a better way.

to be continued...

Monday, February 12, 2018

Present: A Quotation.2

“....men are right when they affirm the value of private enterprises and when the affirm the necessity of public enterprises: where they go wrong is in denying that both are necessary.”

Walter Lippmann

Friday, February 09, 2018

Present Future: Nothing to See Here


Does it bear repeating?  We're in bad shape.

And I'm not even talking about today's news.

Let's do the inventory anyway.  Here's what we need to face contingencies, disasters, major challenges of the near and farther future: capacity, redundancy, resilience.

As I'm thinking again about World War II, it becomes even clearer that the side that was going to win was the side that had the greatest capacity: the largest industrial base, the easiest access to the greatest amount and quality of resources, and the skills and organizational ability to use those assets.

Both sides had organizational ability and skills, close enough to cancel each other out, with a few crucial exceptions (radar, breaking codes and that Germany never developed an atomic bomb.)  But though both sides had formidable industries and resources, the United States just had more.

All that was used in the most insane way imaginable, which was to build thousands of complex devices and send them out to be blown up, sunk or shot into the air or through the water.  All created to destroy people and things, including themselves. The side that could keep on building them would win.  It's why we won.

The next crisis, or the next war, won't be precisely like that.  But it will take all those elements and we are short in key areas, and losing capacity by the day. Our manufacturing base is a fraction of what it was, and skills in all manner of making things (with the possible exceptions of smartphones and cardboard boxes) are being lost.  We don't make things anymore.  China makes things.  They have capacity, and ours is dwindling.

Another crucial area of capacity is food production, and judging from where and how far the food from the grocery comes from, we're getting in trouble there as well.

Without capacity, we are vulnerable to any disruption in transportation, for any reason: technical, political, environmental.  It's one thing to get goods across a contiguous expanse of land.  It's quite another to get stuff halfway around the world across oceans, in the quantities and frequencies our population requires.

Capacity also means the resources, including human resources and skills in identifying problems and delivering solutions.  These are resources that often only government can muster and deploy on a large enough scale.  Once again, capacity is thin and getting thinner--as tragically proven by what should be a national scandal regarding the federal response to hurricane devastation in Puerto Rico.  Or (as now documented) in Texas.

Puerto Ricans responded the way people all over the world respond to situations that make living in their homes and communities impossible: the ones who can leave, have left.  Puerto Rico is emptying out.  But sooner or later, we run out of places to abandon, and to go.

Capacity includes leadership, and in every crucial area in which we depend on federal leadership, we're obviously in trouble.  Congress can allocate money, but how it is spent is crucial.

Which suggests another area of capacity: information.  Apart from government's information gathering and dissemination, the capacity of independent media reporting is crucial.  That capacity is also dwindling.  Apparently with the resources the news media can now deploy, Puerto Rico may as well be the moon.

The internal infrastructure of this country--needed to transport resources and products, for example--is falling apart, and all Washington is doing is trying to figure out who can make money from it.  That's the infrastructure we need to respond to disasters and emergencies, to keep the country from splintering and falling apart.  Especially since resources for life--food and clothing--aren't close by anymore, almost everywhere in America.  Once the remaining stores empty out, of course.  Or you run out of gas.

Redundancy means more than one system to perform functions necessary to survive.  So in your home you don't want to totally depend on electricity, or totally on gas, or totally on batteries you can't replace anyway.  We're losing redundancy in almost everything now: communications, transportation, food, and increasingly, information.  That's a biggie.  Right now the satellites go out, the Internet goes down, and it's just about game over, because the alternatives have been starved to death.  Government used to think about these things.  Nobody seems to care anymore.

Think also how about how we're becoming more and more dependent on access to stuff from long distance: clothing, food and other things it would be hard to live without.  It also seems so easy, click click, and the truck shows up.  What could go wrong?  It would take very little to go wrong in this system to prevent it from working.  And with physical stores driven out of business because of this dependence, we're very quickly out of stuff and out of luck.  The redundancy in the consumer supply system-- several kinds and sizes of physical stories plus mail order from afar--is much weaker and is quickly disappearing.  No Amazon, no Fedex for a few weeks, and we're screwed.

Resilience is a buzzword in certain quarters these days, but it basically means the ability to take hits and keep going.  In the big picture, resilience (beyond individual character) comes from nature, where it's being destroyed.  It comes from community, which is destroying itself, although at least some spirit of "you'd do the same for me" persists.  It comes from families and individuals, but they can't do it alone.

It comes from institutions that are themselves resilient.  It used to come from businesses, but there are fewer local businesses anymore, with local ownership and resources.  It even came from corporations, when they identified themselves with the community, or even with America.  They mostly don't anymore, except for PR purposes. Do they even exist anywhere, when their faceless ownership is in another state and the only people who will talk to you are in another country? Their only allegiance is to their own greed.  I'm talking about you, Suddenlink and AT&T.

 And it comes from government, from the public sector, which is currently being destroyed from within in Washington, as well as looted on a massive scale.  So when it comes time for the federal government to save us all, as it did in 2009, the cupboard is bare, and the morons in charge won't know what to do anyway.

So, with suicidal policies in Washington, homicidal attitudes in the country, and a population with their heads buried in their screen toys, we seemingly haven't noticed that we don't know how to make anything or do anything, and don't have the means to do it anyway.

So no, we're not in great shape.  But let's move along.  Nothing to see here.