Novel Starter

outline2As I hem and haw my way toward my next writing project, I have taken several swipes at an outline. I’m an outliner, yes. Without an outline, I’m writing word salad into a desert. I don’t always follow my outline, but I need a rough map at least.

Over the years I’ve developed a generic outline template designed mainly to keep the pace. Certain key scenes are necessary in any novel I write. In principle, I could write those ten or twelve key scenes first then connect the dots, but I’ve never done that. Doesn’t feel right to me.

I’m especially concerned to avoid the mid-belly sag that often occurs in Act 2, so this outline is specific about how major scenes are distributed.  It tends to be front-loaded, because even though I personally prefer a long, subtle lead-in, most readers don’t. The times we live in demand immediate developments.

Also, the wrap-up, in Act 3 seems truncated by stuffing it into the last 25% of the story, but that’s also what readers like: a ka-bam! ending.

I think the outline will work for genre fiction as well as literary (character-driven). The main difference is whether the story drivers are “in the world” or “in the gut.”

I tend to write short novels, 72K to 75K words because that’s about all the attention span I have. I try to, as Elmore Leonard advised, “leave out the parts that readers skip.”  But you could apply the pacing percentages to 100,000 words just as well.

Is it formulaic? Of course. But it’s just a framework, not the finished house. I’m already thinking of how the story I have in mind will not follow this outline. Still, I will use it to get me started and lead me forward and alert me to dreaded pace sag.

The structure this template shows could be mapped into a visually graphic format, as long as the proportions for each “bubble” were conserved.

Abbreviations Used: PP=Plot Peak; ANT=Antagonist. AP=Ant-Point. MC=Main Character. RI=Romantic Interest. RC=Reaction Character. SQ=Status Quo. Hamartia = hidden weakness or secret shame; MC’s Achilles’ heel. WOM= a main advisor (“Wise old Man” or other). NLT= No later than. PN=Narrator. %= proportion of total projected page or word count to maintain a good pace. Italic highlight = Landmark Scene that cannot be skipped.

0% Act 1. Setup, Intro MC with opening image. Shows MC’s world and character. MC reacts to a small, foreshadowing conflict which is also a hook. Not the trigger, but something is not right. Patch it over while revealing the theme and the character of MC.  Show MC’s dominant trait and hidden need (hamartia); the inner demons. Show but don’t explain.

10% NLT. Trigger event Disrupts (upends) the SQ; Rocks the world. Trigger is exogenous to MC but specific to MC, not generic.  MC’s reaction sets the plot in motion. Define the  Story goal. What does MC WANT, specifically?

15%  MC’s Initial Reaction to trigger is a failure, makes things worse. Confusing. Introduce or hint at ANT by way of explanation of the failure.

20% Rubicon. PP1 Basic conflict is clear. MC is over his or her head, acts irreversibly though it doesn’t seem so at the time (e.g. cheats, breaks law, crosses the river, etc.) A point of no return. Story path is set for MC: Get the MacGuffin, escape the threat, save the farm, do your duty, begin the journey.

25% NLT Act 2. Response phase. MC is victim of slings and arrows (and own hamartia). MC acts repeatedly to fix SQ, often overconfident, and fails. Obstacles and complications are self-generated. Romantic story begins often badly. RI is diffident or offended.

30%  MC is bewildered.Each response sets up the next obstacle. MC is digging his or her own hole, walking into it, but doesn’t realize. AntP1 point. Ant is clearly revealed and personified (not abstract). Romantic story shows signs of hope.

40%  Complications. Stakes go up. Potential for loss is greater than formerly realized. It’s serious. MC suffers failure, pain and loss. Is afraid. MC can’t go back, can’t see forward. Romantic relationship is on the rocks. Escalation continues.  MC tries harder; fails again.

50% Midpoint. Turning point. Some hope is seen. The romance recovers tentatively. New info or resource comes to light and a plan becomes possible. MC goes on the offensive for the first time. Hopes are high. Despite best efforts, big plan fails. Hope is dashed. Precious resources are lost. No Plan B. No other options apparent. Disaster.

55% The Pit. Romantic and other relationships break down, in anger, disappointment, misunderstanding, etc. MC discouraged. Doubts self. Questions goals. RC advises courage, perseverance but MC is wracked with fear and doubt.

60%  AntP2. We need a bigger boat moment. ANT appears in full strength and wins a major skirmish, ups the ante. Seemingly insurmountable obstacle arises. The problem is much larger than previously thought; overwhelming.

65%  Rock Bottom. Ant prevails. Dark night of the soul. MC is ruined,  Alone and near death. MC Despairs. MC gives up and wanders off. Quits the field. Leaves the project, abandons the story goal. Complete Failure is realized, Only ashes left;

70% The turn: PP2, Critical Choice.  MC hears from RC, RI, or  gets advice from WOM; or, MC takes unexpected inspiration from an insignificant and odd thing, something that might have been planted earlier unnoticed; a special device, a memory, a person, a risky passage or technique. MC decides to go with that, despite the inner demons. MC plans a desperate chance a seemingly mad, irrational decision that goes against type. What the hell, do the right thing moment. Possibly against advice of RC, RI.

75%  NLT Act 3.  Confrontation. MC arranges a situation to confront ANT. Sets it up. RI may be at risk. Can withhold information from reader, showing only the setup.

85%  Climax. Big showdown. White vs black hats as MC confronts ANT.  MC prevails (which was not guaranteed) or dies.  MC triumph (or noble death). But the story goal is achieved. RI is won.

95% Character Reversal. MC is a new person (or martyr). MC has overcome ANT, achieved the story goal and has possible epiphany. Understands the hamartia (need not be spelled out). Solid relationship with RI.

<100% Resolution Mirrors the opening image but in a new SQ. Nothing will ever be the same again.

100% ###END###

Developed by Bill Adams from various sources. http://billadamsphd.net.

 

Literary Fiction: Not for the Timid

Melancholia1Looking through my NBT (Next Big Thing) list, I find dozens of attractive ideas for a new novel. I notice many of them would fall into the category of “speculative fiction,” which I believe is mostly realism, but with some fantastic “what-if” element that drives the story.

For example, what if a huge planet were on collision course to Earth (as in von Trier’s excellent movie, “Melancholia”)? What if a man and a boy trudged across a post-apocalyptic landscape (McCarthy’s “The Road”)?

I have loads of speculative story ideas, involving invisible men, telekinetic wizards and ordinary people with access to future events. All fun.

I also have a long list of sci-fi ideas, but I decided to give the genre a rest, having just come through a series of four novels. It’s still an attractive genre, the way I do it as “Boring Science Fiction” (www.boringsciencefiction.com). But I don’t want to get into a rut, which can be indistinguishable from a groove.

I would like to try literary fiction, if only I knew what that was. My idea of it is the novel treated as an art form, in which the character arc dominates the external story elements so as to illuminate something interesting about the human condition (Abe’s “Woman in the Dunes” is an example).

If I had no fantasy element, no distracting MacGuffin, I’d have to write a pure character-driven story. Okay, fine. I settled on a historical period and a cast of characters I had interest in. But as I began sketching, I became bored.

What was wrong? Am I so shallow that if a space ship doesn’t land by page ten I’m outta here? Do I need a murder on page one? My instinct is to go immediately to the speculative MacGuffin to drive the story. Am I bored because my character is boring?

I have to say that most of literary fiction I’ve read is uninteresting. I usually read at least five boring-as-dirt novels before I find one that grips me then haunts me for weeks or months. Most lit-fic doesn’t do it, and I’m afraid I would end up writing something that reads like a saltine cracker tastes.

So I went through a list of about a hundred literary novels I’ve read recently and separated the good ones from the bad and asked, What’s the difference?

The bad ones, even when they’re very well-written, tend to veer to one of two extremes. One batch tries to elevate ordinary everydayness to existential proportions and ends up wallowing in sentimentality. Elena Ferrante’s “Days of Abandonment” was like that. Allende’s “Eva Luna.” Larry Brown’s “Joe.” McCarthy’s “The Road” (I know – sacred cow).  I have a long list of these, including a whole, leafy branch of them that turn on utterly uninteresting (to me) kinship relations (Who was the real father? Answer: Who cares?).

The other category of “failed” lit-fic is wrapped in an eye-catching wrapper that disguises the quotidian machinations of the characters. A lot of “ethnic” and “immigrant” fiction is like that. (I know – politically incorrect. Kill me now).

Readers are supposed to be so caught up in the main character’s fish-out-of-water struggles that we don’t notice that there’s nothing going on that transcends ordinary human experience. We can be safely charmed by an “alien” culture without feeling condescending (I’ll name one of those, at risk of receiving death threats: Achebe’s “Things Fall Apart.”)

I am NOT saying that all writing by authors from nonwhite, non-mainstream cultures or at the margins of the dominant culture are uninteresting – far from it. I found Llosa’s “The Storyteller” gripping, for example, and Alexie’s “Reservation Blues” haunting. I’m making a case that fiction by or about minority culture is not, by itself, sufficient basis for an interesting lit-fic novel. The good ones have genuinely interesting characters that show us a humanity we may not have seen before.

My thesis is that reader empathy in any literary fiction depends on condescending sentimentalism. That’s the game. That’s how they’re written and that’s why they sell. A character must be seriously deficient, flawed, or conflicted to make it in a literary novel. That’s required in order to execute the art form. Plot points confront the character, revealing the hidden flaw (Aristotle’s “hamartia,”) and the drama arises from how the character deals with that.

The problem with “bad” literary fiction, I am proposing, is that the main character’s conflict is often mundane. Childhood trauma, overbearing parents, secret shame, shot a man in Reno, coulda been a contender.

Sometimes excellent writing elevates an ordinary conflict to the sublime, as Ishiguro did with Stevens the Butler in “Remains of the Day,” or Woolf did with Mrs. Dalloway. But too often, the conflict really amounts to the fact that the character is not very self-aware, which leads to disastrous behavior, but it’s still not interesting. This is a flaw I find in much of Faulkner’s writing, but also in characters like Rabbit Angstrom (Updike’s “Rabbit, Run”). Watching stupid people behave badly is just not that interesting to me.

Speculative fiction and sci-fi get around that problem by focusing on the exogenous story. End-of-the-world asteroid? Nobody saw that coming! And nobody really knows how any character would or should react to it, because nothing like that has ever happened. Finesse!

So, back to my trepidation about attempting to write literary fiction. I’m afraid I would write one of those milquetoasty, boring characters whose precious little conflict revolves around sibling rivalry, frustrated ambition, mixed feelings towards parents, and so on. I’m an ordinary person who has rarely faced the abyss, so I’d need to stretch my imagination and dig deep. Am I good enough to do that?

 

Twenty Thousand Snapshots

1946 Est Bill (2)I have a collection of family snapshots scanned into the computer, going back to 1945. It’s mostly pictures of and by me. The traditional photographic records that a family might have going back across the generations, I don’t have and maybe never existed. Photography is a leisure activity not high on the priority list for immigrants and working-class people. My family didn’t take many pictures, and most of the ones we did have were lost in a flood during the 1960’s. I have the salvage from that, plus my own snapshots since then.

For young people: photos are called “snapshots” because cameras once had a spring-loaded shutter that really did “snap” when the shutter was released. The disposable film cameras still available in some drugstores work that way even now. The photograph resulting from the snap of the shutter and subsequent exposure and development of the negative film and printing of the positive picture, was metonymously called a snapshot, mainly to distinguish it from a professionally-made, studio photograph. Professional cameras didn’t snap. Many of them went “ka-chunk” as the reflex mirror flipped up during the moment the shutter was open then flapped back down – one of the more highly implausible media technologies ever invented. Today’s digital cameras are virtually silent, though many produce a gratuitous “snap” from a sound generator for auditory feedback.

I took a quick tour of my snapshot collection, sampling here and there, to gain a sense of what I had, thinking that such a large number of pictures could constitute a rich resource for writing something, I knew not what.

What I discovered did not point to any clear writing project. Autobiography would be the obvious thing, but my life story is about as boring as you could get and besides, I have no motivation to recount it. I have to wonder about the motivation of anyone who publishes an autobiography.

Convention dictates that an autobiography is justified in only two situations. One is if you have kids and grandkids who might someday be interested in some details of your life. I have none of those. The second is if you had some interesting effect on society, such as starting a famous company, serving in the Senate, being a movie star, pulling off a great train robbery or flying around the world in a dirigible – something that lifts your story above the humdrum of ordinary everydayness. I have done nothing but the ordinary.  I am, and have been, a plain-vanilla white, middle-class, suburban guy groping through confusion, scratching out a living, trying to find meaning in the chaos of experience. I have nothing special to report, and my pictures show that.

My next idea was to use the pictures to tell a fictional story. The people in the pictures would be story characters and I would arrange sequences of pictures into a sort of graphic novel. I could use image-processing software to convert the pictures from fuzzy old photos to line drawings, colored or not.

But looking through the pictures, I realized that they don’t lend themselves to that use. Most of the portrait shots are highly conventional (“Say cheese!”), uninteresting and inert. The characters don’t look remotely like storytellers, let alone actors in a drama. Each picture has its own context, usually conventional (birthday, graduation, vacation, etc.), and doesn’t look like it could be anything other than what it is.

Many pictures, maybe half, don’t even have people in them (cars, boats, buildings, cityscapes, flowers, sunsets, kittens).  Those objects might actually be the better choices to equip with speech balloons to tell a story, but that idea made me realize two shortcomings of this approach. One, the story would have to be told, not shown, and as pure narrative exposition, that would probably be boring. And two, what contribution would the pictures make?  Virtually none. So ditch the pictures and just tell the story.

After five days of study, fifteen pages of notes, and much convoluted thinking, I came to some conclusions. Here are a few.

1.  Nobody cares about anybody else’s snapshots, life, or autobiography, so the picture collection is a non-starter for use as a historical record.

2.  Vernacular snapshots like mine, as opposed to artistic photos by Annie Liebovitz, don’t contain much information. An anonymous person (anonymous to anyone but me), grinning stupidly into the lens is not informative. The meaning in the photos is almost entirely in my head, my imagination, my memory.

3.  When people share family snapshots with others (which they never should do), the impulse is to “explain” the meaning that doesn’t show in the image. “This was my first car.” “That was the time we went to New York.” The context of the picture is the meaning. The image can stimulate mnemonic meaning for the owner of the snapshots, but is wasted on anyone else.

4.  The impulse for taking a picture is to address future history. There are no photographs of the future. All pictures are of the past. That’s true even in the moment right after you take them. Subconsciously however, we expect there to be a future, one in which we will look back on the present moment and seek “proof” of how we were, against the vagaries of standalone recall. We don’t think about it in those terms but taking a picture is about the future, not the present.

5.  Taking a picture is also about stopping time. We conceptualize a moment out of the stream of ordinary experience and recognize it as somehow special. “Take a picture!” someone cries. The urge is to stop the clock, preserve the moment, record the experience, none of which a picture really can do, but that’s what we ask of it. And to the extent that the picture is successful in “capturing” the moment, it is a lie, because life is a process. A static moment from the flow of experience is as representative of it as a dot is of a line.

6.  Viewing snapshots is an entirely different process than taking them. Pictures viewed are mnemonics that blend personal and social history. What counts as “an event” worthy of a photograph is defined by social convention, which is why snapshots are unrelentingly conventional. An individual participates in the larger social meaning by enacting the conventional event, even if only subconsciously. That gives the sense, in retrospect of continuity and connectedness with the social order of history. That’s why we have so many wedding and graduation pictures. Most snapshots, therefore, even the personal ones, are not even personal.

7.  What makes my pictures mine? What if I went to a flea market and bought a few dozen snapshots of people I didn’t know? They’re very cheap, and for the most part, they look not dissimilar from my snapshots. What is the difference? Only that I recognize and can usually name the people and places in my snapshots. Other than that, there’s little objective difference. What makes my pictures mine is my experience, not anything that’s in the photos themselves.

8.  Despite the ubiquity and banality of vernacular snapshots, they still contain many layers of magic and deep mystery that can barely be fathomed. For example, a picture proves that time passes. Of course we knew that, but you don’t feel it passing. It slithers silently by. The picture documents that reality and despite its obviousness, it is still damned mysterious. How does that happen? When did it happen? Ageing is the same. And implicit in the concept of ageing is the long, bumpy ride of psychological development with death looming at the end. Nothing is more mysterious than that. Except maybe the strange sense of self-alienation when looking at a picture of yourself. When I do that, I might be able to recover, a little, the feeling of what it was like at that moment, but for most pictures, that is entirely gone. I was a different person then. It wasn’t me. Except that it was. How can it be me and yet not me?

These are just a few of the puzzles and conundrums I uncovered in examination of my photo collection. In the end, I’m not sure if the collection is a valuable resource for writing or not. Certainly there are plenty of questions that could stimulate introspective stories, but that doesn’t light my fire.

IMG_0083I reread my notes on Swann’s Way, Volume one of Proust’s novel, In Search of Lost Time. I read reviews of several scholarly books on the topic of “memory studies,” a field that likes to analyze the social meaning of photographs.  And I wrote five thousand words of notes. And I still have nothing. So, much to my surprise, this may not be the basis for a project after all.

I’ll let it marinate for a few months and see if anything else emerges.

Graphics: Top – How I started out.  Bottom – How I Turned out (so far). 

Just What I Always Wanted!

SympathizerInteresting writing kept the pages turning for me. Nguyen has a knack for unexpected description and creative simile. A random example: Two men are talking but notice the chairs:

“As usual, he reclined in an overstuffed leather club chair that enfolded him like the generous lap of a black mammy. I was equally enveloped in the chair’s twin, sucked backward by the slope and softness of the leather, my arms on the rests like Lincoln on his memorial throne.” (p. 63).

The protagonist is the son of a French father and a Vietnamese mother. He works for North Vietnam, but lives in Saigon as a mole in the South’s secret police. He barely escapes the fall of Saigon then after some tribulation thrives in the Vietnamese expat community in California, still tightly connected to the pro-American, pro-democracy military crowd there. While ostensibly helping “the general” and other officers plan a return and counterinsurgency in Vietnam, he sends secretly encoded messages back to his controller in Ho Chi Minh City.

The hero, known as “The Captain,” is a divided self. He has mixed heritage and is sensitive about that, feeling he is never quite accepted either by the Americans or by the Vietnamese. That ambivalence parallels his secret spy status, acting as both communist and democrat, and again by his sensibilities, appreciating the American way of life, American music and scotch, and also despising the American military for having betrayed his country, abandoning it at the end. He is a two-sided man.

Much of the dark, sarcastic humor of the book arises from his dissociated observations.

“As the Congressman rose, I calmed the tremor in my gut. I was in close quarters with some representative specimens of the most dangerous creature in the history of the world, the white man in a suit.” (p. 250).

“The General furrowed his brow just a bit to show his concern and understanding. As a nonwhite person, the General, like myself, knew he must be patient with white people, who were easily scared by the nonwhite. Even with liberal white people, one could only go so far, and with average white people one could barely go anywhere.” (p. 258).

While the writing is consistently engaging, the story is close to nonexistent, the characters not cleanly drawn, and the pace a terrible sag. That makes the book very slow going, indeed.

The humor doesn’t sustain or redeem it. A lot of pulled punches pass as humor but merely ridicule clichés and stereotypes. Every conceivable American stereotype about the Vietnamese people is trotted out and lambasted, including a tedious, extended parody of the movie, Apocalypse Now.

Was that really necessary? We get it. Stereotypes: bad. Unlike the self-effacing and creative ethnic humor in a book like The Sellout by Paul Beatty, Nguyen doesn’t seem to have much distance on ethnic prejudice and expects the reader to be shocked when he calls it out.

As a commentary on politics and history, the book is sometimes interesting but lacks deep insight. For all his supposed commitment to communist ideology, for example, “The Captain” doesn’t have much to say about the relative virtues of communism and capitalism, other than to note the obvious differences and to convey an obvious conclusion, exploitation: bad. The tragic political history of Vietnam is treated in similar superficial matter. Colonialism: bad.

The story line does have a few plot points, although anyone wanting to tighten it (e.g., a screenwriter) would have to cut at least 150 pages of plodding quotidian detail. Endless drinking, smoking, and sex do not a compelling story make.

Towards the end the story picks up a bit when The Captain and a band of pro-American soldiers return to the country to try to stir up revolution, but The Captain seems strangely unaffected by his dual allegiance. His thoughts and feelings don’t ring true.

The Captain is a divided character, but I didn’t feel that tension, which was only stated, rarely shown, even when he had to kill an innocent man to “prove” his bona fides. Yes, the ghost of the dead man haunts him (tediously) for the rest of the novel, but that fact still doesn’t reveal the Captain’s inner conflict. If he was so conflicted, why did he do it? If he did it against his better judgment, why? Does it change him as a person? I wasn’t “feeling” the Captain.

The guts, gore, and torture in the last hundred pages seemed gratuitous, considering how the first part of the novel was set up to be a Lolita-like, self-reflective and self-exculpatory confession.

The ending is superficially dramatic but in fact hinges on an old Buddhist joke. A monk opens a birthday present and finds the box empty. What does he say?  “Nothing! Just what I’ve always wanted!”

The book is well-enough written that it can live up to its Pulitzer Prize, but I don’t think it will stand as a landmark in literature.

Nguyen, Viet Thanh (2015). The Sympathizer. New York: Grove Press (385 pp).

Amerexit

AmerexitSince the Brexit vote, in which Britain decided to leave the EU, many hands have been wrung.  Nevertheless, it doesn’t seem like much communication has occurred. The losing “Remainers” genuinely don’t seem to get the message. They view the Brexit vote as an error.

It’s not an error. It’s what most of  “the people” want. Last night on a major news show, Howard Dean, former governor of Vermont and former chair of the Democratic National Committee, referred to the Brexit vote as “most unfortunate.”  Why? Because he also doesn’t get it. The vote was not “unfortunate.” It was what the people chose. How can you be a political leader in a democratic country and not understand that?

A letter to the editor in a recent issue of The Economist made the point succinctly. Stephen Hand, of “Chipping Sodbury” (he could hardly have made up a more fitting name for his town), wrote:

“…. Had Remain won, no one would now be discussing the need to heal a divided nation. Instead it would be ‘’common sense triumphing over isolationism’, ‘tolerance overcoming hate’, and so on. I voted Leave on the basis of Tony Benn’s inarguable case regarding democratic accountability and I am delighted with the outcome.”

This is precisely the message that the ‘elite cognoscenti’ of the governing class did not hear, and are still not hearing.  People want to be heard.

In America, our leaders are deaf in the same way. America is not about to exit the EU, but there are signs that some people would like it to exit the world – trade partnerships, military alliances, treaties, and general engagement, including immigration.  In the 2016 election primary, the “leavers” number at least 16 million so far.

Who is listening?  Nobody, not even Trump. He doesn’t listen, he only barks. Not Clinton. She genuinely doesn’t get it. She is a member of the entitled class and she’ll tell you how things should be.  Sanders was listening, but he’s out. Obama can listen, but he won’t. Not only is he a short-timer but he’s buried in everyday detail and complexity and is committed to the status quo. He says “It’s not so bad,” and “We need to work harder.”  He doesn’t see that the country could be on the verge of a radical break-up, hopefully not violent, but who can say? The signs are there. Civil wars have erupted from less division.

Not that any one person, Sanders, Clinton, or Obama, could change course single-handedly. The problem is that the governing class, on both sides of the aisle, cannot or will not acknowledge the hidden root of the divisiveness, the fact that moneyed interests in America now have more influence on governance than the voice of ordinary people. The ordinary people don’t like that. Is anybody listening?

Graphic credit: www.teepublic.com

Unit, This is Like, So Dumb!

Feed - AndersonThis sci-fi novel is addressed to “young adults” defined as aged 14 and up. It was recommended to me for reasons I can no longer remember. I don’t read or write YA fiction, so I have to give allowance for a domain I am not very familiar with, but I have to say, this book seemed heavy-handed, simplistic, pandering, obvious, and downright insulting to any 14-year-old with the intelligence and motivation to read a sci-fi novel.  It’s been a long time since I was fourteen, but I can’t imagine I would ever have found a book like this enjoyable. That said, the book is a National Book Award Finalist, so I’m the odd one out.

The story is set in a dystopian technological future earth where all the children have an electronic brain implant at birth so they can receive “The Feed” directly into their thoughts for the rest of their lives. The Feed comprises advertising and infotainment, much like today’s internet and television. The Feed also allows the kids to communicate “telepathically,” which is to say in chat networks, again, much like today’s social media. Considering the state of the media world before 2002, when the book was published, Anderson was prescient in his vision. (The first smartphone was only released in 2007. Facebook only went public in 2012).

Also a plus is Anderson’s creation of a slangy teen dialect that, while contrived, feels mostly believable and is a source of much humor in the book. The kids call each other Unit! Instead of “Dude!” but the syntax is the same. It’s difficult to create a good artificial dialect. Anthony Burgess did it memorably well in “A Clockwork Orange.” Anderson has produced an admirable and enjoyable language context here.

On the down side, the author’s criticism of consumerist society is overdone and heavy-handed. The author seems appalled at ubiquitous advertising in society, but my feeling is that most people, especially teenagers, do not accept it at face value as his characters do. Kids are not “slaves” to advertising. It is more of an annoyance and an amusement than an oppression, unless you dig deeper into the pernicious subterranean roots of manipulation as Don DeLillo did, for example, in “White Noise.”

Anderson’s teen characters are utterly vapid however, without original thoughts, opinions, or ambition. They go to School™ but don’t seem to study anything there, read any books, have any ideas, or learn anything. Maybe that’s how it is these days, but I think that’s just another example of the author’s heavy-handed attempt at satire, which falls flat, as far as I’m concerned.

A more serious literary criticism is that with uninteresting and uninterested characters who are directionless and think only of drinking (and puking), having sex, wearing the latest fashions and driving a fancy car, the story very quickly becomes boring because nothing happens. There is no plot and no character development, no tragedy, no striving, no quest, no challenge, no hero’s journey. What does that leave? No much of anything.

There is a hint of a story in there. Violet, the main character’s female friend, received her brain implant late in life and was home-schooled, so she is somewhat eccentric and dares to protest the empty-headed lifestyle of her peer group.  But aside from the occasional rant, she does nothing about it and falls in with their ways, and nothing interesting comes of her special situation.  That was a huge lost opportunity. Hundreds and hundreds of boring, repetitive pages of nothingness could have been filled with dramatic tension and intellectual-social conflict.

I believe any thinking YA reader would be offended by the author’s unrelenting ridicule of teen vapidity in this novel. Sure, there’s a lot of that on the ground, but those people aren’t reading novels. Maybe that’s how it’s supposed to work: Those other kids are stupid, but not me.

But Unit! Like, what do I know? This book’s successful and I’m not.

Anderson, M.T. (2002). Feed. Somerville, MA: Candlewick Press (298 pp).

Death at the Beach

snorkelerOn a recent business trip I stole a few days of vacation at La Jolla, in southern California, with my wife. We walked on the cliffs and the beaches and talked to the barking seals and we watched a man die right in front of us.

He had been snorkeling in a nearby cove when kayakers started yelling and waving their arms over their heads. A lifeguard jet-ski was launched and picked him up and loaded him onto a short nylon stretcher behind. That’s how lifeguards operate in La Jolla. With full police siren, the jet-ski zoomed into the small cove where we were standing and other lifeguards, presumably, appeared and began vigorous CPR on the guy. He seemed slightly responsive at first, but that could have been just head movements caused by the chest-pounding he was getting.

He was in a wetsuit so he bore no social signifiers of occupation or status. Just a man.  He was in his fifties, face tanned and creased, and he had a gray, short-cropped beard. He looked a little like “The Most Interesting Man in the World,” on those awful Corona Beer commercials.  But soon his face was as gray as his beard and even while somebody struggled with an IV bag, it was obvious he had died. They lifted his limp form onto a full-sized stretcher and carried him up to the road and into a waiting ambulance. He was whisked off under another cloud of sirens.

On one level it was a mundane occurrence. People die every day, some of them at the beach. According to the CIA (and they would know), over 150,000 people die every day. So it’s not a rare event. And at least this guy went out doing what made him happy, not wilting away in some yellow hospital room. Still, we cover up the fact of our mortality so well in this society, it’s still shocking when you see someone actually go over the edge into that abyss. Combat soldiers might have a different take on that.

The guy could have had a heart attack in the water before anybody even noticed. How do you distinguish a dead snorkeler from a live one?  The drama of the rescue system kicking into gear was testament to the community’s concern for everyone’s well-being, and not just for public safety at the beach. It’s a big deal when anyone in the human community dies. At least it’s a sobering moment. Reminds us of you-know-what.

What struck me about the incident were two things. First, how everyone there at the beach, and out on the water, was quiet, somber, and still while they hauled him in and tried to revive him. Nobody in the crowd seemed to know who this guy was, but everyone knew instinctively he was one of us, in some universal, non-tribal compassionate response. That was good to see.

The other thing was that as soon as the ambulance was gone and the siren had faded, everything went back to normal. The kayaks returned to their clusters and people started going back into the water to play. Maybe the mood wasn’t quite as exuberant as it had been before, but the incident was over and the status quo was re-asserted. My wife and I continued our walk along the beach.

Each person evaporates from the planet without much trace. His family will remember him for a while. I’m still thinking about him a week later. But soon enough, his entire existence will be forgotten. It will be as if he had never lived. Each of us is The Most Interesting Person in the World, in our own eyes, but not in anyone else’s. The only person who really knew what his life meant, was him. For the rest of us, he was an interloper. The sun still shines, the seals still bark and the tide goes out as always.

How to Do Yourself In

Final_ExitEveryone dies. A significant number of people will die in automobile crashes, gang wars, from drugs, or in military wars, but because of modern medicine and public health, if you live through your twenties, you’ll probably die of “natural causes,” which is to say, of the diseases that come with old age. And how does that go?  Usually not well.

Hospitals are motivated and incentivized to keep you alive at almost any cost (your cost, not theirs). I’ve seen relatives and friends linger on, year after year, hanging by a thread it seems, but not dying, until the money runs out and the medical care stops.

Besides the financial crunch of getting old, the human cost is incalculable, both to the dying person and those who care for him or her.  Quality of life declines rapidly. Freedom is lost. Choices are constricted. The body becomes frail and unreliable, and so does the mind. There is nothing quite like seeing the terror in someone who realizes they are losing their mind and knowing it will get worse, never better.  Medicine and medical care become ever-more extreme and expensive.  Pain, discomfort, and anguish increase exponentially.  Every day is a crisis. There’s nothing good about modern dying.

As for family caregivers – it’s a life-changer. Untold hours and dollars are expended. Personal plans and goals are suspended for years. Family life is severely disrupted.  Intra-family battles erupt. Relationships are strained. Choices narrow. Guilt and depression are common. High-stakes, high-anxiety, battles with hospitals, doctors, insurance companies, banks, lawyers, assisted living centers, rehab centers, hospices, pharmacies, the IRS, DMV, and even the post office, are unrelenting. The collateral damage is severe.

America has no rational infrastructure for dying other than capitalism.  As long as someone is still breathing, they are “alive” and that’s all that matters. Quality of life, physical, mental, social, and spiritual, do not figure into any assessment of dying and consequently, and unconscionably, all of those aspects of living are depleted despite the cost in human suffering.

If you want to die on your own terms, instead of slowly slipping into uncomprehending pain and frailty, sucking your whole family into the vortex with you, then you need to plan ahead. You will die. There should be no surprise there. The question is whether you will do it your own way.

“Final Exit” is for people diagnosed with terminal illness who would rather die in a manner of their own choosing instead of going through the traditional meat-grinder of the health system in America.  It describes how to establish a “living will,” which instructs health care professionals and family members what your wishes are, and other useful steps, such as providing a trusted family member or other person with power of attorney to manage your affairs. All this can be set up in advance, when you are happy and healthy.

The book also offers advice about whether or not to end your own life, either with the help of a physician (“assisted suicide,”) or on your own. Assisted suicide is available in three states now, WA, OR, and MT, for residents only, and the regulations are stringent. You really do have to be within your last six months of life due to untreatable, terminal illness. It is not for “mere” old age.

Author Humphry (a physician) is keen to state that the book is not meant to assist people to commit suicide, especially people who suffer from depression, other mental illness, or a severe reversal of fortune. It is advice for, as he calls it, “self-deliverance,” the act of an irreversibly ill person to make a rational, voluntary decision to end life.

The book is somewhat controversial because some people believe there is no such thing as a rational, voluntary decision to end life. If you make such a decision, you are, by definition, suffering from mental illness.  If that’s what you think, then don’t read this book. It assumes that when and how to die can be a rational and moral choice.

Religious people may object that only god can determine when and how each person dies. Of course if that were true, they would eschew end-of-life care, but they don’t. If you believe the god-knows-best argument about death, this book is not for you.

Two philosophical problems remain however, even for those who believe that when and how to die can be a personal choice.

One is that, when you’re depressed, you often don’t know you’re depressed. You might be better served by counseling than suicide.  The only way to know is to talk it out with family members, counselors, and people you trust.  You cannot make a decision about suicide yourself. If you really are completely on your own, you’re in a conundrum and could make a terrible mistake. The book does not address that possibility adequately.

The second problem is that the book insists you have a “right” to die the way you choose to, but that is incorrect. There is no such right. Rights are granted to individuals by the social matrix of a community, sometimes personified by an acknowledged authority, often encoded into law. Since society generally disapproves of suicide and murder, it is wrong to conclude that you have any right to take your own life or have anyone else help you do it. The book is not clear about the role of the value systems in the society that an individual lives in. It assumes you have some vague magical “right” of self-deliverance, but you do not, so it will be an anti-social act if you go through with it and you need to be aware of that from the start.

Putting that prologue aside, the book is valuable in describing how to “deliver” yourself and how not to. It advises, for example, to rule out any kind of plant or chemical poisons. You probably won’t die from them and probably will end up with brain damage. Same with sucking exhaust from your car. Not effective. Same with most drugs you can get, either illegally on the street or legally from your doctor. They’re not pure or they’re not strong enough, or they have anti-suicide technologies built into them. You can’t know what you’ve got or if it will be effective (lethal).  It’s not that easy. There is no suicide potion you can count on.

The most direct method of delivering yourself, and by far the most common, is gunshot. That’s extremely violent and messy however, not available to everyone, especially not for the frail and/or faint-hearted. It’s also not guaranteed to be done right. It’s just the most obvious. The book does not discuss self-inflicted gunshot as a method, an odd omission, considering its frequency. Is there a better way?

The book lists dozens of pharmaceuticals that would be effective, along with the dosages needed and the probability of lethality. However, as a practical matter, you probably can’t get those drugs in the purity and quantity needed.

The recommended choice is the old bag-over-the-head method. A plastic bag, taped at the neck, brings death by asphyxiation in thirty minutes to a few hours, provided you don’t tear it off in a panic when you realize you are actually dying. The recommended method is to take an overdose of sleeping tablets (which are not lethal in themselves  anymore), perhaps along with alcohol and an anti-emetic, so that you are fast asleep before you suffocate and therefore won’t panic (hopefully) before the process is complete.  No data are presented on success or failure rates of the plastic bag method, an odd omission, considering its recommendation.

A much simpler and more obvious method is to ask your doctor to prescribe something lethal for you.  Given your circumstances, you might be surprised at how accommodating the physician might be. Of course you have to use indirect language. No doctor is going to agree to murder or accessory to murder. But it is possible to make your meaning clear without being direct. I have seen that approach work with a relative who was in hospice care.

You will need more and better information than this book provides in order to make your own plan for “self-deliverance.” Judging from the outdated pharmacological information in the book, I’d say it is a good introduction to the topic but “The Final Exit” is not the final word.

Humphry, Derek (1991/2005). Final Exit: The Practicalities of Self-Deliverance and Assisted Suicide for the Dying. New York: Delta/Random House, 224 pp.

The Father of Cognitive Science

jerry brunerJerome Bruner died recently at age 100. He was one of the first cognitive psychologists. His 1956 book, “A Study in Thinking” (co-authored with two other ground-breaking psychologists, Jacqueline Goodnow and George Austin), was the first shot fired in the cognitive revolution that finally overturned behaviorism. It demonstrated that minds could be studied scientifically. Others had done that before, people such as Fechner and Ebbinghaus in the 1800’s, but for some reason it was the Bruner, et al. book that made the splash. Right place at the right time, no doubt. You can’t fight history.

Stimulus_2_materials[1]The Bruner book was on “concept formation,” something that a mind does, not the behaviorists’ muscle-twitches. Volunteers viewed complex geometric displays on cards, for example a yellow circle containing a single digit, enclosed in a green triangle. They were told if that item was a member of the target class or not.  After viewing a series of such samples, the volunteer had to describe the target class, what it included and excluded. It could be, for example, even numbers inside squares. In order to form the correct concept, the volunteer would have to infer what all the positive examples had in common, and that the “wrong” samples lacked.

The brilliance of the demonstration was not to show that college students are good guessers when it comes to abstract materials. The innovation was that the entire demonstration was empirical. Concept formation, a purely mental task, was demonstrated by strictly controlled scientific methods. Ergo, the mind could be studied scientifically.

It seems silly now, but at the time, this was a revelation to me. As a young undergraduate in psychology I had been inculcated with behaviorism. Even in clinical observations, to report that a patient had been crying was cause for scorn. Instead, you had to say that the patient demonstrated “crying behavior” such as tears and sobs. What was behind those behavioral manifestations, if anything, could not be known or discussed scientifically, so don’t even mention them. Just report what you see. That was the rule. Talk about the tyranny of concepts!

Then one quarter I took an upper-level class at the University of Washington, where one of the assigned books was “A Study of Thinking,” and windows opened. Here was proof that you could study the mind scientifically. I was enthusiastic and read everything I could find by Bruner and his colleagues.

Just a few years later, I designed my Master’s thesis around Bruner’s methodology. I set up a concept formation task with similar materials and had volunteers learn to find the unifying rule from examples. I also gave volunteers a battery of personality tests, because my hypothesis was that there was some personality factor that made some people much better  problem-solvers when it came to inferential tasks like that.

In the end, I could not identify such a personality trait but my methodology and reasoning were flawless and I earned the Master of Science degree. That’s what that degree means: you have demonstrated mastery of the concepts and techniques of your chosen field.

My doctoral work was also in cognitive psychology, where I studied human memory and perception, but not using Bruner’s concept formation paradigm.  I had moved on and so had the field. (For the doctorate, you actually have to discover something — a much higher bar!)

James_J_Gibson_1972In my post-doctoral research year, studying with psychologists James and Eleanor Gibson at Cornell, Bruner’s name came up often. They disliked him and everything he stood for. Professional rivalry is the norm, but both Gibsons often disparaged Bruner in front of students. Why would you do that? That’s beyond friendly competition.

J.J. Gibson (his middle name was Jerome, same as Bruner’s first name), my mentor,  claimed to be a behaviorist, by which he meant a functionalist, since a purely behaviorist psychology was a self-contradiction and always had been untenable, though most behaviorists would never agree. (Orthodoxy always trumps reason, even in science, which is founded on reason). Gibson often criticized Bruner and his work for being “mentalistic.” For Gibson, whatever the mind was, it was unknowable and unspeakable and Bruner was full of horsefeathers for talking about it.

How did Gibson accommodate the fact that Bruner’s studies of concept formation were strictly scientific, completely observable and verifiable, not mentalistic at all? He would never deign to discuss methodological details of Bruner’s work. Rather, he proposed his own approach: that animals were behaviorally attuned to their environments, intimately so. In fact, animals “resonated” with their environments, which allowed them to directly “pick up” facts about the world they could use for adaptive behavior. There was no need for mentalistic concepts like concepts.

What did “pick up” mean, exactly? He could never say, exactly. It was supposed to be some form of behavioral learning, pure muscle adaptation.  What’s the “concept” for catching a fly ball? There isn’t one. Your body knows how to do it. It has nothing to do with concepts.

What did Gibson mean that the animal “resonated” with its environment? Again, he never gave an exact definition but argued that the animal merely appreciates, in a behavioral way, the opportunities for performance that its environment affords. Banana simply “says” “eat me” and water “says” “drink me.” No cognitive mediation is necessary. A horse will not jump off  a cliff, because it just “knows,” in a non-mental way, that thin air does not afford it locomotion. A pigeon sees the situation very differently.

The subjects of Gibson’s explanations were always “animals” never volunteers or people, as if to emphasize that while people (such as he!) might have great ideas, that was irrelevant to the behavioral explanations he put forth. When it came to perception, we were all merely animals.

J.J. Gibson affected my thinking for the next two decades, until I finally understood what he was saying and what he was not saying. He was right and he was wrong. He was wrong and mule-headed in his commitment to behavioral psychology, but he was a genius in showing that at least half of what we assumed required mental “cognitive processing” could be accounted for by pre-cognitive, sub-personal, non-linguistic, tacit accommodation to simple geometry and routine transactions with one’s environment.

I am who I am today because of J.J. Gibson. His insights (and sometimes my disagreement with them) shaped my thinking about creativity, the self, phenomenology, socialization, linguistics, modern art, and much else that he never even discussed. But Jerry Bruner and his study of the mind, that mysterious black box that Gibson and others struggled so hard to deny, was my first real teacher, and you never forget your first teacher.

Card – How to Write Science Fiction & Fantasy

Card how to write scifiThe first two chapters of this slim volume were most helpful to me. “What is science-fiction?” is not an easy question, especially in distinguishing it from fantasy and the broader category of speculative fiction. Basically, Card says sci-fi concerns experience that is not yet possible but is technologically or theoretically on the horizon, whereas fantasy is never going to happen (dragons, wizards, etc.). In Speculative fiction,  the author proposes an idea or MacGuffin, and the story is the unfolding of all its ramifications.

For example, in Card’s most famous novel, “Ender’s Game,” a military sci-fi story, Earth is about to be invaded by some aliens too tricky to beat so the brilliant strategy is to use a group of children who think they are playing a video game but are actually defending the planet.  That’s the “conceit” or key idea of the speculative fiction. The rest of the story is about how that idea plays out.

Unfortunately, that description of speculative fiction is more a generalization than a definition of a genre. It excludes too little. Any good story should have a main “concept” such as the one described above, the basic “what-if” proposition of the novel. That’s true of fantasy, mystery, even romance novels. The category is only useful in allowing a space between hard sci-fi and traditional fantasy. Still, even this unresolved discussion was useful to me in clarifying marketing categories. I asked the owner of my local bookstore if she had a shelf for “speculative fiction,” and she said no. It’s either sci-fi or fantasy, and it’s often difficult to tell the difference.

The second chapter on “world creation” was fun because of a list of nearly all possible ways to achieve faster-than-light travel, from “Warp speed, Mr. Sulu,” to wormholes, and the list of possible ways to accomplish time travel.  I was gratified to learn that my invention for FTL travel seems to be unique (as of 1990). I’ve never tried time travel.

The rest of the book uses sci-fi and fantasy examples but contains generic advice that any writer can benefit from, such as how to write  character arcs, how to handle backstory, narrative exposition, and levels of diction. His useful MICE mnemonic stands for milieu, idea, character, and event – important structural elements of any story.

Unlike Stephen King, who is a seat-of-the-pants writer, Card insists that a good SF story requires a preliminary outline or story sketch. That’s the best way to achieve coherence, which is paramount in this story-driven genre, even if you end up deviating from the plan. I have to agree.

The last chapter is on how to market and sell your manuscript and how to improve your writing and your confidence with conferences, classes, and workshops. Much of that advice is dated. Hard to believe, but 1990 was pre-internet, so some of his recommendations are no longer germane. The book is indexed.

For ten bucks or less (used), it’s a worthwhile read just for the first two chapters. The middle two chapters, with general advice on good writing won’t hurt you. You can skip the last chapter.

Card, Orson Scott (1990). How to Write Science Fiction & Fantasy. Cincinnati, OH: Writer’s Digest Books (140 pp).