Words and Spear Points

The destiny of reading and writing is extinction, like the skills of operating a spinning wheel, cutting a quill pen, and knapping flint tools. Reading and writing will fall into increasing disuse until they are known and practiced only by specialists and hobbyists, the way some people still use Morse code or know how to read Old English.

Replacing reading and writing will be talking and listening mediated by video and music. Photographs too, but video, movies, and then holograms or VR presentations, whatever is the next thing,  it will be visual and or musical or both, in quality. We’ll still talk to each other in face-to-face conversations, although those will become increasingly rare even inside the home.

It’s already happening, as anyone can see by examining the state of communications technology. I had thought until recently that the trends to music and video were merely preferences enabled by cheap, available technology, not that these trends would utterly displace reading and writing.

Geological Time

I was doing research for a book I’m writing, research that immersed me into evolutionary and geologic scales of time. I became aware again of how recent modern human culture is. Imagine a timeline of history on earth, stretching from the ground to the height of the Empire State building, homo sapiens represented by the thickness of a postage stamp at the top of the building, and modern culture since the last ice age as the thickness of the adhesive on that stamp. On that scale, the duration of literacy doesn’t even show up. It’s a blip.

I am an avid reader and writer, always have been, but it struck me that the practice of visually scanning and interpreting tens of thousands of words in one long serial order and then making sense of the whole, is absurd on the face of it. That doesn’t even sound like a viable methodology. If somebody proposed it, you’d laugh.

Clovis point

Reading and writing have been highly valued skills in society during my whole life. They certainly have paid my rent. They are skills like being able make good stone spear points, difficult to learn but very useful and highly valued by the society – in their time. Yet I always assumed reading and writing were forever. They are the very definition of civilization. Fifty thousand printed books are published every year, with ten or twenty times as many new ebooks. The very foundation of society seems set in the written word – history, literature, law, science.

The culture of making stone spear points must have seemed just as important, vibrant, all-encompassing and never-ending in its time. What could ever replace a good spear point? People will always have to eat.

The records of civilization are increasingly kept in video format. Movies are the new literature. History is conveyed in documentaries and popular songs. Science reporting now emphasizes ‘data visualization.’ Students don’t read textbooks, they watch YouTube instructions. And fewer people are reading all those printed books that come out every year. Bookstores are disappearing, as are newspapers and professional journals. I doubt if anybody is reading a million new ebooks each year. Reading and writing are technologies that have peaked.

Why am I still committed to reading and writing? Looking with a cold eye, it’s only what I’m used to, what I’ve always done. Reading and writing have  been successful for me, but they’re obsolete. I need to get with the future.

I estimate reading and writing will last longer than I will. They probably have another hundred years to run but extinction is their destiny. Hard-won skills though they are, they just won’t be needed in the future except in niche applications.

So I’ve decided. When I finish writing my current novel, it will be my last. Henceforth I will turn my attention to video communication. I don’t know how, but I’ll figure it out. Maybe there’s an instructional book I can read.

Salt and Fire

Salt and Fire

I don’t normally do movie reviews on my blog because there are so many movies and so many reviews already. But Salt and Fire, the latest Werner Herzog movie, is deserving of my attention because it is almost universally despised. Only twenty-five percent at Rotten Tomatoes, and only four stars at IMDB, and reviews like “Easily the worst movie Herzog has ever made.”

I disagree. It’s a terrific movie and I’ll try to explain why. I went to it because I love Herzog and I’m a fan of Michael Shannon (See him in “My Son, My Son…” (a Herzog movie) and in “Ice Man” and in “99 Homes.” )  I didn’t know Veronica Ferres, a mega-star in Germany. 

The main complaint against Salt and Fire is that it has essentially “no” story, poor script, stiff acting, and its images are unimpressive. But that’s all misleading, perhaps deliberately so. I think it’s a philosophical essay on the theme of “perceptual distortion.”

I can’t quite put it all together, but if I were going to, here are the elements I would focus on.

  1. The story line was so weak as to be almost non-existent. The dialog was stilted, the script chaotic and the acting alternately wooden and overdone. Action was ludicrous and plot development implausible. The slight ecology theme was perfunctory. Even the locations and sets were implausible (e.g., hasty new paint on a run-down hotel). Sounds great?  Well, none of that was accidental.

The message is: Don’t look here for traditional storytelling. Your idea of a movie as a drama played out on a stage does not apply. The whole methodology of storytelling is actually a cultural syntax and not a natural mode of communication. It is a distortion, and this movie will distort that traditional mode to illustrate the distortion.

  1. Visual distortion is the main theme. Movies especially are a learned syntax for perceiving. You normally accept a movie as a window on a reality rather than the artificial construction that it is. Twenty-four pictures per second become a moving image only because of a perceptual illusion but you don’t even think about that. You accept the movie in the natural attitude.

Anamorphic compression

Movies as managed distortion is seen in the anamorphic lens, an oval or cylindrical vertical element in the camera that compresses the image on the horizontal axis.


Uncompressed Pic

The projector then has a “reverse” lens that re-stretches the image back to normal size for viewing.

The process is widely used and was used in Salt and Fire.

Anamorphic photography is used is to enhance image density and saturation but especially to capture wide-screen shots with good detail. I believe this is why the movie was shot on the salt flats in the first place. It is about as wide-open a location as you could imagine, with hardly any detail. It is pure wide-openness. I’ll bet Herzog’s thinking went from the anamorphic lens to that particular location, not the reverse.

Other visual, photographic tricks were shown throughout the movie to clue the viewer into this theme – hey, it’s anamorphosis over here! Herzog practically hit us over the head with it.

Mirror anamorphosis

We also saw a big demo of mirror anamorphosis, described by Shannon’s character.  The theme was repeated again at Shannon’s description and flashback of the anamorphic painting in Rome, a tradition in art since the Renaissance.

So we get the message: pictures, especially in movies, are not “real.” We tend to view them as mere windows on reality but what they show depends entirely on your point of view.

  1. The photography very much emphasized close-ups of the actors’ faces – extreme close ups, often from forehead to chin, and those pictures were held for up to fifteen seconds. That’s unusual and it was very frequent in the film. Why did Herzog do it?

Traditionally, that’s how you show that a character is thinking, and often Herzog had a voiceover to give the thoughts during the closeup. But that doesn’t fully explain what was going on.

Anamorphic mumps

One additional explanation is that the closeups illustrate “anamorphic mumps,” a distortion inherent to the anamorphic lens which broadens the face unnaturally in over-correction and makes the actor look like they have mumps. This distortion can be mostly corrected in post-production. A. mumps are especially prominent and hard to correct when the face is off-center on the screen. I’d have to see the movie again to decide if most of the facial closeups were off center. I can’t remember. In any case, Even though A. mumps were not obvious in the film, the facial closeups may have been another reminder about anamorphosis in visual perception.

Another aspect of the extreme closeups is the idea that faces themselves are distortions. You can’t tell by looking at somebody what they’re thinking. And in the movie, the voice-over thoughts delivered during the closeups were often extremely banal.  I don’t know exactly why. Something to do with the person within versus the public persona – especially emphasized in the highly distinctive faces of these particular actors, the leads, but also in the boys.

  1. There was a strong memento mori theme, the idea that people should be reminded that life is short and we’re all mortal. Several of the banal pseudo-philosophical quotations given by Shannon’s character were of this nature, and carefully repeated just to be sure we got them.

What was that about? It may be related to the supposed ecological disaster and the volcano threat – “Don’t forget, people, we’re all gonna die!”

Herzog may have been trying to tie the theme of distortion to our everyday perception of life.  You think your everyday life is important, but don’t forget:  memento mori!  You have the rumbling eruption “heard” just underfoot  as a powerful reminder.

The superficial story line also reinforces this view. You think you’re going on a routine scientific expedition and you’re suddenly in jail and then in a life-threatening situation. Memento mori!  Life itself is an illusory distortion.

I cannot right now add up all these elements into a coherent whole, but I do believe the film was a didactic presentation (as Herzog films tend to be), concerning the topic of perception as distortion.  It was not supposed to be, and it wasn’t, traditional storytelling.

The interesting thing is that he didn’t do any overt, obvious distortion in the photography, which would have been very easy.  Instead, his message was subtle but clear, as if to show both the distortion and how readily we overlook it.

I need to think further about the idea that storytelling, as we know it, is a fraud.  But it was a great, thought-provoking, and satisfying movie

And that’s my story.


Emma Bovary: Airhead?

Mauldon Translation

I read Madame Bovary in high school, in French, which is to say, I didn’t read it. What I did was spend many hours with a French-to-English dictionary. I was eager to read it as an adult, this time in English.

The classic novel is a slow story by modern standards. Country girl Emma, daughter of a pig farmer, is married off to a country doctor, Charles, where she achieves a comfortable life. But she dreams of luxury and fine clothing, food and furniture, dancing all night, and above all, she dreams of romance. Charles is boring as dirt.

Bored to death, she embarks upon a series of affairs, during which she spends her husband’s money extravagantly, leading inevitably to disaster for them both. The novel was scandalous in its time (1856),  of course.

It is difficult to understand the psychology of country folks in Normandy a hundred and fifty years ago.  One of the enjoyments of the novel is the insight Flaubert provides into that, although it is impossible for me to know, without considerable research that I am not willing to do, what parts are real and what parts are fantasy.

For example, Charles has to be the most dimwitted cuckold in all of history. Even a dog could see that Emma was fooling around. Could the taboos against adultery in country life have been so rigid as to make it unthinkable and invisible? I doubt it. Adultery has been going on for a very long time in all civilizations, in all classes, in all species. So Charles’s ignorance seems improbable.  Toward the end, when the evidence is overwhelming, he sort of chooses to not understand, which is slightly more believable.

And what of Emma? Could there really be such an empty-headed, unsocialized, untutored woman so possessed of childhood fantasies?  Maybe. Women were, as a cultural practice, untutored and badly socialized, and the only child of a pig farmer would have had limited opportunity for psychological development. Still, it is hard to believe that as an adult Emma could be so utterly bored and self-centered, especially since she is presented as articulate, literate, witty, and talented in music, sewing, and finance.  So she is not believable either.

Add those two ciphers together and you get… nothing believable.  The novel mechanically plods to its inexorable conclusion but I never was engaged with the characters.

The story is presented with mud-on-the-boots realism, so there is plenty of insight into everyday life, which I appreciated in the well-annotated Oxford edition that explained almost every reference to obscure practices, foods, religious ceremonies, medical procedures and news articles.  All that helped in getting a clear glimpse into lives in another century.

Mauldon’s translation, while perhaps not as lyrical as Lydia Davis’s more celebrated one, is good on rendering the sweat and roughness of everyday life in simple and coarse terms, also contributing to a compelling sense of seeing life as lived on the ground in that time and place. I enjoyed that quite a bit.

Finally, I admired Flaubert’s craftsmanship. It is often said that he invented the modern novel and that could be right.  He supposedly did invent the narrative technique of free indirect discourse (FID) wherein a third-person-close narrator temporarily dips  into first-person voice to express the mind of a character.

I think the first official use of FID in the modern novel in 1856, occurs on page 11, describing Charles’s boyhood.  He’s walking along the banks of the river, experiencing the great outdoors:

“Opposite, above the rooftops, he could see the vast, pure sky, and the red sun setting. How wonderful to be in the country!”

The first sentence is from the third-person narrator, but who gives voice to the second sentence?  It’s a blend of narrator and Charles himself, and there it is, the historical moment of the invention of FID.

I also appreciated Flaubert’s detailed descriptions of the environment.

“The ballroom was stifling; the lamps were growing dim. People were moving out into the billiard room. A servant climbed onto a chair and broke a couple of panes; at the sound of the shattering glass, Madame Bovary looked round and saw, in the garden, pressed against the window panes, the faces of peasants, staring in.”  (p. 47)

I can experience that. It’s detailed, sensory writing, enjoyable and admirable.

Overall then, the book has many fine qualities that make it a deserved classic, even if strong character and plot are not among them.

Announcing Launch of Psi-fi.net


I have launched my new web site and blog www.psi-fi.net.  That’s where I promote my psi-fi books (should I ever have any), and meanwhile comment on their development.

Awkwardly, at this time, I have zero commercially-published books of psi-fi. For now the site is a platform for the idea of psi-fi.

Psi-fi is an offshoot of sci-fi (and pronounced the same), but the “psi” (Ψ) stands for psychology and the “fi” (Φ) stands for fiction.  Unlike sci-fi, where the emphasis is on pushing the boundaries of science and technology, psi-fi pushes the boundaries of human psychology.

The tradition of psi-fi goes back centuries though obviously it wasn’t called that, since I just made up the term.  One could argue that Homer’s Odyssey is an example of psi-fi. It used fantastical elements (such as the Cyclops, Circe’s island, Athena’s magic bow, etc.) not to speak of the whole Olympian pantheon, to highlight the human condition as experienced by Odysseus.

Psi-fi differs from traditional literary psychological fiction, such as “Crime and Punishment,” or “Madame Bovary” in its use of imagined technological or other counter-factual elements in the telling of the story.  Use of those “magical” elements allows a writer to throw light into some of the more inaccessible corners of psychology.

A more recognizably modern psi-fi tale is Cosmographia, published in 1544 by Sebastian Munster. It described imaginary travels to far-away lands where the inhabitants (“aliens”) were monsters, with the heads of dogs or eyeballs on their bellies. Clearly “they” are “other” and “we” are the good ones.

Jonathan Swift’s later Gulliver’s Travels could also be counted as psi-fi in a more subtle way.

A lot of modern sci-fi still follows that pattern and that message, but modern psi-fi should be explicit and realist about the psychological themes. It’s not enough to say merely, “we are the good guys.”

A more familiar early modern example of psi-fi is Herland, published in 1915 by Charlotte Perkins Gilman. In a remote part of the world, a society is populated and controlled entirely by women and birth is by parthenogenesis. Three male explorers land their balloon there. Clearly they will need to be re-educated. Drama ensues.

I have at least five psi-fi manuscripts drafted, two of them in “ready-to-go” condition, three in various stages of readiness. If, after trying, I decide none can be sold commercially, I’ll publish them myself, so eventually, they will all be available to my hungry, clamoring public.

Deep Structure of Capitalism

When I learned that world-famous linguist and political commentator Noam Chomsky would teach a class at the University of Arizona, I signed up.  A lot of people from the northeast come to Tucson for the winter, but most of them don’t come directly from MIT to teach a university course.  Whatever his reasons, I was glad to take advantage.

The course was a series of 16 hour-long lectures making up a critique of capitalism. Chomsky, who is 88, read prepared lectures from a script. Though he was well-amplified and easy to hear in the large auditorium, his voice was frail and monotone. He never looked up from his notes. He used no visual aids.  He just stood at the podium and recited.

The lectures were the epitome of the dry-as-dust stereotype of a droning professor. Older students like me hung on every word, knowing that this is The Man and he knows whereof he speaks. It’s Noam Frigging Chomsky!  But I imagine the 18 to 21-year-old matriculated crowd were thinking, “Oh, God. How many minutes left to go?”

The 150-level course was entitled “What is Politics?” though that question was hardly addressed. According to the syllabus, “… politics is about who gets what, when and how, [and] where.” That defines politics exclusively as economics. I think most political scientists would prefer a more comprehensive definition, one, for example that also encompassed issues of group identity and values, pursuit of common goals, the structure of government, a forum for conversations, the exercise of power, and many other aspects.  Okay, I’d go with a narrow definition of politics just for the sake of the course. But then it turned out that the course wasn’t even about that.

Buried further in the text, the syllabus also said that “…the course will examine how industrial state capitalism has come to dominate our thinking as the only way to organize the political economy to satisfy human needs and wants.”  This was what the course was mainly about. It was a critique of late-stage capitalism and how we are “brainwashed” into accepting it without question.

On Tuesdays, the lecture was delivered by Marvin Waterstone, a U of A professor of Geography, and on Thursdays, lectures by Chomsky were followed by a question-and-answer session where a selected U of A faculty member would toss fluffball questions to Chomsky so he could expostulate off script for a few minutes. Those were actually the most interesting part of the course.

Waterstone, whose qualifications for co-leading the course were never revealed, filled his hour by summarizing the assigned readings for the week, often by reciting long passages from them verbatim and reducing others to PowerPoint bullet lists. Apparently, students these days don’t or can’t read the assignments.  I cannot imagine what benefit accrues from having someone read to you papers that you already read for yourself. It wasn’t as if he added context, historical perspective, contrasting ideas, examples, or linkages. None of that. He simply synopsized the readings.

How such a travesty passes for higher education was a mystery to me and my heart went out to the young students. It was a perfect example of what I have long suspected, that the purpose of education is to pound the creativity out of you so you will never again have an original thought. (As an ex-college professor, I flatter myself in believing that I worked against type).

At least Chomsky had things to say. I would rather he’d talked about linguistics, the field in which he made his name in the 1950’s by discovering (or inventing, depending on who you talk to) the generative grammar, the deep structure of language, the language acquisition device, and many other innovations. But he left all that behind long ago, and since the 1960’s has been a tireless critic of government, politics, and capitalism in the U.S. He was a prominent voice in the anti-Vietnam war movement and a scathing critic of the Bush wars on Iraq and Afghanistan.

So what is his grievance?  He believes, with good reason, that capitalism inevitably leads to exploitation of workers and ultimately to government plutocracy, rule by the rich, a situation we have arrived at in America. He has deep roots in Marxism, but he’s not “a Marxist,” if there are even any of those left. That set of ideas has its own internal contradictions, such as the labor theory of value, a foundational idea based entirely on a semantic ambiguity.  But we did read excerpts from Marx, Gramsci, and others. About half the assigned readings were quite valuable.

Chomsky’s preferred political alternative is “anarcho-syndicalism,” a mouthful, to be sure, which I had to look up. He mentioned the idea but did not press it in class. The goal of the course was to critique capitalist-based government in the U.S., without really articulating an alternative. He suggested that students should to “take to the streets” and “resist the lies” and “reject the common-sense assumptions,” and in general, return to the activist years of the 1960’s and 1970’s. Maybe this time wearing pussy hats? I don’t know, he wasn’t clear on what we should be protesting. He didn’t provide any clear agenda.

In the 1970’s we had the draft and that was personal and that was the basis for the street protests.  We said, “You politicians can lie, cheat, and steal as long as you do it quietly, but when you require me to stand up and take a bullet for you, I draw the line.”  We have not yet come to that breaking point again in today’s politics.

So despite Chomsky’s longstanding participation in government criticism, he did seem stuck in time.  He could, and did, talk in detail about a CIA-sponsored overthrow of a government in Guatemala a half-century ago, with names, dates, and incidents. But he said not one peep about Donald Trump and his administration, nor did he have anything to say about any American president or administration since Eisenhower.  My impression was that while Chomsky is extremely sincere, expert, and articulate about his displeasure with government power, he’s already a historical figure himself.

Another serious problem with the course was its tone, which was toxically cynical. For example, the so-called “War on Drugs” has been, contrary to popular opinion, completely successful. Did you know that?  Why? Because its purpose always was to sweep into prison non-economically productive members of society, get them off the streets and reduce the numbers of the poor that we have to care for with social programs. Under that goal, the “War” has succeeded.

Not only is that view unsupported by historical facts or population statistics, it is deeply cynical in attributing the darkest, vilest motives to the government. Other people’s motives cannot be ascertained, only inferred, so it is gratuitous at least, and mean-spirited at worst, to attribute such motives, especially in the absence of evidence. Not to mention that the cost of incarceration is far higher than any cost of social programs for the poor.  I agree that the “War on Drugs” was, and still is, a bad idea, but I don’t jump to the conclusion that it arose as a malevolent conspiracy.

This was a problem throughout the course. The darkest, most evil motives were asserted for anyone who disagreed with Chomsky’s agenda. In his lectures and throughout most of the readings, arguments were consistently one-sided, evidential quotations selective and secondarily sourced, propositions laden with innuendo and presumptive values, assertions with willfully conflated correlation and causation, and so on. All these transparently propagandistic rhetorical techniques were an affront to critical thinking and undermined the credibility of the course’s goals. After the first few lectures, I stopped taking any of Chomsky’s arguments seriously. At first I did some online research and discovered how extremely biased and one-sided his lectures were, then I lost interest. What a wasted opportunity.

But it was worse than merely a wasted opportunity. I was angry and disappointed that this sort of crass propaganda passes for higher education.  I felt very sorry, indeed for the youngsters in the class who, presumably, were not as able as I to see through the rhetorical fog the course was blowing.

My only consolation was that basically, I agree with the course’s premises: capitalism does lead to a deceptive and pernicious plutocracy. So maybe it isn’t so terrible if students come to believe that. But that point of view should come after information-gathering and critical thinking, not from having it forced down your throat by a couple of arrogant and disrespectful propagandists who hold academic power over you. These professors acted out the same abuses of power and truth that they accused the government of, but they were immune to the irony.

I’m glad I took the course. It sharpened my awareness of how greed trumps all, works  against even the very survival of the planet.  I guess it also made me feel more helpless than I did before. If I were much younger, maybe I’d do something about it. Take a law degree and go into politics perhaps, I don’t know. In the present reality, I can only wallow in despair. I don’t have much hope for the youth, if this is the quality of education they’re getting.

Psi-fi: New Literary Genre

Psi-fi Oakland libraryI have been wriggling against the sci-fi label since I accidentally wrote my first sci-fi novel a decade ago. I didn’t mean to write sci-fi, but the story had an AI android in it. I don’t even like sci-fi.

I’m a cognitive psychologist who left the academic life for the computer industry to find out if the mind is like a computer. I started writing fiction to dramatize what I discovered about human and AI mental capacities. To do that, in my stories I often use a robot or an alien as a contrast character, because “It takes an alien to understand humans.”

It’s extremely difficult to explain to people, and agents too, that my novels are not really sci-fi. They involve no alien invasions, space battles, plasma guns, warp drives, or rampaging robots. Instead, they are stories about consciousness and its vicissitudes. I stick pretty close to actual technology and AI concepts, with just a little exaggeration.

Where I stretch is in psychological descriptions and explanations, of perception, dreaming, memory, motivation, imagination, creativity, agency, socialization, empathy, and above all, the mind-body problem (how does the immaterial mind connect to the physical body?).

I use robots and aliens the way genetic scientists use “knockout mice,” with a few specific genes disabled or “knocked out” in order to see what those genes do. An android, for example, might be just like a human except lacking in intuition. How would that show up? It’s not about the robot. It’s about human psychology. Try explaining that to your dog.

I’ve strained to find comparable work in the literature. There are some classics, going back to Clarke, Heinlein, and Asimov, that put human psychology at the forefront, but not much in current work. Judging from what’s getting published (by surveying Publishers Marketplace), I would say that 95% of today’s sci-fi is actually in the category of fantasy, and indeed most bookstores shelve sci-fi and fantasy together.

The average reader doesn’t know much about science and cares less. Fantasy is what they want. I described one of my novels to an agent at a conference and his first question about my main character, a physician, was, “What’s his special power?”

There might be, should be, more interest in the mysteries of the mind than there is in science and engineering, because I know for sure that every reader has a mind and a body. All I have to do is make them realize that’s a highly problematic way to exist.

So I’m hereby declaring a new genre: “Psi-fi,” where “psi” stands for psychology (which is not a real science, regardless of what they tell you). The term is already in light use, but not in a literary sense, as far as I can tell.

A scientific interest group in Lahore is called psi-fi (www.facebook.com/LUMSPsiFi/) but they’re not involved in reading or writing fiction. There’s a “psychedelic music” group apparently obsessed with the I Ching (http://www.psy-fi.nl/) but as far as I can make out, they neither read nor write literature. A professor of philosophy in Texas apparently wrote a study of “the intersections of science fiction, superhero comics, and the paranormal” that incidentally uses the term, “psi-fi.” (http://boingboing.net/2011/01/26/psifi.html). But I don’t find anyone using the term in the way I intend, as a specific genre of psychological fiction against a technological background. So I’m taking it.

Psi-fi is hereby deemed a genre of contemporary literature. Now if only Barnes & Noble agreed with me.

Addendum: Psi2

I really should put forth a psi-fi “manifesto” at this point. Everybody has a manifesto. Alas, I don’t have one. I can however offer a list of just a few topics I consider appropriate to feature in a psi-fi novel:

Perception and reality: what’s the difference?

Chaos and pattern – in the eye of the beholder?

Intersubjectivity and its variants, its absence, its origins

Mortality – what is it?

Memory as fabrication

History as collective memory


Time and change (as experienced)

Self-awareness, metacognition and higher-order thought

Consciousness – kinds of, states of, absence of, conditions of…

Consciousness – natural vs artificial

Madness and  the social construction of reality

Personality – what is it?

Individuality – myth or reality?

Emotions and feelings. What good are they?

Intuition and conceptualization

Creativity (and counterfactual imagination)

Free will vs randomness vs determinism vs self-delusion

Knowledge, certainty and doubt

Knowledge vs belief

Language: Social construction, language games, Deep structure

Community (family, tribalism, Gemeinshaftsgefuhl)

Music (all the arts) – the proprius, Necker cube, Gestalt formation

The mind-body problem, intercorporeality, Merleau-Ponty

Spatiality and movement: alternatives to Kant? Einstein?

Entropy vs life, vs knowledge vs information

Rationality vs the Dionysian

Logic and reason – was Hume correct? Critical thinking.

Philosophy of science and constructed reality

The Dream that grips us all

The black hole and the folds of experience

Self-relating subjectivity per Hegel

Intentional inexistence per Brentano

Accommodation of the self to reality and vice-versa

Intrinsic motivation

The telos

Egocentricity vs self-transcendence

Radical subjectivity (Ramana Maharshi)

Gibson’s affordances

The ox-herding pictures

The delusion of self-efficacy

Greed – can it be stopped? Mitigated? Excised?

Love vs reason. Why don’t they mix?

Magic – how should it be defined

The Turing test (and its successors)

Convergence of biology and technology vs theory of evolution

The construction and practice of gender

Lies vs truth. Why do we care? – in practice, per Wittgenstein

Science as a special kind of conversation


The homunculus

Psi-no oilCould this be the genre slogan? “Use no oil!”  Has a nice ring to it, I think – almost as good as a manifesto.


Natural vs Artificial Intelligence

SuperintelligenceThis briefing by Nick Bostrom on the dangers of artificial intelligence takes up a serious and legitimate question: Should we be more cautious as we go about trying to improve artificial intelligence? What if an AI became so smart it decided to take over the world?  Silly? I think so. But it’s a question worth exploring, if only to dispel long-standing fear of the mythical “Frankenstein Syndrome.”

Unfortunately, the book is written by a philosopher with an engineering bent, without, apparently, much understanding of human psychology (real intelligence).  Consequently, the book is mostly a sterile exercise and often unintentionally humorous.

Perhaps the most fundamental problem is the failure to define intelligence of the natural kind. Bostrom unthinkingly uses the I.Q. index as a measure of it, but anyone who has studied the matter will agree that IQ equals intelligence only as a matter of convenient social discourse. People who take IQ tests produce scores distributed in a normal distribution and that’s a scientific fact, but there is no theory or explanation of why  or whether answers to the questions on an IQ test have anything to do with intellectual competence or “smarts,” whatever those might be. Here’s an example of the kind of question you might find on an adult IQ test:

Rearrange the following letters to make a word and choose the category in which it fits.


A. city
B. fruit
C. bird
D. vegetable

Correct answer: bird (parakeet)

If you can answer such a question, what does it mean?  Rapetek is not even a word so you can’t be expected to know it. Perhaps the correct answer shows you have experience with words, letters, and conventional  hierarchical categories of common objects. Does that make you “smart?”  Maybe. Another good answer is “Rape,” and none of the categories presented is appropriate to it.  Is that a less smart answer? (The question did not say I had to use all the letters presented).

The bottom line is that there is no generally accepted explanation for what natural intelligence is. An IQ score is merely a convention for use by educational and legal systems but it explains nothing. If you’re going to write a book about “superintelligence,” I would say you have to do better.

In a related vein, Bostrom seems to have never given two thoughts to the nature of intuition, creativity, agency, subjectivity, empathy, emotion, intrinsic motivation or aesthetics, just to name a few faculties of the intelligent mind that seem important. You would think a philosopher would be at least minimally familiar with current concepts in consciousness studies, such as the debate over qualia. He assumes memory is about storage and retrieval of data, which many people believe, though that is not supported by the scientific research (on humans).

The author proceeds blithely as if there were no question about what intelligence is, so what does he think “super” intelligence is?  It seems to mean symbolic problem solving at a rate much greater than humans can accomplish. Problem-solving slips in there as a new, undocumented re-definition of intelligence. Even if it were, why would “faster” = “smarter?”  What’s the hurry?

Such shortcomings, and many others, including rampant anthropomorphism, leave the discussion ungrounded, a mere exercise for its own sake, leading to nothing. As if that were not bad enough, the writing is execrable. Consider this description of how a “super” AI might solve a problem:

“…the programmer would simply specify a formal criterion of what counts as a success and leave it to the AI to find a solution. To guide its search, the AI would use a set of powerful heuristics and other methods to discover structure in the space of possible solutions. It would keep searching until if found a solution that satisfied the success criterion. (p. 186)”

In other words, the AI would use “methods” to search for a solution. I would do the same myself! Nothing is revealed by the author’s obfuscatory verbiage.

The book is “highly recommended” by Bill Gates, on the front cover. Maybe that should tell you something.  Better choices might be “Artificial Intelligence: A Modern Approach,” by Russell and Norvig, or “The Cambridge Handbook of Artificial Intelligence” by Frankesh and Ramsey.

Bostrom, Nick (2014). Superintelligence: Paths, Dangers, Strategies, Reprint Edition. New York: Oxford University Press, 2014. 415 pp.

Gratuitous Poetry

Blind assassin - Atwood_Margaret Atwood cut her writing teeth on poetry and it shows in her novel, The Blind Assassin, perhaps too much. Her phrases are carefully constructed, a virtue in any writer, but Atwood’s choices often stand out as slightly too clever, while not particularly insightful. As the book opens, the narrator’s sister, Laura, has died, and Iris, the sister, writes about her sister’s novel,

“Hard to fathom, in my opinion: as carnality goes it’s old hat, the foul language nothing you can’t hear any day on the street corners, the sex as decorous as fan dancers – whimsical almost, like garter belts.” (p 39)

I enjoyed “whimsical like garter belts” as a phrase, but what does it mean? Are garter belts whimsical? And even if they are, Laura’s sex scenes were “almost” that whimsical, meaning what?  This is one example of hundreds and hundreds throughout this long novel, of phrases that catch your eye but on closer inspection are close to nonsense.

“Such a thin book, so helpless. The uninvited guest at this odd feast, it fluttered at the edges of the stage like an ineffectual moth.” (p. 40)

Arresting image, until you ask yourself what an “ineffectual moth” is. What would an effectual one be?  And if you’re wondering why the narrator is still waxing on her sister’s novel, the answer is that Atwood waxes.  That’s why the book is too long.

In long, excruciating backstory, we follow the lives of the two sisters from when they were wealthy teenagers in a small town in Eastern Canada, to the death of Iris, seven decades later. What happens between are the two world wars and the depression, with the appearance of soldiers, businessmen, love affairs, marriages, babies, households and the stuff of life. As in much “literary” fiction, nothing really happens. It’s just ordinary everydayness piled high and deep. Getting through the novel is, as a colleague commented, an Iditerod of reading.

More than half of the eighty or ninety short chapters open with a weather report, followed by detailed description of the scenery, and continue into long, lush descriptions of walks in the town or country, food and drink, shopping, clothing, babies and children.  Such material may hold particular fascination for some readers, but it was suffocating for me.

Are there redeeming virtues?  Yes. The novel did not win the Booker Prize for nothing. One interesting aspect is the narrative structure. The whole novel is presented as a diary, or letter, addressed to someone, we are not told whom until the very end. The ending is contrived and clichéd, swooping into the final few pages like a Deus ex Machina.

Within this long diary, Iris, the ostensible writer and first-person narrator, tells the story of her difficult lifetime relationship with Laura, her wild sister who died in the first chapter in a car accident that always smelled of suicide.  Interspersed throughout the diary is a novel, called The Blind Assassin, supposedly the novel that Laura wrote. It is a cheesy sci-fi adventure, written in third person narration, and involves swordplay, monsters, space travel and foreign worlds.  It is stereotypical nonsense, badly written, clichéd, and pointless. However, the astute reader notes it would take considerable skill for someone like Atwood to deliberately write that badly on purpose, so it is interesting in that regard – only.

Finally, the embedded novel, The Blind Assassin is not simply inserted into The Blind Assassin, but rather told as a story, or a series of stories by an unnamed man who claims to be a writer,  to an unnamed woman, who we guess is Laura.  The point of view narrator for that part seems to be Laura, who asks the man repeatedly to continue with the story.  But what is Atwood’s point of view on that point of view?  That’s an interesting and tricky question that is not adequately dealt with, making the structure an interesting piece of experimental writing which in the end, breaks the implicit contract with the reader that says third-person narrators are always reliable. Still, I give points for the effort.

Another virtue of the novel is Atwood’s skill at finely detailed description of fixed scenes and especially of photographs. Atwood seems drawn to ekphrasis, a poetic term for written description of a picture or a work of art. The book opens with a vivid description of a photograph and that is a recurring theme. Ekphrastic writing tries to “tell a story” about a photo, film, or painting, a type of writing that is well-suited to Atwood’s narrative voice.

Finally, as mentioned, much of Atwood’s descriptive writing involves highly poetic language. Taking an arbitrarily selected 154-word paragraph …

“Today I had something different for breakfast. Some new kind of cereal flake, brought over by Myra to pep me up: she’s a sucker for the writing on the backs of packages. These flakes, it says in candid lettering the color of lollipops, of fleecy cotton jogging suits, are not made from corrupt, overly commercial corn and wheat, but from little-known grains with hard-to-pronounce names — archaic, mystical. The seeds of them have been rediscovered in pre-Columbian tombs and in Egyptian pyramids; an authenticating detail, though not, when you come to think of it, all that reassuring. Not only will these flakes whisk you out like a pot scrubber, they murmur of renewed vitality, of endless youth, of immortality. The back of the box is festooned with a limber pink intestine; on the front is an eyeless jade mosaic face, which those in charge of publicity have surely not realized is an Aztec burial mask.”

We notice it is pure description, a propos of nothing, as so much of the book is, and one is tempted to skim right on past with annoyance. But if a reader were to take the time to notice, some lovely constructions are buried in that pile of verbiage:

“candid lettering the color of lollipops” is a visual, vivid, creative, and original phrase well-worth savoring.

“Candid lettering, ” color of lollipops:” 2 syllables followed by three, 2x(trochee + dactyl), with alliteration! Not bad at all. The rhythmic nature of the phrasing is no accident and such constructions do not grow on trees. The astute reader must whisper, “Bravo!”

Is such a construction necessary, or even desirable in a book that’s supposed to be a novel, a  “dramatic” tale (though it contains no drama) where story is supposed to be king? That is a separate question.

“fleecy cotton jogging suits” is another fine phrase — I can almost hear the band playing that tune. Say it out loud and you’ll enjoy it. And it invokes both tactile and kinesthetic senses to boot.

“corrupt, overly commercial corn and wheat” — less good, but still nice.

“little-known grains with hard-to-pronounce names”

On this one, she should have said “difficult” instead of “hard-to-pronounce.” Try it out loud both ways. Is my version too obvious?

Some of Atwood’s phrases are visually arresting, even when not especially rhythmic:

“festooned with a limber pink intestine” — “festooned” is a lovely word, but followed by that particular noun phrase, well, it grabs you in the eyeballs (if not the guts).

So overall, I say that this 154-word paragraph does pay its rent, but that does not mean it should have been included in this novel. Rather, it could be construed as self-indulgent wordplay that shows contempt for a reader vainly searching for a story.

Atwood, Margaret. (2000), The Blind Assassin. New York: Doubleday/Anchor (518 pp.).

What is Consciousness?

What is Consciousness?

tsc2017posterIf I had unlimited time and money, I would waste it on the University of Arizona’s annual conference on consciousness, called, optimistically, “The Science of Consciousness.” Of course there is no such science. One can (I can) argue that consciousness, being immaterial, is not even susceptible to scientific methods of inquiry.

Undaunted, in even-numbered years this conference is held in Tucson; odd-numbered abroad. I used to be an active participant from 1994 through the early oughts, reading, attending, showing posters, presenting at paper sessions, and contributing articles to the journal (Journal of Consciousness Studies (www.imprint.co.uk/product/jcs/).  I presented papers in Tucson, Sweden, and Scotland. This year (2017) the conference is in Shanghai.

I have a fondness in my heart for this enterprise.

Several factors nudged me out of the fold. One was the price. Back in the day I could attend a conference for $200. Now it’s $500 to get in the door, too rich for me.

Another factor was that I started to feel like it was Groundhog Day. The same old ideas and arguments were trotted out year after year. Nothing was ever resolved and nothing that looked like progress was ever evident and I lost confidence that it ever would be.

And then I took up writing fiction, which is perhaps just another way to study consciousness, and it is all-absorbing and very, very time-consuming.

Having visited Shanghai some zodiac Rooster cycles ago, I was curious about the latest conference, in June, 2017  Here are some of the promised highlights and my random associations to them.

  • June 5-10, 2017
  • Shanghai New International Expo Centre
  • Shanghai CHINA


General Conference Registration:  $500. (Travel, food and lodging not included).

Conference Blurb:
Consciousness defines our existence, but its scientific nature remains unknown. How does the brain produce consciousness, and how does consciousness causally affect brain processes? Is consciousness equivalent to computation? What are the best empirical theories, and do we have free will? How and when did consciousness evolve, or has it been present in the universe all along? What are the origins of moral and aesthetic values, and how can mental and cognitive function be optimized? Can consciousness persist after bodily death, e.g. through ‘uploading’ to machines, or via mental processes tied to the structure of reality? These and other relevant questions are approached through many disciplines including brain science, philosophy, physics, cosmology, the arts and contemplative practices.

[The bias of this conference and of most people working in the field, is that the brain does somehow “produce” consciousness. The only remaining question is how?  For a material brain to produce immaterial consciousness  would violate several laws of physics and is scientifically implausible. Never mind.  Refer to the title of the conference.

Notice that in the interests of “fair and balanced” propositions, the introduction also asks how consciousness affects brain processes, allowing only marginal “effects.”  It does not ask how consciousness might “produce” the brain, for that is an inconceivable idea. More inconceivable than the converse?

The prevailing conceptualization seems to be. 1. There are two conceptual entities: a) the brain, and  b) consciousness. 2. Those two entities are clearly correlated in observation. 3. We have no causal story to explain that correlation.

Despite the impasse, there is a strong and palpable bias at these conferences, without reason, evidence, or plausible theory, that the arrow of causality runs from brain to consciousness.

In my (not so) humble opinion, after 35 years of study, the consensus view is a dead-end and even a non-starter.  Once I saw that clearly, I tried for a while to turn the ship around, realized shortly that one cannot swim against a zeitgeist, and dropped out. ]

[John Searle spoke at the first conference in 1994. I don’t think he’s said anything new since then. His answer: “No.” ]

[I was surprised to see that Thomas Bever is now at U of A. It was his book, The Psychology of Language: An Introduction to Psycholinguistics and Generative Grammar by J. A. Fodor, T. G. Bever, M. F. Garrett (1974) New York: McGraw Hill, that got me started in psycholinguistics and philosophy of mind. Too bad there’s no practical way for me to meet him and say “Thanks.”  Oddly, He’s probably sitting not far from me twice a week in the class Chomsky is currently teaching at U of A.]

[Dave Chalmers has been a producer of the TSC from the beginning. I’ve had many interesting conversations with him, online and in person, though he would not know me, since he is many-to-one, while I am among the many.  His 1996 book, The Conscious Mind, was a very welcome antidote to rampant and unexamined physical reductionism in the study of  consciousness. However, I don’t think he has ever settled on a definition of his own. It used to be, I thought, that he believed consciousness was information, in some way that I could not understand – “information” being a classic weasel word with multiple definitions. I don’t know if he’s gone over to the side of the panpsychists.  In any case, I am grateful to him for having invented, or at least promulgated 1. The philosophical zombie and 2. The Zombie Blues.

Galen Strawson, with whom I have also conversed and emailed, is the archetypal panpsychist, although in my (not so) humble opinion he is a closet materialist.]

[It would be interesting to attend this session and hear what the “latest” is on the correlation between consciousness and neurology.  At best, it would be, “still don’t know.” At worst, it will be the same old “just around the corner” misplaced optimism.]

[Likewise, it would be interesting to hear what the “latest” theories of consciousness are, but I would expect same-old, same-old. I’ve talked with Stuart Hameroff several times at conventions and I once took an online class from him. He’s an anesthesiologist and to his credit, admits  that “nobody knows” how anesthesiology works (although I am sure there are strongly-held hypotheses.). ]

[The multiple layers of questions and assumptions embedded in this problematic are indeed plenary.  “Evolution” is a biological term, so posing the question about the “evolution of consciousness” already presupposes that consciousness is a biological phenomenon, something that has not been scientifically established. It’s one of those infuriating topics that skims over so many definitions and assumptions that I am usually left speechless.]

[Still, if I had the thousands of dollars to attend and the time to do it, I would.]

Jack Reacher Outsmarted

Reacher Said NothingAndy Martin’s deconstruction of Lee Child’s twentieth, and hopefully last, Jack Reacher novel, Make Me, is at first glance an exercise in flamboyant grandstanding pretending to be hagiography.   At least 80% of the book is filled with tangents not even remotely germane and peppered with mystifyingly irrelevant anecdotes. It is extremely annoying for that. But on closer reading, I think that’s all obfuscation, part of a near-fantastical feat of mental conjuring worthy of Jack Reacher calculating the trajectory of an incoming bullet.

My hypothesis is that the structure and content of Martin’s book is a response to a heavy hand of censorship from Child’s publishers (also Martin’s own, Random/Vintage). That pressure is alluded to in the text. The publisher apparently had complete editorial control over Martin as he wrote and essentially enforced a content-free policy so he would not say anything even the slightest bit critical, protecting the Reacher brand from any expose, or even the slightest shade. Elephant bucks are involved. Child’s Reacher series is one of the most successful in modern publishing history. Forbes called it “the strongest brand in publishing.” Reacher books have sold more than 70 million copies, making it a billion-dollar brand.

Despite that, Martin manages to present a serious literary criticism of the Child novel, and to present meaningful biographical information about Child himself, all without invoking the Damoclean sword. Reading between the lines, here are some issues and questions  about Child and Reacher that Martin sneakily brought forth right under the noses of his wary censors:

  1. Martin attempts to write in Child’s Jack Reacher style, especially in the first couple of chapters. It’s painfully bad. Martin seems to humiliate himself with some horrible writing and very lame imitation. Yet it does, deftly and indirectly, call attention to Child’s bizarre writing style. Martin provides a scathing criticism without stating a word of criticism.  What is the Child style? In some ways it’s redolent of a Cormack McCarthy minimalism (without the poetry), short, direct, declarative sentences and sentence fragments, the grunts and whistles of a taciturn cave-man.  And the Reacher books are characterized by the catch-phrase, “Reacher said nothing” a phrase Martin glorifies in a brilliant feat of misdirection.

Martin displays these and other elements of the Child style critically, again without commenting directly on them. For example, he shows several instances of cringeworthy purple prose along with some extremely clunky sentence structures and almost uninterpretable quirks of narration, qualities Child’s writing shows in abundance.  It’s an  extremely subtle, even artistic form of criticism, showing, without saying.

Example: “… about the one thing he couldn’t do was write a novel about his own experience. Which was why Reacher still needed him. He’d written the first line on September 1, 2013. It had to be September 1. Every year. Without fail. Now it was over.” (p. 5)

I submit that is parody, even ridicule, of Child’s writing style, and Martin slipped it past the censors.  There are many other similar examples.

  1. On page 41, Martin says to Child, I like the way you use which,” I said. Which made sense anyway. Subordinate clause, but you give it a fresh start.”

It’s another beautifully disguised criticism on many levels, ridiculing Child’s excessive use of fragments and including the deliciously cryptic italicized phrase which renders the passage nonsensical but is supposed to be a thought-balloon (I think). Child, oblivious to irony, eats up the praise while Martin parodizes him.

  1.  On page 56, Martin inserts another dirk into Child’s cloak when he over-praises the title of the novel, Make Me. Masterful!  It is, of course, an uninformative title, having nothing to do with the story. It evokes the mood of a schoolyard bully for no apparent purpose except to reveal something about Child’s own mentality, perhaps, and that is reflected in the novel. Jack Reacher has the social development of a nine-year-old, and after reading Martin’s book, I began to believe that was true of Child as well. So the title is perhaps an inadvertently embarrassing self-disclosure by Child, highlighted and interpreted by Martin. Again Child unwittingly basks in Martin’s praise.
  1. Money, money, money!  Child is all about money (pp 65 ++, 85, 89, elsewhere), and he has done extremely well indeed with the Reacher series. Child portrays himself (per Martin) as being like Reacher – an unmotivated drifter with no agenda. Martin effectively exposes that self-description as either delusion or pure cynicism. Writing is all about the money for Child and that’s what drives him, not any artistic muse, as Child claims with abundant self-flattery. Martin skillfully demonstrates that contradiction without stating anything directly.

“So you’re a poet … and a ruthless bastard at the same time?”

“One does not impact on the other…” (p. 86).

  1. Martin reports Child’s appreciation of the “Flaubertian point of view” (more commonly in the U.S. called “Free Indirect Discourse,” or FID – a type of narration supposedly invented by Flaubert).  Child enthusiastically agrees, for he is a fine literary artist after all. Child does make extensive use of FID in his narration, but so does everybody else these days. It is required in modern writing. However Child corrupts the subtlety of the technique by inserting unbelievable, often incomprehensible, italicized thought-balloons into the text, essentially constituting a different narrator entirely, a first-person narrator that often competes with the close-third narrator exercising FID. Child, of course, is oblivious to this garbling of the technique. Martin is not. (See pp. 131 ++, 133, and 138).
  1. What is the plot of Make Me? You’d be hard pressed to outline it. The story throughline is very nearly lost in the endless meandering that makes up most of the book. Reacher is unmotivated and wants nothing. The “MacGuffin,” his friend’s missing partner, is known by the reader to be dead on page 1. The so-called plot seems to be merely episodic, a long, saggy series of almost unconnected scenes leading nowhere in particular. I admit I couldn’t even keep track of why the main characters were furiously scooting off to Los Angeles or Oklahoma – I had completely lost the thread of what was going on because the story was directionless and nothing mattered. The “grand denouement” of the ending could have been written as Chapter Two, so unrelated was it to the rest of the story.

But Martin skillfully reveals Child’s self-serving “theory of plot” (see pp. 138-139). Child’s incomprehensible “theory” of plot is that it is the job of the author to “kill the plot.” What? On the other hand, maybe he did that, though I’m skeptical that it was on purpose. More likely, Child can’t get a grip on a solid Reacher plot. The books are extremely episodic, not story-driven, and obviously written by the seat of the pants. But nor are they character-driven. Reacher is an unchanging rock. By the last page of the book he has barely mussed his hair.  My conclusion, prompted by Martin, is that Reacher books are neither character-driven nor plot-driven. They are author-driven.

  1. Who is the audience for Reacher books? Martin probes that question ever so gently, aware that he simply cannot insult any of Child’s readers, not even one. The publisher/censors would be all over that with a flame thrower.

So instead, Martin presents a long anecdote about how Child routinely beats speeding tickets. While that conversation is presented in a humorous tone, it is the sneering humor of a bully.  I think the point Martin is making with this extended diversion is that Reacher Creatures (as avid readers call themselves) are thrilled by simplistic and brutal vigilantism because they have an extremely undeveloped sense of social justice and no clue about the principles behind the judicial process (like the Constitution, for example). The readers have the moral and social development of nine-year-olds. Martin skillfully makes his point about Child’s readers without insulting anyone. (See p. 175 and also p. 196).  It’s brilliant.

  1. Ever clever, Martin reports some juicy trash talk from Child in the final few pages, as Child expresses (often indirectly) disparaging attitudes toward James Patterson, John D. MacDonald, the James Bond series, John Grisham, Dorothy Sayers, Thomas Harris, and many others. Of Harris’s Hannibal Lecter character, Child says, “…It could be parody – either that or Harris just fell in love with his own creation.” This is exactly what I’d been thinking about Child and Jack Reacher, and maybe Child wanted to confess as much about himself, but even if he did, Martin would never get something like that past the censors, so he makes the thought a speculation about Harris, by Child, not about Child himself.  Very sly.

Child muses, “Do you think it’s possible some smart cookie at Google is going to come along and read all this and turn it into a piece of software that can write virtual Lee Child novels from now till kingdom come?” (Page  313). Indeed.

Conclusion: Reacher Said Nothing is a difficult book because you have to sift through a lot of dross to find the jewels, but they’re in there. Once you understand that Martin had no choice but to write as a fanboy and not leave the slightest smudge on the Reacher franchise, you can see through the veneer to his subterranean agenda. Though it is a brilliant artistic achievement,  Martin’s frustration is palpable and summed up in a statement camouflaged by a seemingly very  irrelevant tangent on Wittgenstein: “There is a line right at the end of the Tractatus Logico-Philosophicus (Proposition 7) which anticipated “Reacher said nothing”: “Whereof one cannot speak, thereof one must be silent”… (p 294).

Martin has managed to convey important literary criticism about Child/Reacher, cleverly disguised within an ostensibly brain-dead fluff piece. Martin has outsmarted Child, Random House, and even Jack Reacher.

Martin, Andy (2015). Reacher Said Nothing: Lee Child and The Making of Make Me. New York: Random/Vintage (345 pp.)