The Phrenological Slope of the Post

Do some blog-brains have a pronounced proclivity propelling profuse postings, and can the inclination be felt in the shape of their skulls? A blogger has fallen from grace with the blogging sea. I’ve been meaning to post on the phenom, and even though it’s old news in today’s Blogger Ocean, where tides rise and fall every few minutes instead of twice a day, here goes.

Phrenology was taken for a serious science in the late 1800s, and occupied thinkers from all occupations. I first learned about phrenology back at CSUDH, reading “Moby Dick” in an American Lit. class taught by Abe Ravitz. The idea behind phrenology was that a person’s predilections, proclivities, and personality could be read by feeling the shapes of the person’s head, its bumps and curves and slopes. The person doing the feeling was the phrenologist. There may be some basis for comparing the phrenology of the 19th Century to the neuroscience of the 21st.

Can a blogger post too much? Frequent blogging appears to be an acceptable practice as long as the blogger does not repeat posts, but there are rules within rules, so it is okay to repeat a post as long as the previous post is properly cited, even if the post is one’s own. A post of one’s own should not (recent criticism makes clear) with a change of venue be presented as a new post. So, what bump within the neuroscience journalist Jonah Lehrer’s head provoked the young but already venerable writer and speaker from doing just that? What pressures build in the brain from the habit of frequent posting?

I’ve been reading Jonah’s blog, The Frontal Cortex, for some time. The first time I mentioned it in the Toads, I hasten to cite, was on November 21, 2009, in a post titled “This Is Your Brain on Books.” Your brain on posts, apparently, looks and feels differently. I still like Jonah’s work, and find much of the recent criticism following his reposting old posts and ideas previously sounded elsewhere to his blog following a change of venue to The New Yorker somewhat opportunistic (taking advantage of the breaking news to call out Jonah on issues having nothing to do with the current topic), exaggerated (sounding like the Queen in “Alice”), and off point.

The electronic world never sleeps. Surely, the brain feels this, and posting can be addictive, and so can the attention a writer might crave. Over at Twitter, we find writers whose followers number in the thousands. One simply can’t “follow back” that many tweeters, certainly not at the frequency many tweeters are known for. This is seen in blogs also. Does the blogger really want to write the blog everyone follows? In blogs begin responsibilities (follow link, and see Delmore Schwartz).

The best critical review of the Jonah Lehrer self-plagiarism issue I found at Slate, in an article by Josh Levin titled “Why Did Jonah Lehrer Plagiarize Himself? Because he stopped being a writer and became an idea man.” Levin says, in the last paragraph of his article: “A blog is merciless, requiring constant bursts of insight.” This is true of the daily blog, the hourly post, or every minute the blogger is awake blog. But Levin is even more brutally honest: “Most of us journalists have one great idea every few months….” But there are so many different kinds of blogs, dedicated to so many pursuits. But maybe all blogs break down to two basic kinds, the serious (series, < Latin, promotions; ex ordine, no break) and the not so serious (enough posting for today; want to do some yard work). And where will the Toads go, adrift on the Blogger Ocean?

Follow-up, Jul 31, 2012: My sister Barb just sent me this from a Guardian blog: “Journalist resigns for fabricating Bob Dylan quotes.” The journalist? Jonah Lehrer.

Sea Monsters in A. C. Grayling’s Secular Bible; or, Humanity’s Greatest Endeavor

The receding shorelines of the Sea of Faith betrayed not a spiritual drought but a thirst for knowledge when Matthew Arnold stood on the cliffs of Dover and declared his desperate love for his girl amid humanity’s confusing mission, for the beautiful sea, the moon coming to pieces on its surface, the calm English evening wanting amour, was full of sea monsters. It’s an easy poem to parody, Arnold’s “Dover Beach.” Anthony Hecht certainly thought so, when, about a hundred years later, he refashioned it “The Dover Bitch,” thinking of the lot of Arnold’s girl, who, lured by the promise of a weekend tryst at the beach, is forced to listen to Arnold’s God’s not in his heaven, all’s wrong with the world speech. Not much has changed since Arnold’s moonlit vision of sadness. The Sea, though not yet empty, is still losing water to the thirsty scientists, whose promises, in turn, of certitude, progress, or peace, seem as empty as Arnold’s unfurling religious girdle.

If there is no spirit, then nothing is spiritual. The brain is simply a piece of meat, as Jonah Lehrer keeps repeating, and the universe is merely a long fly ball of exploding rock off the bat of a big bang Louisville Slugger. But the nature of the slugger remains unknown, and there’s reason to view with skepticism Dawkins’s and his disciples’ descents. The latest to echo Arnold’s theme appears to be A. C. Grayling, who has written a secular bible, in which he creates a collage from the world canon. Here’s a sample, from Grayling’s “Genesis”: “Thus nature by unseen bodies and forces works; thus the elements and seeds of nature lie far beneath the ordinary gaze of eyes, Needing instead the mind’s gaze, to penetrate and understand” (p. 5). But doesn’t this carry a whiff of dualism, from which the spirit was born? And does he mean “the ordinary gaze of eyes,” or the gaze of ordinary eyes? For just as the Church argues that we need the clergy to explain what we in our ordinary (not to mention fallen) state can’t understand, Grayling posits the scientist as the new high priest who will explain what we in our ordinary intelligence have no way of seeing or understanding: “It is nothing less than science, mankind’s greatest endeavour, greatest achievements, and greatest promise” (p. 11). In any case, Grayling’s secular bible hardly seems an improvement over the sacred Bible. Grayling suggests that his purpose is to get us to think independently, but that’s not as clear as that he wants us to think like him. Anyway, it would seem that much of the writing of the world canon writers he references (Dryden and Milton, for example) would never had been written were it not for the Bible. There are other seeming contradictions in Grayling’s purported purpose.

Grayling comments, in an interview with Matthew Adams, in The New Humanist, “If the sum total of positivity, in some way, outweighed the negativity, in that little moment in one corner of the universe, which was otherwise just a bland, neutral state, then the whole history of the universe is made good by it. But if the negativity outweighed the positivity, then the whole history of the universe is tainted by it. And for that reason, we have a universal responsibility to promote the good.” This sounds strangely religious, and thus contradictory, for it’s religious sentiment Grayling wants to eradicate. It also sounds like some sort of cosmic baseball game. And what is the mind that he refers to? Would that be Lehrer’s piece of meat? Grayling seems to continue the mind-body split, which is what gives rise to ideas of the spirit to begin with. And what is the universe, and why should we feel responsible to its indifference? And does the universe have a history? These seem metaphors and anthropomorphisms, inaccurate and irrelevant. It’s simply not clear why our promoting the good would make any difference in or to the universe. To better understand the universe, we could read again Garrett Lisi’s “An Exceptionally Simple Theory of Everything,” except that the physics is surely beyond the ability of ordinary eyes. And we are again reminded of Robert B. Laughlin’s A Different Universe, which opens and ends on a theme suggested by Sir Arthur Eddington: “Not only is the universe stranger than we imagine, it is stranger than we can imagine.” Grayling, in his introduction, which he calls an Epistle, reaches back to the ancient Greeks when he says that “…every action and pursuit, aims at some good….” But it’s not so easy knowing what’s good. What we value is simply what we want, and what we want is not always what’s good for us. In the end, Grayling’s purpose seems naïve, and worse, for he seems to trap much of the independent thinking in the world canon in a cage with a single purpose, and that can’t be good.

Is the universe free? “They’ll never ever reach the moon,” Leonard Cohen sang, “at least not the one we’re after.” Just so, the physicists attempt to explain the universe in a language most of us will never understand. But then what language are we to use to understand the moon we are after, or the ocean in which we wish to live? The neuroscientists exploring the brain are like the physicists exploring the universe. As Vonnegut illustrated in his short novel Cat’s Cradle, no cat, lots of string. There’s nothing more difficult than creating something from nothing. Science is not, as Grayling would have us believe, “mankind’s greatest endeavour.” Humanity’s greatest endeavor, to return to Mathew Arnold, is love.

Plato was a Neuroscientist, too; or, Plato’s Purple Haze

A new Oliver Sacks book is out, The Mind’s Eye. We are kicked in the eye with metaphor, philosophy, and dichotomy, and we have not even opened the book yet: metaphor because Sacks is talking about the brain, for the mind, as Jonah Lehrer put it, “is really just a piece of meat” (Buckminster Fuller defined the mind as the capability to leverage ideas from single case experiences – see Operating Manual for Spaceship Earth); philosophy and dichotomy because to speak of the mind as an idea distinct from the brain is to cross into Plato territory (a very large country of philosopher states).

We came across Sacks’s new book in the April Harper’s Magazine, in a review by Israel Rosenfield, “Oliver Sacks and the plasticity of perception.” The brain is on the move again.

We ordered The Mind’s Eye; meantime, what does Rosenfield have to say about it? Rosenfield makes this claim: “There is a simple fact about evolution that, although rarely mentioned, is very revealing: plants don’t have brains.” (Of course, why mention the obvious, a claim about which there is no disagreement?) Yet here’s his explanation for why plants don’t have brains: “Plants don’t have brains because they don’t need them; they don’t move from place to place.” In grade school we called this moving from place to place “locomotion.” The classic example is the amoeba, but do amoebas have brains? We understand that plants don’t have the same locomotion that animals do, but we question Rosenfield’s claim that plants don’t move from place to place, because plants do move from place to place. They travel underground and in the wind, float down rivers and out to sea, appear in the most unlikely places, out of cracks in the hot summer asphalt. In fact, as Michael Pollan has suggested (The Botany of Desire, 2002), plants manipulate animals: we taxi them around, ferry them, fly them to the moon. Plants may not have brains, or locomotion, but they do get around.

Rosenfield says that brains “create something that is not there; and in doing so they help us to make sense of our environments.” To illustrate, he uses the phenomenon of color. According to Rosenfield, there is no color outside of the brain: “There are no colors in nature,” he says. (Tell that to Van Gogh, whose paintings reveal the brush of a butterfly and the heart of a hummingbird.) Nature, without a brain to perceive it, is like Jimi Hendrix’s “Purple Haze”: “If we were aware of our ‘real’ visual worlds,” Rosenfield says, “we would see constantly changing images of dirty gray, making it difficult for us to recognize forms.” But Zoe, our cat, has no problem recognizing forms. Then again, she does act like she’s on Purple Haze most of the time. In any case, the mention of separate realities brings to mind Plato as well as Purple Haze. Any mention of forms brings us back to Plato. We might also work in Carlos Castaneda’s A Separate Reality. Is there something outside the brain? Is there something inside the brain? What does it look like when the brain is asleep, or astroke?

But it’s a sunny morning in Portland, the first in some time, the sky a solid blue, fronting the promise of a solid gold weekend. Both our brain and mind seem to agree that we should get out and into this sun. Zoe’s already out there, chasing the forms around the Salsa Garden. Ah, Bartleby; Ah, locomotion!

Where Winston Churchill meets Roddy Doyle; or, the Library is not a Zoo

“Fancy living in one of these streets – never seeing anything beautiful – never eating anything savoury – never saying anything clever!” The quote could easily have come from any one of Roddy Doyle’s many crude characters, hewn from a pub-lyrical pint in a Barrytown road: “Wha’ part o’ Dublin? Barrytown. Wha’ class are yis? Workin’ class. Are yis proud of it? Yeah, yis are…Your music should be abou’ where you’re from an’ the sort o’ people yeh come from,” Jimmy Rabbitte says, who will manage The Commitments to his vision of a Dublin band playing soul music.

But it can’t be a Roddy Doyle character said it, the bit about “never saying anything clever!” For Roddy Doyle characters rarely say anything that’s not clever. Clever’s what comes from never seeing anything beautiful and never eating anything savoury. But what’s savoury? For that, we might to go Roddy Doyle’s The Van: “He got the scoop in under the chips and got a grand big load into the bag, filled it right up. Good, big chips they were, and a lovely colour, most of them; one or two of them were a bit white and shiny looking.” A bit of tea to go with the fish, perhaps, from Roddy Doyle’s The Snapper: “He tried the tea. It was brutal.” But if it was a Roddy Doyle character said it, he would have said it passin’ through a Churchill neighborhood.

Brutal too is the knowledge that our opening quote does not come from a Roddy Doyle character, but from Winston Churchill, quoted, according to Adam Gopnik, in “Finest Hours,” in the August 30 New Yorker, by Edward Marsh, Churchill’s secretary, as Churchill visits a working class neighborhood in Manchester. Gopnik pulls the quote out to evidence that those closest to and in a position to assist the great in their finest and not so finest hours are in the best position to snap shots peculiarly revealing. Gopnik may have had other reasons for isolating the quote. For one thing, it’s possible Churchill’s enculturated attitude displayed in the quote explains his ability to use those living in those streets as cannon fodder in his war.

There certainly is something to be said for living in beautiful, savory, and clever conditions, as Jonah Lehrer explains in his Frontal Cortex blog: “In the late 1990s…the University of Illinois began interviewing female residents in…a massive housing project on the South Side of Chicago. Kuo and her colleagues compared women randomly assigned to various apartments. Some had a view of nothing but concrete sprawl, the blacktop of parking lots and basketball courts. Others looked out on grassy courtyards filled with trees and flowerbeds. Kuo then measured the two groups on a variety of tasks, from basic tests of attention to surveys that looked at how the women were handling major life challenges. She found that living in an apartment with a view of greenery led to significant improvements in every category.”

But literature does not come from the beautiful, the savory, and the clever, but from the brutal, the sardonic, and the cleft. Art does not come from patios filled with plants, but from greasy asphalt alleys glistening in flickering neon. This is why we must be careful of the library. The library is like a zoo, its books like wild animals, snakes, and deadly insects, but the library is not a zoo, for in a zoo there are cages that separate and protect us from the lions and tigers and bears. In a library, the stacks are open, and readers reach out and pet the books at their own risk.

And after leaving the library, assuming you make it out alive, and you’re hungry and there’s a food cart about, be sure they’re stuffing their wraps with real food and not nappies.

How to Live Happily to 106: Happy Bloomsday, Mr. Leopold Bloom

Articles celebrating victims of extreme old age usually ask about diet, so let’s get that out of the way first:

“Mr. Leopold Bloom ate with relish the inner organs of beasts and fowls. He liked thick giblet soup, nutty gizzards, a stuffed roast heart, liver slices fried with crustcrumbs, fried hencod’s roes. Most of all he liked grilled mutton kidneys which gave to his palate a fine tang of faintly scented urine.”

The time is morning, the scene the house, the organ the kidney, the art economics, the symbol the nymph, the title Calypso, the technique mature narrative (Gilbert, 1930). The day was June 16, the year 1904, the place Dublin, the book James Joyce’s Ulysses.

Speaking of mature narrative, Jonah Lehrer, over at the Frontal Cortex, has put up a post titled “Old Writers” in which he dispels the myth that writers do their best work when very young, that older writers can’t match the quality or creativity of their younger work, as if writer’s ink were a kind of dark blue testosterone that fades and weakens in potency with age. Lehrer concludes his post with “…different circumstances call for different kinds of creativity…The most successful artists aren’t slaves to their chronological age. Instead, they succeed by speaking to the age in which they live.”

Works want readers, listeners, viewers, and they always want new readers, new listeners, new viewers, and when they don’t get them, they feel old and weak, remaindered and marked down, bagged for the garage sale: Books Penyeach.

Pomes Penyeach was first published in 1927, when James Joyce was 45 years old. Joyce’s works are remarkable for their consistent creative originality that insists on new forms to communicate the events that parallel the writer’s age and the age of the writer. And they have not weakened over time, but have grown stronger with age. Perhaps it was those nutty gizzards. Almost certainly it must have been the burgundy, as Bloom suggests (although Joyce preferred white wines). In any case, the example of Joyce’s works expresses Lehrer’s definition of the successful artist, that the work has nothing to do with the age of the artist, but everything to do with the age at which the work is experienced.

Theo Jansen and Advanced “Avatar”

Caleb Crain, we learned yesterday, prefers movies that are true to nature, acoustic. He’s more interested in the Carny than the ride, while David Denby prefers the roller coaster, ignoring the Carny, and if he doesn’t have to leave the theatre for the ride, even better. Johnny Meah’s act wouldn’t make much of a movie for Denby. Yet it may not matter what the professional critics think because as their ranks dwindle thanks to the disappearance of newspapers we may find the neuroscientists filling the gap.

Jonah Lehrer, who writes from a neuroscience perspective and explains things like why we stop at red and go at green and why some of us slam the brakes at yellow while others hit the gas, suggests in his Avatar review that there might be something wrong with the prefrontal cortex that prefers the acoustic; for some reason, the brain responds negatively to the film drug. Not to worry, though, whatever your brain seems to prefer, for Jonah’s commenter number eleven, David Dobbs, also a scientist, rebuts Jonah’s scientific argument and calls Avatar “impoverished.” As it turns out, the neuroscientists, like the critics Crain and Denby, also find different values in the film and the brain.

I remember when the first Star Wars movie was released; I finally saw it a decade later. I’m sure there must be something wrong with my prefrontal cortex, judging from my taste in movies. In Bradbury’s Fahrenheit 451, television technology has evolved from the little toads sitting front and center of the mid-twentieth century living room to screens that fill entire walls, and the best TV for one’s home fills all four walls, and the viewer literally interacts with the TV characters, becomes part of the show. Avatar encourages viewers to imagine a time when the film technology of Avatar seems as dated as the first Star Wars movie, and to imagine that that time is now – the fix must be for increased immersion, guaranteeing a string of sequels.

In the 1960’s, during the height of the psychedelic craze, someone asked Salvador Dali if he took drugs when he painted. No, he said. Why would I take the drug; I am the drug. And when the scare was that rockers were putting secret messages in their recordings, some of which could be understood by playing the record backwards, someone asked Alice Cooper if he spiked his records with secret messages. No, he said, I don’t know how to do that, but if I did, the message would be to buy more records.

If we are to be controlled by technology, what’s the point? We still have to contend with nature, our nature, the nature of others, and mother nature. Jonah, in his “review,” argues “why the Avatar plot is so effective: it’s really a metaphor for the act of movie-watching.” Exactly, it’s consumerism about consuming, about being eaten alive by technology, and it’s yummy.

And what of acoustic technology? Is there anyone out there creating creatures more fantastic than those virtually real ones we see via 3D in Avatar? There is. Check out this video. It’s Dutch artist Theo Jansen with his creatures, and they are more fascinating than anything you will experience in Avatar because while they are virtually non-tech, they are real; they have become part of nature, and you don’t need special glasses to view them.

Theodore Dreiser and Flannery O’Connor were Neuroscientists, too

Over at The Frontal Cortex, Jonah Lehrer has posted his Wall Street Journal article in which he takes the pow out of will power, arguing the busy brain is to blame for human frailties. It’s a classic defense of the human condition (Dreiser used it in An American Tragedy), and a blow to the motivational-speaker market.

The reduction of will power also suggests the neuroscientists may be close to removing the free from free will. No wonder a good man is hard to find. There might be some will left, but not enough to satisfy being saved as a one-shot deal. Flannery O’Connor explains in her short story “A Good Man is Hard to Find”: The Misfit, having provided the grandmother with her final jolt of grace, says, “She would have been a good woman, if it had been somebody there to shoot her every minute of her life.” This is Flannery’s depiction of the Catholic view of will and grace, and it explains the Catholic necessity of being saved every moment of one’s life, of the necessity of being reborn daily, not just once, for one could live, in the Catholic tradition, a good life for 80 years, but a single hanging curve ball that goes against the signs and you’re yanked and sent down the tunnel, for in Catholicism, as in baseball, it’s not about what you did for me yesterday; it’s what you can do for me today that counts.

Motivation depends on the quote, a bite of sugar; motivation is entertainment – motivetainment, ads directed at the brittle brain. Quotes are empty calories. If losing weight is a resolution for 2010, skip the motivation; instead, read Theodore Dreiser, go for long walks, and eat bananas. Bananas are funny and literary – you’ll need both after reading Dreiser.

This Is Your Brain On Books

Over at the Frontal Cortex, Jonah Lehrer has posted his review of a new book about the effects of the brain on reading: Stanislas Dehaene’s Reading in the Brain. Lehrer says that the “moral of Dehaene’s book is that our cultural forms reflect the biological form of the brain; the details of language are largely a biological accident.” We’ve not read Dehaene’s book yet, but Lehrer’s summary seems to suggest a symbiotic relationship between the brain and the brain’s environment.

To understand the effects of reading on the brain, one must go to non-literate cultures, and study, as Marshall McLuhan researched, the changes that occur in both the brain and the culture as reading is learned. “The most obvious character of print is repetition,” McLuhan said, “just as the obvious effect of repetition is hypnosis or obsession” (p. 47). It’s impossible to be illiterate in a non-literate culture, and non-literacy has its advantages.

When we read, we are hypnotized, the eye becomes master of the sensorium, the remaining four senses impressed into eye-service. The hypnosis blinks when the eye sees an unfamiliar word, and the tongue and ear have to help out: “we’re forced to decipher the sound of the word before we can make a guess about its definition, which requires a second or two of conscious effort” (Lehrer). This means that the new reader must mouth his words as he reads (since all the words are unfamiliar to the new reader); he must hear them first. This is why, according to McLuhan, “the medieval monks’ reading carrel was indeed a singing booth” (p. 115). They had not yet learned to read silently. They had to say the word and hear it; the words entered the brain through their ears, not through their eyes. (This supports using a phonics method to teach reading.)

Lehrer says that Dehaene “also speculates that, while ‘learning to read induces massive cognitive gains,’ it also comes with a hidden mental cost: because so much of our visual cortex is now devoted to literacy, we’re less able to ‘read’ the details of natural world.” Again, this ground was covered by McLuhan in The Guttenberg Galaxy: The Making of Typographic Man.

“Literacy,” McLuhan argued, “affects the physiology as well as the psychic life” (p. 45). McLuhan said that “every technology contrived and ‘outered’ by man has the power to numb human awareness during the period of its first interiorization” (p. 187). And this is the ground that Nicholas Carr has been sifting though with regard to the effects of the internet on reading and on the brain.

It’s curious to hear Lehrer, not quite a neuroscientist (which is one reason we like him; he’s a non-specialist), say that “the brain is much more than the seat of the soul…,” curious in that he resorts to both metaphor and the metaphysical in a single phrase. “The seat of the soul”: surely that’s your brain on books.

June 6, 2010 Update: Jonah Lehrer takes some of the wind out of Nicholas Carr’s neuro-sails in a Times review of Carr’s book The Shallows and in a follow up post on his blog.