Throwback Thursday: The ER and the Stroke

When I quit the day job, nine years ago now, I inquired to volunteer at a local hospital, and was processed through a formal application and selection drill: application with references and background check; tuberculosis test and booster shots; orientation that included a grand tour of the byzantine hospital bowels; attend specialty departmental training; get a branded work shirt and a photo badge, a job post, and a schedule. I was assigned to the Emergency Room.

There were two main entrances to the emergency room, out back, where the emergency vehicles (ambulances, fire engines, police cars) enjoyed a red carpet treatment, and out front, which was open to the general public, patients arriving alone, with a partner, in small groups, by car, cab, city bus, bicycle, or on foot, in all manner of conditions, inside and out, physical and mental and emotional, in every kind of weather. I worked the out front entrance.

At first, I was given three main duties or responsibilities: I maintained a fresh supply of wheelchairs at the entrance and greeted patients who needed one at the sidewalk or curb, wheeling them inside and up to the reception desk; I walked around the waiting room tidying up and cleaning as necessary; and I handed out bottled water, blankets and pillows, and ice packs. Occasionally, I ran errands out of the ER, collecting wheelchairs that had wandered off like shopping carts from a grocery, or picking up something special for someone from the gift shop or cafeteria, or showing some lost visitor the way to the elevator that would get them to the floor they wanted. On my way back to the ER, I might slip into the chapel to ensure there was at least one candle still burning.

Usually, I was the only volunteer on a given shift, or a couple of shifts might overlap, and another volunteer and I might bond and compare impressions. I had agreed to work any day or time, which meant I got some evening and weekend shifts, when the main hospital felt sort of like a ballpark when there isn’t a game going on, but the ER concession was busier than ever.

One of the shift supervisors was keen on keeping everyone looking busy. Usually, everyone was busy, but Big Nurse, I’ll call her, though she was a tiny lead wire, frowned on any posture that might suggest idleness or shiftlessness. At orientation, the topic of blood was introduced. Not for the squeamish was the ER. If you had a problem with blood, they could find you a job in an office somewhere, safe from the oozes and crusts. Still, volunteers should all avoid blood contact, and in any case, would not be required to touch anything, well, bloody. One evening, I wheeled a patient holding a large rag under her arm into the waiting room. She was quickly admitted, but we had stopped a few moments outside the ER. On my way back with the wheelchair, I noticed some blood pooled on the tile floor. I told Big Nurse about it. Well, she said, clean it up, and she handed me a pair of gloves, a spray bottle of cleaning solvent, and some towels. Throw the towels and gloves into the toxic bin when you’re through, she said.

I sprayed and scrubbed the floor clean of the blood and properly disposed of the tainted tools of the trade. Returning to my station, I looked back at my work, and there gleaming in the unfriendly light of the lobby was a spot in the tile floor that looked as clean as the holy grail. I had either washed clean through the floor finish, or the rest of the floor had not been cleaned in some time.

You might think a visit to the ER a rare occurrence for most individuals, but there were a few regulars everyone seemed to know by name, and one, in particular, was adept at making a scene. The night I met him, he was being treated like the boy who cried wolf one too many times, and he was not being admitted. I was asked to help him out of the lobby. I asked him if he’d like a wheelchair. No, he said, grabbing and wrapping my arm in his, spreading an amazing swath of sweat across my bare arm. Just hold me up and help me walk, he said, assuming now the most stoic of attitudes. After helping him out, I went to the restroom and scrubbed my arms as if I was heading in to surgery, unsure if the ER was my calling.

The reception desk work station separated the waiting room from the general emergency room complex. After I’d been a few times on her shift, and possibly after seeing I could handle a bit of blood and sweat, Big Nurse began to give me jobs inside the ER. You entered the actual ER through a large set of automatic doors, the opening big enough to wheel a chair or bed through. Immediately inside the ER were smaller, private hospital-like rooms where nurses and doctors completed patient triage, fixed small complaints, prescribed, or ordered x-rays or other tests, admitted patients or sent them home. Down the wide hall and around the corner from the private rooms, a large, well-lit theatre-like area was rigged with gurneys, nurse stations, and assorted medical equipment. It might have been some back stage of a movie studio lot. This is where the patients who arrived in ambulances were admitted and worked on by shapes in green scrubs wearing masks and plastic gloves.

My new jobs inside the ER proper included remaking beds, tossing used pillow cases into the laundry and fitting on new ones, tidying the exam rooms, emptying trash cans, replenishing supplies, including gloves and towels and tissues. I came and went purposefully, mostly ignored, everyone busy. Back in the waiting room, a couple of children were playing a game of running around in circles. Big Nurse told me to gather them up, take them to the back where there was a kid’s play area, and read them a story or something. She said this with such optimism and confidence you might have thought I had included a stint with Mr. Rogers in his wonderful neighborhood on my resume.

As a volunteer, I had a prepared, approved script should I get any questions from people in the waiting room. I am a volunteer, but I know you have not been forgotten. Sometimes, I would walk to the work station and look at the check-in chart, returning confidently to the waiting patient, assuring them they were still in line. Still, there would be questions I could not have answered if I wanted to: why am I here; why is this happening to me; what’s taking so long; I was here before them; how much longer? I’m going outside for a cigarette – will you come and get me if they call my name? They don’t call it a waiting room for nothing. I started bringing some supplies from home: crayons and coloring books for the children; old New Yorker and Rolling Stone magazines; a couple of used Louis Untermeyer poetry anthologies; a couple of decks of playing cards. The supply had a way of getting used up; things disappeared.

Big Nurse handed me a clip board and told me to call the patient’s name and escort them into an exam room in the ER. I walked out into the waiting room a few steps and called a name, ending it with a hopeful question mark: Joe?

Occasionally, I was invited by a patient to sit and talk, which meant to listen to what had happened. Sometimes, things slowed down, and I walked around the waiting room collecting magazines, picking up empty bottles, tossing used ice packs. The emergency room is open 24 hours a day, seven days a week, all year long. At home, I got busy doing some other things. I missed a week in the ER, then another. And at some point, I realized I probably wasn’t going back. I wasn’t going to be one of those volunteers who had logged in a record number of hours for the year and had their name recognized on a display board in the lobby.

Instead, as these things happen, one Saturday afternoon, I found myself waiting in the waiting room of the ER. Susan had driven me down. Apparently, I was having a stroke. I don’t remember any volunteers. We waited a long time. I felt better. I told Susan I was going to tell them I was going home. They checked their log. Why are you here, again? I tried to explain again what had happened. But I seemed ok now. Well, you’re here. You should talk to someone. We’ll get you right in. And they did. And I was admitted and stayed a week. About a year later, I wrote a little piece about the experience the Oregonian published, which you can read here.

For those who might think of the world as a waiting room, find someone who appears to be waiting, and offer them something to help with the wait.

Flashing Lights and Random Noise in the City of the Brain

The ophthalmologist asked if I was still seeing the flashing lights. Rarely, but hard to predict. So the brain has gotten used to them, and is ignoring them, she said, and I immediately wondered why that same brain couldn’t ignore the tinnitus sounds also.

Sophisticated sound systems increase chances of distraction from random noise. If you must cough, wait until the crescendo is about to peak. Clarity of sound is valued. Increase pixels, dots, from cartoon to photograph. Whatever might muddy the waters is considered distraction. Clarity is a value that dithers. We must learn to connect the dots.

But what is distraction? And when might distraction be desirable? Rocks viewed through water look different after the creek dries out. The transistor radio is the perfect transmitter of the three minute basement tape composition recorded on a single track hand held device. Form may distort or obscure content so that we might hear, see, feel, smell, or taste what we might otherwise have missed, though the effort often fails.

Kindness, sense of humor, forgiving, joy of life.

Culture provides for cloistered clarity, photographs viewed through filters, the eye a sieve. The ear a strainer. We may not wince quite as much from scenes in a film when intoxicated from the smell of buttered popcorn.

“How do you know but every bird
that cuts the airy way
Is an immense world of delight,
closed by your senses five?”

Just so, but in any case,

“A fool sees not the same tree that a
wise man sees.”

(Two quotes above from William Blake’s The Marriage of Heaven and Hell).

I first heard of random noise working with some actuaries on multivariate analyses. In the context, noise is unpredictable and therefore unreliable. Random noise does not appear to correlate, nor can its causes or effects be accurately tracked or explained. Probability becomes problematic. The treatment is the same as for tinnitus, where smoothing or dithering renders the unwanted noise invisible. Random noise is asymmetrical, or anti-symmetrical, and expressed by numbers, sounds, colors, or any other output, a given sequence of random noise probably cannot be duplicated.

Things are rarely right on, but approximate; why then the need for clarity, for perfection, for proper grammar, pronunciation, spelling, punctuation? Prescription is an attempt to clarify, but description is far more accurate.

“Bright lights, big city, gone to my baby’s head” (Jimmy Reed, 1961).

Here is a four stanza composition, each stanza four lines long, expressing random noise in hexadecimal format. The piece can be played musically if each letter is expressed as a note and each number expressed as a duration (the absence of g might be noted, because the base is 16). There are no more instructions. The fewer instructions, the more random the results. Randomness may be the prefect solution for writers with copyright issues. Interested readers may reproduce the exercise below at the ANU Quantum Random Numbers Server, but no matter how many times you try, you will probably never come up with the same sequence shown below. Plus, each line below has been truncated from its original. Other arbitrary changes made to the original output from the server include all zeros removed, and spacing and line breaks added.

Dither # 1

1ee 9f 9f7454 cd2 a a77114 da a4
6f2 8eab 4fc1 bad b9a13 c8 d23
19f e3 5a 27bd 4c361 e8 dec c211
b3 6a f5 4645407 d85 9 fa 35 efcbbacb

86c c3 681 d5 f5 74bc c3a 8ee8 6 f2 92 c5
91 c5 1 b3 4 b7 f68793753 dd 38ba 34f1e2d
814 eff c6884 aa 30d4 e1 a8 dc5 6c 4b
182986 bfd 982 d5 805854 c7

fc6 6e2172 eab fb 2b5
74 4afef c8 40 c57 c9 4 bab 1
b86fa 8c 4 9a 39 ffba 99ac 89 bd5 be
97 b8 8c f79 477 a c5 7d 9d d 13b 2

53 79 53 d3 61d3b178c68882aa 6 cefbbf77
d8b449 efaf 73fa8917 bfb 473774ffc1 d7 d9dfe8
1d3c c8 99761685 c21cd9 2569935ca2de6b7 ebb
23513e76b828b a5 ac

And the lines below were copied from “The matrix” streamed live from the AUS Lab and pasted without changes, except to color – the original contains the matrix green flavor, but it wouldn’t copy, so I’ve approximated the color with a font change. As I read through the composition, I could find no distractions, but upon preview of the post, it appears that WordPress coding has been added, probably because of the change I made to the font color. I find the result distracting.

È[Õ‚‚ω9(C∫}¾„õ°v¸JËBؾ{ΨÉŠq®ψξ
.Ó½ïpÚ±/zò→‹πɲ˼ΦÏ‚óæ;MIÅ<9Ð9š֑±ªGæ↔Ñ
ïΧ8β)hDœa1tR˜Χε¯Š4∫ÈQ”R/Ì),¦êí‰f9õÚ¯Xîé:
#¿7{‘‡ÆÙ™κë‹g¡ÖÀoªyÖ‡ƒPiDë»öÝ√üÄΩ¹+∪ÊDγÌ
xKòÁNπ:∂t-Eh픉#.=φðρ—yœ9UR—λY¤ω→ÂZãé}:σ
öÝ=±¨Íθ*Ší%τ~νËÍ[&αζF7ÏòUœκ_λωW#θ‡ûqº•Ëã
ú‡LêμÞÑÖ2:λ∂F↓°ψÌ¡pΤ›e–à.N!ÎûƒûˆhqÔÏa?ƒ</span>
<span style="color:#00ff00;"> ↔ζÆ~8ΤºÇ!#∄8ÃLLØθ¡&amp;∪ß|cZk±xÁúο@∪ÏDÜþè'¡{</span>
<span style="color:#00ff00;"> ù’;θfÑÆÏ7ψμE„&lt;ºÙ∃’a√ÖΧŠΣσÕ¢¥æÆ5Æρ
ÄO“jJω

Readers often ask what a poem means. Usually, if nothing else, what poetry means, in spite of repetition in form and sound and sense, is that you can’t guess what comes next:

What is correct in quantum indeterminacy?

One year, we went to hear the jazz trumpeter Donald Byrd at the Ash Grove in Los Angeles. But he was too loud, and we had to leave. I’d already had some hearing damage, forgetting my ear plugs at the Fort Bliss rifle range, working the jack hammer from the compressor truck without plugs, thinking I guess that I was born with immortal hearing. Should have known better; my father suffered from ear damage. The high school I attended, Saint Bernard, in Playa del Rey, sat next to the runways of LAX, the planes taking off often a welcome break to a hard-boiled lecture.

My father often heard what was said, but muffled, without clarity. He taught himself to read lips. He was a good listener, and an avid talker, in spite of a stutter. I suspect he stuttered because he was unsure of pronunciations, the result of his hearing difficulties. Or maybe because he could not hear himself talk. One year, hearing aid technology having improved, he had surgery on both ears, to clear out rotted bone and crud, and was fitted with new hearing aids. In no time, his stutter disappeared. My mother was sure this was a miracle.

We went back to the Ash Grove to hear the guitarist John Fahey. We were seated in front. John came out with his guitar and a giant Bubble Up bottle. He sat down and drank the entire bottle of Bubble Up in one long swig, its neck stuck deep under his duck-like protruding upper lip. He put the bottle down on the floor and began to play guitar. I thought maybe he might use the Bubble Up bottle for some bottleneck guitar, but he did not. He said not one word the entire evening, nor do I recall a single burp. In short, he was not too loud. I still have his “The Yellow Princess” in vinyl album format, the one where you can hear the door close and footsteps.

The ophthalmologist asked me if my eyes felt like sandpaper. She said one of my optic nerves was larger than the other. I told her I also had asymmetrical hearing, which she apparently considered a distraction. She suggested artificial tears for the dry grit in the eyes feeling.

The brain is a megacity of flashing lights and random noise, a conurbation of neighborhoods in various stages of going to seed.

The Phrenological Slope of the Post

Do some blog-brains have a pronounced proclivity propelling profuse postings, and can the inclination be felt in the shape of their skulls? A blogger has fallen from grace with the blogging sea. I’ve been meaning to post on the phenom, and even though it’s old news in today’s Blogger Ocean, where tides rise and fall every few minutes instead of twice a day, here goes.

Phrenology was taken for a serious science in the late 1800s, and occupied thinkers from all occupations. I first learned about phrenology back at CSUDH, reading “Moby Dick” in an American Lit. class taught by Abe Ravitz. The idea behind phrenology was that a person’s predilections, proclivities, and personality could be read by feeling the shapes of the person’s head, its bumps and curves and slopes. The person doing the feeling was the phrenologist. There may be some basis for comparing the phrenology of the 19th Century to the neuroscience of the 21st.

Can a blogger post too much? Frequent blogging appears to be an acceptable practice as long as the blogger does not repeat posts, but there are rules within rules, so it is okay to repeat a post as long as the previous post is properly cited, even if the post is one’s own. A post of one’s own should not (recent criticism makes clear) with a change of venue be presented as a new post. So, what bump within the neuroscience journalist Jonah Lehrer’s head provoked the young but already venerable writer and speaker from doing just that? What pressures build in the brain from the habit of frequent posting?

I’ve been reading Jonah’s blog, The Frontal Cortex, for some time. The first time I mentioned it in the Toads, I hasten to cite, was on November 21, 2009, in a post titled “This Is Your Brain on Books.” Your brain on posts, apparently, looks and feels differently. I still like Jonah’s work, and find much of the recent criticism following his reposting old posts and ideas previously sounded elsewhere to his blog following a change of venue to The New Yorker somewhat opportunistic (taking advantage of the breaking news to call out Jonah on issues having nothing to do with the current topic), exaggerated (sounding like the Queen in “Alice”), and off point.

The electronic world never sleeps. Surely, the brain feels this, and posting can be addictive, and so can the attention a writer might crave. Over at Twitter, we find writers whose followers number in the thousands. One simply can’t “follow back” that many tweeters, certainly not at the frequency many tweeters are known for. This is seen in blogs also. Does the blogger really want to write the blog everyone follows? In blogs begin responsibilities (follow link, and see Delmore Schwartz).

The best critical review of the Jonah Lehrer self-plagiarism issue I found at Slate, in an article by Josh Levin titled “Why Did Jonah Lehrer Plagiarize Himself? Because he stopped being a writer and became an idea man.” Levin says, in the last paragraph of his article: “A blog is merciless, requiring constant bursts of insight.” This is true of the daily blog, the hourly post, or every minute the blogger is awake blog. But Levin is even more brutally honest: “Most of us journalists have one great idea every few months….” But there are so many different kinds of blogs, dedicated to so many pursuits. But maybe all blogs break down to two basic kinds, the serious (series, < Latin, promotions; ex ordine, no break) and the not so serious (enough posting for today; want to do some yard work). And where will the Toads go, adrift on the Blogger Ocean?

Follow-up, Jul 31, 2012: My sister Barb just sent me this from a Guardian blog: “Journalist resigns for fabricating Bob Dylan quotes.” The journalist? Jonah Lehrer.

David Brooks and The Plaque of Alienation; or, the Consciousness Bubble

Are we making progress? And is the progress good? Have humans improved over time? Are we better than our ancestors? What makes us human, and whatever that is, have we been improving upon it? The universe may be expanding; our consciousness is not. Something seems to be blocking our arteries: the plaque of alienation. Yet there are some who are apparently awakening to a new dawn, a new and improved consciousness, and there’s a consciousness revolution afoot, as David Brooks tells it in his January 17, 2011 New Yorker article, under the Annals of Psychology section: “Social Animal: How the new sciences of human nature can help make sense of a life.” Not since the 1960s have we seen such an upswell in the commercialization of consciousness.

“We are living in the middle of a revolution in consciousness,” Brooks tells us. The revolutionaries in this assault on our personal dark ages include “geneticists, neuroscientists, psychologists, sociologists, economists, and others” who “have made great strides in understanding the inner working of the human mind.” Such a list of armed trick-or-treaters makes us want to light out for the territory. But wait, for “far from being dryly materialistic, their work illuminates the rich underwater world where character is formed and wisdom grows.” But how are we suddenly under water? If we’re to have a revolution in consciousness, shouldn’t we be able to talk about it without using metaphors? But there’s more: “They [the revolutionaries] are giving us a better grasp of emotions, intuitions, biases, longings, predispositions, character traits, and social bonding, precisely those things about which our culture has the least to say.” Whose culture? Has Brooks never read Langston Hughes nor heard of the Harlem Renaissance? For Langston talked precisely about “those things.” Has Brooks never read Faulkner’s The Sound and the Fury, Fitzgerald’s The Great Gatsby, Thoreau’s Walden? But there’s even more: “Brain science helps fill the hole left by the atrophy of theology and philosophy.” Mathew Arnold’s “Sea of Faith,” in Brooks’s view, is now bone dry; and apparently Joseph Campbell’s The Power of Myth is suddenly irrelevant (in spite of our underwater status), as must be Jung’s Modern Man in Search of a Soul, not to mention the work of Mary Midgley. And Brooks must have missed the film Examined Life, with Cornel West, Slavoj Zizek, and Martha Nussbaum. Neither has Brooks seemed to have ever visited the Stanford Encyclopedia of Philosophy. Theology and philosophy are not atrophying; that’s one of the few immutable laws the brain seems to labor under. It’s what makes consciousness worthwhile, for, as Dostoevsky’s underground man says, “Suffering is the sole origin of consciousness.”

And where Brooks’s tightly-written scenario takes us is to a happiness moral, much cliched, but no doubt true: we’ve been looking in all the wrong places. “Joining a group that meets just once a month produces the same increase in happiness as doubling your income,” Brooks says, siting recent research. The problem, Brooks says, is that “Many Americans generally have a vague sense that their lives have been distorted by a giant cultural bias. They live in a society that prizes the development of career skills but is inarticulate when it comes to the things that matter most.” Agreed. But why must Brooks have the imprimatur of science to get to the moral? And is it really a revolution of consciousness that he’s describing, or a simple increase in awareness that comes with maturity and experience? Jung said, “…if we maintain that mental phenomena arise from the activity of glands, we are sure of the thanks and respect of our contemporaries, whereas if we explain the break-up of the atom in the sun as an emanation of the creative Weltgeist, we shall be looked down upon as intellectual freaks. And yet both views are equally logical, equally metaphysical, equally arbitrary and equally symbolic. From the standpoint of epistemology it is just as admissible to derive animals from the human species, as man from animal species.” Jung is explaining how the scientific method came to dominate explanations of life: “…everything that could not be seen with the eyes or touched with the hands was held in doubt; such things were even laughed at because of their supposed affinity with metaphysics.” The science Brooks has come to rely on is what Jung called “psychology without the soul,” for the soul is now inadmissible evidence in the court of science. Jung explained that “It is the popular way of thinking, and therefore it is decent, reasonable, scientific and normal. Mind must be thought to be an epiphenomenon of matter. The same conclusion is reached even if we say not ‘mind’ but ‘psyche’, and in place of matter speak of brain, hormones, instincts or drives. To grant the substantiality of the soul or psyche is repugnant to the spirit of the age, for to do so would be heresy.”

No doubt Brooks could have made his argument citing the poets instead of the scientists. And no doubt Arnold’s Sea of Faith is indeed today as dry as bone dust. Brooks cites the scientists because poetic currency has been devalued. What is easily missed is that the scientists also trade in a currency, as Jung explains: “We delude ourselves with the thought that we know much more about matter than about a ‘metaphysical’ mind, and so we overestimate physical causation and believe that it alone affords us a true explanation of life. But matter is just as inscrutable as mind…It is only our doubts as to the omnipotence of matter which could lead us to examine in a critical way this verdict of science upon the human psyche.” And it is this doubt which sticks to the arteries of our psyche and alienates us from the fun the scientists today seem to be having. We fear yet another bubble.

Theo Jansen and Advanced “Avatar”

Caleb Crain, we learned yesterday, prefers movies that are true to nature, acoustic. He’s more interested in the Carny than the ride, while David Denby prefers the roller coaster, ignoring the Carny, and if he doesn’t have to leave the theatre for the ride, even better. Johnny Meah’s act wouldn’t make much of a movie for Denby. Yet it may not matter what the professional critics think because as their ranks dwindle thanks to the disappearance of newspapers we may find the neuroscientists filling the gap.

Jonah Lehrer, who writes from a neuroscience perspective and explains things like why we stop at red and go at green and why some of us slam the brakes at yellow while others hit the gas, suggests in his Avatar review that there might be something wrong with the prefrontal cortex that prefers the acoustic; for some reason, the brain responds negatively to the film drug. Not to worry, though, whatever your brain seems to prefer, for Jonah’s commenter number eleven, David Dobbs, also a scientist, rebuts Jonah’s scientific argument and calls Avatar “impoverished.” As it turns out, the neuroscientists, like the critics Crain and Denby, also find different values in the film and the brain.

I remember when the first Star Wars movie was released; I finally saw it a decade later. I’m sure there must be something wrong with my prefrontal cortex, judging from my taste in movies. In Bradbury’s Fahrenheit 451, television technology has evolved from the little toads sitting front and center of the mid-twentieth century living room to screens that fill entire walls, and the best TV for one’s home fills all four walls, and the viewer literally interacts with the TV characters, becomes part of the show. Avatar encourages viewers to imagine a time when the film technology of Avatar seems as dated as the first Star Wars movie, and to imagine that that time is now – the fix must be for increased immersion, guaranteeing a string of sequels.

In the 1960’s, during the height of the psychedelic craze, someone asked Salvador Dali if he took drugs when he painted. No, he said. Why would I take the drug; I am the drug. And when the scare was that rockers were putting secret messages in their recordings, some of which could be understood by playing the record backwards, someone asked Alice Cooper if he spiked his records with secret messages. No, he said, I don’t know how to do that, but if I did, the message would be to buy more records.

If we are to be controlled by technology, what’s the point? We still have to contend with nature, our nature, the nature of others, and mother nature. Jonah, in his “review,” argues “why the Avatar plot is so effective: it’s really a metaphor for the act of movie-watching.” Exactly, it’s consumerism about consuming, about being eaten alive by technology, and it’s yummy.

And what of acoustic technology? Is there anyone out there creating creatures more fantastic than those virtually real ones we see via 3D in Avatar? There is. Check out this video. It’s Dutch artist Theo Jansen with his creatures, and they are more fascinating than anything you will experience in Avatar because while they are virtually non-tech, they are real; they have become part of nature, and you don’t need special glasses to view them.

Theodore Dreiser and Flannery O’Connor were Neuroscientists, too

Over at The Frontal Cortex, Jonah Lehrer has posted his Wall Street Journal article in which he takes the pow out of will power, arguing the busy brain is to blame for human frailties. It’s a classic defense of the human condition (Dreiser used it in An American Tragedy), and a blow to the motivational-speaker market.

The reduction of will power also suggests the neuroscientists may be close to removing the free from free will. No wonder a good man is hard to find. There might be some will left, but not enough to satisfy being saved as a one-shot deal. Flannery O’Connor explains in her short story “A Good Man is Hard to Find”: The Misfit, having provided the grandmother with her final jolt of grace, says, “She would have been a good woman, if it had been somebody there to shoot her every minute of her life.” This is Flannery’s depiction of the Catholic view of will and grace, and it explains the Catholic necessity of being saved every moment of one’s life, of the necessity of being reborn daily, not just once, for one could live, in the Catholic tradition, a good life for 80 years, but a single hanging curve ball that goes against the signs and you’re yanked and sent down the tunnel, for in Catholicism, as in baseball, it’s not about what you did for me yesterday; it’s what you can do for me today that counts.

Motivation depends on the quote, a bite of sugar; motivation is entertainment – motivetainment, ads directed at the brittle brain. Quotes are empty calories. If losing weight is a resolution for 2010, skip the motivation; instead, read Theodore Dreiser, go for long walks, and eat bananas. Bananas are funny and literary – you’ll need both after reading Dreiser.

Where readers eSurface but authors lose control

One advantage of the eBook is lightness. And library books “just disappear” from the little light box on the due date – so no overdue notices, an article in this week’s Christian Science Monitor (print edition) illustrates (we’ve noticed our print books disappearing occasionally, reminding us of bumbling Polonius’s advice, “Neither a borrower nor a lender be”).

We read a gloomy hope, for at least reading is in the headlines: gloomy in that “deep reading” is failing; hopeful in that readers appear to be surfacing. Some consider that’s a problem. The CSM article references Marianne Wolf, whom we first glimpsed in Carr’s “Is Google Making Us Stupid,” still concerned about the loss of “deep reading.” But “deep reading” may simply be floating, detachment: “The alphabet and print technology fostered and encouraged a fragmenting process, a process of specialism and of detachment,” McLuhan said.

Carr, Wolf, and others are concerned that electronic reading is changing brain circuitry. Of course it is: “All media are extensions of some human faculty – psychic or physical…Media, by altering the environment, evoke in us unique ratios of sense perceptions. The extension of any one sense alters the way we think and act – the way we perceive the world. When these ratios change, men change,” McLuhan argues: “Electronic circuitry is an extension of the central nervous system.” If that’s so, then what? The end of books?

The eBook returns us to the middle ages, before copyright, before individual authors, before fixed points of view. The problem for some is now authorship and ownership: “Medieval scholars were indifferent to the precise identity of the ‘books’ they studied. In turn, they rarely signed even what was clearly their own…Many small texts were transmitted into volumes of miscellaneous content, very much like ‘jottings’ in a scrapbook, and, in this transmission, authorship was often lost” (McLuhan). Sounds like blogging.

“We’re not going to change the code,” Reid Lyon says. No, we’re not, but perhaps readers will, or non-readers – perhaps the code is changing (under our very ears), for, as McLuhan argues, it’s impossible to be illiterate in a non-literate culture. We may be coming close to “the end of the line.”

McLuhan, M. (1967). The Medium is the Massage. Bantam Books.

In Twosome Twiminds: News from the Stroke Club – “Who Are We?”

During our stroke, we picked up the Takemine to test our left hand, self-diagnosing our condition. We noticed our left hand with interest; it formed the shape of the chord we had asked for, but not on the frets and strings we wanted. The result was discord, the guitar sounding badly out of tune. We moved to the Telecaster. The sound was distorted, the guitar either badly out of tune, the amplifier’s speaker blown, or our hand forming some new nonsense chord. Yet, “It sounds fine,” Susan said. “It sounds like it always does.”

In Finnegans Wake, Joyce recreates the experience of a stroke: “…and now, forsooth, you have become of twosome twiminds…” (188).

From 12-1-09 Open Culture: “Jill Bolte Taylor’s ‘Stroke of Insight’ talk reaches the top of many lists. What happens when a neuroanatomist experiences a massive stroke and feels all the brain functions she has studied (speech, movement, understanding, etc) suddenly start to slip away? And how do these losses fundamentally change who we are? You’ll find out in a crisp (and at times emotional) 18 minutes and 40 seconds. You can also read her book that elaborates on her life-altering experience. See My Stroke of Insight: A Brain Scientist’s Personal Journey.”

Join the Stoke Club by finding a quiet 20 minutes to watch Taylor’s talk on  video.

Rap Phonics Rhapsody: Eating the Alphabet and Spitting it Out

If the vowels decide to strike, we can probably keep the machines running, but if we lose the consonants, we’ll have to shut down.

How should we learn to read? The beginning reader, trying to make soundsense from the smell of ink of the “…miseffectual whyacinthinous riot of blots and blurs and bars and balls and hoops an wriggles and justaposed jottings linked by spurts of speed” (Joyce, Finnegan’s Wake, 118) soon understands that “When a part so ptee does duty for the holos we soon grow to use of an allforabit” (19).

Today’s beginning reader (and teacher) sit at the bottom of a tower of babble constructed of politics made necessary by how education is funded and a grant industry, partisan learning theories (in which the neuroscientists are now investing a huge down payment), and good, old-fashioned my way is better than your way faculty room argument.

“It is told in sounds in utter that, in signs so adds to, in universal, in polygluttural, in each auxiliary neutral idiom, sordomutics, florilingua, sheltafocal, flayflutter, a con’s cubane, a pro’s tutute, strassarab, ereperse and anythongue athall” (117).

Over at The Frontal Cortex, the reading discussion was lively but short, and our hungry mind wanted more. So once again we picked up Joyce and reread a few favorite passages (aloud, the better to taste and hear the words, to slurp and listen as the vowels (like Alice’s EAT ME cake) made us bigger and the consonants smaller), and then we perused a few articles.

Nicholas Lehman reported in a 1997 Atlantic article that “The dispute operates at three levels, which is one reason why it is so pervasive. It concerns how people learn, what schools should be for, and the essential nature of a good society.” This came three years after Art Levine reported in an Atlantic article that “In education no question has produced so much bitter debate for so long as this one: What is the best way to teach children to read?”

The debate continues worldwide, with no sign of abatement, and the political influences continue, as shown in a 2006 Guardian article featuring Oxford’s Kathy Sylva, in which she discusses legislative interests. Also in 2006, Sylva brought attention to the issue of learning reading in a teaching expertise interview; here we find her discussing neurons, signaling that as debate continues, it is now infused with new ethos borrowed from neuroscience.

Should the words go from the page directly into the brain through the eyes, or should the words be eaten first (eat your p’s), rolled around on the tongue, felt, then spat out into the ears to worm their way into the brain? 

We don’t value fast food reading; we want the old-fashioned, sit down meal. Words have substance: they are smooth or rough, loud or quiet, ticklish or jolting. Words leave bruises that other words salve.  Words rap and rip their way into our consciousness as we tear them apart with our teeth. Syllables slide like bumpy water. We want to eat the alphabet and spit out the seeds – now that’s reading.

A Different Brain: Reinventing Neuroscience from the Bottom Down

We saw Robert B. Laughlin lecture in Portland in 2005. It was Eric’s idea. He was taking a high school physics class, and there was a free ticket and extra credit in the wings, so we tagged along, always interested in what the physicists are up to.

The hall was packed. On the stage was a podium and an overhead projector. We had expected high tech Excel files pasted into a slick PowerPoint. Instead, we got a speaker and cartoon drawings on the overhead. And it was brilliant (in the Roddy Doyle sense of the word). Laughlin was funny, accessible, engaging (a Q&A followed the lecture), humble, generous, challenging. Then the Nobel prize winning physicist sat in the lobby selling and signing his book: A Different Universe: Reinventing Physics from the Bottom Down.

Our brain, an old dog versed in verse, struggled a bit in parts of the lecture, wanted to chase a Frisbee in the park, get out and smell some dirt, so we looked forward to sharing the book with Eric and learning more about the universe. Eric had Laughlin sign the book; his signature looks like a nebula.

What can science tell us about life? In his preface to the book, Laughlin says, “Seeing our understanding of nature as a mathematical construction has fundamentally different implications from seeing it as an empirical synthesis. One view identifies us as masters of the universe; the other identifies the universe as the master of us…At its core the matter is not scientific at all but concerns one’s sense of self and place in the world.” One of these views he explains with a reference to John Horgan’s The End of Science, “in which he [Horgan] argues that all fundamental things are now known and there is nothing left for us to do but fill in details.”

That is the view of the brain taken by some of today’s neuroscientists, a view that has the seemingly infallible protection of the scientific method. Yet Laughlin moves on to describe a different view, “that all physical law we know about has collective origins, not just some of it. In other words, the distinction between fundamental laws and the laws descending from them is a myth, as is the idea of mastery of the universe through mathematics alone.” This is an untamed elephant in the science lab. And we’re only in the preface.

Emergence is Laughlin’s theme: “…human behavior resembles nature because it is part of nature and ruled by the same laws as everything else…we resemble primitive things because we are made of them – not because we have humanized them or controlled them with our minds. The parallels between organization of a life and organization of electrons are not an accident or a delusion, but physics” (201).

Laughlin likes quotes; they help him move his conversation forward. This one opens his book: “Not only is the universe stranger than we imagine, it is stranger than we can imagine” (Sir Arthur Eddington). This one opens the last chapter of the book: “A human being should be able to change a diaper, plan an invasion, butcher a hog, conn a ship, design a building, write a sonnet, balance accounts, build a wall, set a bone, comfort the dying, take orders, give orders, cooperate, act alone, solve equations, analyze a new problem, pitch manure, program a computer, cook a tasty meal, fight efficiently, die gallantly. Specialization is for insects” (R. A. Heinlein).

Juxtaposition to synthesize varying points of view casts things in new light. In the chapter “Picnic Table in the Sun,” Laughlin, describing some physicists’ conversation, says, “At any rate, by noon nobody’s brain would hold any more…,” and they move off to an outdoor lunch.

We find the physicists’ full brains hopeful; it suggests the need to digest, sleep, and let go – a need we all feel, regardless of the relative size of our brain. Here in this particular spot in the universe it’s morning, and we are thinking of some scrambled eggs, toast, and coffee. Then we’ll take a walk in the sun, and if we’re lucky, our brain will forget about itself, becoming just another part of us, no more, no less, another part of the universe.