Obviously the new owner of these little paintings is a musician. I’m honored to echo his array.
Celebrating the centennial of women gaining the right to vote in Oregon, Portland’s big public mansion on the hill will hold an exhibit—including three of my paintings—from July 14 to November 11. If you’ve never entered this unique house, learned the story of the pioneers who built it, or seen its serious vista of Portland and the surrounding mountains, do stop by. For the next few months we’ll get to see some compelling meditations on feminism on the walls, too.
Pittock Mansion, 3229 NW Pittock Drive, Portland
What comes along with declaring something “the oldest”? Looking at the photo published here, how does your reception of the image change when I say my nephew made these hand prints, that I made them with Photoshop, or they are the oldest paintings in Europe, perhaps created by a species not our own? The layers of description we attach to objects of visual craft can make them appear more or less beautiful, valuable and important—and those narratives tend to evolve.
A study published in the journal Science this past week has a lot of people talking about cave paintings. As reported by National Public Radio, The New York Times, The Christian Science Monitor and other news sources, a group of scientists has been reexamining cave paintings in Spain, and they’ve concluded that many of the paintings are older—tens of thousands of years older—than they thought previously. This has led to some big narrative revisions.
On Friday the Monitor published a lead typical of the reportage: “New tests show that crude Spanish cave paintings of a red sphere and handprints are the oldest in the world, so ancient they may not have been by modern man.” That is, new numbers from a new technique for dating these cave paintings have led some to believe it’s possible they were done by Neanderthals, not humans. And that means Neanderthals were more human than we thought, capable of symbolic, aesthetic expression, maybe able to use language like modern humans do. Or the discovery means none of these things, but it’s really fun to attach new stories to old objects.
The actual study, titled, “U-Series Dating of Paleolithic Are in 11 Caves in Spain,” gives authorial credit to eleven researchers, led by Alistair Pike from the University of Bristol. In a phone interview on NPR’s Science Friday, he explained that carbon dating of organic materials used in paint becomes inaccurate when testing things more than 30,000 years old, and since some Paleolithic painters and engravers didn’t use ground-up plant or animal matter to make their pictures, often there is no carbon to date. So his team used a method based on the predictable rate at which uranium decays to thorium. Testing trace amounts of uranium in layers of calcite deposit that slowly grow over paintings in caves allows the team to come up with a minimum age for the paintings. If a layer over a painting is 40,000 years old, then the painting must have been put on the wall more than 40,000 years ago. The “oldest known” painting the team has found so far is a red splotch among some hand prints put on a cave wall in Spain at least 40,800 years ago.
Pike said he believes there’s a “strong probability” Neanderthals did that painting because other research has concluded that the earliest humans arrived in that area about 41,500 years ago. That date and the earliest cave painting date are precariously close together, and if the team can revise the dates on other paintings, finding even older examples of “enigmatic symbols” on walls, that would mean some nonhuman put them there. At that time Neanderthals had occupied the southern peninsulas of Europe for at least 200,000 years. This is really provocative stuff in a scholarly world that intensely debates whether Neanderthals could do much of anything creative, such as invent new tools, make music, speak or stitch complicated clothing. If Pike’s team is right, our ideas about Neanderthals will change forever. Human people wouldn’t be the only people who made art.
I find all of this fun to reconsider. Pike and his team are saying a lot with this research, but only the first part of their media message seems scientific to me. The rest is pure, beautiful storytelling. They’re trumpeting what they consider to be a superior method for measuring the age of ancient cave paintings. That’s exactly what good research should do. It uses specific, carefully collected evidence to draw modest conclusions. When those conclusions shift our ideas about what came first or from whom, then certain real-world consequences follow. In this case research takes away the label of “oldest known” from the highly detailed, figurative animal pictures in Chauvet cave, France, and plants it on a red splotch in Spain—for now. (Who knows where that label will land when they’ve completed their research on the thousands of paintings yet to be uranium dated.) Will this shift tourism a little? Maybe enhance Spain’s reputation as a place to invest in archeology? Will international attention mean money flows differently in other ways? Spain could use the help these days.
Along with that good work, though, Pike’s group of scientists is telling a speculative story about what it means based on a double standard: They say their work means it’s likely Neanderthals made these pictures. On one hand, Pike et al. are revising a number: The “oldest known” cave paintings are no longer 35,000 years old; they’re 40,800 years old. On the other hand they’re holding tight to a number that others see as suspect: The first time humans set foot in Spain was 41,500 years ago according to Pike’s group. That number comes from a questionable set of cross-referenced sources: some tools attributed to Paleolithic people and rare discoveries of bones. However, I recently read a study published last year in the journal Nature that estimates human arrival in Europe took place at least a couple thousand years earlier. Who is right? With better methods for dating the evidence or new discoveries, will Pike et al.’s date of arrival look way off?
Their work has already been directly questioned by colleagues on other grounds. In the NPR story linked above, archeologist Pat Shipman said she wonders why Neanderthals, who had been in Europe for hundreds of thousands of years, all of a sudden started making art at the same time as humans arrived. She finds it more likely humans brought artistic practice with them from Africa.
A red splotch in a cave tells us about ourselves mainly what we tell ourselves about it. Crouching underground, faced with the “oldest known,” it seems even professionals can’t help letting their imaginations chew on a red splotch and digest it into something earth-shattering. Does a red splotch in its essence shine through all those layers of story and calcite? What does this tell us about our relationship to the more quotidian pictures we see? What layers of description turn a good drawing bad? What caption makes a dull photo compelling—or a gripping painting into a lie?
This painting, which pictures a former companymate of mine, now hangs in a beautifully restored house in Louisville, Kentucky, home of the Kentucky Derby. The photographer shot this from a sitting room right next to an open kitchen, perfect for guests and host to enjoy conversation while dinner is stewing.
I don’t know how to make a masterpiece, but somehow a few of my paintings will show in Small Masterpieces, opening June 1. Mel’s Frame Shop in downtown Portland will hang dozens of little beauties, all measuring one-by-one foot, and all priced to sell. The show runs to the end of July, but get there early to get an excellent picture for a steal.
Join me and several other local painters for a reception Friday, June 15, 5-8pm, with all the usual goodness of drink and chat.
Mel’s Frame Shop: 1007 SW Morrison Street, Portland, Oregon
According to a book of interviews (Conversations with Ian McEwan, 2010, edited by Ryan Roberts), novelist Ian McEwan has been putting down English scholars for decades. This is funny because McEwan isn’t the kind of author who believes school is for fools. His characters are usually upper-middle-class intellectuals, composers, doctors, newspaper editors—well-schooled people. And he has big support in universities. Scholars research, rehash and teach his books in classrooms throughout the English-studying world. They’ve helped make him one of the most lauded living authors.
All the same, he thinks the humanities run on a ridiculous model. For example, in a 2005 interview with sculptor Antony Gormley, McEwan says, “One has to be wary of the delicious, seductive joys of pessimism. In art it can turn into an empty mannerism. And in the universities, in the humanities, all intellectuals are required to be card-carrying pessimists. You have to go to the sciences today to find any real sense of wonder, any real joy in the intellectual life” (141). That’s harsh—and maybe a bit true.
We hear a lot about the crisis in the humanities, that people aren’t paying attention to the good work going on in the departments of English or African American studies or history. Those scholars feel ignored while theoretical physicists get lots of press for discovering New Secrets of the Universe (cover of Newsweek, issue for May 28, which I just saw on a news stand). Why aren’t the latest studies in contemporary Asian American literature splashed across mainstream magazine covers? The latest speculative theories in cosmology probably don’t have as big an impact on our civic and social lives as the ways immigrants and their children view themselves. Is it that the universe is more universal? Stars are prettier than the ins and outs of race? All this makes me want to unpack McEwan’s “pessimism” warning.
For McEwan, studying the arts in higher education has become more about fighting than it needs to be. That may be accurate, but we should also remember the humanities depend on productive clashes in ways hard sciences don’t. I defended my dissertation a year ago in a humanities department, and as often happens when I see academics gather to talk about art criticism, the event—a big conference table surrounded by professors and grad students—reminded me of sailing in rough seas. It wasn’t a hurricane, but it did turn into an exciting little storm. Claims and counterclaims swelled and collided. One senior professor argued against another on the fundamental value of Derridian deconstruction versus consilient approaches. Another accused me of reducing combat painting and choreography to text. These were big issues, as if they were asking, “Does this two-year project have any value at all anywhere?” I turned my rhetorical rudder into the swells, cited memorized notes to keep headway and reefed my sails tightly so my opponents wouldn’t get hold and tear them. Afterward they shook my hand, called me “doctor,” and met me at a bar to drink beer and shoot pool. The next day I began using the ocean of notes I had taken at the defense to fix the little holes my committee pointed out and publish a better final product, one that floated.
My dissertation had a multidisciplinary bent, art criticism borrowing insights from psychology, biology and medical science. As a result I sat in the audience for plenty of defenses on topics ranging from sodomy in Medieval French literature to the neuroscience of children who stutter. The mood among young scientists in their defenses shocked me at first because everyone was so calm. Nobody challenged any basic theoretical underpinnings. That had all been sorted out earlier. At this stage, professors seemed most interested in making sure the methods were sound, saying things like, “How did you control for the child’s blinking during this phase of the study? Did you consider another way to decide which p-waves were outliers?”
Humanities scholars are trained more in attacking and defending philosophical positions than they are in cooperating toward accumulating a body of knowledge. By contrast, most scientists in a given field agree with each other on the fundamentals. They want to do studies and repeat them. They believe accumulating data matters, and eventually looking at everyone’s data lets them come up with some narrative for how the body of evidence fits. There’s a sense that most everyone in the department is on the same team, trudging together carefully into the unknowns of speech pathology or cellular biology. (Of course in science we see plenty of outsize egos, mistakes and snubs—but the model seems to yield lower average blood pressures.)
McEwan seems annoyed at the philosophical disagreements so often voiced among humanities scholars. I think that’s where his word “pessimism” comes from. There’s little feeling that we’re building on each other’s work. We’re attacking, deconstructing, building on ashes instead of standing on the shoulders of giants. If we don’t watch out, that feeling of pessimism can saturate our reception of the books we read and the sculptures we see. In that environment, showing unabashed enthusiasm for an aesthetic experience puts our hearts in a vulnerable position, open to attack from those who disagree. Maintaining inquisitive distance from the novels we love lets our interlocutors trash our claims without trashing us. As McEwan says, pessimism can feel “seductive” and “delicious.” We can use a novel to point out injustices and declare “we’re doomed” without really feeling terrible about it (141). The next scholar can do the same in another direction.
The way McEwan sees it, pessimistic disinterest teaches people to close ourselves off from a novel’s most important quality, “its peculiar ability to get inside minds and show us the mechanics of misunderstanding, so you can be on both sides of the dispute.” He goes on: “we have evolved a literary form that I think is unequalled in its ability to get inside the nature of a misunderstanding” (85). Feeling close to the characters—feeling so close we’re in their heads—is key not only to finding a kind of joy in watching them flounder though their predicament, but also to learning something about people.
What I love about McEwan, besides his fantastic facility with a sentence, is his openness to lots of ways to understand people. As I’ve hinted, he samples from across the spectrum. Asking himself how a writer should represent time, he says things like, “Biological thought has made it possible to rub the emotional against the scientific in a small scene like that”(102). I find that idea beautiful, sampling across disciplines, rubbing one against the other, seeing what resonates.
Matthew Burriesci, Executive Director of the PEN/Faulkner foundation in Washington, D.C., uses my portrait of him to help emphasize the gravity of a situation during a meeting in April 2012. I’m proud to picture the leadership of nonprofit arts organizations wherever I can.
Giovanni Boccaccio wrote The Decameron after the bubonic plague hit Florence in 1348. It has a self-consciously bawdy feel and to a large extent seems to be a study in loose morals—how they develop and why they’re fun to talk about. Seven young ladies and three young gentlemen go off into the country, shirking their familial obligations in the face of the Black Death. They decide each will take a turn as king or queen for a day, ordering the others to hang out in the garden of an abandoned house drinking lots of wine, singing songs and, naturally, each telling one story every day. (“Decameron” is Latin for “ten parts,” though to me it rolls off the tongue more like the name of a villain from a gothic novel or a Star Wars monster, large and drooling.) Corrupt friars; salacious princes; murderers; drunks; star-crossed lovers who tend to die by each other’s side (especially in the fourth day stories); a knight who hunts down a woman and feeds her heart to his dogs (fifth day, eighth story); a gorgeous lady who gets shipwrecked, sleeps around foreign kingdoms for years and lies her way back into respectable society (second day, seventh story); and so on for a total of 100 little fictions.
The translators for the version I read, Mark Musa and Peter Bondanella, say they think Boccaccio finished writing this series no later than 1352, four years after a third of his city died. So the plague, an actual brush with the end of civilization, is fresh for the author and his target audience. In his introduction, Boccaccio keys on a thought-provoking conception of freedom that I suppose must have felt chillingly familiar to the “most gracious ladies” to whom he writes (5). “And in this great affliction and misery of our city the revered authority of the laws, both divine and human, had fallen and almost completely disappeared, for, like other men, the ministers and executors of the laws were either dead or sick or so short of help that it was impossible for them to fulfill their duties; as a result, everybody was free to do as he pleased” (8). This post-apocalyptic world at first glance seems kind of familiar. In our zombie movies and after-the-nukes novels, the rule of government breaks down and gets replaced by some other code, the love of family or another sense of duty. The main characters are profoundly free, and as a result they get in touch with a deep sense of human goodness. They risk their necks machine gunning through vampires or they sacrifice themselves to cannibals to save their brothers and sisters.
Freedom in Boccaccio’s telling affects people in a less noble way. In his recollection, the freedom that came with local government’s collapse meant “almost no one cared for his neighbor,” while “relatives rarely or hardly ever visited each other—they stayed far apart.” Not only that, there seemed to be “no shame whatsoever” while the disaster became “the cause of looser morals in the women who survived the plague” (9). This is background info, in the introduction, and we expect the arc through the book should lead to some self-realization or redemption for the ten main characters who take off to green pastures while their friends die. Yet besides some self-congratulations for not doing anything really terrible, we get no indication these young people develop at all or have any redeeming qualities. They go through no trials, no crucibles of deeper human connection. They just drink, joke and tell lurid stories, then pack up and head back.
It’s something to pause on: How beneficial was the rigid social structure Boccaccio gently pushed against? He’s no hedonist, after all. He’s satirizing the tight-collared, chastity-belted dictates of nobles and church officials as well as their transgressions behind closed doors. He writes at what we now see as a time between times, and the shift must have been disorienting. In the middle of the Fourteenth Century, the city of Florence was helping to kick off the Italian Renaissance, that big reaching back to the art of Ancient Greece and Rome for new ways to examine of all kinds of things, including pleasure itself. A central theme of the book is that pleasure per se isn’t sinful. It should be ok for young people to go out into the country and tell stories about sex and murder, as long as they don’t act out those stories. But that’s where Boccaccio’s progressivism stops. Our ideas about basic gender equality, for example, have no place in his tales. At the start of the story, one lady says to the others, “Remember that we are all women, and any young girl can tell you that women do not know how to reason in a group when they are without the guidance of some man who knows how to control them. We are fickle, quarrelsome, suspicious, timid, and fearful” (15). Boccaccio has her say this as a quick device to get the women and the men to meet up and go on their way. As tone deaf as it is to us, to him it’s both efficient and believable.
I’m trying to imagine how disorienting Boccaccio would find our social normal, where national debates center not on whether women should be allowed to vote or own businesses or whether gay couples should be allowed to live together without harassment, but on whether we need specific laws to make sure women get equal pay or to guarantee marriage and adoption rights for other than heterosexual couples. Would Boccaccio, known for his celebration of freedom, see this as too much?
In his conclusion, Boccaccio stresses the idea that freedom is dangerous. Stories about freedom, which he has just told, should be viewed like weapons, or like wine or fire. Wicked people will use them wickedly. He can’t be blamed for making people wicked. Only people who are already bad might go off and do the things he writes about. (I don’t read this as a warning that wicked people with power might use these stories to keep the masses down. There’s no Orwell in Boccaccio.) He shouts at his reader, “A corrupt mind never understands a word in a healthy way!” (686). I wonder how Boccaccio would handle a Friday night out in Portland. Would he be more comfortable hanging out with the Taliban?
Randall Jarrell, the critic and poet best known for World War II poems such as “Death of the Ball Turret Gunner,” wrote an essay in the mid-1950s responding to the hot visual art craze of his day. Titled “Against Abstract Expressionism,” it makes the case that the action-painting club of Jackson Pollack and Willem De Kooning deserves a devil’s advocate. Such expressionist painting, he argues, isn’t the “purified essence” of modern Western art. It’s a “specialized, intensive exploitation of one part of such painting, and the rejection of other parts and of the whole” (340).
Critic Clement Greenberg’s famous dictum—“Paint is paint; surface is surface”—hovers back stage of this essay without Jarrell actually mentioning it. In 1957 he didn’t have to. Greenberg’s pure modernism was everywhere, with sculptors, choreographers, poets and other fine artists each pressing on a chosen medium to more directly reveal its supposed inherent properties and purposes. Photography, for Greenberg, was best suited to picture the real world. Prose was best for telling stories. Painting should try to do neither.
Jarrell can’t stand that idea, and God bless him for writing this: “When we are told (or, worse still, shown) that painting ‘really’ is ‘nothing but’ this, we are being given one of those definitions which explain out of existence what they appear to define, and put a simpler entity in its place” (341-42). By calling painting “nothing but” paint and surface, Greenberg and the Abstract Expressionists are not reducing painting to its essence, they’re replacing painting with something simpler. (Kind of like certain neuroscientists these days claiming they can explain love by watching which parts of your brain “light up” in an fMRI scan.)
A figurative painter has to agree with Jarrell’s fundamental gripe. If paint and surface are just that, then why is a Rembrandt self portrait so much more fun to look at than the bare gallery wall? The gripping experience in viewing a painting comes in the way we negotiate two ways of seeing. We walk into the room and discern its material, flat presence at the same time as we see into the illusion it presents of a face or a barn or a twisted color field that seems beyond any words we might use to describe it. Jarrell, as a poet, puts it this way: “In the metaphors of painting, as in those of poetry, we are awed or dazed to find things superficially so unlike, fundamentally so like; superficially so like, fundamentally so unlike” (340).
However—and for me this is a hefty “however”—we shouldn’t be throwing Greenberg’s baby out with the bath water. After all he’s right that a painting isn’t a photo. And it isn’t a story or a rhyme or a person. It can refer to these things (and if a painter includes text in the picture, it can contain some of these things), but painting is open to rules, methods and textures that aren’t available through any other means of making a picture, and it’s really easy to forget to explore those. Greenberg was educated by people who saw photography as the best means for recording the visual world. Their response was to laud the work of Picasso, Cézanne, Monet and others who showed how painting could be freed from the burden of picturing some universal real. They used painting to picture dreams or the painter’s idiosyncratic impression of a sunset. Greenberg and his cronies argued from that point of view all the way to the stilted, narrow corner of painting called Abstract Expressionism—their mistake.
Yet the dangerous idea that photography is the world in a box hasn’t left us. With Facebook and Hipstomatic on our smart phones, it isn’t a night out or a birth unless we bludgeon the room with flashes and send the jpegs to our “friends.” What Greenberg et al. astutely, if snobbishly, bring to the fore is that painting can be viewed as a unique technology, a special mode of communication worth preserving and celebrating. Watching a digital film in high definition 3-D doesn’t take the place of attending a live production of Giselle, Don Giovanni or Death of a Salesman. Painting is irreplaceable like ballet, opera and theatre.
Jarrell’s essay shows up in the middle of a collection called Writers on Artists (1988), edited by Daniel Halpern. I bought it for cheap at Powell’s huge bookstore, and it’s full of diverse takes on painting by everyone from Susan Sontag to Ernest Heminway to Italo Calvino.
The venerable Maude Kerns Art Center in Eugene will show two of my paintings, Natural Causes (Violin) and a violin and jeans picture, in a music-themed exhibit opening April 6. A reception happens the following Friday, Apr. 13, from 6 to 8 pm, with all the usual fixins. I’ll be there with a bunch of neuroscientists from the University of Oregon who love researching good pictures and good wine sips.
Maude Kerns Art Center, 1910 E. 15th Ave, Eugene, Oregon.