Monday, December 20, 2010

Congress: The Musical


You must, of course, imagine the music. Perhaps ruffles and flourishes to precede the actors: the statesmen, or “the lovers, liars and clowns!” as the opening chorus from A Funny Thing Happened on the Way to the Forum put it.

Indeed, hand-wringing, wrist-to-forehead angst over gays in the military has been one of the longest running shows in Washington—a tragicomedy of epic proportions—ever since Don’t Ask, Don’t Tell (DADT) was passed seventeen years ago. Ultimately, the stars in the cast most recently are the Republican senators who crossed party lines to do the right thing—for a change, literally. Senators Collins, Murkowski, Kirk, Brown, Snowe, Voinovich, and reluctantly Ensign and Burr are the glitterati of the moment.

Obama has proven to be a disappointment in the role of President. Unlike Truman, who integrated the military by Executive Order 9981 in 1948, thus summarily ending decades of racial segregation and second-class soldiering by African Americans, Obama chose to push for congressional repeal to end decades of anti-gay discrimination that stretch back in time long before Congress enshrined it in law with DADT. But, then, Obama’s been playing Stepin Fetchit to the Republicans for two years, a bit part when he should have been playing the lead.

The real stars of DADT haven't seen much limelight. They are the men and women who “told” and so got quietly robbed of their service careers, and the nation has been the poorer for the all-too-easy dismissal of their talents and dedication for nearly two decades.

Finale: Ta-da! Like magicians collectively whipping open the cabinet (or closet) door, Congress—applause, cheers, whistles!—at last has decided to allow gay men and lesbians to serve their country—as they always have, often with honor and distinction—now openly, without congressionally mandated lying about who they are and who they might or might not love or at least have sex with.

You have to admit that this show has had something for everyone: something familiar, something peculiar; something appealing, something appalling! Certainly appalling.

Tuesday, December 14, 2010

The Art of Being: A Memory


In 2010, when the entire world seems to be represented virtually on the Internet, one would think that everything and everyone can be googled. Occasionally I google my name, partly as ego affirmation, partly to find out where my work is available or has been reviewed. Because I write, I turn up on several pages and literally hundreds of sites. Doesn’t everyone?

Apparently not.

On Sunday, December 12, my brother, Michael Allen Walling, succumbed to pancreatic cancer at the age of 56. He’d been a good kid brother; a good big brother to our sister, Carolyn; a good husband. He’d served in the Navy and gone on to a useful post-military career. (The photo inset shows him in his lab aboard the USS Enterprise aircraft carrier in 1976-77.) So out of curiosity I googled his name.

Nothing.

Not a single hit.

I suppose at some point there might be an obituary that google could find. But that seems such paltry recognition for a life well lived.

Many people—famous, infamous, ordinary—can be googled. Not finding my brother on the Internet, however, reminded me that far more people practice the sometimes not-so-simple art of simply being. They live on, then, not virtually in the electronic ether, but in the memories of those they loved and by whom they were loved.

And that’s art enough.

Friday, October 29, 2010

Sitcom Politics


Back in the day when media guru Marshall McLuhan said, “The medium is the message,” he wasn’t kidding. The medium not only is the message, it also shapes the mind. Our expectations have been molded so thoroughly by the solve-it-in-thirty-minutes sitcom storyline (or an hour if you’re into TV dramas) that we’ve translated that mentality into real life.

Fact is, life isn’t a sitcom or an hour-long drama. Families with kids aren’t nearly as funny as those on “The Middle” or “Modern Family,” show choirs never look like the ones on “Glee,” and modern crime-fighters—no matter how high-tech—have nothing remotely as cool as the scientific and technological gadgetry on “Bones” and “NCIS.”

Yet here we are, coming up on midterm elections, with many voters expecting the multiple crises that took years to create during the two-term presidency of George W. Bush to have been solved within the first two years of the Obama presidency. The Bush presidency took the economic surplus that was the Clinton legacy and converted it into massive debt and an American-led global recession, helped along by two ill-conceived wars and lack of proper oversight of the financial and housing markets—and a lot of other things that ought to have be been seen to.

The fiction of wishful thinking so glibly purveyed by rightwing pundits and their puppets won’t change reality. But with sitcom mentality firmly in place, it doesn’t have to. Conservative politicians who have sat on their hands for two years, parroting Nancy Reagan’s “just say no” to any collaborative strategies to solve the nation’s problems, are now waving their hands in the air and, like Chicken Little, squawking that the sky is falling. Their solution: Elect politicians from the party that made the mess in the first place.

Mess? What mess? Oh, that’s right. The Bush presidency was over half an hour ago. All has been forgotten now—never mind the men and women still out of work and still dying in Iraq and Afghanistan.

In 1967 Marshall McLuhan published The Medium is the Massage, a word play on his famous saying. But it was a well-considered pun. McLuhan used the word massage to describe the effects media have on human senses. He wrote:

All media work us over completely. They are so pervasive in their personal, political, economic, aesthetic, psychological, moral, ethical, and social consequences that they leave no part of us untouched, unaffected, unaltered. (p. 26)

In some ways the media have positively reshaped human awareness, but in other ways we have been diminished. If we expect the impossible because we have seen the impossible realized so often in fiction that we now confuse it with real life, then we have arrived at a truly sorry state.

Thursday, September 16, 2010

Forgive Me, Hector Berlioz…


My mind was elsewhere at times during the Symphonie fantastique. You see, I have been giving thought to writing about teaching religious freedom in an era of extremism. So rather than being focused on the music with the intensity of the four harpists who sat rapt during the entire piece, even though they played only briefly, my thoughts wandered about like the free-range percussionists who roamed on and off stage, playing occasionally and at other times presumably pitching pennies against a backstage wall.

Recent controversy about the building of Cordoba House, a mosque and Islamic center, not far from the site of the former World Trade Center in New York City set me thinking about religious extremism, which so often shapes a national conversation about religion that invariably is interwoven with politics. “Freedom” of religion is never a tenet of extremism; indeed, free exercise of religion, free thought of any kind, is anathema to religious extremists of every stripe. Our domestic Christian extremists—whom some have labeled “Christianists” to distinguish those who wield Christ like a club rather than actually believe in and attempt to live the teachings of Jesus—are every bit as rabid as the Muslim extremists who destroyed the Twin Towers.

As yet, thankfully, most of our homegrown religious extremists are verbal terrorists, and most haven’t resorted to bombs. But they are equally destructive in other ways, perhaps most notably in the classroom. Limiting learning and dumbing down curricula have become extremist tools of choice that will have a profound and lasting effect. Extremists know that the real way to change the world is to pervert the teaching of the young. Inculcate extreme beliefs early and they can produce an army of children that will believe anything, such is the malleability of the unformed mind. It’s how our nation gets four-year-olds carrying anti-Islam signs they can’t yet read and how our world gets ten-year-old, gun-toting soldiers in places like Burma and Sierra Leone. Extremists don’t think of any of this as child abuse, but that’s what it is.

So Leonard Slatkin masterfully conducted the Indiana University Philharmonic, and the Symphonie fantastique soared beautifully last evening, bringing the audience to its feet for the final ovation. But I was mentally present only part of the time. I wonder what Berlioz would have thought. A French Catholic by birth in a more generally religious period than our own (he lived from 1803 to 1869), Berlioz was claimed by the faith in the Catholic Encyclopedia, though the reference admits that he did not remain faithful. The composer himself often wrote in letters that he was an atheist. He declared shortly before his death, “I believe nothing.”

Freedom from religion is implicit in our American democratic tenet of freedom of religion. Not that it applied to Berlioz, but we should also remember that atheism can be taken to extremes. But there I go again, wandering offstage away from the music. Forgive me, Hector Berlioz.

Sunday, August 29, 2010

Rediscovering Norman McLaren


Recently I ran across a copy of Canadian filmmaker Norman McLaren’s 1971 short film, Synchromy, which took me back to my early career when I was teaching art and English to middle school and high school students. Film study, in my mind, always exists in the dual worlds of images and words—and color, sound, movement, and all the rest. But McLaren’s film animations, most entirely wordless, are firmly anchored in the visual, kinesthetic, and musical arts.

We owe a debt of gratitude to the National Film Board of Canada, which supported McLaren’s work for many years and allowed him to give us a wealth of stunning films that are, by turns, lively, thought-provoking, celebratory, and meditative. McLaren’s animation techniques are perhaps as intriguing as his images. At various times he used animations that included a mixture of moving and still photographs, as in the Oscar-winning 1952 allegorical film, Neighbors. Other films, such as Synchromy, are Modern artworks in motion. The visuals of this particular film evoke the paintings of Piet Mondrian and Barnett Newman, both contemporaries of McLaren. And McLaren also drew directly on film (see photo).

As innovative as the visuals are McLaren’s soundtracks, many of which he composed. Particularly haunting is the music of his 1968 Pas de deux, which stands in contrast to the jazzy electronic beeps and blips of Synchromy.

Scottish-born McLaren (b. 1914) emigrated from Britain to the United States in 1939 and then moved to Canada in 1941. He was most active from the mid-1930s until his death in 1987 at age 72. In McLaren’s view, “Animation is not the art of drawings that move but the art of movements that are drawn.”

Introducing students to McLaren’s studies in movement and music in the 1970s and 1980s was like opening the curtains on a wonderful art world for the first time, a delight as much for me as a teacher as for my students. I suspect that for many students that would be just as true today, more than twenty years after McLaren’s death.

Friday, August 13, 2010

Personal Impressionism


After a restless, uncomfortable night of disturbed sleep, and after a breakfast of coffee and a bagel in a favorite eatery, I decided that a solitary stroll on the campus of Indiana University might bring a centering peace that would sustain me during the day ahead. It has been a long, hot summer, and early mornings are the only time of day that brings relief from the heat wave.

For some reason, I can’t say exactly why, I decided to walk around without my eyeglasses, perhaps simply because I needed to see the world in a different way, a way less troubled by the details of existence, of reality. What I discovered was a form of personal Impressionism.

Shapes, though recognizable, took on new dimensions. As positive space became less distinct, negative space asserted itself: fragments of blue sky among the leaves and branches of trees, shadows punctuated by sunlit stones or shrubs. I found myself reaching out to touch the rough bark of trees and to feel the texture of leaves. Instead of seeing individual flowers, I saw masses of color and visual texture. I stopped, literally, to smell the roses on a primrose bush.

Those of us who wear corrective lenses are fortunate to be able to tap this personal Impressionism at will. As visual acuity differs among individuals, I suppose we all create a singular view, as distinct from one another as the works of Renoir are from, say, those of Monet. And yet, there also are commonalities in that our less than perfect vision renders something less than perfect reality: an impression. And because it is an impression, it enables us to see reality in a different—and in my case, refreshing—way.

It was with a sense of peace and renewal that I walked back to my car. There, I put my glasses back on. After all, safe driving requires attention to reality. But my remembrance of the images of my personal Impressionism lingered agreeably.

Wednesday, July 21, 2010

Harp Competition Really About People


For the past two weeks Bloomington, Indiana, specifically Indiana University’s Jacobs School of Music, was the center of the harp world, as the eighth USA International Harp Competition unfolded. A triennial event, this year USAIHC attracted 39 competitors from around the globe, all of them young women between the ages of sixteen and thirty-two.

My partner and I are not harp cognoscenti, but we have always enjoyed getting acquainted and interacting with international students. Thus we volunteered to host a harpist, which is how nineteen-year-old Agné Keblyté (photo) from Lithuania ended up staying with us. What a delight she was! Indeed, the homestay experience was universally fun and rewarding for hosts and guests alike.

For us, the competition wasn’t the true centerpiece of the two weeks. Rather, it was the people themselves: the music school officials, from deans and faculty stars to student harp movers, who worked tirelessly to pull off this major international event; the harpists, performing through successive stages of elimination (Agné got to Stage 2); the host families, making sure their harpists not only got to the competition venues relaxed and on time but also had a chance to see something of the real Bloomington; and the guest performers, both young and experienced, who entertained our community brilliantly.

Competition on this level draws only the best, so the notion of “winners” and “losers” does not apply. All of the harpists were accomplished and performed beautifully. I gained an appreciation of the harp as a solo instrument, whereas previously I had seen it only as an instrument somewhere back in the mass of the symphony orchestra. In particular, a guest recital by the 2004 gold medal laureate, Emmanuel Ceysson, brought this home. Ceysson, still under age thirty, took up his current position as principal harp of the Paris Opera Orchestra not long after winning the USAIHC prize. And what a marvel he was to watch—and hear—in recital!

Another highlight was a “stars of tomorrow” recital given by five harpists, all national junior champions under the age of sixteen. Later, seeing them frolicking with the harp competitors at a pool party reminded us that these incredibly talented young people are, underneath the formal wear and beyond those elegant instruments, just kids—kids who like splashing one another in the swimming pool, chowing down on hotdogs and chips, and laughing and joking with one another in half a dozen languages.

In the end it always comes down to people.

Saturday, April 24, 2010

June and Gypsy


When actress June Havoc died a few weeks ago at age 97, an era truly ended.

End-of-an-era labels are easily applied without justification, but this case is different. June’s long career began in vaudeville, and she was one of the few remaining performers who could make that claim. But there are other reasons that make her end-of-an-era label spot on.

Born Ellen Evangeline Horvik in Vancouver, British Columbia, she was the apple of her mother’s eye — and the root of her stage mom’s hopes for success in vaudeville. Ellen/June actually was American, her mother, Rose Thompson, having married a Swedish American named John Olaf Horvik. When Rose divorced John, it was Ellen as a child vaudeville star who, along with a sister named Rose Louise, kept income flowing for the struggling family. Ellen’s stage name was “Baby June.” The “June” stuck even after she fled her stage mother’s overbearing presence in 1928, when she was about 15.

Rose Louise, called Louise in the family, took over the “baby” franchise, which she soon outgrew. Mama Rose’s hopes could have been dashed right then, as vaudeville was in its death throes. But vaudeville was being superceded by burlesque, and it was there that Rose Louise blossomed — as Gypsy Rose Lee, the iconic stripper.

June and Gypsy’s story was immortalized in the hit stage musical Gypsy, which premiered on Broadway in 1959. It featured a book by Arthur Laurents, music by Jule Styne, and lyrics by Stephen Sondheim, based on Gypsy’s own 1957 self-titled memoir. Gypsy’s was the flashier story — especially with the titillation of her having been a striptease artist — and she could write it. (She also wrote mystery novels, notably The G-String Murders; I have a copy on my bookshelf.) But June’s career post-vaudeville was not without note.

As June Havoc, Ellen Horvik gained a measure of fame on Broadway in shows such as Sigmund Romberg’s Forbidden Melody in 1936 and later Rogers and Hart’s Pal Joey. In the Forties she moved to Hollywood, where she starred in Gentlemen’s Agreement (1947, see photo) and other films, continuing to play various roles well into old age, and well after her more famous sister, Gypsy, had died in 1970 at age 59.

June Havoc’s long life and career was a quintessential Hollywood tale — an overbearing mother, a more-famous sister, and a somewhat tawdry undercurrent of marriages and divorces, all overlaid by the sequin-studded success of show biz. June even wrote about it herself, a couple of times. Predictably, she used her first memoir, Early Havoc, as the basis for a stage play of her own, Marathon ’33, which played briefly on Broadway, starring Julie Harris.

Friday, March 5, 2010

Update or Adapt a Classic?


Recently I saw a university production of George Bernard Shaw’s 1905 play, Major Barbara. Shaw’s usual themes of capitalism versus socialism and class conflict are front and center in this comedy-drama about a Salvation Army major whose father is a successful arms manufacturer.

One might think that setting this play in the 1960s—the Vietnam War era and a time of major social upheaval—would work well. However, in this updating the play was still set in England, which experienced the Vietnam conflict only peripherally. And Britain’s social upheaval of this period was centered mainly in fashion and popular music.

Leave aside the dreadful British accents and the apparent direction to the actors to speak their lines fast and loud, which rendered half the dialogue in this production incomprehensible, and what remained was a staging that jarred against the play’s central themes. Sure, the Sixties fashions were fun for both actors and audience, but in other respects Shaw’s work was almost wholly obscured, at best reduced to flash and drivel.

Shaw’s play, in hindsight, is piquant not only because of the playwright’s sharp-tongued dialogue but also because the themes in 1905 seem prescient. World War I loomed less than a decade into the future when this script first hit the boards in London. American audiences likely found the play even more pertinent when it opened on Broadway in 1915.

If the play were to be set in the Sixties, a full-out adaptation might have made more sense, moving the action from England to the United States and tweaking the dialogue accordingly. America in that era was feeling the effects of an unpopular war and a cultural revolution, the latter driven not merely by music and fashion but by the civil rights and women’s rights movements—all of which have more resonance to Shaw’s themes than what was happening in Britain at the same time.

Whether to produce a classic such as this one as it was written, to update it mainly by moving the action to a more recent time, or to adapt it more deeply presents many challenges. Will the original seem outdated? Will an updating hold too many anachronisms to be effective? Will a comprehensive adaptation go too far and lose the thrust of the original altogether? These are difficult questions.

The British Sixties setting of Major Barbara in this instance failed to consider these questions sufficiently, and so the update came off as superficial and did little justice to Shaw’s ideas, quite apart from massacring his language.

Friday, February 12, 2010

Separation of Church and Stage


A touring production of Jesus Christ Superstar recently hit town. Despite its longevity, this early Seventies hit for the duo of Andrew Lloyd Webber and Tim Rice still stirs controversy. One inevitable criticism of this particular tour stemmed from Ted Neeley’s seemingly endless reprise of his role as Jesus. Neeley, who starred in the 1973 film of the rock opera, is now 67 and arguably far too old to play a 33-year-old Christ. But the main criticism is one of even longer standing, namely, that Jesus belongs in church, not on stage.

There are two problems with this argument. The first is that, as it usually is made by conservative Christians, it tends to be cover for their real criticism, which is that the Webber-Rice Jesus is “just a man,” something that Mary Magdalene emphasizes in the Act I song, “I Just Don’t Know How to Love Him.” Tim Rice has been quoted as saying, “It happens that we don’t see Christ as God but simply the right man at the right time at the right place.” The Christian Right condemns this view, which also is emphasized by the omission of the resurrection from the story on stage.

Expanding this argument to a more general one of “leave Jesus in church where he belongs” is simply spurious. Jesus has long been on stage in ways wholly approved by conservative Christians. Consider, for example, the mystery plays of medieval England, biblical cycles performed by roving bands of amateur actors during the Middle Ages and later, and passion plays. The most famous example of the last is the epic that has been performed in Oberammergau, Germany, once a decade since 1634. Add to these live events the myriad of films featuring Christ and the list of “approved” theatrical appearances by Jesus is endless. Perhaps the film most passionately embraced by the Religious Right in recent years is Mel Gibson’s 2004 sadomasochistic “The Passion of the Christ,” which is a veritable blood bath.

A second difficulty I have with the argument that Jesus doesn’t belong on stage stems from the theatrical element inherent in the religious practices of most churches. Looked at in stage terms, preachers perform much as actors do, robing (getting into costume), climbing into the pulpit (going on stage), and holding forth (performing). The most adept and elaborate of these church-theaters tend to be those occupied by the same conservative critics decrying stage Christs. Consider the megachurches, the traveling theater of tent revivals, and the elaborate staging that backs the most strident of the rightwing televangelists. Talk about theater!

Few topics — including the Bible and Jesus — are off limits as theatrical fodder, which is as it should be — and as it should pertain to all art forms. The arts are the expressions of culture. All viewpoints deserve expression, and that inevitably leads to controversy. Controvery ought not lead to suppression, however.

(The photo shows Jesus and John from a 1900 production of the Oberammergau Passion Play.)

Monday, January 18, 2010

Flash Mob Performance Art

Mass performance art is a fascinating product of the Internet age. For the uninitiated a bit of background may be helpful. “Performance art” itself is defined as art created by the performance of one or more individuals. It began in the modern sense in the 1960s, Yoko Ono among the practitioners. For many it was a response to the question, What is art? Pop Art could make art using everyday objects and images, from Andy Warhol’s soup cans to Claes Oldenberg’s giant Typewriter Eraser, which stands in the National Gallery of Art sculpture garden in Washington, D.C.

Why, then, should not an artist mingle visual art and theater, often impromptu street theater, and produce performance art? A popular term from the time was a happening. Performance art often seems simply to happen spontaneously. At least that is the viewer’s perception. Often, however, the artist has given considerable thought and preparation to the creation of the performance moment, how it should look, and what it should mean. Performance art can incorporate all of the senses, using color, movement, music, and so on — or their opposites: stillness and silence. The “living statues” or silent robotic-type individuals one occasionally sees on the streets, usually with a basket for monetary contributions at their feet, are engaging in a form of performance art.

Flash mobs are a 21st century innovation on the performance art theme. They began about 2003, with Manhattan being credited as the first site. Indeed, flash mobs tend to be big city phenomena. Individuals communicate with one another over the Internet and often anonymously about what, where, and when a performance will take place. Some flash mobs engage in a short rehearsal before the actual event, but the unrehearsed form is as common if not more so. Also, as a performance unfolds observers sometimes join in and become performers themselves.

A number of examples can be found on YouTube. For example, a flash mob of about two hundred danced to “Do Re Mi” from The Sound of Music in the central railway station in Antwerp, Belgium. London boasts what organizers hope to be an annual winter “No Pants Subway Ride,” coordinated by Improv Everywhere. Their tagline is “We cause scenes.”

Performance art comes in all sorts of forms, and flash mobs demonstrate the mass creative power that can be unlocked through electronic communication.

Wednesday, January 6, 2010

Nesser to Grieg to Rubenstein


The other day I was reading Swedish mystery writer Håkan Nesser’s book, Borkman’s Point. His detective, Chief Inspector Van Veeteren, who drives a small, elderly Opel with an up-to-date, high-end stereo, comments on a northerly drive that he needs to hear “something Nordic”: “Cold, clear and serene. Sibelius and Grieg.”

The power of suggestion later that day led me to rummage in the CD cabinet for something to listen to as I fixed dinner. What came to hand was a Chicago Symphony recording of Edvard Grieg’s Piano Concerto in A Minor, orchestrated by Alfred Wallenstein. What I had forgotten about this CD was the pianist: Arthur Rubinstein.

The music was suitably stirring — cold, clear, serene, and Nordic — but listening to it also stirred memories of hearing Rubinstein in concert more than forty years ago.

Rubenstein (1887-1982) would have been seventy-nine in the spring of 1966, when I was a high school senior and somehow acquired tickets to a concert he gave in San Antonio, Texas. I recall going to the performance with my girlfriend (later wife) and another of our classmates. The program has long since escaped my memory, but I recall vividly the thrill of seeing this piano virtuoso at work.

Rubenstein, a Polish Jew, was born the eighth and youngest child of a businessman in Łódź. He evidenced interest in the piano during an elder sister’s lessons and by age four was studying and playing in Warsaw with Hungarian violinist Joseph Joachim, who became his mentor. By age ten in 1897 he had moved to Berlin to continue his studies and in 1900 made his debut with the Berlin Philharmonic. In 1904 he launched his career in earnest in Paris. (The photograph shows him in 1906.)

Rubenstein led an amazing career that ended in 1976, when he was eighty-nine, and then only because his eyesight had begun to fail during the Seventies. I count myself fortunate to have seen and heard him a decade before his retirement. He died in Geneva, Switzerland, in December 1982, about a month shy of his 96th birthday.

January 28 will mark the 123rd anniversary of Rubenstein’s birth. I have several recordings of him that should be suitable to play for that occasion.

Friday, January 1, 2010

Blue Moon: Blue, Blue, Blue


When two full moons occur within the same month, the second is called a “blue moon.” The infrequency of the occurrence has made it a subject of folklore as well as scientific investigation. A common expression is “once in a blue moon,” that is, rarely.

Most years twelve full moons occur, approximately once monthly, over the course of the year. But the common calendar (called the Gregorian calendar) is not precisely aligned to the lunar cycle. Thus every two or three years there is an “extra” full moon.

Various explanations are given for this full moon’s designation as “blue.” One is that in Old English belewe can mean either “blue” or “betrayer.” Because early clergy were responsible for calculating the date of Easter based on the full moon, some years they may have needed to explain whether a particular full moon was actually the “Lent moon” or a false one, a “betrayer moon.”

Are blue moons actually blue? Rarely. But certain moons, not necessarily full moons, in past ages did seem to be bluish in color. The perceived color change probably was the result of other substances in the atmosphere, such as dust or ash from volcanic eruptions. One story is that in December 1883 geologist W. Jerome Harrison reported viewing an “electric-blue” crescent moon against a copper-colored sky from his home in Birmingham, England. He attributed it to lingering atmospheric debris from the explosion of Krakatoa, a volcano in Indonesia. Krakatoa’s spectacular eruptions began in May 1883, culminating in the destruction of the volcano island in August that year. In addition to the volcanic explosions, subsequent tsunamis devastated the region.

Blue moons also have been the subject of popular songs, often with blue given the connation of sadness, sometimes mixed with the sense of rarity. The most popular example is Rogers and Hart’s “Blue Moon,” which begins: “Blue Moon, you saw me standing alone, / Without a dream in my heart, / Without a love of my own.” But then a new love comes along, and the song includes the line, “And when I looked the Moon had turned to gold.” When the tune was first composed by Richard Rogers, it was given different lyrics and originally intended to be sung by Jean Harlow in the 1933 MGM film, Hollywood Party, which featured a number of the movie stars of the era. The song went through other versions — for example, repurposed for the film, Manhattan Melodrama in 1934 — until Lorenz Hart finally gave it the familiar lyrics after that film was released. Since 1934 “Blue Moon” has become a standard ballad, recorded by popular singers including Ella Fitzgerald, Frank Sinatra, and Elvis Presley.

There also is the bluegrass song, “Blue Moon of Kentucky,” written by Bill Monroe in 1947, in which the blue moon is told to “Shine on the one that’s gone and left me blue.” Monroe, often called the “Father of Bluegrass,” is credited with popularizing the genre, which takes its name from his band, the Blue Grass Boys. Monroe’s home state of Kentucky has the nickname, the “Bluegrass State,” based on the prevalence of bluegrass, a smooth meadow grass of the Poa genus. Bluegrass seedpods turn from green to a purple-blue hue, which gives this grass its name.

A blue moon rising on New Year’s Eve welcomed in 2010. The next blue moon should occur in 2011 — plenty of time for any aspiring folklorists or songwriters to get a new composition ready.