Saturday, March 30, 2013

Questioning A Generation's Reading Patterns: Why the So-Called Politically Apathetic Millennial Generation Can't Get Enough of its Politically-Charged Young Adult Dystopian Fiction




I’ve recently finished writing an obscenely long article on the current dystopian trend in young adult literature. It will likely be impossible to sum up that 38 page article here, but I thought I’d share a little of what interested me in the topic.

In the past few years I’ve read a bunch of troubling (and problematic) studies discussing the millennial generation. Nick Carr, author of “Is Google Making Us Stupid,” has discussed the ways in which our technology-dependent society is impacting our youngest generations and Mark Bauerlein, author of The Dumbest Generation: How the Digital Age Stupefies Young Americans and Jeopardizes Our Future, discusses how the social networking era is creating a generation of narcissistic, politically apathetic, and civically illiterate youth. Bauerlein’s text houses some sobering statistics about youth’s lack of interest and knowledge concerning political affairs. Here are just a few examples: 1) 64 percent of young adults in a research study knew the name of the latest American Idol, but only one-third knew which party controlled the state legislature, and only 40 percent knew which party controlled Congress; 2) In a 2003 survey, only one in 50 college students named the first right guaranteed in the amendment, and one out of four did not know any freedom protected by it; 3) A 2004 report conducted by the US Department of Education indicates that 47 percent of high school seniors believe it is “very important” to be an active and informed citizen, but only 26 percent of high school seniors scored “proficient” or “advanced” on the national civics exam.

As I read these studies I was also reading for pleasure a slew of dystopian young adult novels. Dystopia has always had a place in young adult literature but it has become much more predominant in the past decade, even more so since Susan Colllins’s Hunger Games series became an overnight sensation. Anyone who is familiar with the dystopia genre knows that it is packed full of political content and social commentary. Therefore, reading these fictional tales intended for young adults against studies claiming that they had absolutely no interest in such topics created a sort of cognitive dissonance for me. To me it seemed like there was a major mismatch between the reading practices of the youth and the critical labels being attached to them (terms such as “apathetic” or “apolitical”). So I set out to discover why these texts were so popular right now with this age group and if their popularity could disprove some of the criticisms launched at this generation.

In terms of popularity I came to a conclusion that won’t surprise many who follow this blog (or know my research interests): they are likely a product of the post-9/11 moment, joining the ranks of horror films and apocalyptic dramas that remediate the events of 9/11 and/or act as handy metaphors for the host of cultural concerns plaguing the country today. I suggest these texts could be a place where youth wrestle with their (likely unconscious) fears lingering after the national tragedy.

Of course, there could be other reasons why these texts exist in this mass quantity at the moment, reasons that stem not from the age group they are intended for but from a previous era that produced the authors who are crafting them. The majority of the authors writing young adult fiction today lived through the Reagan era, the 1980s Cold War scare, were terrified by the broadcast of The Day After (1983), and as high schoolers they were taught a variety of texts concerned with misery and social control, such as 1984, Fahrenheit 451, Brave New World, Flowers for Algernon, and Lord of the Flies. The dystopian literature of the eighties was preoccupied with mass destruction, atomic bombs, and apocalyptic outcomes. But, as Laura Hall and Kara N. Slade note, this generation did not face such a future; they did not die, but instead “just ended up with mortgages, subscriptions to the New Yorker, and a grinding sense of regret. And now apparently they are “writing, publishing, and promoting postapocalyptic and dystopian fiction for young people at an unprecedented rate.” Therefore, it may very well be that this current upswing in young adult dystopia should be credited to the authors themselves, for it is just as likely that it is their political concerns that are projected upon the pages more so than those of their readers. In terms of crafting the market for these texts, it may be a mistake to assign the agency to the young adults themselves instead of to the authors who are framing the texts within these themes or the publishers who are eager to push texts that capitalize on post-9/11 concerns. But regardless of who or what deserves credit for rejuvenating this profitable subset of young adult literature, the fact remains that teenagers are reading these texts and are, therefore, sustaining the market. I would then argue that their enthusiastic engagement with these novels calls for reevaluating the claims that this generation is apathetic when it comes to national and global issues.

In the longer version of this article I analyzed some of the social critiques found in a set of dystopian YA novels. I’ll just bullet point few here so that anyone unfamiliar with these texts can have a basic understanding of this recent publication trend:

· Matched (Series): Ally Condie’s series debuted in 2010. It depicts a society that relies oncalculated planning where each of life’s stages – from marriage at age 21 to death on or before age 80 – is decided in advanced for optimal results. Part of the novel’s cultural critique comes into play as Condie describes the ways in which the society has decided to preserve cultural and historical artifacts: in the fashion of an exaggerated “throw away” culture, record keeping and personal preserves are virtually eliminated and only things of value (value being decided on by a governmental committee) are saved as parts of collections, or “Lists of 100” to be exact (the top best 100 poems, the best 100 paintings, and so forth).


· The Hunger Games (Series): The first book in Susan Collins’s series was published in 2008. The novels present a totalitarian government that punishes its citizens through an annual competition where children are selected to fight to the death in a televised competition. It can be read as critiquing many different aspects of contemporary society, most obviously our entertainment choices (and specifically reality television or violent media more generally). However, the series also offers up social commentary on poverty, capitalism, and more.

· Little Brother (Novel): Cory Doctorow’s novel was released in 2008. As its title alludes, this text is an Orwellian tale of a government that abuses its power and constantly monitors its citizens. The narrative follows a teenager who was wrongfully detained by the Department of Homeland Security after a terrorist attack. It is a clear critique of the Patriot Act and other governmental abuses (in the name of national security) post-9/11.

· Uglies (Series): Scott Westerfeld’s series was launched in 2005. It focuses on a post-apocalyptic society where cosmetic surgery is mandated at age sixteen in order to rid citizens of physical differences, such as race and beauty, that in the past resulted in inequality and even war. Accompanying this physical transformation, unknown to those who experience it, is a mental one wherein the participants’ brains are altered so that they remain in a permanent state of “bubbly” (ignorant) bliss. The central commentary of these books focuses on the current surge in cosmetic surgery but other important critiques concerns our culture’s obsession with celebrities and material goods, and the damage we are doing to the environment.

· Feed (Novel): M.T. Anderson’s novel was published in 2002. It utilizes the backdrop of a post-apocalyptic world to critique the effects of advanced technology on humankind and the environment. Feed presents a vision of an earth all but destroyed by humans. In this technology-saturated society, the majority of the population has had computerized information feeds implanted in their brains. These feeds constantly bring them endless streams of information, usually in the form of advertising for consumer products. Besides for serving as a cautionary tale about the misuse of technology, his novel was intended to scrutinize the current culture of instant gratification, aspects of herd psychology, and individuals’ refusal to tackle serious societal problems.

Quite obviously, this type of reading does not shy away from social critique. But does the fact that teens are reading these political texts prove that they *are* political? Not necessarily. However, some recent research does indicate that things are not as bleak as some would have us believe.

Robert Putman and Thomas Sander completed a study that determined that 9/11 seemed to have a major impact on the civic engagement of young adults. They report that young collegians' interest in politics rapidly increased in the years following 9/11 after three decades of steady decline. In the period from 1967 to 2000, the share of college freshmen who said that they had ‘discussed politics’ in the previous twelve months dropped from 27 to 16 percent; since 2001, it has more than doubled and is now at an all-time high of 36 percent. Sander and Putman provide a host of other statistics that further buffer their claims: 1) First-year college students also evince a long-term decline and then post-2001 rise in interest in keeping up to date with political affairs; 2) Surveys of high-school seniors show a similar and simultaneous decline and then rise in civic engagement; 3) Moreover, as most of us know, between 2000 and 2008, voting rates rose more than three times faster for Americans under age 29 than they did for Americans over 30.
Since their study was published in 2010, it was too early to draw attention to other recent evidence that young adults are becoming more politically engaged. A key example would be the Occupy Wall Street protest movement. This movement, which began on September 17th, 2011, promoted its protests through the social-networking sites most frequented by American youth. As a result, young people made up the majority of it participants.

These conclusions about young adults suggest that this new generation – inspired by 9/11 and further invigorated by new technological platforms – is experiencing a rejuvenation in terms of civic involvement. This could account for their sudden interest in young adult literature that caters to such mindsets.
Although it seems logical to assume that the popularity of these novels implies that their teenage readers are interested in the social commentary they build upon, this is a hard assumption to prove. It could be that teens are just as drawn to the fantastical settings, the romantic storylines, the coming-of-age themes, and the action-filled plots. However, if this was the case, one would expect to see other YA novels (which include all of these aspects) topping them in sales and garnering the Hollywood movie deals. And this is not occurring as regularly. Quite obviously, there is something within these novels that speak to this newest generation of readers. The most likely answer is that teenage readers are drawn to the way these texts repackage societal concerns from reality, displacing them into the safe comforts of fiction where they are addressed time and time again with more favorable results.


It is not surprising that the decade following the September 11th attacks contained as many dystopian narratives (young adult and otherwise) as it did. I would argue that these narratives are important sites where the reality of a post-9/11 world are being worked out. Therefore, in consuming these texts, teenagers are likely working through fears remaining years after the national trauma. Reading these books may not completely rewire the Millennial Generation, but it may find them contemplating some important societal problems – including those that led up to the 9/11 terrorist attacks, as well as other less tragic concerns such as the effects of reality television, superficial beauty standards, and over-reliance on social technology.

Of course, the popularity of these young adult novels does not eradicate the data that suggests that teenagers today are suffering from civic illiteracy and are not participating in the formal political process at desired rates. (Although, to be fair, the criticisms leveled at young adults are often exaggerated when studied in greater cultural context. For example, data is often not relayed for other age groups which might indicate that civic illiteracy, or disinterest in current political affairs, is more of a cultural phenomenon rather than an age-specific one. Also, these studies often fail to address the larger question of why this generation, or Americans more generally, are shying away from politics.) Nonetheless, taking these somewhat disturbing studies at face value, it does appear that young adults are, at least outwardly, politically disengaged. Despite their content, the mass consumption of these political narratives does not alter this fact. However, the popularity of these dystopian texts does suggest that the descriptor cast upon the age group is misleading. The very fact that this population is enthralled with these tales suggests that the classification of “apolitical,” or at the very least “apathetic,” is inaccurate. This literary trend indicates that while these young readers may be disheartened by contemporary politics and under-informed in current events, they are not uninterested in the social problems that underlay both. The success of these novels implies that teenagers are willing to entertain societal critiques – even ones that implicate themselves. Rather than being a problematic discrepancy, the “mismatch” between the reading interests of young adults and their direct political action suggests that young adults could very easily be molded into more politically engaged citizens. And perhaps this “mismatch” is not as great as it first seems. The fact that this post-9/11 reading trend aligns with recent increases in community service and voting among youth populations may indicate that a change is already underfoot. Perhaps if their interests and concerns – as evidenced by their reading material – are addressed more regularly in the public sphere, and if they feel their voices will count in the democratic process, future research and poll data will reflect a very different political reality. Will reading young adult dystopias alone cause this political transformation? Not likely. And does the consumption of print texts often spark revolutionary change? No, but perhaps it is a start. Rather than predicting a bleak future for the young persons of today, this dystopian trend may very well be pointing toward a more positive future… at least in terms of political engagement.
  

Sunday, March 24, 2013

Some Thoughts on Children's Programming, or, How Mickey Mouse Clubhouse Can be Both Toddler Crack AND a Parent's Best Friend



I am a television scholar who boo-hoo’s most of the criticisms thrown at the medium.  I advocate for active viewing practices and buy completely into the argument that complex television programs actually play a role in the smartening (rather than the “dumbing down”) of culture.  (For more on this latter argument see Steven Johnson’s Everything Bad is Good For You).   But despite my championing for the medium, my dual role as a scholar-mother has made me a bit weary when it comes to the debates concerning the detrimental effects of screen time on infants and toddlers. 

For the first year of my oldest daughter’s life I was pretty good about limiting her exposure to television.   I played a few Einstein learning videos but rarely turned on any of the kids channels.  Although I wasn’t quite convinced by any of the studies, I fought my partner (who enjoys television as a constant background noise companion) on keeping the television off during the day when at all possible.  When I had my second child this past winter (with only a 16 month gap between the two), my strong stance on television viewing diminished out of necessity.  Having a temperamental toddler and a needy infant to deal with, I decided I’d accept any tool into my parenting toolbox.  Enter Mickey Mouse Clubhouse. 

I actually like Disney Junior’s Mickey Mouse Clubhouse for the most part.  Like most children’s programming of today, it has educational value; its stresses problem solving, taps into the children’s musical intelligence, and helps with word/concept acquisition through repetitive songs and narrative patterns. But that’s not really why I liked it at first.  In truth, what I liked the most (during our early viewing days) was the way my daughter’s face lit up and she clapped when she saw the opening credits come and the cute way she said “O’Toodle” when prompted by the screen to call on one of the main characters.  A few months later, I found myself equally as delighted when she started singing and dancing along with the closing credits (as we parents find most of our children’s performances to be cute).  However, lately she has started demanding the show at all hours of the day, to which I won’t comply.  I have struggled to curb her appetite for the show and have had to fight toddler tantrums when I limit viewing to a half hour in the morning at breakfast and a half hour at night before bedtime.  I have justified that formula being that it put her overall screen time to less than an hour a day. 

Of course that one hour a day is also one hour a day that my three month old infant is now being exposed to the television set and it found me wondering about the American Academy of Pediatrics’s (AAP) suggestions that children under the age of two watch no television at all.  I found myself reading parenting articles about the pros and cons of the exposing young children to technology gadgets and screen time.  I found myself watching the infamous YouTube video of the child who could not flip through a magazine seeing it instead as a broken iPad which would not interact with her when she “clicked” on images or “scrolled” across the pages.  I read the studies that found that while young children today are more proficient at technology when they enter grade school that they are lagging behind in other areas linked to fine and large motor skills (e.g. tying shoes; riding bikes).  (Of course, as one article pointed out, these lags are temporary:  to date no one has graduated high school unable to tie his shoes due to early exposure to computer screens.) 

While I stressed about all of these things and the decisions I have yet to be faced with (we have yet to exposure our children to computers, iPods, or iPads, although many friends have praised this app or that in terms its usability for children), I began to remember an article I teach to my freshman composition students.   Ariel Gore’s essay, “TV  Can Be a Good Parent,” discusses the AAP’s recommendations concerning television exposure and their outrageous suggestion that pediatricians ask parents about their children’s media consumption patterns so that such data could be monitored.  Gore acknowledges AAP’s valid arguments for recommending that television be limited but also counters many of their claims by discussing various educational programming and noting that parents can interact with children while watching television.  She also points out the fact that while going without a television set is an option for some – she gives the example of her own artist mother who was participated in communal daycare—it is not an option for her.  Ultimately, Gore argues that many of the AAP’s policies seem to rest on assumptions that all parents are middle class or higher, that all mothers have the option to stay at home, and that they all have partners or supportive people who can nurture their kids when they cannot.  It is the closing of her essay that inspired this essay:

We need more – and better – educational programming on TV.  We need to end the culture of war and the media’s glorification of violence.  We need living-wage jobs.  We need government salaries for stay-at-home moms so that all women have a real career choice.  We do not need “media files” in our pediatricians’ offices or more guilt about being bad parents.  Give me… a commune of artists to share parenting responsibilities, and I’ll kill my TV without any provocation from the AAP at all.  Until then, long live Big Bird, “The Brady Bunch” and all their very special friends!

I always found this article incredibly convincing as a pre-parent and knew that I, too, would someday find television to be an ally in the battle of parenting.  So, it’s not surprising that there are days when Mickey Mouse Clubhouse is my best friend as it distracts my oldest child so I can feed the youngest.  Or that it helped me fill the hours during my oldest child’s nine-day hospital stay while she recovered from pneumonia.    But moreover, it made me realize that Gore is right.  Television is part of the virtual community that we have at our disposal to help with childrearing, while an actual physical community is often lacking.

The old adage is that it takes a village to raise a child, and I believe it.  I am lucky enough to have friends who I participate in babysitting trades with so we can attend work functions or have a rare date night out with our partners.  We have play dates that allow us to have adult interaction while our kids romp around at our feet.  I have borrowed so many baby clothes that I have had to spend very few dollars of my own on attire for my children and I’ve paid it forward by loaning clothes out in return, as well as larger baby items (e.g. swings and jumpers) so that other friends wouldn’t have to foot the expense for such over-priced items.  But despite all of that help there is still plenty that my community cannot do.  While my kids attend the same daycare as one of my closest friends (and we live only blocks apart from one another), we cannot take turns carpooling to daycare because of pesky laws about car seats (and the limited seating space in my non-SUV family vehicle doesn’t help either).  And while my friends and I help each other out for special occasions, for most of the regular week we are all on our parenting islands alone – cooking, cleaning, childrearing, etc.  My village is comprised of all dual-income families with mothers who work outside of the home just as the fathers do.  That doesn’t leave much room for the utopic vision of women tending to the children of other women alongside their own.  But at least I have my borrowed clothes and scheduled play dates.

There are more services that I wish were in place in our country to help mothers and families.  I wish the U.S. was not ranked among the lowest, in comparison to other first world countries, in terms of its Family Leave/Care policies.  I wish we had legislation in place that paid for early preschool or helped to fund daycare for parents who need it.  I wish that our country rewarded (rather than inadvertently penalized) mothers for working outside of the home, as other countries do.  I wish our Family Medical Leave Act (FMLA) allowed parents to spend more time at home with their young children without consequence.  (The meager 12 weeks of unpaid leave we are granted pales in comparison to countries like Italy which provides citizens with 22 weeks of full-paid leave or Sweden which grants it citizens with one year of paid leave; Sweden is especially noteworthy as their egalitarian practices are evident in their leave policies.  Upon the birth of a child a woman can take up to a year off of work at 90% pay but the country tries to encourage men to do this as well by offering men a year off at 100%.)  If we had some of these things in place like most of the developed world, perhaps I wouldn’t need to give my daughter her daily dose of televised toddler crack… but since we don’t, Mickey Mouse Clubhouse will be on in my house at least two times a day so that I can parent in the best way that I know how.


 

Thursday, March 21, 2013

The Darkest Year of Television Debuts: Cult, The Following, Hannibal, Red Widow, and More




When The Sopranos debuted on HBO in 1999, it changed television forever in terms of the acceptability of violent programming.  Other cable programs made similar impacts (e.g. Sex in the City pushed the limits in terms of sexual situations on the small screen), and slowly throughout the past two decades this type programming that delves into the taboo has had a ripple effect now being felt on network television. 

Violence is not necessarily a stranger to network TV – the massive popularity of the endless versions of CSI attest to that.  However, never before has there been such an array of explicitly violent programming offered up for the primetime viewer.  In January, as noted in a previous post, Fox launched its new drama, The Following, starring Kevin Bacon.  A month later, the CW debuted its new program, Cult.  As the titles indicate, both programs track the happenings of cult participants following a charismatic leader (of sorts).  While this leader in The Following is an actual serial killer who has recruited individuals from all walks of life to kill in his name, Cult’s storyline (and “leader”) is a bit more complicated.  The show tracks the amateur detective work of two main characters, Jeff Sefton (Matthew Davis) and Skye Yarrow (Jessica Lucas), who are searching for answers concerning a cult connected to a television program produced by a man named Steven Ray.  (Skye believes her journalist father’s death is related to investigating this man a decade prior and Jeff believes his brother’s recent disappearance is related to his joining this cult of television fans).  Cult depends on the postmodern text-within-a-text stylistic:  it is a television show about a television show (of the same name).  When real audience members watch the onscreen audience viewing the scenes from the fictional show Cult, they too are drawn to the charismatic Billy Grimm (Robert Knepper), the star of the show-within-the-show.  In these moments, viewers at home are watching this fictional program along with the characters on the show.  It’s a narrative time out where one program stops and another begins, only they are dependent on one another.  Read viewers are hence forced to watch the television program that is leading to mental breakdowns and mass murder and, therefore, read viewers are implicated by this way of watching alongside of the crazies.  (Are you still following me?)  Besides for these jarring moments, the meta-commentary about the state of television today, with scenes that analyze online forums where fans connect globally, is noteworthy too.

As I’ve watched these shows this winter I’ve often thought their release is a bit tardy, though still relevant.  As cult involvement tends to heighten during anticipated “end of days” moments, these programs would have been all the more disturbing to watch in the lead up to the anticipated end of the world date – December 21st, 2012.  (Although these programs were, of course, pitched and produced during this time period).  And while the cultural moment may explain the narrative content of these shows, viewers’ seeming desire for violent content (or their tolerance for consuming what is being offered up to them) is not as easily explained.
Along with these shows, NBC is premiering its new show Hannibal next month (a television series that will be based on the novels of Thomas Harris).  Like The Following and Cult, I would anticipate that this new program will show graphic murders in almost every episode.  (Both The Following and Cult have featured a murder in each episode and various torture scenes as well).

While not as gruesome, another wave of violent programming is also in progress.  Producers are taking advantage of America’s long obsession with gangster narratives with a new set of mob-focused programs.  ABC’s Red Widow aired for the first time this month and follows the happenings of the Russian mafia and their coastal drug imports.  Direct TV is preparing to debut its new series, Rogue, which is similarly follows mob activities on the docks of California (Rouge is set in Oakland while Red Widow is set in San Francisco). 
So as I watch these new programs, and the trailers for the ones yet to come, I’m struck with how violent the televisual landscape is at present.  And it’s not just that these programs simply exist, they’re pulling in a ton of viewers.  (Case in point, AMC’s zombie show, Walking Dead, discussed in a previous post, has secured itself a spot in the top ten most watched television programs for the past few weeks.  Its midseason premiere drew in over 15 million viewers becoming the first cable series to ever beat out every other show of the fall broadcast season in the coveted 18-49 demographic).

Is this surge of violence on the small screen a product of television as a medium becoming more like film (as it continues to nudge its older sibling out of the way in terms of popularity and cultural impact)?  Is it a product of the times – the consumption of violence as a coping mechanism for dealing with the post-9/11 climate where media pundits feed us endless scenarios of doom and gloom while real military and terrorist violence fills our television screens in the form of nonfiction (or quasi-nonfiction)? 

Although I rarely tend to jump on the alarmist bandwagon about the effects of media violence on viewers, siding instead with the research that shows that the effects of violent programming and video games have few predictable and/or measurable effects on consumers, I do wonder what the long term effects of this trend could be if this is just a preview of coming attractions.  In much of my scholarly work I’ve argued that television consumption works like a negative affect theory (to use Silvan Tomkins’s terminology):  it works to decrease negative affect amongst viewers.  What this has meant in my reading of news coverage is that the constant array of visual imagery we see of car crashes and school shootings and bombings abroad eventually desensitizes us to these incidents.  I would argue that we no longer really feel “sadness” and “horror” when driving by a horrible wreck on the side of the freeway because we’ve been trained to voyeuristically study such imagery as they flit across our flat screen televisions at home (hence the “gawker delays” we face when traffic slows down in front of these scenes).  So, if this is the case, could the same be true of fictional television? 

While we likely won’t encounter opportunities to be desensitized to real zombie attacks or find ourselves in close proximity with cult leaders, serial killers, or mob enforcers, could this array of violent imagery desensitize us to violence more generally?   And, perhaps more importantly, what does it say about us as a society that right now this type of violence is so appealing to us?  What does it say about me that my DVR is full of these shows and that I don’t find myself bothered by the fact that I’m watching all of this, while also catching up on seasons of Dexter on DVD?  I don’t have an answer but as I tune into the next installment of cults and killers and mobsters (oh my!), I’ll be sure to continue contemplating the consequences these programs may have on the public (and  on myself).

Saturday, March 9, 2013

General Hospital Gets its Second Life at Age 50, or, How One Show Helped Make a Television Scholar


Part I: How I Became a Television Scholar (and Why Soaps are to Blame)

Like most, I grew up in a time when we as a society worried less about childrearing.  I could go on all day bike rides with no way to be reached and the old adage “be home before dark” was a comfortable send off for summer play.  Once indoors, it was a time when the hype about media’s bad influence was not quite what it is today.   As a product of the 80s I grew up on the expected fare of Sesame Street, Mr. Rogers, and The Polka Dot Door, but beyond that the choices weren’t what they are for parents today, nor was there the incessant fear that exposure to our parent’s programming choices would forever (or temporarily) damage us.  And thank goodness for that or I wouldn’t have the career I do today.

During the first part of my childhood my mother stayed at home and although I never once saw her planted firmly on the couch in the caricature of a soap-obsessed housewife, she often had the ABC soaps playing in the background throughout the day (and she admits to having followed certain shows at certain times, such as the infamous escapades of Luke & Laura on General Hospital).   I’m not sure at what age I actually started attending to what was on the screen and “watching” in the sense that I was following a narrative but I have some early recollections of being seven when a character my age appeared on GH, Robin Scorpio (Kimberly McCullough).  It isn't until middle school that I can recall actually following storylines.  My timing could not have been better.  I started watching soaps seriously at a time featuring some very important social issues storylines and perhaps it was that first exposure to the important work that soap operas do that impacted my later academic studies. 

Some of these storylines that I remember appeared on One Life to Live.  One focused on the coming out storyline of network TV's first gay high school character, Billy Douglas (played by Ryan Philippe in his breakout role).  The second involved the trial surrounding a gang rape that occurred in a college fraternity house.  A few years later I remember vividly General Hospital's AIDs/HIV storyline featuring the above-mentioned Robin Scorpio and Stone Cates.  (Stone died of the disease and the program then realistically potrayed Robin living with her HIV diagnosis for almost two decades thereafter).

By the time I was in high school I considered myself a soap fan. My grandmother got me a subscription to a soap magazine and VHS (yes, VHS!) tapes of famous soap weddings as a gift one Christmas.  I sent off fan letters and received "autographed" pictures of some of my favorite actors (including one from McCullough).

As I got older and busier I started being a bit more selective about which soaps I followed.  Though the summer would often find me picking up all the ABC soaps again (including Port Charles and All My Children), due to recording space (in the VCR age), during the school year I usually only followed General Hospital.  (And I still follow it today).  But even my loyal watching of just this one soap opera changed the trajectory of my career.  

Here’s that story:

“One day a soap opera fan went off to graduate school, discovered feminist theory and media scholarship, and decided that finally she could find a way to justify her not quite secret (but not quite advertised) love of daytime drama.  Long before she really understood what completing a doctoral degree entailed or how one went about writing a dissertation, she began fantasizing about how she would be the first to uncover the feminist value in this so-called lowbrow genre; she began coming up with unique and complex readings of her favored programs that would dazzle academia; and she began rationalizing how her project would finally give her the opportunity to attend ABC’s Super Soap Weekend, a fan conference she could never quite find the justification to spend her hard-earned money or precious time on.  As the years stretched on that girl grew up and, alas, she realized that she was almost three decades late in being the first to argue that soap operas are feminist allies.  Likewise, she realized that many (although not all) of her earth-shattering analyses of the soap had been done by academics who came before her.  But, that latter pipe dream – that a college degree would finally land her a plane ticket to Florida, entrance into MGM Studios, and a weekend with fellow soap fans – well, that did happen.  And she was able to end her dissertation on the cultural artifact that she loved with a narrative account of her ‘research’ conducted at that fan conference.”

So that was how I got my start as a television scholar.  Although I rarely write on soaps today, I know I owe my love of serial television to them.  And because I link my career specifically to General Hospital it is no surprise that I have found myself pleasantly surprised by its recent increase in ratings.

Part II:  How to Save a Soap:  Bring on the Nostalgia (and Bring Back the Supernatural?)

General Hospital first aired on April 1st, 1963 so next month it will officially turn “fifty.”  (I remember the program celebrating its 30th birthday when I was in high school back when I thought thirty years seemed like a long time for anything.  Now I can hardly believe there has been twenty years sandwiched between these two big birthdays and that I have witnessed both of them as a viewer).  If asked a year ago if I thought the show would make it to this big anniversary I would have been skeptical since times have not been kind to the genre as of late.

Through most of their decades on television, the number of American soap operas fell into double digits.  By the mid-2000s this began to change as ratings continued to drop.  One of the first to go was ABC’s Port Charles (in 2004), followed by Passions (in 2007), and then Guiding Light (in 2009).  It was the end of Guiding Light that really broadcasted the message loud and clear that soaps were in danger of extinction.  (Guiding Light had been the only soap to have successfully mastered the transition from radio to television and had remained on the air for 72 years, making it the second longest running program to date.)   Next to fall was As the World Turns (in 2010) and then there were just six soaps left.

At that time I felt pretty good about the potential longevity of General Hospital because it was the only show part of a big “soap block” and I thought it was sitting pretty in a three program line-up along with All My Children and One Life to Live.   That confidence was shattered when ABC announced that it was cancelling GH’s bedfellows; All My Children and One Life to Live drew to a close in 2011 and 2012 respectively. 

With only four soaps lingering in the televisual landscape I was depressed.  I predicted that GH or Days of Our Lives would be the next to fall and it was hard for me to muster up the energy to follow the program I had watched so faithfully for years as I was convinced I was viewing its final days.  (Of course my reluctance to watch was also fueled by the fact that I had fallen over 100 episodes behind and that my GH collection was consistently filling up my DVR queue.)  After having watched for over twenty years, I almost walked away from the show.  And then a storyline stopped me.

I learned that General Hospital had “killed off” Robin Scorpio, the character who I had grown up with.  I felt I had to catch up and watch her final scenes.  And so I did.  And then when, in typical soap style, her death included a body “burnt beyond recognition” I needed to watch to see if she was really dead.  And, of course, she wasn’t.  By then I had caught up to the present (with some necessary skimming), ABC had renewed the soap, and I had been sucked into a variety of storylines that highlight the strategic ways in which the show is trying to increase its ratings.  (Strategies which have been successful, I should note, as the soap rose to the number two slot in the coveted ratings category concerned with female viewers aged 18-49).  Here are some of the ways in which GH is reviving itself at age fifty:

Strategy One:  A Soap Opera Merger (Port Charles Welcomes Llandview Residents)

General Hospital started integrating characters from One Life to Live into its storyline, including Todd Manning, Starr Manning, Detective John McBaine, and cameos from other former OLTL favorites like Blair Cramer-Manning and Téa Delgado. 

Strategy Two:  Allusions, Illusions, Delusions, and More – A Turn Back Toward the Supernatural

Soaps have a long history of integrating supernatural aspects into their storylines.  Port Charles, a spinoff of General Hospital, spent the final years of its life focusing on gothic storylines, the largest of which centered around a love story involving vampire Caleb Morely (Michael Easton) and Livvie Locke (Kelly Monaco).  When GH brought Easton on to continue his OLTL role as John McBaine it created an interesting narrative conundrum since he had formerly played a different role in this same fictional town.  This was further complicated by the fact that Monaco, who played his then-love-interest Livvie, was now playing a different role as well, the character Sam Morgan.  In a truly postmodern move, the show alluded to this fact by creating a storyline that found John and Sam drawn to one another and often remarking that it was if they had known each other in “a different life.”  I smiled and thought it would end there… it didn’t.

As the show decided to bring back more and more old characters (see strategy three below), the writers decided to embrace the mystical elements of its fallen sister soap head on.  Although I initially rolled my eyes (“Surely, it isn’t going THERE again!” I thought), now that Easton has reprised his role as Caleb Morley, and there are plots about vampire hunters and angels mixed into this medical drama, I find myself strangely intrigued.  (It has been the decade of the vampire with Twilight, True Blood, and such… so I guess why not go there again?)

Strategy Three:  Blasts from the Past – The Return of Old Characters

GH is quite obviously attempting to raise its ratings by tempting old viewers to come back to the show. It has recently just brought back a plethora of former characters (so much that one might think she is watching an episode from the 80s).  Back on the screen together recently include:  the trio of Luke & Laura Spencer and Scotty Baldwin; another former love triangle consisting of Mac Scorpio and Felicia & Frisco Jones; the complicated quartet of Anna Devane, Duke Lavery, Cezar Faison, and Robert Scorpio; the comical Lucy Coe and hubby Kevin Collins; and the resurrected A.J. Quartermaine.  Is this strategy working?  Well, it is for me.  With each reappearance I smile the smile of a soap fan who knows the writers are writing these characters back in for her.  The soap is even returning to previous famous storylines (such as the ridiculous plot pertaining to The Ice Princess) and annual events (like the Nurse’s Ball, a fictional event planned to raise money for HIV/AIDs research.  The actual show would then in reality sell paraphernalia related to this event to raise real profits for this medical research.  I’m dorky enough to have bought two “Nurse’s Ball” T-Shirts in the past… and worn them).

Conclusion

Despite the recent success that GH has had climbing back up in the ratings I am afraid the writing may still be on the wall in terms of the future of soaps.  (My money is on Young and the Restless and Bold & the Beautiful lasting the longest).  But I hope that I’m wrong.

As someone who studied this genre formally and informally for so many years, I continue to have a great appreciation for what it is and what it does.  Soaps have often been the first to tackle important social issues (many of which pertain specifically to women, e.g. abortion, domestic abuse, etc.).  They are the only type of programming that allows viewers to age indefinitely alongside characters – to watch them experience similar milestone moments at the same pace (high school graduations, weddings, the births of children, etc.).  The longevity of the soap provides viewers with a chance to have an intergenerational dialogue about the narratives on the show (with grandmothers, mothers, and daughters often watching the same programs). 

I don’t often spend a lot of time wishing on the televisual stars or worrying about televisual trends.  Usually if a genre seems to be on a decline I rest assured that genres have cycles and they’ll come around again.  This will not be the case with the soaps.  Once they exit the daytime airwaves they will never return. Although their influence will continue on in primetime (for more on this see my previous post on recent network melodramas), viewers will never have that experience of watching a narrative stretch on 260 days out of the year.

It would be great if the next decade found me watching GH in the background while my daughters grew up.  It would be interesting to see if they would be attracted (at some point) to the narrative happenings on the screen.  It would be wonderful if one day, as adults, we spent time chatting about the show.  But, for now, I’ll celebrate the fact that I was able to watch as long as I have, and that the show reached this important marker in its history.  For now I’ll simply say with a smile:  “Happy Birthday 50th General Hospital; And Thanks for Helping Shape Me into the Television Scholar I am Today.”