Tag Archives: opinion

~ Hey Motherfuckers – Read my damn post about shitty language!

Sorry for those who I offended with the title of my post.

Actually, I’m not sorry.   Because, if, with all the horrific things that take place in our world – wars, genocide, political corruption, child abuse, corporate greed, human slavery, environmental disasters, etc. – this is what upsets you, then you and I probably wouldn’t get along anyway.

And yet some people choose to focus their energy and time on this very issue.   Bill Cosby, for example, has often expressed dismay over the use of curse words in stand-up and today’s culture in general. He is not alone in this view. Will Smith has shared similar sentiments about the prevalence of profanity in rap music. More recently, if you watched the Golden Globes or Oscars, you probably noticed an abundance of bleeps during acceptance speeches.

And my response to this condemnation and censorship of profanity is to say, fuck that.

In all seriousness, though, I understand where they are coming from and recognize their good intentions. Yet, it seems silly to censor swear words or fret over the use of profanity in song lyrics when there are much more important issues to focus on or draw attention to.   Rather than worry about the use of profanity by rappers and “thugs” who let their pants sag low and speak “improper” English, Smith and Cosby would be better off focusing on the poor educational opportunities and lack of positive role models that often make the “thug” life appealing.   Rather than television censors bleeping out “bad words” to protect viewers, the F.C.C. should consider why it is still okay for immoral behavior to be glamorized and regularly depicted as consequence free (not that I’m advocating censorship).

The profanity police should think more carefully about where to focus their outrage or even their discomfort. After all, how can one be an arbiter of language and not recognize the mutability of words, the fact that language is not a static thing? Words and their meanings are in constant flux, changing according to the context in which they are spoken. Yet, censors and conservatives with an aversion to curse words ignore this fact.

Take the word considered most foul: fuck. “The first known occurrence of the word (at least the most accepted) is … in a poem in a French/Latin mix which satirizes the Carmelite monks of Cambridge from around 1500. The line reads, “[The clergy] are not in heaven because they fuck wives of Ely.”2  Used to insult the clergy in this poem, the very people who transcribed most of the written word at that time, fuck was deemed a “bad word.” Thus, it was the context in which fuck was used and not the word itself that originally made it so offensive. And yet, even though most of us aren’t Carmelite monks, we still consider ‘fuck’ profane regardless of the context in which it was or is used.

We also arbitrarily label certain words as “bad” or “off limits”. I say arbitrarily because it is often our social conditioning and not the actual meaning of the word or the intentions of the speaker that affect our interpretation. For example, the science fiction show Battlestar Gallatica uses the word ‘frack’ as the universal curse word, a clever way for the writers to get around the censors and still have their heroes express anger, frustration, and fear in the same way many people often do today, which is by cursing.   The fact that no viewers or censors objected to the use of this word, even though the meaning and intention of its use is exactly the same as our more recognizable equivalent, fuck, demonstrates how hypocritical people’s objections to profanity can be. Do you not like fuck because it is those four letters put together in that order OR because it is often used in a derogatory or hurtful manner? If it is the latter, then you should be just as offended by ‘frack’.

And yet we are not. Because our culture does not socially condition people to view “frack” and the people who use it as vulgar reprobates.

Not only is this hypocritical, it also ignores the delight a well-placed swear word can provide. It denies the deftness and nuance of curse words in favor of a one-dimensional (and thus incorrect) view of them as unequivocally “bad”. Yet, according to Wikipedia, “the more vulgar a word is, the greater its linguistic flexibility.” For example, the worst of all swears, fuck, has been shown to have the greatest flexibility.   “Linguist Geoffrey Hughes found eight distinct usages for English curse words, and fuck can apply to each” (Wikipedia), which underscores the importance of context in determining whether its use is offensive.

And I think Shakespeare would agree with me. Shakespeare – the man considered one of greatest artists of all time, sculpting the English language into shapes that have lasted for centuries – loved the vulgar and profane.   His plays were full of swears and sexual innuendo, the master wordsmith delighting in the “linguistic flexibility” such language provided.   Now does this diminish the meaningful insights he delivered in beautiful, poetic language? Of course not. In fact, his ability to weave together the profane and the profound is what allows his works to represent humankind as it really is, showing us that we are and always will be a clash of animal instincts and lofty ideals.   And that is what art and entertainment is supposed to do as Shakespeare explains in Hamlet: “to hold as ’twere the / mirror up to nature: to show virtue her feature, scorn her own / image, and the very age and body of the time his form and / pressure.”

And when you consider the fact “that roughly 0.5% to 0.7% of all spoken language is swear words, with usage varying from between 0% to 3.4%. [and that] first-person plural pronouns (we, us, our) make up [only] 1% of spoken words”1, it’s hard not to see that profanity is very much a part of human nature and our everyday life.

This is why it is important not to be offended by words unless they are used with certain intentions or in certain contexts. If I say, Fuck! after hitting my thumb with a hammer and you look at me with disdain, I would ask you to lighten up a bit and to recognize that it’s just a word that symbolizes frustration and pain and nothing more at that moment. On the other hand, if I walk up to you and say, go fuck yourself! Well, then I understand why you would never want to talk to me again.

This is why censorship is silly. It assumes profanity is wrong regardless of context. It refuses to recognize the nuances in language and in doing so, inhibits human expression.   After all, there’s something emotionally satisfying about letting out a string of profanities when you are pissed off or something completely visceral about yelling “Holy Shit!” when someone jumps out at you from behind a door for a practical joke.   (Trust me, I’ve tried “fudge” as a replacement and it did not do the trick.) In fact, Keele University researchers Stephens, Atkins, and Kingston found that “swearing relieves the effects of physical pain,” with Stephens going so far as to say, “I would advise people, if they hurt themselves, to swear.”   And yet even in their article, “Why the #S%! Do We Swear? For Pain Relief” published in the Scientific American in 2012, they demonstrate the reluctance of the “refined” to lower themselves to the base level of those who swear by “censoring” the very subject of their research!   As if readers don’t know exactly what word those random symbols replace. As if, in our heads, we aren’t reading it as “Why the Fuck Do We Swear? For Pain Relief”. Still, at least their article argues that swearing is a widespread but perhaps underappreciated anger management technique rather than issuing a sweeping condemnation of profanity as many do in our culture.

Now some of you might still have objections. You might ask me if I want my nieces or nephews to go around cursing out other kids on the playground? And my response is, of course not. But I say that not because I think it is so awful for a child to utter a profanity but because children do not possess the wherewithal to understand the nuances of context. They do not know when it is offensive and when it is not offensive to swear. Moreover, I’d be much more concerned about children being exposed to words like “retard, faggot, or nigger” and the hateful or ignorant views that usually go hand-in-hand with their use than with a child yelling out “shit” or “damn” on the playground.

Hey, what about racial slurs? What are they so bad but swears are not? Didn’t you just say it’s all about context, which would mean that even words like “nigger” or “faggot” are “okay” to use as long as we consider the context in which we are using them?

To some degree, I would argue yes. Context should always be the utmost consideration. However, I will also tell you that my general rule is to avoid saying words that have the potential to hurt others. Not hurtful to their sense of appropriateness but to their sense of self. For example, when I taught Adventures of Huckleberry Finn, even after having a multi-day discussion about why the abundant use of the word nigger was so vital to the message of Twain’s book, I could not bring myself to utter it aloud. That word is so charged, has so much history, and the thought that I might hurt even just one student by saying it was enough to make me refrain.

But that’s not to say that someone else using that same word is wrong. This is why the old argument that “it’s not fair that they can say it but I can’t” holds no merit. Again, it’s all about context and situation. I am a white person and an educator, which is why it’s hard to think of a time or place in which I would ever feel comfortable saying the word nigger. Because I cannot control how my students will interpret my use of the word and it has the potential to be very harmful.   However, if you are with your friends who know exactly what you mean and whom you know will not be offended, who am I to say it is improper to utter.

The same thing goes with profanity. As a teacher, I know it would be viewed as unprofessional if I told the kids to “get their fucking homework out.” I might lose their respect or receive a few dozen parent phone calls. I also wouldn’t say similar things in front of a grandparent or perhaps even you, if I knew you would be truly bothered by it.

And that’s what Bill and Will Smith and other language purists need to realize. Getting hung up on profanity distracts us from real offenses or turns us into self-righteous hypocrites. Take the French film Fuck Me, which was changed to Rape Me for its American release (Wikipedia).   How is this new title any less offensive than the original, if not more so? And yet, the more violent and charged word choice, rape, was deemed acceptable by censors while the one with the “bad word”, fuck, was not.

Similarly, more attention was paid to Vice President Dick Cheney’s saying “Go fuck yourself,” to Senator Patrick Leahy in 2004 than the fact that Leahy was calling out Halliburton‘s sole-source contracts in the reconstruction of Iraq (Wikipedia). Sure it was unbecoming and unprofessional for the man only a tragedy away from the presidency to use profanity, but how is that more outrageous than the fact that he exploited an ill-conceived and horribly destructive war for his company’s financial gain?

And this is at the heart of my defense of profanity. We need to care more about issues that actually matter rather than being indignant about someone dropping an f-bomb on live television.   So often humans get hung up on style over substance. We focus on the superficial rather than the material. And in doing so, we spend our energies on things that don’t really matter when we could be focusing on the issues that are truly problematic and that actually affect us. Instead of worrying about issues of substance, we focus on easier targets like profanity.

So the next time you hear someone swear and feel offended, ask yourself if there isn’t a better outlet for your outrage.


Works Cited

1 from “The Utility and Ubiquity of Taboo Words” printed in Perspectives on Psychological Sciences in 2009

2 From Ranker.com article entitled “Origins of the 7 Dirty Words”


~ If You Like It, Then You Shouldn’t Put a Ring On It ~

I love me some Beyoncé and have happily danced along to “Single Ladies” while out with friends.  But like some other songs that cause toes to start tapping, its catchy hook obscures its problematic lyrics.  Essentially, “Single Ladies” expresses a common sentiment concerning the engagement ring.  That if you “like it” then you better “put a ring on it.” And why is that you may say?  Well, so that everyone, including the woman herself, knows that it is yours.  The problem is that the “it” in question is your future wife or life partner and NOT some possession or thing.

So why, in 2014, must a man still “mark his territory” or “stake his claim” by affixing his fiancé with an engagement ring?

Now you might argue that I am overthinking the issue or that I am missing the important symbolism of the engagement ring tradition, particularly the meaning it provides for the couple and the message it sends to the outside world.  But let me assure you, this is not the case.   Instead I am merely trying to shed light on a social tradition that many of us feel compelled to adhere to despite internal misgivings or without really knowing why.  To do this, we must look at the origins of this custom and ask some questions.  For example, why is it that men don’t wear engagement rings?  If it truly is a symbol of a future lifelong commitment, why does only the woman wear one prior to the official nuptials?

It is in answering this question that the overtly sexist origins of our modern tradition become apparent.  In fact, “today’s symbol of love was once something more like virginity insurance”, a replacement for the “Breach of Promise to Marry” law that “allowed jilted fiancées to sue their former lovers, particularly when the pair had premarital sex and thus the woman’s value was damaged due to her lack of virginity” (O’Brien).   Farther back in history the sexism inherent in this tradition was even more evident as rings were used by sultans and sheiks to “tag” each of their wives (Bare).

Thankfully, times have changed since then, and in most civilized places a woman’s worth is no longer determined by how chaste she is or to whom she is married.  This progress might cause some to argue that even though the engagement ring tradition may have had sexist roots, we have transcended these dark origins and made the act into something more meaningful and egalitarian.  Yet, even if you say the ring is merely the embodiment of a promise to marry and not some politically-charged object, the question remains: why is that only the woman wears this ring?  Both the husband and wife wear the wedding ring, a symbol of their lifelong commitment to each other, so why isn’t that the case with the engagement ring?

There are two potential answers to this question and both cast women in a negative light.   The first is that women cannot be trusted to stay faithful during this interim before the wedding unless they are wearing an object that wards off potential suitors.  Otherwise, why must women wear a symbol of this commitment in public while men need not?  The second reason one might give for the persistence of this tradition is that women want the ring.   And while in many cases, this may actually be true, this reasoning implies that women are materialistic and superficial, an idea that has been reinforced by the competitive nature of this tradition whereby one’s love is seen as directly proportional to the size of the diamond.  In fact, just recently in US Weekly, there was a full 2-page spread of engagement rings, a display of wealth that invites the reader to see this supposedly deep, symbolic and intimate act as a competition.  This is why I feel a bit sad every time someone gets engaged and the first thing women ask to see is the ring.  It’s as though the fact that this couple has decided to bind their lives together is overshadowed by a piece of rock in some metal.

If you voiced my concerns to jewelers like those at DeBeers, they would likely dismiss any overtones of misogyny or materialism in this social tradition.  Perhaps they would argue as DeBeers’ website does that “ today, perhaps more than ever, the diamond engagement ring remains the most powerful universal expression of true and everlasting love and an essential part of the marriage ritual across the globe.”   And many people would agree with this sentiment without being “wrong.”   After all, a symbol is an object to which we ascribe value.   And if people want to see engagement rings as symbols of love and enduring fidelity, then who am I to stop them?

The only thing I ask is that they consider why they feel this way.  Is it because they truly believe this or because marketing and social pressure have told them that they do?

Because that’s exactly what DeBeers wished to happen when it started marketing diamond rings to the masses in the 30s, even going so far as to suggest in the 80s that this purchase should be the equivalent of 2-3 months of salary (Bernard).   Copywriter Frances Gerety and publicist Dorothy Dignam for N.W. Ayer & Son ad agency even explicitly stated that their goal was “to create a situation where almost every person pledging marriage feels compelled to acquire a diamond engagement ring” (Sullivan).  To do this, they convinced Americans, particularly the women, to ignore their more practical natures and indulge in their superficial and materialistic sides.  Prior to this campaign, an extensive survey conducted by N.W. Ayer found that most Americans thought diamonds a luxury for the very wealthy.  Moreover, Frances Gerety herself stated that women wanted men to spend their money on “a washing machine, or a new car, anything but an engagement ring…[as]…”it was considered just absolutely money down the drain” (Sullivan).

But with their campaign, Gerety and Dignam changed that.  After just two years, sales of diamonds in the U.S. increased by 55% (Sullivan).  And since then, the tradition has only grown stronger, so much so that jewelers now say that “a girl is not engaged unless she has a diamond engagement ring” (Sullivan).    Not everyone feels this way as 25% of brides don’t wear one for whatever reason (Sullivan).  Still, it is clear that the majority of women do feel that an engagement ring is important or necessary.

And despite my personal contempt for the custom, this article is not about my sitting in judgment of that vast majority.  Again, I am only asking that they consider whether they ascribe this value or meaning to an expensive piece of jewelry because they truly believe it to possess such worth or whether it is because society and marketing campaigns have assigned it that significance.  Because much of what we want in life is really what others tell us we should desire rather than what we actually do ourselves.  And the result of such pursuits is usually unhappiness or dissatisfaction with one’s life.

Perhaps after reflecting, you might decide that, like me, you would prefer to spend your “engagement ring money” on a wonderful trip with your fiancée.  Personally, I’d rather be showing my friends the amazing pictures from our time together in Spain than a piece of jewelry.  And instead of having a physical object representing that commitment for the rest of my life, I would have those memories of that first trip together as a soon-to-be-married couple.

Then again, I’ve never been a jewelry person, so perhaps that is why it is easier for me to dismiss this tradition than it is for others.  Still, after learning about the engagement ring’s origins, it is hard not to realize that, just like with Valentine’s Day and Christmas, the public has been “sold” a story in order to increase company profits.  Whether you truly believe that story is up to you.  Because in the end, I’m not trying to make people feel badly about wearing or wanting a ring.   I’m only asking you to really think about why you want it, to make sure it is not because of the value others ascribe to it but because of the value you do.

As for myself, if you like me, then you better not put a ring on “it.”  Instead let’s have a mature discussion about our desire to commit to one another for the rest of our lives, and then we can plan a trip to celebrate!

Hmm, I wonder why Beyoncé didn’t make that the chorus to “Single Ladies”?

Works Cited

Bare, Kelly. “The History of Engagement Rings.” Readers’ Digest. 2014. Web.  2 March 2014.

Bernard, Tara Siegel. “With Engagement Rings, Love Meets Budget”. New York Times. 31 January 2014. Web. March 2014.

O’Brien, Matthew.  “The Strange (and Formerly Sexist) Economics of Engagement Rings”. Readers’ Digest. 2014. Web. 2 March 2014.

Sullivan, Courtney.  “How Diamonds Became Forever”. New York Times. 3 May 2013. Web. 2 March 2014.

~ Does Classifying Addiction as a Disease Help or Hurt Addicts?

(Let me start off by saying that I realize addiction is classified as a disease by medical professionals. So this post is not about the legitimacy of labeling it as such but whether it is helpful or hurtful to addicts to classify their problem this way.)

If someone stopped you on the street and asked you to name 5 diseases, well, first you’d probably take a few steps back and ask why the hell they want to know. Your next move might then be to name illnesses like cancer, cystic fibrosis, Parkinson’s, ALS, or MS. Most of us don’t think “addiction” when we hear the word disease.

Yet recently, this term has been used often when discussing the great actor Phillip Seymour Hoffman’s overdose on heroin. People have shown great compassion for Hoffman’s struggles with addiction, expressing sadness over his passing and shaking their heads at yet another soul lost to a compulsion they could not control. And while I am glad this tragedy has at least got people talking about this all too common but overlooked problem in our society, I still feel that twinge of resentment, that tiny flicker of anger when I hear that word, disease, used in reference to addiction. Perhaps this is because when I think of disease, I picture an illness that a person has acquired through no fault of his/her own, a condition that has no relationship to the behavior of the individual or the choices he/she has made.

But this isn’t entirely true for addiction. Because even with addiction, there are choices. A choice to do drugs (and this includes alcohol). A choice to escalate that use. A choice to not stop using in spite of the negative effects it has on yourself and others.

And yet, I recognize that there are many other “diseases” in which choice plays a significant role. Take Type II diabetes or lung cancer for example. Not everyone who has these illnesses “brought it upon themselves”, but more often than not, these conditions are the result of years of bad decisions. Now that doesn’t mean that I would say to someone with lung cancer, “good, you got what you deserved.” Or that I wouldn’t have compassion for those who are suffering from Type II diabetes, even if it is the result of their own choices. It just means I find the term disease troubling when used to describe these conditions because its connotations somehow imply that the person suffering shares no responsibility for their plight.

Don’t get me wrong. I realize that once someone is addicted it is not a simple matter of will power to overcome the temptation to use. However, I resent the idea that an addict is “powerless” over their addiction. I resent it because it does those battling their addiction a disservice while also providing them with a convenient excuse when they sometimes lose a battle.

Think about it. If there was no element of volition in addiction, then why is it that some addicts are able to stay sober while others are not? If it is truly a “disease” in which one is powerless, then how does one get or stay sober in the first place? Again, I’m not saying that there isn’t an element of compulsion or that the brains of addicts have not been altered due to their addiction. Just that in the struggle to stay sober, choice plays a role and that using the word disease to describe addiction sometimes undermines this fact. Furthermore, it discredits the efforts of the “recovering” addict as well. Staying sober is a hard, hard task. One that requires commitment and strength and honesty and a network of support. So when we say that addiction is a disease in which the addict is powerless, it feels like we are also saying that their sobriety is not their own, not something that they clawed and climbed their way out of the darkness to achieve.

Furthermore, referring to addiction as a disease provides the addict with a convenient scapegoat when they do relapse. They often use this “diagnosis” as a way of avoiding blame or explaining away their mistakes. “It’s not my fault. It’s a disease” is something I’m sure many a loved one of an addict has heard before.

And so it is for these two reasons that I resent referring to addiction as a disease. It doesn’t help the addict or those affected by addiction.

What might actually help addicts is to rethink how we discuss addiction. Not just whether or not it’s a disease but how we can prevent people from becoming addicted in the first place. After all, no one smokes their first joint, takes their first shot, or snorts their first line thinking they will be an addict. We all think we will be the exception, even when we have had front row tickets to the main event our whole lives and know the predisposition for addiction lies deeply embedded in our DNA. Even then many decide to play Russian roulette, never knowing whether they’ve lost or not until it is too late.

So this is what we should be focusing on when we talk about addiction: how to stop people from becoming addicts in the first place. To do this, we need to understand why some people become addicts and others do not. We need to know if there is an identifiable “tipping point” in which recreational behavior turns into something more sinister. Just when does a person’s brain chemistry or wiring change so that he or she is now an “addict”? Are there genetic tests that can determine the degree of one’s predisposition for addiction? Perhaps if we could tell people from a young age just how great their odds are, then maybe we could convince them to never pick up that beer or that joint or that pipe. Yes, most children of addicts already realize that they are at a higher risk for addiction themselves, yet there is no hard and fast rule for just how much so. Nor is there an explanation for why one sibling becomes an addict while the other does not. The one thing we do know is that the earlier one starts using the more likely he/she is to develop an addiction. Perhaps this should be emphasized in drug awareness programs rather than focusing on the immediate physical damage that can be done. Because the message we send to children is hypocritical, telling them not to drink as we lift our glass of wine. Maybe if instead of trying to scare them away from experimentation, we explain that waiting until their minds have fully developed will allow them to better enjoy the pleasure that mind-altering substances can provide, they might actually listen.

In addition to preventing addiction before it starts, we also need to know if there are different types of addicts (not just people addicted to different substances) so that we can better help people who become addicted. Currently, we rely on AA as the panacea for all addictions even though its long term success rate is abysmally low and it fails to recognize that addicts are a very diverse group of individuals. Because of this diversity, what helps one person stay sober may not be as effective for another. Sure, there may be some universal elements among all treatments, such as insisting on accountability for one’s actions, but there should also be some flexibility and individualization. Treatment should be tailored to the individual, perhaps combining pharmaceutical aids with behavioral modification therapy for one person while teaching another to place their faith in a higher power and to attend daily meetings.

These are the types of discussions we should be having in light of yet another life lost to addiction. Because even if Phillip Seymour Hoffman’s death was the result of his own poor choices, this does not mean we do not still mourn his passing.

~ The fundamental difference between liberals and conservatives*

A Conservative’s Perspective on Income Inequality

I grew up in a Republican family, but I didn’t really know it at the time.  Politics weren’t discussed at the dinner table.  Still, Catholic school and my middle class upbringing were supposed to be enough to encourage me to follow in my father’s conservative footsteps.

And for a while, it worked.  I remember finding Anti-abortion pamphlets sitting among many other brochures in the vestibule of my church.  I was horrified by the pictures of aborted fetuses and for a brief period of my youth, vocalized these opinions to any who would listen.

And then, slowly, over the course of my twenties,   I grew up. I finally experienced the “real world,”  an event my dad prophesied would lead to the end of any liberal leanings.  According to him, once it was my money being thrown away on undeserving government programs, I would finally understand why Republicans had the right idea.

However, when I left my suburban bubble to learn about the “real world”, instead of my experience confirming the biases about the poor or illegal immigrants or certain ethnicities that had been part of my upbringing, I had an entirely different reaction.  In stepping outside of my own little world, I realized how fortunate I was to have been born within it.

And that confirmed my liberal leanings.  I recognized that my upbringing, despite being far from idyllic in its own ways, was much better off than 98% of other people’s childhoods.  And this did not make me superior. It just made me very, very lucky.   In my opinion, this perspective is the fundamental difference between conservatives and liberals.   Whereas I am extremely grateful to circumstance for the person I have become, most Republicans ascribe their good fortune to their own devices.   They are successful not because they always had food and shelter or because education was valued in their community or because they had a mother and father who loved and supported them, both emotionally and financially.  No, they are successful because they are hard-working individuals with strong morals and a determined spirit.   Sure, they might thank their parents and teachers, but in their hearts, they attribute their success mostly to themselves.   (Hence, the “I Build That” mantra that was spouted by Romney supporters during the 2012 election in response to Obama giving government infrastructure and policies the credit for many a business owner’s success.)

Conservatives also believe in the rags-to-riches stories.  That success is achievable by all, if they really want to work for it.  And to some small extent, they are right. There are individuals who have come from nothing and been successful.   However, these stories are always the exception and not the rule.  For every Ted Cruz, there are millions who were not able to overcome the unfortunate circumstances of their birth, who could not find a way to reshuffle the deck clearly stacked against them.  Are there also people who abuse the system?  Who make no attempt to overcome their unfortunate circumstances?  Sure there are.  But just like with the rags-to-riches stories, these are the exception rather than the rule.    And yet Republicans still try to justify policy based on these exceptions rather than acknowledging they are anomalies.

On the other hand, because I am a liberal, I thank fortune, just as much as my parents and the community I was raised in, for providing me with such a strong foundation.  I may not have what I currently have or be the person I am today without many of the fortunate circumstances of my birth, a factor which I had no hand in.  Would I have gone to college if it hadn’t been a foregone conclusion that I was going to attend?  Would I have cared so much about my grades if I hadn’t been surrounded by peers who had similar attitudes and upbringings?  Would I be healthy and fit if regular activity wasn’t a ‘normal’ part of life in my household?

Because I can never definitely answer these questions, I can’t say I entirely deserve what I have.  At least not when others have so little.  Sure, I worked for my accomplishments and am proud of them.  However, I also realize that someone currently living in poverty might have achieved similar feats if they’d been given the advantages that I was given.  Or that I might be like some of the impoverished, living paycheck to paycheck, a medical event away from disaster if I hadn’t been born where I was or hadn’t had the opportunities I’ve had.

This all comes back to the “nature vs. nurture” dilemma, which is at the heart of why I am liberal.   Even if “nurture” is only responsible for 50% of who we are, that’s still a significant factor in determining our fates.  And it’s usually a factor over which we have no control.  For example, my name is Kelly, a very common, acceptable name.   I didn’t choose it and I’m sure it hasn’t played a huge role in my success.  But in white middle class culture, it is considered “normal.”    In other cultures in my country, however, it’s common to give children unique names.  Unfortunately, this seemingly  innocuous choice made by parents can have a huge impact on that child’s future according to numerous studies showing the link between one’s name and success.   And yet,  children have no control over what they are named.   Unlike a young boy named Godzilla Gorilla Pimp Hunter (not made up), a “Kelly” is already at a great advantage from birth.   And again, this is not because I am superior but because I was lucky enough to be given a name society deems acceptable. (see the Key and Peele skit below for a humorous take on this idea.)

Some might argue that learning to overcome adversity and failure are just as important to success as one’s name, that we all face challenges in life and that rising above them is what makes us who we are.  And again, to some extent, this is also true.  However, in Malcolm Gladwell’s recent book David and Goliath: Underdogs, Misfits and the Art of Battling Giants, Gladwell argues that there is such a thing as too much adversity, a point at which hardship is detrimental rather than redemptive.  For example, maybe if one’s mother is an alcoholic, it makes that child stronger and more mature.  But, if a child’s mother is an alcoholic and the father is in prison and that child attends crappy schools with bad teachers and is surrounded by gang violence, then perhaps he/she is facing too much adversity to benefit from these challenges.  And consequently, it should be no surprise when that child repeats the mistakes of his parents and continues the cycle of poverty.

Regardless of your politics, you must admit that a child doesn’t choose the family or circumstances into which he/she is born, for better or for worse.  In fact, both parties would agree that a child is blameless and should not suffer or be disadvantaged because of who its parents are or where it was raised.   The strict anti-abortion stance most Republicans hold would support this viewpoint.   Yet, as soon as a child is born, these same people don’t want to enact policies that would allow these babies to have the same opportunities as their own children.  They begrudge funding for early education in poor, urban areas as they read to their own children before bedtime.  They blame hard-working single parents for not being in their children’s lives while denying attempts to raise the minimum wage.  They bemoan government support for easy access to cheap birth control while chastising women for having more kids than they can afford.   They blame schools and unions for failing our children while arguing against free-school lunches and competitive teacher salaries.

Liberals, on the other hand, support programs that try to eradicate the disparities created by the circumstances of birth and upbringing.   Rather than insisting we are all born equal, we acknowledge that some people are disadvantaged from the start.   Looking at the world as a whole, this is obvious.   And even in the land of opportunity, this is clearly true.     Sure, we are still responsible for our actions, regardless of our upbringing.  After all, life is about choices, and we must live with the ones we make.  However, no one can deny that where we are born makes certain decisions much more likely than others.  And liberals believe this fact should dictate our country’s policies, so that, at the very least, we all start out on an even playing field.  So that the race is not rigged at birth.  So that one’s future is not determined by mere luck.

This does not make me a communist or a socialist.  Just a realist and a liberal.

**DISCLAIMER:  I am sure some people will object to my broad use of these terms, so please understand that I am using them in the sense that they were used in my formative years.  For example, ‘liberal’ always had a slightly negative connotation growing up.   This is why I hate to even use political labels like Democrat or Republican, liberal or conservative.  Inevitably, when we label or generalize, our statements become false to some degree.  So, although I do use these terms in my post, I don’t assume that one word encapsulates all the views and values an individual holds.   In fact, I wish we could eliminate the two party system as it encourages animosity and antagonism rather than cooperation.   What both sides need to focus on is helping the American people.  Not special interests, not corporations, and not small, radicalized pockets of the population.

~ On the Greatest Art Form

“Poetry is superior to painting in the presentation of words, and painting is superior to poetry in the presentation of facts. For this reason I judge painting to be superior to poetry ~ Leonardo da Vinci

Artistic endeavor is the hallmark of humanity.  From early cave paintings to jewelry wrought from stones, humans have sought to adorn their world, to find some form of external expression of who we are and how we feel.   This expression is intimate and personal, an extension of its creator.  Yet, at the same time, it is also disconnected from the artist.  Art is something greater than the individual; it’s communal, requiring interpretation by an audience to have meaning.   For this reason, art is both subjectively created and received.

This is why determining the greatest art form is a bit like choosing a favorite child; each has its own strengths and weaknesses and under normal circumstances you would never choose one over the other.  To lose one, regardless of your preference, would be devastating.  So, for the sake of this post, I’m going to make it a Sophie’s Choice* situation.

Before I start to make people angry by dismissing their preferred form of human expression in favor of another, let’s narrow the playing field.  While art can be an extremely broad term, encompassing everything from landscaping to architecture to graphic design, in order to keep this essay reasonably short, I am going to limit this theoretical competition of the arts to a few categories.

Let’s start with the culinary arts and fashion design.  Both try to elevate the mundane and the everyday.  They take staples of human culture – food and clothing – and attempt to infuse them with greater meaning and purpose.  After all, I’m pretty sure the first humans could not have cared less about the pairing of wild boar with river water or about the fit of their animal skin coverings.  Fortunately, as people evolved, so did these everyday art forms.  Now one can sit down at a restaurant and hear a detailed description of a Cabernet or walk down the street and see a myriad of cuts and colors in the clothing of those who pass by.  Yet, while I am grateful that these flourishes can make the ordinary extraordinary, these art forms are limited by the body.  A delicious meal may make my taste buds grateful to be alive, but it will not cause me to ponder the nature of living.

Similar limits impede other forms from reaching the level of transcendence that the greatest artistic mediums do.  For example, both athletics in general and dance in particular display wondrous feats of the human form.  They allow the body to be both the artist and the art, the paint brush and the canvas.  One cannot help but marvel at the remarkable control and grace of a ballerina or the explosive speed and vigor of the athlete.   Their mastery over the human body inspires the audience, makes them feel awe for the all too fragile corporeal form.   Yet, these mediums lack an intellectual element.   Yes, a ballet may tell a story and yes, a championship game be suffused with as much drama as a novel, but they lack language.  And language, our ability to express what our synapses – firing a million times a minute – are saying, is essential to the greatest art.

And song** has that.  Like dance and athletics, it is an art form that relies on an agile body to succeed, whether it be through the strumming of a guitar or the vibrations of the vocal cords.  Yet, it also has words.  Words, words, words.  For this reason, song has been described as modern poetry, an evolutionary offshoot of The Iliad or Lovesong of J. Alfred Prufrock.   Like poetry, it is a condensed form of thought, providing small slivers of insight that can embed themselves in our skin.  And when story and sound collide in a masterpiece, these slivers stick with us.  With only a few notes, song can transport us to a time and place in our past.  Song can connect us with others as we sing together.  It can be our sole comfort and companion when are alone.  And when the music is great, it as though an invisible hand has struck a chord inside our soul.  Yet, ultimately, music is a visceral art form, which is why we find ourselves tapping our feet or whistling a tune without even realizing we are doing it.  And thus, we love songs more for their rhythm, their tempo and timbre than for the story they convey and the words they use to do so.  In fact, sometimes we love a song in spite of its lyrics, the melody triumphing over the message.   Because of this failing, song falls short of other artistic mediums.

This includes Leonardo da Vinci’s beloved form of expression, painting.   While there are subtleties of hue and dimension that can be a language of their own, while a great painting can tell a story both with what it depicts and what it does not, to say we don’t have better means of doing both these things via film and television would be like insisting on riding a horse rather than driving a car to work.  Still, great painters have a way of reflecting back the world to us in a new light; using form and color to strike the eye in such a way that a fully embodied idea is placed in the mind without words.  And much of what is great about cinema and its more commonplace cousin, television, is derived from the same principles as painting.

Perhaps ten years ago, I would have argued that film is the greatest art form with drama (aka plays) a close second.   While film lacks the electricity, the immediate give and take between audience and artists of the stage, it has greater freedom and flexibility of expression.  Film uses our innate desire to put together a narrative, showing us pictures that we piece together in our minds and weave into a story.  It also unites many of the other great art forms under one banner, to achieve the same mission.  Music, language, costume, setting, movement – all interwoven to create the most powerful story.   And that is why, when done well, a great film can be life-changing; it is why, even though we may say we prefer the novel to the movie, we still long to see how cinema has brought our favorite stories to life on the big screen.

Until recently, television was the inferior sibling to cinema.  It was common in the most derogatory sense of the word, serving as the stepping stone for artists aspiring to work in film.  But that has all changed.   With higher budgets and an increasingly talented field of actors, writers, and directors, television can now do exactly what films do without the same limitations that film has.   Unlike a movie, a television series has the length of literature, taking its time to develop the characters and the story in a way that a two hour film cannot.  Moreover, with the abundance of cable channels, television has been allowed to take risks.  And all great art requires risk, the willingness to innovate and experiment, to fail.  And like a great novel, where you can forgive the errant paragraph or a rare, poorly worded phrase, television has the luxury of length, where mistakes are diluted rather than concentrated.

One of the dangers of television is its passivity.   Unlike literature, which forces engagement and thought in the reader due to the very form it takes, television has the ability to lull the audience into the complacent state of the observer.   And so, despite the heights it has recently achieved, television will always have to work to overcome its negative connotations, its perception as the “idiot box” or the “boob tube.”

Even more importantly, television, along with ALL art forms, must prove its worth in the face of the practicalities of life that art often distracts us from.  After all, perhaps if art didn’t exist we might focus more on our real world problems.  We wouldn’t be able to, by simply turning on the telly, block out the inane antics of Congressmen abusing their power or the idiotic laws that allow governments to become more and more corrupt.   Moreover, with all the money spent on the creation and the consumption of art in our world, we could easily feed the hungry and increase the standard of living for all human beings on this planet.  In fact, it can seem absurd that the film industry spends billions of dollars each year, often to make very terrible films, or that someone would pay millions of dollars for a Picasso whilst children are dying all around the world.   Yet, what sort of world would those children live in without art?

In the end, human beings are not practical.  We are emotional creatures.  We need to be inspired.  We need something to give our lives meaning and value.  To connect us with each other.  To motivate us to live and work for something greater than ourselves.  To find beauty and hope in an often brutal and indifferent world.   And that is what art does. That is why art, no matter what the form, will always be inextricably human.

*Sophie’s Choice is references a story in which a mother is forced to choose which child will be sent to a concentration camp and which will be saved.

**I realize that song falls into the broader art category of music and similar comments could be made about other art forms mentioned.  Again, for the sake of brevity and due to interconnected nature of many artistic mediums, I have arbitrarily chosen to discuss some of them as separate entities while including others within various other forms of art.  I have also omitted several great art forms, so sorry in advance if I neglected your favorite J.


~ We Are What We Watch

Ask a man if he watches All My Children or General Hospital and he will most likely laugh dismissively.  As if he would waste his time on such meaningless drivel.

Ask a man if he watches The Real Housewives of New Jersey or Atlanta or Miami… and he will probably give you the same reply, perhaps admitting he has seen one of these programs but only at the behest of his significant other.   As if he would watch that mindless garbage if it were up to him.

But if you were to ask a man about whether he spends a large portion of his weekend bingeing on football game after football game after football game, you would very likely get a proud admission of his excessive consumption.

But why?  What about watching grown men wrestle each other to the ground in hopes of getting a ball to a desired end of the field is something to be vaunted rather than whispered?   What does it say about our culture that men who spend countless hours paying homage to the football gods are seen as “normal” and masculine while women who watch Keeping Up with the Kardashians or Teen Moms are viewed as shallow and celebrity-obsessed?

Yes, feats of strength and skill and determination are inspiring and exciting.  Yes, these programs can unite us in our love of our favorite team or through our hatred of a certain housewife.  However, any woman who has endured hours of the same stories being dissected on ESPN or any man who has witnessed repeated cat fights and betrayals on BRAVO has to admit that, ultimately, there are many worthier programs to watch, television shows that might teach us something new about ourselves or our world, programs that might cause us to question our beliefs or motivate us to do something important with our time here on earth.

Before you get defensive, know that this is not a lecture.  I am no saint.   I have gorged on horrible television that glamorizes bad behavior and talentless so-called celebrities.   And I also enjoy the adrenaline-fueled world of sports, having developed a love for athletic competition as a child.

Yet, when I consider that toddlers view about 24 hours of television each week and that this number tends to increase with age1, it becomes apparent that we ought to take more care in both our own and our children’s viewing choices.   We need to become more mindful of what we consume; we can no longer pretend that there isn’t a correlation between the television we watch and the people we are.   After all, why would a man be embarrassed to admit that he enjoys The Bachelorette if he did not believe that it, in some way, said something about who he is?

Again, I am not trying to criticize fans of reality television or admonish aficionados of football.  I am not claiming that if you never miss an episode of America’s Got Talent, you must be uninformed about what’s going on in the world.  And I am not arguing that all television turns our minds to mush.

What I am saying is that, if you watch certain types of shows exclusively or excessively, then perhaps you share the same qualities as that program.   And so if you don’t want to turn into a superficial gossip or a barbaric meathead, you may want to diversify your viewing menu.   Because, in the end, we all have a limited time on this planet, and if we are not careful, we’ll become more like what we watch rather than who we wish to be.

1 Hinckley, John.  “Americans Spend 34 hours a week watching TV,   according to Nielsen numbers”. NYDailyNews.com. 19 Sept. 2012.

~ On Perspective And Why It Should Be Actively Cultivated & Taught

Everyone wants to be wise, yet that term is usually reserved for the elderly, people whose years of experience is said to have given them the insight and understanding that cannot be learned from a book or in school.   But is this thinking correct?  Can’t we “wise up” before we hit our sixtieth birthdays?  Must we wait so long to reap the benefits of wisdom?

I believe the answer is yes, and the solution is to actively encourage people to broaden their perspectives.  That is essentially the basis of wisdom after all.  Perspective shows us a vision of the world that opens our minds; it gives us something to compare and evaluate our experiences against.  Without it, we are limited to our self-centered views of existence.  We become mere animal, seeing only what affects us, our happiness and our survival.   Language, art, technology – those are all great achievements that separate humans from other animals, but I would argue that the ability to see the world from multiple perspectives is equal or greater to those other accomplishments.   And just as we must cultivate an appreciation for art or master a new technology, we must be willing to work and sacrifice to see beyond our own myopic perspectives.

Let me give you an example of what I mean about the wisdom that perspective provides.

When I was in seventh grade, I forced my friends to listen to Boyz II Men’s “On Bended Knee” on repeat for an embarrassingly long time, completely convinced its lyrics captured the pain I was feeling after being rejected by my childhood sweetheart in favor of an older woman, one of the popular 8th graders at my school.  The heartbreak I felt back then was very real, yet now that I am in my 30s, I can’t help but laugh at my 12-year-old self.

This is the power of perspective.   The older me has had many more heart breaks since 7th grade – some caused by cheating boyfriends, others by personal failures, and the majority stemming from family turmoil – and as a result, my teenage misery is now merely a funny story.   This isn’t surprising to most of us; after all, how many times has someone told you, in an attempt at consolation after an embarrassing incident, that you’ll “look back on this and laugh years from now”? And while you may have wanted to punch them in the face at that moment, much of the time, those seemingly trite words were proven true.

Still, viewing the world from our own limited experience of it is as instinctive as breathing; even if my older self could travel back in time and explain to my younger self that this boy would mean nothing to me in another year, that I would have many other crushes, and would even be the crushee on several occasions, I don’t know if I would have been any less devastated.   After all, your first heartbreak is exactly that, your first.  You have nothing to compare it to.

This would seem to suggest that you cannot teach perspective to people, and while that may be partially true, it is not a good enough reason to forgo any attempts to do so.  In spite of the inherent self-centeredness of the young, we do not have to be that way.  We all learn through repetition, and so while an attempt to teach perspective may fail at first, well, you know what they say about failing to succeed: try again.

Think of those people who seem to have been born with the capability of seeing life from the mountain top.  We often call them “old souls”, those individuals whose few years on this earth don’t correspond with their insight and understanding of the world and its inhabitants.   Yet, usually these individuals share some things in common that can account for their seemingly premature insight.   More often than not, they are acute observers, or survivors of a traumatic experience, or voracious readers.  The existence of these “old souls” demonstrates that perspective, and hence wisdom, is not something reserved for the elderly.   We do not have to have endured and suffered many years on this planet to earn the understanding the older generations have acquired over their decades on earth.   Yet, it is something we must actively cultivate for ourselves and teach to our youth if we wish the young to possess the perspective of the more mature.  Fortunately, this effort is worthwhile.   Having a broad perspective is the basis of wisdom, the vital component to contentment, the essential element in compassion.  So how do we go about gaining this perspective?

While perspective can arise after enduring traumatic experiences, I am not suggesting we go around traumatizing the youth.  (It is, however, a way to put a positive spin on the less savory events a friend or loved one may experience.)   Fortunately, there is an easy way we can teach perspective in schools, particularly through reading.   Think of the multitude of experiences, feelings, biases, and conversations that we are exposed to while reading a book, providing readers with one of the important benefits of literature, which is to force us to look beyond our own worldview.  In fact, studies have shown that those who read tend to be more empathetic humans, and I have no doubt that is due to the broadening of perspective that coincides with the reading of others’ stories.  Reading is the cheap, easy way to expose ourselves to the myriad of experiences, feelings, and choices human beings can face.  Students who analyze Night can learn how inhumane humans can be, racists who study Uncle Tom’s Cabin or To Kill a Mockingbird may perhaps relent in their prejudiced worldview, and teenagers who read Oliver Twist may even find gratitude for their parents.  I am not saying that this is a certainty, but it is indisputable that the more we “walk in each other’s footsteps” the more paths we have to compare to our own, thus ensuring a less narcissistic perspective.

Both schools and parents can encourage reading as a means of gaining this less selfish worldview, however, there are other means to this end as well as pitfalls to avoid that endanger this endeavor.  Parents, for example, can prevent their children (and themselves) from indulging in some of the “reality” TV shows or idolization of celebrity lifestyles that are promoted in tabloid magazines and television.  These sources of entertainment give us, and more importantly, the impressionable youth, a distorted perspective.   They skew our view of the world, making us think these individuals live lives far removed from the trivial and mundane tasks that make up most of our days.  Instead of generating appreciation for what one has, they create ingratitude for what one has not.

And lastly, in addition to avoiding being subjugated to the distorted lifestyles in the media and to vicariously experiencing what we may otherwise never actually experience through literature, we can gain wisdom by seeking out new, sometimes difficult, life experiences.   This is where I expect the most resistance, yet it is also the most surefire and the swiftest means to broadening our perspective.   Let me give you another example to illustrate what I mean.

A few years ago I went on a “volunteer vacation” to the Dominican Republic.  During this experience, I endured stifling hot and humid days filled with manual labor, giant centipedes and equally enormous and disgustingly hairy tarantulas, and an absence of television, internet, air-conditioning or even privacy.   And I paid handsomely for this experience, as much or more than it would have cost me to take an extravagant trip abroad.   My friends and family, while outwardly expressing admiration for my sacrifices, were clearly confused by my choice.

Yet, I would not give up my experience in the Dominican Republic for the most luxurious vacation elsewhere.  I learned more about myself, my good fortune, my fortitude, and my limitations on that trip than I have in any other experience.  Yes, there were moments of longing for a private toilet or shower, hot water, or a bed that was not covered in my sweat and mosquito netting, but without those deprivations, I would not have grown as much as I did in those few weeks.  I would not have gained the perspective that now makes me ashamed of the abundance most of us take for granted.

Don’t worry.  I am not advocating that parents forego all trips to Disneyland or that schools stop field trips to museums; just that every now and then, you choose the more formidable, less comfortable path.   Instead of going to see Cars 3 with your children, go to a homeless shelter to pass out blankets and food. Yes,  enjoy a trip to the Bahamas this year, but next year, spend the same amount of time and money in Appalachia, helping those who don’t have the option of a vacation.   You may not come back with a gorgeous tan or pictures with Mickey and Minnie, but you will come back with all the wisdom that a broadened perspective provides.