Who doesn’t love to travel? The experience of a new place and new people is exhilarating. Yet, while most of us enjoy traveling, we are much more uncomfortable with the identity or role of a traveler: the tourist. For locals, tourists can be nuisances. They are predictable, travel in herds and get in the way. As travelers, we attempt to avoid the label through sophistication, respect, knowledge or simply an ability to blend in. Is the tourist-local standoff inevitable? Can we ever really transcend the role of tourist while traveling?
These were the questions that caught my attention in my first real venture onto sociological ground: Jim Dowd’s Sociology of Travel course at the University of Georgia. Returning to UGA having completed a semester abroad and an extra 6 weeks of solo backpacking through Europe, I entered the class feeling accomplished, worldly and smug. Insert sociology. We placed travel and its motivations under a microscope, we critiqued the tourist economy and we deconstructed the neat dichotomy of tourist/traveler in which I had clearly classified myself as the latter. I was both humbled and intrigued.
Dean MacCannell—somewhat of a pioneer in the sociology of travel—claimed that our travel is motivated by a quest for authenticity. I was fascinated by his work. He suggested that, in increasingly commercialized societies, the authentic becomes a highly valued commodity. The authentic is sacred, making tourism a modern form of the religious pilgrimage. MacCannell observed the use of the “tourist” label to denote those that disengaged from this quest; the “tourists” were those described as being “content with…obviously inauthentic experiences.” Yet, he was quick to point out that, for the most part, travelers settle for (and actually pay for) “staged authenticity.”
I was reminded of MacCannell’s work while on vacation in Puerto Vallarta a few weeks ago. It was my first trip outside the U.S. in over three years, and it was a different form of travel than I had experienced in the past. My boyfriend and I stayed in a resort hotel. It was right on the beach and filled with all sorts of conveniences, yet neither of us was completely comfortable in the experience. Perhaps its “obvious inauthenticity” triggered our anti-tourist reflex. As a result, we hopped on a water taxi to the nearby fishing village of Yelapa, a more remote location near Puerto Vallarta accessible only by boat.
This is where I find the quest for authenticity to be paradoxical. In the search for authenticity, travelers disturb less touched or untouched areas, thus extracting that which they seek.
Or do they? It depends on how you define the authentic. If it is characterized as being untouched or primitive, then yes, entrance into that space destroys its authenticity. Yet, if visitors seek life “as it really is,” then perhaps not. The entrance of the tourist influences a place. Their entrance changes its landscape. The interplay between tourist and host becomes a part of everyday life. Take New York City, for example. The continual onslaught of tourists has become a part of the city. The many visitors to Times Square are a part of its fabric. I’m not suggesting that a “back stage” of New York no longer exists—of course there continue to be many local hideaways—but I am suggesting that perhaps authenticity exists in the tourist areas as well.
I noticed this evolution of authenticity in Yelapa. While it receives far fewer visitors than Puerto Vallarta, Yelapa has evolved to incorporate tourism into daily life within the village. This left me feeling ambivalent about our journey there. In our search to transcend the tourist identity through experiences of authenticity, we were changing the very experience we were seeking. Perhaps more disturbing, we were directly contributing to a shift in Yelapa’s daily life, culture and economy.
In some ways, the shift could be described as positive—increased interactions of different ideas, people and cultures. On the other hand, an exploitation of Yalepa’s residents and culture are at risk. Such changes may be inevitable, but does our contribution to that shift come with certain responsibilities? For example, is there a responsibility to invest in the local economy while visiting a place? Is there a responsibility to be knowledgeable of a local culture and language prior to stepping foot in it? Given increased movement across the globe, are such requests reasonable or even possible? Given the expansion of global media and business, does the impact of the traveler even register? Overall, despite great effort, does anyone every truly avoid the tourist trap?
 MacCannell, Dean. 1973. “Staged Authenticity: Arrangements of Social Space in Tourist Settings.” American Journal of Sociology. 79; 3: 589-603.
After I declined a friend’s invitation recently, she replied, “I’m happy to hear that you are so busy, but I’m curious—what is it that is keeping you so tied up?” Her question caught me by surprise. For several years, “I’d love to, but I’m just too busy” has been my standard response to invitations to spend time with friends, join teams, plan trips or even attend weddings. To me, this response was both valid and justified; I was sacrificing fun for hard work. I was being responsible. My friend’s question, however, led me to further interrogate this auto-reply. What else might my proclamation of busyness communicate? Have I also inadvertently been proclaiming my own self-importance? Have I not—albeit unintentionally—been implying that others are not busy or that my time is clearly more valuable?
Apparently, I’m not alone in my frenzy or in my ambivalence about it. In his popular June 30th New York Times piece, Tim Kreider critiqued what he called the “busy trap,” arguing that, as a society, we drive each other toward busyness in an effort to “hedge against emptiness.” Kreider puts into question the very nature of busyness; is it truly that idleness is self-indulgent and busyness is productive, or is it perhaps the reverse? It’s not an entirely new question. Thoreau was a strong advocate of idleness, and many creative and intellectual industries boast the value of providing “employee down time.” Yet, skepticism of Thoreau’s actualization of an idle life, as well of true “down time” within a workplace, abounds. And while this skepticism is valid, I would also posit that it is a symptom of the fact that, as a culture, we remain unconvinced of both the limitations of busyness and the benefits of idleness.
In this way, Kreider’s commentary is reminiscent of Nietzsche’s Beyond Good and Evil. In the philosopher’s 1886 deconstruction of the essentialism of truth, he provocatively challenged the concepts of good and evil. Claiming that Christian “goodness” was motivated by revenge and resentment and, in its emphasis on the afterlife, resulted in a devaluation of life, Nietzsche not only turned the meanings of good and evil on their heads but also forced readers to think more deeply about the contexts in which meanings are defined. There is no universal good and evil; rather, the concepts are subject to the societies within which they are situated. Therefore, perhaps the next step in an examination of our obsessive busyness should be to explore the contexts within which it has developed.
While a true investigation into the contextual influences of the “busy trap” would require more time and space than I have here, I would point to three factors that I believe are relevant and worth further analysis:
First, the world is smaller. We are now aware of the existence of more information, places and ideas than ever before, and more importantly—through technological advances—we also have access to these things. While wonderfully enriching, this can also be overwhelming. I know I’ve felt it. On my first day of grad school, a professor began our class by stating, “you are all already way behind.” It was an uncomfortable moment, but he was right. While scholars have always built on the work before them, they didn’t always have access to or knowledge of the entirety of literature and ideas that preceded them. Now, in large part, we do. With that access, also comes a responsibility to master that work and, further, a pressure to continually surpass it. Such feelings are not limited to academia; the same is true when it comes to current events, foods, activities and travel. As opportunity multiplies, so does the weight of that opportunity.
Second, there is an increasing belief that we are each extraordinary. Contemporary parenting styles—characterized by positive reinforcement, customized attention and an appreciation and cultivation of children’s individuality—feed such beliefs. So do (now rather infamous) rituals of giving every child a trophy. New media programming and channels provide a reason to buy-in to these beliefs; anyone, it seems, can become a reality TV star, a social media guru or a YouTube sensation. The only issue is, as David McCullough so succinctly put it during his Wellesley High commencement address, “if everyone is special, no one is.” High expectations of greatness can lead to an amplification of frantic activity undertaken in an attempt to grasp some level of exceptionality.
Yet, and this is my last point, measures of greatness–of success–are becoming more abstract, leaving many individuals seeking prestige, status or even accolades that either do not exist or are bound by time, space and knowledge. The workforce has shifted toward intellectual and creative labor characterized by its intangible value more than its quantifiable measures of success. How do you measure a good idea? Noteworthy art? The value of a business relationship? Further, with increasing specialization, it is difficult to translate the basics of one’s job, never mind its value. Even job titles have lost any sense of constant value—what exactly is a consultant, an account executive, or an analyst? With fields more specialized, organizations less hierarchical and careers more fluid, one’s title is no longer a concrete indicator of achievement. As a result of unattainable or short-lived recognition, there is a never-ending parade of effort. Efforts to demonstrate greatness sans clear identifiers of that greatness have led to a plethora of work; we are now bombarded with advertising campaigns, patents, publications, films, jam-packed resumes, etc. that must be sifted through. Thus, both producers and consumers are constantly occupied.
Thus, with extraordinary success within reach—and expected—and with that success difficult to calculate, it is no wonder we are all deliriously buzzing about.
Yesterday, I had the pleasure of attending a sociology colloquium talk at UC Berkeley. In addition to the thought-provoking discussion on minority rights and international/national citizenship, I was also giddy (read: I am a nerd) to be in the home of one of my favorite sociologists—Arlie Hochschild. Although Hochschild was not in attendance (unsurprising given her emeritus status), the visit did bring her work to the forefront of my mind.
As a graduate student and even now, I am fascinated by the intersections of work and family, both how culture and structure shape the interface, as well as how different groups experience it. Much of the literature in this field begins with the assumption that work and family are based on two competing models: an efficient and profit-driven organization vs a time-intensive and care-based kinship group. This is where Hochschild makes her contribution. In her book, The Time Bind, she argues—based on ethnographic findings—that within contemporary American society, the workplace has replaced the home as a hub of personal relationships and fulfillment. The family, she argues, has also evolved. No longer is the home a respite from a frenetic capitalist market; rather, it has become a place where every moment is scheduled and tensions peak. The meanings of work and home are no longer aligned with lived experiences; as her subtitle states, work has become home and home has become work.
A potentially powerful insight. One that, despite the book’s numerous citations within the work-family literature, has not yet been seriously pursued by work-family scholars.
There may be several reasons why Hochschild’s insight has been overlooked. Perhaps her observations are viewed as simply a reflection of a postmodern society in which universal binaries such as work/family can no longer be held constant across spatial, temporal and social contexts. Perhaps they are viewed as a misunderstanding of the more intimate interconnection of the two institutions produced by a technology-induced blurring of boundaries. Perhaps they are dismissed as a reflection of one small sample of the American population and do not hold true among other groups.
Regardless, it seems a worthwhile venture to explore the potential implications of her argument. The placement of care within the market, in and of itself, is alarming. As Hochschild has noted in her other work, treating care as a commodity can lead to potential alienation from emotion, as well stratified experiences of and access to care. However, these are the consequences of a care market—what about the displacement of care within the home due to a prioritization of work?
The extent to which jobs consume not only our professional lives but also our personal lives is alarming given the recent recession. If Hochschild is correct, if work consumes not only employees’ time, energy and thought but also their emotional resources and social networks, what happens, as is the case now, when so many Americans lose their jobs? How can one define his or her identity when its core resides in a workplace with which he or she is no longer connected? How can families cope when they’ve been drained of the ability to do so? How can we, as a community, care about one another when those lucky enough to remain employed must devote the totality of their human, social and financial capital to maintaining their place within the workforce? Perhaps this is a nihilistic post, but the discussion may open the doorway for new opportunity. As we rebuild the American workforce, it is important to also rebuild our networks of care outside of it, both in our homes and in our communities. We are now faced with the chance (and the incentive) to do so.
Listening to Rush Limbaugh’s discussion of Paul Ryan’s Medicare plan a few weeks ago, I was amazed by the extent to which he lauded the plan’s provision of greater choice for Americans. Intrigued, I decided to take a look at the transcript online. I found seven references to “choice,” 4 references each to the words “choose” and “option” and two references to the word “chosen;” that’s a lot of choice. Limbaugh’s emphasis is not surprising; I’ve found myself in quite a few political discussions in which I’ve been confronted by the “choice is better for everyone” logic. While I’ve always found this argument unnerving, I never truly confronted it, always assuming its irony was clear and that those who adopted the logic did so simply to make an exaggerated point. However, in recent years, I’ve watched the words “choice,” “freedom” and the “American dream” provide momentum to a political machine that has turned support away from government regulation and social safety nets toward market-based policies that are not in the best interest of many of those who support them. So, I decided to explore this language a little more seriously…
First, I should point out that Limbaugh is not alone. Turning to last week’s Republican National Convention, both Republican candidates evoked themes of choice and freedom in their speeches. Paul Ryan, arguing that government programs hinder Americans’ freedom to choose their own life path, stated
Now when I was waiting tables, washing dishes, or mowing lawns for money, I never thought of myself as stuck in some station in life. I was on my own path, my own journey, an American journey, where I could think for myself, decide for myself, define happen as for myself. That is what we do in this country. That is the American dream.
Mitt Romney, in his speech, focused on choice within one particular area of policy reform, proclaiming, “when it comes to the school your child will attend, every parent should have a choice, and every child should have a chance.” A small yet, I would argue, representative sample of the choice rhetoric employed by many Republicans.
From a strategic standpoint, this approach is effective. America was founded on the pursuits of liberty and of independence. The American dream—a belief that we can create our own destiny—is a cultural ideal that we, as Americans, hold closely to our hearts. The American dream gives hope to those who have not yet actualized the life that they aspire towards, and it relieves the wealthiest of responsibility in its suggestion that success is simply the result of working hard and making the right choices. Thus, it has wide appeal. Americans at all levels of the class ladder have reason to believe in the American dream, and when politicians advocate for more choices, they hope to build upon this cultural ideal, implicitly suggesting that more choices provide greater opportunities for us all.
Am I against choice? Absolutely not. As an American, I am grateful for the many freedoms of choice of which I am granted, including the freedom to choose government representatives, to choose what religious beliefs, if any, to practice and to choose how to respond to an unwanted pregnancy. In fact, it is my respect for these freedoms that leads to my current concern with the hijacking of these American ideals within political rhetoric. I contend that politicians are intentionally confounding greater choice with a greater equality of opportunity. In reality, choice is structured by race, gender and class. When Americans make school and healthcare choices, those choices are mediated, among other things, by access to information, ease of transportation and experience in negotiating with social institutions. We may be offered the same choices, but the ability to choose specific options is not the same. Therefore, it is important to disentangle greater choice from greater opportunity for all; access to choices is simply not equal.
My issue with the word choice being used by politicians to support neoliberal policies is that it hides privilege. It makes it easy to ignore the financial, social and cultural factors that enhance opportunity for some and hinder it for others. And hidden privilege perpetuates the status quo by blaming inequality–not on an economic and political system that disproportionally supports the wealthy– but on the disadvantaged themselves. It vilifies those in need of social support and celebrates those at the top, and in this way it separates our nation. I don’t pretend to offer a solution here, but I do believe that, in beginning from the premise that choices are stratified, we can start to propose policies that are more relevant and beneficial to all Americans. By making the American dream a conceivable reality, such policies could strengthen the ideals upon which our nation was founded.
In a recent public apology, Missouri Representative Todd Akin stated, “I used the wrong words in the wrong way,” referring to comments he made about fertility rates among victims of “legitimate” rape. The GOP Senate Candidate’s apology highlights the power embedded within policymakers’ language—an important insight given the prominence of rhetoric and sound bites within our current political and media systems—and the consequences of those words for the nation’s female citizens.
First, Akin’s comment demonstrates the wide abuse of scientific knowledge within political debates. Politicians continually bombard the American public with data and expert opinions in an effort to bolster the credibility of their statements. In theory, this could be viewed as a responsible approach to public communication. In practice, subject experts and studies have become a dime a dozen, resulting in a cherry picking of facts and an overall atmosphere of confusion regarding quality sources of information.
It is not surprising, then, that when researchers Andrea Press and Elizabeth Cole interviewed women about their views on abortion, they found that both pro-life and pro-choice advocates referred to the authority of science to back their beliefs.[i] Akin’s vague reference to “doctors” to support his August 19th statement that those suffering from “legitimate” rape are less likely to become pregnant provides yet another example of the inappropriate use of scientific authority. As New York Times blogger, Robert Mackey, has pointed out a natural defense against impregnation by rape is a longstanding myth with origins in the 13th century. In contrast to the representative’s comments, a 1996 study cited by the CDC, provides empirical evidence that the fertility rate among female rape victims is 5%, a rate comparable to that of women who have had consensual sex.
More disturbing, as seen by the immediate and widespread reaction of politicians, media professionals, and public citizens, was Akin’s reference to “legitimate” rape. Since the interview, the representative has withdrawn that statement, explaining that he meant to say “forcible” not “legitimate” rape. While forcible rape is a concept backed by some House Republicans, evidenced by its original inclusion within the “No Taxpayer Funding for Abortion” legislation co-sponsored by vice presidential hopeful Paul Ryan, any classification system for rape can have alarming implications.
So, why should politicians take care when describing and defining rape? As the federal government acknowledged by expanding the definition of rape for statistical reporting, how we understand and talk about rape as a nation reflects how seriously we take violent sexual assaults within our culture, our policies, and our justice system. When some rapes are defined as “legitimate,” it suggests that—under some circumstances—access to and control over women’s (and men’s) bodies is acceptable. This begs the question, what are those circumstances? Is it not rape when it is not vaginal sex? Is it not rape if it is a male victim, when emotional or mental coercion is used rather than physical force, if the victim has a known sexual history, or if the victim’s clothing suggested interest?
When we begin to ask questions such as these, the victim becomes the target of interrogation and blame, as was common in U.S. courts prior to the introduction of the Rape Shield Laws. Even after the enactment of these laws, public scrutiny of victims has remained at the forefront in many rape cases. As a result, attention is diverted away from the rapists themselves. Measures are taken to increase victims’ and potential victims’ diligence (i.e. safety tips directing women to stay with groups, drink responsibly and avoid dark areas), ignoring the more pressing need to identify and prosecute rapists.
Regarding abortion, the issue at hand, the implications of classifying rapes as legitimate are ironic. In those cases not classified as legitimate rape, in which women’s control over sexual access to their bodies is not protected, women are also forced to forfeit their reproductive rights. Thus, the label of legitimate rape robs women of their rights not once, but twice. In a society founded upon ideals of equality and human rights, this is simply unacceptable.
[i] Press, Andrea L., and Elizabeth R. Cole. 1999. Speaking of Abortion: Television and Authority in the Lives of Women. Chicago:University of Chicago Press.