The New You

You’ve probably never heard of TestingMom.com. It’s part of a new generation of test-prep companies like Kaplan and Princeton Review –– except this one is for toddlers. Competition for slots in kindergarten has gotten so intense that some parents are shelling out thousands to get their four-year olds ready for entrance tests or interviews. It’s just one more example of the pressure that got celebrity parents arrested for falsifying college applications a few years ago. In this case the battle is over getting into elite elementary schools or gifted programs. While such admissions pressure is widely known, what’s new is how early it’s occurring. Equity issues aside, the demand to improve performance is being drilled into youngsters before they can spell their names.  All of this bespeaks the competition for grades, school placement, and eventual careers that has transformed the normal impulse to do better into an obsession for students and their families. Much like the drive for perfection, an insatiable hunger to be quicker, smarter, and more acceptable to admissions officers is taking its toll in many ways. 

What explains this obsessive behavior? Brain science has been proving what advertising long has known ­–– that wanting something is far more powerful than getting it. School admissions and other markers of success are part of an overarching mental wanting mechanism. That new iPhone might bring a thrill. But soon comes the yearning for an update, a newer model, another purchase. Neuroimaging shows that processes of “wanting” and “liking” occur in different parts of the brain, with the former more broadly and powerfully operating than the latter. This reverses the common wisdom that primal hungers and “drives” underlie human motivation.  Unlike animals, the motor force driving human beings is imagination –– with anticipation of something more important than the experience itself. This partly explains why merchandizing deals more with feeling than facts. Slogans like “Just Do It” and “Think Different” bear no direct relationship to shoes or computers, but instead tingle feelings of desire. In the fuzzy realm emotion pleasure is a fungible currency. 

Continue reading “The New You”

Find Your Superpower

“How to Find Your Superpower” is among thousands of recent articles, books, and improvement programs about the age-old dream of an updated self. Like others in its genre, the piece offers guidance for achieving “peak performance” through a blend of passion, mastery, and hard work. “The #1 thing you can do is determine your strengths, determine your superpowers,” the authors state in coaching readers to sharpen “a dominant gift an attribute, skill or ability that makes you stronger than the rest:  a difference between you and your coworker.”[i] Find that elusive something, and you are sure to succeed. Pitches like this appear everywhere these days. Witness the massive market for fitness, beauty, self-esteem, and cognitive improvement products. These range from dietary supplements and workout regimes to books, videos, and apps. Amazon is loaded with titles like Your Hidden Superpower, Finding Your Superpower, and the kid’s book What’s My Superpower? [ii]

Juvenile appeals notwithstanding, a consistent theme runs through all these books – that it is up to you alone to find, develop, or somehow acquire missing capacities. Rarely is there a mention of structural advantages or disadvantages in the superpower quest. The impulse to exceed one’s limits has a long history in Western thought, with roots in religious doctrine and philosophy. Some even link enhancement to hard-wired survival instincts. Simply put, people have been augmenting themselves for thousands of years, first by using tools, then by working in groups, and later with machines and technology. From the Enlightenment Era onward, this was seen as humanity’s “natural” impulse for continual improvement and progress. Ongoing developments in science and medicine have intensified this drive, along with the heightened sense of crisis in the 21st century. The result has been a growing mania to become stronger, smarter, and better looking than anyone else.

Then add popular culture. Everyone knows the basic superhero plot: stories begin with ordinary characters (often underdogs), who transform via accident, discovery, or gift. With new powers, the superhero battles evil and invariably prevails. Such stories now comprise the most cherished works of mainstream media, generating fortunes for movie franchises: Marvel ($18.2 billion), Harry Potter ($9.1 billion), X-Men ($5.8 billion), DC Universe ($4.9 billion), Spiderman ($4.8 billion).[iii] It’s easy to see the appeal of these films. In an essay titled “Why Everyone Has Seen a Superhero Movie,” critic Gwyneth Torrecampo explained that “The unique challenges we face in our everyday lives can be daunting and stressful, leading us to feel powerless and dejected.”[iv]  Viewers thus identify with the hero as a form of wish fulfillment, she explains. “Superheroes often start out like you and me, and then go on to overcome obstacles, protect the vulnerable, or save the world. It’s a potent fantasy that inspires imitation among viewers.”

The superhero syndrome is the fantasy version of “human enhancement,” defined as “the natural, artificial, or technological alteration of the human body to enhance physical or mental abilities.”[v] On one hand there is nothing terribly new or unusual about this. Running shoes and vitamins are enhancements that people take for granted. And indeed, much of modern medicine devotes itself to such helpful interventions, especially when they address genuine needs or difficulties.  An appropriately determined restoration of health and functionality always has defined the practice of healing professions, as discussed in Chapter 5. But in recent years, the marriage of science and business has gone well beyond “getting back to normal” in offering ever-more-sophisticated forms of enhancement to meet the public’s insatiable appetite for “more.” But not without controversy. The troubled histories of cosmetic surgery, fad-diets, and steroid abuse are but a few notable examples. Certainly science-fiction superpower stories play a big part in the phenomenon. But on another level, the hunger for such products feeds on the gnawing anxiety now epidemic in the U.S. In addition to economic disparities and their socio-cultural underpinnings, new levels of perfectionism percolate in mainstream culture as well. Advertising only reinforces these impulses by linking them to products.

Human enhancement fascinated German philosopher Friedrich Nietzsche, known for his social criticism and advocacy of science. In 1883, Nietzsche introduced the aristocratic figure of the “Übermensch” (Superman) as an aspirational ideal for the human race. He argued that this “perfect human” could be achieved though secular means on earth (rather than heaven) by improvements in health, creativity, and willpower.  In making this claim, Nietzsche wasn’t simply promoting sci-fi fantasy. Putting the Übermensch in a broader context, Nietzsche explained that every society generates a set of body ideals, and that those ideals inform what societies value and how they behave. The Übermensch complimented then-popular beliefs about human evolution, especially the strain of thinking known as “eugenics.”  Also introduced in 1883, eugenics applied Charles Darwin’s theories of natural selection to social policy by advocating the selective reproduction of certain classes of citizens over others (something Darwin never himself advocated). National leaders like Winston Churchill and Theodore Roosevelt supported the concept, along with many others around the world.[vi]  Nazi eugenicists later would cite Nietzsche’s Superman concept in their program to perfect the Aryan race through genocidal programs during World War II.

The excesses of early eugenics movements have tempered contemporary thinking about human enhancement but have done little to dampen yearnings for superhuman updates to the body and mind. Unforeseen consequences often come from what initially seem good ideas. And enthusiasm has little patience for downsides. Adding commerce, culture, and health benefits to the mix, it’s no mystery why the update impulse is stronger than ever. This chapter examines the resulting contradictions in today’s improvement culture, as they play out in beauty, fitness, wellness, intelligence, and ability. Key in this discussion is the assertion that in themselves enhancements are neither good nor bad. Like many things, what matters is the degree to which they are pursued, as well as what happens when external values or pressures are placed upon them.

Many use the term “transhumanism” to describe the contemporary update impulse in everything from robotic cyborgs to artificial organs. As the name implies, transhumanism wants humanity to transcend its limitations, with a strong emphasis on subjective autonomy and the specialness of the human species. Philosophically speaking, the movement sees humanity in a contest with nature and the natural world. It partakes in the belief that humans should use nature for their own ends and master its processes with science and technology. This takes form in enhancements to augment the “natural” body or, ultimately, to forestall or eliminate the natural occurrence of death. Because of this, some critics equate transhumanism with anthropocentrism, as well as historic tendencies to denigrate groups seen as uncivilized, savage or otherwise less-than-human owing to their proximity to nature.

Transhumanism differs from the similar term “posthumanism,” which looks at the way the human self is technologically mediated, and how the humans coexist with other organisms. Writing in The Posthuman Glossary, Francesca Ferrando explained the distinction: “Transhumanism and posthumanism both emerged in the 1980s and 1990s, but the drives motivating them are rooted in different traditions of thought. Transhumanism traces its roots within the enlightenment and does not reject the human tradition; on the contrary transhumanism focuses specifically on human enhancement.” In contrast, posthumanism focuses on “the postmodern deconstruction of the human started in the 1960s and 1970s underlining the fact that, historically, not every human being has been recognized as such.”[vii]

British futurist Max More often gets credit for mapping out the first full-fledged philosophy of contemporary transhumanism in his 1990 “Principles of Extropy.”[viii] More used the term “extropy” (the opposite of entropy) to assert the continual evolution of “intelligent life beyond its current human form and limitations.”[ix]  This can include anything from prosthetic limbs, brain implants, and gene splicing to futuristic plans for extending the human life span or uploading consciousness to the cloud.  Most of what is seen in contemporary science fiction falls within the transhumanist realm, for better or worse. While transhumanism focusses on the laudable goal of improving people’s health, ability, and wellbeing, it often glamorizes technology as an end in itself. Transhumanists imagine a future in which people become liberated from the constraints of infirmity and death but remain essentially human.

Transhumanism often finds itself tangled in ethical debates about technology’s appropriate role in life. Some critics worry that too many changes might alter what it means to be human in the first place. Others point out that technologies often get misused or run out of control. And still others express concern about the high costs of enhancements. Underscoring this last point, some of transhumanism’s most well-known boosters are tech billionaires like Elon Musk and Peter Theil.  New technologies often spring from genuine needs and good intentions. Yet they inevitably become contingent on cultural attitudes, market forces, and the institutions that enable them.  

[i] Gwen Moran, “How to Find Your Superpower,” Fast Company (Jun. 8, 2018) https://www.fastcompany.com/40578240/how-to-find-your-superpower (accessed Apr. 22, 2022).

[ii] Becca North, Your Hidden Superpower (Independently published, 2018); Carter Hughes, Finding Your Superpowers: Keys to Cementing Your Identity and Reaching Your Goals (Independently published, 2020); Aviaq Johnson and Tim Mack, What’s My Superpower? (New York: Inhabit Media, 2017).

[iii] Jennifer M. Wood, “10 Highest Grossing Movie Franchises of All Time,” MF (Mar. 18, 2019) https://www.mentalfloss.com/article/70920/10-highest-grossing-movie-franchises-all-time (accessed Apr. 19, 2022).

[iv] Gwyneth Torrecampo, “10 Reasons Why Everyone Has Seen a Superhero Movie,” Medium (Aug. 16, 2018) https://medium.com/framerated/10-reasons-why-superhero-films-are-so-popular-2ce69d2d93ea (accessed Apr. 19, 2022).

[v] “Human Enhancement,” Stanford Encyclopedia off Philosophy (Apr. 7, 2015) https://plato.stanford.edu/entries/enhancement/ (accessed May 20, 2022).

[vi] Victoria Brignell, “When America Believed in Eugenics,” New Statesman (Dec. 10, 2010) https://www.newstatesman.com/society/2010/12/disabled-america-immigration (accessed Apr. 24, 2022).

[vii] Francesca Ferrando, in Rosi Braidotti and Maria Hlavajova, The Posthuman Glossary (London: Bloomsbury Academic, 2018) p. 439.

[viii] See, Max More, “The Philosophy of Transhumanism, in Max More and Natasha Vita-More, The Transhumanist Reader: Classical and Contemporary Essays on Science, Technology, Philosophy of the Human Future (Malden, MA: Wiley-Blackwell, 2013) p. 10.

[ix] More, p. 3.

Empowerment for Sale

“Yes You Can,” (Sprint), “Be All that You Can Be” (U.S. Army), “Because You’re Worth it,” (L’Oréal) in “Your World, Delivered” (AT&T). You’ve seen these new ads: pitches for products or services to let you “be yourself” or “take control” of some aspect of your life. It’s a new strategy called “empowerment marketing,” based on the premise that in media savvy age people are smarter about advertising and need to be approached in a way that flatters their evolved sensibilities. As a recent feature in Your Business put it, “Traditional marketing depends on creating anxiety in the customer in convincing her that she has a need that only the product or service sold can help her fill.” In contrast, “Empowerment marketing subverts traditional marketing techniques by recasting the consumer as the hero who has the power to effect change and use the product or service being sold to achieve success.”[i]

Nice as this sounds, it is really a case of putting old wine in new bottles. The example Your Business uses is the familiar Nike “Just Do it” campaign, which doesn’t so much promote a certain shoe as much as “the message that anyone can be an athlete if they’re willing to work hard.”[ii] And indeed, this is exactly the message that appears on the first page of Nike’s current website: “Your daily motivation with the latest gear, most effective workouts and the inspiration you need to test your limits––and unleash your potential” with a fashion item lower on the page captioned “Dress like a champion.”[iii] In other words, the new empowerment advertising doesn’t really forgo conventional appeals to consumer anxiety. It simply personalizes the pitch with the lure of enhanced autonomy. The Nike ad itself sums up this contradiction perfectly in stating: “Life isn’t about finding your limits. It’s about realizing you have none.”[iv]  

The Learning Society

David Trend

As consumer prices continue to rise, experts now warn of a looming recesssion brought about by pandemic manufacturing slowdowns and supply-chain shortages. Economists explain it as a classic case of demand outpacing availability –– with scarcity making things more costly. Unfortunately, the painful solution now being launched will raise borrowing costs rates so that people spend less. While these measures may or may not improve the overall economomy, the combined effects of inflation and rising interest rates will exact a double blow to people struggling to make ends meet. In such an atmosphere it becomes critical to help people manage their own finances and to prevent the broader economy from overheating. This is where consumer education and financial literacy can help as part of a largermove toward a “learning society.”

For some time now, economists have been promoting financial education in public schools and urging people to become more resourceful. Time Magazine reported polls showing “99 percent of adults in agreement that personal finance should be taught in high school.”[i]  The Federal Reserve argued that “financial literacy and consumer education, coupled with strong consumer protections, make the financial marketplace ‘effective and efficient’ and assists consumers in making better choices.”[ii] Many colleges and universities have started making financial literacy courses graduation requirements. And for some it has worked, as many Americans “put their own budgets under the microscope –– akin to what financial analysts routinely do when the scrutinize companies.”[iii]  

Continue reading “The Learning Society”

The Creative Inner Child?

David Trend

Pablo Picasso once quipped that “Every child is an artist; the problem is how to remain an artist once they grow up.”[i]  In this often-quoted slogan, Picasso neatly summarized idealized views of the universally creative child and the uncreative adult. In a similar fashion he would later write that, “It takes a long time to become young.” What is one to make of such laments? Nostalgia over a lost youth? A yearning to escape a pressurized grown-up life?  Regardless of origins, it’s impossible to deny America’s ongoing infatuation with childhood creativity.

This fascination childood artistry dates to the 1700s, corresponding to evolving views of children as “blank slates” (tabula rasa) better served by nurturance and education than by discipline alone. At the same time, Enlightenment debates ver individualism and personal autonomy were bringing considerable anxiety to the era, evidenced in worries that self-interest would overwhelm moral sentiments. This set the stage for the naturalism espoused by Jean-Jacques Rousseau in his book Emile: Or, On Education, seeing an inherent “goodness” in children, which becomes corrupted by adult desire and material want.[ii] With the 1800s, views of “human nature” gave ways to theories of evolution and behavioral adaptation –– owing in large part to the influence of Charles Darwin and Herbert Spencer. While the resulting rationalism eventually would make educatio more formulaic, an artsy transcendentalism would counterbalance American culture with an advocacy for an “educated imagination.”[iii] The Romantic Era writings of Ralph Waldo Emerson, Margaret Fuller, Henry Wadsworth Longfellow, and Walt Whitman advanced themes of emotion over reason and imagination over reality –– setting in place a tradition progressive of push-back against the instrumentalist ethos of science and industry.

In the 1920s, Swiss psychologist Jean Piaget began charting children’s “stages” of maturity, hence launching the modern field of child development.[iv] Piaget saw “realistic” rendering as a learned ability rather than a natural inclination. In one famous study, Piaget asked a group of four-year olds to draw familiar people or objects. He found that the images invariably had the same characteristics: drawn from memory rather than observation, exaggeration of certain salient features (faces, for example), and a disregard of perspective or scale. In other words, the images derived more from mental symbolism than they did conventional schema of visual representation. Piaget would note that at later ages children acquire the ability to “correct” their images to conform to normative depictions of reality. Later observations of so-called “feral” children (raised” in the wild without human contact) found that such children often didn’t speak or make pictures of any kind, further reinforcing the premise that language and “artistic” rendering were largely determined by culture.[v]

Natural Born Killers?

David Trend

“Confessions of a Drone Warrior,” is one of hundreds of articles on the military’s use of Unmanned Ariel Vehicles (UAV), which began in the early 2000s. In many ways this new form of combat embodies the psychological distancing that typifies killing in the twenty-first century. The story about Airman First Class Brandon Bryant recounts his first day in a Nevada bunker, when the 22-year fired on two presumed Afghani insurgents on the other side of the world. An early recruit in this new kind of warfare, Bryant “hunted top terrorists, but always from afar” –– killing enemies in countless numbers, but not always sure what he was hitting. “Meet the 21stcentury American killing machine,” the story concluded.[i]

Of course, notions of aversion to fighting don’t sit well with either military doctrine or public belief. Behind America’s infatuation with high-tech weapons lie long-cultivated attitudes toward violence itself. In a class I teach on this, students often will express common sense views that fighting is “natural,” deriving from humanity’s animalistic origins, and often the only way of resolving conflicts. One sees this kind of thinking evident in permissive attitudes toward everything from boyish rough-housing to violent sports. The gendered aspects of violence receive less attention than they should, and will be addressed at length in Chapter 9. Suffice to say that aggression often is expected of men and boys, while also reflected in popular culture. Along with political partisanship, these attitudes help explain the deep divisions within the U.S. electorate over gun control and so-called “stand your ground” laws. Since even scholars often disagree over the issue of human violence, it helps to break the question into subcategories –– and to also point out how knowledge has changed over time in the fields of biology, psychology, and cultural analyses of violent behavior.

Continue reading “Natural Born Killers?”

Stigma and Mental Illness

By David Trend

“The more I became immersed in the study of stigmatized mental illness, the more it astonishing to me that any such phenomenon should exist at all,” writes Robert Lundin, a member of the Chicago Consortium for Stigma Research. “I believe that serious and persistent mental illnesses, like the one I live with, are clearly an inexorably no-fault phenomena that fully warrant being treated with the same gentleness and respect as multiple-sclerosis, testicular cancer or sickle-cell anemia.”[i] Here Lundin names a central of problem in the social construction of mental illness: the misunderstanding of conditions affecting the mind as somehow different from other biological illness. The misrecognition renders mental illness prone to the judgmental attributions discussed by Susan Sontag in her 1973 book Illness as Metaphor.  To Sontag, contemporary society reverses ancient views of sickness as a reflection of the inner self.  In this new view, the inner self is seen as actively causing sickness––through smoking, overeating, addictive behavior, and bad habits: “The romantic idea that disease expresses the character is invariably extended to exert that the character causes the disease–because it is not expressed itself. Passion moves inward, striking within the deepest cellular recesses.”[ii] But as before, the sick person is to blame for the illness.

Such sentiments are especially vindictive when a mentally ill person commits a crime. Understandably perhaps, clinical terms like “mental illness” quickly acquire malevolent meanings in the public mind––even though the mentally ill statistically are no more prone to criminality than anyone else. Sometimes this semiotic slippage causes public panic over commonplace disorders. Consider the case of Adam Lanza, the young man who in 2013 shot 26 children and adults at the Sandy Hook Elementary School in Newton, Massachusetts. While mental health analysts speculate that an acute psychotic episode prompted his violence, Lanza never had been diagnosed with a serious mental illness. As reporters scrambled for a story, much was made of Lanza’s childhood symptoms of Asperger’s syndrome, a form of high-functioning autism. The repeated mention of this disorder in news coverage triggered wrong-headed fears nationally of the murderous potential in other autistic kids. According the Centers for Disease Control (CDC), approximately 1 in 50 people (1.5-million) fall somewhere on the autistic spectrum, 80 percent of whom are boys.[iii] This has prompted improved diagnostic measures, which in turn have resulted in an apparent rise in autism cases in recent years––up 78 percent from a decade ago––and made autism a source of acute anxiety for many new parents. Continue reading “Stigma and Mental Illness”

The Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “The Big Data vs Artists and Everyone Else”

Infidelity

Which type of cheating is worse, sexual or emotional? It depends who you’re asking — more specifically, what gender you’re asking.

A new study published in the journal Evolutionary Psychology set out to determine how people feel about the two types of infidelity.images

Researchers from Kansas State University recruited 477 adults — 238 men and 239 women — and asked them to fill out several questionnaires on a variety of topics, including relationships and cheating. One such question was, “Which would distress you more: Imagining your partner enjoying passionate sexual intercourse with another person or imagining your partner forming a deep emotional attachment with another person?”

After analyzing the results, researchers came to a very clear conclusion: “Males reported that sexual infidelity scenarios were relatively more distressing than emotional infidelity scenarios, and the opposite was true of
females,” they wrote in the study.

Interestingly, the purpose of the study was to determine which factors — be it attachment style, feelings of trust, relationship habits, etc. — would lead someone to feel one way or the other about cheating. But at the end of the study, researches discovered that the only factor that played a role was gender. Men were most upset by physical cheating and women were more upset by emotional cheating — end of story.

What do you think: Can it really be so black and white?

Keep in touch! Check out HuffPost Divorce on Facebook and Twitter. Sign up for our newsletter here.

The persistence of gender pay inequity

For all the progress made on women’s rights, one measure of inequality still stands out: Females earn less than males, even in the same occupations. Closing this gender gap will require changing the way employers think about work.

It’s hard to overstate how far women have come in the last century. They are now almost as active in the labor market asmen, and equally or even better educated. They account for about half of all law and medical school enrollments, and lead men in fields such as biological sciences, pharmacy and optometry.images-1

Still, women have yet to reach the same level of pay. As of 2010, the annual earnings of the median full-time, full-year female worker stood at 77 percent of the median male’s — up from 56 percent in 1980 but still far from parity. For college graduates, the number was an even lower 72 percent.

Why the persistent difference? U.S. data provide two clues. First, the gap increases with age: Women start their careers close to earnings parity with men, then fall behind over the next several decades. Second, wage differences are concentrated within occupations, meaning that women earn less not because they choose lower-paid professions.

The earnings gap is most pronounced in occupations such as law that place a premium on the willingness and ability to work long hours, be in the office at specific times and build face-to-face relationships with co-workers and clients. In these professions, the penalty for working part time or taking time off — to give birth or care for a child, for example — is particularly large. Small differences in time away or in hours translate into large differences in pay.

Consider the case of women with master degrees in business administration. At 10 to 16 years into their careers, they are typically earning only 55 percent of what men do. Child bearing is a primary reason for the divergence. A year after giving birth, women’s workforce participation rate declines 13 percentage points. Three to four years later, the decline increases to 18 percentage points. In other words, many MBA moms try to stay in the fast lane but ultimately find it unworkable.

The huge value that so many employers place on a standard work schedule affects more than the careers of women. Anyone who, for whatever reason, needs to take time off or work flexible hours gets penalized. The broader economy suffers when businesses are unable to make full use of highly educated and productive people.

To be sure, some professions may never be able to offer much flexibility. Merger-and-acquisition bankers, trial lawyers and the U.S. secretary of state have 24/7, on-call-all-the-time jobs. That said, the universe of such jobs is probably smaller than it appears.

Many professions that once tied people to specific hours are finding ways to reduce the cost of flexibility by making employees more substitutable. Veterinarians, optometrists, pharmacists, pediatricians, anesthesiologists and primary-care providers are shifting from self-employment to group practices and corporate ownership structures that allow them to cover for one another. Smaller veterinary practices that once required staff to have weekend, night and emergency hours are giving way to larger regional hospitals. Such changes often occur because of increased economies of scale, or in response to pressure from employees.

 

More at: http://www.bloomberg.com/news/2014-01-21/close-the-gender-pay-gap-change-the-way-we-work.html

Arts job report

What are the latest employment figures for working artists—both full-time and their moonlighting counterparts?Keeping My Day Job: Identifying U.S. Workers Who Have Dual Careers As Artists is the third installment in the National Endowment for the Arts’ Arts Data Profiles, an online resourceoffering facts and figures from large, national datasets about the arts, along with instructions for their use. Arts Data Profile #3 reports on employment statistics for U.S. workers who name “artist” as their primary or secondary job.imgres

According to the NEA, “The analysis springs from the Current Population Survey (CPS), a nationwide, monthly survey of 60,000 American households, conducted by the U.S. Census Bureau and the Bureau of Labor Statistics. The CPS is the primary source of U.S. labor statistics, as well as other data on volunteering, poverty, computer and Internet use, arts participation, and more.

“The big picture – In 2013, 2.1 million workers held primary positions as artists. A primary job is defined as one at which the greatest number of hours were worked. In that same year, an estimated 271,000 workers also held second jobs as artists. Twelve percent of all artist jobs in 2013 were secondary employment.

“Unemployment trends – For primary artists, the unemployment rate was 7.1 percent in 2013, compared to 6.6 percent of all U.S. civilian workers, but higher than the 3.6 rate for all professionals (artists are grouped in the professional category). This is an improvement over the 9 percent jobless rates in 2009 and 2010, but well above the pre-recession unemployment rate of 3.6 percent in 2006. Architects and designers were among the hardest hit occupations. While both have halved the 10-11 percent unemployment rates they faced in 2009, neither is back to pre-recession employment rates of 1-3 percent. By contrast, musicians have faced a steady unemployment rate of 8-9 percent since 2009, much higher than the 4.8 percent jobless rate in 2006. Continue reading “Arts job report”

White history month

Invariably, around February of each year, coinciding with Black History Month, you’ll hear people asking, “Why isn’t there a White history month?”

Do these people mean we should condense all the American history centering around White people to just one month and devote the other 11 to people of color?

Of course not. It’s readily accepted that White history is taught, year-round, to the exclusion of minority histories. But the literal history of Whiteness — how and when and why what it means to be White was formulated — is always neglected. The construction of the White identity is a brilliant piece of social engineering. Its origins and heritage should be examined in order to add a critical layer of complexity to a national conversation sorely lacking in nuance. I’m guessing that’s not what they mean, either. In conversations about race, I’ve frequently tried and failed to express the idea that Whiteness is a social construct. So, here, in plain fact, is what I mean:

The very notion of Whiteness is relatively recent in our human history, linked to the rise of European colonialism and the Atlantic slave trade in the 17th century as a way to distinguish the master from the slave. From its inception, “White” was not simply a separate race, but the superior race. “White people,” in opposition to non-whites or “colored” people, have constituted a meaningful social category for only a few hundred years, and the conception of who is included in that category has changed repeatedly. Continue reading “White history month”

The social commentary of Yoshua Okón’s “Salo Island”

Writing in OC Weekly, Dave Barton writes of the new exhibition at UC Irvine of a work by Yoshua Okón, entitled “Salo Island.”

“The Marquis de Sade was rotting away in the Bastille of pre-revolutionary France when he wrote one of his first pornographic novels, 120 Days of Sodom.beach

: “A mind-boggling litany of sexual perversion, the plot is about a foursome of wealthy French elite—a Judge, a Bishop, a Banker and a Cardinal—who kidnap a group of boys and girls, take them to an isolated castle, and then humiliate, rape and murder them. Heinous masturbatory material that it is, it’s also a grimly funny social commentary, with the degenerate Marquis pointing fingers at fellow travelers in his own social class, people who were doing things he only fantasized about.

“In 1975, Marxist Italian filmmaker Pier Paolo Pasolini used the infamous book as source material for his film Salo, or the 120 Days of Sodom, considered by many critics the most controversial movie of all time. Changing the setting from France to the last Fascist holdout of Mussolini’s Italy, Pasolini’s film doesn’t have the Marquis’ mordant sense of humor; playing things deadly serious, the bold visualization of the novel’s atrocities turns the political tract into cinema’s first torture porn.

“Shortly before the film’s release, Pasolini was brutally murdered, supposedly by a teenage male prostitute who ran over him with his own car on a desolated beach. Believed at the time to be a sex deal gone bad, the murderer (who had right-wing ties) has since recanted his confession, claiming Pasolini was assassinated for his politics, as well as his open homosexuality. Fascists apparently don’t take kindly to portrayals of themselves as ass-licking, shit-eating, child murderers. Continue reading “The social commentary of Yoshua Okón’s “Salo Island””

Why women live longer

There are many causes of women’s longevity, some apparently biological (such as their more resilient immune systems) and some more man-made (such as lower rates of accidental, homicidal, or suicidal death).But the overall survival advantage is an outcome of social dynamics. The Atlantic discusses the factors:

“In the United States, women’s advantage in life expectancy at birth is just less than five years, but it was almost eight years in the 1970s. Demographers have determined that the major driver of the 20th century trend was smoking (there is a similar pattern in much of Europe).

images

“Smoking is a big issue. More than 80 percent of American men born in 1901 did by the time they were in their thirties, which accounts for the early deaths of millions of men into the 1970s (in the 1950s Americans consumed about 12 pounds of tobacco per person annually, three times current levels). In contrast, young women’s smoking rates never passed 55 percent, and their peak was later, in the 1970s. Since 1965 smoking rates have fallen by more than half, and the gender gap has dropped by more than two-thirds, so women’s survival advantage may narrow further.

“Smoking is a major factor globally, and many countries could be going through what the U.S. did in the last century. The World Health Organization reports that smoking is more common for men than for women in every country except Austria, and in many countries the difference is huge. Continue reading “Why women live longer”

Arts feminism considered

Sometimes people claim that we don’t need feminism any more. Women have rights, they argue, so what more could they possibly want or need?

A recent post from the UK office of Huffington Posts carries an essay saying: “One only needs to look around the world at the terrible situation for many girls and women to realise that feminism is still necessary and vital. But even once females have better living conditions and more rights, feminism still has a role to play as women try to shape careers.images-7

“Several recent news stories have made it clear that women are way behind when it comes to careers in the arts.

“VIDA’s overview of who got published in literary magazines in 2012 suggests that it is still – no surprise – overwhelmingly men. Not only is it men who more often get their literary work published, but it is also primarily men who get their work reviewed and who are the reviewers, too. Continue reading “Arts feminism considered”

Let’s actually talk about student loans

There is more student loan debt outstanding — $1 Trillion — than credit card debt! And the government is making a huge profit on it — an estimated 36 percent profit margin, reports the Huffington Postimages-1

“Here’s the real shame: The government gets to borrow for 10 years paying less than 2 percent interest on U.S. Treasury notes, while students must pay 6.8 percent interest on the loans they get from the government!

“The government is ripping off college students, leaving them with a burden of debt that averages $27,000, and for many exceeds $100,000, while they are forced to pay above-market interest rates.

“Students will spend so much time and pay so much interest getting out of student loan debt that most will never be able to afford to buy a home. Today’s homebuyers can get a 3.5 percent, 30-year fixed-rate mortgage. But today’s students may never get to take advantage of today’s low mortgage rates, because the government demands twice that rate to pay off their student loan debt. Continue reading “Let’s actually talk about student loans”

Signorile on the Scouts

images-4“The latest decision by the Boy Scouts of America, proposing to end its ban on gay scouts but not its ban on gay and lesbian scoutmasters and den mothers, is at once ridiculous and blatantly anti-gay,” writes  Michaelangelo Signorile in today’s Huffington Post, continuing as excerpted below

“Sorry, but there’s just no middle ground on bigotry. The idea that you can end discrimination against some — and actually admit that it is discrimination — but not against others is truly breathless in its illogic. The BSA actually says in its new proposal that “no youth may be denied membership in the Boy Scouts of America on the basis of sexual orientation or preference alone,” but that the organization “will maintain the current membership policy for all adult leaders.”

“So a boy can come out as gay, be a great scout and be accepted by the organization but not even think about being a scoutmaster as an adult? And how can a boy who comes out as gay, or is simply known to be gay because of his other associations and friendships, feel that he is not stigmatized by the BSA when the organization is still discriminating against gay adults? Continue reading “Signorile on the Scouts”

Europe sees research gender gap

Addressing whether research has a “gender dimension” is to become a greater priority under new plans for European funding, reports the Times of London

“The term refers to the fact that research does not always account for differences between men and women and this needs to be woven into the fabric of research projects.images-3

“Katrien Maes, chief policy officer at the League of European Research Universities, said a failure to consider gender in research has led to medicines being less evidence-based for women and has also resulted in products and services being ill-designed for, or untested on, women.

“The issue was discussed at the Leru round-table event on “Women, research and universities: excellence without gender bias” on 22 March in Brussels, and may gain greater prominence under the next research funding framework, Horizon 2020. Dr Maes said the European Commission was considering whether to strengthen its requirements for applicants to take into account the gender dimension of research in funding applications from 2014 to 2020.

“If somebody puts in a proposal for a research project, they could ask, have you taken into account whether there is a need to have a gender dimension? Are there any gender or sex analyses that are necessary?” said Dr Maes. The Commission may also introduce specific funding for gender- related research in areas such as the environment, transport and nutrition. Continue reading “Europe sees research gender gap”

Mormons, race, and history

A newly released digital edition of the four books of LDS or Mormon scripture—the Holy Bible, the Book of Mormon, the

imgres

Doctrine and Covenants, and the Pearl of Great Price—includes editorial changes that reflect a shifting official view on issues like polygamy, the Church’s history of racism, and the historicity of LDS scripture, reports Salon.com

“Perhaps the most significant is the inclusion of a new heading to precede the now-canonized 1978 announcement of the end of the LDS Church’s ban on black priesthood ordination:

“The Book of Mormon teaches that “all are alike unto God,” including “black and white, bond and free, male and female” (2 Nephi 26:33). Throughout the history of the Church, people of every race and ethnicity in many countries have been baptized and have lived as faithful members of the Church. During Joseph Smith’s lifetime, a few black male members of the Church were ordained to the priesthood. Early in its history, Church leaders stopped conferring the priesthood on black males of African descent. Church records offer no clear insights into the origins of this practice. Church leaders believed that a revelation from God was needed to alter this practice and prayerfully sought guidance. The revelation came to Church President Spencer W. Kimball and was affirmed to other Church leaders in the Salt Lake Temple on June 1, 1978. The revelation removed all restrictions with regard to race that once applied to the priesthood.

 

“Church leaders have long maintained public ambiguity about the history of the ban and its end; they have rarely acknowledged the ordination of early African-American Mormons nor have they cited anti-racist teaching in the Book of Mormon in connection with the Church’s own troubled history on race. The new heading historicizes the ban (suggesting the influence of a robust Church History department) and depicts it as a contradiction to the original impulses of the faith, not corrected until 1978. The heading does, some commentators have noted, offer continuing cover to Brigham Young, whose on-the-record racist statements to the Utah legislature suggest his influence in the evolution of a non-ordination policy. Commentators also note the absence of reference to the fact that black women were not historically admitted to LDS temple worship until the 1978 announcement.”

 

Full story at:  http://www.salon.com/2013/03/09/changes_in_mormon_scriptures_to_reflect_shifting_views_partner/

Facebook hacked again

Facebook Inc has said that it been the target of a series of attacks by an unidentified hacker group, but it had found no evidence that user data was compromised, reports today’s Al Jazeera.

“’Last month, Facebook security discovered that our systems had been targeted in a sophisticated attack,’ the company said in a blog post posted on Friday afternoon, just before the three-day Presidents Day weekend. ‘The attack occurred when a handful of employees visited a mobile developer website that was compromised.’imgres

“The social network, which says it has more than one billion active users worldwide, also said: ‘Facebook was not alone in this attack. It is clear that others were attacked and infiltrated recently as well.’ Continue reading “Facebook hacked again”