Stop Blaming Students: Toward a Post-Pandemic Pedagogy

David Trend

There’s trouble in the college classroom these days. But you can’t blame students. The pandemic and other disruptions of the past two years have shaken higher education to the core, casting doubt on how universities deliver instruction, pay their bills, and justify their existence. Enrollments are dropping across the nation, as students and their families increasingly see college as  overpriced, inequitable, and non-essential. More disturbing still are shifts taking place within institutions themselves, as dispirited students are losing motivation and enthusiasm for learning.  Clearly something has to change, with many pointing to the classroom as a key place to start.  But will it be enough?

“A Stunning Level of Disconnection” is the way one recent article described the situation. “Fewer students show up to class. Those who do avoid speaking when possible. Many skip the readings or the homework. They have trouble remembering what they learned and struggle on tests,” one professor reported.[1] Instructors are trying to reach and teach students, to figure out the problem, and do anything they can to fix things, with many now concluding in frustration that “It may be necessary to change the structure of college itself.” Call it a stress test for higher education – the seismic disruption of the college classroom during the COVID-19 years, and its ongoing after-shocks. At all levels of instruction, educators continue to voice alarm over the persistent malaise and underperformance of college students. 

The Problem with Rigor

David Trend

“It’s Time to Cancel the Word Rigor,” read a recent headline in the respected Chronicle of Higher Education.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

David Trend

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

David Trend

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

Loneliness of the Long Distance Learner

David Trend

No one could have predicted the radical changes in education of the early 2020s. Besides making the once-obscure Zoom into a household name, the pandemic accelerated an already fast-moving takeover of everyday life by the internet. The economic consequences were profound, with revenues exploding for companies like Netflix and Amazon while brick-and-mortal retail outlets and restaurants disappeared by the thousands. Of course nothing about the upheaval was especially surprising in historical terms. Cataclysmic events like disasters and wars often leave places quite different than they were before, as systemic restraints give way to radical reorganization. Emergency measures accepted in the moment have a habit of leaving remnants in place, much as occurred with online learning. Not that this is always is a bad thing. Urgent situations can trigger remarkable innovation and creativity, seen in the hundreds of ways that educators found ways to keep instruction going. But just as often people get hurt in the rush, as short-term solutions make for long-term problems.

Seen in retrospect, the rapid transition to online learning certainly falls into this latter category, evidenced in the huge numbers of students who failed or dropped out of classes, with those affected overwhelmingly the historically underserved. Changes occurred and learning was disrupted. But the convenience and efficiencies of virtual classroom were too good to let go. “Online Learning is Here to Stay” read a feature in New York Times, citing a study from the Rand Corporation saying that 20 percent of schools were choosing to continue portions of their online offerings. “Families have come to prefer stand-alone virtual schools and districts are rushing to accommodate, but questions still linger.”[i] Questions indeed. Before the pandemic less than one percent of K-12 schooling took place online. Educational reasons notwithstanding, this also had to do with the function of school as childcare for working families. The idea of a twenty-fold increase in home learning raises the question of what parent demographics are driving this shift. Or more to the point, who has gained from the online shift and who lost out? Continue reading “Loneliness of the Long Distance Learner”

Turn-U-In : Treating Students as Suspects

David Trend

It’s no secret that online learning has its problems, witnessed in the historic failure and drop-out rates resulting from thrown-together course overhauls in the early COVID months. Less widely reported has been another kind of failure owing to a loss faith in educational institutions and a widening trust gap between teachers and students.

Inherent school power inequities  have aggravated  antagonisms – now made even worse by a range of surveillance and security technologies. The distance in “distance learning” can create an atmosphere of alienation and distrust. When the in-person classroom is reduced to a screen image, teachers and students can seem more like abstractions than actual people.

This opens the door for all sorts of communication failures and misunderstandings, not to mention stereotyping and harm. The objectifying tendencies of media representations long have been associated distortions in the way individuals and groups view each other, whether in the marketing of products, sensationalizing news items, or spreading ideologies on social networks. When “Zoom school” does this, underlying beliefs and assumptions can overtake the reality of encounters, generating attitudes that destabilize the learning environment.

These problems have become especially evident in the panic about student dishonesty in online learning, as the absence of classroom proximity quickly escalated in into assumptions of cheating. Early in the 2020s a torrent of news reports warned of an “epidemic” of dishonesty in online learning, with some surveys showing over 90 percent educators believing cheating occurred more in distance education than in-person instruction.[i] New technologies often have stoked such fears, in this instance building on the distrust many faculty hold toward students, some of it racially inflected. [ii] Closer examination of the issue has revealed that much of the worry came from faculty with little direct knowledge of the digital classroom, online student behavior, and preventative techniques now commonly used.  Indeed more recent research has shown no significant differences between in-person and online academic integrity.[iii] Continue reading “Turn-U-In : Treating Students as Suspects”

When School is a Factory

David Trend

For 20 years, I have been teaching large arts and humanities general education courses at the University of California, Irvine. These 400-student classes are part of the undergraduate “breadth requirements” common in most colleges and universities, and hence draw enrollments from across the academic disciplines. At UC Irvine, this means that most of the class comprises science, technology, engineering, and math (STEM) majors. Aside from an orientation to more practical fields, I’ve noticed a clear shift in student attitudes in recent years –– a heightened preoccupation with grades and rankings, combined with growing anxieties about future earnings. Many of my colleagues see this as well, often disparaging students more concerned with GPA metrics than learning itself, while increasingly behaving more like consumers of educational commodities. I take a more sanguine view.

Bear in mind that many of today’s college students grew up during the Great Recession, when families of all incomes had money worries. With scant knowledge of a world before 9/11, it’s little wonder that polls show millennials expecting lower earnings than their parents, seeing the United States on a downward spiral, and believing the two-party system as fatally flawed.[i] Rising income inequality doesn’t help matters, especially at UC Irvine where 6 in 10 students get financial aid and half are the first in their families earning a college degree.[ii] Because of this, Irvine has been cited by the New York Times as the country’s leading “upward mobility engine” –– making the campus a national model of what public higher education can do.[iii] But it’s still not a cake-walk for degree seekers. As at most public universities in America, the majority of Irvine’s full-time students also work at jobs to make ends meet.[iv] Continue reading “When School is a Factory”

Anxious Creativity: When Imagination Fails

Just released: Creativity is getting new attention in today’s America –– along the way revealing fault lines in U.S. culture. Surveys show people overwhelming seeing creativity as both a desirable trait and a work enhancement, yet most say they just aren’t creative.

Like beauty and wealth, creativity seems universally desired but insufficiently possessed. Businesses likewise see innovation as essential to productivity and growth, but can’t bring themselves to risk new ideas. Even as one’s “inner artist” is hyped by a booming self-help industry, creative education dwindles in U.S. schools.

 Anxious Creativity: When Imagination Fails examines this conceptual mess, while focusing on how America’s current edginess dampens creativity in everyone. Written in an engaging and accessible style, Anxious Creativity draws on current ideas in the social sciences, economics, and the arts. Discussion centers on the knotty problem of reconciling the expressive potential in all people with the nation’s tendency to reward only a few. Fortunately, there is some good news, as scientists, economists, and creative professionals have begun advocating new ways of sharing and collaboration. Building on these prospects, the book argues that America’s innovation crisis demands a rethinking of individualism, competition, and the ways creativity is rewarded.

Available from all major booksellers. More info at: https://www.routledge.com/Anxious-Creativity-When-Imagination-Fails-1st-Edition/Trend/p/book/9780367275068

Natural Born Killers?

David Trend

“Confessions of a Drone Warrior,” is one of hundreds of articles on the military’s use of Unmanned Ariel Vehicles (UAV), which began in the early 2000s. In many ways this new form of combat embodies the psychological distancing that typifies killing in the twenty-first century. The story about Airman First Class Brandon Bryant recounts his first day in a Nevada bunker, when the 22-year fired on two presumed Afghani insurgents on the other side of the world. An early recruit in this new kind of warfare, Bryant “hunted top terrorists, but always from afar” –– killing enemies in countless numbers, but not always sure what he was hitting. “Meet the 21stcentury American killing machine,” the story concluded.[i]

Of course, notions of aversion to fighting don’t sit well with either military doctrine or public belief. Behind America’s infatuation with high-tech weapons lie long-cultivated attitudes toward violence itself. In a class I teach on this, students often will express common sense views that fighting is “natural,” deriving from humanity’s animalistic origins, and often the only way of resolving conflicts. One sees this kind of thinking evident in permissive attitudes toward everything from boyish rough-housing to violent sports. The gendered aspects of violence receive less attention than they should, and will be addressed at length in Chapter 9. Suffice to say that aggression often is expected of men and boys, while also reflected in popular culture. Along with political partisanship, these attitudes help explain the deep divisions within the U.S. electorate over gun control and so-called “stand your ground” laws. Since even scholars often disagree over the issue of human violence, it helps to break the question into subcategories –– and to also point out how knowledge has changed over time in the fields of biology, psychology, and cultural analyses of violent behavior.

Continue reading “Natural Born Killers?”

Teaching Robots to Imagine

David Trend

Can robots be taught to imagine? Google’s DeepMind artificial intelligence group is doing just that –– developing computer versions of what many consider humanity’s quintessential trait. The software world long has pursued sentient consciousness as its holy grail. But until now, it’s only been found in science fiction movies like A.I., Ex Machina, and Transcendence. DeepMind engineers say they have cracked the code by combining two kinds of machine-learning. The first is linear, which is nothing new, with the computer applying a predefined algorithm over-and-over till it finds answers and then remembering them. In the second more radical approach, the computer tries many algorithms to find which work best, and then changes the very way it approaches problems. Combining the purely linear with a more systemic approach, DeepMind’s “Imagination-Augmented Agent” mimics intuitive learning in a way prior software hasn’t. It’s not exactlythe same as human imagination, but it comes closer than ever before to what neuroscientists say the brain does.

While robotic imagination may be improving, human thought isn’t faring as well. Most people feel uncreative and without inspiration, as discussed in earlier chapters. Corporations say innovation is withering as well. Novelist Ursula Le Guin recently observed that, “In America today imagination is generally looked on as something that might be useful when the TV is out of order. Poetry and plays have no relation to practical politics. Novels are for students, housewives, and other people who don’t work.”[i]Beyond the abandonment of a creative genre or two, American society also is undergoing a wholesale commodification of imagination itself. Disney is most famous for this, its “Imagineering” (imagination + engineering) brand one of the most viciously protected anywhere. But hundreds of companies evoke imagination to conjure an aura of specialness ––seen in promotions like Bombay Safire’s “Infused with Imagination,” GE’s “Imagination at Work,” Electrolux’s “Power to Capture Imagination,” Lego’s “Imagine,” Microsoft’s “Imagine Academy,” Nestle’s “Feed your Imagination,” Samsung’s “Imagine,” and Sony’s “Made of Imagination.”

The connection of imagination to commercial products reflects the powerful linkage of purchasing to consumer self-image. Expressing oneself through buying brings a passing feeling of agency, maybe even of accomplishment. Some critics say that shopping is more meaningful than voting for many Americans. Henry A. Giroux speaks of “disimagination” in describing how public consciousness is overwritten in this process, as people lose abilities to imagine on their own. To Giroux “The power to reimagine, doubt, and think critically no longer seems possible in a society in which self-interest has become the ‘only mode of force in human life and competition’ and ‘the most efficient and socially beneficial way for that force to express itself.’” Going even further, Giroux links disimagination to a rising collective amnesia, stating “What I have called the violence of organized forgetting signals how contemporary politics are those in which emotion triumphs over reason, and spectacle over truth, thereby erasing history by producing an endless flow of fragmented and disingenuous knowledge.”

Imagination can be seen positively, of course. With this in mind, much of this chapter exploresways people can envision a better and more just world. Obviously this might take a little encouragement in an age of disimagination. But it’s far from impossible. Most definitions describe imagination as the mental process behind creativity, as seen in the Oxford Dictionary: “Imagination: The faculty or action of forming new ideas, or images or concepts of external objects not present to the senses.The ability of the mind to be creative or resourceful.” Put another way, creativity is imagination actualized for a purpose –– generally assumed a positive one. As stated by a leading expert in the field, “Creativity is putting your imagination to work. It’s applied imagination.” Dig a little deeper into this lexicon, and one finds that very problem that worries Le Guin and Giroux. A quick look at Roget’s Thesauruslists such synonyms for “imaginative” as “dreamy,” “fanciful,” “fantastic,” “quixotic,” “romantic, and “whimsical.” Nice as these sound, such vaporous associations equate imagination with the same romantic idealism and inconsequentiality dogging creativity. This explains why advertisers seem so keen on imagination. As one marketing firm put it, “We don’t see imagining as a real task. It’s an enjoyable game. By asking a prospect to imagine something, you bypass that critical part that throws up objections, and sneak into their mind through the back door of the imagination.”

How about seeing imagination differently? Maybe as a roadmap for one’s life or future?  Or a way to imagine important people in one’s life? Perhaps even a vision for community, country, and the larger world? After all, isn’t society itself an imaginary construct? Doesn’t everyone want to make it better? To Le Guin, “To train the mind to take off from immediate reality and return to it with new understanding and new strength, nothing quite equals poem and story.” She concludes that “Human beings have always joined in groups to imagine how best to live and help one another carry out the plan. The essential function of human community is to arrive at some agreement on what we need, what life ought to be, what we want our children to learn, and then to collaborate in learning and teaching so that we and they can go on the way we think is the right way.”

Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “Big Data vs Artists and Everyone Else”

Belonging Where?

By David Trend:

Throughout its existence the United States has shown a strange tendency to turn against itself, dividing citizens against each other with a vehemence rivaling the most brutal regimes on earth. Some have rationalized the resulting crisis of “belonging” in America as an understandable consequence of cultural diversity, economic stress, and global threat. After all, haven’t there always been “insiders” and “outsiders” in every culture? Aren’t competition and aggression wired into human nature?  Or is there something peculiar about the personality of the U.S.?  Could it be that prejudice is the real legacy of the “American Exceptionalism,” in traditions dating to the genocide of indigenous populations, the subjugation of women, the rise of slavery, the scapegoating of immigrants, and more recent assaults on the poor or anyone falling outside the realm of normalcy?

I discussed selected aspects of America’s divisive pathology in my book A Culture Divided: America’s Struggle for Unity, which was written in the closing years of the George W. Bush presidency.  Like many at the time, I had completely given up on the idea of “common ground” amid the residue of post-9/11 reactionary fervor and emerging economic recession. Media commentators were buzzing constantly about red/blue state polarization.  Opinions varied about the cause of the divide, attributing it to factors including regionalism, media sensationalism, partisan antipathy, or all of these combined. Also joining the fray were those asserting the divide was fabricated, with evenly divided elections showing most people in the middle of the curve on most issues.  My somewhat contrarian view was that the “problem” shouldn’t be regarded problem at all. After all, America always had been divided––through war and peace, boom and bust. Division was the country’s national brand.  But as a book about politics, A Culture Divided didn’t get to the roots or the lived experience America’s compulsive divisiveness.

Speaking at the 50th anniversary of the Selma to Montgomery marches, President Barack Obama described America as an incomplete project––a nation caught between ideals of a perfect union and the lingering realities of their failure. While citing advances in civil liberties since the bloody apex of the Voting Rights Movement, Obama also spoke of a federal report issued just days earlier documenting structural racism and misbehavior toward African Americans by police in Ferguson, MO, where months before law enforcement officers had killed an unarmed black teenager. “We know the march is not yet over.  We know the race is not yet won,” the President stated, adding, “We know that reaching that blessed destination requires admitting as much, facing up to the truth.” Continue reading “Belonging Where?”

Creative Magic

By David Trend

“The central question upon which all creative living hinges is: Do you have the courage to bring forth the treasures hidden within you?” With this entreaty, author Elizabeth Gilbert introduced her recent bestseller Big Magic: Creative Living Beyond Fear, which offered an artistic cure for an anxious American culture.[i] Speaking directly to widespread feelings of disaffection and powerlessness, Big Magic romanticized artistry in Gilbert’s signature blend of sentiment and cliché––packaging familiar views (human creativity, divine creativity, etc.) with a self-help twist about creating one’s “self” in new and better ways.  While one easily can write off Big Magic as yet another feel-good advice book (which it surely is), I think it’s time to take Gilbert’s approach to creativity seriously and ponder why such ideas now get so much traction.

Publicity doesn’t hurt. Reviewers effused over Big Magic as a “book-length meditation on inspiration” (Newsday) to “unlock your inner artist” (Woman’s Day) and “dream a life without limits” (Publishers’ Weekly).[ii] This message resonated well with the rising chorus promoting creativity as an innovation engine and economic tonic.  While no one would dispute the positive benefits of a little artistic dabbling, at what point does such wishful thinking begin to border on delusion? Or put another way, when does fantasy paper over reality? Might it be that America’s fondness for make-believe is party behind the nation’s political confusion and disaffection? Do fairy-tale versions of life infantilize a citizenry that should know that answers don’t always come easily?  Certainly the fantasy-version of reality offered by certain politicians would fail any thoughtful analysis. But instead, many leaders continue treating their constituents like children, with entire governments encouraging populations to set worries aside and simply “Be Creative.”

In Magical Thinking and the Decline of America, historian Richard L. Rapson took a long look at the nation’s romantic idealism. “Probably in no other society of the world can one write the script for one’s life as completely as United States. This fact has made the nation the ‘promised land’ for much of the world over the past two centuries,” Rapson wrote. “The flight into endless self-improvement and innocent optimism has a long lineage in our past.”[iii] Perhaps anticipating Donald Trump’s “Make America Great Again” sloganeering, Rapson pointed to the disconnection between America’s self-image as an “exceptional” driver of human history, and the growing evidence of the nation’s falling fortunes. This has led to what Rapson described as a growing “flight from knowledge and reality into faith and fantasy,” resulting in large part from “an American public increasingly in thrall to the fairytales told by the mass media.”[iv]  It also promotes a “cultural fixation on the individual, the personal, the biographical, the confessional, and, all too often, the narcissistic,” and hence the rise of new “magic words” like “self-awareness,” “personal growth” and other aphorisms promoting everyone to “be all that you can be.”[v]

Individualism lies at the heart of American idealism, dating to the country’s Enlightenment Era origins, when the autonomous subject was invented as a counterpoint to deific and royal authority. Necessary as individualism was (and remains), no one could have predicted how its value could be magnified and distorted in neoliberal times.  The initial affirmation of personal identity, which encouraged people to vote and participate in society, soon morphed into “striving to get ahead” and “winning at any cost.” Eventually the “self” would become an American obsession of theological proportions. “The purpose of nearly all the current gospels is to put believers ‘in touch’ with themselves,” Rapson further explained.[vi] This new brand of secular “faith” also comports well with the religiosity many Americans still profess, especially evangelical strains that promise economic gain to dutiful worshippers. Continue reading “Creative Magic”

Stigma and Mental Illness

By David Trend

“The more I became immersed in the study of stigmatized mental illness, the more it astonishing to me that any such phenomenon should exist at all,” writes Robert Lundin, a member of the Chicago Consortium for Stigma Research. “I believe that serious and persistent mental illnesses, like the one I live with, are clearly an inexorably no-fault phenomena that fully warrant being treated with the same gentleness and respect as multiple-sclerosis, testicular cancer or sickle-cell anemia.”[i] Here Lundin names a central of problem in the social construction of mental illness: the misunderstanding of conditions affecting the mind as somehow different from other biological illness. The misrecognition renders mental illness prone to the judgmental attributions discussed by Susan Sontag in her 1973 book Illness as Metaphor.  To Sontag, contemporary society reverses ancient views of sickness as a reflection of the inner self.  In this new view, the inner self is seen as actively causing sickness––through smoking, overeating, addictive behavior, and bad habits: “The romantic idea that disease expresses the character is invariably extended to exert that the character causes the disease–because it is not expressed itself. Passion moves inward, striking within the deepest cellular recesses.”[ii] But as before, the sick person is to blame for the illness.

Such sentiments are especially vindictive when a mentally ill person commits a crime. Understandably perhaps, clinical terms like “mental illness” quickly acquire malevolent meanings in the public mind––even though the mentally ill statistically are no more prone to criminality than anyone else. Sometimes this semiotic slippage causes public panic over commonplace disorders. Consider the case of Adam Lanza, the young man who in 2013 shot 26 children and adults at the Sandy Hook Elementary School in Newton, Massachusetts. While mental health analysts speculate that an acute psychotic episode prompted his violence, Lanza never had been diagnosed with a serious mental illness. As reporters scrambled for a story, much was made of Lanza’s childhood symptoms of Asperger’s syndrome, a form of high-functioning autism. The repeated mention of this disorder in news coverage triggered wrong-headed fears nationally of the murderous potential in other autistic kids. According the Centers for Disease Control (CDC), approximately 1 in 50 people (1.5-million) fall somewhere on the autistic spectrum, 80 percent of whom are boys.[iii] This has prompted improved diagnostic measures, which in turn have resulted in an apparent rise in autism cases in recent years––up 78 percent from a decade ago––and made autism a source of acute anxiety for many new parents. Continue reading “Stigma and Mental Illness”

The Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “The Big Data vs Artists and Everyone Else”

The Performance Art of the Deal

By David Trend:

As I write these words, many Americans remain up in arms about President Donald Trump’s peculiar relationship with the truth.  On a seemingly daily basis, the nation is greeted with a new round of accusations or indignant retorts from the President–– most of which bear little resemblance to objective reality. Let’s just say The Commander-in-Chief has a very “creative” approach to factuality––about everything from crime and immigration to science and the judiciary. Perhaps he’s joking or trying to shock people. Or maybe he’s a pathological liar. Time Magazine devoted a cover to the President’s “Truth and Falsehoods”;  the Los Angeles Times ran multiple “Why Trump Lies” editorials; and The New Yorker is now 14 installments in its ongoing “Trump and the Truth” series. Unsurprisingly, the President doubled-down on his claims, and––in keeping with his fondness for conspiracy theories––has labelled the entire field of journalism “the enemy of the American people.” Endless pundits and commenters have tried to discern a logic in the President’s bizarre behavior––in which mischief and chaos seem the only constants.

Say what you will about Trump, his ability to get public attention is astonishing. And while some critics question the President’s grasp of “reality,” others see a calculated shrewdness in his behavior––an underlying strategy not unlike what Naomi Klein discussed in The Shock Doctrine.  “We already know the Trump administration plans to deregulate markets, wage all-out war on ‘radical Islamic terrorism,’ trash climate science and unleash a fossil-fuel frenzy,” Klein recently stated, adding, “It’s a vision that can be counted on to generate a tsunami of crises and shocks.” She predicted economic shocks (as market bubbles burst), security shocks (as blowback from foreign belligerence comes home), weather shocks (as the climate is further destabilized), and industrial shocks (as oil pipelines spill and rigs collapse, especially when enjoying light-touch regulation).

“All this is dangerous enough,” Klein added, “What’s even worse is the way the Trump administration can be counted on to exploit these shocks politically and economically. Trump himself forecasted as much often in promising a “radical break” from the past––described by Fox News as a “shock and awe campaign against the Washington establishment.” This new agenda bears little resemblance to earlier “culture wars” between conventional liberal and conservative camps. Moral idealism has no place in Trump’s program of disruption and dishonesty. But his ability to confuse and deceive is not to be taken lightly. The Trump phenomenon raises important concerns about the role of knowledge in contemporary society––and the ways different worldviews are conceived, put into circulation, and frequently politicized. Continue reading “The Performance Art of the Deal”

Elsewhere in America

Elsewhere in America: The Crisis of Belonging in Contemporary Culture by David Trend (Routledge: 2016)

The book uses the term “elsewhere” in describing conditions that exile so many citizens to “some other place” through prejudice, competition, or discordant belief. Even as “diversity” has become the official norm in American society, the country continues to fragment along new lines that pit citizens against their government, each other, and even themselves.  Yet in another way, “elsewhere” evokes an undefined “not yet” ripe with potential. 

0001

The book argues that even in the face of daunting challenges, elsewhere can point to optimism, hope, and common purpose. Through 12 detailed chapters, Elsewhere in America applies critical theory in the humanities and social sciences in examining recurring crises of social inclusion (“belonging”) in the U.S.  After two centuries of struggle and incremental “progress” in securing human dignity, today the U.S. finds itself riven apart by new conflicts over reproductive rights, immigration, health care, religious extremism, sexual orientation, mental illness, and fears of terrorists. Why are U.S. ideals of civility and unity so easily hijacked and confused? Is there a way of explaining this recurring tendency of Americans to turn against each other? Elsewhere in America engages these questions in charting the ever-changing faces of difference (manifest in contested landscapes of sex and race to such areas as disability and mental health), their spectral and intersectional character (as seen in the new discourses on performativity, normativity, and queer theory), and the grounds on which categories are manifest in ideation and movement politics (seen in theories of metapolitics, cosmopolitanism, dismodernism).

For more information: https://www.routledge.com/Elsewhere-in-America-The-Crisis-of-Belonging-in-Contemporary-Culture/Trend/p/book/9781138654440

Striking adjuncts

David Trend

If adjuncts want more workplace rights, they have to take them. As Inside HigherEd reports, “That message was echoed throughout a discussion on non-tenure-track faculty rights here Monday at the Coalition of Contingent Academic Labor, or COCAL, conference. It’s being held this week at John Jay College of Criminal Justice of the City University of New York.

“The biennial gathering draws participants from the U.S., Mexico and Canada, and adjunct activist panelists from all three countries advocated striking as a real and valid means of achieving short- and long-term goals.

imgres-1

“Unless and until faculty, including part-time faculty, hit the streets and occupy the classrooms,” said Stanley Aronowitz, a tenured professor of sociology and urban education at the CUNY Graduate Center, “there won’t be any change of substance.”  Aronowitz, who has worked as an adjunct professor several times throughout his career, said this idea applied even in those states where collective bargaining or strikes among public employees is prohibited by law. Faculty members at Nassau Community College who went on strike last year over protracted contract negotiations paid hefty fines for violating New York State’s Taylor Law, for example. (Under the law, the union was permitted to engage in collective bargaining, but not to strike.) But Aronowitz and other activists said that striking is a fundamental right that should be ensured by the First Amendment; without the right to strike, he said, collective bargaining too often becomes “collective begging.”Participants here responded to Aronowitz’s remarks on strikes with strong applause.

“Maria Teresa Lechuga, a Ph.D. candidate in pedagogy at the National Autonomous University of Mexico, added: “We need to stop asking for permission to organize ourselves.” Panelists said that striking is always a “last resort,” to be exercised only when adjunct faculty members and administrators can’t otherwise reach common ground. But in order to ensure public support when and if the time to strike comes, advocates said, adjuncts need to nurture relationships with other kinds of workers, along with parents and students.Maria Maisto, president of the New Faculty Majority, a national adjunct advocacy organization, said adjuncts shouldn’t be afraid to bring up their working conditions with their students. She said such conversations are part of students’ “civic education” — an essential part of their studies. Continue reading “Striking adjuncts”