Legacies of Western Exclusion

Education in the U.S. has a complex history, marked by intellectual progress and systematic exclusion. For over three centuries, its institutions have often prioritized certain forms of knowledge and ways of thinking, inadvertently or intentionally reinforcing intellectual hierarchies. Western philosophical traditions played a significant role in this by emphasizing reason and science while promoting a Eurocentric worldview. The influence of colonialism further complicated matters, as it led to the suppression and erasure of Indigenous knowledge systems around the world and in the U.S. This historical context left a lasting impact on the structure and focus of American higher education, influencing who has access and what is considered valuable knowledge. 

Much of this can be traced to the Age of Reason of the 17th and 18th centuries, which profoundly shaped philosophical and educational frameworks in Europe and internationally. Prominent thinkers like John Locke and Immanuel Kant advanced the authority of rationalism and empiricism, influencing philosophical discourse and promoting certain disciplines over others.[i] This focus led to the development of university curricula that, while comprehensive, often functioned selectively.

The Age of Reason reinforced a Eurocentric perspective that marginalized non-Western forms of knowledge and understanding. Visions of world history that placed Europe at the pinnacle of civilization, as set for by Georg Wilhelm Friedrich Hegel, rendered other cultures as less developed or worthy.[ii]  This prejudice led academic institutions to the criticize, misrepresent, or entirely disregard non-Western philosophies, sciences, and cultural practices. Edward Said’s concept of “Orientalism” explained how Western academia constructed and perpetuated distorted views of non-Western societies, often rendering them as exotic, backward, or irrational in contrast to the supposedly rational and progressive West.[iii] This intellectual bias not only shaped academic disciplines like anthropology and geography but also influenced broader educational curricula, diplomatic relations, and colonial policies. Consequently, the university emerging from this intellectual milieu often failed to recognize or value Indigenous knowledge systems, oral traditions, and alternative epistemologies, further entrenching the dominance of Western thought in global academic discourse.

Continue reading “Legacies of Western Exclusion”

The Changing Face of College

As the new academic year begins, the shifting demographics of undergraduates bear acknowledgment. Today’s students are navigating a profoundly altered landscape when it comes to higher education. Coming of age amidst shifting sands, they no longer view
college as a mere rite of passage into adulthood, a perception held by many in previous generations. Instead, higher education has emerged as a perceived bulwark against an unstable future, a necessary tool to secure a foothold in an increasingly competitive market. Armed with a critical eye and a deep-seated desire for value in their educational investment, these students are willing to devote the time and effort necessary to achieve grades that promise to pave a promising pathway into the workforce or further studies, viewing each step as a vital cog in the machinery of their future success.

The metamorphosis in the racial and ethnic composition of American higher education institutions is indeed noteworthy. According to data from the National Center for Education Statistics, there has been a discernible increase in the enrollment rates of several minority groups. In the fall of 2019, it was noted that the proportion of white students enrolled in colleges was around 55.9%, while Hispanic and Asian/Pacific Islander students represented 20.1% and 7.4% of enrollments, respectively.[i] Furthermore, the number of African American students enrolling has also seen an incremental rise, accounting for 13.2% in the same year. These developments illustrate a promising trajectory towards fostering a more inclusive and diverse educational environment. The progressive shift not only indicates a break from a predominantly white majority but also hints at an enriching academic milieu where perspectives from various backgrounds can converge. This diversification is a cornerstone in preparing students to navigate a globally interconnected world, where understanding and appreciation for diverse cultures and narratives is a critical asset.

Continue reading “The Changing Face of College”

The Instruction Myth Revisited

In the vast landscape academia, one constant lingers. The venerated lecture is an historical artifact that traces its origins to the very inception of higher learning. Such a tradition, efficient as it might be for transmitting facts, often falls short in sparking genuine engagement. This passive style stands in stark contrast to true education, especially in our digitally charged era where learning has undergone a dramatic metamorphosis.

Our digital age hasn’t just redefined how we retrieve information, but reshaped our very expectations of learning. The omnipresence of online tools and multifaceted communication avenues heralds a marked shift in pedagogy. Brick-and-mortar classrooms, once the sole sanctums of knowledge, are being complemented by, if not at times replaced by, vibrant alternative modalities.

As John Tagg insightfully noted in his now- classic The Instruction Myth: Why Higher Education is Hard to Change, And How to Change It (Rutgers, 2019), established education structures can unwittingly ensnare itself in a misguided “universal solution” mindset. They risk glossing over the rich potentials of diverse learners, their individualized backgrounds, and inclinations. In this milieu, learning that foregrounds students’ individual aptitudes emerges as a promising way forward. Such adaptive approaches beckon a richer, more encompassing educational horizon.

However, the journey to innovation is fraught with institutional roadblocks. The gravitational pull of longstanding norms, coupled with an almost reverential deference to the established order, can thwart progress. To Tagg these institutional barriers can be compounded by faculty hesitancy, often stemming from tech apprehension or the perceived threat of new methodologies –– all of which amplifies the challenge. Alleviating these concerns demands a renewed commitment to professional growth and the fostering of a collaborative ethos among educators. Moreover, it’s an irony that external accreditation entities, designed to enshrine the zenith of academic excellence, might inadvertently ossify outdated methods. A recalibration towards genuine learning outcomes, rather than the means of instruction, seems imperative.

Continue reading “The Instruction Myth Revisited”

The Fitness Paradox

In the American panorama, fitness culture has taken a front-row seat. Sculpted physiques have become the driving force in our self-image-fueled society. An aesthetic representation of health has hijacked the popular consciousness, becoming not only coveted but also expected. It’s a celebration of the human body, but with its glorification, the “healthy” standard morphs into an unreachable Everest for many.

Couched in the language of possibilities without borders, fitness campaigns shine a spotlight on personal responsibility, with Nike’s “Just Do It” mantra being the poster child for such efforts. It’s not about selling sneakers, it’s about selling the dream that we can all ascend to athletic greatness. Their website continues this narrative, stating, “Your daily motivation with the latest gear, most effective workouts and the inspiration you need to test your limits ––and unleash your potential.” The push is persuasive, especially for young customers grappling with identity, schooling, or job hunting.

Similar slogans resound from the likes of Equinox, LA Fitness, and Shadow Fitness, all tapping into the ethos of self-determination, willpower, and personal growth. As Forbes reports, the multi-billion dollar fitness industry, which has grown steadily over the last decade, is fueled in part by gym-rat adults. The motivation? Lower health insurance costs and the powerful self-affirmation that accompanies taking the reins of one’s health.

Contrast this landscape with the stark reality: many Americans remain outside this idealized circle of health and fitness, intensifying the quest for better bodies. The message to our ageing, overweight, and unwell population is unequivocal: “get in shape or get left behind.” And this pressure isn’t limited to one demographic; it’s an equal opportunity oppressor, driving men, women, and the non-binary to chase this epitome of health. Fitness obsession seeps into every corner of our lives, from diet plans to gym memberships, from yoga studios to the booming wellness industry. Even giants like Amazon have recognized this lucrative market, snapping up Whole Foods.

The New You

You’ve probably never heard of TestingMom.com. It’s part of a new generation of test-prep companies like Kaplan and Princeton Review –– except this one is for toddlers. Competition for slots in kindergarten has gotten so intense that some parents are shelling out thousands to get their four-year olds ready for entrance tests or interviews. It’s just one more example of the pressure that got celebrity parents arrested for falsifying college applications a few years ago. In this case the battle is over getting into elite elementary schools or gifted programs. While such admissions pressure is widely known, what’s new is how early it’s occurring. Equity issues aside, the demand to improve performance is being drilled into youngsters before they can spell their names.  All of this bespeaks the competition for grades, school placement, and eventual careers that has transformed the normal impulse to do better into an obsession for students and their families. Much like the drive for perfection, an insatiable hunger to be quicker, smarter, and more acceptable to admissions officers is taking its toll in many ways. 

What explains this obsessive behavior? Brain science has been proving what advertising long has known ­–– that wanting something is far more powerful than getting it. School admissions and other markers of success are part of an overarching mental wanting mechanism. That new iPhone might bring a thrill. But soon comes the yearning for an update, a newer model, another purchase. Neuroimaging shows that processes of “wanting” and “liking” occur in different parts of the brain, with the former more broadly and powerfully operating than the latter. This reverses the common wisdom that primal hungers and “drives” underlie human motivation.  Unlike animals, the motor force driving human beings is imagination –– with anticipation of something more important than the experience itself. This partly explains why merchandizing deals more with feeling than facts. Slogans like “Just Do It” and “Think Different” bear no direct relationship to shoes or computers, but instead tingle feelings of desire. In the fuzzy realm emotion pleasure is a fungible currency. 

Continue reading “The New You”

Update Available: The Algorithmic Self

Bing, Bard, and other bots. The world is rushing headlong into a ChatGPT future. Yet amid the giddy optimism over boundless new capabilities lie deeper questions about how artificial intelligence is reshaping human consciousness in unnoticed ways. Update Available: The Algorithmic Self (2023) take a critical look at this emerging phenomenon.

Update Available is available as a free download from Amazon, Apple, Barnes & Noble and other major retailers, published as an Open Access Creative Commons book.

Other books by David Trend include  Welcome to Cyberschool: Education at the Crossroads in the Information Age, Worlding: Media, Identity, and Imagination,  and The End of Reading: From Guttenberg to Grand Theft Auto.  

Trend’s popular “Changing Creativity” course is taken each year by over 1000 students throughout the University of California system.

Find Your Superpower

“How to Find Your Superpower” is among thousands of recent articles, books, and improvement programs about the age-old dream of an updated self. Like others in its genre, the piece offers guidance for achieving “peak performance” through a blend of passion, mastery, and hard work. “The #1 thing you can do is determine your strengths, determine your superpowers,” the authors state in coaching readers to sharpen “a dominant gift an attribute, skill or ability that makes you stronger than the rest:  a difference between you and your coworker.”[i] Find that elusive something, and you are sure to succeed. Pitches like this appear everywhere these days. Witness the massive market for fitness, beauty, self-esteem, and cognitive improvement products. These range from dietary supplements and workout regimes to books, videos, and apps. Amazon is loaded with titles like Your Hidden Superpower, Finding Your Superpower, and the kid’s book What’s My Superpower? [ii]

Juvenile appeals notwithstanding, a consistent theme runs through all these books – that it is up to you alone to find, develop, or somehow acquire missing capacities. Rarely is there a mention of structural advantages or disadvantages in the superpower quest. The impulse to exceed one’s limits has a long history in Western thought, with roots in religious doctrine and philosophy. Some even link enhancement to hard-wired survival instincts. Simply put, people have been augmenting themselves for thousands of years, first by using tools, then by working in groups, and later with machines and technology. From the Enlightenment Era onward, this was seen as humanity’s “natural” impulse for continual improvement and progress. Ongoing developments in science and medicine have intensified this drive, along with the heightened sense of crisis in the 21st century. The result has been a growing mania to become stronger, smarter, and better looking than anyone else.

Continue reading “Find Your Superpower”

Welcome to Cyberschool

David Trend

While technology always has played a big part in education ,it went into hyperdrive in the pandemic-driven move to online learning. Up to this point, economic pressures and growing student numbers already were causing a panic in education. Schools were struggling to trim budgets as “accountability” scrutinized everyone. These extant conditions presented an upside to some of the changes that would occur.  Most dramatically, the shift to doing schoolwork at home eliminated shortfalls in classroom space and, at least temporarily, student housing as well. As the pandemic continued the share of higher education offered online jumped from 10 percent in 2019 to 33 percent a few years later.[i]  But as everyone now knows, so-called “distance learning” isn’t for everyone and doesn’t work for all kinds of material.  Research shows that one-size-fits-all character of mechanical course delivery disadvantages students of many kinds. 

Online schooling isn’t as new as you might think. The idea of distance learning dates to vocational and self-improvement correspondence courses of the eighteenth century, which arose with improvements  in mail delivery systems. Often cited as an early example was a shorthand course offered by Caleb Phillips, advertised in a 1721 edition of Boston Gazette with claims that “students may by having several lessons sent weekly to them, be as perfectly instructed as those that live in Boston.”[ii] By the 1800s all manner of vocational skills were being taught by mail, as well hobbies like drawing and painting. The University of London became the first college to offer distance learning degrees in 1858. By the end of the century, learning by mail had become big business for institutions like the Pennsylvania-based International Correspondence Schools (ICS). In the decade between 1895 and 1905, ICS grew from 72,000 to 900,000 students signing up to learn technical and management skills.[iii] Much of this growth was due to the innovation of sending entire textbooks rather than single lessons, along with promotion by a large in-person sales team.

The Learning Society

David Trend

As consumer prices continue to rise, experts now warn of a looming recesssion brought about by pandemic manufacturing slowdowns and supply-chain shortages. Economists explain it as a classic case of demand outpacing availability –– with scarcity making things more costly. Unfortunately, the painful solution now being launched will raise borrowing costs rates so that people spend less. While these measures may or may not improve the overall economomy, the combined effects of inflation and rising interest rates will exact a double blow to people struggling to make ends meet. In such an atmosphere it becomes critical to help people manage their own finances and to prevent the broader economy from overheating. This is where consumer education and financial literacy can help as part of a largermove toward a “learning society.”

For some time now, economists have been promoting financial education in public schools and urging people to become more resourceful. Time Magazine reported polls showing “99 percent of adults in agreement that personal finance should be taught in high school.”[i]  The Federal Reserve argued that “financial literacy and consumer education, coupled with strong consumer protections, make the financial marketplace ‘effective and efficient’ and assists consumers in making better choices.”[ii] Many colleges and universities have started making financial literacy courses graduation requirements. And for some it has worked, as many Americans “put their own budgets under the microscope –– akin to what financial analysts routinely do when the scrutinize companies.”[iii]  

Continue reading “The Learning Society”

The Creative Inner Child?

David Trend

Pablo Picasso once quipped that “Every child is an artist; the problem is how to remain an artist once they grow up.”[i]  In this often-quoted slogan, Picasso neatly summarized idealized views of the universally creative child and the uncreative adult. In a similar fashion he would later write that, “It takes a long time to become young.” What is one to make of such laments? Nostalgia over a lost youth? A yearning to escape a pressurized grown-up life?  Regardless of origins, it’s impossible to deny America’s ongoing infatuation with childhood creativity.

This fascination childood artistry dates to the 1700s, corresponding to evolving views of children as “blank slates” (tabula rasa) better served by nurturance and education than by discipline alone. At the same time, Enlightenment debates ver individualism and personal autonomy were bringing considerable anxiety to the era, evidenced in worries that self-interest would overwhelm moral sentiments. This set the stage for the naturalism espoused by Jean-Jacques Rousseau in his book Emile: Or, On Education, seeing an inherent “goodness” in children, which becomes corrupted by adult desire and material want.[ii] With the 1800s, views of “human nature” gave ways to theories of evolution and behavioral adaptation –– owing in large part to the influence of Charles Darwin and Herbert Spencer. While the resulting rationalism eventually would make educatio more formulaic, an artsy transcendentalism would counterbalance American culture with an advocacy for an “educated imagination.”[iii] The Romantic Era writings of Ralph Waldo Emerson, Margaret Fuller, Henry Wadsworth Longfellow, and Walt Whitman advanced themes of emotion over reason and imagination over reality –– setting in place a tradition progressive of push-back against the instrumentalist ethos of science and industry.

In the 1920s, Swiss psychologist Jean Piaget began charting children’s “stages” of maturity, hence launching the modern field of child development.[iv] Piaget saw “realistic” rendering as a learned ability rather than a natural inclination. In one famous study, Piaget asked a group of four-year olds to draw familiar people or objects. He found that the images invariably had the same characteristics: drawn from memory rather than observation, exaggeration of certain salient features (faces, for example), and a disregard of perspective or scale. In other words, the images derived more from mental symbolism than they did conventional schema of visual representation. Piaget would note that at later ages children acquire the ability to “correct” their images to conform to normative depictions of reality. Later observations of so-called “feral” children (raised” in the wild without human contact) found that such children often didn’t speak or make pictures of any kind, further reinforcing the premise that language and “artistic” rendering were largely determined by culture.[v]

The Problem with Rigor

David Trend

“It’s Time to Cancel the Word Rigor,” read a recent headline in the respected Chronicle of Higher Education.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

David Trend

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

David Trend

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

When School is a Factory

David Trend

For 20 years, I have been teaching large arts and humanities general education courses at the University of California, Irvine. These 400-student classes are part of the undergraduate “breadth requirements” common in most colleges and universities, and hence draw enrollments from across the academic disciplines. At UC Irvine, this means that most of the class comprises science, technology, engineering, and math (STEM) majors. Aside from an orientation to more practical fields, I’ve noticed a clear shift in student attitudes in recent years –– a heightened preoccupation with grades and rankings, combined with growing anxieties about future earnings. Many of my colleagues see this as well, often disparaging students more concerned with GPA metrics than learning itself, while increasingly behaving more like consumers of educational commodities. I take a more sanguine view.

Bear in mind that many of today’s college students grew up during the Great Recession, when families of all incomes had money worries. With scant knowledge of a world before 9/11, it’s little wonder that polls show millennials expecting lower earnings than their parents, seeing the United States on a downward spiral, and believing the two-party system as fatally flawed.[i] Rising income inequality doesn’t help matters, especially at UC Irvine where 6 in 10 students get financial aid and half are the first in their families earning a college degree.[ii] Because of this, Irvine has been cited by the New York Times as the country’s leading “upward mobility engine” –– making the campus a national model of what public higher education can do.[iii] But it’s still not a cake-walk for degree seekers. As at most public universities in America, the majority of Irvine’s full-time students also work at jobs to make ends meet.[iv] Continue reading “When School is a Factory”

Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “Big Data vs Artists and Everyone Else”

Belonging Where?

By David Trend:

Throughout its existence the United States has shown a strange tendency to turn against itself, dividing citizens against each other with a vehemence rivaling the most brutal regimes on earth. Some have rationalized the resulting crisis of “belonging” in America as an understandable consequence of cultural diversity, economic stress, and global threat. After all, haven’t there always been “insiders” and “outsiders” in every culture? Aren’t competition and aggression wired into human nature?  Or is there something peculiar about the personality of the U.S.?  Could it be that prejudice is the real legacy of the “American Exceptionalism,” in traditions dating to the genocide of indigenous populations, the subjugation of women, the rise of slavery, the scapegoating of immigrants, and more recent assaults on the poor or anyone falling outside the realm of normalcy?

I discussed selected aspects of America’s divisive pathology in my book A Culture Divided: America’s Struggle for Unity, which was written in the closing years of the George W. Bush presidency.  Like many at the time, I had completely given up on the idea of “common ground” amid the residue of post-9/11 reactionary fervor and emerging economic recession. Media commentators were buzzing constantly about red/blue state polarization.  Opinions varied about the cause of the divide, attributing it to factors including regionalism, media sensationalism, partisan antipathy, or all of these combined. Also joining the fray were those asserting the divide was fabricated, with evenly divided elections showing most people in the middle of the curve on most issues.  My somewhat contrarian view was that the “problem” shouldn’t be regarded problem at all. After all, America always had been divided––through war and peace, boom and bust. Division was the country’s national brand.  But as a book about politics, A Culture Divided didn’t get to the roots or the lived experience America’s compulsive divisiveness.

Speaking at the 50th anniversary of the Selma to Montgomery marches, President Barack Obama described America as an incomplete project––a nation caught between ideals of a perfect union and the lingering realities of their failure. While citing advances in civil liberties since the bloody apex of the Voting Rights Movement, Obama also spoke of a federal report issued just days earlier documenting structural racism and misbehavior toward African Americans by police in Ferguson, MO, where months before law enforcement officers had killed an unarmed black teenager. “We know the march is not yet over.  We know the race is not yet won,” the President stated, adding, “We know that reaching that blessed destination requires admitting as much, facing up to the truth.” Continue reading “Belonging Where?”

Creative Magic

By David Trend

“The central question upon which all creative living hinges is: Do you have the courage to bring forth the treasures hidden within you?” With this entreaty, author Elizabeth Gilbert introduced her recent bestseller Big Magic: Creative Living Beyond Fear, which offered an artistic cure for an anxious American culture.[i] Speaking directly to widespread feelings of disaffection and powerlessness, Big Magic romanticized artistry in Gilbert’s signature blend of sentiment and cliché––packaging familiar views (human creativity, divine creativity, etc.) with a self-help twist about creating one’s “self” in new and better ways.  While one easily can write off Big Magic as yet another feel-good advice book (which it surely is), I think it’s time to take Gilbert’s approach to creativity seriously and ponder why such ideas now get so much traction.

Publicity doesn’t hurt. Reviewers effused over Big Magic as a “book-length meditation on inspiration” (Newsday) to “unlock your inner artist” (Woman’s Day) and “dream a life without limits” (Publishers’ Weekly).[ii] This message resonated well with the rising chorus promoting creativity as an innovation engine and economic tonic.  While no one would dispute the positive benefits of a little artistic dabbling, at what point does such wishful thinking begin to border on delusion? Or put another way, when does fantasy paper over reality? Might it be that America’s fondness for make-believe is party behind the nation’s political confusion and disaffection? Do fairy-tale versions of life infantilize a citizenry that should know that answers don’t always come easily?  Certainly the fantasy-version of reality offered by certain politicians would fail any thoughtful analysis. But instead, many leaders continue treating their constituents like children, with entire governments encouraging populations to set worries aside and simply “Be Creative.”

In Magical Thinking and the Decline of America, historian Richard L. Rapson took a long look at the nation’s romantic idealism. “Probably in no other society of the world can one write the script for one’s life as completely as United States. This fact has made the nation the ‘promised land’ for much of the world over the past two centuries,” Rapson wrote. “The flight into endless self-improvement and innocent optimism has a long lineage in our past.”[iii] Perhaps anticipating Donald Trump’s “Make America Great Again” sloganeering, Rapson pointed to the disconnection between America’s self-image as an “exceptional” driver of human history, and the growing evidence of the nation’s falling fortunes. This has led to what Rapson described as a growing “flight from knowledge and reality into faith and fantasy,” resulting in large part from “an American public increasingly in thrall to the fairytales told by the mass media.”[iv]  It also promotes a “cultural fixation on the individual, the personal, the biographical, the confessional, and, all too often, the narcissistic,” and hence the rise of new “magic words” like “self-awareness,” “personal growth” and other aphorisms promoting everyone to “be all that you can be.”[v]

Individualism lies at the heart of American idealism, dating to the country’s Enlightenment Era origins, when the autonomous subject was invented as a counterpoint to deific and royal authority. Necessary as individualism was (and remains), no one could have predicted how its value could be magnified and distorted in neoliberal times.  The initial affirmation of personal identity, which encouraged people to vote and participate in society, soon morphed into “striving to get ahead” and “winning at any cost.” Eventually the “self” would become an American obsession of theological proportions. “The purpose of nearly all the current gospels is to put believers ‘in touch’ with themselves,” Rapson further explained.[vi] This new brand of secular “faith” also comports well with the religiosity many Americans still profess, especially evangelical strains that promise economic gain to dutiful worshippers. Continue reading “Creative Magic”

The Performance Art of the Deal

By David Trend:

As I write these words, many Americans remain up in arms about President Donald Trump’s peculiar relationship with the truth.  On a seemingly daily basis, the nation is greeted with a new round of accusations or indignant retorts from the President–– most of which bear little resemblance to objective reality. Let’s just say The Commander-in-Chief has a very “creative” approach to factuality––about everything from crime and immigration to science and the judiciary. Perhaps he’s joking or trying to shock people. Or maybe he’s a pathological liar. Time Magazine devoted a cover to the President’s “Truth and Falsehoods”;  the Los Angeles Times ran multiple “Why Trump Lies” editorials; and The New Yorker is now 14 installments in its ongoing “Trump and the Truth” series. Unsurprisingly, the President doubled-down on his claims, and––in keeping with his fondness for conspiracy theories––has labelled the entire field of journalism “the enemy of the American people.” Endless pundits and commenters have tried to discern a logic in the President’s bizarre behavior––in which mischief and chaos seem the only constants.

Say what you will about Trump, his ability to get public attention is astonishing. And while some critics question the President’s grasp of “reality,” others see a calculated shrewdness in his behavior––an underlying strategy not unlike what Naomi Klein discussed in The Shock Doctrine.  “We already know the Trump administration plans to deregulate markets, wage all-out war on ‘radical Islamic terrorism,’ trash climate science and unleash a fossil-fuel frenzy,” Klein recently stated, adding, “It’s a vision that can be counted on to generate a tsunami of crises and shocks.” She predicted economic shocks (as market bubbles burst), security shocks (as blowback from foreign belligerence comes home), weather shocks (as the climate is further destabilized), and industrial shocks (as oil pipelines spill and rigs collapse, especially when enjoying light-touch regulation).

“All this is dangerous enough,” Klein added, “What’s even worse is the way the Trump administration can be counted on to exploit these shocks politically and economically. Trump himself forecasted as much often in promising a “radical break” from the past––described by Fox News as a “shock and awe campaign against the Washington establishment.” This new agenda bears little resemblance to earlier “culture wars” between conventional liberal and conservative camps. Moral idealism has no place in Trump’s program of disruption and dishonesty. But his ability to confuse and deceive is not to be taken lightly. The Trump phenomenon raises important concerns about the role of knowledge in contemporary society––and the ways different worldviews are conceived, put into circulation, and frequently politicized. Continue reading “The Performance Art of the Deal”

On teaching style

Professors who want to establish classroom connections with their students receive lots of advice. And some experts have over the years advised the use of “self-disclosure,” telling students stories about themselves, using self-deprecating humor as a way to make students feel comfortable and to view the instructor as an ally. InsideHigher Ed discusses the finer points of this:

“Ignore that advice. That’s the recommendation of a study being published today in Communication Education, a journal of the National Communication Association. imgresThe study was based on surveys of 438 undergraduates at a Southeastern university. The students — from across disciplines — were asked about the class they had attended just before taking the survey. And for that class, they were asked both about their instructors and about whether they engaged in certain “uncivil” behaviors, such as packing up books before class was over or texting during lectures. The researchers then compared attitudes the students had about professors and the students behaviors.

“The study notes that professors’ styles only go so far in predicting whether students will be posting status updates on Facebook or actually paying attention, but they do matter.

“Although it is clear that a range of factors outside of instructors’ control contribute to uncivil behavior in the classroom — such as societal shifts toward student entitlement and students’ being raised in homes where manners are not adequately taught — results of this study indicate that there are at least some things instructors can do to minimize uncivil behavior,” the study says. “This model, taking into account only instructor-related factors, explained 20 percent of the variance in self-reported uncivil behaviors among our participants — not a huge proportion, but enough to make a noticeable difference to a frustrated teacher.”

“Based on the surveys, the paper argues that students are least likely to engage in uncivil behavior when they view the instructor as having high levels of “credibility,” meaning that through actions and nonverbal cues, the instructor conveys command of the material and the class, a sense of knowing what should be going on in class, and so forth. When students have that confidence level, they are more likely to pay attention. Continue reading “On teaching style”