The Backlash Against Inclusive Teaching

Yet another backlash against student diversity was discussed this past week in the Chronicle of Higher Education. In this case the assault came against pandemic-era inclusive teaching measures designed to mitigate the risk of student disconnection and failure –– methods such as group work, deadline flexibility, enhanced faculty interaction, and Universal Design for Learning. However, critics argue that these measures have led to a lax academic environment and decreased student motivation. What is needed, the critics assert, are stricter and more difficult courses to force students back in line.

In an article, “Why Calls for a Return to Rigor Are Wrong,”  Chronicle columnist Kevin Gannon counters this perspective, contending that a simple increase in workload, tougher grading, and heightened standards do not equate to academic rigor. He argues that these conventional methods often serve as a veneer for practices that raise barriers to student success, rather than tearing them down.

Critics of the pandemic-era teaching efforts often focus on metrics such as the volume of reading per week, the number of writing assignments, or the duration to complete an academic program. According to them, these have fallen far too low. In essence, they attribute “rigor” to logistical challenges in course delivery. However, Gannon emphasizes that higher education needn’t be prohibitive, and introducing practices that stifle student motivation and engagement is counterproductive.

In the midst of this debate, the University of California, Irvine (UCI), has taken a progressive step towards educational inclusivity with the launch of the Inclusive Course Design Institute (ICDI). This post will explore the transformative initiative of ICDI at UCI, which serves as a beacon of inclusivity and equity in the shifting landscapes of higher education.

Continue reading “The Backlash Against Inclusive Teaching”

The Good Life

How do you live a “good life”?  It’s a question philosophers have pondered and pollsters still pose. Answers vary a lot, given differences in opinion and the breadth of the issue. What often comes to mind is a definition of happiness or what makes a life satisfying. For most people, the question entails both “self-directed” aspects of personal experience and “other-directed” elements of one’s place among others.[i]  Definitions of the good life can refer to abundance (“luxury, pleasure, or comfort”) or insight (“simplicity, health and morality).”[ii]  Other qualities include freedom or the idea of life as a journey.  This chapter explores how people view and pursue the good life, and what obstacles may stand in their way.

Discussions of the good life date to the ancient Greek concept of eudaimonia, a word commonly translated as “happiness,” “flourishing,” or “well-being.”[iii] Aristotle cast eudaimonia as an aspirational state that individuals could achieve by demonstrating authenticity and virtue in the eyes of the divine. This differed somewhat from the more immediate state of pleasure and enjoyment known as hedonia. As later philosophers gave people more credit for self-determination, enlightenment era figures like René Descartes and Baruch Spinosa linked the good life to a reasoned control of human passions.[iv] Christian interpretations of the good life sometimes gave it a moral character in beliefs that humans were created in God’s image, which is “good” by definition. In this line of thinking, virtue and success in life go hand-in-hand.  

Historical figures sometimes made lists to define the good life. Socrates said such a life should follow five principles: temperance, courage, piety, justice, and wisdom.[v] Gautama Buddha spoke of an eightfold path of understanding, thought, speech, action, livelihood, effort, mindfulness, and concentration.[vi] Almost all traditional good life lists had people conforming to widely held doctrines or belief systems, with the “self” cast as an element in a larger plan. In today’s more secular times most people see the good life as a matter of perspective. Unfortunately, this relativization has brought with it a certain emptiness. A simple online search for good life will provide you with a list of “bucket lists” of activities such as traveling or skydiving.

Continue reading “The Good Life”

The New You

You’ve probably never heard of TestingMom.com. It’s part of a new generation of test-prep companies like Kaplan and Princeton Review –– except this one is for toddlers. Competition for slots in kindergarten has gotten so intense that some parents are shelling out thousands to get their four-year olds ready for entrance tests or interviews. It’s just one more example of the pressure that got celebrity parents arrested for falsifying college applications a few years ago. In this case the battle is over getting into elite elementary schools or gifted programs. While such admissions pressure is widely known, what’s new is how early it’s occurring. Equity issues aside, the demand to improve performance is being drilled into youngsters before they can spell their names.  All of this bespeaks the competition for grades, school placement, and eventual careers that has transformed the normal impulse to do better into an obsession for students and their families. Much like the drive for perfection, an insatiable hunger to be quicker, smarter, and more acceptable to admissions officers is taking its toll in many ways. 

What explains this obsessive behavior? Brain science has been proving what advertising long has known ­–– that wanting something is far more powerful than getting it. School admissions and other markers of success are part of an overarching mental wanting mechanism. That new iPhone might bring a thrill. But soon comes the yearning for an update, a newer model, another purchase. Neuroimaging shows that processes of “wanting” and “liking” occur in different parts of the brain, with the former more broadly and powerfully operating than the latter. This reverses the common wisdom that primal hungers and “drives” underlie human motivation.  Unlike animals, the motor force driving human beings is imagination –– with anticipation of something more important than the experience itself. This partly explains why merchandizing deals more with feeling than facts. Slogans like “Just Do It” and “Think Different” bear no direct relationship to shoes or computers, but instead tingle feelings of desire. In the fuzzy realm emotion pleasure is a fungible currency. 

Continue reading “The New You”

Update Available: The Algorithmic Self

Bing, Bard, and other bots. The world is rushing headlong into a ChatGPT future. Yet amid the giddy optimism over boundless new capabilities lie deeper questions about how artificial intelligence is reshaping human consciousness in unnoticed ways. Update Available: The Algorithmic Self (2023) take a critical look at this emerging phenomenon.

Update Available is available as a free download from Amazon, Apple, Barnes & Noble and other major retailers, published as an Open Access Creative Commons book.

Other books by David Trend include  Welcome to Cyberschool: Education at the Crossroads in the Information Age, Worlding: Media, Identity, and Imagination,  and The End of Reading: From Guttenberg to Grand Theft Auto.  

Trend’s popular “Changing Creativity” course is taken each year by over 1000 students throughout the University of California system.

Find Your Superpower

“How to Find Your Superpower” is among thousands of recent articles, books, and improvement programs about the age-old dream of an updated self. Like others in its genre, the piece offers guidance for achieving “peak performance” through a blend of passion, mastery, and hard work. “The #1 thing you can do is determine your strengths, determine your superpowers,” the authors state in coaching readers to sharpen “a dominant gift an attribute, skill or ability that makes you stronger than the rest:  a difference between you and your coworker.”[i] Find that elusive something, and you are sure to succeed. Pitches like this appear everywhere these days. Witness the massive market for fitness, beauty, self-esteem, and cognitive improvement products. These range from dietary supplements and workout regimes to books, videos, and apps. Amazon is loaded with titles like Your Hidden Superpower, Finding Your Superpower, and the kid’s book What’s My Superpower? [ii]

Juvenile appeals notwithstanding, a consistent theme runs through all these books – that it is up to you alone to find, develop, or somehow acquire missing capacities. Rarely is there a mention of structural advantages or disadvantages in the superpower quest. The impulse to exceed one’s limits has a long history in Western thought, with roots in religious doctrine and philosophy. Some even link enhancement to hard-wired survival instincts. Simply put, people have been augmenting themselves for thousands of years, first by using tools, then by working in groups, and later with machines and technology. From the Enlightenment Era onward, this was seen as humanity’s “natural” impulse for continual improvement and progress. Ongoing developments in science and medicine have intensified this drive, along with the heightened sense of crisis in the 21st century. The result has been a growing mania to become stronger, smarter, and better looking than anyone else.

Then add popular culture. Everyone knows the basic superhero plot: stories begin with ordinary characters (often underdogs), who transform via accident, discovery, or gift. With new powers, the superhero battles evil and invariably prevails. Such stories now comprise the most cherished works of mainstream media, generating fortunes for movie franchises: Marvel ($18.2 billion), Harry Potter ($9.1 billion), X-Men ($5.8 billion), DC Universe ($4.9 billion), Spiderman ($4.8 billion).[iii] It’s easy to see the appeal of these films. In an essay titled “Why Everyone Has Seen a Superhero Movie,” critic Gwyneth Torrecampo explained that “The unique challenges we face in our everyday lives can be daunting and stressful, leading us to feel powerless and dejected.”[iv]  Viewers thus identify with the hero as a form of wish fulfillment, she explains. “Superheroes often start out like you and me, and then go on to overcome obstacles, protect the vulnerable, or save the world. It’s a potent fantasy that inspires imitation among viewers.”

The superhero syndrome is the fantasy version of “human enhancement,” defined as “the natural, artificial, or technological alteration of the human body to enhance physical or mental abilities.”[v] On one hand there is nothing terribly new or unusual about this. Running shoes and vitamins are enhancements that people take for granted. And indeed, much of modern medicine devotes itself to such helpful interventions, especially when they address genuine needs or difficulties.  An appropriately determined restoration of health and functionality always has defined the practice of healing professions, as discussed in Chapter 5. But in recent years, the marriage of science and business has gone well beyond “getting back to normal” in offering ever-more-sophisticated forms of enhancement to meet the public’s insatiable appetite for “more.” But not without controversy. The troubled histories of cosmetic surgery, fad-diets, and steroid abuse are but a few notable examples. Certainly science-fiction superpower stories play a big part in the phenomenon. But on another level, the hunger for such products feeds on the gnawing anxiety now epidemic in the U.S. In addition to economic disparities and their socio-cultural underpinnings, new levels of perfectionism percolate in mainstream culture as well. Advertising only reinforces these impulses by linking them to products.

Human enhancement fascinated German philosopher Friedrich Nietzsche, known for his social criticism and advocacy of science. In 1883, Nietzsche introduced the aristocratic figure of the “Übermensch” (Superman) as an aspirational ideal for the human race. He argued that this “perfect human” could be achieved though secular means on earth (rather than heaven) by improvements in health, creativity, and willpower.  In making this claim, Nietzsche wasn’t simply promoting sci-fi fantasy. Putting the Übermensch in a broader context, Nietzsche explained that every society generates a set of body ideals, and that those ideals inform what societies value and how they behave. The Übermensch complimented then-popular beliefs about human evolution, especially the strain of thinking known as “eugenics.”  Also introduced in 1883, eugenics applied Charles Darwin’s theories of natural selection to social policy by advocating the selective reproduction of certain classes of citizens over others (something Darwin never himself advocated). National leaders like Winston Churchill and Theodore Roosevelt supported the concept, along with many others around the world.[vi]  Nazi eugenicists later would cite Nietzsche’s Superman concept in their program to perfect the Aryan race through genocidal programs during World War II.

The excesses of early eugenics movements have tempered contemporary thinking about human enhancement but have done little to dampen yearnings for superhuman updates to the body and mind. Unforeseen consequences often come from what initially seem good ideas. And enthusiasm has little patience for downsides. Adding commerce, culture, and health benefits to the mix, it’s no mystery why the update impulse is stronger than ever. This chapter examines the resulting contradictions in today’s improvement culture, as they play out in beauty, fitness, wellness, intelligence, and ability. Key in this discussion is the assertion that in themselves enhancements are neither good nor bad. Like many things, what matters is the degree to which they are pursued, as well as what happens when external values or pressures are placed upon them.

Many use the term “transhumanism” to describe the contemporary update impulse in everything from robotic cyborgs to artificial organs. As the name implies, transhumanism wants humanity to transcend its limitations, with a strong emphasis on subjective autonomy and the specialness of the human species. Philosophically speaking, the movement sees humanity in a contest with nature and the natural world. It partakes in the belief that humans should use nature for their own ends and master its processes with science and technology. This takes form in enhancements to augment the “natural” body or, ultimately, to forestall or eliminate the natural occurrence of death. Because of this, some critics equate transhumanism with anthropocentrism, as well as historic tendencies to denigrate groups seen as uncivilized, savage or otherwise less-than-human owing to their proximity to nature.

Transhumanism differs from the similar term “posthumanism,” which looks at the way the human self is technologically mediated, and how the humans coexist with other organisms. Writing in The Posthuman Glossary, Francesca Ferrando explained the distinction: “Transhumanism and posthumanism both emerged in the 1980s and 1990s, but the drives motivating them are rooted in different traditions of thought. Transhumanism traces its roots within the enlightenment and does not reject the human tradition; on the contrary transhumanism focuses specifically on human enhancement.” In contrast, posthumanism focuses on “the postmodern deconstruction of the human started in the 1960s and 1970s underlining the fact that, historically, not every human being has been recognized as such.”[vii]

British futurist Max More often gets credit for mapping out the first full-fledged philosophy of contemporary transhumanism in his 1990 “Principles of Extropy.”[viii] More used the term “extropy” (the opposite of entropy) to assert the continual evolution of “intelligent life beyond its current human form and limitations.”[ix]  This can include anything from prosthetic limbs, brain implants, and gene splicing to futuristic plans for extending the human life span or uploading consciousness to the cloud.  Most of what is seen in contemporary science fiction falls within the transhumanist realm, for better or worse. While transhumanism focusses on the laudable goal of improving people’s health, ability, and wellbeing, it often glamorizes technology as an end in itself. Transhumanists imagine a future in which people become liberated from the constraints of infirmity and death but remain essentially human.

Transhumanism often finds itself tangled in ethical debates about technology’s appropriate role in life. Some critics worry that too many changes might alter what it means to be human in the first place. Others point out that technologies often get misused or run out of control. And still others express concern about the high costs of enhancements. Underscoring this last point, some of transhumanism’s most well-known boosters are tech billionaires like Elon Musk and Peter Theil.  New technologies often spring from genuine needs and good intentions. Yet they inevitably become contingent on cultural attitudes, market forces, and the institutions that enable them.  

[i] Gwen Moran, “How to Find Your Superpower,” Fast Company (Jun. 8, 2018) https://www.fastcompany.com/40578240/how-to-find-your-superpower (accessed Apr. 22, 2022).

[ii] Becca North, Your Hidden Superpower (Independently published, 2018); Carter Hughes, Finding Your Superpowers: Keys to Cementing Your Identity and Reaching Your Goals (Independently published, 2020); Aviaq Johnson and Tim Mack, What’s My Superpower? (New York: Inhabit Media, 2017).

[iii] Jennifer M. Wood, “10 Highest Grossing Movie Franchises of All Time,” MF (Mar. 18, 2019) https://www.mentalfloss.com/article/70920/10-highest-grossing-movie-franchises-all-time (accessed Apr. 19, 2022).

[iv] Gwyneth Torrecampo, “10 Reasons Why Everyone Has Seen a Superhero Movie,” Medium (Aug. 16, 2018) https://medium.com/framerated/10-reasons-why-superhero-films-are-so-popular-2ce69d2d93ea (accessed Apr. 19, 2022).

[v] “Human Enhancement,” Stanford Encyclopedia off Philosophy (Apr. 7, 2015) https://plato.stanford.edu/entries/enhancement/ (accessed May 20, 2022).

[vi] Victoria Brignell, “When America Believed in Eugenics,” New Statesman (Dec. 10, 2010) https://www.newstatesman.com/society/2010/12/disabled-america-immigration (accessed Apr. 24, 2022).

[vii] Francesca Ferrando, in Rosi Braidotti and Maria Hlavajova, The Posthuman Glossary (London: Bloomsbury Academic, 2018) p. 439.

[viii] See, Max More, “The Philosophy of Transhumanism, in Max More and Natasha Vita-More, The Transhumanist Reader: Classical and Contemporary Essays on Science, Technology, Philosophy of the Human Future (Malden, MA: Wiley-Blackwell, 2013) p. 10.

[ix] More, p. 3.

Anxious Creativity for Free

Anxious Creativity: When Imagination Fails (Routledge) now is available without cost as an Open Access ebook thanks to funding from UC Irvine. You can get it as a Kindle ebook from Amazon or in PDF  format from Routledge using this link.

Creativity is getting new attention in today’s America –– along the way revealing fault lines in U.S. culture. Surveys show people overwhelming seeing creativity as both a desirable trait and a work enhancement, yet most say they just aren’t creative. Like beauty and wealth, creativity seems universally desired but insufficiently possessed. Businesses likewise see innovation as essential to productivity and growth, but can’t bring themselves to risk new ideas. Even as one’s “inner artist” is hyped by a booming self-help industry, creative education dwindles in U.S. schools.

Anxious Creativity: When Imagination Fails examines this conceptual mess, while focusing on how America’s current edginess dampens creativity in everyone. Written in an engaging and accessible style, Anxious Creativity draws on current ideas in the social sciences, economics, and the arts. Discussion centers on the knotty problem of reconciling the expressive potential in all people with the nation’s tendency to reward only a few. Fortunately, there is some good news, as scientists, economists, and creative professionals have begun advocating new ways of sharing and collaboration. Building on these prospects, the book argues that America’s innovation crisis demands a rethinking of individualism, competition, and the ways creativity is rewarded.

Empowerment for Sale

“Yes You Can,” (Sprint), “Be All that You Can Be” (U.S. Army), “Because You’re Worth it,” (L’Oréal) in “Your World, Delivered” (AT&T). You’ve seen these new ads: pitches for products or services to let you “be yourself” or “take control” of some aspect of your life. It’s a new strategy called “empowerment marketing,” based on the premise that in media savvy age people are smarter about advertising and need to be approached in a way that flatters their evolved sensibilities. As a recent feature in Your Business put it, “Traditional marketing depends on creating anxiety in the customer in convincing her that she has a need that only the product or service sold can help her fill.” In contrast, “Empowerment marketing subverts traditional marketing techniques by recasting the consumer as the hero who has the power to effect change and use the product or service being sold to achieve success.”[i]

Nice as this sounds, it is really a case of putting old wine in new bottles. The example Your Business uses is the familiar Nike “Just Do it” campaign, which doesn’t so much promote a certain shoe as much as “the message that anyone can be an athlete if they’re willing to work hard.”[ii] And indeed, this is exactly the message that appears on the first page of Nike’s current website: “Your daily motivation with the latest gear, most effective workouts and the inspiration you need to test your limits––and unleash your potential” with a fashion item lower on the page captioned “Dress like a champion.”[iii] In other words, the new empowerment advertising doesn’t really forgo conventional appeals to consumer anxiety. It simply personalizes the pitch with the lure of enhanced autonomy. The Nike ad itself sums up this contradiction perfectly in stating: “Life isn’t about finding your limits. It’s about realizing you have none.”[iv]  

The New Case Against College

David Trend

It’s called the “paper ceiling” –– the barriers for skilled job seekers who lack a bachelor’s degree. Amid the brouhaha in recent years over admissions scams and student debt, a new line of attack is emerging against higher education. This one is being described as an ontological threat in that it questions the existence and value of college itself, while accusing the system of perpetuating multiple forms of inequity. Of course, higher education often has found itself a political football in the past. What makes this time different is its critique of qualities universities typically have seen as their strength. 

Everyone knows it’s been a tough few years for higher education. Even before the pandemic, colleges and universities were seeing public opinion souring over rising costs, political correctness, and faculty misbehavior –– causing more than a few students and their families to start doubting the value of degree. With enrollments dropping during the “great disruption” at a pace not seen for half a century, concurrent changes in the American workplace have rendered college degrees unnecessary for a growing number of high wage jobs. Yet many employers require four-year credentials anyway, in what some observers see as an antiquated habit and a cover for discrimination.

The numbers are deceptively simple – that 75% of new jobs insist on a bachelor’s degree, while only 40% of potential applicants have one.[1] According the advocacy group Opportunity@Work, employers mistakenly equate college completion with work aptitude, while disregarding self-acquired knowledge or non-academic experience.  The group asserts that the nation’s undervalued workforce “has developed valuable skills through community college, certificate programs, military service, or on-the-job learning, rather than through a bachelors degree. Workers with experience, skills, and diverse perspectives are held back by silent barrier.” As a consequence, over 50% of the American skilled workforce has been under employed and underpaid.[2]  More concerning still is that such discrimination is unevenly distributed. Within a 70-million worker cohort of what are termed STARs  (Skilled Through Alternative Routes) employees, one finds 61% of Black workers, 55% of Hispanic/Latinos, and 61 of veterans.[3]

You 2.0 – The Will to Improve

David Trend

You’ve probably never heard of TestingMom.com. It’s part of a new generation of test-prep companies like Kaplan and Princeton Review –– except this one is for toddlers. Competition for slots in kindergarten has gotten so intense that some parents are shelling out thousands to get their four-year olds ready for entrance tests or interviews. It’s just one more example of the pressure that got celebrity parents arrested for falsifying college applications a few years ago. In this case the battle is over getting into elite elementary schools or gifted programs. While such admissions pressure is widely known, what’s new is how early it’s occurring. Equity issues aside, the demand to improve performance is being drilled into youngsters before they can spell their names.  All of this bespeaks the competition for grades, school placement, and eventual careers that has transformed the normal impulse to do better into an obsession for students and their families. Much like the drive for perfection, an insatiable hunger to be quicker, smarter, and more acceptable to admissions officers is taking its toll in many ways. 

What explains this obsessive behavior? Brain science has been proving what advertising long has known ­–– that wanting something is far more powerful than getting it. School admissions and other markers of success are part of an overarching mental wanting mechanism. That new iPhone might bring a thrill. But soon comes the yearning for an update, a newer model, another purchase. Neuroimaging shows that processes of “wanting” and “liking” occur in different parts of the brain, with the former more broadly and powerfully operating than the latter. This reverses the common wisdom that primal hungers and “drives” underlie human motivation.  Unlike animals, the motor force driving human beings is imagination –– with anticipation of something more important than the experience itself. This partly explains why merchandizing deals more with feeling than facts. Slogans like “Just Do It” and “Think Different” bear no direct relationship to shoes or computers, but instead tingle feelings of desire. In the fuzzy realm emotion pleasure is a fungible currency. 

Especially in the contemporary world, anticipation is a bigger animating force than what follows. Researchers believe the dominance of wanting affects all manner of everyday behaviors, from reaching for a candy bar or playing a game to calling up a friend or striving for success.[i]  So powerful is this expectation mechanism that it gets people wanting things that give no benefit. As it turns out, brain mechanisms for “wanting” are bigger and more complex than the ones for “liking,” and they carry more unconscious baggage. This helps explain the addictive consumerism throughout American culture, as well as why money and achievement often bring little lasting meaning. It’s also one reason why people eat to the point of obesity or habitually do things they don’t really enjoy. Put another way, it’s a key to understanding the update impulse explored throughout this book.

In a broader sense this brain function can shed light on how major life decisions get affected by emotional desire. Economists generally assume that people work hard at their jobs so they can buy things.  Neuroscience increasingly shows how chasing money and even work itself can be their own rewards. In a now famous experiment, researchers watched a certain region of the brain –– the nucleus accumbens –– as study participants reacted to the prospect of receiving money. As reported in Harvard Business Review, the higher the potential monetary reward, the more active the accumbens became. “But activity ceased at the time the subjects actually received the money—suggesting that it was the anticipation, and not the reward itself, that aroused them.”[ii] So just think about this. If people can be so misguided about something as fundamental as why they work, what other things might they be getting wrong?

“The brain seems stingier with mechanisms for pleasure than for desire,” stated Kent Berridge, the scientist primarily known for these findings.[iii] Berridge’s main discovery was that dopamine, the so-called “feel-good neurotransmitter,” had little to do with the pleasure of eating sweets or winning a game. Instead, dopamine’s real power lay in the expectation of enjoyment experienced in desires, unconscious thoughts, and even the memories of pleasure. Building on this, Berridge concluded that the brain’s pleasure system also drove motivations for pursuing success, the good life, and well-being. Yet often these motivations rested on a fundamental misunderstanding of genuine pleasure –– a failure to see that true joy was mainly a mental construct disconnected from actual experience. In this sense, achievement is more a matter of attitude than an objective reality. Just as importantly, the same process of wanting is the engine of anxiety –– when people expect the worst or worry about bad outcomes.These questions about wanting and liking have a lot to do with the will to improve –– and why we invest so much of ourselves in school, work, relationships, and society. So often in life people simply assume they are on the right path, their goals rational and self-evident. Caught up in climbing to the next rung of the ladder, few ever take time to ask just why they are climbing. But philosophers and psychologists have spent a lot of time on this issue, and some of what they say might surprise you. The topic of “motivation” has a long history and has gone by many names: the will to live, the survival instinct, the competitive impulse, the drive for self-preservation, following God’s plan, or striving, struggling, seeking pleasure or comfort. In what follows, I’ll review this history and then bring to topic up to date, ultimately discussing methods everyone can use to critically evaluate how to self-improve. 


[i] Kent Berridge and John. P. O’Doherty, “From Experience Utility to Decision Utility,” Neuroeconomics (2014) p. 337.

[ii] Gardiner Morse, “Decisions and Desire,” Harvard Business Review (Jan. 2006) https://hbr.org/2006/01/decisions-and-desire (accessed Feb 2, 2021).

[iii] “Why ‘Wanting’ and ‘Liking’ Something Simultaneously is Overwhelming,” University of Michigan (Mar. 3, 2007) https://news.umich.edu/why-wanting-and-liking-something-simultaneously-is-overwhelming/ (accessed Feb. 2, 2021).

College Art in Crisis

David Trend

It might surprise many to know that no systematic studies exist of college and university-level arts programs. This is partly due to the way art in higher education fragments into academic disciplines and professional training programs, as well as the complex array of public and private schools, community colleges and research universities, and the ever expanding variety of for-profit entities and online learn-at-home opportunities. The National Center for Education Statistics (NCES) provides rough disciplinary percentages of bachelor’s degrees earned by America’s estimated 18.7-million college students, however. Of these, 5.1 percent graduated in the “Visual and Performing Arts” category, and another 4.6 percent in “Communications and Journalism.” Larger break-downs included “Business” at 19.4 percent, “Health Sciences” at 10.7 percent, and “Social Science” at 9.2 percent.[i] Beyond this, anecdotal evidence abounds of a decade long decline in arts and humanities programs, described by many as a continuing crisis. The recession is partly to blame, with many students and their families simply opting for more surefire career paths, especially as college tuitions have risen.

On the other hand, college art has found new friends among creative economy advocates, with educators jumping on claims from people like Richard Florida that 30 percent of today’s jobs require creative skills.[ii] Making the most of this, the National Endowment for the Arts (NEA) recently released a report entitled “The Arts and Economic Growth,” compiled in partnership with the U.S. Bureau of Economic Analysis.[iii] The document claimed that “arts and culture” contributed $704-billion to the U.S. economy (4.2 percent of GDP) and a whopping 32.5 percent of GDP growth in the past 15 years. This is more than sectors like construction ($619-billion) and utilities ($270-billion), perhaps because the study defined art so broadly –– encompassing advertising, broadcasting, motion pictures, publishing, and arts-related merchandizing, as well as the performing and visual arts themselves. This prompted a piece entitled, “Who Knew? Arts Education Fuels the Economy” in the respected Chronicle of Higher Education, which noted similar findings from business groups. Among these were the Partnership for 21st-Century Learning, a coalition of corporate and educational leaders and policy makers, which said that, “Education in dance, theater, music, and the visual arts helps instill the curiosity, creativity, imagination, and capacity for evaluation that are perceived as vital to a productive U.S. work force.”[iv] The Conference Board, an international business-research organization, polled employers and school superintendents, finding “that creative problem-solving and communications are deemed important by both groups for an innovative work force.”[v] And IBM, in a report based on face-to-face interviews with more than 1,500 CEOs worldwide, concluded that “creativity trumps other leadership characteristics” in an era of rising complexity and continual change.[vi]

Welcome to Cyberschool

David Trend

While technology always has played a big part in education ,it went into hyperdrive in the pandemic-driven move to online learning. Up to this point, economic pressures and growing student numbers already were causing a panic in education. Schools were struggling to trim budgets as “accountability” scrutinized everyone. These extant conditions presented an upside to some of the changes that would occur.  Most dramatically, the shift to doing schoolwork at home eliminated shortfalls in classroom space and, at least temporarily, student housing as well. As the pandemic continued the share of higher education offered online jumped from 10 percent in 2019 to 33 percent a few years later.[i]  But as everyone now knows, so-called “distance learning” isn’t for everyone and doesn’t work for all kinds of material.  Research shows that one-size-fits-all character of mechanical course delivery disadvantages students of many kinds. 

Online schooling isn’t as new as you might think. The idea of distance learning dates to vocational and self-improvement correspondence courses of the eighteenth century, which arose with improvements  in mail delivery systems. Often cited as an early example was a shorthand course offered by Caleb Phillips, advertised in a 1721 edition of Boston Gazette with claims that “students may by having several lessons sent weekly to them, be as perfectly instructed as those that live in Boston.”[ii] By the 1800s all manner of vocational skills were being taught by mail, as well hobbies like drawing and painting. The University of London became the first college to offer distance learning degrees in 1858. By the end of the century, learning by mail had become big business for institutions like the Pennsylvania-based International Correspondence Schools (ICS). In the decade between 1895 and 1905, ICS grew from 72,000 to 900,000 students signing up to learn technical and management skills.[iii] Much of this growth was due to the innovation of sending entire textbooks rather than single lessons, along with promotion by a large in-person sales team.

The Learning Society

David Trend

As consumer prices continue to rise, experts now warn of a looming recesssion brought about by pandemic manufacturing slowdowns and supply-chain shortages. Economists explain it as a classic case of demand outpacing availability –– with scarcity making things more costly. Unfortunately, the painful solution now being launched will raise borrowing costs rates so that people spend less. While these measures may or may not improve the overall economomy, the combined effects of inflation and rising interest rates will exact a double blow to people struggling to make ends meet. In such an atmosphere it becomes critical to help people manage their own finances and to prevent the broader economy from overheating. This is where consumer education and financial literacy can help as part of a largermove toward a “learning society.”

For some time now, economists have been promoting financial education in public schools and urging people to become more resourceful. Time Magazine reported polls showing “99 percent of adults in agreement that personal finance should be taught in high school.”[i]  The Federal Reserve argued that “financial literacy and consumer education, coupled with strong consumer protections, make the financial marketplace ‘effective and efficient’ and assists consumers in making better choices.”[ii] Many colleges and universities have started making financial literacy courses graduation requirements. And for some it has worked, as many Americans “put their own budgets under the microscope –– akin to what financial analysts routinely do when the scrutinize companies.”[iii]  

Continue reading “The Learning Society”

The Creative Inner Child?

David Trend

Pablo Picasso once quipped that “Every child is an artist; the problem is how to remain an artist once they grow up.”[i]  In this often-quoted slogan, Picasso neatly summarized idealized views of the universally creative child and the uncreative adult. In a similar fashion he would later write that, “It takes a long time to become young.” What is one to make of such laments? Nostalgia over a lost youth? A yearning to escape a pressurized grown-up life?  Regardless of origins, it’s impossible to deny America’s ongoing infatuation with childhood creativity.

This fascination childood artistry dates to the 1700s, corresponding to evolving views of children as “blank slates” (tabula rasa) better served by nurturance and education than by discipline alone. At the same time, Enlightenment debates ver individualism and personal autonomy were bringing considerable anxiety to the era, evidenced in worries that self-interest would overwhelm moral sentiments. This set the stage for the naturalism espoused by Jean-Jacques Rousseau in his book Emile: Or, On Education, seeing an inherent “goodness” in children, which becomes corrupted by adult desire and material want.[ii] With the 1800s, views of “human nature” gave ways to theories of evolution and behavioral adaptation –– owing in large part to the influence of Charles Darwin and Herbert Spencer. While the resulting rationalism eventually would make educatio more formulaic, an artsy transcendentalism would counterbalance American culture with an advocacy for an “educated imagination.”[iii] The Romantic Era writings of Ralph Waldo Emerson, Margaret Fuller, Henry Wadsworth Longfellow, and Walt Whitman advanced themes of emotion over reason and imagination over reality –– setting in place a tradition progressive of push-back against the instrumentalist ethos of science and industry.

In the 1920s, Swiss psychologist Jean Piaget began charting children’s “stages” of maturity, hence launching the modern field of child development.[iv] Piaget saw “realistic” rendering as a learned ability rather than a natural inclination. In one famous study, Piaget asked a group of four-year olds to draw familiar people or objects. He found that the images invariably had the same characteristics: drawn from memory rather than observation, exaggeration of certain salient features (faces, for example), and a disregard of perspective or scale. In other words, the images derived more from mental symbolism than they did conventional schema of visual representation. Piaget would note that at later ages children acquire the ability to “correct” their images to conform to normative depictions of reality. Later observations of so-called “feral” children (raised” in the wild without human contact) found that such children often didn’t speak or make pictures of any kind, further reinforcing the premise that language and “artistic” rendering were largely determined by culture.[v]

Stop Blaming Students: Toward a Post-Pandemic Pedagogy

David Trend

There’s trouble in the college classroom these days. But you can’t blame students. The pandemic and other disruptions of the past two years have shaken higher education to the core, casting doubt on how universities deliver instruction, pay their bills, and justify their existence. Enrollments are dropping across the nation, as students and their families increasingly see college as  overpriced, inequitable, and non-essential. More disturbing still are shifts taking place within institutions themselves, as dispirited students are losing motivation and enthusiasm for learning.  Clearly something has to change, with many pointing to the classroom as a key place to start.  But will it be enough?

“A Stunning Level of Disconnection” is the way one recent article described the situation. “Fewer students show up to class. Those who do avoid speaking when possible. Many skip the readings or the homework. They have trouble remembering what they learned and struggle on tests,” one professor reported.[1] Instructors are trying to reach and teach students, to figure out the problem, and do anything they can to fix things, with many now concluding in frustration that “It may be necessary to change the structure of college itself.” Call it a stress test for higher education – the seismic disruption of the college classroom during the COVID-19 years, and its ongoing after-shocks. At all levels of instruction, educators continue to voice alarm over the persistent malaise and underperformance of college students. 

The Problem with Rigor

David Trend

“It’s Time to Cancel the Word Rigor,” read a recent headline in the respected Chronicle of Higher Education.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

David Trend

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

David Trend

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

Loneliness of the Long Distance Learner

David Trend

No one could have predicted the radical changes in education of the early 2020s. Besides making the once-obscure Zoom into a household name, the pandemic accelerated an already fast-moving takeover of everyday life by the internet. The economic consequences were profound, with revenues exploding for companies like Netflix and Amazon while brick-and-mortal retail outlets and restaurants disappeared by the thousands. Of course nothing about the upheaval was especially surprising in historical terms. Cataclysmic events like disasters and wars often leave places quite different than they were before, as systemic restraints give way to radical reorganization. Emergency measures accepted in the moment have a habit of leaving remnants in place, much as occurred with online learning. Not that this is always is a bad thing. Urgent situations can trigger remarkable innovation and creativity, seen in the hundreds of ways that educators found ways to keep instruction going. But just as often people get hurt in the rush, as short-term solutions make for long-term problems.

Seen in retrospect, the rapid transition to online learning certainly falls into this latter category, evidenced in the huge numbers of students who failed or dropped out of classes, with those affected overwhelmingly the historically underserved. Changes occurred and learning was disrupted. But the convenience and efficiencies of virtual classroom were too good to let go. “Online Learning is Here to Stay” read a feature in New York Times, citing a study from the Rand Corporation saying that 20 percent of schools were choosing to continue portions of their online offerings. “Families have come to prefer stand-alone virtual schools and districts are rushing to accommodate, but questions still linger.”[i] Questions indeed. Before the pandemic less than one percent of K-12 schooling took place online. Educational reasons notwithstanding, this also had to do with the function of school as childcare for working families. The idea of a twenty-fold increase in home learning raises the question of what parent demographics are driving this shift. Or more to the point, who has gained from the online shift and who lost out? Continue reading “Loneliness of the Long Distance Learner”

Turn-U-In : Treating Students as Suspects

David Trend

It’s no secret that online learning has its problems, witnessed in the historic failure and drop-out rates resulting from thrown-together course overhauls in the early COVID months. Less widely reported has been another kind of failure owing to a loss faith in educational institutions and a widening trust gap between teachers and students.

Inherent school power inequities  have aggravated  antagonisms – now made even worse by a range of surveillance and security technologies. The distance in “distance learning” can create an atmosphere of alienation and distrust. When the in-person classroom is reduced to a screen image, teachers and students can seem more like abstractions than actual people.

This opens the door for all sorts of communication failures and misunderstandings, not to mention stereotyping and harm. The objectifying tendencies of media representations long have been associated distortions in the way individuals and groups view each other, whether in the marketing of products, sensationalizing news items, or spreading ideologies on social networks. When “Zoom school” does this, underlying beliefs and assumptions can overtake the reality of encounters, generating attitudes that destabilize the learning environment.

These problems have become especially evident in the panic about student dishonesty in online learning, as the absence of classroom proximity quickly escalated in into assumptions of cheating. Early in the 2020s a torrent of news reports warned of an “epidemic” of dishonesty in online learning, with some surveys showing over 90 percent educators believing cheating occurred more in distance education than in-person instruction.[i] New technologies often have stoked such fears, in this instance building on the distrust many faculty hold toward students, some of it racially inflected. [ii] Closer examination of the issue has revealed that much of the worry came from faculty with little direct knowledge of the digital classroom, online student behavior, and preventative techniques now commonly used.  Indeed more recent research has shown no significant differences between in-person and online academic integrity.[iii] Continue reading “Turn-U-In : Treating Students as Suspects”