The Problem with Rigor

David Trend

“It’s Time to Cancel the Word Rigor,” read a recent headline in the respected Chronicle of Higher Education.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

David Trend

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

David Trend

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

Loneliness of the Long Distance Learner

David Trend

No one could have predicted the radical changes in education of the early 2020s. Besides making the once-obscure Zoom into a household name, the pandemic accelerated an already fast-moving takeover of everyday life by the internet. The economic consequences were profound, with revenues exploding for companies like Netflix and Amazon while brick-and-mortal retail outlets and restaurants disappeared by the thousands. Of course nothing about the upheaval was especially surprising in historical terms. Cataclysmic events like disasters and wars often leave places quite different than they were before, as systemic restraints give way to radical reorganization. Emergency measures accepted in the moment have a habit of leaving remnants in place, much as occurred with online learning. Not that this is always is a bad thing. Urgent situations can trigger remarkable innovation and creativity, seen in the hundreds of ways that educators found ways to keep instruction going. But just as often people get hurt in the rush, as short-term solutions make for long-term problems.

Seen in retrospect, the rapid transition to online learning certainly falls into this latter category, evidenced in the huge numbers of students who failed or dropped out of classes, with those affected overwhelmingly the historically underserved. Changes occurred and learning was disrupted. But the convenience and efficiencies of virtual classroom were too good to let go. “Online Learning is Here to Stay” read a feature in New York Times, citing a study from the Rand Corporation saying that 20 percent of schools were choosing to continue portions of their online offerings. “Families have come to prefer stand-alone virtual schools and districts are rushing to accommodate, but questions still linger.”[i] Questions indeed. Before the pandemic less than one percent of K-12 schooling took place online. Educational reasons notwithstanding, this also had to do with the function of school as childcare for working families. The idea of a twenty-fold increase in home learning raises the question of what parent demographics are driving this shift. Or more to the point, who has gained from the online shift and who lost out? Continue reading “Loneliness of the Long Distance Learner”

Turn-U-In: Treating Students as Suspects

David Trend

It’s no secret that online learning has its problems, witnessed in the historic failure and drop-out rates resulting from thrown-together course overhauls in the early COVID months. Less widely reported has been another kind of failure owing to a loss faith in educational institutions and a widening trust gap between teachers and students.

Inherent school power inequities  have aggravated  antagonisms – now made even worse by a range of surveillance and security technologies. The distance in “distance learning” can create an atmosphere of alienation and distrust. When the in-person classroom is reduced to a screen image, teachers and students can seem more like abstractions than actual people.

This opens the door for all sorts of communication failures and misunderstandings, not to mention stereotyping and harm. The objectifying tendencies of media representations long have been associated distortions in the way individuals and groups view each other, whether in the marketing of products, sensationalizing news items, or spreading ideologies on social networks. When “Zoom school” does this, underlying beliefs and assumptions can overtake the reality of encounters, generating attitudes that destabilize the learning environment.

These problems have become especially evident in the panic about student dishonesty in online learning, as the absence of classroom proximity quickly escalated in into assumptions of cheating. Early in the 2020s a torrent of news reports warned of an “epidemic” of dishonesty in online learning, with some surveys showing over 90 percent educators believing cheating occurred more in distance education than in-person instruction.[i] New technologies often have stoked such fears, in this instance building on the distrust many faculty hold toward students, some of it racially inflected. [ii] Closer examination of the issue has revealed that much of the worry came from faculty with little direct knowledge of the digital classroom, online student behavior, and preventative techniques now commonly used.  Indeed more recent research has shown no significant differences between in-person and online academic integrity.[iii] Continue reading “Turn-U-In: Treating Students as Suspects”

When School is a Factory

David Trend

For 20 years, I have been teaching large arts and humanities general education courses at the University of California, Irvine. These 400-student classes are part of the undergraduate “breadth requirements” common in most colleges and universities, and hence draw enrollments from across the academic disciplines. At UC Irvine, this means that most of the class comprises science, technology, engineering, and math (STEM) majors. Aside from an orientation to more practical fields, I’ve noticed a clear shift in student attitudes in recent years –– a heightened preoccupation with grades and rankings, combined with growing anxieties about future earnings. Many of my colleagues see this as well, often disparaging students more concerned with GPA metrics than learning itself, while increasingly behaving more like consumers of educational commodities. I take a more sanguine view.

Bear in mind that many of today’s college students grew up during the Great Recession, when families of all incomes had money worries. With scant knowledge of a world before 9/11, it’s little wonder that polls show millennials expecting lower earnings than their parents, seeing the United States on a downward spiral, and believing the two-party system as fatally flawed.[i] Rising income inequality doesn’t help matters, especially at UC Irvine where 6 in 10 students get financial aid and half are the first in their families earning a college degree.[ii] Because of this, Irvine has been cited by the New York Times as the country’s leading “upward mobility engine” –– making the campus a national model of what public higher education can do.[iii] But it’s still not a cake-walk for degree seekers. As at most public universities in America, the majority of Irvine’s full-time students also work at jobs to make ends meet.[iv] Continue reading “When School is a Factory”

Teaching Robots to Imagine

David Trend

Can robots be taught to imagine? Google’s DeepMind artificial intelligence group is doing just that –– developing computer versions of what many consider humanity’s quintessential trait. The software world long has pursued sentient consciousness as its holy grail. But until now, it’s only been found in science fiction movies like A.I., Ex Machina, and Transcendence. DeepMind engineers say they have cracked the code by combining two kinds of machine-learning. The first is linear, which is nothing new, with the computer applying a predefined algorithm over-and-over till it finds answers and then remembering them. In the second more radical approach, the computer tries many algorithms to find which work best, and then changes the very way it approaches problems. Combining the purely linear with a more systemic approach, DeepMind’s “Imagination-Augmented Agent” mimics intuitive learning in a way prior software hasn’t. It’s not exactlythe same as human imagination, but it comes closer than ever before to what neuroscientists say the brain does.

While robotic imagination may be improving, human thought isn’t faring as well. Most people feel uncreative and without inspiration, as discussed in earlier chapters. Corporations say innovation is withering as well. Novelist Ursula Le Guin recently observed that, “In America today imagination is generally looked on as something that might be useful when the TV is out of order. Poetry and plays have no relation to practical politics. Novels are for students, housewives, and other people who don’t work.”[i]Beyond the abandonment of a creative genre or two, American society also is undergoing a wholesale commodification of imagination itself. Disney is most famous for this, its “Imagineering” (imagination + engineering) brand one of the most viciously protected anywhere. But hundreds of companies evoke imagination to conjure an aura of specialness ––seen in promotions like Bombay Safire’s “Infused with Imagination,” GE’s “Imagination at Work,” Electrolux’s “Power to Capture Imagination,” Lego’s “Imagine,” Microsoft’s “Imagine Academy,” Nestle’s “Feed your Imagination,” Samsung’s “Imagine,” and Sony’s “Made of Imagination.”

The connection of imagination to commercial products reflects the powerful linkage of purchasing to consumer self-image. Expressing oneself through buying brings a passing feeling of agency, maybe even of accomplishment. Some critics say that shopping is more meaningful than voting for many Americans. Henry A. Giroux speaks of “disimagination” in describing how public consciousness is overwritten in this process, as people lose abilities to imagine on their own. To Giroux “The power to reimagine, doubt, and think critically no longer seems possible in a society in which self-interest has become the ‘only mode of force in human life and competition’ and ‘the most efficient and socially beneficial way for that force to express itself.’” Going even further, Giroux links disimagination to a rising collective amnesia, stating “What I have called the violence of organized forgetting signals how contemporary politics are those in which emotion triumphs over reason, and spectacle over truth, thereby erasing history by producing an endless flow of fragmented and disingenuous knowledge.”

Imagination can be seen positively, of course. With this in mind, much of this chapter exploresways people can envision a better and more just world. Obviously this might take a little encouragement in an age of disimagination. But it’s far from impossible. Most definitions describe imagination as the mental process behind creativity, as seen in the Oxford Dictionary: “Imagination: The faculty or action of forming new ideas, or images or concepts of external objects not present to the senses.The ability of the mind to be creative or resourceful.” Put another way, creativity is imagination actualized for a purpose –– generally assumed a positive one. As stated by a leading expert in the field, “Creativity is putting your imagination to work. It’s applied imagination.” Dig a little deeper into this lexicon, and one finds that very problem that worries Le Guin and Giroux. A quick look at Roget’s Thesauruslists such synonyms for “imaginative” as “dreamy,” “fanciful,” “fantastic,” “quixotic,” “romantic, and “whimsical.” Nice as these sound, such vaporous associations equate imagination with the same romantic idealism and inconsequentiality dogging creativity. This explains why advertisers seem so keen on imagination. As one marketing firm put it, “We don’t see imagining as a real task. It’s an enjoyable game. By asking a prospect to imagine something, you bypass that critical part that throws up objections, and sneak into their mind through the back door of the imagination.”

How about seeing imagination differently? Maybe as a roadmap for one’s life or future?  Or a way to imagine important people in one’s life? Perhaps even a vision for community, country, and the larger world? After all, isn’t society itself an imaginary construct? Doesn’t everyone want to make it better? To Le Guin, “To train the mind to take off from immediate reality and return to it with new understanding and new strength, nothing quite equals poem and story.” She concludes that “Human beings have always joined in groups to imagine how best to live and help one another carry out the plan. The essential function of human community is to arrive at some agreement on what we need, what life ought to be, what we want our children to learn, and then to collaborate in learning and teaching so that we and they can go on the way we think is the right way.”

Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “Big Data vs Artists and Everyone Else”

Belonging Where?

By David Trend:

Throughout its existence the United States has shown a strange tendency to turn against itself, dividing citizens against each other with a vehemence rivaling the most brutal regimes on earth. Some have rationalized the resulting crisis of “belonging” in America as an understandable consequence of cultural diversity, economic stress, and global threat. After all, haven’t there always been “insiders” and “outsiders” in every culture? Aren’t competition and aggression wired into human nature?  Or is there something peculiar about the personality of the U.S.?  Could it be that prejudice is the real legacy of the “American Exceptionalism,” in traditions dating to the genocide of indigenous populations, the subjugation of women, the rise of slavery, the scapegoating of immigrants, and more recent assaults on the poor or anyone falling outside the realm of normalcy?

I discussed selected aspects of America’s divisive pathology in my book A Culture Divided: America’s Struggle for Unity, which was written in the closing years of the George W. Bush presidency.  Like many at the time, I had completely given up on the idea of “common ground” amid the residue of post-9/11 reactionary fervor and emerging economic recession. Media commentators were buzzing constantly about red/blue state polarization.  Opinions varied about the cause of the divide, attributing it to factors including regionalism, media sensationalism, partisan antipathy, or all of these combined. Also joining the fray were those asserting the divide was fabricated, with evenly divided elections showing most people in the middle of the curve on most issues.  My somewhat contrarian view was that the “problem” shouldn’t be regarded problem at all. After all, America always had been divided––through war and peace, boom and bust. Division was the country’s national brand.  But as a book about politics, A Culture Divided didn’t get to the roots or the lived experience America’s compulsive divisiveness.

Speaking at the 50th anniversary of the Selma to Montgomery marches, President Barack Obama described America as an incomplete project––a nation caught between ideals of a perfect union and the lingering realities of their failure. While citing advances in civil liberties since the bloody apex of the Voting Rights Movement, Obama also spoke of a federal report issued just days earlier documenting structural racism and misbehavior toward African Americans by police in Ferguson, MO, where months before law enforcement officers had killed an unarmed black teenager. “We know the march is not yet over.  We know the race is not yet won,” the President stated, adding, “We know that reaching that blessed destination requires admitting as much, facing up to the truth.” Continue reading “Belonging Where?”

Creative Magic

By David Trend

“The central question upon which all creative living hinges is: Do you have the courage to bring forth the treasures hidden within you?” With this entreaty, author Elizabeth Gilbert introduced her recent bestseller Big Magic: Creative Living Beyond Fear, which offered an artistic cure for an anxious American culture.[i] Speaking directly to widespread feelings of disaffection and powerlessness, Big Magic romanticized artistry in Gilbert’s signature blend of sentiment and cliché––packaging familiar views (human creativity, divine creativity, etc.) with a self-help twist about creating one’s “self” in new and better ways.  While one easily can write off Big Magic as yet another feel-good advice book (which it surely is), I think it’s time to take Gilbert’s approach to creativity seriously and ponder why such ideas now get so much traction.

Publicity doesn’t hurt. Reviewers effused over Big Magic as a “book-length meditation on inspiration” (Newsday) to “unlock your inner artist” (Woman’s Day) and “dream a life without limits” (Publishers’ Weekly).[ii] This message resonated well with the rising chorus promoting creativity as an innovation engine and economic tonic.  While no one would dispute the positive benefits of a little artistic dabbling, at what point does such wishful thinking begin to border on delusion? Or put another way, when does fantasy paper over reality? Might it be that America’s fondness for make-believe is party behind the nation’s political confusion and disaffection? Do fairy-tale versions of life infantilize a citizenry that should know that answers don’t always come easily?  Certainly the fantasy-version of reality offered by certain politicians would fail any thoughtful analysis. But instead, many leaders continue treating their constituents like children, with entire governments encouraging populations to set worries aside and simply “Be Creative.”

In Magical Thinking and the Decline of America, historian Richard L. Rapson took a long look at the nation’s romantic idealism. “Probably in no other society of the world can one write the script for one’s life as completely as United States. This fact has made the nation the ‘promised land’ for much of the world over the past two centuries,” Rapson wrote. “The flight into endless self-improvement and innocent optimism has a long lineage in our past.”[iii] Perhaps anticipating Donald Trump’s “Make America Great Again” sloganeering, Rapson pointed to the disconnection between America’s self-image as an “exceptional” driver of human history, and the growing evidence of the nation’s falling fortunes. This has led to what Rapson described as a growing “flight from knowledge and reality into faith and fantasy,” resulting in large part from “an American public increasingly in thrall to the fairytales told by the mass media.”[iv]  It also promotes a “cultural fixation on the individual, the personal, the biographical, the confessional, and, all too often, the narcissistic,” and hence the rise of new “magic words” like “self-awareness,” “personal growth” and other aphorisms promoting everyone to “be all that you can be.”[v]

Individualism lies at the heart of American idealism, dating to the country’s Enlightenment Era origins, when the autonomous subject was invented as a counterpoint to deific and royal authority. Necessary as individualism was (and remains), no one could have predicted how its value could be magnified and distorted in neoliberal times.  The initial affirmation of personal identity, which encouraged people to vote and participate in society, soon morphed into “striving to get ahead” and “winning at any cost.” Eventually the “self” would become an American obsession of theological proportions. “The purpose of nearly all the current gospels is to put believers ‘in touch’ with themselves,” Rapson further explained.[vi] This new brand of secular “faith” also comports well with the religiosity many Americans still profess, especially evangelical strains that promise economic gain to dutiful worshippers. Continue reading “Creative Magic”