You 2.0 – The Will to Improve

David Trend

You’ve probably never heard of TestingMom.com. It’s part of a new generation of test-prep companies like Kaplan and Princeton Review –– except this one is for toddlers. Competition for slots in kindergarten has gotten so intense that some parents are shelling out thousands to get their four-year olds ready for entrance tests or interviews. It’s just one more example of the pressure that got celebrity parents arrested for falsifying college applications a few years ago. In this case the battle is over getting into elite elementary schools or gifted programs. While such admissions pressure is widely known, what’s new is how early it’s occurring. Equity issues aside, the demand to improve performance is being drilled into youngsters before they can spell their names.  All of this bespeaks the competition for grades, school placement, and eventual careers that has transformed the normal impulse to do better into an obsession for students and their families. Much like the drive for perfection, an insatiable hunger to be quicker, smarter, and more acceptable to admissions officers is taking its toll in many ways. 

What explains this obsessive behavior? Brain science has been proving what advertising long has known ­–– that wanting something is far more powerful than getting it. School admissions and other markers of success are part of an overarching mental wanting mechanism. That new iPhone might bring a thrill. But soon comes the yearning for an update, a newer model, another purchase. Neuroimaging shows that processes of “wanting” and “liking” occur in different parts of the brain, with the former more broadly and powerfully operating than the latter. This reverses the common wisdom that primal hungers and “drives” underlie human motivation.  Unlike animals, the motor force driving human beings is imagination –– with anticipation of something more important than the experience itself. This partly explains why merchandizing deals more with feeling than facts. Slogans like “Just Do It” and “Think Different” bear no direct relationship to shoes or computers, but instead tingle feelings of desire. In the fuzzy realm emotion pleasure is a fungible currency. 

Continue reading “You 2.0 – The Will to Improve”

College Art in Crisis

David Trend

It might surprise many to know that no systematic studies exist of college and university-level arts programs. This is partly due to the way art in higher education fragments into academic disciplines and professional training programs, as well as the complex array of public and private schools, community colleges and research universities, and the ever expanding variety of for-profit entities and online learn-at-home opportunities. The National Center for Education Statistics (NCES) provides rough disciplinary percentages of bachelor’s degrees earned by America’s estimated 18.7-million college students, however. Of these, 5.1 percent graduated in the “Visual and Performing Arts” category, and another 4.6 percent in “Communications and Journalism.” Larger break-downs included “Business” at 19.4 percent, “Health Sciences” at 10.7 percent, and “Social Science” at 9.2 percent.[i] Beyond this, anecdotal evidence abounds of a decade long decline in arts and humanities programs, described by many as a continuing crisis. The recession is partly to blame, with many students and their families simply opting for more surefire career paths, especially as college tuitions have risen.

On the other hand, college art has found new friends among creative economy advocates, with educators jumping on claims from people like Richard Florida that 30 percent of today’s jobs require creative skills.[ii] Making the most of this, the National Endowment for the Arts (NEA) recently released a report entitled “The Arts and Economic Growth,” compiled in partnership with the U.S. Bureau of Economic Analysis.[iii] The document claimed that “arts and culture” contributed $704-billion to the U.S. economy (4.2 percent of GDP) and a whopping 32.5 percent of GDP growth in the past 15 years. This is more than sectors like construction ($619-billion) and utilities ($270-billion), perhaps because the study defined art so broadly –– encompassing advertising, broadcasting, motion pictures, publishing, and arts-related merchandizing, as well as the performing and visual arts themselves. This prompted a piece entitled, “Who Knew? Arts Education Fuels the Economy” in the respected Chronicle of Higher Education, which noted similar findings from business groups. Among these were the Partnership for 21st-Century Learning, a coalition of corporate and educational leaders and policy makers, which said that, “Education in dance, theater, music, and the visual arts helps instill the curiosity, creativity, imagination, and capacity for evaluation that are perceived as vital to a productive U.S. work force.”[iv] The Conference Board, an international business-research organization, polled employers and school superintendents, finding “that creative problem-solving and communications are deemed important by both groups for an innovative work force.”[v] And IBM, in a report based on face-to-face interviews with more than 1,500 CEOs worldwide, concluded that “creativity trumps other leadership characteristics” in an era of rising complexity and continual change.[vi]

Welcome to Cyberschool

David Trend

While technology always has played a big part in education ,it went into hyperdrive in the pandemic-driven move to online learning. Up to this point, economic pressures and growing student numbers already were causing a panic in education. Schools were struggling to trim budgets as “accountability” scrutinized everyone. These extant conditions presented an upside to some of the changes that would occur.  Most dramatically, the shift to doing schoolwork at home eliminated shortfalls in classroom space and, at least temporarily, student housing as well. As the pandemic continued the share of higher education offered online jumped from 10 percent in 2019 to 33 percent a few years later.[i]  But as everyone now knows, so-called “distance learning” isn’t for everyone and doesn’t work for all kinds of material.  Research shows that one-size-fits-all character of mechanical course delivery disadvantages students of many kinds. 

Online schooling isn’t as new as you might think. The idea of distance learning dates to vocational and self-improvement correspondence courses of the eighteenth century, which arose with improvements  in mail delivery systems. Often cited as an early example was a shorthand course offered by Caleb Phillips, advertised in a 1721 edition of Boston Gazette with claims that “students may by having several lessons sent weekly to them, be as perfectly instructed as those that live in Boston.”[ii] By the 1800s all manner of vocational skills were being taught by mail, as well hobbies like drawing and painting. The University of London became the first college to offer distance learning degrees in 1858. By the end of the century, learning by mail had become big business for institutions like the Pennsylvania-based International Correspondence Schools (ICS). In the decade between 1895 and 1905, ICS grew from 72,000 to 900,000 students signing up to learn technical and management skills.[iii] Much of this growth was due to the innovation of sending entire textbooks rather than single lessons, along with promotion by a large in-person sales team.

The Learning Society

David Trend

As consumer prices continue to rise, experts now warn of a looming recesssion brought about by pandemic manufacturing slowdowns and supply-chain shortages. Economists explain it as a classic case of demand outpacing availability –– with scarcity making things more costly. Unfortunately, the painful solution now being launched will raise borrowing costs rates so that people spend less. While these measures may or may not improve the overall economomy, the combined effects of inflation and rising interest rates will exact a double blow to people struggling to make ends meet. In such an atmosphere it becomes critical to help people manage their own finances and to prevent the broader economy from overheating. This is where consumer education and financial literacy can help as part of a largermove toward a “learning society.”

For some time now, economists have been promoting financial education in public schools and urging people to become more resourceful. Time Magazine reported polls showing “99 percent of adults in agreement that personal finance should be taught in high school.”[i]  The Federal Reserve argued that “financial literacy and consumer education, coupled with strong consumer protections, make the financial marketplace ‘effective and efficient’ and assists consumers in making better choices.”[ii] Many colleges and universities have started making financial literacy courses graduation requirements. And for some it has worked, as many Americans “put their own budgets under the microscope –– akin to what financial analysts routinely do when the scrutinize companies.”[iii]  

Continue reading “The Learning Society”

The Creative Inner Child?

David Trend

Pablo Picasso once quipped that “Every child is an artist; the problem is how to remain an artist once they grow up.”[i]  In this often-quoted slogan, Picasso neatly summarized idealized views of the universally creative child and the uncreative adult. In a similar fashion he would later write that, “It takes a long time to become young.” What is one to make of such laments? Nostalgia over a lost youth? A yearning to escape a pressurized grown-up life?  Regardless of origins, it’s impossible to deny America’s ongoing infatuation with childhood creativity.

This fascination childood artistry dates to the 1700s, corresponding to evolving views of children as “blank slates” (tabula rasa) better served by nurturance and education than by discipline alone. At the same time, Enlightenment debates ver individualism and personal autonomy were bringing considerable anxiety to the era, evidenced in worries that self-interest would overwhelm moral sentiments. This set the stage for the naturalism espoused by Jean-Jacques Rousseau in his book Emile: Or, On Education, seeing an inherent “goodness” in children, which becomes corrupted by adult desire and material want.[ii] With the 1800s, views of “human nature” gave ways to theories of evolution and behavioral adaptation –– owing in large part to the influence of Charles Darwin and Herbert Spencer. While the resulting rationalism eventually would make educatio more formulaic, an artsy transcendentalism would counterbalance American culture with an advocacy for an “educated imagination.”[iii] The Romantic Era writings of Ralph Waldo Emerson, Margaret Fuller, Henry Wadsworth Longfellow, and Walt Whitman advanced themes of emotion over reason and imagination over reality –– setting in place a tradition progressive of push-back against the instrumentalist ethos of science and industry.

In the 1920s, Swiss psychologist Jean Piaget began charting children’s “stages” of maturity, hence launching the modern field of child development.[iv] Piaget saw “realistic” rendering as a learned ability rather than a natural inclination. In one famous study, Piaget asked a group of four-year olds to draw familiar people or objects. He found that the images invariably had the same characteristics: drawn from memory rather than observation, exaggeration of certain salient features (faces, for example), and a disregard of perspective or scale. In other words, the images derived more from mental symbolism than they did conventional schema of visual representation. Piaget would note that at later ages children acquire the ability to “correct” their images to conform to normative depictions of reality. Later observations of so-called “feral” children (raised” in the wild without human contact) found that such children often didn’t speak or make pictures of any kind, further reinforcing the premise that language and “artistic” rendering were largely determined by culture.[v]

Stop Blaming Students: Toward a Post-Pandemic Pedagogy

David Trend

There’s trouble in the college classroom these days. But you can’t blame students. The pandemic and other disruptions of the past two years have shaken higher education to the core, casting doubt on how universities deliver instruction, pay their bills, and justify their existence. Enrollments are dropping across the nation, as students and their families increasingly see college as  overpriced, inequitable, and non-essential. More disturbing still are shifts taking place within institutions themselves, as dispirited students are losing motivation and enthusiasm for learning.  Clearly something has to change, with many pointing to the classroom as a key place to start.  But will it be enough?

“A Stunning Level of Disconnection” is the way one recent article described the situation. “Fewer students show up to class. Those who do avoid speaking when possible. Many skip the readings or the homework. They have trouble remembering what they learned and struggle on tests,” one professor reported.[1] Instructors are trying to reach and teach students, to figure out the problem, and do anything they can to fix things, with many now concluding in frustration that “It may be necessary to change the structure of college itself.” Call it a stress test for higher education – the seismic disruption of the college classroom during the COVID-19 years, and its ongoing after-shocks. At all levels of instruction, educators continue to voice alarm over the persistent malaise and underperformance of college students. 

The Problem with Rigor

David Trend

“It’s Time to Cancel the Word Rigor,” read a recent headline in the respected Chronicle of Higher Education.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

David Trend

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

David Trend

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

Loneliness of the Long Distance Learner

David Trend

No one could have predicted the radical changes in education of the early 2020s. Besides making the once-obscure Zoom into a household name, the pandemic accelerated an already fast-moving takeover of everyday life by the internet. The economic consequences were profound, with revenues exploding for companies like Netflix and Amazon while brick-and-mortal retail outlets and restaurants disappeared by the thousands. Of course nothing about the upheaval was especially surprising in historical terms. Cataclysmic events like disasters and wars often leave places quite different than they were before, as systemic restraints give way to radical reorganization. Emergency measures accepted in the moment have a habit of leaving remnants in place, much as occurred with online learning. Not that this is always is a bad thing. Urgent situations can trigger remarkable innovation and creativity, seen in the hundreds of ways that educators found ways to keep instruction going. But just as often people get hurt in the rush, as short-term solutions make for long-term problems.

Seen in retrospect, the rapid transition to online learning certainly falls into this latter category, evidenced in the huge numbers of students who failed or dropped out of classes, with those affected overwhelmingly the historically underserved. Changes occurred and learning was disrupted. But the convenience and efficiencies of virtual classroom were too good to let go. “Online Learning is Here to Stay” read a feature in New York Times, citing a study from the Rand Corporation saying that 20 percent of schools were choosing to continue portions of their online offerings. “Families have come to prefer stand-alone virtual schools and districts are rushing to accommodate, but questions still linger.”[i] Questions indeed. Before the pandemic less than one percent of K-12 schooling took place online. Educational reasons notwithstanding, this also had to do with the function of school as childcare for working families. The idea of a twenty-fold increase in home learning raises the question of what parent demographics are driving this shift. Or more to the point, who has gained from the online shift and who lost out? Continue reading “Loneliness of the Long Distance Learner”

Treating Students as Suspects

David Trend

It’s no secret that online learning has its problems, witnessed in the historic failure and drop-out rates resulting from thrown-together course overhauls in the early COVID months. Less widely reported has been another kind of failure owing to a loss faith in educational institutions and a widening trust gap between teachers and students.

Inherent school power inequities  have aggravated  antagonisms – now made even worse by a range of surveillance and security technologies. The distance in “distance learning” can create an atmosphere of alienation and distrust. When the in-person classroom is reduced to a screen image, teachers and students can seem more like abstractions than actual people.

This opens the door for all sorts of communication failures and misunderstandings, not to mention stereotyping and harm. The objectifying tendencies of media representations long have been associated distortions in the way individuals and groups view each other, whether in the marketing of products, sensationalizing news items, or spreading ideologies on social networks. When “Zoom school” does this, underlying beliefs and assumptions can overtake the reality of encounters, generating attitudes that destabilize the learning environment.

These problems have become especially evident in the panic about student dishonesty in online learning, as the absence of classroom proximity quickly escalated in into assumptions of cheating. Early in the 2020s a torrent of news reports warned of an “epidemic” of dishonesty in online learning, with some surveys showing over 90 percent educators believing cheating occurred more in distance education than in-person instruction.[i] New technologies often have stoked such fears, in this instance building on the distrust many faculty hold toward students, some of it racially inflected. [ii] Closer examination of the issue has revealed that much of the worry came from faculty with little direct knowledge of the digital classroom, online student behavior, and preventative techniques now commonly used.  Indeed more recent research has shown no significant differences between in-person and online academic integrity.[iii] Continue reading “Treating Students as Suspects”

When School is a Factory

David Trend

For 20 years, I have been teaching large arts and humanities general education courses at the University of California, Irvine. These 400-student classes are part of the undergraduate “breadth requirements” common in most colleges and universities, and hence draw enrollments from across the academic disciplines. At UC Irvine, this means that most of the class comprises science, technology, engineering, and math (STEM) majors. Aside from an orientation to more practical fields, I’ve noticed a clear shift in student attitudes in recent years –– a heightened preoccupation with grades and rankings, combined with growing anxieties about future earnings. Many of my colleagues see this as well, often disparaging students more concerned with GPA metrics than learning itself, while increasingly behaving more like consumers of educational commodities. I take a more sanguine view.

Bear in mind that many of today’s college students grew up during the Great Recession, when families of all incomes had money worries. With scant knowledge of a world before 9/11, it’s little wonder that polls show millennials expecting lower earnings than their parents, seeing the United States on a downward spiral, and believing the two-party system as fatally flawed.[i] Rising income inequality doesn’t help matters, especially at UC Irvine where 6 in 10 students get financial aid and half are the first in their families earning a college degree.[ii] Because of this, Irvine has been cited by the New York Times as the country’s leading “upward mobility engine” –– making the campus a national model of what public higher education can do.[iii] But it’s still not a cake-walk for degree seekers. As at most public universities in America, the majority of Irvine’s full-time students also work at jobs to make ends meet.[iv] Continue reading “When School is a Factory”

Anxious Creativity: When Imagination Fails

Just released: Creativity is getting new attention in today’s America –– along the way revealing fault lines in U.S. culture. Surveys show people overwhelming seeing creativity as both a desirable trait and a work enhancement, yet most say they just aren’t creative.

Like beauty and wealth, creativity seems universally desired but insufficiently possessed. Businesses likewise see innovation as essential to productivity and growth, but can’t bring themselves to risk new ideas. Even as one’s “inner artist” is hyped by a booming self-help industry, creative education dwindles in U.S. schools.

 Anxious Creativity: When Imagination Fails examines this conceptual mess, while focusing on how America’s current edginess dampens creativity in everyone. Written in an engaging and accessible style, Anxious Creativity draws on current ideas in the social sciences, economics, and the arts. Discussion centers on the knotty problem of reconciling the expressive potential in all people with the nation’s tendency to reward only a few. Fortunately, there is some good news, as scientists, economists, and creative professionals have begun advocating new ways of sharing and collaboration. Building on these prospects, the book argues that America’s innovation crisis demands a rethinking of individualism, competition, and the ways creativity is rewarded.

Available from all major booksellers. More info at: https://www.routledge.com/Anxious-Creativity-When-Imagination-Fails-1st-Edition/Trend/p/book/9780367275068

Natural Born Killers?

David Trend

“Confessions of a Drone Warrior,” is one of hundreds of articles on the military’s use of Unmanned Ariel Vehicles (UAV), which began in the early 2000s. In many ways this new form of combat embodies the psychological distancing that typifies killing in the twenty-first century. The story about Airman First Class Brandon Bryant recounts his first day in a Nevada bunker, when the 22-year fired on two presumed Afghani insurgents on the other side of the world. An early recruit in this new kind of warfare, Bryant “hunted top terrorists, but always from afar” –– killing enemies in countless numbers, but not always sure what he was hitting. “Meet the 21stcentury American killing machine,” the story concluded.[i]

Of course, notions of aversion to fighting don’t sit well with either military doctrine or public belief. Behind America’s infatuation with high-tech weapons lie long-cultivated attitudes toward violence itself. In a class I teach on this, students often will express common sense views that fighting is “natural,” deriving from humanity’s animalistic origins, and often the only way of resolving conflicts. One sees this kind of thinking evident in permissive attitudes toward everything from boyish rough-housing to violent sports. The gendered aspects of violence receive less attention than they should, and will be addressed at length in Chapter 9. Suffice to say that aggression often is expected of men and boys, while also reflected in popular culture. Along with political partisanship, these attitudes help explain the deep divisions within the U.S. electorate over gun control and so-called “stand your ground” laws. Since even scholars often disagree over the issue of human violence, it helps to break the question into subcategories –– and to also point out how knowledge has changed over time in the fields of biology, psychology, and cultural analyses of violent behavior.

Continue reading “Natural Born Killers?”

Teaching Robots to Imagine

David Trend

Can robots be taught to imagine? Google’s DeepMind artificial intelligence group is doing just that –– developing computer versions of what many consider humanity’s quintessential trait. The software world long has pursued sentient consciousness as its holy grail. But until now, it’s only been found in science fiction movies like A.I., Ex Machina, and Transcendence. DeepMind engineers say they have cracked the code by combining two kinds of machine-learning. The first is linear, which is nothing new, with the computer applying a predefined algorithm over-and-over till it finds answers and then remembering them. In the second more radical approach, the computer tries many algorithms to find which work best, and then changes the very way it approaches problems. Combining the purely linear with a more systemic approach, DeepMind’s “Imagination-Augmented Agent” mimics intuitive learning in a way prior software hasn’t. It’s not exactlythe same as human imagination, but it comes closer than ever before to what neuroscientists say the brain does.

While robotic imagination may be improving, human thought isn’t faring as well. Most people feel uncreative and without inspiration, as discussed in earlier chapters. Corporations say innovation is withering as well. Novelist Ursula Le Guin recently observed that, “In America today imagination is generally looked on as something that might be useful when the TV is out of order. Poetry and plays have no relation to practical politics. Novels are for students, housewives, and other people who don’t work.”[i]Beyond the abandonment of a creative genre or two, American society also is undergoing a wholesale commodification of imagination itself. Disney is most famous for this, its “Imagineering” (imagination + engineering) brand one of the most viciously protected anywhere. But hundreds of companies evoke imagination to conjure an aura of specialness ––seen in promotions like Bombay Safire’s “Infused with Imagination,” GE’s “Imagination at Work,” Electrolux’s “Power to Capture Imagination,” Lego’s “Imagine,” Microsoft’s “Imagine Academy,” Nestle’s “Feed your Imagination,” Samsung’s “Imagine,” and Sony’s “Made of Imagination.”

The connection of imagination to commercial products reflects the powerful linkage of purchasing to consumer self-image. Expressing oneself through buying brings a passing feeling of agency, maybe even of accomplishment. Some critics say that shopping is more meaningful than voting for many Americans. Henry A. Giroux speaks of “disimagination” in describing how public consciousness is overwritten in this process, as people lose abilities to imagine on their own. To Giroux “The power to reimagine, doubt, and think critically no longer seems possible in a society in which self-interest has become the ‘only mode of force in human life and competition’ and ‘the most efficient and socially beneficial way for that force to express itself.’” Going even further, Giroux links disimagination to a rising collective amnesia, stating “What I have called the violence of organized forgetting signals how contemporary politics are those in which emotion triumphs over reason, and spectacle over truth, thereby erasing history by producing an endless flow of fragmented and disingenuous knowledge.”

Imagination can be seen positively, of course. With this in mind, much of this chapter exploresways people can envision a better and more just world. Obviously this might take a little encouragement in an age of disimagination. But it’s far from impossible. Most definitions describe imagination as the mental process behind creativity, as seen in the Oxford Dictionary: “Imagination: The faculty or action of forming new ideas, or images or concepts of external objects not present to the senses.The ability of the mind to be creative or resourceful.” Put another way, creativity is imagination actualized for a purpose –– generally assumed a positive one. As stated by a leading expert in the field, “Creativity is putting your imagination to work. It’s applied imagination.” Dig a little deeper into this lexicon, and one finds that very problem that worries Le Guin and Giroux. A quick look at Roget’s Thesauruslists such synonyms for “imaginative” as “dreamy,” “fanciful,” “fantastic,” “quixotic,” “romantic, and “whimsical.” Nice as these sound, such vaporous associations equate imagination with the same romantic idealism and inconsequentiality dogging creativity. This explains why advertisers seem so keen on imagination. As one marketing firm put it, “We don’t see imagining as a real task. It’s an enjoyable game. By asking a prospect to imagine something, you bypass that critical part that throws up objections, and sneak into their mind through the back door of the imagination.”

How about seeing imagination differently? Maybe as a roadmap for one’s life or future?  Or a way to imagine important people in one’s life? Perhaps even a vision for community, country, and the larger world? After all, isn’t society itself an imaginary construct? Doesn’t everyone want to make it better? To Le Guin, “To train the mind to take off from immediate reality and return to it with new understanding and new strength, nothing quite equals poem and story.” She concludes that “Human beings have always joined in groups to imagine how best to live and help one another carry out the plan. The essential function of human community is to arrive at some agreement on what we need, what life ought to be, what we want our children to learn, and then to collaborate in learning and teaching so that we and they can go on the way we think is the right way.”

Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “Big Data vs Artists and Everyone Else”

Belonging Where?

By David Trend:

Throughout its existence the United States has shown a strange tendency to turn against itself, dividing citizens against each other with a vehemence rivaling the most brutal regimes on earth. Some have rationalized the resulting crisis of “belonging” in America as an understandable consequence of cultural diversity, economic stress, and global threat. After all, haven’t there always been “insiders” and “outsiders” in every culture? Aren’t competition and aggression wired into human nature?  Or is there something peculiar about the personality of the U.S.?  Could it be that prejudice is the real legacy of the “American Exceptionalism,” in traditions dating to the genocide of indigenous populations, the subjugation of women, the rise of slavery, the scapegoating of immigrants, and more recent assaults on the poor or anyone falling outside the realm of normalcy?

I discussed selected aspects of America’s divisive pathology in my book A Culture Divided: America’s Struggle for Unity, which was written in the closing years of the George W. Bush presidency.  Like many at the time, I had completely given up on the idea of “common ground” amid the residue of post-9/11 reactionary fervor and emerging economic recession. Media commentators were buzzing constantly about red/blue state polarization.  Opinions varied about the cause of the divide, attributing it to factors including regionalism, media sensationalism, partisan antipathy, or all of these combined. Also joining the fray were those asserting the divide was fabricated, with evenly divided elections showing most people in the middle of the curve on most issues.  My somewhat contrarian view was that the “problem” shouldn’t be regarded problem at all. After all, America always had been divided––through war and peace, boom and bust. Division was the country’s national brand.  But as a book about politics, A Culture Divided didn’t get to the roots or the lived experience America’s compulsive divisiveness.

Speaking at the 50th anniversary of the Selma to Montgomery marches, President Barack Obama described America as an incomplete project––a nation caught between ideals of a perfect union and the lingering realities of their failure. While citing advances in civil liberties since the bloody apex of the Voting Rights Movement, Obama also spoke of a federal report issued just days earlier documenting structural racism and misbehavior toward African Americans by police in Ferguson, MO, where months before law enforcement officers had killed an unarmed black teenager. “We know the march is not yet over.  We know the race is not yet won,” the President stated, adding, “We know that reaching that blessed destination requires admitting as much, facing up to the truth.” Continue reading “Belonging Where?”

Creative Magic

By David Trend

“The central question upon which all creative living hinges is: Do you have the courage to bring forth the treasures hidden within you?” With this entreaty, author Elizabeth Gilbert introduced her recent bestseller Big Magic: Creative Living Beyond Fear, which offered an artistic cure for an anxious American culture.[i] Speaking directly to widespread feelings of disaffection and powerlessness, Big Magic romanticized artistry in Gilbert’s signature blend of sentiment and cliché––packaging familiar views (human creativity, divine creativity, etc.) with a self-help twist about creating one’s “self” in new and better ways.  While one easily can write off Big Magic as yet another feel-good advice book (which it surely is), I think it’s time to take Gilbert’s approach to creativity seriously and ponder why such ideas now get so much traction.

Publicity doesn’t hurt. Reviewers effused over Big Magic as a “book-length meditation on inspiration” (Newsday) to “unlock your inner artist” (Woman’s Day) and “dream a life without limits” (Publishers’ Weekly).[ii] This message resonated well with the rising chorus promoting creativity as an innovation engine and economic tonic.  While no one would dispute the positive benefits of a little artistic dabbling, at what point does such wishful thinking begin to border on delusion? Or put another way, when does fantasy paper over reality? Might it be that America’s fondness for make-believe is party behind the nation’s political confusion and disaffection? Do fairy-tale versions of life infantilize a citizenry that should know that answers don’t always come easily?  Certainly the fantasy-version of reality offered by certain politicians would fail any thoughtful analysis. But instead, many leaders continue treating their constituents like children, with entire governments encouraging populations to set worries aside and simply “Be Creative.”

In Magical Thinking and the Decline of America, historian Richard L. Rapson took a long look at the nation’s romantic idealism. “Probably in no other society of the world can one write the script for one’s life as completely as United States. This fact has made the nation the ‘promised land’ for much of the world over the past two centuries,” Rapson wrote. “The flight into endless self-improvement and innocent optimism has a long lineage in our past.”[iii] Perhaps anticipating Donald Trump’s “Make America Great Again” sloganeering, Rapson pointed to the disconnection between America’s self-image as an “exceptional” driver of human history, and the growing evidence of the nation’s falling fortunes. This has led to what Rapson described as a growing “flight from knowledge and reality into faith and fantasy,” resulting in large part from “an American public increasingly in thrall to the fairytales told by the mass media.”[iv]  It also promotes a “cultural fixation on the individual, the personal, the biographical, the confessional, and, all too often, the narcissistic,” and hence the rise of new “magic words” like “self-awareness,” “personal growth” and other aphorisms promoting everyone to “be all that you can be.”[v]

Individualism lies at the heart of American idealism, dating to the country’s Enlightenment Era origins, when the autonomous subject was invented as a counterpoint to deific and royal authority. Necessary as individualism was (and remains), no one could have predicted how its value could be magnified and distorted in neoliberal times.  The initial affirmation of personal identity, which encouraged people to vote and participate in society, soon morphed into “striving to get ahead” and “winning at any cost.” Eventually the “self” would become an American obsession of theological proportions. “The purpose of nearly all the current gospels is to put believers ‘in touch’ with themselves,” Rapson further explained.[vi] This new brand of secular “faith” also comports well with the religiosity many Americans still profess, especially evangelical strains that promise economic gain to dutiful worshippers. Continue reading “Creative Magic”