Update Available: The Algorithmic Self

Bing, Bard, and other bots. The world is rushing headlong into a ChatGPT future. Yet amid the giddy optimism over boundless new capabilities lie deeper questions about how artificial intelligence is reshaping human consciousness in unnoticed ways. Update Available: The Algorithmic Self (2023) take a critical look at this emerging phenomenon.

Update Available is available as a free download from Amazon, Apple, Barnes & Noble and other major retailers, published as an Open Access Creative Commons book.

Other books by David Trend include  Welcome to Cyberschool: Education at the Crossroads in the Information Age, Worlding: Media, Identity, and Imagination,  and The End of Reading: From Guttenberg to Grand Theft Auto.  

Trend’s popular “Changing Creativity” course is taken each year by over 1000 students throughout the University of California system.

Find Your Superpower

“How to Find Your Superpower” is among thousands of recent articles, books, and improvement programs about the age-old dream of an updated self. Like others in its genre, the piece offers guidance for achieving “peak performance” through a blend of passion, mastery, and hard work. “The #1 thing you can do is determine your strengths, determine your superpowers,” the authors state in coaching readers to sharpen “a dominant gift an attribute, skill or ability that makes you stronger than the rest:  a difference between you and your coworker.”[i] Find that elusive something, and you are sure to succeed. Pitches like this appear everywhere these days. Witness the massive market for fitness, beauty, self-esteem, and cognitive improvement products. These range from dietary supplements and workout regimes to books, videos, and apps. Amazon is loaded with titles like Your Hidden Superpower, Finding Your Superpower, and the kid’s book What’s My Superpower? [ii]

Juvenile appeals notwithstanding, a consistent theme runs through all these books – that it is up to you alone to find, develop, or somehow acquire missing capacities. Rarely is there a mention of structural advantages or disadvantages in the superpower quest. The impulse to exceed one’s limits has a long history in Western thought, with roots in religious doctrine and philosophy. Some even link enhancement to hard-wired survival instincts. Simply put, people have been augmenting themselves for thousands of years, first by using tools, then by working in groups, and later with machines and technology. From the Enlightenment Era onward, this was seen as humanity’s “natural” impulse for continual improvement and progress. Ongoing developments in science and medicine have intensified this drive, along with the heightened sense of crisis in the 21st century. The result has been a growing mania to become stronger, smarter, and better looking than anyone else.

Then add popular culture. Everyone knows the basic superhero plot: stories begin with ordinary characters (often underdogs), who transform via accident, discovery, or gift. With new powers, the superhero battles evil and invariably prevails. Such stories now comprise the most cherished works of mainstream media, generating fortunes for movie franchises: Marvel ($18.2 billion), Harry Potter ($9.1 billion), X-Men ($5.8 billion), DC Universe ($4.9 billion), Spiderman ($4.8 billion).[iii] It’s easy to see the appeal of these films. In an essay titled “Why Everyone Has Seen a Superhero Movie,” critic Gwyneth Torrecampo explained that “The unique challenges we face in our everyday lives can be daunting and stressful, leading us to feel powerless and dejected.”[iv]  Viewers thus identify with the hero as a form of wish fulfillment, she explains. “Superheroes often start out like you and me, and then go on to overcome obstacles, protect the vulnerable, or save the world. It’s a potent fantasy that inspires imitation among viewers.”

The superhero syndrome is the fantasy version of “human enhancement,” defined as “the natural, artificial, or technological alteration of the human body to enhance physical or mental abilities.”[v] On one hand there is nothing terribly new or unusual about this. Running shoes and vitamins are enhancements that people take for granted. And indeed, much of modern medicine devotes itself to such helpful interventions, especially when they address genuine needs or difficulties.  An appropriately determined restoration of health and functionality always has defined the practice of healing professions, as discussed in Chapter 5. But in recent years, the marriage of science and business has gone well beyond “getting back to normal” in offering ever-more-sophisticated forms of enhancement to meet the public’s insatiable appetite for “more.” But not without controversy. The troubled histories of cosmetic surgery, fad-diets, and steroid abuse are but a few notable examples. Certainly science-fiction superpower stories play a big part in the phenomenon. But on another level, the hunger for such products feeds on the gnawing anxiety now epidemic in the U.S. In addition to economic disparities and their socio-cultural underpinnings, new levels of perfectionism percolate in mainstream culture as well. Advertising only reinforces these impulses by linking them to products.

Human enhancement fascinated German philosopher Friedrich Nietzsche, known for his social criticism and advocacy of science. In 1883, Nietzsche introduced the aristocratic figure of the “Übermensch” (Superman) as an aspirational ideal for the human race. He argued that this “perfect human” could be achieved though secular means on earth (rather than heaven) by improvements in health, creativity, and willpower.  In making this claim, Nietzsche wasn’t simply promoting sci-fi fantasy. Putting the Übermensch in a broader context, Nietzsche explained that every society generates a set of body ideals, and that those ideals inform what societies value and how they behave. The Übermensch complimented then-popular beliefs about human evolution, especially the strain of thinking known as “eugenics.”  Also introduced in 1883, eugenics applied Charles Darwin’s theories of natural selection to social policy by advocating the selective reproduction of certain classes of citizens over others (something Darwin never himself advocated). National leaders like Winston Churchill and Theodore Roosevelt supported the concept, along with many others around the world.[vi]  Nazi eugenicists later would cite Nietzsche’s Superman concept in their program to perfect the Aryan race through genocidal programs during World War II.

The excesses of early eugenics movements have tempered contemporary thinking about human enhancement but have done little to dampen yearnings for superhuman updates to the body and mind. Unforeseen consequences often come from what initially seem good ideas. And enthusiasm has little patience for downsides. Adding commerce, culture, and health benefits to the mix, it’s no mystery why the update impulse is stronger than ever. This chapter examines the resulting contradictions in today’s improvement culture, as they play out in beauty, fitness, wellness, intelligence, and ability. Key in this discussion is the assertion that in themselves enhancements are neither good nor bad. Like many things, what matters is the degree to which they are pursued, as well as what happens when external values or pressures are placed upon them.

Many use the term “transhumanism” to describe the contemporary update impulse in everything from robotic cyborgs to artificial organs. As the name implies, transhumanism wants humanity to transcend its limitations, with a strong emphasis on subjective autonomy and the specialness of the human species. Philosophically speaking, the movement sees humanity in a contest with nature and the natural world. It partakes in the belief that humans should use nature for their own ends and master its processes with science and technology. This takes form in enhancements to augment the “natural” body or, ultimately, to forestall or eliminate the natural occurrence of death. Because of this, some critics equate transhumanism with anthropocentrism, as well as historic tendencies to denigrate groups seen as uncivilized, savage or otherwise less-than-human owing to their proximity to nature.

Transhumanism differs from the similar term “posthumanism,” which looks at the way the human self is technologically mediated, and how the humans coexist with other organisms. Writing in The Posthuman Glossary, Francesca Ferrando explained the distinction: “Transhumanism and posthumanism both emerged in the 1980s and 1990s, but the drives motivating them are rooted in different traditions of thought. Transhumanism traces its roots within the enlightenment and does not reject the human tradition; on the contrary transhumanism focuses specifically on human enhancement.” In contrast, posthumanism focuses on “the postmodern deconstruction of the human started in the 1960s and 1970s underlining the fact that, historically, not every human being has been recognized as such.”[vii]

British futurist Max More often gets credit for mapping out the first full-fledged philosophy of contemporary transhumanism in his 1990 “Principles of Extropy.”[viii] More used the term “extropy” (the opposite of entropy) to assert the continual evolution of “intelligent life beyond its current human form and limitations.”[ix]  This can include anything from prosthetic limbs, brain implants, and gene splicing to futuristic plans for extending the human life span or uploading consciousness to the cloud.  Most of what is seen in contemporary science fiction falls within the transhumanist realm, for better or worse. While transhumanism focusses on the laudable goal of improving people’s health, ability, and wellbeing, it often glamorizes technology as an end in itself. Transhumanists imagine a future in which people become liberated from the constraints of infirmity and death but remain essentially human.

Transhumanism often finds itself tangled in ethical debates about technology’s appropriate role in life. Some critics worry that too many changes might alter what it means to be human in the first place. Others point out that technologies often get misused or run out of control. And still others express concern about the high costs of enhancements. Underscoring this last point, some of transhumanism’s most well-known boosters are tech billionaires like Elon Musk and Peter Theil.  New technologies often spring from genuine needs and good intentions. Yet they inevitably become contingent on cultural attitudes, market forces, and the institutions that enable them.  

[i] Gwen Moran, “How to Find Your Superpower,” Fast Company (Jun. 8, 2018) https://www.fastcompany.com/40578240/how-to-find-your-superpower (accessed Apr. 22, 2022).

[ii] Becca North, Your Hidden Superpower (Independently published, 2018); Carter Hughes, Finding Your Superpowers: Keys to Cementing Your Identity and Reaching Your Goals (Independently published, 2020); Aviaq Johnson and Tim Mack, What’s My Superpower? (New York: Inhabit Media, 2017).

[iii] Jennifer M. Wood, “10 Highest Grossing Movie Franchises of All Time,” MF (Mar. 18, 2019) https://www.mentalfloss.com/article/70920/10-highest-grossing-movie-franchises-all-time (accessed Apr. 19, 2022).

[iv] Gwyneth Torrecampo, “10 Reasons Why Everyone Has Seen a Superhero Movie,” Medium (Aug. 16, 2018) https://medium.com/framerated/10-reasons-why-superhero-films-are-so-popular-2ce69d2d93ea (accessed Apr. 19, 2022).

[v] “Human Enhancement,” Stanford Encyclopedia off Philosophy (Apr. 7, 2015) https://plato.stanford.edu/entries/enhancement/ (accessed May 20, 2022).

[vi] Victoria Brignell, “When America Believed in Eugenics,” New Statesman (Dec. 10, 2010) https://www.newstatesman.com/society/2010/12/disabled-america-immigration (accessed Apr. 24, 2022).

[vii] Francesca Ferrando, in Rosi Braidotti and Maria Hlavajova, The Posthuman Glossary (London: Bloomsbury Academic, 2018) p. 439.

[viii] See, Max More, “The Philosophy of Transhumanism, in Max More and Natasha Vita-More, The Transhumanist Reader: Classical and Contemporary Essays on Science, Technology, Philosophy of the Human Future (Malden, MA: Wiley-Blackwell, 2013) p. 10.

[ix] More, p. 3.

Welcome to Cyberschool

David Trend

While technology always has played a big part in education ,it went into hyperdrive in the pandemic-driven move to online learning. Up to this point, economic pressures and growing student numbers already were causing a panic in education. Schools were struggling to trim budgets as “accountability” scrutinized everyone. These extant conditions presented an upside to some of the changes that would occur.  Most dramatically, the shift to doing schoolwork at home eliminated shortfalls in classroom space and, at least temporarily, student housing as well. As the pandemic continued the share of higher education offered online jumped from 10 percent in 2019 to 33 percent a few years later.[i]  But as everyone now knows, so-called “distance learning” isn’t for everyone and doesn’t work for all kinds of material.  Research shows that one-size-fits-all character of mechanical course delivery disadvantages students of many kinds. 

Online schooling isn’t as new as you might think. The idea of distance learning dates to vocational and self-improvement correspondence courses of the eighteenth century, which arose with improvements  in mail delivery systems. Often cited as an early example was a shorthand course offered by Caleb Phillips, advertised in a 1721 edition of Boston Gazette with claims that “students may by having several lessons sent weekly to them, be as perfectly instructed as those that live in Boston.”[ii] By the 1800s all manner of vocational skills were being taught by mail, as well hobbies like drawing and painting. The University of London became the first college to offer distance learning degrees in 1858. By the end of the century, learning by mail had become big business for institutions like the Pennsylvania-based International Correspondence Schools (ICS). In the decade between 1895 and 1905, ICS grew from 72,000 to 900,000 students signing up to learn technical and management skills.[iii] Much of this growth was due to the innovation of sending entire textbooks rather than single lessons, along with promotion by a large in-person sales team.

The Learning Society

David Trend

As consumer prices continue to rise, experts now warn of a looming recesssion brought about by pandemic manufacturing slowdowns and supply-chain shortages. Economists explain it as a classic case of demand outpacing availability –– with scarcity making things more costly. Unfortunately, the painful solution now being launched will raise borrowing costs rates so that people spend less. While these measures may or may not improve the overall economomy, the combined effects of inflation and rising interest rates will exact a double blow to people struggling to make ends meet. In such an atmosphere it becomes critical to help people manage their own finances and to prevent the broader economy from overheating. This is where consumer education and financial literacy can help as part of a largermove toward a “learning society.”

For some time now, economists have been promoting financial education in public schools and urging people to become more resourceful. Time Magazine reported polls showing “99 percent of adults in agreement that personal finance should be taught in high school.”[i]  The Federal Reserve argued that “financial literacy and consumer education, coupled with strong consumer protections, make the financial marketplace ‘effective and efficient’ and assists consumers in making better choices.”[ii] Many colleges and universities have started making financial literacy courses graduation requirements. And for some it has worked, as many Americans “put their own budgets under the microscope –– akin to what financial analysts routinely do when the scrutinize companies.”[iii]  

Continue reading “The Learning Society”

The Creative Inner Child?

David Trend

Pablo Picasso once quipped that “Every child is an artist; the problem is how to remain an artist once they grow up.”[i]  In this often-quoted slogan, Picasso neatly summarized idealized views of the universally creative child and the uncreative adult. In a similar fashion he would later write that, “It takes a long time to become young.” What is one to make of such laments? Nostalgia over a lost youth? A yearning to escape a pressurized grown-up life?  Regardless of origins, it’s impossible to deny America’s ongoing infatuation with childhood creativity.

This fascination childood artistry dates to the 1700s, corresponding to evolving views of children as “blank slates” (tabula rasa) better served by nurturance and education than by discipline alone. At the same time, Enlightenment debates ver individualism and personal autonomy were bringing considerable anxiety to the era, evidenced in worries that self-interest would overwhelm moral sentiments. This set the stage for the naturalism espoused by Jean-Jacques Rousseau in his book Emile: Or, On Education, seeing an inherent “goodness” in children, which becomes corrupted by adult desire and material want.[ii] With the 1800s, views of “human nature” gave ways to theories of evolution and behavioral adaptation –– owing in large part to the influence of Charles Darwin and Herbert Spencer. While the resulting rationalism eventually would make educatio more formulaic, an artsy transcendentalism would counterbalance American culture with an advocacy for an “educated imagination.”[iii] The Romantic Era writings of Ralph Waldo Emerson, Margaret Fuller, Henry Wadsworth Longfellow, and Walt Whitman advanced themes of emotion over reason and imagination over reality –– setting in place a tradition progressive of push-back against the instrumentalist ethos of science and industry.

In the 1920s, Swiss psychologist Jean Piaget began charting children’s “stages” of maturity, hence launching the modern field of child development.[iv] Piaget saw “realistic” rendering as a learned ability rather than a natural inclination. In one famous study, Piaget asked a group of four-year olds to draw familiar people or objects. He found that the images invariably had the same characteristics: drawn from memory rather than observation, exaggeration of certain salient features (faces, for example), and a disregard of perspective or scale. In other words, the images derived more from mental symbolism than they did conventional schema of visual representation. Piaget would note that at later ages children acquire the ability to “correct” their images to conform to normative depictions of reality. Later observations of so-called “feral” children (raised” in the wild without human contact) found that such children often didn’t speak or make pictures of any kind, further reinforcing the premise that language and “artistic” rendering were largely determined by culture.[v]

The Problem with Rigor

David Trend

“It’s Time to Cancel the Word Rigor,” read a recent headline in the respected Chronicle of Higher Education.[1]  The article detailed growing concerns about hidden bias within what many see as conventional teaching practices. Here, “rigor” was taken to task for throwing roadblocks up for some students more than others, even as its exact meaning remains vague. Webster’s Dictionary defines rigor as “severity, strictness or austerity,” which educators often translate into difficult courses and large amounts of work, rationalized in the interest of excellence and high standards.[2]

While there is nothing wrong with challenging coursework, per se, this interpretation of rigor often becomes a recipe for failure for otherwise intelligent and hardworking students.  Such failures can result when rigor is used to incentivize or stratify students, as in gateway or “weed out” courses with prescribed grading targets, or situations where faculty overuse tests as motivation. Rigor discussions I have witnessed rarely consider instructional quality, teaching effectiveness, or principles of learning. Instead faculty complain about poor student attention, comprehension, or commitment. As the Chronicle explains, “all credit or blame falls on individual students, when often it is the academic system that creates the constructs, and it’s the system we should be questioning when it erects barriers for students to surmount or make them feel that they don’t belong.”[3] Continue reading “The Problem with Rigor”

The Algorithm Rejected Me

David Trend

School is where most kids first become aware of what I call the  “update imperative.”  After all, education is a process continual improvement in a step-by-step process of knowledge acquisition and socialization. In this sense schooling represents much more than the beginning of education. For many kids it’s a time of moving from the familiarity of home into the larger world of other people, comparative judgement, and a system of tasks and rewards. Along the way, a package of attitudes and beliefs is silently conditioned: conformity to norms, obedience to authority, and the cost of failure. All of this is presented with a gradually intensifying pressure to succeed, rationalized as a rehearsal for adult life. Rarely are the ideological parameters of this “hidden curriculum” ever challenged, or even recognized. Much like work, American K-12 schools are driven largely by mandates of individual achievement and material accumulation.

By the time college applications are due, levels of anxiety can run out of control, given the role of degrees in long term earnings.  Many students start the admissions Hunger Games as early as middle school, plotting their chances, polishing their transcripts, and doing anything they can to get good grades. Everyone knows how admissions data now flows in an age in which students apply to an average of 10 schools each. Unsurprisingly perhaps, overall applications have increased by 22% in the past year alone.[i] And while the applicant side of this equation has been much publicized, what happens in the admissions office remains shrouded in mystery. Largely unknown are secret criteria driven by algorithms to determine things like likelihood to enroll or willingness to pay. Even less known are kinds of AI analytics used to monitor and grade students, sometimes making prejudicial judgements along the way. Continue reading “The Algorithm Rejected Me”

Why Professors Ignore the Science of Teaching

David Trend

A recent article appearing in the Chronicle of Higher Education explored the apparent reluctance of college and university professors to embrace the growing body of research about how students learn and what teaching methods work best. While many faculty simply cling to what has worked for them in the past, others feel overworked and unable the consider changing. In the meantime, an increasingly diverse student population experiences increasing inequity as a result.

Beth McMurtrie’s “Why the Science of Teaching is Often Ignored” opens with a discussion of a recent study by five Harvard University researchers who published some novel research. The group was trying to figure out why active learning, a form of teaching that has had measurable success, often dies a slow death in the classroom. They compared the effects of a traditional lecture with active learning, where students solve problems in small groups.

The results were not surprising; students who were taught in an active method performed better on standardized tests. The academic press praised the study for its clever design and its resonance with professors who had trouble with active learning. Yet despite being praised in some quarters, the study was criticized in others.

This mixed reaction reveals a central paradox of higher education, according to McMurtrie. Teaching and learning research has grown dramatically over the decades, encompassing thousands of experiments, journals, books, and programs to bring learning science  into classrooms. But a lot of faculty members haven’t read it, aren’t sure what to do with it, or are skeptical. Continue reading “Why Professors Ignore the Science of Teaching”

Inclusive Pedagogy

David Trend

The pandemic years have been rough on college students everywhere, with record levels of academic stress and losses in student learning.  While occurring throughout higher education, these problems haven’t affected all groups the same way. Students from privileged backgrounds have fared better than the under-resourced, with disparities in network access, income, and external responsibilities exacerbating inequities. As I saw these dynamics play out in the large undergraduate general education courses I teach, I began wondering if instructional methods might be partly to blame and if changes might improve matters going forward. Working with UC Irvine’s Division of Teaching Excellence and Innovation (DTEI) helped me to rethink my own teaching by searching out ways that I unconsciously had been putting up roadblocks

Usually when educators speak of “inclusion” they are thinking of course content and ways to incorporate diverse perspectives or voices previously excluded. While this approach remains a central tenant of inclusive teaching, a deeper look at the issue can reveal biases or barriers built into the teaching of even the most progressive educators. Practices of exclusion can be the result of habits or structures that have become so routinized in instruction that they seem natural or neutral approaches. Costly books, rigid deadlines, and high-stakes exams are among practices that privilege students with money, time flexibility, and testing skills, for example.

Faculty attitudes also can get in the way of inclusion. This often is manifest in principles of “rigor” intended to elevate worthy over unworthy students. Such attitudes create a scarcity mentality toward success rather than one that makes high achievement possible for all students. Decades of educational research has shown the deleterious effects of such practices in conflating grades with knowledge acquisition. The grade pressure that frequently drives “rigor” has been shown to affect some students more than others, while creating an atmosphere of anxiety and an emphasis on types of learning easily that can be easily tested. Not only does this create learning inequities, but it also tends to discourage collaboration, questioning, and diverse opinion. Continue reading “Inclusive Pedagogy”

When School is a Factory

David Trend

For 20 years, I have been teaching large arts and humanities general education courses at the University of California, Irvine. These 400-student classes are part of the undergraduate “breadth requirements” common in most colleges and universities, and hence draw enrollments from across the academic disciplines. At UC Irvine, this means that most of the class comprises science, technology, engineering, and math (STEM) majors. Aside from an orientation to more practical fields, I’ve noticed a clear shift in student attitudes in recent years –– a heightened preoccupation with grades and rankings, combined with growing anxieties about future earnings. Many of my colleagues see this as well, often disparaging students more concerned with GPA metrics than learning itself, while increasingly behaving more like consumers of educational commodities. I take a more sanguine view.

Bear in mind that many of today’s college students grew up during the Great Recession, when families of all incomes had money worries. With scant knowledge of a world before 9/11, it’s little wonder that polls show millennials expecting lower earnings than their parents, seeing the United States on a downward spiral, and believing the two-party system as fatally flawed.[i] Rising income inequality doesn’t help matters, especially at UC Irvine where 6 in 10 students get financial aid and half are the first in their families earning a college degree.[ii] Because of this, Irvine has been cited by the New York Times as the country’s leading “upward mobility engine” –– making the campus a national model of what public higher education can do.[iii] But it’s still not a cake-walk for degree seekers. As at most public universities in America, the majority of Irvine’s full-time students also work at jobs to make ends meet.[iv] Continue reading “When School is a Factory”

Big Data vs Artists and Everyone Else

By David Trend:

Heard about Generation Z?  The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.”  Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.

Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism.  Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.”  Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.”  Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”

The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings.  ‘Decommodification,’ then, is to reverse this process.  To make the world and the people in it more unique, more priceless, more human.”  This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “Big Data vs Artists and Everyone Else”

Belonging Where?

By David Trend:

Throughout its existence the United States has shown a strange tendency to turn against itself, dividing citizens against each other with a vehemence rivaling the most brutal regimes on earth. Some have rationalized the resulting crisis of “belonging” in America as an understandable consequence of cultural diversity, economic stress, and global threat. After all, haven’t there always been “insiders” and “outsiders” in every culture? Aren’t competition and aggression wired into human nature?  Or is there something peculiar about the personality of the U.S.?  Could it be that prejudice is the real legacy of the “American Exceptionalism,” in traditions dating to the genocide of indigenous populations, the subjugation of women, the rise of slavery, the scapegoating of immigrants, and more recent assaults on the poor or anyone falling outside the realm of normalcy?

I discussed selected aspects of America’s divisive pathology in my book A Culture Divided: America’s Struggle for Unity, which was written in the closing years of the George W. Bush presidency.  Like many at the time, I had completely given up on the idea of “common ground” amid the residue of post-9/11 reactionary fervor and emerging economic recession. Media commentators were buzzing constantly about red/blue state polarization.  Opinions varied about the cause of the divide, attributing it to factors including regionalism, media sensationalism, partisan antipathy, or all of these combined. Also joining the fray were those asserting the divide was fabricated, with evenly divided elections showing most people in the middle of the curve on most issues.  My somewhat contrarian view was that the “problem” shouldn’t be regarded problem at all. After all, America always had been divided––through war and peace, boom and bust. Division was the country’s national brand.  But as a book about politics, A Culture Divided didn’t get to the roots or the lived experience America’s compulsive divisiveness.

Speaking at the 50th anniversary of the Selma to Montgomery marches, President Barack Obama described America as an incomplete project––a nation caught between ideals of a perfect union and the lingering realities of their failure. While citing advances in civil liberties since the bloody apex of the Voting Rights Movement, Obama also spoke of a federal report issued just days earlier documenting structural racism and misbehavior toward African Americans by police in Ferguson, MO, where months before law enforcement officers had killed an unarmed black teenager. “We know the march is not yet over.  We know the race is not yet won,” the President stated, adding, “We know that reaching that blessed destination requires admitting as much, facing up to the truth.” Continue reading “Belonging Where?”

Creative Magic

By David Trend

“The central question upon which all creative living hinges is: Do you have the courage to bring forth the treasures hidden within you?” With this entreaty, author Elizabeth Gilbert introduced her recent bestseller Big Magic: Creative Living Beyond Fear, which offered an artistic cure for an anxious American culture.[i] Speaking directly to widespread feelings of disaffection and powerlessness, Big Magic romanticized artistry in Gilbert’s signature blend of sentiment and cliché––packaging familiar views (human creativity, divine creativity, etc.) with a self-help twist about creating one’s “self” in new and better ways.  While one easily can write off Big Magic as yet another feel-good advice book (which it surely is), I think it’s time to take Gilbert’s approach to creativity seriously and ponder why such ideas now get so much traction.

Publicity doesn’t hurt. Reviewers effused over Big Magic as a “book-length meditation on inspiration” (Newsday) to “unlock your inner artist” (Woman’s Day) and “dream a life without limits” (Publishers’ Weekly).[ii] This message resonated well with the rising chorus promoting creativity as an innovation engine and economic tonic.  While no one would dispute the positive benefits of a little artistic dabbling, at what point does such wishful thinking begin to border on delusion? Or put another way, when does fantasy paper over reality? Might it be that America’s fondness for make-believe is party behind the nation’s political confusion and disaffection? Do fairy-tale versions of life infantilize a citizenry that should know that answers don’t always come easily?  Certainly the fantasy-version of reality offered by certain politicians would fail any thoughtful analysis. But instead, many leaders continue treating their constituents like children, with entire governments encouraging populations to set worries aside and simply “Be Creative.”

In Magical Thinking and the Decline of America, historian Richard L. Rapson took a long look at the nation’s romantic idealism. “Probably in no other society of the world can one write the script for one’s life as completely as United States. This fact has made the nation the ‘promised land’ for much of the world over the past two centuries,” Rapson wrote. “The flight into endless self-improvement and innocent optimism has a long lineage in our past.”[iii] Perhaps anticipating Donald Trump’s “Make America Great Again” sloganeering, Rapson pointed to the disconnection between America’s self-image as an “exceptional” driver of human history, and the growing evidence of the nation’s falling fortunes. This has led to what Rapson described as a growing “flight from knowledge and reality into faith and fantasy,” resulting in large part from “an American public increasingly in thrall to the fairytales told by the mass media.”[iv]  It also promotes a “cultural fixation on the individual, the personal, the biographical, the confessional, and, all too often, the narcissistic,” and hence the rise of new “magic words” like “self-awareness,” “personal growth” and other aphorisms promoting everyone to “be all that you can be.”[v]

Individualism lies at the heart of American idealism, dating to the country’s Enlightenment Era origins, when the autonomous subject was invented as a counterpoint to deific and royal authority. Necessary as individualism was (and remains), no one could have predicted how its value could be magnified and distorted in neoliberal times.  The initial affirmation of personal identity, which encouraged people to vote and participate in society, soon morphed into “striving to get ahead” and “winning at any cost.” Eventually the “self” would become an American obsession of theological proportions. “The purpose of nearly all the current gospels is to put believers ‘in touch’ with themselves,” Rapson further explained.[vi] This new brand of secular “faith” also comports well with the religiosity many Americans still profess, especially evangelical strains that promise economic gain to dutiful worshippers. Continue reading “Creative Magic”

The Performance Art of the Deal

By David Trend:

As I write these words, many Americans remain up in arms about President Donald Trump’s peculiar relationship with the truth.  On a seemingly daily basis, the nation is greeted with a new round of accusations or indignant retorts from the President–– most of which bear little resemblance to objective reality. Let’s just say The Commander-in-Chief has a very “creative” approach to factuality––about everything from crime and immigration to science and the judiciary. Perhaps he’s joking or trying to shock people. Or maybe he’s a pathological liar. Time Magazine devoted a cover to the President’s “Truth and Falsehoods”;  the Los Angeles Times ran multiple “Why Trump Lies” editorials; and The New Yorker is now 14 installments in its ongoing “Trump and the Truth” series. Unsurprisingly, the President doubled-down on his claims, and––in keeping with his fondness for conspiracy theories––has labelled the entire field of journalism “the enemy of the American people.” Endless pundits and commenters have tried to discern a logic in the President’s bizarre behavior––in which mischief and chaos seem the only constants.

Say what you will about Trump, his ability to get public attention is astonishing. And while some critics question the President’s grasp of “reality,” others see a calculated shrewdness in his behavior––an underlying strategy not unlike what Naomi Klein discussed in The Shock Doctrine.  “We already know the Trump administration plans to deregulate markets, wage all-out war on ‘radical Islamic terrorism,’ trash climate science and unleash a fossil-fuel frenzy,” Klein recently stated, adding, “It’s a vision that can be counted on to generate a tsunami of crises and shocks.” She predicted economic shocks (as market bubbles burst), security shocks (as blowback from foreign belligerence comes home), weather shocks (as the climate is further destabilized), and industrial shocks (as oil pipelines spill and rigs collapse, especially when enjoying light-touch regulation).

“All this is dangerous enough,” Klein added, “What’s even worse is the way the Trump administration can be counted on to exploit these shocks politically and economically. Trump himself forecasted as much often in promising a “radical break” from the past––described by Fox News as a “shock and awe campaign against the Washington establishment.” This new agenda bears little resemblance to earlier “culture wars” between conventional liberal and conservative camps. Moral idealism has no place in Trump’s program of disruption and dishonesty. But his ability to confuse and deceive is not to be taken lightly. The Trump phenomenon raises important concerns about the role of knowledge in contemporary society––and the ways different worldviews are conceived, put into circulation, and frequently politicized. Continue reading “The Performance Art of the Deal”

On teaching style

Professors who want to establish classroom connections with their students receive lots of advice. And some experts have over the years advised the use of “self-disclosure,” telling students stories about themselves, using self-deprecating humor as a way to make students feel comfortable and to view the instructor as an ally. InsideHigher Ed discusses the finer points of this:

“Ignore that advice. That’s the recommendation of a study being published today in Communication Education, a journal of the National Communication Association. imgresThe study was based on surveys of 438 undergraduates at a Southeastern university. The students — from across disciplines — were asked about the class they had attended just before taking the survey. And for that class, they were asked both about their instructors and about whether they engaged in certain “uncivil” behaviors, such as packing up books before class was over or texting during lectures. The researchers then compared attitudes the students had about professors and the students behaviors.

“The study notes that professors’ styles only go so far in predicting whether students will be posting status updates on Facebook or actually paying attention, but they do matter.

“Although it is clear that a range of factors outside of instructors’ control contribute to uncivil behavior in the classroom — such as societal shifts toward student entitlement and students’ being raised in homes where manners are not adequately taught — results of this study indicate that there are at least some things instructors can do to minimize uncivil behavior,” the study says. “This model, taking into account only instructor-related factors, explained 20 percent of the variance in self-reported uncivil behaviors among our participants — not a huge proportion, but enough to make a noticeable difference to a frustrated teacher.”

“Based on the surveys, the paper argues that students are least likely to engage in uncivil behavior when they view the instructor as having high levels of “credibility,” meaning that through actions and nonverbal cues, the instructor conveys command of the material and the class, a sense of knowing what should be going on in class, and so forth. When students have that confidence level, they are more likely to pay attention. Continue reading “On teaching style”

On suspending professors

It would be hard to find a faculty advocate opposed to the suspension last week of a University of Florida professor of veterinary science who was secretly taking videos of students’ body parts with a device hidden in his pen. Administrative — and police — action came swiftly, without any public objection from fellow instructors.

But as InsideHigher Ed reports today: “Beyond such a clear violation of professional conduct, and, in this case, the law, faculty advocates often are quick to criticize institutions for jumping the gun with punishments. imagesA spate of forced leaves for professors in recent memory raises the question of what exactly constitutes suspension-worthy speech and action — particularly a suspension made unilaterally by administrators.

“In other words, does a line exist and, if so, where?

“The answer, some experts said, is another question: Does the faculty member’s exercise of his or her rights violate anyone else’s? And some fear that institutions may be becoming too quick to suspend in cases in which faculty conduct may have resulted in hurt feelings but not actual harm.

“The proper line to draw is where a professor’s actions interfere with the legitimate rights of others,” said John K. Wilson, co-editor of the American Association of University Professors’ “Academe” blog, editor of AAUP’s Illinois Conference Academe journal, and author of the book Patriotic Correctness: Academic Freedom and Its Enemies.

“If, for example, a professor commits a crime against students (such as video voyeurism), it’s punishable, Wilson said in an e-mail. So, too, is unfairly grading or meeting the “high bar” of discriminating against some group of students; making verbal threats in violation of the law; or engaging in academic fraud. And professors can be suspended for failing to do their jobs, such as refusing to teach.

“Still, that’s all with due process – and the professor should keep teaching as his or her case is being adjudicated, outside of being an immediate threat to students or others on campus. (Even the Florida professor deserves the right to defend himself before fellow professors at some point, faculty advocates said.)

“But a professor can’t be punished if he or she “merely says something that offends someone,” Wilson said. “When a professor is suspended for expressing controversial ideas, it violates the rights of students to hear those ideas, and the rights of everyone on campus who must live in a climate of fear about freedom of expression.”

Read more: http://www.insidehighered.com/news/2013/09/25/are-colleges-being-too-quick-suspend-professors#ixzz2fxWZpAgb
Inside Higher Ed

Selling out the university

This essay starts with utopia—the utopia known as the American university, writes Thomas Frank in The Baffler

“It is the finest educational institution in the world, everyone tells us.”Indeed, to judge by the praise that is heaped upon it, the American university may be our best institution, period. With its peaceful quadrangles and prosperity-bringing innovation, the university is more spiritually satisfying than the church, more nurturing than the family, more productive than any industry.

images

“The university deals in dreams. Like other utopias—like Walt Disney World, like the ambrosial lands shown in perfume advertisements, like the competitive Valhalla of the Olympics—the university is a place of wish fulfillment and infinite possibility. It is the four-year luxury cruise that will transport us gently across the gulf of class. It is the wrought-iron gateway to the land of lifelong affluence.

“It is not the university itself that tells us these things; everyone does. It is the president of the United States. It is our most respected political commentators and economists. It is our business heroes and our sports heroes. It is our favorite teacher and our guidance counselor and maybe even our own Tiger Mom. They’ve been to the university, after all. They know.

“When we reach the end of high school, we approach the next life, the university life, in the manner of children writing letters to Santa. Oh, we promise to be so very good. We open our hearts to the beloved institution. We get good grades. We do our best on standardized tests. We earnestly list our first, second, third choices. We tell them what we want to be when we grow up. We confide our wishes. We stare at the stock photos of smiling students, we visit the campus, and we find, always, that it is so very beautiful.

“And when that fat acceptance letter comes—oh, it is the greatest moment of personal vindication most of us have experienced. Our hard work has paid off. We have been chosen.

“Then several years pass, and one day we wake up to discover there is no Santa Claus. Somehow, we have been had. We are a hundred thousand dollars in debt, and there is no clear way to escape it. We have no prospects to speak of. And if those damned dreams of ours happened to have taken a particularly fantastic turn and urged us to get a PhD, then the learning really begins.”

 

More at: http://thebaffler.com/past/academy_fight_song

Those boring lenses and frames

Certainly, the lens and the frame are useful as metaphors, but as used, they are also quite limited. As an experiment, the next time you see one used, replace “frame” or “lens” with “context,” adjust the necessary conjunctions, and see if any meaning is lost. If in a given piece of writing, “seen through a queer lens” could just as easily be “seen in a queer context,” then the optical device isn’t living up to its potential as metaphor.

The chief ways in which optical metaphors can be improved in our writing are through diversity and specificity. imagesThese go hand-in-hand: the more diverse our optical metaphors become, the more specific they are able to be. Lenses, for example, can be convex-convex (the usual “lenticular” shape, which incidentally I suspect of being where lentils got their name, though I’ve done no research on this), but they can also be flat or concave on one or both sides. So, some lenses are plano-convex, others are convex-concave. These lenses behave differently and have different applications, and so could be employed in a diverse range of metaphorical applications.

“Lens” and “frame” get used a lot in theory writing. A recent post on Bad-at-Sports i getting cranky about this:

“The difference between a lens of any type and a frame is that we are directly aware of the ways in which lenses alter the image we are seeing. A biconvex lens held at the right distance from the eye will magnify the image. (At this distance, the image is not inverted; held out further, the image inverts, but the reason why is beyond my ability to explain from memory, so go Google a diagram.) This is the classic magnifying glass. Other types of lenses, such as eyeglasses, subtly alter the focal distance of our eyes (or rather, adjust the image to account for a flawed focal distance). Multiple-lens apparatuses like binoculars and microscopes magnify and can be focused. The point is that we are immediately aware of this alteration of the image we are seeing, because it is inherent to the function of the lens-based device. Continue reading “Those boring lenses and frames”

Rediscovering the library

When I started teaching, books were easier to find than articles, whose references were buried deep in voluminous, thin-paged indexes. Students took different paths in their research and came up with wildly different sets of texts, states a piece in ths weeks Chronicleof Higher Education:  imgres

“Some checked out the better books early, leaving the others to scrounge for what was left. Sure, there was overlap, but students often ended up with individualized research materials, exercising their critical abilities to integrate what they found into a coherent, cohesive discussion. As periodical-search engines blossomed, students, ever adaptable, started using more articles. While the electronic card catalog remained more or less static, the search engines became increasingly user-friendly. It became so difficult to get students to use books in their research that I started stipulating that they use a minimum number in my assignments.

“Then the development of Google and of electronic journals essentially converged. Why bother with books and the stacks when you can search full-text articles online? The process has become even more alluring with database products like Discover (which our libraries enthusiastically characterize as “the scholarly version of Google!”). It searches millions of entries, including all of the library catalog, the most-used journal databases, and local historical collections. Like Google, Discover ranks findings according to relevance. With the aid of our reference librarians, students easily set up their searches to obtain exactly what they think they’ll need, usually in the form of full-text articles.

“Consequently, my students hardly ever consult books. Circulation statistics support this impression. In 2005 our libraries checked out or renewed 86,807 books or other media. That number has been steadily declining. By 2012, the number had dropped to 45,394, down 48 percent in seven years.

Why am I bothered by these developments? Well, partly because modern library design mirrors student preferences. Increasingly, libraries are social spaces—with Wi-Fi, study nooks, coffee shops, chat areas, and movable furniture—and not homes for books, which are relegated to off-site repositories, save for a few recent acquisitions. If a student wants a book, she can requisition it. I cannot imagine students already deterred by the stacks having much patience for the repository.”

 

Full article at: http://chronicle.com/article/Unintentional-Knowledge/139891/

On asexuality

In 10 years, activist David Jay hopes your kids will be learning about asexuality when they’re getting “the talk.” This week an essay on Huffington Post explores this topic”

“What is ace culture going to look like in a decade? I don’t know,” he said. “Will it look like gay culture? That might happen, but I’m not invested in that. What I am invested in is that as more aces come out, a much larger percentage of the population will have access to the term ‘asexual’ than there is right now. I hope asexuality will be far more visible, with more out aces and asexual characters on TV shows and movies. I hope it becomes a part of the bigger world of sexuality.”imgres

“Mark Carrigan, 27, a PhD student at the University of Warwick who has been studying asexuality for half a decade, concurred. He’s eager to see an increase in asexuality awareness as he believes it will not just benefit the ace community but the world at large.

“More visibility for the asexual community will be very important,” he said. “And that’s not just because it’ll make their lives easier as a stigmatized group, but because there are cultural implications beyond those who are asexual themselves.”Carrigan, who is not himself ace, says he sees many similarities between the lesbian, gay, bisexual and transgender rights movement and that of the asexual struggle for broader acceptance, writes todays Huffington Post

“I’d argue that gay pride and the LGBT rights movement was a very civilizing movement,” he said. “It had broader ramifications for the culture we live in, inculcating a greater degree of tolerance and more awareness of sexual difference. Similarly, more awareness for asexuality will likely lead to awareness of a different sort of sexual difference.” Continue reading “On asexuality”