From Degrees of Difficulty: The Challenge of Equity in College Teaching by David Trend, now free of charge from Worlding Books
Underlying current opposition to diversity programs lies the pervasive belief that inequity and bias barely exist in a “post-civil rights, post-feminist” era, and that efforts to redress them have gone too far. This mindset helps explain why, as American universities now face a federal ban on diversity, equity, and inclusion (DEI) initiatives, a majority of the general public supports eliminating the programs –– with a recent Economist/YouGov poll finding 45% in favor of ending DEI in education over 40% opposed.[1] Already intense in state legislatures and conservative media, this resistance reflects deeply rooted American ideologies about meritocracy and individualism that clash with efforts to address systemic inequalities in higher education. The resulting political struggle has transformed campus diversity initiatives from administrative policies into flashpoints in America’s culture wars.
The controversies over this are no secret. Recent measures to ban or restrict DEI and the teaching of CRT in educational institutions reflect a longstanding political backlash. Leading up to the November election, 85 anti-DEI bills had been introduced in 28 state legislatures, according to the Chronicle of Higher Education‘s “DEI Legislation Tracker.”[2] These often broadly worded laws created confusion and fear among educators, while chilling discussions of race, gender, sexual orientation, and disability on campuses.
From Degrees of Difficulty: The Challenge of Equity in College Teaching by David Trend, forthcoming from Worlding Books
In recent years, the premise of “evidence-based teaching” has emerged as a key strategy in addressing outcome disparities in higher education. Much like evidence-based practices in medicine and social science, this approach draws on empirical research to inform teaching methods, moving beyond practices based on personal experience or intuition. This shift represents a major change in how educators view the art of teaching itself, acknowledging that while intuition has value, it must be balanced with systematic investigation of what actually works in the classroom. The development of evidence-based teaching can be traced to the late 20th century, along with advances in cognitive science and educational psychology. As researchers gained new insights into adult learning and intellectual development, their findings found their way into the university classroom.
The earliest educational research came from simple comparative methods. Researchers typically would divide classes into control and experimental groups, with one cohort receiving standard instruction and the other a modified version. These “split-class” experiments provided the first rigorous evidence that teaching methods could affect learning outcomes significantly. While rudimentary, these early methods established the crucial principle that teaching effectiveness could be measured and improved through systematic study rather than innate talent alone. Educators also relied heavily on pre- and post-testing, administering assessments before and after interventions to measure knowledge gain. Though simple, this proved particularly good for seeing which teaching strategies led to lasting comprehension versus short-term memorization. Besides this, some faculty maintained teaching journals for documenting their own methods and student responses, which later would be shared with others. While lacking the sophistication of conventional educational studies, these varied methods laid the groundwork for an evidence-based teaching movement asserting that teaching effectiveness could be studied and improved.
From Degrees of Difficulty: The Challenge of Equity in College Teaching by David Trend, forthcoming from Worlding Books
As awareness grows about the role of structural inequities and systemic biases in student success or failure, many schools are exploring the role of instructional methods and course design in bringing equity to the educational environment. In doing so, institutions are finding emerging teaching practices guided by evidence-based research can broaden learner success. Key to this movement is the practice of inclusive teaching, a pedagogical approach that recognizes the inherent diversity of learners and seeks to accommodate their varying needs. This philosophy is predicated on the understanding that students come from various backgrounds, possess different learning styles, and often face individual challenges in their educational pursuits. In recognizing these forms of diversity, educators can develop strategies catering to the most significant number of learners, ensuring no one is left behind. This also treats classroom diversity as an asset, enriching the learning experience for all students by introducing multiple perspectives and fostering cross-cultural understanding.
For many faculty like me, the COVID-19 pandemic catalyzed a heightened attention to inclusive principles. The sudden transition to remote learning destabilized my ongoing practices in two significant ways: first, by forcing the adoption of new instructional methods, and second, by making visible latent inequities I hadn’t previously recognized. As mentioned above, this situation led many colleges and universities to scrutinize their teaching approaches and adopt new tools and strategies to enhance fairness, flexibility, and accessibility. The pandemic also highlighted the importance of social-emotional learning and mental health support in education, prompting institutions to integrate these elements into their teaching strategies more fully.
College students are a lot more worried about grades these days. This is something I myself have witnessed in the large general education courses I teach at UCI. My offerings are part of the breadth requirements common at most universities. These attract learners from a wide array of academic disciplines –– which at UCI translates into large numbers of science, technology, engineering, and math (STEM) majors. The changes I’m seeing manifest in a growing preoccupation with grades and rankings, as well as increasing concerns about future earnings potential. This shift has not gone unnoticed by my colleagues, many of whom express disdain for students more invested in grade point averages than the intrinsic value of learning. Some view this as a troubling trend towards a consumer mentality in education. But I take a more sanguine view.
While grade pressure always has been present to some extent, its recent intensification goes beyond individual classrooms. Almost every university uses these metrics as the primary measure of learning. This makes assessments and scores central to most university teaching for a variety of reasons: measuring comprehension, motivating student effort, providing feedback, generating student rankings, etc. But grade-centric approaches also can fail to account for learners’ diverse challenges, and may undermine equity as a result. Moreover, too much attention on grades can compromise critical thinking and intellectual curiosity crucial not only for academic success but also for life after college.
From Degrees of Difficulty: The Challenge of Equity in College Teaching by David Trend, forthcoming from Worlding Books
The university classroom long has been dominated by teacher-centered instruction, which has shown some adaptability while retaining its fundamental characteristics. It wasn’t until the late 20th century that this approach faced significant challenges, as evidence-based practices and learning sciences began to inform educational methods. Understanding this transition requires examining the extensive history of teacher-centered education, including the influence of global pedagogical traditions and the effects of industrialization and technological advances.
Throughout educational history, our understanding of how children and young adults learn has continuously evolved. For centuries, this understanding remained notably one-dimensional, failing to account for the complexity of human learning. Prior to the 20th century in most parts of the world children were either seen as blank slates or miniature adults, requiring little more than information and discipline as they matured. Philosophers in the 1700s described children as possessing a natural goodness or in need of stern training. But it wasn’t until the early 1900s that Swiss psychologist Jean Piaget began charting children’s “stages” of maturity.[i] From this would emerge understandings of how youngsters transition from self-centeredness into social beings, eventually acquiring capacities to actively “construct” knowledge rather than passively taking it in. These insights about cognition and learning would eventually underlie the fields of child development and “child-centered” education.
Yet even as these progressive educational theories were evolving, families and educators continued to prefer traditional instruction, owing to “common sense” resemblances of teaching to parenting in activities like establishing routines, setting expectations, imparting knowledge, and providing feedback. These resemblances make teacher-centered instruction feel familiar and natural, especially in K-12 years. This broad-based appeal speaks to a deep-seated human desire for guidance and a reverence for wisdom embodied in the teacher figure. But as any teenager will tell you, such methods can become counterproductive over time as learners develop levels of independence and autonomy, requiring less and less supervision, direction, and monitoring. Failure to recalibrate instruction to the maturing learner can lead to stress, resentment, and underperformance in what educators term “resistance.”[ii]
Education in the U.S. has a complex history, marked by intellectual progress and systematic exclusion. For over three centuries, its institutions have often prioritized certain forms of knowledge and ways of thinking, inadvertently or intentionally reinforcing intellectual hierarchies. Western philosophical traditions played a significant role in this by emphasizing reason and science while promoting a Eurocentric worldview. The influence of colonialism further complicated matters, as it led to the suppression and erasure of Indigenous knowledge systems around the world and in the U.S. This historical context left a lasting impact on the structure and focus of American higher education, influencing who has access and what is considered valuable knowledge.
Much of this can be traced to the Age of Reason of the 17th and 18th centuries, which profoundly shaped philosophical and educational frameworks in Europe and internationally. Prominent thinkers like John Locke and Immanuel Kant advanced the authority of rationalism and empiricism, influencing philosophical discourse and promoting certain disciplines over others.[i] This focus led to the development of university curricula that, while comprehensive, often functioned selectively.
The Age of Reason reinforced a Eurocentric perspective that marginalized non-Western forms of knowledge and understanding. Visions of world history that placed Europe at the pinnacle of civilization, as set for by Georg Wilhelm Friedrich Hegel, rendered other cultures as less developed or worthy.[ii] This prejudice led academic institutions to the criticize, misrepresent, or entirely disregard non-Western philosophies, sciences, and cultural practices. Edward Said’s concept of “Orientalism” explained how Western academia constructed and perpetuated distorted views of non-Western societies, often rendering them as exotic, backward, or irrational in contrast to the supposedly rational and progressive West.[iii] This intellectual bias not only shaped academic disciplines like anthropology and geography but also influenced broader educational curricula, diplomatic relations, and colonial policies. Consequently, the university emerging from this intellectual milieu often failed to recognize or value Indigenous knowledge systems, oral traditions, and alternative epistemologies, further entrenching the dominance of Western thought in global academic discourse.
The “college premium”is the shorthand term for the income differential accruing to those who complete four-year degrees. Often attributed to research begun in 2011 by Georgetown University’s Center on Education and Workforce (CEW), the college premium concept came about from estimates comparing the average lifetime earnings of college graduates ($2.3 million) to those of high school diploma holders ($1.3 million).[i] In the subsequent decade, the CEW estimate swelled from its initial $1 million to $1.2 million as the premium made college seem like a mandatory life choice.
But families often pay heavily for this benefit, as top-tier universities edge ever closer to tuition costs of $100,000. This year, Vanderbilt University came nearest to this much-watched threshold, projecting tuition of $98,426, though it also emphasized that most students receive financial aid. This trend is evident in other prestigious institutions like Brown, NYU, Tufts, and Yale, whose costs are similarly approaching six figures. While these universities cater to a specific segment, it’s noteworthy that the national average tuition is $56,000 for private colleges and $26,000 for public universities. The rising costs across the industry continue to be a significant concern.[ii]
Choosing a college from one of the America’s 5,775 public and private options in the U.S. can be one of the biggest decisions a young adult makes. With 25-million applicants making these choices, a large industry exists to help with this process, encompassing high-school guidance counsellors, college admissions offices, professional advisors, industry organizations, books and guides, and ranking publications – all devoted to help applications find the “best” school for them.[i] From elite private universities to regional state colleges, for-profit institutions, and community colleges, the hierarchy of institutions is well-recognized and often shapes public opinion. This stratification raises crucial questions about access, equity, and whether the status of an institution significantly determines a graduate’s long-term success.
This “brand hierarchy” is a reality of the U.S. higher education system. The public often assigns greater value to highly selective, well-resourced institutions with name recognition Rankings and media portrayals fuel this perception, creating an implicit understanding that some colleges are simply “better” than others. In fact, studies from the U.S. Department of Education show 74 % of prospective students rating important “reputation/academic quality” the most important factor in choosing a school –– more important than tuition cost (67%), proximity to home (26%), or personal recommendations (24%).[ii]
A central question for the public is whether the name of the institution on a diploma translates to tangible differences in earnings potential and life satisfaction. There’s a prevailing assumption that graduates of elite universities have a clear advantage, but the reality is more complex. Partly this has to do with the structural benefits that higher education institutions provide as a transitional ground between high school and adulthood. For many young adults, elite colleges are seen as sources of social connections, professional networks, access to organizations, recommendations, and mentoring, much of linked to a particular college or university brand identity.
Admissions processes, particularly at elite schools, contribute to the perception of stratification. The intense competition reinforces the notion of scarcity, and the “brand” of the university becomes a factor for ambitious students. Furthermore, legacy admissions systems, where preference is given to children of alumni, perpetuate the image of higher education as tied to existing social class structures –– a perception not easily dispelled. And obviously, acceptance rates very widely, according to the status and type of institution involved. The most exclusive schools like Caltech, Harvard, MIT, Princeton, and Stanford take about 4% of applicants.[iii]Most public community colleges accept all applicants, although demand for certain classes can limit individual enrollments. In all of this, it is very important to keep in mind one’s individual needs, since some highly competitive and prestigious institutions do not have the greatest programs in certain fields, and less well-known schools may Have the best program in the nation in certain areas.
The structural inequities and systemic biases present in higher education profoundly affect learners’ sense of belonging, which in turn influences their academic and social experiences. Research consistently shows that students from historically minoritized backgrounds, including students of color, low-income students, and first-generation college students, often feel less connected to their institutions. This lack of belonging can have far-reaching consequences, impacting learners’ engagement with courses and materials, their sense of connection with peers and community, and their overall well-being and acceptance within the campus culture. As institutions strive to create more inclusive environments, it is essential to understand the multifaceted ways in which belonging influences student experiences and outcomes.
When learners perceive themselves as outsiders, their motivation and participation in academic activities suffer. A recent study found that students who do not feel a sense of belonging are less likely to engage in classroom discussions or participate in group projects, leading to a diminished learning experience.[1] This disengagement is particularly pronounced among learners from underrepresented groups, who may already feel alienated due to cultural and institutional biases. Such environments fail to support these learners, exacerbating feelings of isolation and disengagement. Consequently, these students are often left to navigate academic challenges without the support structures necessary for success, further entrenching existing inequities.
Everyone wishes for higher intelligence. Like beauty and fitness, it’s another quality everybody seems to want. But at some point in life, most people accept what they have and just plow ahead. This sense of defined limits comes from grades, standardized tests, performance evaluations, and chosen pathways reinforced throughout life in competitive comparison. Because of this, attitudes toward intelligence become a perfect set-up for transhumanist enhancement. Rarely is the definition of intelligence questioned, even though the concept is extremely murky. Instead, what gets advanced is the hope of salvation, supplement, addition, or replacement of native functioning, these days offered in a dizzying array of methods, tricks, and technologies.
Memory boosting supplements like Brainmentin and Optimind flood the consumer market, often pitched to aging baby-boomers. Students drink Red Bull or acquire ADD-drugs to study for tests. Exercise and nutritional products promise sharper thinking through purportedly “natural” means. Dig a little further, and one finds unexamined values in intelligence discourse, which privilege reasoning and memory over just about anything else. Important as such traits may be, alone they can’t account for many and diverse ways people navigate their lives, adapt to changing circumstances, or act in creative ways.
So, what is intelligence? The Cambridge Dictionary says it’s the “ability to understand and learn well, and to form judgments and opinions based on reason.”[i] Most other sources say roughly the same thing. Yet people who study intelligence argue that single definitions just won’t do. There simply are too many variables that go into “intelligent” thinking and behavior –– among them cognition, capacity, context, experience, emotion, orientation, language, memory, motivation, and overall physical health. Definitions of intelligence have changed throughout history and vary from culture-to-culture. Western societies in particular tend to value analytical skill over other traits. Critiques of such narrow thinking have a long history in philosophy, with Socrates, Plato, and Aristotle each coming up with different views. Much in these early debates focused on the question of knowledge itself and how people express their thoughts. But as societies became more bureaucratic and mechanized, increasing value was placed on spreadsheets, metrics and algorithms.
Neuroscientists call the brain an “anticipation machine” because it spends so much time predicting the future.[i] It does this by piecing together past experiences to build scenarios of expected outcomes, in a process that reinforces itself as predictions come true. But of course things don’t always come true, creating uncertainty and wreaking havoc on the anticipation machine. In mild cases this expresses itself in a sense of worry that things might go wrong. But pile up a lot of bad experiences and you end up expecting the worst, in what psychologists call “anticipatory dread.”[ii] While this can be a healthy process in buffering the shock of negative events, it also can spiral into a harmful sensation of crisis.
Recent research has a lot to say about the anticipation machine’s relationship to the update impulse. Visions of the future don’t spring from a vacuum, but link to objects, expected outcomes, or something we think we want. This desiring process applies to just about everything, whether it’s a slice of pizza or the admiration of others. But here’s the fascinating part: Getting things is less powerful than wanting them. That new pair of jeans might bring a thrill. But soon comes the yearning for another purchase. Neuroimaging reveals that “wanting” and “liking” occur in different parts of the brain, with the former more strongly active than the latter. Contrary to common wisdom, motivation isn’t influenced by animalistic hungers and drives. What gets people going is the imagination, which is why advertising favors feelings over facts.
The past year has witnessed unprecedented assaults on Diversity, Equity, and Inclusion (DEI) initiatives in universities. Often disguised as support for “traditional” values or academic freedom, these criticisms mask a deeper debate about the role and direction of higher education in a diverse society. To navigate this turbulent discussion, it’s important to move beyond slogans and delve into the evidence-based benefits of DEI, not just for educational institutions, but for the very fabric of a democratic society.
Historically, American academia has been marked by exclusion. Access to knowledge, the cornerstone of a thriving democracy, was largely reserved for privileged white students. This reality underscores the dynamic nature of tradition in higher education. True progress lies not in clinging to past practices, but in expanding access to reflect the rich tapestry of American life.
DEI serves as a crucial tool in this expansion. Far from a political tool or mere slogan, it represents a data-driven approach to dismantling barriers that impede access and success for historically marginalized communities Research paints a clear picture:
Improved Student Outcomes: Studies by the National Bureau of Economic Research show that diverse learning environments significantly enhance academic performance and critical thinking skills.
Higher Graduation Rates: The American Association of Colleges and Universities reports that campuses with robust DEI programs boast higher graduation rates, particularly for sociallt marginalized students.
Stronger Civic Engagement: Research by the National Center for Education Statistics reveals that universities with strong inclusivity practices foster greater student satisfaction and civic engagement.
“Confessions of a Drone Warrior,” is one of hundreds of articles on the military’s use of Unmanned Ariel Vehicles (UAV), which began in the early 2000s. In many ways this new form of combat embodies the psychological distancing that typifies killing in the twenty-first century. The story about Airman First Class Brandon Bryant recounts his first day in a Nevada bunker, when the 22-year fired on two presumed Afghani insurgents on the other side of the world. An early recruit in this new kind of warfare, Bryant “hunted top terrorists, but always from afar” –– killing enemies in countless numbers, but not always sure what he was hitting. “Meet the 21stcentury American killing machine,” the story concluded.[i]
Of course, notions of aversion to fighting don’t sit well with either military doctrine or public belief. Behind America’s infatuation with high-tech weapons lie long-cultivated attitudes toward violence itself. In a class I teach on this, students often will express common sense views that fighting is “natural,” deriving from humanity’s animalistic origins, and often the only way of resolving conflicts. One sees this kind of thinking evident in permissive attitudes toward everything from boyish rough-housing to violent sports. The gendered aspects of violence receive less attention than they should, and will be addressed at length in Chapter 9. Suffice to say that aggression often is expected of men and boys, while also reflected in popular culture. Along with political partisanship, these attitudes help explain the deep divisions within the U.S. electorate over gun control and so-called “stand your ground” laws. Since even scholars often disagree over the issue of human violence, it helps to break the question into subcategories –– and to also point out how knowledge has changed over time in the fields of biology, psychology, and cultural analyses of violent behavior.
“The more I became immersed in the study of stigmatized mental illness, the more it astonishing to me that any such phenomenon should exist at all,” writes Robert Lundin, a member of the Chicago Consortium for Stigma Research. “I believe that serious and persistent mental illnesses, like the one I live with, are clearly an inexorably no-fault phenomena that fully warrant being treated with the same gentleness and respect as multiple-sclerosis, testicular cancer or sickle-cell anemia.”[i] Here Lundin names a central of problem in the social construction of mental illness: the misunderstanding of conditions affecting the mind as somehow different from other biological illness. The misrecognition renders mental illness prone to the judgmental attributions discussed by Susan Sontag in her 1973 book Illness as Metaphor. To Sontag, contemporary society reverses ancient views of sickness as a reflection of the inner self. In this new view, the inner self is seen as actively causing sickness––through smoking, overeating, addictive behavior, and bad habits: “The romantic idea that disease expresses the character is invariably extended to exert that the character causes the disease–because it is not expressed itself. Passion moves inward, striking within the deepest cellular recesses.”[ii] But as before, the sick person is to blame for the illness.
Such sentiments are especially vindictive when a mentally ill person commits a crime. Understandably perhaps, clinical terms like “mental illness” quickly acquire malevolent meanings in the public mind––even though the mentally ill statistically are no more prone to criminality than anyone else. Sometimes this semiotic slippage causes public panic over commonplace disorders. Consider the case of Adam Lanza, the young man who in 2013 shot 26 children and adults at the Sandy Hook Elementary School in Newton, Massachusetts. While mental health analysts speculate that an acute psychotic episode prompted his violence, Lanza never had been diagnosed with a serious mental illness. As reporters scrambled for a story, much was made of Lanza’s childhood symptoms of Asperger’s syndrome, a form of high-functioning autism. The repeated mention of this disorder in news coverage triggered wrong-headed fears nationally of the murderous potential in other autistic kids. According the Centers for Disease Control (CDC), approximately 1 in 50 people (1.5-million) fall somewhere on the autistic spectrum, 80 percent of whom are boys.[iii] This has prompted improved diagnostic measures, which in turn have resulted in an apparent rise in autism cases in recent years––up 78 percent from a decade ago––and made autism a source of acute anxiety for many new parents. Continue reading “Stigma and Mental Illness”
Heard about Generation Z? The demographic growing up in the 2000s? It’s a bigger group than Boomers or Millennials–––and it has one further distinction. “Members of Generation Z are ‘digital natives’ who cannot remember what it was like not to have access to the Internet –– no matter when, no matter what, no matter where,” according to Forbes Magazine. This is a group raised on networked “connecting” with others, sharing, and buying things. It’s second nature to Gen-Zers to upload their favorite music on YouTube, post images on Facebook, and sell things on Etsy or eBay. Much is being made in creative economy talk of how networks now blur traditional producer/ consumer roles, manifest in the new figure of the “prosumer.” In Wikinomics: How Mass Collaboration Changes Everything authors Don Prescott and Anthony D. Williams effused over the democratization inherent in the new “Openness, Peering, Sharing and Acting Globally.” Of course, there is nothing really new about home-made items, crafts, and people’s willingness to share. What’s different today is the ability to copy digitized materials and circulate them via electronic networks. Digitization also has made Generation Z the first demographic to be completely tracked by “big data” analytics.
Some creativity industry experts argue that this is nothing short of a revolution, driven by ongoing change more than any clear future. Evolutionary economist Jason Potts and collaborators have proposed what they term “Social Network Markets” unlike the top-down models of industrial capitalism. Characterized by fluidity and exchange through complex fields of actors, the new social network markets are less governed by competition and profit than by communication and preference. Participants are “Not ‘buying’ the property, but buying into the social space.” Moreover, the dynamics of these new markets are highly interactive. As the Potts group put it, “a social network is defined as a connected group of individual agents who make production and consumptions decisions based on the actions (signals) of other agents on the social network: a definition that gives primacy to communicative actions rather than connectivity alone.” Almost by definition, this process rules out conventional manufacturing or professional services. Instead, the networks generate value through production and consumption of network-valorized choices.”
The beauty is that much of what is online now is free––seeming to arrive just in time in a tight economy. While a lot of the “free” stuff available online is user-generated (selfies, birthday announcements, anecdotal postings, etc.), a huge volume of material comes from other sources (news outlets, filmmakers, commercial music producers, artists). On the surface it looks like old Marxist doctrines are being reversed as items seem to be “decommodified” in the sharing economy. This idea has become an anthem of resistance in some circles. The Burning Man Festival, to take one example, has stated: “When we commodify we seek to make others, and ourselves, more like things, and less like human beings. ‘Decommodification,’ then, is to reverse this process. To make the world and the people in it more unique, more priceless, more human.” This may be all well-and-good in the real-life sharing of food and weed at Burning Man. But when things get virtual, it’s usually a large corporation that owns the websites, servers, and networks that make sharing possible. Continue reading “The Big Data vs Artists and Everyone Else”
As I write these words, many Americans remain up in arms about President Donald Trump’s peculiar relationship with the truth. On a seemingly daily basis, the nation is greeted with a new round of accusations or indignant retorts from the President–– most of which bear little resemblance to objective reality. Let’s just say The Commander-in-Chief has a very “creative” approach to factuality––about everything from crime and immigration to science and the judiciary. Perhaps he’s joking or trying to shock people. Or maybe he’s a pathological liar. Time Magazine devoted a cover to the President’s “Truth and Falsehoods”; the Los Angeles Times ran multiple “Why Trump Lies” editorials; and The New Yorker is now 14 installments in its ongoing “Trump and the Truth” series. Unsurprisingly, the President doubled-down on his claims, and––in keeping with his fondness for conspiracy theories––has labelled the entire field of journalism “the enemy of the American people.” Endless pundits and commenters have tried to discern a logic in the President’s bizarre behavior––in which mischief and chaos seem the only constants.
Say what you will about Trump, his ability to get public attention is astonishing. And while some critics question the President’s grasp of “reality,” others see a calculated shrewdness in his behavior––an underlying strategy not unlike what Naomi Klein discussed in The Shock Doctrine. “We already know the Trump administration plans to deregulate markets, wage all-out war on ‘radical Islamic terrorism,’ trash climate science and unleash a fossil-fuel frenzy,” Klein recently stated, adding, “It’s a vision that can be counted on to generate a tsunami of crises and shocks.” She predicted economic shocks (as market bubbles burst), security shocks (as blowback from foreign belligerence comes home), weather shocks (as the climate is further destabilized), and industrial shocks (as oil pipelines spill and rigs collapse, especially when enjoying light-touch regulation).
“All this is dangerous enough,” Klein added, “What’s even worse is the way the Trump administration can be counted on to exploit these shocks politically and economically. Trump himself forecasted as much often in promising a “radical break” from the past––described by Fox News as a “shock and awe campaign against the Washington establishment.” This new agenda bears little resemblance to earlier “culture wars” between conventional liberal and conservative camps. Moral idealism has no place in Trump’s program of disruption and dishonesty. But his ability to confuse and deceive is not to be taken lightly. The Trump phenomenon raises important concerns about the role of knowledge in contemporary society––and the ways different worldviews are conceived, put into circulation, and frequently politicized. Continue reading “The Performance Art of the Deal”
If adjuncts want more workplace rights, they have to take them. As Inside HigherEd reports, “That message was echoed throughout a discussion on non-tenure-track faculty rights here Monday at the Coalition of Contingent Academic Labor, or COCAL, conference. It’s being held this week at John Jay College of Criminal Justice of the City University of New York.
“The biennial gathering draws participants from the U.S., Mexico and Canada, and adjunct activist panelists from all three countries advocated striking as a real and valid means of achieving short- and long-term goals.
“Unless and until faculty, including part-time faculty, hit the streets and occupy the classrooms,” said Stanley Aronowitz, a tenured professor of sociology and urban education at the CUNY Graduate Center, “there won’t be any change of substance.” Aronowitz, who has worked as an adjunct professor several times throughout his career, said this idea applied even in those states where collective bargaining or strikes among public employees is prohibited by law. Faculty members at Nassau Community College who went on strike last year over protracted contract negotiations paid hefty fines for violating New York State’s Taylor Law, for example. (Under the law, the union was permitted to engage in collective bargaining, but not to strike.) But Aronowitz and other activists said that striking is a fundamental right that should be ensured by the First Amendment; without the right to strike, he said, collective bargaining too often becomes “collective begging.”Participants here responded to Aronowitz’s remarks on strikes with strong applause.
“Maria Teresa Lechuga, a Ph.D. candidate in pedagogy at the National Autonomous University of Mexico, added: “We need to stop asking for permission to organize ourselves.” Panelists said that striking is always a “last resort,” to be exercised only when adjunct faculty members and administrators can’t otherwise reach common ground. But in order to ensure public support when and if the time to strike comes, advocates said, adjuncts need to nurture relationships with other kinds of workers, along with parents and students.Maria Maisto, president of the New Faculty Majority, a national adjunct advocacy organization, said adjuncts shouldn’t be afraid to bring up their working conditions with their students. She said such conversations are part of students’ “civic education” — an essential part of their studies. Continue reading “Striking adjuncts”
A few years ago I was desperately seeking a book contract, Writes Rachel Toor in the Chronicle of Higher Education“Things weren’t going well on the project I’d spent years working on, and I wanted a quick fix. In a frenzy I put together a crappy proposal for an advice book for graduate students and professors on writing and publishing and sent it to an editor I didn’t know at Harvard University Press.
“Five days later, Elizabeth Knoll responded by telling me she was already publishing a how-to-write-better book for academics, Stylish Academic Writing by Helen Sword (it’s excellent). Then she conveyed in the kindest way something I already knew: What I had proposed wasn’t a book. I had merely submitted a bunch of prose framing a table of contents for a collection of my Chronicle columns. She suggested we brainstorm an idea for a real book.
“We had a warm and frequently funny correspondence about scholarly publishing, academic writing, issues and problems in higher education, growing up as children of academics, college admissions, mutual friends, and many other things. I went back to my original book project but still hoped that someday I would be able to publish a book with Elizabeth. Recently I found I had lost my chance. She’d left the press to become assistant provost for faculty appointments at Harvard. So I jumped on the opportunity to ask Elizabeth to reflect about her time in publishing, and to offer some advice on book publishing to Chronicle readers.
“Elizabeth went into the family business. Her father was a professor of English at the University of Nebraska; her mother had been one of her father’s most talented students. “I got my Ph.D. in the history of science,” she said. “Basically I was—and am—always curious about what counts as knowledge in different times and places.” After working at the Journal of the American Medical Association, Elizabeth got a job as an editor at the University of California Press in 1988, then at W.H. Freeman in 1994. She moved to Harvard Press in 1997. Continue reading “On academic publishing today”
Brooklyn fashion blogger Rachel Tutera knows that you might not see her the way she sees herself. As discussed on PBS.com,
“There’s a weird tendency in people to panic when they can’t tell if you’re a man or a woman, or how you may identify,” Tutera, 29, said. “There are people who find me provocative in a way that I don’t exactly understand.”
“As a gender non-conforming person, someone who behaves and appears in ways that are considered atypical for one’s sex assigned at birth, Tutera said she feels constant stress and anxiety from the outside world.
“Whether I’m read as what I am, which is a masculine-presenting woman, or if I’m read as a feminine-presenting man, there’s a lot of danger there — physical danger,” Tutera said. “I’ve gotten shoved by guys, certain slurs.” Tutera has been the victim of gender policing, the act of imposing or enforcing gender roles based on an individual’s perceived sex. This type of behavior can range from banal actions, like a confused look on the subway, to more insidious behavior like getting thrown out of a gendered public restroom or fitting room, she said.
“Gender non-conforming people get harassed on the basis of not being the right kind of woman, a failed woman, or not being the right kind of man, a failed man,” said Professor Anne Pellegrini, the director of New York University’s Gender and Sexuality Center. Pellegrini said gender policing amounts to a form of cultural oppression.
“According to Pellegrini, in most states, transgender and gender non-conforming people are not protected from workplace or housing discrimination. Just a few decades ago, state laws allowed police to arrest individuals for impersonating another sex if the police deemed they weren’t wearing gender-appropriate clothing. Continue reading “When gender policing turns violent”
From the New York Times: “It is the height of summer, and millions of visitors are flocking to the Louvre — the busiest art museum in the world, with 9.3 million visitors last year — and to other great museums across Europe. Every year the numbers grow as new middle classes emerge, especially in Asia and Eastern Europe. Last summer the British Museum had record attendance, and for 2013 as a whole it had 6.7 million visitors, making it the world’s second-most-visited art museum, according to The Art Newspaper. Attendance at the Uffizi in Florence for the first half of the year is up almost 5 percent over last year
“Seeing masterpieces may be a soul-nourishing cultural rite of passage, but soaring attendance has turned many museums into crowded, sauna-like spaces, forcing institutions to debate how to balance accessibility with art preservation.
“In recent years, museums have started doing more to manage the crowds. Most offer timed tickets. Others are extending their hours. To protect the art, some are putting in new air-conditioning systems. Still, some critics say that they’re not doing enough.
“Last year, the Vatican Museums had a record 5.5 million visitors. This year, thanks to the popularity of Pope Francis, officials expect that to rise to 6 million. The Vatican is installing a new climate-control system in the Sistine Chapel to help spare Michelangelo’s frescoes the humidity generated by the 2,000 people who fill the space at any given time, recently as many as 22,000 a day. The Vatican hopes to have it finished by October.
“In a telephone interview, Antonio Paolucci, the director of the Vatican Museums, said his institution was in a bind: To safeguard the frescoes, attendance should not be allowed to increase, he said, but “the Sistine Chapel has a symbolic, religious value for Catholics and we can’t set a cap.”Museums generally don’t like keeping a lid on attendance. At the Hermitage, which had 3.1 million visitors last year, the only cap on the number of visitors is “the physical limitations of the space itself, or the number of hangers in the coat room during the winter,” said Nina V. Silanteva, the head of the museum’s visitor services department.Ms. Silanteva said the goal was to make the museum accessible to as many people as possible, but she conceded that the crowds pose problems. “Such a colossal number of simultaneous viewers isn’t good for the art, and it can be uncomfortable and overwhelming for those who come to see the art,” she said. “Thankfully nothing bad has happened, and God has saved us from any mishaps.”