There’s nothing like a bunch of unemployed recent college graduates to bring out the central planner in parent-aged pundits, as the Washington Post reports.
“In a recent column for Real Clear Markets, Bill Frezza of the Competitive Enterprise Institute lauded the Chinese government’s policy of cutting financing for any educational program for which 60 percent of graduates can’t find work within two years. His assumption is that, because of government education subsidies, the United States is full of liberal-arts programs that couldn’t meet that test.
“Too many aspiring young museum curators can’t find jobs?” he writes. “The pragmatic Chinese solution is to cut public subsidies used to train museum curators. The free market solution is that only the rich would be indulgent enough to buy their kids an education that left them economically dependent on Mommy and Daddy after graduation.” But, alas, the United States has no such correction mechanism, so “unemployable college graduates pile up as fast as unsold electric cars.”Bill Gross, the founder of the world’s largest bond fund, Pacific Investment Management Co., has put forth a less free- market (and less coherently argued) version of the same viewpoint. “Philosophy, sociology and liberal arts agendas will no longer suffice,” he declared. “Skill-based education is a must, as is science and math.”There are many problems with this simplistic prescription, but the most basic is that it ignores what American college students actually study. Take Frezza’s punching bag, the effete would-be museum curator. It would be only a slight exaggeration to say that no such student exists.
“According to the National Center for Education Statistics, humanities majors account for about 12 percent of recent graduates, and art history majors are so rare they’re lost in the noise. They account for less than 0.2 percent of working adults with college degrees, a number that is probably about right for recent graduates, too. Yet somehow art history has become the go-to example for people bemoaning the state of higher education. A longtime acquaintance perfectly captured the dominant Internet memes in an e-mail he sent me after my last column, which was onrising tuitions. “Many people that go to college lack the smarts and/or the tenacity to benefit in any real sense,” he wrote. “Many of these people would be much better off becoming plumbers — including financially. (No shame in that, who’re you gonna call when your pipes freeze in the middle of the night? An M.A. in Italian art?)” While government subsidies may indeed distort the choice to go to college in the first place, it’s simply not the case that students are blissfully ignoring the job market in choosing majors. Contrary to what critics imagine, most Americans in fact go to college for what they believe to be “skill-based education.” Continue reading “Art history at the crossroads”
Invariably, around February of each year, coinciding with Black History Month, you’ll hear people asking, “Why isn’t there a White history month?”
Do these people mean we should condense all the American history centering around White people to just one month and devote the other 11 to people of color?
Of course not. It’s readily accepted that White history is taught, year-round, to the exclusion of minority histories. But the literal history of Whiteness — how and when and why what it means to be White was formulated — is always neglected. The construction of the White identity is a brilliant piece of social engineering. Its origins and heritage should be examined in order to add a critical layer of complexity to a national conversation sorely lacking in nuance. I’m guessing that’s not what they mean, either. In conversations about race, I’ve frequently tried and failed to express the idea that Whiteness is a social construct. So, here, in plain fact, is what I mean:
The very notion of Whiteness is relatively recent in our human history, linked to the rise of European colonialism and the Atlantic slave trade in the 17th century as a way to distinguish the master from the slave. From its inception, “White” was not simply a separate race, but the superior race. “White people,” in opposition to non-whites or “colored” people, have constituted a meaningful social category for only a few hundred years, and the conception of who is included in that category has changed repeatedly. Continue reading “White history month”
Since their birth as a science-fair curiosity at Brookhaven National Laboratory in the late 1950s, video games have moved inexorably towards higher and more central cultural ground, much like film did in the first half of the 20th century.
Games were confined at first to the lowbrow carnival of the arcade, but they soon spread to the middlebrow sphere of the living room, overran this private space, and burst out and upwards into the public spheres of art and academia. With prestigious universities like NYU and USC now offering graduate-level programs in game design, and major museums like MoMA, MAD, and SF MoMA beginning to acquire games and curate game exhibitions, preserving the early history of the medium appears more important than ever. But what exactly does it mean to preserve a digital game?
The answer is surprisingly simple: It means, first and foremost, preserving a record of how it was played and what it meant to its player community. Ensuring continued access to a playable version of the game through maintenance of the original hardware or emulation is less important—if it matters at all.
That, at least, was the provocative argument Henry Lowood made at Pressing Restart, which recently brought preservationists, teachers, academics, and curators together at the NYU Poly MAGNET center for a day of “community discussions on video game preservation.” Lowood is no contrarian whippersnapper; as a curator at the Stanford Libraries, he has been professionally involved in game preservation efforts for well over a decade. Continue reading “History and video games”
Even some of our most storied and longest-lasting profanities have proven susceptible to a gradual weakening in the face of changing social norms and technology-aided taboo-sapping overuse.
Today’s Slate.com explains, in the excerpt below: “Damn, hell, shit, and fuck are not what an anthropologist observing us would classify as ‘taboo,’ ” says linguist John McWhorter, author of What Language Is: And What It Isn’t and What It Could Be, among other books. “We all say them all the time. Those words are not profane in what our modern culture is—they are, rather, salty. That’s all. Anyone who objects would be surprised to go back 50 years and try to use those words as casually as we do now and ever be asked again to parties.”
“As McWhorter notes, even fuck—the super-badass, cannot-be-effed-with, undisputed heavyweight champion of all curse words—has not escaped the passage of time with the full force of its offensiveness intact. Sheidlower, who is also editor of The F-Word—a comprehensive volume that delineates the impressive history of the word fuck, as well as its many uses and variations that have cropped up throughout the English-speaking world—is perhaps the world’s foremost expert on this topic. He has studied the progression of the word with precision and scholarly zeal. There are, he says, “a number of things going on with fuck.” Continue reading “Bad words are changing”
Much has changed in the past 50 years, since the height of the Civil Rights movement. But how do you teach the Civil Rights to kids who haven’t ever experienced it? In Jackson, Miss., Fannie Lou Hamer Institute’s Summer Youth Workshop tackles that question, reports NPR today.
“Take 13-year-old Jermany Gray, for instance. Gray and his fellow students are all African-American, and many of them are from Jackson. They’re familiar with the struggle for civil rights — they read about it in text books and saw it in museum exhibits. But for most, it’s a story that ended long before they were even born. Gray has no problem talking about what the Civil Rights movement was back in the ’60s, but when asked what it means to him these days, the answer doesn’t come as easily.
“What does it mean? I’ll have to think about that question,” he said. “Maybe I can answer that at the end of the week.”That’s the typical challenge, according to Michelle Deardorff who is the chair of political science at Jackson State and who also helped found the Hamer Institute. “The image I give when I talk about this is a tree, and the tree is democracy. And a chain link fence was around it,” said Deardorff, who used the idea of the fence to represent racism and slavery. “And as the tree grew, it grew around the fence. We’ve now pulled the fence out… but the tree is shaped by it forever.” Continue reading “Teaching the civil rights movement”
America’s’s concerns about government intrusion are older than the country itself, says Neil Richards, a law professor at Washington University in St. Louis.
“If you want to talk about privacy, what would be less private than having a platoon of Redcoats living in your house, eating your food, listening to your conversations?” Richards asks. “… In the Constitution itself — the quartering of soldiers, the execution of general warrants — all have to do with the privacy of the home, the privacy of papers. NPR says:
“And though the Constitution doesn’t use the word ‘privacy,’ the separation of individuals and their information and their homes and their persons from the state is a theme that runs throughout the Bill of Rights.”
“Concerns about privacy ballooned again in the camera age. “Privacy as a theme in American law, and really in American public discussion, arose in 1890,” Richards says. Supreme Court Justice Louis Brandeis — just a young lawyer at the time — wrote an article for The Harvard LawReview about the personal intrusions of the new “snap cameras.”
“The history of privacy in the U.S. is closely tied with the history of the press, and by the 1960s, that had become an embattled relationship. The ’60s, Richards says, were a major moment for American privacy, in part because of the growth of “pre-modern computers.” Back then, databases were called “data banks,” and they made people nervous. Continue reading “The History of privacy”
A newly released digital edition of the four books of LDS or Mormon scripture—the Holy Bible, the Book of Mormon, the
Doctrine and Covenants, and the Pearl of Great Price—includes editorial changes that reflect a shifting official view on issues like polygamy, the Church’s history of racism, and the historicity of LDS scripture, reports Salon.com
“Perhaps the most significant is the inclusion of a new heading to precede the now-canonized 1978 announcement of the end of the LDS Church’s ban on black priesthood ordination:
“The Book of Mormon teaches that “all are alike unto God,” including “black and white, bond and free, male and female” (2 Nephi 26:33). Throughout the history of the Church, people of every race and ethnicity in many countries have been baptized and have lived as faithful members of the Church. During Joseph Smith’s lifetime, a few black male members of the Church were ordained to the priesthood. Early in its history, Church leaders stopped conferring the priesthood on black males of African descent. Church records offer no clear insights into the origins of this practice. Church leaders believed that a revelation from God was needed to alter this practice and prayerfully sought guidance. The revelation came to Church President Spencer W. Kimball and was affirmed to other Church leaders in the Salt Lake Temple on June 1, 1978. The revelation removed all restrictions with regard to race that once applied to the priesthood.
“Church leaders have long maintained public ambiguity about the history of the ban and its end; they have rarely acknowledged the ordination of early African-American Mormons nor have they cited anti-racist teaching in the Book of Mormon in connection with the Church’s own troubled history on race. The new heading historicizes the ban (suggesting the influence of a robust Church History department) and depicts it as a contradiction to the original impulses of the faith, not corrected until 1978. The heading does, some commentators have noted, offer continuing cover to Brigham Young, whose on-the-record racist statements to the Utah legislature suggest his influence in the evolution of a non-ordination policy. Commentators also note the absence of reference to the fact that black women were not historically admitted to LDS temple worship until the 1978 announcement.”
“Keeping up with the pace of change in the digital world is challenging, and harnessing its potential can be frustrating,” says the Getty Trust’s James Cuno. As presented in an essay entitled “Art History is Failing at the Internet” by Cuno carried in DailyDot, “the biggest mistake many of us in the arts and humanities academy can make is thinking of that potential only in terms of how we can use the new technology to more quickly and broadly disseminate information. The promise of the digital age is far greater than that. It offers an opportunity to rethink the way we do, as well as to deliver new research in the arts.
“The history of art as practiced in museums and the academy is sluggish in its embrace of the new technology. Of course we have technology in our galleries and classrooms and information on the Web; of course we are exploiting social media to reach and grow our audiences, by tweeting about our books, our articles, including links to our career accomplishments on Facebook and chatting with our students online