October 3, 2007
Following up on some of my previous affirmative-action posts, I found this op-ed in the Boston Globe particularly interesting. The op-ed centers around some new research of the most highly selective Universities in the US. What they find is that roughly %15 of white students at these Universities fall below the institution’s minimum admissions standards. Contrary to the story propagated most recently by the Supreme Court, white students who fall below the minimum standards are twice as likely to be admitted to these Universities than their minority counterparts.
This evidence clearly discredits the myth of the over-qualified white student who is denied acceptance to the most selective Universities because of racial quotas. What it demonstrates is that the much older system of affirmative action, namely, the good ol’ boys network, is still the most powerful system of disenfranchisement at elite colleges.
Of course, this kind of empirically driven argument seems incapable of convincing staunch conservatives, who find Justice Robert’s pithy logic–“the best way to end discrimination based on race is to stop discriminating on the basis of race”–more compelling.
April 22, 2007
This is a nice commentary on some of the reaction to the recent tragedy at VT by Ira Socol who is better known for his work on higher education for people with disabilities. His commentary points out some of the problems already beleaguering University mental health care systems and suggests that “cracking down” on students exhibiting mental health problems will exacerbate them.
November 14, 2006
Keeping you up to date on this issue: an asian student, Jian Li is filing suit against Princeton University for descriminatory policies against asian students. This Chinese high school student, in the top 1% of his class, with perfect SAT scores, was denied admission to Princeton after being wait-listed (he is attending Yale this year). His claim rests on recent research performed by Princeton’s faculty who found that at America’s elite Universities, Asian-Americans would be the group that would benefit most from eliminating affirmative action, predicting three-fold increase in their enrollment, from 12.2% to 33.7%. Conversely, enrollment of blacks and hispanics, the article claims, would fall from 9% to 3.3% and 7.9% to 3.8%, respectively.
The suit may not succeed, but it puts an interesting spin on the affirmative action policies. Is there a perception that grades + SAT scores = admission to elite Universities? Having just returned from France, where I stayed at the Ecole Normale Superieure, a true meritocracy, I can say that that system too has its disadvantages: students take two or three years after high school preparing for the entrance examination to be admitted; they are often overwhelmed, anxious, depressed, etc. (manifesting itself in nearly annual suicide attempts–this past year, resulting in the death of one student); the system is under-funded and lacks vitality; and almost no racial diversity apparent at the institution.
November 3, 2006
One of the interesting trends to notice in this year’s political season (apart from the fact that this is the first time that parties have spent more money in an interim election than in the preceding presidential race) is the current trend toward ballot initiatives: proposing ammendments to state constitutions on controversial “wedge” issues. The broad issue is certainly a failure on the part of legislators across the country to do their job. Constitutional ammendments are an ineffective and illegitimate way to push through controversial legislation. Unfortunately, because of the political polarization of these issues, legislatures themselves lack the political wherewithal to creat compromise, write appropriate legislation and open that legislation for judicial review. Instead, judges and the people at large are forced to make decisions on issues that are beyond their capacity or authority.
One particular issue that should have all educators ears perked is MCRI. The ammedment essentially states that no publicly funded institution of higher education can base the decision for admission, employment or contracting of work on the basis of race, sex, gender, ethnicity, color or national origin. University of Michigan provides all the relevant information about the ammendment and its impact on current school policies.
This issue goes well beyond the typical scenario where a well-qualified white candidate is denied admission to the college of his or her choice in favor of less well-qualified candidates of color. First, it would directly effect the hiring policies of college employees and faculty, as well as potentially effecting other government programs, like health services, targeted toward specific genders and races. Second, the fact that it is addressed through a ballot initiative raises an additional concern from the point of view of democracy and an informed public, as a recent Inside Higher Education article demonstrates. Are voters at large the right ones to be making these kinds of choices? And are they receiving adequate substantive information about the topic through the news media?
In general, this is an issue I have rarely understood. Ordinarily, affirmative action debates are framed in terms of the fairness and equality of preferential treatment: either you think it is just and right to treat certain people preferentially for whatever reason, or you don’t. From this perspective, the debate hardly has any hope of reaching a reasonable compromise. Add to that the usual scary scenario of losing a job or a place at the University of your choice to a less well-qualified candidate based on race, and there is almost no hope for even reasonable dialogue. However, in higher education it seems clear to me that the whole purpose of the institution is to design an environment (albeit at times artificial) that will contribute to the kind of education one want students to receive. Universities achieve this by hiring certain types of employees and by admitting certain kinds of students. Now the formula for success here is difficult to guage, but there is good evidence that a diverse student body produces a better prepared work-force for the global economy; one more responsive to issues of racial integration and immigration of foriegn nationals; one prepared to integrate into a diverse work environment and make political decisions that benefit all Americans, not just a particular ethnic or social group; in short, it produces good citizens.
Justice O’Connor, writing the majority opinion in the Grutter v. Bollinger case that upheld the University of Michigan Law School’s affirmative action criteria cites:
numerous expert studies and reports showing that such diversity promotes learning outcomes and better prepares students for an increasingly diverse workforce, for society, and for the legal profession. Major American businesses have made clear that the skills needed in today’s increasingly global marketplace can only be developed through exposure to widely diverse people, cultures, ideas, and viewpoints. High-ranking retired officers and civilian military leaders assert that a highly qualified, racially diverse officer corps is essential to national security. Moreover, because universities, and in particular, law schools, represent the training ground for a large number of the Nation’s leaders, Sweatt v. Painter, 339 U.S. 629, 634, the path to leadership must be visibly open to talented and qualified individuals of every race and ethnicity. Thus, the Law School has a compelling interest in attaining a diverse student body.
The same conclusion was upheld by Justices Scalia and Thomas in their opinion (assenting in part and dissenting in part):
The “educational benefit” that the University of Michigan seeks to achieve by racial discrimination consists, according to the Court, of “ ‘cross-racial understanding,’ ” ante, at 18, and “ ‘better prepar[ation of] students for an increasingly diverse workforce and society,’ ” ibid., all of which is necessary not only for work, but also for good “citizenship,” ante, at 19.
This is exactly the point. Higher education is meant not only to provide an avenue for success and to prepare graduates for the workforce, but also to produce “good citizens” in the broadest possible sense of that term. As long as Universities can demonstrate that affirmative action programs are directed toward this end, they should be encouraged to continue that project.
October 30, 2006
A recent editorial in Inside Higher Education points to the value of the American liberal arts college education from the perspective of Middle East educators. The article, titled “Lessons from Middle East ‘de Tocquevilles,'” concludes that the core of American higher education that makes it the envy of the world is its focus on student-centered education, its goal of preparing the student for life, not just for a post-graduate job, and its focus on the continuing education and development of its faculty. As the title suggests, the author believes that these Middle East educators may have more keen an insight into the essence of American higher education than we do.
For many reasons, American Universities have been very quick to catch on to what we might call the democratization of education: one that is “student centered” rather than based on the “content-expert model” of dissiminating information. I would even suggest that one need not go as far as the Middle East to see the differences mentioned in this article. In many ways, American Universities are leagues ahead of their European counterparts (with the possible exception of the top British Universities) in exactly the areas mentioned. In the end, the article concludes that the maligned and lamented aspects of American higher education, such as the pursuit of prestige through a veritable arms-race of state of the art facilities and banner athletic programs, these are ultimately detractions from the core of American higher education; they are signs that we are losing our focus.
However, it might be interesting to entertain the possibility that the democratization of the classroom is in some sense tied to the adverse effects of publicity and marketing sought through prominent athletics programs and higher rankings. In other words, the wishes of the consumer (i.e., the student and his/her parents) drive the policies of the University. In many ways, this creates a much more responsive and valuable culture for higher education, but it clearly has its detractions.
Educators in an American liberal arts colleges have a distinct window into the state of American culture. While I often complain about the privilege of expensive technological and recreational facilities over meaningful interaction between educators and students, I have to admit that these trends are not unique to the American University. Moreover, they are not necessarily the signs of a unhealthy atmosphere in higher education. University Administrators are constantly under pressure to provide tangible signs of the competitiveness of their University. Until we, as a culture, can devise ways to put a concrete value on higher education as a part of personal and moral development, we will not be able to change the University culture. Until we see higher education as a key component in the development of a populus for the purpose of good moral and political action, the deficiencies of the University to provide that kind of education are not symptoms of bad University policy, rather they are symptoms of the state of our culture at large.
October 3, 2006
Inside Higher Ed is reporting today on a new survey released by an independent business group, The Conference Board, which finds that 431 HR representatives of companies hiring college graduates rate all college graduates–i.e., four-year as well as community college–poorly in the areas of Written Communication, Writting in English and Leadership. In fact, the only area in which all college graduates were rated as “excellent” by a majority of HR representatives was application of information technology.
This is a slap on the wrist of colleges, especially four-year liberal arts colleges. But it is an opportunity for those of us teaching in the humanities to demonstrate that our place in higher education is integral and ought to be reinforced. There is no better place to learn good writing skills than in History, Philosophy, Litterature, Theology or Classics courses. These courses are often begrudgingly included in the “core” curriculum for liberal arts colleges, but their faculty, financial support and status within the University has consistently been undermined: they are often seen as refuges for neo-marxist, feminist, queer-theoretical or postmodern nonsense and are consistently contrasted with those disciplines that fit more easily into the framework of modern technical sciences. This new report suggests that our emphasis on learning science and technology has done its job; now we need to return to the classical skills of reading and writing that can be taught so well in the humanities.
June 6, 2006
In a recent post on High-Tech "Cheaters", I argued that the essential problem with cheating, high-tech or otherwise, is a problem that faces all educators, namely, how to reach students' needs effectively and then evaluate their response to our efforts. In the May 2006 issue of the Proceedings and Addresses of the APA, I ran accross John M. Dolan's "Statement for the Academy of Distinguished Teachers" appended to his obituary. John Dolan was a Professor of philosophy at the University of Minnesota, and passed away on 15 September 2005. He says:
"My philosophy of education can be expressed in a single sentence: 'You are teaching a fish how to swim.' I am suspicious of instructors who, thinking they understand what is going on when real intellectual growth takes place, wax eloquent about their pedagogical 'methodologies.' Examined closely, episodes described as 'inspired teaching' are occasions in which abilities and powers already present in the 'student' are somehow stimulated and stirred into more vivid realization and growth. Neither the 'teacher' nor the 'student' is wholly in charge of the direction or character of that new life and growth."
He goes on to compare a student to a plant, which develops out of its own inner force, and he warns teachers to take heed of the old doctor's credo: "Primum non nocere" (first of all, do no harm). How much does a teacher's effort to control the dynamic of the classroom, to resist the intrusion of new technologies, to submit students to artificial and narrow methods of evaluation constrict the growth of students?
In a real sense, what we do as teachers is not much: we simply try to let our students be what they already are. Yet there is also no doubt that letting things be what they are requires a determined will and a sharp mind.
May 27, 2006
With the recent publication of the University of Colorado-Boulder’s board on academic misconduct’s findings on May 6, 2000 regarding the Ward Churchill controversy, issues of academic freedom are once again brought to the forefront. Recently, the online magazine Inside Higher Ed published a series of perspectives on Churchill, centering largely around the ACTA report “How Many Word Churchills.” The principle author of that report Anne D. Neal defends her position explaning herself against recent attacks, alongside a counterpoint by Dennis Baron. I leafed through the report, which is billed as a “survey” of course catalogues at American Universities, but found no indication of the perameters or procedures employed for the survey, no statistical analysis of language used in college catalogues, only a recurring claim that the anecdotes offered therein are shockingly “ordinary.”
Some claim that the ACTA is simply another right-wing witchunt, but then-again others claim that Churchill and the majority of University professors are in fact advancing a left-wing conspiracy to undermine democratic education. I’m afraid that debate won’t get us very far. (In an interseting aside, if you Google ‘Ward Churchill academic freedom,’ you will find 375,000 entries, of which the leading are largely right-leaning, while if you Google ‘Ward Churchill academic responsibility’ you find 100,000 fewer entries, of which the leading are left-leaning.)
I did find a real debate between Churchill and David Horrowitz. Given what appears to be the rather one-sided nature of the forum in which the debate occurs, the discussion is actually fairly enlightening. It seems that the basic lines of the debate run something like this: Horrowitz and his supporters think that politics should be removed from the classroom in favor of “professional,” scientific objectivity. Classrooms, according to Horrowitz, are not the appropriate forum for educators to voice their political opinions. He includes in his list of deplorable trends on college campuses the usual women’s studies, ethnic studies, queer theory, peace studies, and somewhat surprisingly now “social work” based on a particular course at Kansas State University that really got under his skin. The former are a new crop of concentrations at major universities, they are rarely given the status of a university degree offering and even more rarely support tenured chairs. The latter, however, is a very interesting study, since Horrowitz suggests that this is part of a broad trend in the marxist-driven politics of social work. However, the above course will not be found available to social work degree seekers, but is available as a part of the Women’s Center at KSU. Horrowitz’s main point seems to be that professors are not politicians, they are not elected to their position and do not serve at the mercy of their constituents, and so they should not speak politically.
Churchill, for his part, argues, through many detailed historical accounts of academic oppression and historical prejudices being passed off as doctrine, that there is no such thing–especially in a discipline like Native American history–of abstracting political opinion from the classroom, the subject-matter itself demands a discussion of political opinions. His argument is that a “professor’s” implicit job is to “profess” his or her opinions (pay no attention to the dubious etymology behind the curtain). In the end, he claims that only those who challenge the “status quo” are submitted to the tribunal of scientific legitimacy, whereas all interpretations of history are simply the advancement of a certain perspective.
We could enter a long debate on those juicy morcels. But what is the role of the University in educating the public? The German academic paradigm, upon which the American University system is largely based, believed that by training and disciplining the imagination of the citizenry, that they would develop a robust and moral nation-state. The process was called Bildung, which carries a close resemblance to a word the German idealists use for imagination, Bildungskraft, also closely related to the German word for picture Bild. Thus, the idea was to develop a publicly shared imagination that would inform the moral actions of the people. (The idea had some rather unfortunate historical consequences.) In order to be able to do this, professors could not be beholden to a small subset of the population (a board of directors, for instance) but were conceived as stewards of the state itself, so that they could freely serve the common good.
This, of course, is the conceptual grandfather of “tenure” and the much-maligned encadrement of the “ivory tower.” As somewhat of an aside, it should not be overlooked that this debate about “academic fredom” is going to have important consequences in the related trend in Universities to eliminate tenured positions in favor of short-term contracts and adjunct positions. But in a strictly pedagogical context, University professors have the duty to connect with the opinions of their students, to stimulate them and broaden them to as maximally general a domain as possibile without loosing their particular foundations. Indeed, the particular foundation of those opinions is often emotional, that “gut reaction” that accompanies conviction. Professors are not politicians and should not try to be. But perhaps it would not be entirely inappropriate for professors to elicit the opinions of their students by articulating the sometimes controversial opinions of themselves or others.
I recommend to everyone to read some of Ward Churchill’s responses to the recent criticism (a task from which he has not backed down) because he is an articulate person, whatever the status of the charges now confirmed against him. We should also reconsider the essence of what it was that got Churchill into so much trouble. He claimed that 9-11 was a case of “chickens coming home to roost.” He did this in a rather insensitive way, referring as he did to the victims of 9-11 as “little Eichmanns,” but it is grosso modo in line with what other intellectuals and journalists had also said in the aftermath of 9-ll: for example, Jean Baudrillard, Noam Chomsky, and Robert Fisk.
For some, this may not be the highest of compliments: to be associated with these clearly neo-marxist liberal populists. But at least neither the academic credentials of Baudrillard and Chomsky, nor the journalistic credentials of Robert Frisk can be questioned. In fact, what makes all of these critiques similar is that they each ask us to question why did something like 9-11 happen to America, why the World Trade Center? Even if it is commonplace to say that the “world changed” after 9-11, wouldn’t we like to know the conditions that brought about such a change? Would it be historically and academically irresponsible to probe into that question? Perhaps you wish that Churchill–or Baudrillard for that matter–did it with a bit more tact. But that’s really a semantic issue, isn’t it?
I will leave you with what I think is a wonderful historical document in its own right and I think highlights the legitimacy of that question. It’s a video of Robert Fisk giving a talk at MIT in February of 2003 called “Ask All You Like About 9-11, But Just Don’t Ask Why.” I was at this talk, watching on a screen in one of several overflow rooms at MIT. Frisk had just come to Boston via New York City where he attended (what is now commonly agreed to be) the shameful presentation of then Secretary of State Collin Powell to the UN Security Council making the case for war in Iraq. Frisk opens with his account of this presentation and then proceeds, in his own way, to render the events that led up to 9-11, describe the current status of Iraq and demostrate the absolute incredulity of any possible connection between Sadaam Hussein and Al Qaeda. In a particularly chilling segment, he shows a series of pictures of a children’s hospital where hundreds of Iraqi children were dying (very quickly) of strange cancers. He suggests that perhaps the use of small-scale nuclear warheads (the first-Gulf’s version of the “bunker buster”) may have something to do with it. Who knows?
But maybe asking some tough questions would not have been such a bad idea after all.
Update* 02-22-2007: For historical purposes, I recently discovered this article written by someone I respect who was the chair of the philosophy department at CU during the Churchill ordeal. His position is instructive, clear and free from politicking. And the ultimate conclusion about Churchill is not very positive.
May 25, 2006
An article by Ira Socol, “Stop Chasing High-Tech Cheaters,” itself a response to recent NYT article, “Colleges Chases as Cheats Shift to Higher Tech,” presents a very interesting issue concerning evaluation procedures in institutions of higher ed. Perhaps these different perspectives represent a changing of the guard in terms of teaching methods: the younger generation of grad students like Socol embrace methods of “research”/”cheating” enabled by newer technology, while the older generation cringes at the thought that students might use “spell check” on in-class exams (prompting one journalism professor to have his students write their computer-aided exams with screens facing him). In any case, I am sure that my colleagues and friends are forced to ponder, as grading season is in full force, how should we evaluate our students?
Socol’s argument isn’t perfect, but I do like the perspective it offers. I had a wonderful High School biology professor, Dr. King, who used to say that you shouldn’t have to learn anything you could find out by looking up. In our Google-ized world, it may seem that there is very little left to know that cannot be “looked up” instantaneously. I sympathize with the remark, cited by Socol, “Why aren’t colleges teaching students how to research, organize and evaluate the information that is out there?” instead of continuing to demand from them rote memorization of facts that could so easily be found on-line.
Well, I agree that what we need to teach is effective research methods. But what are those methods? And could they include perhaps some of the mechanical practices that students have been tested on for ages? For instance, I have my students write in-class exams because I feel that one important skill in conducting research is the capacity to organize thoughts quickly and spontaneously, and then write these out coherently. I also think that handwriting is important, even in our world of the emerging hand-held PC, because hand-written notes cannot be eliminated when organizing research. I also personally find that there is something added by physically altering what one is working on (but that may be itself passé). However, the proposition I submit to my students is this: if your thoughts are not legible, they are not intelligible, and so they are not useful for research. In short, there certainly are some “old fashioned” practices, such as memorization, good spelling, etc. that remain indispensable, even in a high-tech world.
There is, of course, the issue of citation, which is an important research skill that I find very few college students have mastered. But I don’t see why this couldn’t be integrated with the use of new technologies. Websites ought to be cited just as books or articles ought to be cited. And perhaps technologically related techniques (such as word-searching an electronic database) ought to be acknowledged in certain cases.
The thing that always struck me about the outrage against cheaters is that it seems to be fueled by a kind of moral taboo: that there are certain practices which ought to be absolutely forbidden. For my part, I prefer a more aesthetic judgment: poor work is poor work and students who cheat generally do poor work. Apart from the wholesale purchasing of pre-written papers, something that I have never personally encountered, any case of cheating ought to be fairly recognizable: there are invariably sudden shifts in subject-matter and diction, students use obscure material not covered in the course, or introduce an example or fact without the requisite development and understanding of it. But these stylistic indicators of a “cheater” are also simply indications of poorly written papers. In my view they should be graded accordingly.
What is more, if paper topics and exam questions are geared toward evaluating what students have learned from the course, then issue of cheating should effectively be moot. That is to say, Googled research or purchased external papers are obviously not going to pertain directly to what particular instructors have taught in their particular courses. So these students will fail to demonstrate that they have actually internalized the teaching from the course. But herein lies the rub: how do we really evaluate the learning process? How do we judge whether students have internalized the infomation and methods that we are trying to convey to them?
Maybe the moral outrage against “cheating” funnels a deeper frustration about how to approach the very difficult task that is education.
May 24, 2006
This year’s graduation speakers provide a very interesting commentary on the paradoxical status of moral teaching at our religious institutions of higher learning. I am an academic; hope to continue to be one for as long as possible. And I am a religious person. I have also had the, somewhat coincidental, experience of attending Catholic institutions of higher education for both my undergraduate and graduate degrees.
This year, one of those institutions Boston College gave Condoleezza Rice the honor of being the commencement speaker. Students protested. Faculty protested. There was a very intelligent student-generated petition against the invitation. Adjunct professor Steve Almond resigned. Some of the arguments guiding these actions focused on Boston College’s status as a Jesuit institution, stating that Rice’s actions as National Security advisor directly conflict with “Catholic values.” Almond went so far as to state quite clearly that “Rice is a liar.” But in every case, at least some effort was made to suggest that Condoleeza Rice’s decisions in the past undermined her moral authority to provide the kind of wisdom that is supposed to be imparted from the podium during graduations.
In a wonderfully ironic twist, Rice’s speech focused on “responsibility.”
Meanwhile, at a small catholic school in St. Paul Minnesota, the University of St. Thomas, a senior chosen to address the student body declared that faculty members who wanted to share a room together on school-sponsored trips, as well as–get this–women who use “the pill” are being “selfish” and ignoring the greater good for short term pleasure. While many students and parents walked out in disgust, while still others booed and shouted insults, the dean who followed the student’s speech called it “courageous.”
Religious institutions of higher education, it would seem, are some of the last vestiges of the kind of education that trains, not only the mind, but the soul. It used to be the case that all forms of education were envisaged in these terms, but now there are only a few institutions who are openly willing to acknowledge that they stand by a certain set of moral values and that these inform their educational practices.
But how far have we come, if what we take to be the best kind of moral education is to tell a handful of people how they should conduct themselves in the bedroom, while at the same time rewarding and encouraging figureheads of the state who–at the very least–are not above reproach for having directed our country’s citizens, resources, and attention to what is quite clearly an unnecessary and extraordinarily costly war.