Monday, December 14, 2009

Achievement

Malcolm Gladwell’s study of overachievers, Outliers, has been on the bestseller list for some time now, and this meditation on it comes rather late. But I just read it, and it is rare to come across a book so intriguing that I literally lose sleep over it. Gladwell begins by asking what it is that makes exceptional people exceptional, and in a series of case studies powerfully repudiates conservative (and self-help book) dogma which holds that the world is a functioning meritocracy where you, and you alone, determine your fate. Talent and hard work are factors in success, to be sure, but, as Gladwell demonstrates, they are very far from the only ones. It avails you nothing to be Mozart if your parents are drug addicts who beat you every time you make some noise.

In fact, innate intelligence, as measured (ostensibly) by IQ, doesn’t seem to matter – one needs only sufficient intelligence. Where you go to school doesn’t seem to matter much, either – from a list of Nobel Prize winners, for example, Gladwell demonstrates that graduates from Ivy League and other elite institutions are no better represented than those from second and third tier institutions. Any decent school will do. Even racial prejudice doesn’t always work against an individual. What does matter, then? Well, for starters, how early you’re born in the year and, indeed, the year you were born; the extent to which your parents supported you; the number of hours — 10,000 seems to be key — you put into developing your skills at a young age...the list goes on. What matters is an accumulation of advantages, many of them objective factors outside the control of any individual. In other words, our talents and our efforts must be met with a great heaping dose of dumb luck if we want to excel.

I recently raised this point to a group of graduate students when I was invited to participate in a panel discussion on the topic of getting hired in the academic job market. You could be the Mozart of your subject area and never get hired, I told them. Now, I’m no Mozart (not even Salieri), but I asked them to consider my own path to a rare and coveted tenure-stream position. At least four things had to occur: there had to be a position to fill (somebody had to retire); my department had to seek to replace the retiree with a new hire in my field (by no means a given); my institution had to agree, in the midst of an economic downturn, to the department’s request for a replacement hire (probably the biggest hurdle); finally, they had to hire me out of all the dozens and dozens of qualified people who applied for the job. Of those four factors, I could affect the outcome of only one, and even then only up to a point. In the end, I was hired over many people, including some friends, who would have done the job excellently had they been chosen. My personal merit — which certainly did not exceed that of any of a number of other candidates — was one factor among many.

My own view is that my former students, now entering grad school hoping to become professors, are starting out at the worst possible time, when the advantages are least likely to accumulate in their favour. They are facing the worst academic job market since, well, ever. They will graduate into a job market positively saturated with immensely qualified PhDs swarming for a tiny handful of jobs. The cruel mathematical reality is that there are simply far fewer positions than there are good people with PhDs, and how accomplished they are in terms of teaching and publication is only one factor among many in the hiring process. Life isn’t fair, and the world is not a straightforward meritocracy. If I had to guess, I’d guess that fewer than one-in-five will get a full-time, tenure-stream position. I know they don’t want to hear this, but it does no good to hide from reality, either.

I’ve talked about this before, and less pessimistically. But that roundtable I participated in got me thinking. The assembled graduate students were told everything except the two things that they needed to hear the most. The first is that the objective circumstances are stacked against them, and so they will need to work very, very hard to accumulate as many advantages as they can in those fairly narrow areas where their efforts actually make a difference.

The second, and most important thing that I should have said but didn’t, is that the quality of your CV is not the same as the quality of your person, that your academic successes and failures are not moral successes and failures. Alas, truly excellent job candidates can sometimes have their accomplishments held against them, or minor faults blown out of all proportion to their actual significance. Academics trained to hone their critical faculties to a razor's edge too easily turn them on one another. It happens in class, it happens over beer, it happens in meetings, in the pages of journals, at conferences, and on hiring committees. Never underestimate the capacity of people with tenure and six figures to be threatened by the least little thing, and to find not just nit-picky but positively pathological grounds for passing over qualified candidates for jobs. At that recent panel, a colleague remarked that the order in which a candidate describes her teaching and research on her cover letter could make the difference between being interviewed or not, and that moreover people who make the wrong choice might actually offend some hiring committees. I have no doubt that this is true.

Anyway, that’s my advice for graduate students. For any professors reading this, I have different advice. If that sort of thing would offend you, well, get a therapist.

Sunday, November 29, 2009

Christmas

Last Christmas, I wrote a long column in which I objected most strenuously to the whole season. I consider it my best and most important blog post to date, and I made several reasoned arguments against the Christmas season, the foremost of which is that it sucks and I hate it. At the mall this week, however, I was dismayed to discover that my objections have gone entirely unheeded. Unaccountably, Christmas has returned. Come on, people! Is no one reading this thing?

If this blog has railed against one thing from the outset it has been against the hypocrisy of compulsory sentiment, and nothing so exemplifies this condition as a visit to the shopping mall or supermarket this time of year. The same dozen songs, endlessly repeated (however often they are re-recorded - I see that Bob Dylan has an album of Christmas standards out); the same message (be merry, or else); the same visage of the Merry Leader (Santa, not Jesus) and the incessant reminder that he is watching you and knows when you are sleeping and when you are awake. There's an Orwellian thought for you. I say again that in the seven weeks separating Halloween from Christmas we get a small taste of what it’s like to live in North Korea. Merry! Merry! Merry! Happy! Happy! Happy! Joy! Joy! Joy! Merry Leader is Watching You And Expects You to Conform.

And, please, don't get me started again about the clichéd holiday specials ("Next week, on a very special episode of Battlestar Galactica, the Cylons learn the true meaning of Christmas" etc.) and the vapid holiday films with trailer tag lines like, "This Christmas, the only things some families can stand more than being together, is being apart."

Come on now, Broad, you middle-aged grump. It's not all bad, is it? Well, I admit that I don't mind some Christmas music. Emile-Claire Barlow does a thumpingly great version of "Little Jack Frost" (if you have iTunes, download it now – you won't regret the 99 cents) and, of course, there's Dean Martin's effortless take on "Baby, It's Cold Outside." But these are songs about winter rather than Christmas per se. In my view, there's only one authentically good modern Christmas song: Fairytale of New York, by the seminal Irish band the Pogues and the late Kirsty MacColl. It’s about a drunk and a druggie and the sentiments of genuine affection (and contempt - she calls him a "scumbag" and a "maggot", he calls her an "old slut on junk") that they share at Christmastime. Brings a tear to these jaded eyes of mine.

I find myself in complete accord with my Christian friends who regard the season as too commercial. There's nothing new about this complaint – C.S. Lewis made it half a century ago and he wasn't the first. Indeed there is something cold and crass about the idea that we will express our affection for family and friends once per year through the mandatory purchase of commodities that are in most cases both unwanted and unneeded. In many families it's reached the point where people simply tell each other what to buy for them, which raises an obvious objection about cutting out the middle-man. I know, of course, that for many parents Christmas is a time of genuine joy — many children love it — but let us not forget, too, that for many parents of modest or little economic wherewithal Christmas is a time of genuine anxiety. Young children are consumer aware but not, generally, comprehending of their parents' economic circumstances. And let's not forget that some parents, too, are positively insane this time of year. Remember the Cabbage Patch Doll riots?

Judging the volume of e-mails I received (about a dozen), my blog last year was my most widely read ever. After it was published I was approached by a couple of activist-minded students who had seen it and who were preparing to petition my workplace over its overt displays of Christian Christmas symbols. This, they felt, created a "hostile" atmosphere for non-Christian students, faculty, and staff. Would I join them? My reply was "certainly not." Apart from the obvious objection that it's rather silly of anyone to voluntarily work at or to attend a Catholic institution and then act surprised to discover Christian symbols there, I explained to them that my affinity for the Grinch goes only so far. Like him, I find the season loud and crass. But we have an emphatic parting of ways over the fact that he believes that it's his right to stop other people from celebrating it, too.

The fact that the students ­— and they are by no means alone in this — could not differentiate between these two worlds-apart positions, is indicative of how badly our educational system often handles such things. Out of mistaken notions of "respect" for differing worldviews, many schools have decided that it's best if people don't express their differing worldviews at all. But respect, of all things, is a sentiment that cannot be made mandatory. It emerges, if it emerges at all, through a process of engagement — which must necessarily include argument and disputation among people who do not always agree. The efforts at this time of year to ban carols and lighted trees and harmless expressions such as "Merry Christmas" are not merely silly but insidious. They undermine rather than promote discussion between faiths and between people of faith and nonbelievers.

Sunday, November 15, 2009

Memorization

Last month, my wife and I moved. Moving is best done regularly or not at all, and after the events of the past four weeks, the idea of spending about fifty years in one house, dying there, and then forcing the inheritors of my estate to go through my decades of accumulated crap has a certain appeal. Take that, you vultures.


At any rate, I consider myself a modest adherent of Robertson Davies's admonishment to "keep everything." Amongst the discoveries in our attic — my prized comic books, located at last! — was a pile of undergraduate papers and tests. I re-read a few of them, and was struck by the fact that Graham, aged twenty, was in some ways a better writer than he is now. He was highly imitative of whatever he was reading at the time, but a careful and economical writer and one with certain spring in his sentences. He might have made a good novelist had he worked at it. He didn't. In fact, re-reading the essays now, I see quite clearly that he was also a remarkably lazy researcher, and much of what he wrote comes across as likeable but insubstantial, like fast food or a hollow Easter-egg. One senses from the remarks of various professors that they were more amused than impressed, and occasionally one saw through the whole stylistic juggling act and called it for what it was. In 1991, Graham received a mark of 40% for a very clever but paper-thin review of Carlo Ginzburg's The Cheese and the Worms. He was very angry at the time, especially since another professor had just given him a 90% on a similarly conceived book review. And here's an important point, students: re-reading the essays now, and reflecting upon the professors' comments, I realize that it was the first professor, and not the second, who was not only right, but who really cared about twenty-year old Graham's academic progress. The second exemplified what the recently departed Theodore Sizer called the "disengagement compact" – the all-too-common understanding between teacher and student that they won't demand much of each other.


Another thing: among my discoveries was a mid-term test I wrote in 4th year. (Some years had passed: young Graham went through the academic grist mill and emerged as the person I will now refer to as "I".) The course was "The Intellectual History of Modern Japan", taught by a great professor, Barry Steben, who was very probably the person most responsible for my decision to pursue history professionally. His was a straightforward pedagogy. He arrived with a sheath of notes and an idea and started talking (with you, not at you); you finished each class feeling winded but with a sense of real accomplishment –the kind of really intellectually demanding classes that would be offered by more professors if teaching evaluations weren't forever being dangled over their heads.


His tests were hard. Here's a typical essay question: "Describe the structure of loyalty under the Tokugawa order, giving the name in Romanized Japanese of each major node in the authority hierarchy. Explain some of the principles by which this order functioned at its different levels, and make some mention of potential contradictions within the society (or contradictions between the system and the actual realities of Japanese society) that contributed to its collapse at the end of the Tokugawa period. In outlining the structure of the system, you should explain why the concept of direct, unmediated loyalty to Heaven was considered subversive."


A curious thing: I scored 11/12 on the multiple choice and 23/24 on the essay question, for a cumulative mark of 94%. The professor felt that my answer to and explanation for one question was good enough for a 2% bonus, and thus my final mark came to 96%. Well done indeed. That was the highest test mark I ever received in university.


Now, the point, and for anyone who teaches history, it's a sobering though perhaps unsurprising one. Were I to write that test tomorrow, I would unquestionably fail it – with a mark of (probably) zero on the essay and, presumably, results according to chance expectations on the multiple choice, for an average of 6%. So, here I am, a professional historian, and I would fail a history test that I aced fifteen years ago. So much for the standard claim that we study history in order to accumulate facts that will aid us in the present, or the hope of those dour and fusty antediluvians at the Dominion Institute that if a kid can pass a history quiz he can be deemed well educated. As I've argued elsewhere, several conditions would have to be met for this to be true, not the least of which is that we'd actually have to remember what we're taught for more than a few months. Few of us can.


To illustrate: a few weeks back I was mildly irritated but also unsurprised when not one of twenty students in my 4th year history seminar could correctly explain Confederation to me, and last week I found that none could tell me anything worthwhile about the French Canadian nationalist Henri Bourassa, and this after they all had completed a Canadian history survey. (Shame, senior students, for not looking it up – something you can do much more easily than Graham could at your age.) But the fault isn't really theirs: the overwhelming majority of us simply don't recall facts that we don't regularly require. Now, I happen to believe that a degree of cultural literacy is important. As one friend and colleague of mine has often observed, if you're studying modern European history and don't know what the French Revolution is, you're in trouble. The problem is this: the methods of education that the Dominion Institute types want —methods that center on rote memorization — are the ones least likely to produce cultural literacy in the long term. And they remain at the core of our educational system. Oh, I tell my students that I want them to think about what I'm saying in lecture, but when mid-term and exam time rolls around what I'm looking for is accurate recall of raw information.


What Steben understood is that the curriculum was taught not just for its own sake but also and perhaps predominantly to cultivate a love of learning and scholarly habits of mind. I was fortunate to have half a dozen or so professors who saw it that way, and who made an authentic effort to do more than just pay lip service to this ideal. So, yes, I'd fail that test if I took it tomorrow. But give me two weeks to prepare for it, and I'll beat the pants off Graham, aged 25, without blinking. I have something that he was just beginning to cultivate: an understanding of disciplinary methodology. This is the second most important thing we can teach our students. (The first is ethics.) Knowing the name of the first Prime Minister and the date of Confederation is, well, trivial, by comparison.

Saturday, October 24, 2009

Pedagogy

When people meet me for the first time, it is usually not the Aristotelian sophistication of my intellect, but, rather, the nearly Herculean perfection of my physique that positively arrests their attention. "How can I be more like you?" they ask, before adding, inevitably with a tone of remorsefulness, something along the lines of, "but I dream…"


I kid, I kid. I am, in fact, balding, gap-toothed, approximately porpoise shaped, and with each passing year new hairs begin to sprout from alarming places where hairs have no business being. This does not, unfortunately, include the top of my head. I am the apotheosis of every fitness magazine's "before" picture.


Having said all this, you are reading, comrades, not just the musings of a PhD, but a scholar whose arsenal of accreditations includes the necessary classroom work to be a certified personal fitness trainer. It's true. It is something I did back when I was young and fit, before the crushing gravitational pull of academe shortened and broadened my physique. And, as The Simpson's once said of becoming a police officer, you don't get to be a personal trainer overnight: it takes a solid weekend of training.


At any rate, my interest in matters concerning personal fitness remains, tucked away like a half-finished novel, awaiting better and fitter days. (On the issue of half-finished novels, incidentally: old friends can confirm that, age 17, I actually wrote a novel about a high school girl who falls in love with a classmate who turns about to be a vampire. I kid you not. But I thought the idea was stupid and clichéd and never pursued it after the 11th grade. Well, it was stupid and clichéd. But it turns out that stupid and clichéd can make you a billionaire.)


Okay: the point. Personal trainers will argue interminably about optimal exercise protocols for their clients: should they do cardio first, and then weights? Or weights first, and then cardio? Or is cardio even necessary, if weight training elevates the heart rate sufficiently and for long enough? And when they do weights, should it be with free weights or machines? What is correct number of sets and reps and at what speed should be they performed? The journals of exercise physiology and fitness magazines are full of articles on these issues. But these discussions often ignore the fact that for most people, the real problem isn't deciding on an optimal exercise program, it's that they aren't exercising at all, and that almost any safe exercise program would do them a world of good. Shiny new programs that purport to make exercise fun can attract people for a certain amount of time, but gyms make a killing off of members who pay their monthly dues and never go. What sedentary people need, is to be persuaded that to be physically active is a vital part of living well.


The relationship to pedagogical debates over optimal teaching methods couldn't be clearer. Open any teaching journal and you'll find articles contrasting this method of teaching to that, and in the past ten or fifteen years most of the discussion has been about how technology should be deployed in the classrooms. But these debates fast reach a point of diminishing or inconsequential returns, when the real issue is that a significant percentage of students are the equivalent of sedentary North Americans who have gym memberships but don't really use them. What they need is any good method of education to inspire them to get learning.


I don't deny that the question of how one teaches happens to be important as far as any given class goes, but the student engagement one achieves through technological wizardry and over-the-top pedagogical theatrics probably lasts no longer than the class itself, and may actually discourage learning in the absence of such wizardry and theatrics. (This is of special significance at a time when the evidence is conclusive that students simply aren't reading as many books as they used to.) The goal of any professor worthy of the name is to produce students who can go on learning after their formal education has ended. For those purposes the question of how to teach is far less import than the question of why we teach and why students should want to learn. The answer, of course, is because a good education, which is one that leaves us with a love for learning and method for doing it, can be a vital part of what it means to live flourishingly. But when our own pedagogical discussions center on such matters as how Twitter can make learning fun, or what the correct number of PowerPoint slides should be, it brings to mind the parable of the Zen master, pointing towards the moon, who looks down to discover that his students are staring at his finger.


Tuesday, October 13, 2009

Blogging

I like to claim that I live without bigotry, but in fact I harbour a secret prejudice. I loathe opinion columnists, those hacks who get paid to spew at the mouth two or three times a month on issues about which they have no expertise. As a professor of history, I consider it my very great responsibility to get my facts straight before lecturing to my thirty or so students every week, but there are opinion columnists who make a living by pontificating to tens of thousands without, as near as I can tell, giving a moment’s consideration to what they’re saying. Indeed, there are certain well-known columnists who I read with great devotion - not because I like them, but because it has the same appeal as a horrific and bloody roadside car wreck: can’t look at it, can’t look away. They are everything good scholars should not be: certain, smug, self-righteous and, worst of all, consciously contrarian — making arguments that they know to be false because it amuses them to do so.

Looking back over this blog’s fifty columns, I have begun to see that coming off as certain, smug, and self-righteous probably is an inevitable consequence of regularly writing about one’s views. I admit that, in person, I can exhibit these traits, too, but I’ve been making a concerted effort to do better (it would help if others would at least try to act smarter than they are). Here, on this blog, however, matters are different: when sharing one’s opinions without rebuttal it’s hard to avoid coming off as very sure of oneself. And the very curious thing is this: I am not - hence the title of the blog itself. I can, however, claim with good conscience that I have never, at least not on Measure of Doubt, argued a position that I do not believe for the sake of doing so.

At any rate, the whole thing has been immensely therapeutic, and it has lasted much longer than I had anticipated. Much to my own amazement, I have fifty posts — some 41,000 words — under my belt. As I said in one of the earliest posts, I’m doing this for my own sake, not in anticipation of anyone reading it. However, while I don’t keep tracking statistics, gradually I have discovered that people actually are reading this thing: friends, colleagues, enemies, students, and even random passers-by.

So, up for another fifty? I am if you are.

PS

I would like to announce the creation of a second blog: Suspended Judgment, which will be devoted solely to the discussion of teaching-related issues. Already reading Measure of Doubt? Never fear - for now, at least, nothing will appear there that won’t also be here. The point is to cleave off a small part of cyberspace strictly for my professional work.

Monday, September 28, 2009

Dinner

For this, my fiftieth post, I promised to turn to the eternal question, the one that has bedeviled human beings since the first of our protohuman ancestors vocalized a thought, namely: “What’s for dinner?” I wonder how you would feel if I told you that Dawkins, pictured on the left, was on the menu.

Did that thought fill you with revulsion? It did me, because I rather like the little girl, even though she bites our feet and tracks slightly moist kitty litter onto the bed in the morning. But, really. Why not eat her? Why not brain the little sucker, bleed her, skin her, cut her into parts, hang her up to let her age (the meat we eat is decomposing, you know - after all, rigor mortis makes for tough chewing), then joint her, and cook her up in some olive oil, sprinkled with rosemary, sea salt, and freshly ground pepper? Yum yum. Why make pets of some animals, but imprison, fatten up, slaughter, and then chomp down others? Intelligence cannot be the dividing line — she’s is not very bright, trust me — so why should cuteness be?

I have been troubled by this question for some years now. On what warrant do we claim the moral right to select certain animals for food and lethal medical experimentation, but not others? I have no satisfactory answer to the question. Clearly the mere fact that something benefits us does not make it moral. I long ago concluded that higher-order primates, our evolutionary cousins, must absolutely be left alone, regardless of any impediment it puts in place to scientific research. We share something on the order of 98 percent of our DNA with chimpanzees, for example, and I can locate no rational defense for performing experiments on them, or for making them perform circus tricks for us, that, were we to be consistent, wouldn’t also apply to certain human beings with cognitive impairments.

Anyway, dinner. About a year ago, my wife and I made a quite conscious decision to become flexitarians - consumers of a mostly vegetarian diet who aren’t dogmatic about occasionally eating meat. We reduced our consumption of meat, poultry, pork, and fish from about six dinners per week to one, and did the same with lunches. (Breakfast was mostly vegetarian anyway.) Some flexitarians would say that we need to go further still, which is why I prefer Mark Bittman’s term “lessmeattarianism” to describe our diet. We did this for a variety of reasons.

First, we are utterly convinced that industrial meat production is cruel to the animal and environmentally damaging. On these grounds alone, there would be sufficient cause to stop or hugely reduce meat eating. Add to that the following fact: the average steak or piece of pork or poultry from the supermarket, shrink-wrapped onto styrofoam, doesn’t taste like anything - it's basically a dead delivery vehicle for spices and sauces. Might as well save the money.

When we do eat meat now, we try to select it carefully from those rare vendors whose practices, we believe, are more ethical and ecologically sustainable, and which result in a better-tasting critter. (We have abandoned our former favourite fast-food, sushi, altogether: either it’s fake, the seafood equivalent of McDonald’s — that bright red tuna is dyed, people — or real but involving endangered fish flown in from the Pacific, in which case it’s environmentally catastrophic.) I realize that some vegans in particular would rebut that we are therefore simply reducing the amount of murder that we’re complicit in, but, as I’m constantly reminding my moral relativist students, the number of corpses one generates does matter.

Our second reason for reducing meat consumption concerns matters of health. While we’re convinced that there’s no particular evidence that eating meat generally is bad for your health, the enormous quantity of meat that most Westerners eat almost certainly is, if only because it comes at the expense of other things that are good for us, which is to say, plants. The dismissal of vegetables as "food's food" used to be a joke around our house, but no more, and people who don't eat them would be amazed with how good they taste if you prepare them properly.

And what has been the consequences of all this? Well, for one, we’re better and more imaginative cooks. I’ve lost 14 pounds by this expedient alone. My resting heart rate is down. My blood pressure is down. My cholesterol is down. Our grocery bill is down, too - by about one-third per month.

But, for me, at least, there’s something lacking. An important point of any personal ethics is that you should never ask someone to do something that you wouldn’t be willing, in theory, at least, to do yourself. (Educators take note.) Therefore, I feel that it’s rapidly coming to the point where I’m going to have to get my hands dirty or else give up meat altogether. That means that I either have to try hunting or at the very least witness the slaughter of a cow, pig, or chicken first hand. I made this point earlier when I discussed the death penalty - that people who support the death penalty, it seems to me, have an obligation to support public executions or at the very least must witness an execution sometime. The people calling for blood, I said, don’t get to shield their eyes from it when it’s spilled. And the people eating the flesh of animal shouldn’t get to pretend that it’s something other than what it actually is.

Saturday, September 12, 2009

Psychics

Here’s a headline that you don’t see very often: “Psychic Wins Lottery.” And why not? If their powers of divination are real, then that sort of headline should be as trivial and commonplace as ones about Senators having sex with staffers. Moreover, clairvoyants should be cleaning up in casinos, racetracks, stock exchanges, and on their SATs. But they never seem to. Well, they say, we only use our gift for good, not evil. The powers of prognostication come screeching to a halt when personal gain is involved - and the Psychic Friends Network, I suppose, is a nonprofit organization. But let us take the point as granted. That being the case, why not win the lottery and donate the proceeds to charity?

A couple of years before she died, my mother, on a lark, went to a psychic for a reading, and returned slightly surprised by the accuracy with which the alleged medium could divine the details of her life. Reviewing a tape of the proceedings later – this was provided for an additional fee, of course – she was rather less impressed. Upon a second glance, it was clear that the alleged psychic was doing nothing more than the crudest kind of cold reading and was not even very good at it. Her supposed “hits” were actually generalities or elaborations upon information that my mother had herself volunteered. And would it be grotesque of me to mention that the alleged psychic failed to note a rather big event on my mother’s horizon - the imminent discovery of a nearly 100% lethal form of cancer?

Some psychics, quite clearly, are entirely conscious charlatans. Others, I think, really believe they have some sort of gift. By way of comparison, a British psychologist named Christopher French did an interesting study of dowsers, and demonstrated quite clearly that none of them could locate water at a level above what we’d expect by random chance. A curious thing, though: the dowers themselves concluded that it was it the test that was faulty, not their alleged powers, even though these had conclusively failed them. In any case, no psychic or alleged mindreader has yet met managed to demonstrate their abilities under reasonable scientific controls. Nor have they have been anecdotally impressive, in my opinion. Not one among America’s psychics gave us a clear warning of the events of September 11th, 2001? None among the mystics in that most superstitious of cities, New Orleans, saw Katrina coming? Is it too much to ask for just one accurate, specific prediction of a forthcoming global event? No, the powers don’t work that way, they say. The spirits of the departed are with us and sending us messages, but, for some reason, the messages arrive in the form of generalities and banalities or in messages left in tea leaves and Tarot Cards. They never arrive as clear as day: “Your grandmother is saying, ‘I left the meatball recipe tucked into page 580 of the Joy of Cooking. Also, go with the 5-year, 6% GIC instead of the Mutual Fund. Trust me on this. Weather is terrific - wish you were here. Love, Grandma. P.S., my new e-mail is grandma@afterlife.net.’ ”

Polls show that about half of people believe in psychic phenomenon, past lives, reincarnation and the like, but, then, half of people also believe that the sun goes around the Earth, and a Harris poll from 2003 found that more than a third of people believe in astrology. In other words, a lot of people will believe in anything. Allow me to observe that since about 80 percent of people in the United States and Canada are Christians, the simultaneous belief by about half of them in such things as Horoscopes and reincarnation and spirit photography means, as I have said before, that many among the allegedly religious haven’t got a clue what their own churches teach.

Belief in paranormal phenomena tends to decline as education rises. People with graduate degrees are much less likely to believe in psychics and astrology and whatnot than, say, your average high-school dropout. I point this out because it reinforces my belief that education tends to cultivate the rational mind. Admittedly, I have met some smart people who have told me some spooky things about psychics that I can’t explain. But I do know that elaborate deception, trickery, or the failure of one’s own comprehension of an event are vastly more probable than the idea that a weirdo with a deck of cards or a crystal ball can violate the physical laws of the universe - but never win the lottery, too.

Nonetheless, some people will say that they do believe in this sort of thing, and that in times of trouble it gives them great comfort to drop some money on a reading by Madame Mysteriouso and her crystal ball. Who am I to rain on their paranormal parade? Fair enough - whatever gets you through the night. But who are they to rain on my rationalist parade, to make me smile and nod while they profess their belief systems without giving me a moment to express mine? The possession of any belief carries with it a vital corollary: you can believe whatever you want, provided you leave other people alone. And if you can’t keep it to yourself, if you absolutely must tell it on the mountain, then you have to be willing to listen to others in return, and sometimes you aren’t going to like what they have to say.

I want more than anecdote. I want real proof - the kind of proof that would pass muster in a peer-reviewed journal. If you tell me that there’s a ghost in your house, I want cameras from multiple angles to capture the moment that the candlestick moves on its own and Newton’s Second Law falls. If you tell me that there are spirits all around us, I want scientific instruments to measure their presence, not some crank with a crystal ball telling me that somebody whose whose name that starts with “M”, possibly Mary or Margaret or Melissa, and who might have had some sort of illness related possibly somehow to the chest area, and who possibly passed in the last few years, is here with us now, and wants to send me some messages that could have come from any greeting card. Please.

Want to really impress me? I’ll pick a word at random from the Oxford English Dictionary, write it down, and seal it in an envelope. Get your psychic to tell me what the word is.

Saturday, August 29, 2009

Addresses

What follows is the preliminary text of an address I will deliver to incoming undergraduates during their orientation the week after next. I believe that the most important quality a teacher can have is empathy. As time passes, however, I wonder more and more if I’m able to empathize with teenagers. How would I, at age 18, have received this talk? I’m honestly not sure. Let me know what you think. I borrowed the bit about students today and when they’re going to be retiring from Ken Robinson’s talk at TED, and I’ll say so in the speech itself when I deliver it. GB.



My Address to the Undergraduates
September 2009

Probably you’ve heard people say that a BA means absolutely nothing - that everyone has one. It’s not true. Only about one-in-five Canadians have a BA. In your age group only one person in three is in university. Those numbers are going up but it will be a long time before it reaches one-in-two. It’s also not true that a BA doesn’t count for anything in the job market. The job market is tough for everybody. You wouldn’t believe what it’s like for PhDs. A friend of mine with a PhD just spent the summer working in a bookstore. I kid you not. It’s rather frightening, isn’t it? But the statistical fact - and we have study after study to prove this - is that, on average, the higher your education the higher your lifetime earnings. It isn’t always true, but it’s true on average. A person with a BA will, on average, make more money than a person who has only a college diploma, and that person will tend to make more money than someone just high school.

But what I want to suggest today is that there’s a lot more to it than that. It has to do with the real value of an education in the liberal arts and social sciences, and it’s a value that is constantly under attack and that we have to do more to defend. And I want you to think of it this way. The purpose of an education isn’t just to help you get you a job in three or four years. It’s to help you lead a good life - which means that your education has to serve you over the course of your life, not just in the years immediately after graduation. But the problem is this. Most of you are going to be retiring sometime around 2060. I’ll say that again. Most of you will be retiring sometime around 2060. No one in 1910 could have predicted what the world would be like in 1960, and no one in 2010 can make that prediction about 2060. One thing I can promise you, though: most of the information that you accumulate over the next four years will be forgotten by then. If you take my Canadian history class next year you won’t remember much of it by 2060, and the same goes for most of your other classes. I took a class in medieval literature. It was wonderful. I’d take it again in a second if I could. And I’d need to, because I’ve forgotten every word of it.

So you might ask, then, what’s the point? Some people would say - and you’re going to hear a lot of this sort of thing - that all you learn in university is a lot of useless nonsense that gets you a useless degree. But they’re wrong. We need to stop thinking of an education as merely the accumulation of more facts that will help you get a job. We teach history and philosophy and literature and all of the other subjects not just because they’re rewarding in their own right, but because studying them teaches us how to think, and learning to think well is one of the keys to the good life. This isn’t a new idea: it’s one of the oldest ideas in our tradition: it goes back to the one of the very first institutions of learning in the Western world, the Academy, in ancient Athens, where the motto was “know thyself.”

But it seems like an odd idea, doesn’t it? Teaching someone how to think. Because you’re sitting there saying, “Well, I know how to do that already.” But consider it like this: nearly everyone can move their arms and legs. But that doesn’t mean that they can play professional sports. Playing professional sports takes long and arduous training. It’s the same with thinking. Everyone can do it, but not everyone does it well, and you can learn to be better at it.

If we do our job right - and if you do yours - over the next four years, through the study of the arts and social sciences, you’ll learn to think better and more creatively, to reflect, to ask questions, and to find answers on your own. Because, whatever else happens, those are skills that are needed in the job market, and those are qualities that that will serve you over the course of your life, and it’s one thing we can say for certain that the world will need in 2060, and that the world needs more of today.

Well, you may be asking, how do I do that? How do I become a better thinker? Going to a good school helps - and you’ve done that. There are professors here who can stand alongside any teachers and scholars anywhere in the world. Second, you have to take advantage of what the school has to offer. The main difference between high school and university is this: here, you are joining a community of scholarship. Your professors aren’t just teachers - they’re scholars who are actively engaged in research and publication in their field. You are being invited to join that community, and that means your education is for the most part self-directed. We try to point our students in the right direction - whether or not they go there is up to them. And that means that you’re going to have to work hard, and that means putting in a lot of time.

But, fortunately, because you’re young people, time is something you have a lot of. Time is the greatest asset you possess; it is also the one asset you have less of with every passing minute. And so let me leave you today by encouraging you to use your time here as well as you possibly can.

Thank You.

Friday, August 14, 2009

Signs

Last summer, an atheist organization put some rather silly signs on buses, and there was much wailing and gnashing of teeth, as if riders were going to get off at the next stop and torch the nearest cathedral. As I’ll argue in the near future, it isn’t nonbelievers that the faithful need fret about, but the damage being done to religion by some among the faithful themselves. On my way to work — depending on which way I go — I pass probably a dozen or more church signs day after day, and it strikes me that, in many cases, nobody is working harder to keep people from churches than the churches themselves. Does the local corner parish really think that “Exercise your heart: walk with God!” emblazoned on an ugly roadside rental sign is going to get me through the door this Sunday?


Over the course of the summer, I’ve made note of a few such signs. These range from the inane and the unfunny (“Hot outside? We’re prayer-conditioned!”), to depressingly asinine (“if you’re going in the wrong direction, God allows U-turns”); and, perhaps most commonly, the straightforwardly menancing: (“Pray now or Pay later”.) My favourite in the latter category is this one: “Afraid of burning? Ask Jesus for Son block.” Nothing like the threat of torture to make people see things your way.


From time to time, I admit, I’ve noticed church signs that struck me as vaguely clever. Some years back, a local adult video store put up a sign that read, “membership has its privileges.” The adjacent church countered with, “membership here has its privileges, too.” Well done. Then again, this was the Church of England, which holds that pretty much everybody is saved without effort, so it’s not clear what those privileges are. (The comedian Eddie Izzard has observed that a Church of England inquisition would give heretics a choice between “cake or death”, and then be surprised that there was “such a run on cake.”)


In a summer of looking, I found signs that were coy, some that were smug, some that were straightforwardly hateful, but never once did I see one that was profound. And why not? With one of the great works of English literature the King James Bibleand millennia of theological thought before them, surely they can do better than something that sounds like it was written by the runner-up for a job at a greeting card company. The threatening ones, at least, had the virtue of sincerity, and I’ll take “Pray now or pay later” over “Rainbows are God painting" any day.


Taken together, the manner in which so many churches sell themselves these days suggests something slightly pathetic and out-of-touch, like those television and movie-trailer ads that use the latest slang to get teens to stop using drugs. (In my day, they told us that staying clean was “rad” and “totally awesome”, and while I never did drugs I was tempted to start, just to hit back at whatever boneheads thought it was a good idea to condescend to me and my friends.)


One needn’t accept the metaphysical assumptions upon which churches are based to recognize their importance as social institutions, making contributions to the conversation about how we ought to lead our lives. It is therefore painful to see so many of them reduced to hawking their wares like the most undignified used car salesman. There is the rule about books and covers, of course, but a lack of imagination and whiff of desperation in exteriors seldom bodes well for the interior contents.


I could go on and on. Just last week I saw, “Christians never meet for the last time” - in my books, at least, that's not a selling point if it includes people who think up slogans like that. Same for, “prayer is the key to Heaven’s door”, since keys can lock doors, too. And just this morning I found a blog — defunct now, sadly — that catalogued crummy church signs. My favourite is: “Heaven is not Burger King. You can’t have it your way.”


Damn. And here I was hoping for extra pickles upon arrival.


Wednesday, July 29, 2009

Orientation

A mentor of mine said recently that I was wrong when I told my students that the purpose of education was to get smarter. No, he said, the purpose of education is to get wiser. The difference, you ask? A smart person knows when he’s right; a wise one knows when to say it.

I raise the point because last week I received an intriguing invitation to address the incoming class of undergraduates during their September orientation. I'm going to do it, but I'm not sure what to say. Certainly not that I’ve always been mildly irritated by "O-Week" – the initiation of first year students to university life. Twenty years ago, I attended the first few events of my own and then headed for the hills. How I hated it. Everything about it: the binge drinking; the compulsory “fun” (two words that should never go together); the uniformity of, well, uniforms; the inane group cheers; the shabby slogans; the unimaginative activities; the inculcation of "school spirit" amongst students who have not yet had time to decide for themselves if their school is any good, struck me then and strikes me now as antithetical to one of the larger purposes of university education: to produce independent thinkers.

On my very first day, twenty years ago, a trio of third-year students, complete with painted faces and enormous excesses of personality, made me sing the national anthem – readers of this blog know my feelings about that song – before handing over $20 for my "mandatory frosh kit", which turned out to be a bag of flyers, pamphlets, and junk being freely distributed elsewhere. Well, hell. I left high school hoping to escape precisely that sort of nonsense and precisely those sorts of people, and there I was in the thick of it two months later.

Over the years, I’ve had a great many students, and even organizers of these events, tell me that they found none of it "fun" in the least. But try being the one who says, "Actually, I don’t want to paint my face and wear this t-shirt another day and chant this, well, rather insipid and offensive cheer. And my roommates are binge drinkers who won’t do dishes and think it’s funny not to flush the toilet. This isn’t quite what I was promised at the University Fair, where all the talk was about cultivating the mind and the human spirit."

A few years back, during "O-Week", members of my former faculty's "student fun team" scrawled "Social Science: the Biggest and Best Faculty!" in chalk on the sidewalk outside of our oppressively ugly faculty building. (Some wag respond by writing: "Would you like fries with that?" underneath.) Later, I saw students practicing a cheer on the same theme. But a good education in the social sciences should actually call such conclusions into question. A proper slogan might read: "Objective, long-term consideration of the available evidence leads to the highly tentative conclusion that for a significant portion of motivated undergraduates in the social sciences, their undergraduate experience is, on the whole, intellectually fulfilling. More longitudinal studies are required to determine whether or not social science degrees are of actual utility in the rapidly-reorienting job market in terms of both starting salary and lifetime earnings. " But try making that into a cheer.

Anyway, there I will be, during O-Week, and when the time comes I hope I'll have the wisdom to not say what I'd really like to say, which is that if you’ve come to university to learn to think for yourself, now is the time to start. In fact, consider O-Week your first test.

Twenty years. Did you catch that part, students? It was twenty years ago this month that I hopped on bike, rode up to the university, and chose my classes. English, History, Political Science, Philosophy, and Psychology (stupidly, I did not take French.) I remember all the profs. They seemed unfathomably old and learned to me, but I know now that two of them were ABDs, and a good deal younger than I am now. Twenty years. I can scarcely believe those words as I type them. And so I've decided that there is one thing I'm going to say for certain five weeks from today, and it's this:

"You’re seventeen or eighteen. I don’t mean to condescend, but it’s hard to appreciate at that age the rapidity of the passage of time. Twenty years ago this week I started university. The intervening years have passed so quickly I can hardly describe it. I still have projects, left over from high school, that I’ve been meaning to work on. For me, there have been good things and bad things in the past twenty years. I wouldn’t go back, even if I could, but I can tell you that I would like an extra twenty years before me. So my essential message to all of you is this: the greatest asset you possess is time. It is also the one asset that you have less of with every passing second. It is therefore urgent that you use your time well. If there's one thing that you derive from your education, I hope it's a better understanding of how to do that."

Time. Hear that ticking sound, students? It gets louder with every passing second.

Tuesday, July 14, 2009

Evaluations

"This course should have had a textbook, with weekly assigned readings." - Comment from teaching evaluation, 2005.

It did. It did have a textbook, and it did have weekly assigned readings. Evaluations often tell us more about our students than about our teaching.


I'll get this out of the way so that nobody thinks it's sour grapes. Despite the occasional barb hurled my way, I get very good teaching evaluations, and you can check if you don't believe me. I've even won a couple of teaching awards, and you can check up on that, too. But I also believe that we could improve the quality of education overnight by abolishing student teaching evaluations – or at least by abolishing the kind that we have now. I also know, as I write these words, that I feel quite vulnerable. As a relative academic newcomer, still in a probationary period, I pay my rent by the good graces of administrators for whom teaching evaluations are a sacred cow. But therein lies the rub: I haven't met a sessional or newly minted full-time professor yet who wouldn't, behind closed doors, admit to lowering standards in exchange for better evaluations. Not to mix metaphors, but for the indentured servants who carry, if not the bulk, then certainly the dead weight of the university's teaching burden, teaching evaluations hang over them like the threat of the slave driver's whip. Every deserved "F", every blunt assessment of a lazy student's performance, every admonishment to stop surfing the wireless web and pay attention, is tempered by that threat. Reading loads get reduced, content gets thinned out, expectations get lowered, and lo, yea and verily, the light of the highest grades is shed upon work of the shadiest character. Don't kid yourselves – evaluations make our teaching worse.


They also rest upon the assumption that the great majority of undergraduates are qualified to say what is and what is not good teaching. On what grounds do we assume this? Have undergraduates lectured? Marked? Led discussions? Studied pedagogy? Any parent of any teenager will tell you that young people seldom are objective adjudicators of adult authority, yet an evaluation from a sincere and diligent student gets no more consideration than one from a full-time party-animal who slept, skipped, or surfed his way through my class, and who departs thinking that Rosa Parks "invented the national parks system." (Yes, it really happened). Well, hurrah for the new academic democracy, down with the hidebound old guard that just doesn't get it. Times have changed: they're customers now, not students, and they have every right to demand customer satisfaction – even the ones who have come to shoplift. Thus do good teaching evaluations become an end in themselves, when the "end", if there is one, is not so much better teachers as it is better students, students who no longer need teachers in order to learn. If my institution's administrators really want to assess my teaching, they're welcome to drop by my class any day. They can show up unannounced, if they like. Or I can show them my class websites and the contents of my teaching portfolio and the articles I've published on andragogy. We could discuss it over lunch. I welcome their insight and their expertise. All this could be done. As it stands, we let students define what constitutes good teaching.


In fairness, this is not about administration. In my working career I've stepped into managerial roles often enough to know that the grass is not greener on that side of the desk, and I certainly wouldn't want to endure the automatic accusations of ill-faith that come packaged with the job of Dean, Principal, and President. These colleagues are not my target. My target is the assumptions that lead them to take the present system of teaching evaluations seriously. The point of having better teachers, after all, is to produce better students, and for those purposes student teaching evaluations are not so much a sacred cow as they are a golden calf.