Out of the crooked timber of humanity, no straight thing was ever made
Updated: 1 hour 34 min ago
The great Mandela is dead. A political prisoner for 27 years, a courageous fighter against racism and injustice, and finally a great statesman. There is much to remember there and much to mourn. Those who suffered under apartheid, the exiles, those who were active in solidarity overseas: all will have their memories of the struggle. Some of their voices will be heard. But sadly, they have to share a stage with the official voices of commemoration: politicians and others who cared little for the ANC or who actively opposed it. In the UK it is sickening to hear eulogies from the braying Tories, the Bullingdon-club types and ex-members of the Federation of Conservative Students who sang “hang Nelson Mandela” in the 1980s. No doubt, in the US, there will be some prominent Reaganites who utter similar word of appreciation. There’s an implicit narrative emerging that everyone recognized his greatness after 1990. But this isn’t so. The warbloggers and Tea Partiers (and their followers in the UK) were vilifying him when he criticized US policy under George W. Bush or said something on Palestine that deviated from the standard US-media line. Just as with Martin Luther King, we are witnessing the invention of a sanitized version of the man, focused on reconciliation with those who hated him – and who still hate those like him – and suppressing his wider commitment to the fight against social and global injustice.
You know what’s a great album? Radiohead, In Rainbows, that’s what. I was just relistening, and then relistening again, and then again. What a great rhythm section, perfectly setting off the ethereal-cerebral Yorke vocals!
Take just the first track. The 5/4 “15 step”. Such a funky, danceable track. It’s like, I dunno, the Purdie tsakonikos shuffle or something. Played by a robot. It would be fun to make a music video for it using a loop of the 5/4 Fred Astaire dance from this video. Just speed up the video a little bit to match the beat.
So that’s someone’s homework assignment. Kindly upload the results to Youtube when you’ve got it properly synched. And give us the link.
Rick Perlstein has a great piece on how faculty respond to grad student unions.
He quotes at length from a letter that a professor of political science at the University of Chicago sent to graduate students in his department who are trying to organize a union there.
What always amuses me about these sorts of statements from faculty is how carefully crafted and personal they are—you can tell a lot of time and thought went into this one—and yet somehow they still manage to attain all the individuality of a Walmart circular. No union contract was ever as standardized or as cookie-cutter as one of these missives. The very homogenization and uniformity that faculty fear a union will foist upon their campus is already present in their own aversion.
Anyway, here’s what the good professor has to say (I have no idea which member of the poli sci faculty at Chicago actually wrote this):
First off, let me preface these remarks by saying that when I was in graduate school at Berkeley in the 1990s, I was very active in the graduate student unionization movement. I was shop steward for the political science department for several years and was very active in a three week campus wide teaching strike we held in the fall of 1992. It may also be worth mentioning that I come from a working class family (I was the first and only person in my family to go to college) and I grew up around a lot of issues of collective bargaining. So I’m highly sympathetic to issues of collective action.
The I-come-from-a-working-class-background-my-dad-was-in-a-union-my-aunt-fucked-Walter-Reuther-I-organized-the-workers-at-Flint-this-may-come-as-a-surprise-but-I-actually-am-Cesar-Chavez opening. Check.
That said, I found your co-signed letter to be naive, unconvincing, and, quite frankly, kind of offensive. It is naive in that you seem to really think a union would not change relationships between graduate students and the faculty. I don’t know if either of you have ever been members of a union or worked in a unionized environment, but unions inevitably alter the relationships between union members and the people the interact with, be they management, clients, customers, or what not. The formalization of such relationships is, in fact, the central goal of a union. Your letter says “Our goal is simply to gain a voice in the decisions that affect our working conditions.” Well, these decisions are largely made by the faculty. Thus, if you want a collectivized voice in these decisions, you will be unavoidably shaping your relationships to faculty members.
We make all the decisions around here. Check. (Ask that professor if he even knows how much you make as a TA; they almost never do, though this one seems to. One point for research.)
The union will screw up your very close and personal relationship with your adviser. Check.
Oddly, when you point out that relationships between students and professors at Berkeley, Michigan, and Wisconsin, all of which have unions, are not that different from relationships between students and professors at Chicago, Yale, and Harvard, these peer institutions that these professors would be thrilled to get their students a job at suddenly become tarred with that dreaded word “public” or even worse “state school.”
And when you ask these professors to explain, concretely, why it makes a difference that Berkeley is public and Chicago is private, a thoughtful look will inevitably descend upon them, as they slowly emit the following carefully chosen words: “Well, it’s different at Berkeley. They’re a public university.”
What’s more egregious is the fact that most of the faculty I know do not think of interns [the University of Chicago’s term for teaching assistants] as employees but think of the internship as another educational experience.
You’re students, not workers. Check.
Though this one has a novel twist: we, the faculty, think of you as students, not workers.
And just like that, our hard-bitten empiricist turns into the most starry-eyed constructivist.
And now comes the climax.
Every year there are hundreds of applicants for a very small number of slots to study here. You are very lucky to be here, just as I am very lucky to teach here. When you were admitted to the university, you were not hired. You were offered a spot as a student. The university owes you nothing beyond what it initially proposed and what you accepted. To call yourself an employee and complain about an absence of cost-of-living adjustments, health insurance, or the burdens of being a graduate student…sounds both presumptuous and petulant.
You’re privileged, presumptuous, and petulant. Check.
I, on the other hand, am…just another tenured professor at a fancy school. Saying what every other tenured professor at a fancy school has said to any one of his students who managed to tell him that she wanted to form a union too.
Check check check.
Academia: the herd of independent minds.
Folks are linking to it. The Farhad Manjoo profile of Neetzan Zimmerman, the Gawker writer who picks the linkbait stories like no one else, apparently. I do like the idea that after AI’s are better than us at everything else, it might still take a human to figure out whether sloths are in this month.
Donald Barthelme wrote a story about this back in … – turns out it was 1980! “Pepperoni”!
Basically, he envisions a kind of Gawkerization of media. (But without the social media aspect, admittedly.)
A newspaper has found financial success by diversifying its operations. It owns timberlands, mines, pulp and paper operations, and a number of different media, and over-all return on invested capital increases at about 9% a year. But top management is saddened and discouraged, and middle management is drinking too much. Automation has lowered morale in the newsroom. Recently the paper ran the same stock tables every day for a week. No one noticed, no one complained. Some elements of the staff are not depressed. The real estate, food, clothing, and games columns of the paper are thriving. Nevertheless, the Editors’ Caucus has applied to middle management which has implored top management to alter its course. The paper’s editorials have been subcontracted to Texas Instruments and the obituaries to Nabisco. There was an especially lively front page on Tuesday. The No. 1 story was pepperoni – a useful and exhaustive guide. Top management has vowed to stop what it is doing – not now, but soon, soon. A chamber orchestra has been formed among the people in the newsroom, and we play Haydn until the sun comes up.
You can get it in Forty Stories [amazon]. Funny stuff! But the funny thing about the New Yorker summary is that you probably think you are getting a teaser. The first paragraph or something. But it’s actually a condensed version of the whole story. Only, of course, nothing really happens in a Donald Barthelme story. Executive summaries of postmodern literature are weird. I never really noticed that until just now.
UPDATE: Oooh, oooh. Now I’m rereading Forty Stories. From “Conversations With Goethe”:
Critics, Goethe said, are the cracked mirror in the grand ballroom of the creative spirit. No, I said, they were, rather, the extra baggage on the great cabriolet of conceptual progress. “Eckermann,” said Goethe, “shut up.“
I forgot how funny this stuff is.
On the topic of Philosophy’s uneven sex ratios: Gina Schouten has a really interesting paper about stereotype threat as a possible explanation of those ratios (PDF). Her paper is, as she says, an armchair reflection on the hypothesis, but I think it would be useful to anyone wanting to study the causes of the sex ratios empirically.
The reason she has to do an armchair reflection is that Philosophy is a small discipline, and one the composition of which does not have huge social consequences, so the incentives for empirical researchers to give it the kind of attention they give the STEM subjects, the subjects for which the stereotype threat hypothesis was formed, and has been tested, and for which treatments have been devised, are small (Kieran seems to do it as a strange sort of hobby – but I don’t think his discipline promises great rewards for this part of his work).
Her reflections, though, are interesting and useful. She points out that the main leak in the pipeline is between the first philosophy course and the major. It would be really handy if it turned out that stereotype threat explained the exit of students at this point in the pipeline, because psychologists have devised interventions to counter stereotype threat that are extremely cheap and easy to implement, and seem to be highly effective (see footnote ). We could adapt some of those interventions relatively easily to Philosophy courses. (Then we could continue to be completely insensitive and rude in the way we teach, without suffering the consequence of depriving our discipline of talent!)
Problem is that we don’t have a lot of evidence, and some of the features of stereotype threat seem to be absent. For example, the fact that girls get lower average grades in any given STEM course is prima facie evidence that they are underperforming (one indicator of stereotype threat). I don’t have data on how well girls and boys do in intro level courses, but anecdote suggests that girls do not get worse grades than boys (Ok, ok, I’m writing this, and realize I should just get someone to check for my dept, and I’ll report back if it’s legal to). Of course, “underperformance” means something like “lower performance than the student should perform given his or her prior achievement”, and given that girls at most institutions have significantly higher prior achievement on most measures, they could be getting higher grades than boys and still be underperforming.
Another problem with the idea that stereotype threat explains why girls leave after the first one or two courses is that they just lack the stereotype. After all, philosophy is a found major, and because they have no experience of it, our students lack the relevant stereotypes: girls don’t think that philosophy is the kind of thing that girls do badly, or that others think that, because they don’t know what it is. In so far as they do have beliefs about what philosophy is , those beliefs are usually quite wrong, and we disabuse them pretty quickly.
However, as she points out, their first encounter with the subject might easily introduce a stereotype to them:
The first bit of data we provide is the gender composition of the disciplinary experts in the classroom: the instructors and teaching assistants. Many undergraduates will be exposed to only male experts during their first experience with philosophy. Especially in large classes with multiple teaching assistants, gender homogeneity may constitute a striking bit of data indeed. Students might explicitly infer from the prevalence of male experts that men tend to excel in philosophy, whereas women do not. Or they might reach this conclusion indirectly, noticing that the gender makeup of philosophy resembles that of math and science, and then applying gendered stereotypes from other disciplines to philosophy.
The second bit of data from which students may infer the existence of gendered stereotypes about philosophy is the course syllabus. Even the most well-meaning among us are likely to distribute syllabi on which men’s contributions far exceed those of women. Even if students do not notice this immediately, they are likely during the first few weeks of the course to notice that all the articles they are reading were written by men, if only because we refer to the authors using male pronouns.
The course content during the first few weeks of class often constitutes a third data point for students. Many of us begin our courses with a brief introduction to logic. From this, students may quickly come to believe that philosophy generally is similar to math, or that an important foundational component of philosophy is similar to math. They may infer that being good (bad) at math bodes well (ill) for one’s prospects in philosophy. Students may then apply gendered math stereotypes to philosophy, or simply experience anxiety at the prospect of confirming math stereotypes in the philosophy classroom, much as they have been shown to experience anxiety at the prospect of confirming math stereotypes in physics classrooms.
There’s a lot else in the paper, but I want to focus on this part of it. Among the interventions Schouten suggests are altering syllabuses to include more women authors, and doing our best to ensure that students are exposed to live female philosophers, both by assigning women teachers to classes where they are likely to teach a good number of female students and by using guest lecturers. 
A discussion with a (feminist) colleague who teaches ethics at another institution indicated how difficult the first suggestion – changing the sex ratios of the assigned authors – is on some approaches to philosophy. Said colleague teaches her introductory ethics course using the historical approach (the same I used the first time I taught it): but on the historical approach, you are going to teach the historical figures: probably a couple out of Plato, Hobbes, Rousseau, Nietzsche, Marx and Rawls and definitely all of Aristotle, Hume, Kant and Mill. No women. You can force women in – both times I taught it that way I used Mary Wollstonecraft, and my colleague ended her course with Philippa Foot. But neither of these figures has the stature of any of the men I’ve mentioned, and my guess is that the students can pretty much see through the token gesture (and if you teach it chronologically, the women come along pretty late anyway). It is, obviously, much easier to change the ratios if you take a problem-based approach – teach about realism and relativism, the significance of consequences, integrity, virtue ethics, partiality, sexual morality, forgiveness, the significance of death, etc. Keep it mainly focused on contemporary authors and you could easily design a sensible syllabus, using only very high quality philosophical literature with just one man on it (Williams is who I would find indispensible – I’m not suggesting that it would be a good idea to have only one man on the syllabus, just pointing out how easy it would be to alter the sex ratios). The same is probably true for an intro to philosophy course (I say probably because I’m not sure I could do it, but I’m not sure I could design a good syllabus for an Intro to Philosophy course, period.
Couldn’t a department experiment with this? My guess is that the historical approach is dominant in many departments as things stand, and that it should be possible to persuade a few colleagues to adopt a problem-based approach with significant numbers of female authors on the syllabus, and see whether rate at which girls take a second Philosophy course changes (using colleagues who have not changed their approach as a control). Of course, an experiment like this would require finding out what people are actually teaching, which goes against the grain, but might be valuable for other reasons.
 Sorcery maybe?
 I’ll disclose that Schouten herself has been used (by me) in exactly the latter way – for about 3 years she was my go-to substitute for numerous reasons, including that my large classes are largely female and I want those students to see a female philosopher in action (esp. one who is evidently extremely talented both as a teacher and as a philosopher). Also, that both she and I see the limitations of and problems with this suggestion.
So, you’re a controller for a municipal trolley system with a perfect safety record. You’ve just been alerted that one of your tracks, serving a community of 5000 people has suffered unexpected damage which could cause a trolley car accident involving fatalities among philosophers. You have no budget allowance for this, so the only way of fixing the line is to abandon planned maintenance of another line, serving 1000 people, which would then have to be closed until more funds become available. Presumably, in these circumstances, most people will decide to fix the more important line.
Now, we change the situation. You no longer control the funds for the other line, which are within the jurisdiction of your colleague the Fat Controller, so named for obvious reasons. If you draw management attention to his obesity problem, HR will force him to take leave until he can get his weight within acceptable limits. You will then be given temporary control over his line and the associated budget, which you can divert to fixing the more important line.
What should you do?
More relevantly, has the absence of fatalities, and (IMO) the more plausible situation (you can take out the Fat Controller joke, and substitute some more routine bureaucratic manoeuvre if you want), weakened or improved the usefulness of the thought experiment? Over to you.
I think it’s possible—nay, probable—naw, it is a nigh-certainty that you have not seen one of the best music videos ever made, quite randomly for French electronica duo Justice (they aren’t even, they’re sort of a rock band. But not.) It stars a young Snake Plissken (presumably before he is inserted into, and subsequently [SPOILER ALERT] escapes from, New York, in the movie “Escape From New York.” I strongly encourage everyone to go on and click full screen and listen to the song and everything. Dudes this is so fucking awesome. C’mon. Did they actually program a computer from the 1980s to make some of the “high-definition” graphics?
My best friend from middle school and I once wrote a program like that which, by displaying a series of screens on which we had drawn the lines point to point, created the image of a rotating green wire cube on a black screen on her Apple II c. It took us like four hours or something. More? Her family’s cook made killer shrimp tempura, though, so that was sustaining. And then coffee milkshakes and chocolate cookies for afters. Actually she would ask you egg preferences the night before and bring us breakfast in bed every morning that I ever slept over, which was a billion. With fresh-squeezed OJ. With sugar in the coffee already how she knew you liked it. Mrs. Hong was the shit, but she was prone to get angry and would not let anyone go in the kitchen and make a peanut butter sandwich or anything. Or even a bowl of cereal. Eventually Sacha’s mom had to fire her when Mrs. Hong threw a huge-ass knife at her during an argument over menu planning and it stuck, quivering, embedded a good two inches in the plaster of sloping ceiling of the back stairs. Even then it was a struggle (internally, for her mom). Mrs. Hong claimed it was a “warning shot” and hadn’t gone that close to Sacha’s mom’s head, which was kind of true but kind of not super-relevant. Anyway, A ROTATING CUBE YOU GUYS RLY! We were siced. Just like how siced I am about this video right now.
Another follow-up on the philosophy styles and aggression issue, raised initially by Chris. I meant my first post to be a response, narrowly, not to Chris’ post but to the suggestion that sort of ate the comment thread: trolley problems are symptomatic of philosophers’ taste for intellectual bloodsport. (Not that tying people to tracks and running them over is sporting, mind you.) I didn’t mean to offer up the whimsical innocence of trolley tragedy as proof that philosophers don’t, otherwise, suffer from the sorts of problems that Jonathan Wolff alleges. But I actually do disagree, substantially, with the Wolff piece. Let me try to say how.
But first this needs to be said: the gender gap in philosophy departments is very bad and that can only be chalked up as a major failing of the discipline. I think some of this failure is due to the broader culture. A mode of personal presentation that is valued in philosophy – i.e. a kind of impersonal, argumentative self-confidence and brashness – codes as ‘bitchy’ when exhibited by women. So women are likely to have to strive harder, inwardly, to overcome cultural conditioning. And, outwardly, they will be rewarded by some – but not by philosophers! – regarding them as bitchy. But ‘it’s not our fault our discipline is failing, it’s the culture’ is not a defense I would care to make. That’s more an argument for shuttering the department until such a time as cultural conditions are more favorable for its operation. If we’re so smart, we ought to be able to make a little progress on this problem, in-house.
So far, so not so far from what Chris said, really. In part I’m just just rehashing diagnostic points my lovely wife made in this thread. Also, see this comment on stereotype threat. If the culture stayed the same, but the mix got closer to 50/50 overnight, the bad effect of the cultural mismatch would be reduced. A more healthy gender balance might just stabilize, once achieved – so I would hope.
Affirmative action, then? I would say so, yes. Ideally.
An alternative route would be, as Wolff suggests, for philosophers “to act, well, if not in more “ladylike” fashion, then at least with greater decorum?” I don’t think that makes sense. A symptom of things going wrong is this: Wolff, in effect, writes as if he is trying to come up with a better set of norms for teaching philosophy at its worst.
At its worst, philosophy is something you do against an opponent. Your job is to take the most mean-minded interpretation you can of the other person’s view and show its absurdity. And repeat until submission. Certainly the method has the merits of encouraging precision, but at the same time it is highly off-putting for those who do not overflow with self-confidence
But obviously we don’t want norms for how to do/teach philosophy like jerks. The relevant norm is: don’t.
But let’s think this through by pushing through: what is the best way to teach philosophy, on the assumption that it is to be taught more or less at its worst? That is, what is the least toxic way for people to philosophize, given that they are basically going to be using philosophy as an outlet for the irrepressible urge to be an asshole? I actually think that speech-and-debate is a not-half-bad holding option, if people have to be assholes. (It’s sort of like saying: if you are going to get in fights, you might as well put on a pair of gloves and study ‘the sweet science’.) There is something ‘decorous’ about speech-and-debate, after all. The note cards, the suits and ties. (No, wait, that’s a high school thing.) Academic philosophy is highly ‘decorous’, as fighting styles go. So ‘let’s fight about impersonal arguments’ is maybe the best way to teach philosophy, at its worst? What then is the worst of this worst, by contrast? In my experience, that happens when philosophy is treated more as an intimate problem of the self. In effect, you are saying that the problem with your opponent is not that her arguments are flawed but that she’s somehow inauthentic, as a person. When the impulse to be an asshole, through philosophy, gets loose during one of these agons of authenticity – that is philosophy at its worst, as philosophy of the worst sort goes.
That was confusing. Let me summarize: if you know someone is going to be an asshole about it, and it happens to be philosophy, then the least bad option may be to channel them into a life of finicky arguments about Russell’s theory of descriptions, rather than, say, getting them to read Being and Time closely. But this is a very strange and backwards way to recommend the study of Russell over Heidegger.
Let’s just start over and try to think more optimistically about it, but while acknowledging sad facts of life!
What is the best way to teach philosophy, assuming most people are not just going to be total jerks about it? But keeping in mind that some people are jerks, i.e. they are going to find a way to be maximize their opportunities for delivering beatdowns to anyone they can, to feed their amour propre (as Chris B. says) – even if they nominally conform to intellectual norms that are, other things equal, fairly sound?
This is relevant to the gender issue in the following way: what makes life hard for women in philosophy is a few assholes, plus the fact that a few assholes is all its takes. (You may disagree, but this is how it looks to me.) I tend to think that the only really viable strategy is simple affirmative action. If half the population were women, a few male assholes would no longer be all it took.
Because I don’t really think good philosophical norms can be jerk-proofed, in effect.
What is the best way to teach philosophy – and to practice it? I guess I think that there is no alternative to the ‘combat’ mode, insofar as philosophy consists of problems more than solutions. You have to stage ‘fights’, i.e. display disagreements about the basics. If you don’t do that, you aren’t providing a realistic picture.
I think most philosophers don’t stage fights, in this way, without maintaining a degree of intellectual distance from the fight, which is healthy. Philosophers aren’t like debaters who have been given one of those “resolved: abortion is murder” cards, and now they are arbitrarily determined to defend that, to win. But, all the same, once the thing is structured as ‘fights’, there is no way to prevent people from being aggressive in a bad way. That’s just how it goes.
I like Jacques Derrida, I think he’s funny. I like my philosophy with a few jokes and puns. I know that that offends other philosophers; they think he’s not taking things seriously, but he comes up with some marvellous puns. Why shouldn’t you have a bit of fun while dealing with the deepest issues of the mind?
As an accomplished Derrida-disliker, I am obliged to set Moore straight. It isn’t that he told jokes but how that bothered analytic critics. Searle said Derrida didn’t get Austin’s arguments, which was true. But the thing that bothered him – but he couldn’t just say this is what bothered him – was that, as a result, Derrida couldn’t ‘tell it right’. (I said all this somewhere else, long ago. Well, I’ll just say it again.) Reading Austin for the Nietzschean spark is like reading Wodehouse for its Kafkaesque quality.
In general, Derrida is obviously extremely concerned to collect applause for his punchline – coup de don, etc. Which often comes right at the start. And it doesn’t work as a ‘snapper’, not just because he tells it at the start, but also because ‘I’m telling a joke and it’s going to be very funny!’ is painted all over his face.
That sort of obviousness about the fact that you are joking limits the styles of humor you can pull off. Analytic philosophy consists of jokes that can only be told in a more understated style.
The analytic-continental split, in philosophy, is a side-effect of different styles of joke-telling. Continental means not telling jokes: Heidegger. Or: telling Heidegger’s jokes in a French style. Analytic means not telling jokes: logic. Or: telling logic jokes.
UPDATE: The deepest issues of the mind arise equally in both traditions, but that tail can’t really wag both shaggy dogs, as it were.
I really said it all (and more!) in this old post about Occam’s Phaser. Do not multiply zap-guns beyond necessity!
Philosophers aren’t bloodthirsty autists, you silly people. They are mildly whimsical. But that’s important. The genre of the analytic philosophy (Anglo-American, call it what you like) thought-experiment is a mildly humoristic one, in that it tends to Rube Goldbergism. Of course the point is always to solve for variables! You never tie another victim to the tracks, or fatten one up, for any other reason than that he/she is strictly needed in that place or shape. Nevertheless, the more outlandish the set-up gets, the funnier it gets. And I think it’s fair to say that philosophers quietly award themselves style points for (plausibly deniable!) whimsy, above and beyond conceptual substance.
The problem with that, I should think, is that mirth is an emotion that may affect our moral thinking. Specifically, it makes us more utilitarian. See this more recent article as well [sorry, Elsevier paywall]. The trolley scenarios are, or may be, used as intuition pumps for utilitarian purposes. (They may be used for other things, of course.) But it is an underdiscussed fact that they may inherently do so, in part, because trolley tragedies can’t help being a bit funny.
UPDATE: for those who can’t read the experiments, basically watching comedy clips makes you more utilitarian. But the experimenters don’t seem to have considered that the trolley cases themselves are short comedy clips, of a mild sort. I should publish this important finding of mine. Seriously. It’s actually important to think about.
You wouldn’t normally see those three adjectives in a line like that. And the noun they modify is an unusual one as well.
I suppose it’s fair to object that ‘exalted’ isn’t functioning as an adjective in this context. Fair enough.
The BBC has a short article on the background to Judith Kerr’s The Tiger Who Came to Tea. It quotes another children’s author, who suggests that some of the imagery stems from Kerr’s experience as a little girl whose family fled from the Nazis (tigers, like Nazis, are dangerous). This seems to me improbable – the tiger is hungry, but genial, and little Sophie embraces him. But what I’ve always liked about the book when reading it to my children is the ordinary world into which the tiger irrupts. You can tell a lot about the political economy of 1950s or 1960s middle class life in a London flat from reading it. It’s a world where the milkman still comes around every day, and the grocer has a delivery boy. But it’s also a world where a moderately hungry tiger can quickly consume all the food in the flat (the pictures suggest that the cupboard shelves are rather bare) – the grocery’s delivery boy can carry everything that he needs to in the basket mounted on the front of his bicycle, because there isn’t much to carry. Perhaps most strange from the perspective of a modern American child, there’s a limited supply of water – the tiger has drunk so much from the tap that Sophie cannot have a bath.
This isn’t nearly as strange to me, or Irish people of my generation, as I suspect it is to most middle class Americans. I grew up in a professional family, but many of the things that Americans take for granted (and, as best as I can tell from TV, novels etc, took for granted back then too) would have seemed like the most sybaritic of luxuries. Britain was somewhat better off, obviously, but not by much. David Lodge’s comic novel, Changing Places plays up some of the differences between the material standards of living in the US and Britain for comic effect, but he really doesn’t have to exaggerate much (when I was a child, we lived for a year in a flat in Darlington – much of what he describes is familiar). The life I have today would have been unimaginable to me as a child, or even a teenager. Which is all a roundabout way of getting towards saying that ordinary life in the US today, for people who are middle class or higher is a life of extraordinary material abundance, even from the perspective of other Western nations in recent memory. If you’re one of the people enjoying this life, you likely have a great deal to be grateful for. So happy Thanksgiving.
Yesterday, Jo Wolff tackled the question of women in philosophy in his column at the Guardian, writing:
At its worst, philosophy is something you do against an opponent. Your job is to take the most mean-minded interpretation you can of the other person’s view and show its absurdity. And repeat until submission. Certainly the method has the merits of encouraging precision, but at the same time it is highly off-putting for those who do not overflow with self-confidence.
Brian Leiter thinks Jo Wolff is making a mistake:
At the end of the column, he runs together two issues that should be kept separate: the combative nature of philosophy and how one should treat students. Professor Ishiguro’s approach [see the Wolff column] on the latter seems the right one, but that is independent of whether philosophy as practiced among peers should, or should not be, combative. Insofar as truth is at stake, combat seems the right posture!
I disagree, unless there’s some good reason to believe that combat leads to truth more reliably than some alternative, more co-operative approach. (Does the adversarial system of the US and English courts lead to the truth more reliably than the inquisitorial system?) Sometimes combat might be the right stance, but seeing that as the default mode for philosophical discussion leads far too often to destructive Q&A sessions that aim at destroying the opponent and bolstering the amour propre of the aggressor. Where the aim is victory, then all kinds of rhetorical moves can prove effective: there’s no reason to think that truth will emerge as a by-product. I think a relatively common occurrence is that people on the receiving end of an aggressive battering lose confidence (in themselves, or in a good idea). Sometimes people should defer to criticism, of course, and sometimes people should make criticism in forthright terms and Brian is right to value that. But frankly, a lot of the stuff that goes on in philosophy seminars is just damaging.
What I’ve said so far is independent of the gender issue. I realize that some women in philosophy are uncomfortable with the link between gender and philosophical style and there’s certainly no reason to think that merely being robust and forthright in argument is specially male. But a lot of conduct in philosophy goes well beyond the robust and forthright and tips into the straightforwardly arseholish, and there may be a selection effect in favour of women in the profession who are able (though not willing) to endure that. A lot of people in the academy – both men and women – suffer from “imposter syndrome”. But it turns out that women are more likely than men to suffer from this and there is no correlation with actual ability. An atmosphere where there is systematic reinforcement of such a widespread anxiety is not a good one, and it might be, because of its uneven distribution by gender, just one of the several mechanisms that exclude women.
Pope Francis’s new Apostolic Exhortation, Evangelii Gaudium, has been getting some attention today, mostly thanks to its reiteration of some long-standing Catholic doctrine on social justice and the market. So, here is a quiz to see whether you can distinguish statements by Pope Francis from statements by Karl Marx. I figured someone was likely to do this anyway, so why not be first to the market? It’s fair to say that the Pope and Karl Marx differ significantly on numerous points of theory as well as on what people asking questions at job talks refer to as the policy implications of their views. So I don’t think this quiz is very hard. At the same time, I sort of hope it will be picked up, stripped of this introductory paragraph, and circulated as evidence that the Pope and Marx agree on pretty much everything.Questions!
1. In a similar way, by raising dreams of an inexhaustible market and by fostering false speculations, the present treaty may prepare a new crisis at the very moment when the market of the world is but slowly recovering from the recent universal shock.
2. … society needs to be cured of a sickness which is weakening and frustrating it, and which can only lead to new crises.
3. In this play of forces, poverty senses a beneficent power more humane than human power. The arbitrary action of privileged individuals is replaced … Just as it is not fitting for the rich to lay claim to alms distributed in the street, so it is also in regard to these alms of nature.
4. Yet we desire even more than this; our dream soars higher. We are not simply talking about ensuring nourishment or a “dignified sustenance” for all people … for it is through free, creative, participatory and mutually supportive labour that human beings express and enhance the dignity of their lives.
5. … the limitless possibilities for consumption and distraction offered by contemporary society. This leads to a kind of alienation at every level, for a society becomes alienated when its forms of social organization, production and consumption make it more difficult … to establish solidarity between people.
6. Today everything comes under the laws of competition and the survival of the fittest, where the powerful feed upon the powerless. As a consequence, masses of people find themselves excluded and marginalized: without work, without possibilities, without any means of escape.
7. In this system, which tends to devour everything which stands in the way of increased profits, whatever is fragile … is defenseless before the interests of a deified market, which becomes the only rule.
8. Inequality eventually engenders a violence which recourse to arms cannot and never will be able to resolve. … Some simply content themselves with blaming the poor and the poorer countries themselves for their troubles; indulging in unwarranted generalizations, they claim that the solution is an “education” that would tranquilize them, making them tame and harmless.
9. The worldwide crisis affecting finance and the economy lays bare their imbalances and, above all, their lack of real concern for human beings; man is reduced to one of his needs alone: consumption.
10. Solidarity is a spontaneous reaction by those who recognize that the social function of property and the universal destination of goods are realities which come before private property.
(Scroll down for answers.)
I think I’ve written before about creative procrastination, but I can’t immediately find it, so I’ll restate my idea here. Whenever you have an urgent deadline, the desire to procrastinate becomes irresistible. Rather than trying to resist it, the optimal response is to succumb, but to have a list of necessary but non-urgent tasks at hand (as I’ve argued before, there’s no need to prioritise non-urgent tasks. Just divide them into those you are going to do, and those you aren’t, then do them in whatever order suits). Now, the guilt induced by the deadline should stop you goofing off on FB, killing boars or whatever, so the desire to procrastinate will force you to tackle the jobs on your list. Then, as the deadline approaches you will finish the job. This works even better if (as is usually the case) an extension of the deadline is possible, but you can conceal this knowledge from yourself until the last possible moment. That way, you get a second round of creative procrastination, plus you have enough time to do the main job properly.
That’s all revision. My new idea for today links this to my long-standing advocacy of word targets. I try to write 500 to 750 words of new material every day. 500 words a day might not sound much, but if you can manage it 5 days a week for 40 weeks a year, you’ve got 100 000 words, which is enough for half a dozen journal articles and a small book. So, that’s my target. If I haven’t written enough one day, I try to catch it up the next day and so on.
And here’s the link. If you’re involved in a big project like a book, or a PhD, there aren’t really any deadlines. But, if you make a rule of being caught up on your word target at the end of the week, you create an automatic deadline for yourself. While doing your best to avoid dealing with this deadline, you create an automatic opportunity for creative procrastination, during which you can deal with admin tasks, write blog posts, sort out your reference system and so on.
Obviously, everyone is different. But this has certainly worked for me and, as a by-product, for CT readers (at least, those of you who don’t just skip over my posts to get to the good stuff). The marvels of creative procrastination have produced hundreds of blog posts, some of which have even turned into books.
I knew folks on the right were going to be upset about the Iran deal, but isn’t this a bit much? The Corner has gone Everyday-is-like-Munich full neocon.
OK, maybe there’s no point in even bothering, but just look at this post, “Munich II”, by James Jay Carafano (vice president of foreign- and defense-policy studies at the Heritage Foundation.) He is banging on about how ‘realism’, presumably in the I-R sense, opposes this deal. But, even as he’s trying to make the case, he can’t help inadvertently making the case that the other side has got the better realist case.
What does Carafano think we should hold out for? “Any diplomatic deal that is not grounded in shared interests or a common sense of justice will surely fail.”
That just means there’s no possibility of any diplomatic deal. Ever. If there’s any truth to realism. States are self-interested. Iran wants what’s good for Iran. US wants what’s good for US. There isn’t any overriding, operative sense of justice that overrides all that. So we’re done. This is Realism 101, right?
Of course, the realist strategy for dealing with this familiar dynamic is … well, you might try to arrange things so that, even though everyone is self-interested, it’s a positive-sum game and everyone can walk away from the table a little better than they were, ex-ante. That is, you manufacture shared interests.
Carafano singles out this realist strategy for especial mockery as the ‘magic button’ approach. It’s absurd to suppose there could be any button so magic that pushing it would cause Iran to “realize that the benefits of collaboration and transparency outweighed the burdens of isolation and confrontation.” Because it’s absurd to suppose that Iran is to be modeled as a minimally rationally self-interested actor, on the international stage, concerned primarily with its own survival – above and beyond that, with security and power.
What kind of pie-in-the-sky, idealistic, let’s-just-be-friends theory of international relations would model states as self-interested, rational actors?
Maybe he would say he didn’t mean ‘realism’ in the I-R sense. But it seems more likely that this is neocon thinking that doesn’t know how perfectly anti-realist it is.
Four days ago, Zbigniew Brzezinski tweeted this:
Seemed like a crazy read of what Brzezinski said, but it’s the sort of thing I’ve come to expect from Goldberg. I didn’t give it a second thought.
But Logan Bayroff at J Street did. J Street, in case you don’t know, is a liberal-ish Jewish group in the US that’s pushing for a peace settlement with the Palestinians. They call themselves “the political home for pro-Israel, pro-peace Americans.” Not my people, but what are you going to do?
Anyway, Bayroff went after Goldberg:
— Logan Bayroff (@Bayroff) November 24, 2013
Goldberg got furious and this morning tweeted this:
And when he was asked what this rude J Street employee had done to piss him off, Goldberg explained:
— Jeffrey Goldberg (@JeffreyGoldberg) November 24, 2013
Okay, now that we’ve introduced our cast of characters let me say this:
By what authority does Jeffrey Goldberg arrogate to himself the right to defend (with the implicit threat that he might not in the future) someone or some group’s “place in the Jewish tent”? Who elected him Pope to excommunicate or not some heretical Jew? Who made him Defender of the Faith?
(Let’s set aside that the reason he gives for wavering in his commitment to keep J Street in the fold is that one of its impertinent employees had the audacity to criticize him. For reasons that seem perfectly legitimate.)
Isn’t that sort of talk—I’ll defend (or won’t defend) your staying within the tent, but only if you’re well behaved—sort of, well, un-Jewish? I know there are more theologically minded Jews than myself who think they can dictate how one ought to be Jewish, who is or isn’t a Jew, but that’s the point: they ain’t Jeffrey Goldberg. He’s a self-described liberal Zionist, not an Orthodox Jew. But in his cosmos, Zionism is the religion, he’s the rebbe, and the rebbe gets to decide if you’re in or you’re out, if you’re kosher or treif.
Zionists like Goldberg like to style themselves as open, hip, and pluralist. They think what distinguishes them from the Black Hats is their embrace of secular modernity. But as you can see from this incident and the one I discuss below, Zionism has not only made these types intolerant and anti-pluralist; it has turned them into Popes and Inquisitors, enthralled with their imagined power to exile and excommunicate.
Under their watch, one of the most important questions that lies at the heart of the Jewish tradition—What does it mean to be a Jew?—gets taken off the table. Because we already know the answer: support for the State of Israel. If you do, you’re a Jew in good standing; if you don’t, you’re not.
That’s what nationalism—especially nationalism hitched to a state—does to people. It makes the Goldbergs of this world think they can give Jews a passport or take it away. Well, guess what, Rabbi Goldberg: you can’t. I don’t need you defending my right to be in the Jewish tent because that’s not within your, or any other Jew’s, power to decide.
I wish Ben-Ami had jumped in in response to Goldberg and said, “Fuck you, you don’t get to decide who’s a Jew or not.” Instead, he tweeted (never has that word seemed more appropriate) this:
Like I said, not my people. What can you really expect from a group that needs to surround their support for peace in the Middle East with bumper stickers affirming they’re pro-Israel and pro-America?
Anyway, as I said above, this isn’t the first time Goldberg has said something un-Jewish in the name of the Jews. About two years ago he made a similar move, and I wound up writing a lengthy blog about it. Here it is…
• • • • •
As someone who identifies as Jewish—who periodically goes to shul, celebrates some if not all of the holidays, and tries at least some (ahem) of the time to get off the internets for shabbos—yet opposes Zionism, I thought I’d heard all the charges that have been and could be made against me and my tribe. But yesterday, Jeffrey Goldberg, the Atlantic writer and one of the leading voices of liberal Zionism in this country, threw a new one into to the mix.
In my experience, those Jews who consciously set themselves apart from the Jewish majority in the disgust they display for Israel, or for the principles of their faith, are often narcissists, and therefore seem to suffer from an excess of self-regard, rather than self-loathing.
What caught my eye (really, my ear) was not the evident wrongness of the claim, starting with the lazy assumption that those who oppose the State of Israel are somehow setting “themselves apart from the Jewish majority.” It was that “excess of self-regard.” Whether Goldberg knows it or not, or was conscious of it when he used it, that charge has a pedigree in Jewish—or rather anti-Jewish—history.
To be sure, there is within Judaism an injunction, and more generally an ethos, not to separate oneself from the Jewish people. The Wicked Son at the Passover Seder asks, “What does this service [or ritual or story] mean to you?” His wickedness lies in that final hissing “to you”: he refuses to acknowledge that in addition to being an “I” he is also a “We.” Verses in the Pirkei Avot enjoin us not to hold ourselves apart from the community. There’s also a Halachic stipulation that for the sake of practicality and communal living, Jews must abide by legal rulings regarding everyday ritual and civil law. Despite the many differences and disagreements it generates, Judaism is not really a religion of individuals or individualism; it is the religion of a people. Am Yisrael: the people of Israel.
But, as far as I can see, there is little in the tradition that views the dissenter as somehow haughty or superior, narcissistic or self-regarding. And while friends more knowledgeable than I joke that one can always find evidentiary support in the Talmud for some claim or other, this particular one would probably require some digging. If it exists, it’s a subterranean position. And how could it not be? For every two Jews, goes the old saw, there are three opinions. If every unorthodox statement were treated as a symptom of overweening arrogance or pride, well, there’s not enough room in the universe—let alone the Talmud—to contain such a lexicon of self-regard.
In fact, the only document I can think of that even approximates such an accusation is Annie Hall. Think of those scenes where a young Alvie Singer presses his existential concerns (“The universe is expanding”) upon his parents only to be told by his mother, “What is that your business?” and, later, “You never could get along with anyone at school. You were always outta step with the world.” Or perhaps that scene in Hannah and her Sisters where Mickey (the Woody Allen character) tells his parents he’s thinking of converting to Catholicism because he’s afraid there’s no God or life after death, and his father replies, “How do you know?” and his mother, less indulgently, “Of course there’s a God, you idiot! You don’t believe in God?” Aside from these hints that the questioner of—or deserter from—the faith is somehow punching above his weight (and, of course, the characters here are speaking the language of parents rather than Judaism), it’s hard to find this specific rhetoric of accusation that I’m talking about, in which the dissenter is impeached as a presumptuous snob, in the Jewish tradition.
But if you’re not in the mood for digging deep, if you want quick and easy access to that rhetoric, simply put your hand into the garbage can of anti-Semitism. For it is there, in the rubbish of ancient and modern history, that you’ll find the accusation that the Jew who refuses to conform to the ways of the dominant culture—with the culture now understood, of course, to be non-Jewish—is smug and superior, that he assumes he knows better and believes he is better than the majority. Because how else are we to understand a minority insisting upon its own ways over and against the majority?
Robert Wistrich’s A Lethal Obsession: Anti-Semitism from Antiquity to the Global Jihad is a veritable compendium of such accusations, from ancient pagans to Vichy officials to Brezhnev’s Soviet Union to the modern Arab world (making full allowances, as Wistrich does not, for the distinction between anti-Semitism and anti-Zionism). Over and over, one hears the complaint from the anti-Semite that the Jew has set himself up not only in opposition to, but in judgment upon, the dominant culture. And that in doing so he has presumed himself to be better than that culture.
Of course, that accusation often preys upon the complicated—and by no means uncontroversial—notion of chosenness within the Jewish tradition. Bernard Lazare, the Jewish radical who wrote the first genuine history of anti-Semitism just before the Dreyfus Affair (and whose work had a tremendous influence upon Hannah Arendt), offered a version of this claim. In Wistrich’s lucid paraphrase:
Bernard Lazare was convinced that the “revolutionary spirit of Judaism” had been a major factor in anti-Semitism through the ages. Abraham, Moses, Jesus, and Karl Marx were prime examples of Jewish iconoclasts of their time. The Jews, by creating an intensely demanding God of morality and justice whose stern monotheism brooked no toleration of alien deities, threatened the natural order. The prophetic vision of an abstract transcendent Godhead above nature, a deity without form or shape, who had nonetheless created the universe and would in the fullness of time redeem all mankind, was disconcerting, powerful, and mysterious to the pagan world. It was rendered especially irritating by the Jewish claim to be a “chosen people,” a “kingdom of priests,” and a ferment among the Gentiles. Anti-Semitism could best be seen as an instinctive response by the nations of the world to this provocation—to the uncanny challenge of an eternal people, whose refusal to assimilate defied all established historical patterns. Hatred of the Jews was often combined with fear, envy….
Though it seems quite wrong to me to locate the sources of anti-Semitism in anything Jews do or say—and that’s not really Lazare’s point, I don’t think—there can be no doubt, as Wistrich shows, that anti-Semites have consistently chosen to interpret the Jewish insistence on separateness and difference (leave aside the more difficult notion of chosenness) as a bid for superiority.
Conversely, and ironically, for writers like Tom Paine, it is precisely this insistence upon setting themselves apart that has been not only the glory of the Jewish people but the guarantor of whatever is democratic and egalitarian in their culture. In Common Sense, Paine takes up a lengthy disquisition on the question “of monarchy and hereditary succession.” There he makes a special point of noting that the Jews were originally without a king and were governed instead by “a kind of republic administered by a judge and the elders of the tribes.”
But the temptation to monarchy dies hard, Paine observes, even among the Jews. And the reason it dies hard is that the desire to conform, to abandon one’s ways in the face of outside pressure, dies harder. So frequently does Paine recur to the lures and dangers of imitation and conformity—”Government by kings was first introduced into the world by the Heathens, from whom the children of Israel copied the custom”; “We cannot but observe that their motives were bad, viz. that they might be like unto other nations, i.e. the Heathens, whereas their true glory laid in being as much unlike them as possible”—that we might say for Paine (at least in Common Sense; Age of Reason sounds a different note) it is the Jew’s refusal to conform that most guarantees his democratic and egalitarian credentials.
For Jeffrey Goldberg, it’s the reverse. It’s the Jew who sets himself apart from the dominant culture—Goldberg’s referring to mainstream Judaism, of course, rather than the culture as a whole, but the structure of the argument is the same—who is making a bid for superiority. And in this respect, Goldberg is aligning himself with neither Judaism nor democracy but their antitheses.
It’s ironic that what started this whole discussion, for Goldberg and excellent journalists like Spencer Ackerman, was the use of the controversial term “Israel-Firster” by critics of Israel and the ensuing debate over whether or not it’s anti-Semitic. I don’t have much of a dog in that fight: I’ve never used and would never use the term, not because it questions the patriotism of American Jews but because it partakes of the vocabulary of patriotism in the first place, a vocabulary I find suspect and noxious from beginning to end. Even so, I’m amazed that someone who is so quick to find anti-Semitism in the words of others is so careless about its presence in his own.
I’m curious – for teaching purposes! What are the Plato bits that you especially like, that aren’t any of those usual bits that always get taught in Intro Philosophy? If you could include one unconventional Plato selection – whole dialogue, or chunk of one – in an intro philo course, what would it be, and why? (In short, this thread is your opportunity to get all indie about Plato. “I only read dialogues that don’t exist.” Please, let your hipster flag fly. You’ll probably sound like a Straussian.)
Bonus exercise: write a commentary on a Plato dialogue in the style of a Pitchfork music review.
I wrote this in late September 2011, to explain to my circle of friends why I thought we were in the state we were in. It’s by way of background to my latest post on secular stagnation, so I’ve disabled comments on this one.
“When you write down all the good things you should have done, and leave out all the bad things that you did do, that’s memoirs” – Will Rogers
“Secular stagnation” is doing the rounds as a theory of why we’re in the mess we’re in, after this Larry Summers talk, which Paul Krugman is claiming basically summarises ideas that he’d also been talking about for the last few years. I am not sure about the extent to which anyone can claim priority on this though – as Krugman says, Summers is basically giving a clear expression of a set of ideas which have been ubiquitous for a long time, to the extent that I was making jokes along that line, ten years ago. I will follow Krugman in saying that I also had been thinking about a similar explanation of things since 2009, set out in cursory form here and in greater detail here.
Basically, the thesis is that since about the mid-1990s, it has been the case that it has only been possible to achieve anything like full employment in America during periods when the private sector has been chronically over-consuming and increasing its debt levels. The “natural rate of interest” consistent with full employment has been consistently negative all that time, and since there are good theoretical reasons to presume that the natural rate of interest has some relationship to the natural rate of economic growth, this might be saying something rather depressing about the underlying growth potential of the developed world’s economy. And so on, and so forth.
Now it’s an interesting question, although not one on which I find myself with anything to say, as to whether we are stagnating secularly. But the thing I do want to address is that, in the way in which the issue is being discussed historically, there is a lot of rewriting of the recent past.
Right from the start, you can see that there has been a lot of semantic drift in the word “bubble”. From having once referred to a specific model of how prices could depart from fundamentals in a rational expectations model, to referring to any general inflation of securities valuations, Summers and Krugman appear to be using “a succession of bubbles” to refer to “any period during which personal gross debt increased based on rising asset values”. As an opponent of linguistic inflation, I’m already prejudiced against this way of thinking of the economic history of the last two decades. But in describing the growth in debt as if it was a purely exogenous phenomenon, due to nothing other than animal spirits and irrationality, there’s a really dangerous kind of mistake being made.
So here’s my version of the economic history of the pre-crisis years, and I suggest it’s at least as consistent with the facts as the “secstag” hypothesis and has the advantage of having significantly fewer unexplained factors in it:
Policy is now in a box. The Federal Reserve has got a basically deflationary economy to deal with, and a fiscal policy constraint that is non negotiable. It has to set policy to deal with this demand shortfall. This gives you all of the observables of the “secular stagnation” hypothesis without having to take any particular position on the (on the face of it unrealistic, remember the Internet?) question of whether the 1997-2005 period was one of technological stagnation. Back to the narrative.
Oh yes, and to put the final cap on it, Alex Harrowell reminded me the other night that, as Doug Henwood demonstrated, in the USA the majority of that consumption growth was actually healthcare, so the consumption was basically non-discretionary in nature. Nice touch.
My point here is that none of this was unknown at the time. The US economic policy structure was aware that they were accommodating China and NAFTA, and aware that the tool of demand management was consumer spending. They might or might not have been aware that the consumer spending was financed by borrowing against housing wealth, but if they weren’t, they thundering well should have been. They got a structural increase in personal sector debt because they wanted one and set policy in order to create one. There’s no good calling it a “bubble” or a “puzzle” now that the shit’s hit the fan.
And so, welcome to the world you made guys. These are the consequences of globalization, entirely predictable and in fact predicted (by Dean Baker, among others). The final conclusion is probably the same as if it was a mysterious secular stagnation; fiscal policy. But the need for fiscal policy is such an obviously correct and obvious fact that more or less any economic argument is going to end up there unless it has major logical or accounting errors. But really – there is no need to tell ourselves ghost stories about animal spirits. There’s no puzzle here. We got this outcome because we wanted it.
 As you can see, that link goes to the blog which I decided to make non-public, because people were being nasty to me. I’ll put up the text of the post in question on CT in a short while.
 Of course, conversely, someone who was pernickety about their Keynes might note that there are very good theoretical reasons to believe that there is no reason at all to suppose that the equilibrium rate of interest in the market is going to be the same number as the full employment rate of interest. The case in which the two coincide is a very “special” case; it’s precisely the fact that there’s no reason to suppose they will which caused Keynes to write his “General Theory” and indeed to give it that title.
 Doesn’t look like a proper word but I think it is.
 Oh dear.