Updated: 18 weeks 2 days ago
It’s probably past time I made such an announcement. Not that there’s been anything wrong with what Bill B. has been up to on his own lonesome for some time hereabouts. But the Valve was never intended to be Bill Benzon’s personal blog. Best he relocate to his own digs if it’s to be a solo operation.
I want to keep the site up. I would be sad if the archives disappeared. Lots of good stuff. But keeping the place up? ... well, we’ll see. Maybe I’ll get around to organizing some of those good old book events again soon. Best of luck to all our past authors, wherever they have wandered to by this point. And thanks to all the readers and commenters who made it such fun while it lasted. But nothing lasts forever.
I first heard of The Valve back in late Spring or early Summer of 2005 when I caught wind of an up-coming discussion of an anthology entitled Theory’s Empire. The idea, it seems, was to look at what had by now become capital “T” Theory and, perhaps, hope! hope! lay it to rest.
So I showed up in early July and joined the commentariat. I soon found myself playing an unaccustomed role, that of the old timer who says, “Now, back in my day . . . “ Now, I’m not that old, and it wasn’t that long ago in calendar years, but it WAS before the personal computer and internet. I was an undergraduate at Johns Hopkins when the French landed in 1966 and I’d lived the ferment in literary studies that had been occasioned by those ideas. I also encountered developmental psychology, through Mary Ainsworth, and psycholinguistics through James Deese. And so I added Jean Piaget and Noam Chomsky to my repertoire while studying semiotics and comp lit in English translation under the tutelage of Dick Macksey, one of the organizers of the (in)famous Structuralism conference.
The upshot of all that is that I shuffled off to Buffalo and joined the cognitive revolution under the tutelage of David Hays in the Linguistics Department. But my degree was in English and my dissertation was on cognitive science and literary theory. I’d decided that structuralism led, not to post-structuralism, deconstruction, or postmodernism, but to cognitive science. The profession did not agree with my assessment of the situation; we parted ways several years after I got my degree.
But I kept up my intellectual program anyhow, publishing this and that, here and there. I showed up at the Theory’s Empire event to see how things were going in the literary academy and to engage that part of it that might, I thought, just might, be looking for something new. Later in that year I was asked to try out for the masthead and was accepted early in 2006. Since then The Valve has been the closest thing I’ve had for an institutional home base.
Up into the second half 2010 or so The Valve functioned as a vigorous group blog, more vigorous at some times than others, but strong and interesting. In the Spring of 2010 I set up my own blog, New Savanna, mostly so I could post on a wider range of topics than I felt appropriate to The Valve. By the end of 2010 it was clear, however, that, as a group effort, The Valve was dying. I continued to post here, as well as at New Savanna, because it was easy enough to do and because there seemed to be constant traffic from somewhere out there in the ether.
All things change, however. John Holbo, The Valve’s progenitor, informs me that it’s time for The Valve to go the way of the Phoenix and be reborn. To do that, however, it must first die. Really and truly. Dead.
And so I will cease posting at The Valve in order that this plot of cyberspace may lay fallow for awhile.
It was a good run.
I’d like to thank John and the other Valvsters for the good intellectual company and you, our readers, for your kind and generous attention.
I remember browsing through the encyclopedia when I was young. We had an Americana, to the Britannica, which just announced that it will cease print publication, and I would spend hours reading from one entry to another. The volumes were heavy and substantial and the set of them gave a visible and tactile sense of complete knowledge. That sense, of course, was an illusion, but it was there.
The Wikipedia affords a different experience. Of course, I come to the Wikipedia as a mature adult with a great deal of intellectual sophistication; how it would appear to a bright 11-year old, I don’t know. But there’s no way to get a sense of all-knowledge-complete from the Wikipedia; you can’t see it on the shelf, you can’t handle it volume by volume. It just trails off into the ether, in many many different directions.
There is, of course, the question of accuracy and authority. I know that comparisons have been done between Wikipedia entries and, I believe, Britannica entries. And Wikipedia has come out well in these comparisons. But that’s not all there is to IT.
By IT I mean both authority and, well, accuracy, I guess. They’re closely related, but not quite the same. In the case of conventional encyclopedias, such as the Britannica, the authority resides in the institution itself. Where the entries themselves came from, who wrote them and what sources they consulted, that’s pretty much a mystery.(Continued below the fold.)
From today’s New York Times:
In an acknowledgment of the realities of the digital age — and of competition from the Web site Wikipedia — Encyclopaedia Britannica will focus primarily on its online encyclopedias and educational curriculum for schools. The last print version is the 32-volume 2010 edition, which weighs 129 pounds and includes new entries on global warming and the Human Genome Project.
“It’s a rite of passage in this new era,” Jorge Cauz, the president of Encyclopaedia Britannica Inc., a company based in Chicago, said in an interview. “Some people will feel sad about it and nostalgic about it. But we have a better tool now. The Web site is continuously updated, it’s much more expansive and it has multimedia.” ...
Sales of the Britannica peaked in 1990, when 120,000 sets were sold in the United States. But now print encyclopedias account for less than 1 percent of the Britannica’s revenue. About 85 percent of revenue comes from selling curriculum products in subjects like math, science and the English language; 15 percent comes from subscriptions to the Web site, the company said.
About half a million households pay a $70 annual fee for the online subscription, which includes access to the full database of articles, videos, original documents and to the company’s mobile applications.
It is a truth universally acknowledged that What’s Opera, Doc? is one of the finest cartoons ever made. It satirizes opera, Wagner in particular; it parodies Disney’s Fantasia, and, for that matter, it parodies the routines of its stars, Bugs Bunny and Elmer Fudd. The production was, by Warner Brother’s standards, lavish, and the layouts, by Maurice Noble, are inspired.
All of that’s obvious. What’s not so obvious is that the film plays on the nature of reality in a way that’s reminiscent of Dance of the Hours from Disney’s Fantasia. As I’ve argued in Animal Passion? Hyacinth Hippo and Ben Ali Gator, that episode depicts the inability of animal dancers to stay in role with the result that, when Ben Ali Gator courts Hyacinth Hippo we don’t know whether they’re acting roles or whether their passion is, well, real. Something like that is going on in What’s Opera, Doc? Elmer Fudd is in role as, well, Siegfried I guess, from beginning to end, but Bugs is not.
Note: I’m not going to comment on the design. But you should pay attention to it. Note the colors, the camera angles, and the use of lines. It’s really exquisite.
Kill the Wabbit
Let’s start at the beginning. As the title card and credits roll we hear an orchestra warming up. We thus know that, yep, as the title says, this is going to be opera. The opening music is wild and stormy and we see a stormy sky, and then a large hulking shadow appears projected against a cliff. More sky and lightening, and then we see that the large shadow is projected by a rather small fellow:
It this point a simple, and rather old, point has been made: things aren’t always what they seem to be. The camera zooms in and it’s Elmer Fudd, in heroic costume as a Nordic warrior, informing us that he’s “hunting wabbits.”(Continued below the fold.)
A new journal, Singularum, has an interview with philosopher Alphonse Lingis, who translates Merleau-Ponty, writes, travels, and takes photos.
I had long resisted buying a camera, thinking that there was something false about collecting images of things seen and people encountered and who have passed on, trying to retain the past. I thought that what was real was what from a trip left one changed. I started taking pictures when a friend who was taking me to the airport gave me a camera on the way.
I soon realized that the camera had changed my perception. The light: it was no longer just cleared space in which things took form; it had direction, it led the gaze, its shafts excavated situations isolated in the dark, sometimes it spread in a scintillating, dazzling, blazing medium without boundaries. Shadows took on substance; they stretched, flowed, condensed things in themselves. It occurred to me that I saw them that way when I was a child. Things looked different: the contours of shadows and of things that overlapped other things pushed out the contours that contained things in themselves. Flat surfaces showed corrugations, grain, stubble and texture, and sheets of gleam. And the continuity of the landscape drifting by would be abruptly broken by momentary events—the spiraling neck of a heron probing the space, the poised pause of an antelope, the legs of a child in an arabesque she will never be able to do once grown up, the grin of a passerby at something inward. The landscape is abruptly splintered, a segment isolates, magnetizes and pulls the glance into it.
Since George Dyson’s recent history of modern computing, Turing’s Cathedral: The Origins of the Digital Universe, was written in part to restore John von Neumann to prominence, I thought I’d republish a double review, lightly edited, I wrote some years ago: “A Tale of Two Geniuses,” Journal of Social and Evolutionary Systems, 17(2): 227-230, 1994. Richard Feynman was one of the geniuses and John von Neumann was the other.
John von Neumann, by Norman Macrae, New York, Pantheon Books, 1992, 405 pp.
Students of cognitive evolution and of twentieth century thought are fortunate in the simultaneous appearance of these two biographies. No doubt the simultaneity is mostly coincidence. The physicist Richard Feynman is most widely known, alas, for two autobiographical collections of anecdotes which reveal him to be a waggish and riggish anti-establishment sort; he is most deeply known for his contributions to quantum electrodynamics. John von Neumann was a thoroughly establishment sort - soldiers guarded his hospital room as he lay dying of brain cancer just in case he let out defense secrets in his sleep - and is most widely know as the name which appears in phrases like “computers using the von Neumann architecture.” The two men crossed paths in Los Alamos, where they worked on the atomic bomb. That crossing is a reasonable place to begin our review.
Feynman was recruited to Los Alamos while still a graduate student. He was in charge of group T-4, Diffusion Problems. The problem was to figure out how neutrons, which drive the fission reaction, diffuse through the explosive core. Knowing the rate and pattern of diffusion was essential to determining the mass and configuration of fissile material. Since the late 30s von Neumann had been working on similar problems in connection with shock waves and explosions in general and so was able to help the Los Alamos effort between 1943 and 1945.
The difficulty was that the relevant equations could not be solved analytically. Rather, it was necessarily to simulate neutron diffusion numerically by calculating the step-by-step motion of individual neutrons. That requires lots of calculations, which were performed by a group of people operating calculators. The problems would be broken into components; each person would be responsible for one component, with each problem being passed from person to person as individual components where calculated.
Computing and von Neumann
That, of course, is the general way computers solve problems, with the computational plan being an algorithm. But, they did not have computers at Los Alamos. Computers came after the war and von Neumann was central to the effort. He understood that the computer is essentially a logical device and clarified that logic with the concepts of the stored program (Macrae, pp. 282-284), the fetch-execute cycle (pp. 287), and conditional transfer (see Bernstein 1963, 1964, pp. 60 ff.). That is to say, von Neumann clearly differentiated between the physical structures and connections of the devices from which the computer is constructed and the logical requirements which those devices have to fulfill. For that he is the progenitor of the computer.(Continued below the fold.)
Several years ago I spent a delightful evening at New York City’s Museum of Modern Art viewing retrospective of Michael Sporn’s films. Everyday I check his blog, which is a treasure trove for those interested in animation. Now I’m asking you to support his Kickstarter project, which involves a biography of Edgar Allen Poe that he’s been working on for several years.
Here’s how Sporn describes the film:
The Animatic, above, is a rough representation of animation in progress. It helps us tell the story. We hope to turn the many segments started into completed animation to be able to thrust the feature film, POE, into complete production. The Kickstarter money will do that for us and help satisfy the needs of the possible distributors and financiers who are already interested.
What’s the story?
Edgar Allan Poe was a brilliant writer who lived a very short and eccentric life. He died at the age of 40 and in that time created literary genres including the detective mystery, the sci fi epic, the horror story, and many of the most beautiful love poems imaginable. Within this life there is a very dynamic story to be told.
The film opens with baby Edgar dragged from theater to theater by his ever-squabbling actor parents. They travel to cities up and down the East Coast performing, as their marriage falls apart. Edgar’s father disappears, and his mother dies of consumption. The three year old watches the last theater his mother performs in burn to the ground. He’s left an orphan, and the film begins.
Poe’s life was destroyed not by drugs or alcohol, as is often stated, but by absolute poverty, and this is the crux of our film. Many of the women in his life died of consumption and illness as he was too poor to be able to care for them properly.He, himself, died in a poorhouse hospital.
Our film will show various biographical key points in his life and will use selections from his great fiction to depict this dramatic story.
The film is now completely scripted and story-boarded and 20 minutes of an animatic have been completed. Four Poe stories will be set in counterpoint to the biography: The Premature Burial, Murder in the Rue Morgue, The Black Cat, and Ms. in a Bottle.
Now’s you chance to step into film history by supporting this project.
Writing in, of all places, The New York Times, Colin McGinn, a distinguished philosopher—for only distinguished philosophers get to appear in “the paper of record”—has called for a rebranding of the discipline of philosophy. No, “rebranding” isn’t his word, though it was astutely used by one of the commenters. McGinn just called for a name change. “Ontics” is his suggested alternative.
McGinn notes that the name is misleading to non-philosophers, who “immediately assume you are in the business of offering sage advice, usually in the form of unargued aphorisms and proverbs.” And when you try to explain, well, they just don’t get it. Whatever this discipline is, “lover of wisdom”—the etymological meaning of the name—is too generic.
Well, sure, yeah, it is. But then, is what McGinn does, or what most academic researchers do, is that wisdom in any meaningful sense? Thomas Kuhn famously argued that what most scientists do is rather like puzzle-solving, and he did not mean the term at all pejoratively. The point of the term was to suggest that most scientists—and McGinn thinks of philosophy as science, in a broad sense of the term—work within fairly well-specified conceptual boundaries.
Which they do. And so it is with most academics. That’s just the nature of the enterprise.
There is tremendous respect for the mythology of “going boldly where no man has gone before,” but little on-the-ground tolerance for that activity in the flesh. I rather suspect that McGinn wouldn’t recognize one of the bold ones if she bit him in the ass. Whatever it is that McGinn does, is there any reason whatever to suspect that he gives a fig about wisdom?(Continued below the fold.)
J.J. Gould has a short piece in The Atlantic that lists Nazi regulations for dance orchestras on Czeckslovakia:
H/t Graham Harman.