“I really do believe that the world can be saved through design, and everything needs to actually be ‘architected,’ ” Kanye West recently told Harvard students in a widely-repeated quote. Architects especially loved it, but Lucas Verweij, a Berlin-based writer, argues in Dezeen that claims such as West’s are excessive. Verweij writes that the expectations placed on design—“design can solve the smog problem in Beijing, the landmine problems in Afghanistan and huge social problems in poor parts of Western cities”—are overblown and cannot be met. “We are in a design bubble,” he writes, “it’s a matter of time before it will burst.” I’m not sure about the bubble—it seems more like a passing fashion to me—but the current idea that every problem is fodder for the design profession is certainly misguided. Can a designer really be master of all trades? The proposition that design can be effective at all scales dates back to Walter Gropius, who claimed that the designer could assume broad responsibilities, from a teacup to a city was how he put it. (Grope’s teacups are OK, his urbanism, not so much.) But design is primarily about the how; the what is determined by a host of circumstantial conditions—social, economic, and cultural—over which the designer exercises no authority. More than 40 years ago, Victor Papanek, a Viennese-born industrial designer, wrote Design for the Real World, in which he made the case for design-as-problem-solving as opposed to design-as-styling. It was compelling stuff—I remember a Third World transistor radio housed in a can, run by candle-power. But were such radios ever produced? The forces of globalism ensured that it was the cell phone, not the the tin-can radio, that revolutionised the world, including the Third World. And the revolutionary aspects of the cell phone are the work of engineers, not industrial designers. The much-vaunted “design” of Apple products, for example, is chiefly (obsessively) minimalist packaging. Pace Papanek.
Just returned from a brief visit to the UK. When you arrive in London, if you have £20 you can take the Heathrow Express (travel time 15 minutes) to the city; if you have £28 you can go first class. The spiffy train interior makes Acela look frumpy. When did the British get so good at design? The original London black cab was the Austin FX3, introduced in 1948. It had plenty of room for luggage, flip-down jump seats, and rear-hinged doors for the benefit of the passengers. The latest model of black cab, TX4, still has those useful features (except the rear-hinged doors), as well as a diesel engine, air-conditioning, ABS braking, a wheelchair ramp, and MP3 compatibility. It carries five passengers and is 2 feet shorter than a Ford Crown Vic, the New York cabbie’s favorite. And it still looks like a black cab.
I despair when I return home. The train from Philadelphia’s airport to downtown is cheaper ($8) but it takes longer, makes local stops, has all the charm of a 1950s subway car, and people struggle to find a place for their luggage. It’s still better than the taxis, though, old sedans that are uncomfortable, beat-up, and driven with reckless abandon by drivers whose newly-acquired knowledge of the city is minimal.
The British have developed an enviable ability to innovate without throwing out the baby with the bathwater. In 1971, they decimalized their money, retiring the halfpenny, threepence, sixpence, shilling, florin and half-crown–not to mention the guinea. The smallest paper money now is a five-pound note, and there are sensible one-pound and two-pound coins. The coins still carry the monarch’s image on one side. We can’t even get rid of the penny, let alone introduce a dollar coin. The US Army has adopted metric measure for distances, but the nation seems unable; after a half-hearted try in the 1970s we remain one of only three countries in the world to resist metrication (together with Burma and Liberia). The UK completed metrication more than 40 years ago–but in a very British way. Food is sold in grams and kilos, but people still weigh themselves using that mysterious British measure, the stone. The London Underground counts distances in metric but speeds in imperial. And while gas stations use liters, pubs still serve beer in pint glasses. Cheers.
The first international style in architecture was not the white-box style of Le Corbusier and Walter Gropius but Art Nouveau, modernism’s predecessor and in many ways its aesthetic and philosophical opposite. Art Nouveau flourished from 1890 to 1910, and along the way it produced a surprisingly large number of masters: Gaudí, Hoffmann, Horta, Mackintosh, Plečnik, Sullivan, Van de Velde, and Wagner. And that’s just the leading architects; there were also painters, designers and craftsmen: Beardsley, Klimt, Lalique, Moser, Tiffany. Thirty years is a good long run as architectural fashions go, indeed, the International Style lasted barely that long, nevertheless, modernist apologists have always pooh-poohed Art Nouveau, promulgating the view that “the demise of Art Nouveau was attributable to some fundamental internal flaw,” as Peter Kellow writes in a recent issue of American Arts Quarterly. The modernist apologists were understandably defensive; nobody would ever put a Gropius architectural fragment in a museum, as they would the work of Sullivan and Horta. Moreover, the anti-rationalism of Art Nouveau flew in the face of “scientific” modernism. Yet a quick glance at subsequent history reveals that Art Nouveau was the harbinger of a significant strain of modern architecture, visible in the work of Scharoun, Mendelsohn, Poelzig, the late Wright, and surviving today, though without the exquisite details, in the work of Gehry and Hadid. Though there was a brief revival of Art Nouveau, at least in graphic design, during the psychedelic Sixties, an architectural revival seems unlikely. But you never know. As Kellow writes. “Art Nouveau buildings are surely some of the most beautiful ever designed. Not necessarily the best, but the most beautiful.”
We remember the past in different ways. World War II produced memoirs (Frank, Wiesel, Tregaskis), histories (Churchill, Shirer, Keegan), novels (Mailer, Jones, Heller), and innumerable films and television documentaries. So did the Vietnam War (Dispatches, A Rumor of War, A Bright Shining Lie, The Best and the Brightest, as well as The Deerhunter, Apocalypse Now, and Platoon). Of course, there are also built memorials, which form the focus of wreath-layings and commemorative ceremonies, but our memory resides in many places. So I find the present custom of making so-called visitor centers an integral part of memorials odd, not to say redundant. A museum was to be a part of the World War II Memorial in Washington, D.C., but was wisely removed. Less wisely, Congress has approved an underground “education center” next to the Vietnam Veterans Memorial, although that project happily appears stalled. The other night, “Sixty Minutes” showed the 9/11 museum under construction at the World Trade Center site. You would have thought that the vast outdoor memorial would have been sufficient commemoration, but we are also to have a vast underground space devoted to the events of that unhappy day. I once visited the Resistance museum in Oslo, the subject of which was the Norwegian resistance to the Nazi occupation during World War II. It is estimated that 40,000 men and women took an active part in the resistance movement. The museum displays included Sten guns, short-wave radios, and miniature tableaux of wartime scenes, with tiny armored cars being sabotaged, and agents parachuting among cotton-batten clouds. It was informative, charming, and low-key in a laid back Scandinavian sort of way. If we must have a 9/11 museum, I wish it were that modest.
An article in today’s New York Times on classroom chairs reminded me of my schooldays. As far as I remember, we had wooden desks with a built in bench seats, attached to the floor. The desk-top, usually carved with a chronicle of interesting graffiti, was sometimes hinged with a storage space beneath that we never used. We didn’t used the hole in the top, which was made to hold an ink-well, either. The desks were sturdy and not particularly comfortable—they weren’t intended to be. The Times piece is full of fluff about how different classroom chairs might improve learning, although the author allows that New York City’s Model 114 stacking chair has its defenders. “But is some quarters, the chair and others like it are seen as stubborn holdovers from before the age of ergonomics, when American schools’ main job was to turn out upright citizens, and rote learning was the student’s lot.” Since most people agree that American education has declined precipitously since the Age of Rote Learning, I wonder if a “stubborn holdover” is really so bad.