Month: February 2005


  • Back From the Front, With Honor, a Warrior’s Truth


    By HELEN GERHARDT





    JOHN was a love child, conceived on the driver’s seat of my rusty old Volvo as I sped toward my National Guard headquarters in March of 2002. I was aglow with unexpected discovery, my body flush from hours of cuddling skin to skin with my newly beloved.


    But I had lingered too long and now was pushing the speedometer toward 90 in a 70 mile-an-hour zone, trying desperately to get to the armory in time for morning formation. I was speeding toward what I knew would be a very slow day: probably an apathetic rehash of gas mask maintenance, maybe a quick check under the hoods of our seldom-used trucks and most certainly a cud-chewing on the latest rumors of our possible deployment to the War on Terror.


    When I rushed into formation, a minute late and shining with happiness, Sergeant Bryan twisted around, looked me up and down and winked. “Guy wouldn’t let you go, huh?”


    I smiled back. Why disabuse him of that notion?


    But as soon as formation broke, a small clump of my fellow soldiers gathered to tease me and pursue details of the scoop. I hadn’t dated anyone since I joined the company two years before, soon after the end of my 12-year marriage to a man I still cared for, if I couldn’t live with him. A couple of guys in the unit had manfully tried to rectify my lack of companionship, flirting with me, asking me out, and my ham-handed response to their attempts had raised a few eyebrows and fueled speculation about my sexual orientation that was not spoken to me but that I could read clearly on several faces. Of course they were wrong, I’d thought. They just weren’t taking into account the complexities of a bruised heart.


    Sergeant Durk grinned up at me from his pink pug face. “Is it lust, girl, or is it love?”


    “Come on,” Sergeant Bryan demanded. “Details, details – who is he?”


    I looked back at Sergeant Bryan with affection. He had so innocently asked, but I could not tell. And it was killing me, the fact that I had to keep all those glorious, life-changing details of my new love hemmed in and humming between my ears. I wanted to tell the world about it. I certainly didn’t see how it was possible to spend eight long hours pretending it was just another drill day.


    “I see,” Sergeant Bryan said. “Gonna hold out on your redneck Army buddies?”


    The alias I’d thought of on the way to the armory trembled at the edge of my lips. And then it tumbled out. “Well…” I began.


    And so John was born, breaking records for growth as he sprang from my head with all the reflective glow of a newly polished shield and armed with all the sharp edges of my fresh memory. “He’s a little tubby, just turned 40, salt-and pepper hair cut mighty short. He’s finishing his doctorate. He’s going to be applying for teaching jobs next year. Yes, he’s in English too. We met in class.”


    The story poured out with such confidence – it was the truth, after all, except for one small inconvenient fact. But how great it felt to besottedly report the relevant details of my unexpected love, to regale my fellow soldiers with tales of my man, who was only a slight twist of the helix and the tongue away from the whole truth.


    John at first seemed to be a low-maintenance guy. Put away with my uniform, and quickly brushed to a shine along with my black boots once a month, he always sprang to duty with military efficiency.


    But as the year wore on, my act began to take its toll. Early on I had excused John’s absences at Guard socials by proudly declaring how devoted he was to his studies. But that line wore thin as my fellow soldiers’ spouses, boyfriends and girlfriends continued to show up by their sides while I, yet again, attended solo.


    “Yeah, well, John’s definitely a workaholic,” I complained with an anger that felt disturbingly authentic. When my real love and I began to plot a discreet co-habitation, I tried to allay any suspicion that my lover’s voice might raise if Guard members were to call my home by explaining that John wanted to save money by moving in with my “roommate” and me. But finally, when we got the news of our deployment to Iraq, I felt as if my wall of deception was about to collapse.


    My true love began to campaign for John’s elimination with editorial ruthlessness: “Kill him off now! This is the perfect time.”


    “But I’ve been talking about our engagement. We were looking at rings last month.”


    “Everybody knows deployments change things. With such a long separation coming up, maybe he thought it would be wiser to wait rather than act hastily.”


    As usual, my lover’s practicality was inarguable, and over the next few months I laid down a back story for the coming virtual breakup. As other guardsmen swore the undying love of their own lovers and spouses, I indicated that no, John certainly wasn’t thrilled by the prospect of my yearlong deployment in Iraq. “And he’s been offered a job in Texas,” I added. “We don’t see how he can turn it down with the job market like this.”


    I had friends and family who wondered very loudly why I didn’t just come out and tell the truth rather than so carefully script John’s exit. After all, why not take the opportunity to let the truth get me off what could be a truly deadly hook?


    But I couldn’t do that. When I joined the Army in 2000 I had never anticipated any future need to censor my life, had never imagined the flesh and blood form in which my true love would one day appear. I had raised my hand and sworn the military oath to redeem a decade of debt, to escape the years of assembly lines, waitress aprons and janitor buckets that had kept me afloat. Thanks to the Army, I had just received a degree in English, and for this I was grateful.


    Like my country, teetering on the edge of a war with unknowable costs, I had decided to borrow now and pay later. As I saw it, I owed for what I had received, and it would be a sniveling, wimpy misuse of my love to back out just when the bill was due to my country and the men and women I served with. I did not really buy the bill of goods they’d sold everyone to star-and-spangle our reasons for pre-emptive invasion, but I had sworn to obey my commander in chief.


    So while I told my worried friends and family that I would not bear the fictional burden of John to Iraq for my fellow soldiers to innocently pry at, I would have to remain silent about who I was for the duration of my overseas service.


    In February 2003, as my unit gathered in the freezing gray dawn to get on the bus to the predeployment processing camp, I broke my carefully planned news to the sergeant I respected the most.


    “So, will John be there to see you off next month?” Sergeant Collum asked.


    “No,” I said. “He went ahead and took the job in Texas. We decided we had to see what happens when all this is over.”


    He did not look surprised. Unlike others in my unit, he’d never asked why John didn’t show up for our group celebrations. I knew he was a savvy man, but I could not know what he guessed of my real situation – if his restraint had served a willed ignorance or a respectful tact.


    SUDDENLY it occurred to me that my deception had worked two ways. The falsehoods I’d spread to keep my fellow soldiers from knowing the real me were at the same time preventing me from knowing the real them. I could now see that during the time I’d been covering for myself, I’d stood increasingly apart from my unit and my superior officers – friendly, but not a friend.


    And in a few months we all would be in a war zone together. Over the coming year I would convoy thousands of miles with Sergeant Collum and the other members of our transportation unit, past deteriorating mosques, begging Iraqi children and roadside explosives. I could never have guessed that the loneliness of maintaining my silence with him and others I cared about would be harder to bear than being shot at or bombed.


    Now Sergeant Collum looked at me. “It won’t work out for a lot of these guys,” he said quietly. “They think it will, but it won’t.”


    A month later, as new recruits marched by our predeployment barracks singing songs of home and lost loves, I sat down and wrote to my man for the first time. We’d been told that for security reasons, all of our correspondence would be subject to inspection, and I sort of hoped that would be the case here, that my letter could serve as a final flourish to end the illusion.


    “Dear John,” I wrote, “I’m afraid we can’t go on like this.” And just like that it was over. Or so I thought.


    Now it is 2005, and I have done my time in the wilderness. For 12 months I hauled your ammunition and guns, your concrete barriers and your charred Humvees that no thickness of back story could armor for the flesh within. I served honorably, remained faithful to my true love and to my country, and I came back in one piece, with even my silence intact.


    But in the wake of all these deceptions, small and large, innocent and deadly, my ongoing silence eventually became its own burdensome lie – one that I simply could no longer bear.


    So that is my truth, or at least the best I could do under the circumstances. I know you didn’t ask. I had to tell anyway. The fact is, I would very much like to continue to serve as my true self. I hope you’ll understand.



    Sgt. Helen Gerhardt returned from Iraq in July and now serves in a transportation unit of the Missouri Army National Guard.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • Colin M. Angle of iRobot steers a PackBot up a set of stairs by remote control at company headquarters in Burlington, Mass


    A New Model Army Soldier Rolls Closer to Battle


    By TIM WEINER





    The American military is working on a new generation of soldiers, far different from the army it has.


    “They don’t get hungry,” said Gordon Johnson of the Joint Forces Command at the Pentagon. “They’re not afraid. They don’t forget their orders. They don’t care if the guy next to them has just been shot. Will they do a better job than humans? Yes.”


    The robot soldier is coming.


    The Pentagon predicts that robots will be a major fighting force in the American military in less than a decade, hunting and killing enemies in combat. Robots are a crucial part of the Army’s effort to rebuild itself as a 21st-century fighting force, and a $127 billion project called Future Combat Systems is the biggest military contract in American history.


    The military plans to invest tens of billions of dollars in automated armed forces. The costs of that transformation will help drive the Defense Department’s budget up almost 20 percent, from a requested $419.3 billion for next year to $502.3 billion in 2010, excluding the costs of war. The annual costs of buying new weapons is scheduled to rise 52 percent, from $78 billion to $118.6 billion.


    Military planners say robot soldiers will think, see and react increasingly like humans. In the beginning, they will be remote-controlled, looking and acting like lethal toy trucks. As the technology develops, they may take many shapes. And as their intelligence grows, so will their autonomy.


    The robot soldier has been a dream at the Pentagon for 30 years. And some involved in the work say it may take at least 30 more years to realize in full. Well before then, they say, the military will have to answer tough questions if it intends to trust robots with the responsibility of distinguishing friend from foe, combatant from bystander.


    Even the strongest advocates of automatons say war will always be a human endeavor, with death and disaster. And supporters like Robert Finkelstein, president of Robotic Technology in Potomac, Md., are telling the Pentagon it could take until 2035 to develop a robot that looks, thinks and fights like a soldier. The Pentagon’s “goal is there,” he said, “but the path is not totally clear.”


    Robots in battle, as envisioned by their builders, may look and move like humans or hummingbirds, tractors or tanks, cockroaches or crickets. With the development of nanotechnology – the science of very small structures – they may become swarms of “smart dust.” The Pentagon intends for robots to haul munitions, gather intelligence, search buildings or blow them up.


    All these are in the works, but not yet in battle. Already, however, several hundred robots are digging up roadside bombs in Iraq, scouring caves in Afghanistan and serving as armed sentries at weapons depots.


    By April, an armed version of the bomb-disposal robot will be in Baghdad, capable of firing 1,000 rounds a minute. Though controlled by a soldier with a laptop, the robot will be the first thinking machine of its kind to take up a front-line infantry position, ready to kill enemies.


    “The real world is not Hollywood,” said Rodney A. Brooks, director of the Computer Science and Artificial Intelligence Laboratory at M.I.T. and a co-founder of the iRobot Corporation. “Right now we have the first few robots that are actually useful to the military.”


    Despite the obstacles, Congress ordered in 2000 that a third of the ground vehicles and a third of deep-strike aircraft in the military must become robotic within a decade. If that mandate is to be met, the United States will spend many billions of dollars on military robots by 2010.


    As the first lethal robots head for Iraq, the role of the robot soldier as a killing machine has barely been debated. The history of warfare suggests that every new technological leap – the longbow, the tank, the atomic bomb – outraces the strategy and doctrine to control it.


    “The lawyers tell me there are no prohibitions against robots making life-or-death decisions,” said Mr. Johnson, who leads robotics efforts at the Joint Forces Command research center in Suffolk, Va. “I have been asked what happens if the robot destroys a school bus rather than a tank parked nearby. We will not entrust a robot with that decision until we are confident they can make it.”


    Trusting robots with potentially lethal decision-making may require a leap of faith in technology not everyone is ready to make. Bill Joy, a co-founder of Sun Microsystems, has worried aloud that 21st-century robotics and nanotechnology may become “so powerful that they can spawn whole new classes of accidents and abuses.”


    “As machines become more intelligent, people will let machines make more of their decisions for them,” Mr. Joy wrote recently in Wired magazine. “Eventually a stage may be reached at which the decisions necessary to keep the system running will be so complex that human beings will be incapable of making them intelligently. At that stage, the machines will be in effective control.”


    Pentagon officials and military contractors say the ultimate ideal of unmanned warfare is combat without casualties. Failing that, their goal is to give as many difficult, dull or dangerous missions as possible to the robots, conserving American minds and protecting American bodies in battle.


    “Anyone who’s a decision maker doesn’t want American lives at risk,” Mr. Brooks said. “It’s the same question as, Should soldiers be given body armor? It’s a moral issue. And cost comes in.”


    Money, in fact, may matter more than morals. The Pentagon today owes its soldiers $653 billion in future retirement benefits that it cannot presently pay. Robots, unlike old soldiers, do not fade away. The median lifetime cost of a soldier is about $4 million today and growing, according to a Pentagon study. Robot soldiers could cost a tenth of that or less.


    “It’s more than just a dream now,” Mr. Johnson said. “Today we have an infantry soldier” as the prototype of a military robot, he added. “We give him a set of instructions: if you find the enemy, this is what you do. We give the infantry soldier enough information to recognize the enemy when he’s fired upon. He is autonomous, but he has to operate under certain controls. It’s supervised autonomy. By 2015, we think we can do many infantry missions.


    “The American military will have these kinds of robots. It’s not a question of if, it’s a question of when.”


    Meanwhile, the demand for armed bomb-disposal robots is growing daily among soldiers in Iraq. “This is the first time they’ve said, ‘I want a robot,’ because they’re going to get killed without it,” said Bart Everett, technical director for robotics at the Space and Naval Warfare Systems Center in San Diego.


    Mr. Everett and his colleagues are inventing military robots for future battles. The hardest thing of all, robot designers say, is to build a soldier that looks and acts human, like the “I, Robot” model imagined by Isaac Asimov and featured in the recent movie of the same name. Still, Mr. Everett’s personal goal is to create “an android-like robot that can go out with a solider to do a lot of human-like tasks that soldiers are doing now.”


    A prototype, about four feet high, with a Cyclops eye and a gun for a right arm, stood in a workshop at the center recently. It readied, aimed and fired at a Pepsi can, performing the basic tasks of hunting and killing. “It’s the first robot that I know of that can find targets and shoot them,” Mr. Everett said.


    His colleague, Jeff Grossman, spoke of the evolving intelligence of robot soldiers. “Now, maybe, we’re a mammal,” he says. “We’re trying to get to the level of a primate, where we are making sensible decisions.”


    The hunter-killer at the Space and Naval Warfare Systems Center is one of five broad categories of military robots under development. Another scouts buildings, tunnels and caves. A third hauls tons of weapons and gear and performs searches and reconnaissance. A fourth is a drone in flight; last April, an unmanned aircraft made military history by hitting a ground target with a small smart bomb in a test from 35,000 feet. A fifth, originally designed as a security guard, will soon be able to launch drones to conduct surveillance, psychological warfare and other missions.


    For all five, the ability to perceive is paramount. “We’ve seen pretty dramatic progress in the area of robot perception,” said Charles M. Shoemaker, chief of the Army Research Laboratory’s robotics program office at Aberdeen Proving Grounds in Maryland. That progress may soon allow the Army to eliminate the driver of many military vehicles in favor of a robot.


    “There’s been almost a universal clamor for the automation of the driving task,” he said. “We have developed the ability for the robot to see the world, to see a road map of the surrounding environment,” and to drive from point to point without human intervention. Within 10 years, he said, convoys of robots should be able to wend their way through deep woods or dense cities.


    But the results of a road test for robot vehicles last March were vexing: 15 prototypes took off across the Mojave Desert in a 142-mile race, competing for a $1 million prize in a Pentagon-sponsored contest to see if they could navigate the rough terrain. Four hours later, every vehicle had crashed or had failed.


    All this raises questions about how realistic the Army’s timetable is for the Future Combat Systems, currently in the first stages of development. These elaborate networks of weapons, robots, drone aircraft and computers are still evolving in fits and starts; a typical unit is intended to include, say, 2,245 soldiers and 151 military robots.


    The technology still runs ahead of robot rules of engagement. “There is a lag between technology and doctrine,” said Mr. Finkelstein of Robotic Technology, who has been in the military robotics field for 28 years. “If you could invade other countries bloodlessly, would this lead to a greater temptation to invade?”


    Colin M. Angle, 37, is the chief executive and another co-founder of iRobot, a private company he helped start in his living room 14 years ago. Last year, it had sales of more than $70 million, with Roomba, a robot vacuum cleaner, one of its leading products. He says the calculus of money, morals and military logic will result in battalions of robots in combat. “The cost of the soldier in the field is so high, both in cash and in a political sense,” Mr. Angle said, that “robots will be doing wildly dangerous tasks” in battle in the very near future.


    Decades ago, Isaac Asimov posited three rules for robots: Do not hurt humans; obey humans unless that violates Rule 1; defend yourself unless that violates Rules 1 and 2.


    Mr. Angle was asked whether the Asimov rules still apply in the dawning age of robot soldiers. “We are a long ways,” he said, “from creating a robot that knows what that means.”




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • The Night of the Pod People


    By CATHY HORYN





    TO some it was a flagrant case of the emperor’s new clothes, a collection that just confirmed what a baroque madhouse the fashion world is. Even Bridget Foley, the executive editor of Women’s Wear Daily, in her report on the controversy set off last week by Marc Jacobs could not avoid using terms like “Addams Family” and “clown smocks,” though, as she also wrote, many people thought Mr. Jacobs’s show on Monday night was brilliant and as exciting as anything on a European runway.


    What Mr. Jacobs did was to create dresses and coats of such exaggerated proportions that they defied practical application, something that generally irritates retailers and excites editors. Some skirts had a podlike shape and were shown with boxy jackets or tops that looked almost woebegone. “I thought Marc did a wonderful show, said Robert Burke, the fashion director of Bergdorf Goodman. “You can say the color palette was strong and that he played up the volume.”


    Other retailers, as if anticipating the reaction of even their most tolerant customers to the prospect of looking like garden gnomes, were not so amused. “What bothered me was that he was trying too hard,” said Joan Kaner, the fashion director of Neiman Marcus. “It could have been charming, but it just missed.”


    She added, referring to Neiman Marcus’s buyers: “We will try to reserve judgment until we go back to the showroom and see it. Maybe those full skirts would work with a tight top, but a full skirt and a full top doesn’t work. Unless you’re 17 feet tall, it could be tough to sell.”


    Mr. Jacobs, who by Thursday was at work in Paris on his fall Louis Vuitton show, seemed pleased that his clothes had touched a nerve, though many editors did not feel a sense of charity toward him for having been made to wait 95 minutes for the show to start, and at least one store chief, Michael Gould of Bloomingdale’s, left early.


    “I love that reaction of love or hate,” Mr. Jacobs said. “It’s indifference that bores me to death.”


    There seemed to be no danger of that. By the end of Fashion Week on Friday the debate had not cooled, partly because the rest of the shows by American designers looked comparatively timid and restrained and did not present any headaches to retailers. Indeed within some stores the Jacobs show produced a sharp divide between the interests of fashion directors, whose role is to identify new trends, and buyers, who have a fiscal responsibility to sell them.


    The last time an American show polarized the fashion establishment to this degree was Mr. Jacobs’s grunge collection, for spring 1993, designed for Perry Ellis, a collection that more or less held up a mirror to Kurt Cobain and the Seattle music scene, but which on Seventh Avenue seemed an outrage. Who was going to buy drab flannel shirts at designer prices? Almost nobody did, and the line was shut down by Mr. Jacobs’s bosses, but the collection had a profound influence on fashion.


    On Monday night, when the second most controversial show of Mr. Jacobs’s career finally started, at 10:35 p.m.- a couple of dresses were still being finished and the public relations staff was on wireless headsets monitoring their progress – the music the audience heard was Smashing Pumpkins. One way or another grunge has been his lodestar.


    “In my 20 years in fashion there have been only two times when I haven’t woken up depressed after a show,” Mr. Jacobs said from Paris. The first time was after the grunge collection and the second time was last Tuesday morning.


    And in a sense, Mr. Jacobs said, all of his collections are reflection of his life and the way he first felt about fashion as a young man, when designers like Rei Kawakubo and Yohji Yamamoto were at the height of their powers and women like Carla Sozzani, who helped Romeo Gigli lead a fashion revolt against minimalism in the 1980′s, had an original look. These were some of the influences on his mind, he said, when he began discussions with his design staff and Venetia Scott, his stylist, for this fall’s collection, giving them books and other reference materials as a starting point for padded full skirts, ample coats and poetic-looking velvet dresses swathed in point d’esprit, a Gigli idea. Mr. Jacobs makes no apologies for references to the work of other designers, though journalists often take him to task for it, and he argues that musicians and artists, even writers, do the same thing.


    Another influence on Mr. Jacobs was his friend the director Sofia Coppola. He said they were out one night recently in Paris and Ms. Coppola wore a Lanvin dress with flat shoes. “I loved the way she looked,” he said. At the same time, he recognized that it was time to move away from the coy girlish style of the last few seasons. And that inevitably meant something darker, more somber and poetic, with that do-it-yourself quality he has attempted to give his clothes since the grunge collection.


    If Mr. Jacobs had not changed direction, editors would have been disappointed. “He left the girlie look that had pleased everyone and got to something stronger, that took a risk,” said Carine Roitfeld, the editor of Paris Vogue. “There were great pieces that I’ve never seen before in any of his collections.”


    That is precisely what concerns some retailers, that customers may avoid buying outfits in favor of one terrific piece like a coat or a pair of cuffed trousers. “For a retailer you generate business on the sale of multiple pieces to a customer,” said Ronald Frasch, the chief merchant at Saks Fifth Avenue. If customers are intimidated by the silhouette, they may forgo buying head-to-toe outfits, which have been the commercial strength of companies like Chanel and Dolce & Gabbana.


    Still, as Mr. Frasch points out, Mr. Jacobs is considered a highly commercial designer, in part because of his fashion influence and the success of his handbags and shoes.


    “I think what people have to remember is that if we ask Marc to do the same thing season after season, year after year, then we’re not allowing him to change,” said Sue Patneaude, the executive vice president of designer apparel at Nordstrom. To her the proportions, the shapes, showed a more mature and sophisticated eye.


    “I know it’s controversial, but I loved the collection, and I loved the full skirts,” Ms. Patneaude said. “It was actually serene to me.”



    Eric Wilson contributed reporting for this article.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • The clothes of 12 designers are being judged on “Project Runway.”


    TV WATCH | ‘PROJECT RUNWAY’


    Hemlines on the Stand: A Design-Off for Fashion Glory


    By ALESSANDRA STANLEY





    Beauty is undefinable, but fashion is not. It’s the dress that makes every other outfit in the room look fussy or staid.


    And aptly enough, Bravo’s “Project Runway” is the Prada of reality shows – the fashion contest is yet another “Apprentice”-like competition, except that this one makes even Donald Trump’s New York seem frumpy.


    The couture smackdown pits 12 aspiring designers against one another in an atelier at the Parsons School of Design in Manhattan, and has collected a loyal, intense following among the hip, young and gay – viewers who do not necessarily identify with the suburban pastry chefs and wreath makers on “Wickedly Perfect,” CBS’s quest for the next Martha Stewart. But “Project Runway” is not as narrow as its niche: all reality-show competitions are the same, but they are not all alike. Amazingly enough, “Project Runway,” has a decent heart beating under its frisky silk chiffons, frivolous chatter and pesky product placement. (Unlike Waldo, it doesn’t take long to discover where L’Oréal products are hidden.) It has suspense and unexpected turns, but only the good kind.


    Even at their most artificially flavored, reality shows always have an element of surprise. Sometimes, the producers are the ones caught off guard. Najai Turpin, 23, a middleweight boxer from Philadelphia who was a contestant on NBC’s forthcoming boxing show, “The Contender,” committed suicide on Monday, after the show had been taped, for reasons that NBC said were personal and not related to the show.


    In a fictional drama like “Eight Simple Rules,” the plot can be altered to deal with the death of one of its stars, as ABC did when John Ritter died suddenly early in the 2003 fall season. Reality shows cannot be rewritten once they are taped: In a made-for-television boxing championship that magnifies the Rockyesque personal struggles of the fighters – lots of scenes with children, spouses and girlfriends – the real tragedy took place off camera, and will likely be unnoticeable on the show. Mercifully, personal vignettes are not as important to “Runway.” For one thing, artistic talent is more tangible on this show than on most; contestants are not judged by how long they can lie in a tub of worms or how fast they can sell lemonade on the street. Perhaps because Miramax Television, which produces the series “Project Greenlight,” is a partner with Bravo, the show itself has a cleaner, more streamlined, documentary-style look – fewer hammy reaction shots and no scary “Jaws” music.


    There is plenty of the cattiness and clawing that add a kick to “America’s Next Top Model,” on UPN, and Bravo’s “Manhunt: The Search for America’s Most Gorgeous Male Model.” But behind all the personality clashes and waspish repartee (“Daniel’s approach is, like, so Bob Barker,” Jay, 29, says about a rival designer’s first creation), there is also a real test of individual ability and imagination.


    Sometimes with as little as 24 hours, each designer has to come up with a sketch, buy supplies, sew, cut and adapt an outfit to a runway model who parades the creation before a panel of judges, including the designer Michael Kors and Nina Garcia, hard-to-please fashion director of Elle magazine. The host is the supermodel Heidi Klum, and her angelic features and German inflection (“You’re out. Auf Wiedersehn.”) add a dominatrix touch to the dress making.


    Its not a completely fair fight, of course. A message tacked onto the final credits explains that the criteria for elimination are the judges’ scores, and also the producers’ preferences. But however molded by show business concerns, their choices make fashion sense, too.


    The tasks are gimmicky: the first assignment was to design a sexy evening dress entirely from materials bought in a Gristede’s grocery store (Austin, 23, won with a cocktail dress made entirely from corn husks). But there are also more classic challenges, including wedding dresses. When the model first tried on Kara’s soft, sleek satin design, her eyes welled up with tears like a real bride.


    Of course, everybody cries on “Project Runway,” a minefield of histrionics, hugs and tantrums. And just as every reality show has its Omarosa, on “Runway” everybody loves to hate Wendy, 39, a mother and dress designer from Middleburg, Va., who conceals a killer’s cunning behind glasses and a homebody hairstyle. In next week’s finale the three finalists, Jay, Kara and Wendy, show their collections during New York Fashion Week (the eventual winner will get, among other things, $100,000 in seed money). Tonight, as a warm-up to the finale, eliminated designers are brought back to reminisce and confront. Wendy, suddenly glamorous with newly dyed hair and no glasses, bears the brunt.


    “We were all really nice to you because we felt sorry for you because you’re such a terrible designer and like, a mother of however many children and you live in the middle of wherever,” Vanessa, 34, a Englishwoman with a loose upper lip, wails. “And you just stepped on every one of us.”


    Jay and Kara are more talented and far more pleasant, and they deserved to be finalists. “Project Runway” is what fashion should be and so often is not: naughty, but also sometimes nice.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • Catalina Sandino Moreno got her role in “Maria Full of Grace” through the good offices of a stranger who had seen her in a play and called her mother. “Maybe he was an angel,” she says.


    From Drug Mule to Miss Colombia


    By MARGY ROCHLIN





    AS she stretches out on a striped lounge chair by the Four Seasons swimming pool in Beverly Hills, 23-year-old Catalina Sandino Moreno observes the oil-slicked sunbathers taking in the warm afternoon sun. It’s not an unusual Los Angeles backdrop, but something about the way emotions ripple across Ms. Sandino’s oval face makes it seem as if she is quietly absorbing every detail. It was this same expressive yet thoughtful quality – a kind of transfixing curiosity – that the writer-director Joshua Marston instantly noticed about Ms. Sandino, a fledgling Colombian actress, while searching for the lead in his debut film, “Maria Full of Grace,” about a poor, rebellious, pregnant teenager who leaves her job on a flower plantation to become a drug mule.


    “I needed someone who could be loose, natural, play a kid – but she also had to have a weight and mystery in her eyes,” Mr. Marston said. “Catalina was all of those things.” Recently, Ms. Sandino became this year’s least known Academy Award nominee in the best-actress category. Speaking here with Margy Rochlin, she explains her own method of navigating United States customs, why it’s important to eat airplane food and what Colombians really think of the Oscars.


    MARGY ROCHLIN Where were you when you heard your best-actress nomination announced along with film vets like Annette Bening and Kate Winslet?


    CATALINA SANDINO MORENO I was in New York. I couldn’t sleep. I turned on the TV and called my mother in Colombia. I’m like, “Mom, 8:30 is the nominations. I don’t want you to know from another person what happened.” Then they said my name and it was the both of us screaming and screaming. My mother was crying. Then the phone broke. I thought: “Oh, my God. I hope she is fine.” Four hours later, I got through again and said, “What happened?”She said, “So many reporters from Colombia called the house that they broke the phone.”


    Q. Here you are, nominated for a top acting award, yet in Colombia, you couldn’t even get soap opera work. Why was that?


    A. In Colombia, in all of the soap operas, there’s a beauty queen, a model and the guy lead is from Ecuador or Peru or Venezuela, but not from Colombia. People like to see on the screen tall, skinny, beautiful models with blond or black hair who dress sexy. The way I dress and the way maybe I am, they didn’t like me.


    Q. Part of your growing legend is that you heard about the “Maria” open-call auditions from an anonymous stranger who saw you in a play and called your mother to say you’d be perfect in the lead. True?


    A. Yeah. I don’t know who he was.


    Q. He never came forward?


    A. I’ve never heard from anybody who said, “I was the one.” I think he had something to do with TV or something. He called my mother and told her where I should go, everything. It’s odd that he’s never introduced himself to me. Maybe he was an angel. Just someone who appeared in my life and changed it totally.


    Q. You were born and raised in middle-class Bogotá. Did playing Maria transform how you view the world?


    A. In Bogotá, I lived in this little bubble. I had everything I needed. Apartment. Doorman. I went to a British school. My friends and family were fine. Nothing weird happened to me or my brother. Everything was cool. So it’s scary when you are realizing that in your country, it can happen.


    Q. “It” meaning that you would need money so badly you’d turn to drug smuggling?


    A. Yes. Before, I was like: “Oh, drug mules are bad people. It’s so good they’re in jail.” Then I realized that they’re risking their lives, that they do it because they need to, not because they’re greedy. The last time I went to Colombia, I heard this story about a woman who was raising five or six kids. She didn’t have money, so she made a surgery on her dogs and stuffed drugs in them. If you hear that, you’d think, “That woman is awful and should be in jail to do that to a little animal.” But that woman was just trying to survive, to make money for her family without risking her life or the life of her babies. It’s really sad.


    Q. Because of this role, are you recognized by United States customs officers?


    A. I was coming from Spain and I was going through customs and gave them my passport to stamp. These two guys were like: “You’re the girl from ‘Maria’! Can I take my picture with you?” I was like: “Yeah! Of course! But can I take a picture with you?” It was awesome. A year before, the customs were the ones stopping me and causing me a headache. So I took a picture of him with me because I know that Josh wouldn’t believe me.


    Q. In one memorable scene, your character chokes down latex pellets filled with 10 grams of heroin. How did that work for the cameras?


    A. They were digestible, thank God. My mother is a pathologist, so when I told her that Josh was making me swallow the pellets, she was like, [sternly] “I need a word with him.” We had a meeting. She said, “What are you going to make my daughter swallow?” Josh calmed her down, saying, “There’s going to be no latex, no drugs.” I think there was, like, a sugar powder inside. I think I ended up swallowing eight pellets. I didn’t see them later, but I didn’t search for them, right? No, thank you.


    Q. Did the role teach you anything about being a young female who is traveling internationally by herself?


    A. Drug mules can’t eat. It’s a big clue. Lots of times, the flight attendants tell customs, “These are the girls who didn’t eat.” So I eat on the plane.


    Q. That’s horrible. Plane food is so bad.


    A. I know! It’s awful. But every time I fly from Colombia, I eat half the chicken or half the meat. The other thing, of course, is that you should try not to look nervous.


    Q. Aren’t the eyes of designers on you at this time, as well? Someone’s profile will be raised just by getting you to wear his or her dress to the Oscars.


    A. Really? I didn’t know this. Everything is so new for me. I don’t look for names of designers. If I like the dress and it fits me and I feel pretty, then I’ll put it on.


    Q. What about your red carpet walk? Have you been practicing?


    A. People tell me: “You’re going to the Oscars. You have to learn how to pose.” But I’m just going to have to be more conscious about cameras, about people taking pictures. The other day, I was watching Cate Blanchett. She’s so beautiful, so stylish. She knows how to do it. I’m just like – [gives a perkily cheerful smile].


    Q. Did you grow up watching the Academy Awards?


    A. No, never. In Colombia, the big thing to watch is the beauty pageants. It’s an event. You gather with your friends. Eat popcorn. Watch Miss Universe.


    Q. Why Miss Universe?


    A. Because we have somebody there – a Colombian representative. So, yes, this year everyone is watching the Academy Awards. This year, they have me.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • Sidney Lumet. His 1957 film “12 Angry Men,” starring Henry Fonda, was about a bitterly divided jury. “Dog Day Afternoon,” in 1975, was one of Al Pacino’s great performances


    White Socks, Cheap Suits and a Belief in the System


    By MANOHLA DARGIS





    THE filmmaker Sidney Lumet is one of the last of the great movie moralists. It’s no wonder he never won an Academy Award. Numerous directors have been passed over for an Oscar, but it was still something of a surprise that one of our most reliable journeymen, a director whose films have earned more than 40 Academy Award nominations, has never pocketed one of those gold-plated statuettes. On Feb. 27, the Academy of Motion Picture Arts and Sciences will present Mr. Lumet with an honorary Academy Award, the organization’s consolation prize for a lifetime of neglect.


    A leading purveyor of the social-issue movie (police corruption, the injustice of the justice system), the director is, above all, a committed moralist. More than anything, the image that defines a Sidney Lumet film is that of a man – and, almost inevitably, it is a man – struggling with his conscience and against the world. The Lumet world is populated by low-down and high-flying thieves, crooked but essentially decent cops, whistle-blowers and ambulance chasers, average, everyday, hurting men. Spiritually, at times literally, these are men who are just a generation away from Ellis Island and its promise, men who, like Timothy Hutton’s callow prosecutor in the 1990 police drama “Q & A,” wear white socks with their off-the-rack suits and still believe in the System.


    But the Lumet man is also Nick Nolte’s feral detective in that same movie, a grotesque whose consuming racism also symbolizes all that has gone wrong with the System. This is no small thing, particularly when it comes to American movies, where for the last three decades technical virtuosity has become more important and certainly more bankable than moral reason, and gangsters and pimps from the boardroom to the gutter are lionized, even heroized.


    Unlike many survivors of the film-school generation of the 1960′s and 1970′s, those easy riders and raging bulls who became mired in solipsism and increasingly empty exercises in style – and unlike, too, many of the show-boating prodigies who followed in their often ahistorical, apolitical wake – Mr. Lumet has remained stubbornly engaged with the world.


    Like the filmmaker Frederick Wiseman, our greatest documentarian of the social institution, Mr. Lumet is fascinated by how individuals are molded by the institutions in which they find themselves. Again and again, from his 1957 feature debut, “12 Angry Men,” about a bitterly divided jury, to the director’s 1981 film about a bad cop struggling to do right, “Prince of the City,” his films express both the dramatist’s concern with human struggles and a social scientist’s interest in society and social change. Few American filmmakers capture the dirty gleam of enamel-paint walls, the chipped wood-veneer and naked light-bulb ugliness of institutions as persuasively as Mr. Lumet. Fewer still understand what happens when a man beats his head bloody against those same walls, as does the title character in “Serpico,” a depressingly evergreen drama about police corruption.


    Released in 1973, “Serpico” was the first film Mr. Lumet made with Al Pacino. Their second collaboration, “Dog Day Afternoon,” released two years later, remains the director’s sublime achievement and, with the first two “Godfather” films, features the actor’s greatest screen performance. The film is based on the true story of John Wojtowicz, who, after clumsily commandeering a Brooklyn bank and taking its employees hostage, achieves a perverse pop celebrity. As he struts in front of the bank, waving a white handkerchief and chanting “Attica” to the cheering crowd, Mr. Pacino simultaneously comes across like Mick Jagger and a kid itching for a schoolyard fight. What makes this ridiculous man unbearably touching is the tender regard that the film holds for the character, who needs money to pay for his male lover’s sex-change operation.


    As in Mr. Lumet’s best work, the pleasures of “Dog Day Afternoon” emerge principally through the performances – John Cazale, who played Fredo in “The Godfather,” is equally good here as Mr. Pacino’s befuddled bank-robbing partner – and from a specific sense of place. Repeatedly, New York becomes a kind of character in Mr. Lumet’s films, alternately a silently condemning witness, a galvanizing historical force and an emblem of the collective conscience. That’s true whether the action is at the corner of 116th and Park in “The Pawnbroker,” his 1964 drama about a Holocaust survivor cut off from the life around him, or the glimpses of St. Mark’s-in-the-Bowery Church in Mr. Lumet’s 1966 adaptation of the Mary McCarthy novel “The Group,” gulps of fresh air in an otherwise dead offering.


    As “The Group” confirms, not all of Mr. Lumet’s adventures in the screen-directing trade have been happy. There have been the usual disappointments, films that were taken away from him and those that, despite his efforts, simply got away from him. There is no need to revisit “The Wiz” (1978), his disco-inflamed riff on the classic tale with Diana Ross and Michael Jackson. “The idea of doing any musical was thrilling,” Mr. Lumet later explained; too bad he wasn’t tapped to direct “Chicago.” And while it’s endearing that the director can rise to a gentlemanly defense of Melanie Griffith, who played a New York cop who goes undercover in a Hasidic community in his 1992 film “A Stranger Among Us,” the less said, the better about this young lady who fell very far.


    It is understandable why that particular script tugged at Mr. Lumet, the son of a Yiddish theater actor. Mr. Lumet was born in Philadelphia in 1924, but grew up in New York, the backdrop for most of his best films. Mr. Lumet initially followed his father into acting, both in radio and Yiddish theater, making his Broadway debut as one of the original Dead End Kids. The son also appeared in a 1935 Yiddish short film called “Cigarettes,” as a shoeless cigarette vendor, and “One Third of a Nation,” a 1939 agitprop feature produced by the Works Progress Administration. Mr. Lumet’s acting career was interrupted by the war, where he served in the Signal Corps; afterward, he studied acting with Sanford Meisner, helped found a theater troupe, directed plays and, in 1950, ventured into television.


    It was in television that he honed his directing chops, working for “Playhouse 90″ and “Kraft Television Theater,” and becoming part of a new generation of television-nurtured talent that eventually included John Frankenheimer and Arthur Penn. Both the theater and live television proved valuable, surprisingly complementary training grounds. Mr. Lumet became an adherent of preproduction rehearsals, a “minimum of two weeks,” and learned how to preplan his movies, shooting only what he needed and pre-editing as much as possible in camera. “You are forced by their nature,” he said of the two mediums, “to make the dramatic selection in advance.” His early dual tenure in theater and television may also help account for his deeply sympathetic direction of actors and longtime affinity for human-scaled drama.


    Like Martin Scorsese, Mr. Lumet never became a true movie industry insider. “That place has no reason for being,” he once said of Hollywood. “All the great centers of art have been centers of other things … London, Paris, New York, Berlin – they’ve had other functions; the life of the place has been connected to the mainstream of life of that nation, of those people, and art came as a flower of that.” In New York, Mr. Lumet had a canvas on which to develop a recognizable visual style, low to the ground and stripped of gloss, as well as the inspiration for a wealth of stories and real lives from which his art could flower. No wonder Hollywood gave him the brush-off, even while he was doing his part to sustain its great humanist tradition.


    Film critics like to champion the humanism of artists like Jean Renoir, as if their genius – and historical distance – gave us license to applaud their decency. It is no small irony that many of those same critics, many of whom lean to the left, are often tougher on filmmakers like Mr. Lumet, whose artistry doesn’t always match their good intentions. We mock the do-gooders even as we refuse to take issue with the dehumanizing violence and various “isms” that passes for entertainment on our screens. Politics may be cool, at least in documentaries, but forget about social engagement, fighting the good fight, moral outrage.


    For Mr. Lumet, who has coaxed numerous great performances from his actors and plenty of solid scenes from his many fine films, doing the right thing has also always been the human thing, the only thing.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • February 13, 2005

    Helpmates and Heroes: Ordinary Women and Bold Men


    By A. O. SCOTT





    WHO do we think we are?


    This may seem like an arbitrary question to ask of the 10 performances nominated in the Academy of Motion Picture Arts and Science’s lead acting categories – it might make more sense to ask who the 1,300 or so actors who made the nominations think they are – but indulge the conceit for a moment. Consciously or not, even those of us old enough to know better still regard the movies as a mirror. It’s not exactly that we go to them to learn how to behave, or that we emulate what we see, but rather that we seek out connections to, and reflections of, our ideal and actual selves. And we look for these shadows and illuminations, more than anywhere else, in the faces and postures of the actors, and in the carefully framed and foreshortened lives of the characters they inhabit.


    The five men and five women singled out for special attention by their peers at Oscar time are never the whole picture, but they nonetheless offer a suggestive snapshot, a possible response to questions that perennially preoccupy the culture, and that drive us to seek out stories of boxers, flyboys, drug mules and musicians in the first place. More than that, the tradition of sorting lead and supporting performances according to sex is an invitation to wonder about what these incarnations say about what men and women are, are not and should be.


    One thing they are not is lovers. Most of the protagonists who dominate the Oscar field this year do have husbands, wives, girlfriends and boyfriends, but for the most part their romantic lives took second place to more practical concerns. The big subjects were work, art, fame and survival; and relationships with friends, colleagues, rivals and children (biological and surrogate) sustained more drama than courtship and marriage.


    Given this antiromantic bias, it is perhaps not surprising that, once again, men had a better year than women, a discrepancy that has more to do with the conventions and blind spots of American filmmaking than with actual social arrangements. If you look at the list of men nominated for best actor – Don Cheadle, Johnny Depp, Leonardo DiCaprio, Clint Eastwood, Jamie Foxx – it is not hard to keep going, until you have come up with an equally varied and impressive second list of the excluded. Paul Giamatti from “Sideways” is the first name that comes to mind, of course, and his absence seems like the most obvious injustice, but Liam Neeson in “Kinsey,” Jeff Bridges in “Door in the Floor,” Tom Cruise in “Collateral” and Jim Carrey in “Eternal Sunshine of the Spotless Mind” are also easily imaginable might- (or should)-have-beens.


    On the women’s side, though, no such alternative list presents itself. Not only do most of the nominees come from small movies, but it is hard to come up with the names of other actresses who might have qualified. There was the hard-working, perennially nomination-worthy Nicole Kidman in “Birth” and “Dogville,” but neither a high-toned ghost story nor a three-hour Brechtian allegory were likely to catch the academy’s eye, especially since neither one did especially well with the mainstream audiences whose tastes the academy exists to ratify. After Ms. Kidman, in any case, the field grows sparse indeed – assuming we limit it, as the nominating voters almost always do, to American or English- language movies. In Asia, in Europe and in Latin America the lives and fates of women seem to supply endless grist for the cinematic imagination, as indeed they used to in this country. But as the traditional vehicles of female stardom – the melodrama and the romantic comedy – have fallen from favor in Hollywood, the range of parts available to women has shrunk, leaving several generations of gifted actresses shuttling among the durable supporting roles of wife, girlfriend, mother, stripper and prostitute. Every year, it seems, someone makes this complaint, and every year the film industry responds by justifying it anew.


    But at least the five best actress nominees provide a reminder that there is a great deal more that women can do, on screen and, by implication, elsewhere. Indeed, it is striking that, with the exception of Kate Winslet in “Eternal Sunshine,” none of the nominees plays a traditional romantic lead. While Vera Drake, Imelda Staunton’s character in Mike Leigh’s movie of that name, is a devoted wife and mother, and while her family life figures prominently in the film, its dramatic crux involves her workaday activities as a clandestine abortionist, “helping girls out” in post- World War II London. And even though the story of Annette Bening’s Julia Lambert in “Being Julia” is organized around her unhappy love affair with a young American adventurer, it is really the glamour and frenzy of her working life as a star of the London stage that give this rather flat backstage costume drama what liveliness it has. The vigorous joy of Ms. Bening’s performance comes from her conviction that an actress is, above all, a woman in action, who embodies the possibilities of freedom, self-assertion and decisiveness in the face of adversity or indifference.


    This is a trait Julia shares with Maria Alvarez, the young Colombian woman played by Catalina Sandino Moreno in “Maria Full of Grace,” and also with Maggie Fitzgerald, Hilary Swank’s boxer in “Million Dollar Baby.” Both of these young women start out facing appalling obstacles, which arise not only from poverty but also from a soul-killing social obscurity. Working in a flower warehouse and waiting tables at a run-down diner do not just oppress Maria and Maggie with long hours and low pay; these jobs also seem to insult their very sense of being, to prescribe a life of invisibility and passivity they are determined to defy. Julia Lambert, blessed by fame and privilege, assumes such defiance as an entitlement, which is why she triumphs so magnificently. For her part, Vera Drake, a humble cog in the English class system, takes diffidence and unobtrusiveness to be her rightful share in life, which is why she puts up so little fight when the power of the state comes down upon her.


    In different ways, “Maria Full of Grace” and “Million Dollar Baby” both revisit the theme of upward striving that has been a staple of American movie realism at least since the Depression. The paths their heroines take – drug smuggling and prize-fighting – are different of course, as are the fates that await them at the end of their struggles, but the defining contours of their personalities are remarkably similar. They are stubborn, quick on their feet and sometimes reckless, meeting even the most gruesome and terrifying prospects with winning, heartbreaking decisiveness. Though both Ms. Sandino and Ms. Swank are lovely to look at, they function, in these movies, much more as objects for identification than objects of desire. Their ordinariness, the feeling that the world they occupy is one we might inhabit, however distant it may be from our actual lives, is part of what makes the audience care about them. And even though she dwells in the allegorical, quasi-science fiction landscape of Charlie Kaufman and Michel Gondry’s wild imaginations, Ms. Winslet’s Clementine may be the most familiar character of them all – a flaky, passionate, changeable creature as maddening as she is irresistible. Clementine could be someone you know, or someone you’ve dated, or – most likely, perhaps – someone you’ve dated’s idea of you.


    All of which may doom Ms. Winslet’s chances, since familiarity, in the eyes of Oscar voters, often breeds neglect. How else to explain their indifference to Mr. Giamatti’s performance? He played a weak-willed loser in a comedy, and he did so with impeccable understatement – three strikes against him in a contest that tends to reward the traditional movie-star virtues of stoicism, heroism, high seriousness and scenery-chewing. One thing movie stars love to do is play other famous people, and the most notable contrast between the men and the women this year is that, while the women are all playing fictional, everyday folk, 80 percent of the men are playing real people with at least some claim on public renown. Mr. Eastwood is the exception, but he is such a familiar screen presence – the whisper, the squint, the craggy, inscrutable visage – that he seems at times to be playing himself. Frankie Dunn, the grizzled trainer who takes a chance on Maggie Fitzgerald and becomes her mentor and surrogate father, is an especially subtle and lovely variation of the persona Mr. Eastwood has honed over the years: the weary, aging man of action who rouses himself to one last fight. The pleasure in watching him comes from the ease with which the actor slips into this persona and the unobtrusive way he discovers its hidden emotional dimensions, allowing us to be surprised by someone we thought we had known all along.


    Mr. Foxx’s galvanizing performance in “Ray” brings its own, distinctive kind of surprise, the delightful shock of seeing a famous figure so uncannily impersonated. This is the kind of virtual reincarnation that almost justifies the pedestrian conventions of the biopic, and is surely what attracts gifted actors to the genre. Of all last year’s real-life subjects, Ray Charles was both the most charismatic and the most recently departed, which made “Ray” both accessible and risky. It was impossible not to measure Mr. Foxx against the real Ray Charles, a comparison that Mr. DiCaprio, as Howard Hughes in “The Aviator,” and Mr. Depp, as J. M. Barrie in “Finding Neverland,” were less likely to face. But like “Ray,” “The Aviator” and “Finding Neverland” capitalized on the curiosity that public figures continue to inspire even after their initial luster has faded, and on our hunger for information about their private lives. This is not only prurience, but also a kind of biographical superstition. If we glimpse these people behind the scenes and out of the spotlight – if we see them as vulnerable children, straying husbands, doting fathers, drug addicts or madmen – we might find ourselves in possession of their secrets, and understand the tics of personality and the accidents of fate that link us to them even as they explain their distinction from the rest of us.


    Of course, such understanding is illusory, which is why screen biographies, even when well executed, tend to be dramatically unsatisfying. What we get from them is not really understanding, but proximity. We spend a few hours in the company of Howard Hughes or Ray Charles or J. M. Barrie so that we can be glamorized by the strangeness of their lives and the ultimate enigma of their characters. We may admire, pity or disapprove, but we rarely identify in the way that we do with ordinary, unreal characters and we experience a similar distance from the actors portraying them, which is why it is so gratifying to watch them take the stage on Oscar night.


    In this context, Mr. Cheadle is an intriguing hybrid: his character, the Rwandan hotel manager Paul Rusesabagina is both real and ordinary, a person with whom we can identify even as his heroism makes him seem larger than life, a private citizen pulled by circumstances into actions of enormous public visibility and consequence. All of this may make him an ideal Oscar candidate, not only because he acts with such grace and conviction, but because he answers the question we ask of screen actors -who do we think we are? – by showing us what we would most hope to be.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • From left, Billy Idol, George Michael, Duran Duran, Deborah Harry and Robert Smith of the Cure.


    February 13, 2005

    We Hate the 80′s


    By JEFF LEEDS





    NIKKI SIXX, the bassist for the famously fast-living glam-rockers Motley Crue, believes that even 24 years after their debut, his band still has a certain timeless aspect. “If you want to drop the tailgate, get some beer and go to a strip club, that’s the Crue,” he said the other day before a rehearsal for the band’s new tour. Yet Mr. Sixx’s band, which just released a two-disc career anthology including 1987′s “Girls, Girls, Girls” and 1989′s “Dr. Feelgood,” is returning at a particularly apt moment.


    The music of the 1980′s has re-entered the zeitgeist in a gigantic way. What began more than a decade ago with 80′s nightclubs spread soon after through “flashback” lunch hours across the radio dial. After that came the retro-tinged success of blockbuster films like “The Wedding Singer” and pastel-saturated video games like “Grand Theft Auto: Vice City.” On television, the hits of that decade now fill the soundtracks of countless popular series, like “The O.C.,” which chose a cover of the OMD hit “If You Leave” for a decisive scene last season. And VH1, of course, has built a franchise on 80′s exhumations, with “Big 80′s” and the wildly popular “I Love the 80′s.” All together, it’s proof that the synthesizer-powered pop songs and hair-sprayed headbangers of that era still have a strange hold on the 30-something demographic so desirable to advertisers.


    The recording industry was slow to act, but over the last year and a half it has belatedly started trying to cash in on it all. Performers lost in the pop wilderness for a generation suddenly decided to get in touch with their old, often estranged mates, and get the band back together in the name of art, commerce or both. A raft of once-popular acts, from the danceable R&B group New Edition to the pop idols Duran Duran and George Michael to the more self-serious Tears for Fears to the standard-bearers of teenage angst, the Cure, all shook off the dust and signed new recording contracts in the past 18 months or so, releasing CD’s of new music in some cases for the first time in 15 years. In the footsteps of Motley Crue’s double album, the stylishly snarling Billy Idol, the dark darlings New Order and the famously burly rapper Heavy D will be releasing new albums as well.


    All have returned with attendant fanfare, sweeping across red carpets and past screaming fans at radio station visits and showcase concerts.


    Yet despite the grass-roots enthusiasm and VH1 dogma – not to mention millions of dollars in marketing – the 80′s are not selling. People may be donning the once-again fashionable styles of the era (even leg warmers and Flashdance tops) and dancing to the bands of their youth, but they are not going to the store to buy the albums. For the industry that bet on the revival, it’s mourning in America.


    Take Tears for Fears. After scoring huge hits with the melancholy anthems “Everybody Wants to Rule the World” and “Shout,” they broke up. Their subsequent decade of silence was broken about five years ago when one member, Curt Smith, sent a fax to his former partner, Roland Orzabal, and the two started writing songs together. They eventually received a six-figure advance from Universal’s New Door label, and found themselves playing radio station-sponsored concerts and meeting fans at in-store appearances at Tower Records. According to Nielsen SoundScan data, through Jan. 30 their new album, “Everybody Loves a Happy Ending,” had sold just 80,000 copies – a far cry from their last album, “The Seeds of Love,” which sold about one million copies.


    Duran Duran, who drew a fanatical following in the early 80′s with their rogueish good looks and new-wave hits “Rio” and “Hungry Like the Wolf,” made a huge splash last year when all five members reunited for a series of concerts (including five sellouts at Wembley Arena in London) and signed a deal with Epic Records for an estimated $500,000. Since then they have been back to woo their rabid female fans, making multiple appearances on “Regis & Kelly” and at fashion-industry events. But in contrast to their hits, which routinely sold one million copies or more, the band’s reunion album, “Astronaut,” had sold only about 199,000 copies in the United States. Motley Crue is expected to enjoy a strong debut, but after that, the precedent is poor.


    “The 80′s nostalgia boom is real, but it’s not broad,” said Michael Hirschorn, executive vice president of programming for VH1. “It doesn’t apply to everything and not in all ways. It applies to a specific kind of Gen X, self-mocking, slightly ironic thing. For this group of people, you can’t give them straight nostalgia of the sort of baby- boomer, “everything was wonderful and great when we were kids” feel. People Gen X and younger know that things weren’t that great. We never thought that Motley Crue was saving the world. We identify with them passionately, but with a certain wink.”


    Reviving the careers of artists who have retreated from the pop music scene is never a simple affair, but it has been done – usually by updating their image and appealing to new fans at least as directly as old ones. The bluesy rockers Aerosmith nearly disbanded in the 70′s, but they re-emerged in the mid-80′s via the new medium of music videos, appearing with Run D.M.C. in the rappers’ cover of “Walk This Way.” They brought in outside songwriters to develop their next album, the hit “Permanent Vacation,” and later produced a series of popular high-concept videos. More recently, the guitar virtuoso Carlos Santana, who had struggled to sell records since the late 70′s, stormed back in 1999. Arista Records recast the rock legend as pop radio star, pairing him with a string of popular artists en route to releasing the blockbuster “Supernatural” album.


    To a degree, the executives orchestrating the returns of the 80′s artists are following those formulas, aiming to update their image and package them for new fans. Duran Duran brought in some very au courant producers – Don Gilmore, who had previously produced the band Linkin Park, and Dallas Austin, who has worked with Pink and TLC – for their album. Tears for Fears cast the it-girl Brittany Murphy in its new music video. Billy Idol, wary of simply performing a hits revue for older fans, has booked a date to play South by Southwest, an annual buzz-band conclave in Austin, Tex.


    And on some level, these strategies have been successful: consumer research by the New Door label showed that about half of the people who bought Tears for Fears’ new album were new to the band’s material, said Bob Mercer, a senior vice president for Universal. Promotion of the new albums has also helped these bands’ sales overseas, or on older recordings. But when it comes to the new material, the 30-something American fans who should logically form the artists’ core audiences just aren’t turning up in droves.


    According to Ann Fishman, president of Generational Targeted Marketing, the problem’s not with the music, it’s with the memories. The fans from Generation X, she says, “are not particularly grounded in their youth.”


    “Would you be grounded in something where you had divorced parents, poor schooling?” she asks. “We presume nostalgia is a great selling tool. It is to the baby boomers. It’s not to Gen X. The history of their youth has forced them to grow up more quickly. Nostalgia is not necessarily something that’s going to move them ahead. They enjoy the music of their youth, but it’s not a need.”


    The theory might help to explain why Madonna and Prince had a very good year. They both made it big in the 80′s, but (with the exception of a brief hiatus on Prince’s part), they both kept performing, kept evolving, kept updating their image. Their recent albums weren’t hastily convened revivals, they were simply the latest chapter in a long and varied career.


    Making the odds that much longer, the long-lost stars of the 80′s are returning to a music establishment they might barely recognize. The machinery that transformed them into mass phenomena two decades ago – mainly Top 40 radio and MTV – has long since been dismantled or redesigned. The radio dial has splintered into tightly managed formats aimed at specified niches, which may not be receptive to revivals. “There’s resistance from radio to play some of these artists,” said Jon Zellner, who oversees programming on so-called hot adult-contemporary stations for Infinity Broadcasting. He said he decided against playing Tears For Fears, among others. “I think programmers are potentially afraid of their radio stations sounding dated.” As for MTV, and the music video itself, they aren’t so new anymore, and the cable giant now devotes far more airtime to reality programming and lifestyle shows.


    New bands now establish themselves through outlets that didn’t exist five years ago, let alone 20, like AOL’s “Sessions,” a live performance for online viewers, or MySpace, an online community popular with music fans. And those formats don’t favor bands in their 40′s and 50′s.


    As a result, some label executives said they had turned away former stars who came shopping for new record contracts. “I just wasn’t convinced that the songs were compelling enough to compete in today’s marketplace,” said Andrew Slater, president of Capitol Records, who says he passed on both Duran Duran and Billy Idol. “On the television side, you might have someone perform on a late-night show, but ultimately I don’t think it’s enough to drive a passive audience to all drop what they’re doing in their lives and find that connection to the artist that they loved in the 80′s.”


    But these bands are expected to do extremely well in their North American concert tours. Motley Crue, for one, will be paid minimum fees of up to $250,000 per night on their tour. Duran Duran, in addition to big appearance fees, is cashing in on the trend toward V.I.P. tickets, offering their most devoted fans the chance to buy travel packages, including a two-night hotel stay and signed memorabilia, for $2,590 per person.


    But those lucrative concerts play to fans eager for one (or two) glorious nights of nostalgia, not those interested in watching the band try to grow.


    “It’s hard enough now doing any of the old material because obviously we just want to do the new material,” said Mr. Smith of Tears for Fears. “It’d be horrible to be playing onstage and have all these people in the front saying ‘play ‘Shout.’ The emotion in a lot of the songs we wrote back then really doesn’t mean anything to us now. There are certain emotions you have in your late teens and 20′s that really don’t exist when you turn 40. There’s a certain angst we had then that doesn’t exist now. Now we have middle-aged angst.”


    The stars of the 80′s also now have middle-aged bodies, and hauling them around the country on long tours isn’t as easy as it once might have been. Mick Mars, the guitarist for Motley Crue, has undergone hip replacement surgery. Mr. Smith has two young children.


    Still, you won’t hear any of them complaining too loudly. Pop music has always been a young person’s game, and for those who get a rare second turn in the spotlight, even tepid album sales and a backward-looking concert tour are a rush. But for many fans watching the marketing machinery creak into gear, the industry’s attempt to catch up to what was once a just a kitschy, spontaneous goof feels all too familiar.


    The age bracket that grew up with MTV, after all, has already seen more than its fair share of commercially driven revivals, from the re-release of “Star Wars” to the creation of Nick at Nite to a new music festival at Woodstock (twice). Perhaps after all those retreads, a backlash is inevitable.


    In Baltimore, for example, Benn Ray, the co-owner of an independent bookstore, Atomic Books, has started up a regular “I Hate the 80′s” party to mock the trend.


    “The 80′s nostalgia was starting to roll in, and I was like, ‘Wait a minute! Did you people actually listen to the same decade I did? You had eight years of Reagan. There was cocaine everywhere. There were yuppies. We were oppressed by this whole notion of baby boomers trying to cash out.” At past parties, attended by people wearing parachute pants and Members Only jackets, local bands performed their most hated 80′s memories on Casio keyboards, which they promptly demolished at the end of their set. “One year,” he recalled, “a performer called Evil Pappy Twin played Van Halen covers on a classical Renaissance lute.”


    In any case, the clock is running out. Whether you love it or hate it, the second coming of the 80′s has already lasted almost as long as the original decade – unheard of in the ever-quickening cycles of cultural nostalgia. Mr. Hirschorn of VH1 admits that “the early 80′s are sort of getting long in the tooth.”


    Besides which, the 90′s – remember them? – are ready for their retouched close-up. “With the Lights Out,” a box set by the decade’s greatest heroes, Nirvana, ranked as one of the holiday season’s best sellers. Trendy bars have started 90′s nights, or even adopted entire 90′s-revival décor themes.


    And VH1, of course, has already brought out a new series called “I Love the 90′s.”




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • Frank Rich


    FRANK RICH


    How Dirty Harry Turned Commie







    THE day the left died in Hollywood, surely, was the day that a few too many Queer Eyes had their way with Michael Moore as he set off on his Oscar campaign. The baseball cap and 1970′s leisure ensemble gave way to quasi-Libeskind eyeglasses and spiky hair that screamed “I am worthy of a cameo on ‘Entourage.’ ” But not worthy of an Oscar. “Fahrenheit 9/11″ got zero nominations, leaving the Best Picture race to five apolitical movies. Since none of those five has yet sold $100 million worth of tickets, let alone the $350-million-plus of a “Lord of the Rings”-level megahit, the only real drama accruing to this year’s Oscar telecast was whether its ratings would plunge as low as the Golden Globes.


    But two weeks out from the big night, the prospects for a little conflict are looking up. Just when it seemed that Hollywood had turned a post-election page in the culture wars, the commissars of the right cooked up a new, if highly unlikely, grievance against “Holly-weird,” as they so wittily call it. This was no easy task. They couldn’t credibly complain that “The Passion of the Christ” was snubbed by the movie industry’s “elite” (translation: Jews), since it nailed three nominations, including one for makeup (translation: really big noses). That showing bested not only “Fahrenheit 9/11″ but “Shrek 2,” the year’s top moneymaker. Nor could they resume hostilities against their perennial bogeymen Ben Affleck, Susan Sarandon, Sean Penn, Barbra Streisand and Whoopi Goldberg. All are nonplayers in this year’s awards.


    So what do you do? Imagine SpongeBob tendencies in the carefully sanitized J. M. Barrie of “Finding Neverland”? Attack a recently deceased American legend, Ray Charles, for demanding that his mistress get an abortion in “Ray”? No, only a counterintuitive route could work. Hence, the campaign against Clint Eastwood, a former Republican officeholder (Mayor of Carmel, Calif., in the late 1980′s), Nixon appointee to the National Council of the Arts and action hero whose breakthrough role in the Vietnam era was as a vigilante cop, Dirty Harry, whom Pauline Kael famously called “fascist.” There hasn’t been a Hollywood subversive this preposterous since the then 10-year-old Shirley Temple’s name surfaced at a House Un-American Activities Committee hearing in 1938.


    No matter. Rush Limbaugh used his radio megaphone to inveigh against the “liberal propaganda” of “Million Dollar Baby,” in which Mr. Eastwood plays a crusty old fight trainer who takes on a fledgling “girl” boxer (Hilary Swank) desperate to be a champ. Mr. Limbaugh charged that the film was a subversively encoded endorsement of euthanasia, and the usual gang of ayotallahs chimed in. Michael Medved, the conservative radio host, has said that “hate is not too strong a word” to characterize his opinion of “Million Dollar Baby.” Rabbi Daniel Lapin, a longtime ally of the Christian right, went on MSNBC to accuse Mr. Eastwood of a cultural crime comparable to Bill Clinton having “brought the term ‘oral sex’ to America’s dinner tables.”


    “What do you have to give these people to make them happy?” Mr. Eastwood asked when I phoned to get his reaction to his new status as a radical leftist. He is baffled that those “who expound from the right on American values” could reject a movie about a heroine who is “willing to pull herself up by the bootstraps, to work hard and persevere no matter what” to realize her dream. “That all sounds like Americana to me, like something out of Wendell Willkie,” he says. “And the villains in the movie include people who are participating in welfare fraud.”


    What galls the film’s adversaries – or so they say – is a turn in the plot that they started giving away on the radio and elsewhere in December, long before it started being mentioned in articles like the one you’re reading now. They hoped to “spoil” the movie and punish it at the box office, though there’s no evidence that they have succeeded. As Mr. Eastwood has pointed out, advance knowledge of the story’s ending did nothing to deter the audience for “The Passion of the Christ.” My own experience is that knowing the ultimate direction of “Million Dollar Baby” – an organic development that in no way resembles a plot trick like that in “The Sixth Sense” – only deepened my second viewing of it.


    Here is what so scandalously intrudes in the final third of Mr. Eastwood’s movie: real life. A character we love – and we love all three principals, including the narrator, an old boxing hand played by Morgan Freeman – ends up in the hospital with a spinal-cord injury and wants to die. Whether that wish will be granted, and if so, how, is the question that confronts not just the leading characters but also a young and orthodox Roman Catholic priest (Brian F. O’Byrne). The script, adapted by Paul Haggis from stories by F. X. Toole, has a resolution, as it must. But the movie has a powerful afterlife precisely because it is not an endorsement of any position on assisted suicide – or, for that matter, of any position on the disabled, as some disability-rights advocates have charged in a separate protest. The characters of “Million Dollar Baby” are complex and fictional, not monochromatic position papers outfitted in costumes, and the film no more endorses their fallible behavior and attitudes than “Ray” approves of its similarly sympathetic real-life hero’s heroin addiction and compulsive womanizing.


    “I never thought about the political side of this when making the film,” Mr. Eastwood says. He is both bemused and concerned that a movie with no political agenda should be construed by some as a polemic and arouse such partisan rage. “Maybe I’m getting to the age when I’m starting to be senile or nostalgic or both, but people are so angry now,” he adds. “You used to be able to disagree with people and still be friends. Now you hear these talk shows, and everyone who believes differently from you is a moron and an idiot – both on the right and the left.” His own politics defy neat categorization. He’s supported Democrats (including Gray Davis in the pre-Schwarzenegger era) as well as Republicans, professes the libertarian creed of “less government” and “was never a big enthusiast for going to Iraq but never spoke against it once the troops were there.” In other words, he’s in the same middle as most Americans. “I vote for what I like,” he says. “I’m not a loyalist to any party. I’m only a loyalist to the country.” That’s no longer good enough, apparently, for those who feel an election victory has empowered them to enforce a strict doctrine of political and spiritual correctness.


    It’s a standard tactic for these holier-than-thou bullies to cite movies they don’t like as proof that, in Mr. Medved’s formulation, “the entertainment industry” is “not in touch with the general public.” The industry’s profits prove exactly the reverse, but never mind. Even in this case, were Mr. Eastwood’s film actually an endorsement of assisted suicide, the public would still be on his side, not his critics’. The latest Gallup poll on the subject, taken last year, shows that 53 percent of Americans find assisted suicide “morally acceptable” as opposed to the 41 percent who find it “morally wrong.” (The figures for Catholics are identical).


    But the most unintentionally revealing attacks on “Million Dollar Baby” have less to do with the “right to die” anyway than with the film’s advertising campaign. It’s “the ‘million-dollar’ lie,” wrote one conservative commentator, Debbie Schlussel, saying that the film’s promotion promises ” ‘Rocky’ in a sports bra” while delivering a “left-wing diatribe” indistinguishable from the message sent by the Nazis when they “murdered the handicapped and infirm.” Mr. Medved concurs. “They can’t sell this thing honestly,” he has said, so “it’s being marketed as a movie all about the triumph of a plucky female boxer.” The only problem with this charge is that it, too, is false. As Mr. Eastwood notes, the film’s dark, even grim poster is “somewhat noiresque” and there’s “nobody laughing and smiling and being real plucky” in a trailer that shows “triumph and struggles” alike.


    What really makes these critics hate “Million Dollar Baby” is not its supposedly radical politics – which are nonexistent – but its lack of sentimentality. It is, indeed, no “Rocky,” and in our America that departure from the norm is itself a form of cultural radicalism. Always a sentimental country, we’re now living fulltime in the bathosphere. Our 24/7 news culture sees even a human disaster like the tsunami in Asia as a chance for inspirational uplift, for “incredible stories of lives saved in near-miraculous fashion,” to quote NBC’s Brian Williams. (The nonmiraculous stories are already forgotten, now that the media carnival has moved on.) Our political culture offers such phony tableaus as a bipartisan kiss between the president and Joe Lieberman at the State of the Union, not to mention the promise that a long-term war can be fought without having to endure any shared sacrifice or even too many graphic reminders of its human cost.


    Last Sunday’s was the first Super Bowl in 19 years that didn’t feature the “I’m Going to Disneyland” spot for the victor, but maybe that’s because it’s superfluous. Whether in reaction to the trauma of 9/11 or for reasons that are as yet unknowable, we seem determined to will ourselves into Fantasyland at all times. This cultural syndrome is perfectly encapsulated by Jacques Steinberg’s report in The New York Times last week of a new ABC “reality” program with the working title of “Miracle Workers.” In this show, in which DreamWorks is also a participant, a “dream team” of physicians will miraculously run to the rescue of critically ill Americans, the perfect imaginary balm for what ails a country spiraling into a health-care catastrophe.


    There’s no dream team, either in the boxing arena or in the emergency room, in “Million Dollar Baby.” While there is much to admire in the year’s other Oscar-nominated movies – the full-bodied writing in “Sideways,” the cinematic bravura of “The Aviator,” the awesome Jamie Foxx in “Ray” – Mr. Eastwood’s film, while also boasting great acting, is the only one that challenges America’s current triumphalist daydream. It does so not because it has any politics or takes a stand on assisted suicide but because it has the temerity to suggest that fights can have consequences, that some crises do not have black-and-white solutions and that even the pure of heart are not guaranteed a Hollywood ending. What makes some feel betrayed and angry after seeing “Million Dollar Baby” is exactly what makes many more stop and think: one of Hollywood’s most durable cowboys is saying that it’s not always morning in America, and that it may take more than faith to get us through the night.




    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top


  • February 13, 2005

    Clang!


    By MICHAEL SOKOLOVE





    Behold the slam dunk, the pulse-quickening, throw-it-down, in-your-face signature move of the National Basketball Association. The dunk is a declaration of power and dominance, of machismo. In a team game, an ensemble of five players a side, it is an expression of self. In a sport devoted to selling sneakers, the dunk is a marketing tour de force, the money shot at the end of every worthy basketball sequence. (When you see the shoes in the 30-second spot, what is the wearer of those shoes always doing?) Next weekend in Denver, the cultural moment that is the N.B.A. All-Star Game will take place, an event set annually amid a weekend of concerts, lavish parties and showy displays of fashion. On such a big stage (and with defensive standards momentarily relaxed), the game itself is sure to be a veritable dunkathon, a string of self-satisfied throw-downs by the league’s biggest stars. If I had my way, at the conclusion of the game the dunk would be taken out of commission. Banned as a first step toward rescuing a game that has strayed far from its roots, fundamentals and essential appeal.


    The addiction to the dunk is emblematic of the direction in which basketball — like all major pro sports, really — has been heading: less nuance, more explosive force. Greater emphasis on individual heroics and personal acclaim, less on such quaint values as teamwork and sacrifice. Basketball’s muscled-up, minimally skilled dunker is the equivalent of baseball’s steroid-fueled home-run slugger or the guided-missile N.F.L. linebacker, his helmet aimed at anything that moves. It is all part of a video-game aesthetic being transplanted into our real games: the athlete as action hero, an essentially antisocial lone wolf set apart from teammates, dedicated to his own personal glory and not bound by much of anything, even the laws of gravity. (Last month the sports media giant ESPN entered into an $850 million partnership with Electronic Arts, the video-game company that turns real-life athletes into digitized figures, further blurring the distinction between flesh-and-blood athletes and the superhumans we have come to expect in the sports arena.)


    In November, an ugly incident, a brawl between N.B.A. players and fans in Detroit, led some commentators to conclude that pro basketball is populated by thugs. (My online search of the keywords ”N.B.A.” and ”thug” a month later produced more than 400 hits.) But the fight was an aberration; N.B.A. players are, in my experience, as gentlemanly as (or more so than) athletes in other pro sports. The N.B.A. doesn’t have a thug problem; it has a basketball problem. Its players are the best athletes in all of pro sports — oversize, swift and agile — but weirdly they are also the first to have devolved to a point where they can no longer play their own game.


    Unbelievable as it may seem, you can make millions in today’s N.B.A. without having even one semireliable way to put the ball in the basket — no jump shot, no hook shot, no little 12-foot bank shot. In fact, the entire area between dunking range and the three-point line, what used to be prime real estate for scoring, is now a virtual dead zone. (The three-point shot is the other one of the N.B.A.’s twin addictions, but more on that later.) Richard Hamilton of the Detroit Pistons, last year’s N.B.A. champion, has been just about knighted for his ability to consistently sink the ”midrange” jumper, which used to be an entry-level requirement into the N.B.A. — if you couldn’t do that, you had to find another line of work. But not anymore. This generation of players is so young, so green, so unschooled (four years of college is now exceedingly rare), so raised on a diet of ESPN highlights that many have nothing but so-called N.B.A. bodies.


    Last year, the New Jersey Nets scored 56 points in a playoff game. Fifty-six! ”We just missed shots,” said a Nets player. No kidding. Wilt Chamberlain once averaged more than 50 points a game, all by himself. Two decades ago, teams averaged about 110 points a game; this year, the figure is about 96 points per game (which is actually 3 points better than last season). Presented with players bent on executing highlight-reel dunks — but who otherwise do not pass well, shoot well or move effectively to open spots on the floor — many N.B.A. coaches have slowed the pace to a plodding, unwatchable crawl. And the more important the game, the more slowly it is played. ”It’s an incongruity,” Rod Thorn, the president of the Nets, told me. ”We have better athletes than ever, but they play at a slower pace. The reason is they’re not as sound fundamentally, so the coaches feel that the faster they play, the more mistakes they’ll make.”


    The dunk, by the way, has been banned once before, for reasons other than the one I am proposing. In 1965, a 7-foot-1 basketball player of uncommon grace and coordination graduated from Power Memorial Academy in New York City and enrolled at U.C.L.A., then the dominant force in college basketball. In his first season, Kareem Abdul-Jabbar (then known as Lew Alcindor) led U.C.L.A. to a national championship. Faced with the probability that no other team would have any chance at a title for the duration of Abdul-Jabbar’s stay, the N.C.A.A. outlawed ”basket stuffing,” aka the dunk. No one said straight out that the new rule was meant to handicap the young giant, but it immediately became known as the Alcindor rule. U.C.L.A. still thrived, winning national championships in both of Abdul-Jabbar’s remaining two seasons. ”After the so-called Alcindor rule was passed . . . some skeptics said he wouldn’t be as great,” John Wooden, the legendary U.C.L.A. coach, observed years later. ”They ignored his tremendous desire and determination. He worked twice as hard on banking shots off the glass, his little hook across the lane and his turnaround jumper.”


    In other words, Abdul-Jabbar, already skilled, became even more so. His ”sky hook” — released 5 to 10 feet from the basket, with his right arm fully extended and the ball cradled in one hand — remains the most devastatingly effective, and most beautiful, shot in the history of the game. A close second, in terms of grace, might be the ”finger roll” of Julius Erving, in which high-flying Dr. J glided above defenders and let the ball roll toward the hoop with his palm facing up, as if he were a waiter extending a serving tray. It is no coincidence that Erving played his college basketball within the years (1967-76) that the Alcindor rule was in effect: the finger roll is the kind of move you invent when the option of just powering it to the basket and stuffing it is not available.


    Earl Monroe, a stylish guard who played for the New York Knicks in the 1970′s, employed ”tempo changes only Thelonious Monk would understand,” the music and social critic Nelson George has written. Many others over the years have seen basketball as jazz, an apt comparison when the game is played well — as an amalgam of creativity, individuality, collaboration, improvisation and structure. Much of what makes basketball interesting is the give and take, the constant tension, between individual expression and team concepts. On the best teams, players take their turns as soloists, but not at the expense of others in the quintet.


    The most obvious aspect of basketball, especially at the N.B.A. level, is the extraordinary athleticism of the players. What is less apparent is that the outcome of games, more so than in any other major sport, is determined by a series of social interactions. Basketball coaches have long taught that the ball must be ”shared” — passed from player to player until it ends up in the hands of the one with the best possible shot. Players are urged constantly to ”talk” on defense — communicate about the alignment and movements of offensive players — and to ”give help,” meaning that a defender is not just responsible for the man he is guarding but also for sliding over to help a teammate who has been beaten by his own man. With just 5 players on the court at a time and rosters that consist of just 12 men, N.B.A. teams are intimate groups, extended families almost, and the ones that succeed cover for individual weaknesses and stress their strengths. They play as if they are aware of, and care for, one another.


    One reason that fans of a certain age remember and still cherish the great Knicks teams of the early 70′s is because they seemed to be such a functional, appealing social unit. The guards Walt (Clyde) Frazier, Dick Barnett and Earl Monroe were sort of urban hipsters. Bill Bradley, the dead-eye shooter and future United States senator, was an Ivy League wonk nicknamed Dollar Bill by his teammates for the presumed cost of the bargain-basement suits he wore. Willis Reed and Dave DeBusschere did the dirty work under the basket and were so blue-collar in their approach to the game that it wasn’t hard to imagine them carrying lunch buckets to some M.T.A. railyard. They meshed seamlessly on the court, elevating the concept of sharing the ball (Coach Red Holzman’s mantra was ”hit the open man”) to something like an art form. The same could be said of the Los Angeles Lakers of Magic Johnson and Abdul-Jabbar in the 80′s, the so-called Showtime teams. The multitalented Johnson, in particular, was understood to have sacrificed his own scoring in order to involve teammates in a free-flowing, high-scoring offense.


    Few teams play like that anymore because basketball culture in America is broken in ways that go beyond the addiction to dunking or the decline in fundamentals like shooting. It has always been possible to identify extraordinary basketball talent at very young ages. The game’s phenoms present early, like female gymnasts or violin prodigies (and unlike athletes in, say, football or baseball, where seemingly talented 12-year-olds often just fizzle out). What has changed in basketball is that a whole constellation has been created for the phenoms; they are separated out and sent off to dwell in a world of their own. An industry of tout sheets and recruiting services identifies them as early as fifth or sixth grade, and they begin traveling a nationwide circuit of tournaments with their high-powered youth teams. In the summer, the best high-school players attend showcases sponsored by the big sneaker companies. (The latest of the prodigies earned cover notice on Sports Illustrated in January. ”Meet Demetrius Walker,” the headline said. ”He’s 14 Years Old. You’re Going to Hear From Him.”)


    Quite understandably, these young stars, rather than being prone to sharing the ball, are apt to believe they own it. ”I’m amazed when guys make it out of that system with any sense of perspective at all,” said Jeff Van Gundy, the former Knicks coach now coaching the Houston Rockets. ”It’s not natural to be that catered to at such a young age. We’ve got kids being named the ‘best 11-year-old basketball player in America.’ How the hell do you recover from that?”


    As Van Gundy knows too well, many do not recover. The N.B.A.’s upper tier, its elite performers (the American ones, as opposed to the increasing number of foreign-born players), now typically come out of a system in which they have been pointed toward the ”next level” since grammar school. They have never played in the present tense. Their high-school coach and teammates may well have been secondary to their peer group of nationally recognized megastars. If they stopped off in college before turning pro, it was probably for just a year or two. It is not often easy to coach such a player because he is likely to see himself as a finished product, in no need of instruction, polishing or discipline. (My favorite college coach, John Chaney of Temple University, recently benched a couple of players because they showed up for the team bus without the winter hats he requires in cold weather. Unsurprisingly, Chaney rarely lands any of the nation’s most coveted recruits.)


    Stephon Marbury, the 27-year-old, $14-million-a-year point guard of the New York Knicks and one of the most celebrated schoolboy players ever, is in many ways the embodiment of modern basketball culture. Even among other very good players in his Coney Island neighborhood (including his three older brothers, who all went on to play college ball), he stood out as gifted. As a ninth grader, he was an instant starter at Abraham Lincoln High, the perennial New York City powerhouse. A basketball luminary since grammar school, he had been so eagerly awaited that after just one high-school game, Newsday proclaimed that the ”era of Stephon Marbury” had begun.


    One night earlier this season at the press table at Madison Square Garden, I was seated next to Jeff Lenchiner, the editor of InsideHoops.com, an online magazine for basketball aficionados. During a lull in the game, he turned his laptop computer toward me and directed me to watch an electronic file of Stephon Marbury highlights, an array of breathtaking moves: crossover dribbles that left defenders looking as if they were stuck in cement; spinning, twisting drives to the basket; soaring dunks. The last clip showed the 6-foot-2 Marbury rising up for a jump shot over a taller defender. At his peak, just as the ball left his hand, his sneakers looked to be about three feet above the floor. ”Look at him!” Lenchiner shouted. ”It’s like he’s in a video game. He’s got thrusters!”


    Marbury played one year of college basketball at Georgia Tech before jumping to the N.B.A. A dazzling ball handler, utterly fearless about driving to the hoop against bigger defenders, he has compiled high scoring averages and high assist totals (an assist is a pass that leads directly to a basket) in the pros while at the same time often leaving the strong impression that he does not play well with others. But then again, the concept of being part of a team is one that seems to elude a great many N.B.A. players. Prodigies as kids, they see themselves as virtuosos, leading men with ”supporting casts” (a favorite phrase of Michael Jordan’s) rather than players with teammates.


    On his first pro team, the Minnesota Timberwolves, Marbury chafed at sharing the spotlight with another young talent, Kevin Garnett, and forced a trade. Marbury has yet to play on a team that advanced past one round of the playoffs, even as, in the last three of his four N.B.A. stops (in nine seasons), he has been his team’s unquestioned marquee performer. Marbury this year publicly proclaimed himself the best point guard in the N.B.A. The Knicks promptly lost 14 of their next 16 games, and the coach, Lenny Wilkins, resigned along the way.


    Few of the N.B.A.’s younger stars want to share top billing. Tracy McGrady left the Toronto Raptors rather than stay with another superstar, Vince Carter (who also happened to be his cousin). Allen Iverson of the Philadelphia 76ers is much admired for his grit and competitive spirit, but it is not unusual for him to fire up 30 or more shots in a game in which no teammate takes as many as 15. Kobe Bryant and Shaquille O’Neal famously could not get along in Los Angeles, and with Shaq’s trade, Kobe has been left on a vastly inferior Lakers team, a trade-off he seemed willing to make.


    Marbury was among a dozen N.B.A. players who went to Athens last August to represent the United States at the Summer Olympics. Since N.B.A. players began competing at the Olympics in 1992, the Americans had never lost a game, let alone failed to win the gold medal. But in Athens, the U.S. truly dominated only one game — against the scrappy but overmatched Angolans. The N.B.A. players, who collectively earn more than $100 million a year, suffered relatively close losses to Lithuania and Argentina. They squeaked by Greece, which did have the home-court advantage. Stunningly, the U.S. Olympians were blown off the court by the commonwealth of Puerto Rico.


    In the midst of this tournament, as it was going downhill, Isiah Thomas, the president of the Knicks, called Marbury from New York. Marbury’s game had been muted; he wasn’t taking many shots or being very aggressive. His defense was so lax that the Puerto Rican point guard, Carlos Arroyo, scored 24 points on him (as compared with Marbury’s 2). ”I was just honest with him,” Thomas said, recalling the conversation. ”I told him he was playing like” something that can’t be reprinted here. Thomas advised Marbury: ”Remember who you are.” In other words, be the man, be the wizard of Coney Island. But only one person at a time, of course, can be that kind of player.


    After his talk with Thomas, Marbury responded with a record-breaking barrage of three-pointers in a close victory over Spain, nearly single-handedly preserving the U.S. medal hopes. But the next night he failed to make any three-pointers, and the Americans lost the game to Argentina, and with it any hope for a gold medal. (They settled for a bronze.)


    The Olympic basketball tournament amounted to an indictment of U.S. basketball. If you had just watched the games in Athens and knew nothing of basketball history, it would have been reasonable to conclude that the sport had been invented and popularized in, say, Argentina or Italy — and was just starting to catch on in the United States. Other teams passed better, shot more accurately, played better defense. (Foul shooting is generally regarded as a matter of discipline and repetition. With enough practice, most players can become proficient. It’s worth noting that in Athens, the gold-medal-winning U.S. women’s team made 76 percent of its foul shots while the men connected on a woeful 67 percent.)


    The American men, in defeat, chose to focus on how much better the rest of the world’s players have become and how unfamiliar the U.S. players were with one another and the somewhat different style and rules of international basketball. The larger point, they would not face: after a month together and with the noted basketball teacher Larry Brown of the Detroit Pistons as their coach, they still played as strangers. Seasoned jazz musicians can pick up together in a lounge and play the standards and sound pretty damn good — they would know all the changes in ”Stompin’ at the Savoy” — but the American basketballers had no common basketball language. Five old heads on lunch hour at a gym in North Philly or Harlem could have meshed better.


    This season, some good things are starting to happen in the N.B.A., possibly because the Olympic debacle was such an eye-opener. Scoring has started to edge up for the first time in years, and some coaches have begun to trust their teams to play a fast-breaking style. After years of exporting the game, the N.B.A. is importing not just players but also a style of play from abroad. The high-scoring Phoenix Suns have been the surprise team of the N.B.A. season so far. Their coach, Mike D’Antoni, holds dual Italian and U.S. citizenship and has spent most of his career playing and coaching in Europe. The Suns’ point guard, the master orchestrator of their run-and-gun offense, is Steve Nash, a Canadian. (The Suns signed him as a free agent to replace their point guard of last season, Stephon Marbury.)


    The San Antonio Spurs do not play at the frenzied pace of the Suns, but they are one of the N.B.A.’s best teams and, within the coaching fraternity, probably the most admired. On offense, they are a five-man whirl of movement. A player who passes the ball cuts to the basket. The player receiving a pass either shoots, makes a move toward the hoop or quickly passes to someone else. They execute the old-school ”give and go” play — a player passes to a teammate, cuts, then gets it right back. ”The Spurs are the gold standard,” Van Gundy said.


    As the Spurs took the floor for a November game in San Antonio against the Knicks, I looked in my program and noted the backgrounds of the players in their starting lineup. Rasho Nesterovic is from Slovenia; Tony Parker, from France; Manu Ginobili, star of the gold-medal-winning Olympic team, from Argentina; and Tim Duncan, the Spurs power forward and best player, from St. Croix in the U.S. Virgin Islands. Among the Spurs starters, only Bruce Bowen was born on United States soil, and he spent four years after college toiling for minor-league teams in the U.S. and on the European pro circuit. A key reserve, Beno Udrih, is another Slovenian.


    For Marbury, playing the Spurs must have felt like being back in Athens. Their style is sometimes called Euro-ball, but it is really nothing new: constant motion on offense, hit the open man. It’s the game that used to be played in the U.S. but was forsaken for a more static style.


    The Knicks got off to an early lead, spurred by one of Marbury’s highlight-reel moments: he stole a pass, raced the ball toward his offensive end and shoveled a no-look, behind-the-back, left-handed pass to Nazr Mohammed, who finished the sequence with a dunk. Eventually, though, the Spurs’ teamwork and Duncan’s strong inside presence took over. What kept me fascinated, even after the game was no longer competitive, was that the two teams played according to entirely different geometries. The Spurs made a series of angled passes that usually culminated with the final one advancing the ball closer to the basket. The Knicks’ offense consisted of Marbury using his speed off the dribble to dart inside the lane, and then, when the Spurs defense collapsed on him, he passed the ball back out, farther from the basket — often to beyond the three-point line where teammates were standing still, awaiting a pass.


    ”They do that even on a fast break, not just the Knicks but most of the rest of the teams,” Walt Frazier explained to me. A Knicks broadcaster now, Frazier diagrammed this on a tablecloth as he spoke. He was quite agitated. ”One guy’s got the ball in the middle, and these two guys on the wing here, they should be cutting to the basket, right? But, no, here they go way out here, to three-point land, and they get the ball and shoot it. You’re 6 feet from the hoop; why pass it back out 25 feet? And then people wonder why teams can’t score 80 points.”


    I am guessing that the league’s commissioner, David Stern, the best and the brightest of all sports executives, will not take my suggestion and decommission the dunk shot. It’s too much of a crowd-pleaser — just two points, but so much money in the bank. But I do hope that college and high-school basketball will again ban dunking, so that players on the way up have some chance of acquiring something other than a repertory of slam dunks.


    The three-point shot is another matter altogether. No reason it should not just disappear. ”The dagger!” announcers sometimes call it, as if it were the shock-and-awe of the hardwood, a weapon that brings opposing players to their knees. The three-pointer is a corruption of the sport, a perversion of a century of basketball wisdom that held that the whole point of the game was to advance the ball closer to the basket. If its intent was to increase scoring, the three-point shot definitely has not done that, and if it was to make the game more wide open and exciting, it hasn’t accomplished that either. The unintended consequence of the three-pointer has been to make the game more static as players ”spot up” outside the arc, waiting for the pass that will lead to the dagger.


    'Michael Jeffrey Jordan is almost certainly more popular than Jesus,” Playboy declared in 1992. ”What’s more, he has better endorsement deals.”


    Money, of course, is at the root of many, probably most, of the N.B.A.’s ills. Because Jordan established that one man can become a brand unto himself, that he can personally elevate a company — no one was more responsible for making Nike into a worldwide cultural force — the N.B.A. is now the only pro league in which a player can become an endorsement king without playing for a winning team. If he’s a spectacular enough dunker, it can happen, even if he plays in some N.B.A. outpost.


    Jordan created this world, but it’s important to remember that he did not grow up in it. Until he was deep into high school, few outside of his hometown had heard of him. When he needed coaching, he was still listening — which is part of what made him worth watching. The same cannot be said of many of his heirs in the sneaker-shilling game.


    The power of the shoe deal (and the hoped-for shoe deal) in basketball cannot be overstated. It induces kids to skip college and go right to the N.B.A. because endorsement money from Nike and other companies can dwarf the salaries they make from playing ball. The shoe deal is specifically what is making the N.B.A. younger — which, in turn, is what is degrading the quality of play.


    Sebastian Telfair of Coney Island was a phenom from an early age, pointed toward bigger things and therefore on the radar of the sneaker companies — just like his cousin, Stephon Marbury. He went to an Adidas-sponsored camp. The teams he played for as a kid, right up through high school, were outfitted in Adidas. Last spring, he took the shoe money, a reported $15 million — from Adidas, of course — and skipped right from Lincoln High to the N.B.A. ”I’ve been Adidas all my life,” he said at the press conference to announce his N.B.A. ascension. I saw him play the other night. He looked small and lost.


    A snapshot from today’s N.B.A.: the locker room of the New York Knicks, where in each dressing cubicle a necktie hangs on a hook, pre-knotted. Isiah Thomas, the team president, has ordered players to wear suits and ties to the arenas, a grown-up enough thing. But during games, a team functionary goes around knotting the ties so that when a player gets dressed afterward, all he has to do is slip the tie over his head and tighten it rather than actually having to make the knot himself.


    One other snapshot: the Knicks bench, with 12 players, 1 head coach and 6 assistant coaches. The Dallas Mavericks have employed as many as 10 assistants, nearly 1 per player. I checked into how many assistants Red Holzman had with the old Knicks. The answer: none. He coached by himself. It was explained to me by people around the league that in the modern N.B.A., a half-dozen or more assistant coaches are needed to help fill in the gaps for young players. In essence, they teach remedial basketball for millionaires.


    What the N.B.A. needs, most of all, is to get older. Last summer, eight first-round draft choices were high-school kids; four were college seniors. There are some true prodigies out there, young men ready to go straight from seventh-period English to the N.B.A. But not that many. The most notable recent one is LeBron James of the Cleveland Cavaliers, who somehow survived intense high-school fame to emerge as a mature, team-oriented professional basketball player.


    For most, though, the N.B.A. is a bad place to learn, no matter how many coaches are available as tutors. The league is increasingly stocked with athletes who might have ripened in college — if they had not been picked so young. They end up stunted. The players are paid, but the fans, and the game, are being cheated.




    Michael Sokolove, author of ”The Ticket Out: Darryl Strawberry and the Boys of Crenshaw,” is a contributing writer for the magazine.





    Copyright 2005 The New York Times Company | Home | Privacy Policy | Search | Corrections | RSS | Help | Back to Top