February 1, 2007

  • Posing as a Family, Sex Offenders Stun a Town

    Yavapai County Sheriff

    Neil H. Rodreick II

    Laura Segall for The New York Times

    Lori Morgan, a teacher in Chino Valley, Ariz., had Neil H. Rodreick II in her class for one day this month

    Posing as a Family, Sex Offenders Stun a Town

    EL MIRAGE, Ariz., Jan. 31 — To neighbors, Casey Price was a seventh grader with acne and a baseball cap who lived an unremarkable life among a bevy of male relatives.

    He built the occasional skateboard ramp and did wheelies on his bicycle down the streets of this subdivision of stucco homes north of Phoenix.

    In nearby Surprise, where Casey was enrolled as a 12-year-old in a public school for four months, he was regarded as a shy, average student with chronic attendance problems. A man identified as his uncle had registered him, attended curriculum night and e-mailed his teachers about homework assignments.

    Now Casey is in jail, and his former neighbors and classmates have learned the unthinkable: Not only is Casey not Casey — his real name is Neil H. Rodreick II — but he is also a 29-year-old convicted sex offender who kept a youthful appearance with the aid of razors and makeup.

    And the men known as his uncle, grandfather and cousin, who until recently shared a three-bedroom house with him here, were not family at all, but a web of convicted sex offenders and predators, law enforcement officials say, preying in part on one another.

    A retracing of Mr. Rodreick’s tracks over the past several years shows that he is under investigation in three states. The authorities in four jurisdictions say he repeatedly failed to register as a sex offender, housed a large cache of child pornography in his computer and, based on videos found by the police, had sex with at least one boy.

    “Obviously there are a lot of emotions to work through,” said Mindy Newlin, the mother of a kindergartener at Imagine Charter School, the school in Surprise where Mr. Rodreick posed as Casey. “We are just shocked.”

    Robin Kaiser’s daughter Kaitlin shared a class with “Casey,” but he failed to make an impression, Ms. Kaiser said. “She remembers him, that he was quiet and sat in the back of the classroom,” she said. “She said he looked like he had been held back.”

    Janet R. Lincoln, the public defender for Yavapai County, who represents Mr. Rodreick and the other three men, did not return multiple phone calls. A receptionist in her office said Ms. Lincoln would have no comment. The men have been indicted on numerous counts and are scheduled to appear in court in late February; they have already pleaded not guilty to charges of fraud and failing to register as sex offenders.

    Mr. Rodreick spent seven years in prison in Oklahoma for making lewd and indecent proposals to two 6-year-old boys. After being released in 2002, law enforcement officials said, he was able to convince Lonnie Stiffler, 61, and Robert J. Snow, 43, who had been trolling the Internet for boys, that he was a minor.

    In 2005, he talked the two men into taking him from Oklahoma to live with them in Arizona, where Mr. Stiffler posed as Mr. Rodreick’s grandfather and Mr. Snow as his uncle, and both men regularly had sex with him, the authorities said. Another man living in the house, Brian Nellis, 34, a sex offender Mr. Rodreick had met in prison, is believed to have aided Mr. Rodreick in the ruse, the authorities said.

    Mr. Rodreick continued the charade as a minor for nearly two years, the authorities said, registering at four charter schools in Arizona, until this month, when school administrators in Chino Valley called the sheriff.

    The police and school officials in each location where “Casey” enrolled said they knew of no children harmed, although the indictment against Mr. Rodreick includes an assault count. The authorities are trying to determine, with the help of videos confiscated from the men, if there were victims in the schools.

    “With boys it is a really tough deal,” said Lt. Van Gillock of the Police Department in El Reno, Okla., where Mr. Rodreick is believed to have posed as a 12-year-old to ingratiate himself with boys at church. “If they did it voluntarily, they have the stigma of homosexuality, and if it is forced, well, boys are supposed to be tough and the things the boys have on them gives them an embarrassment factor.”

    Though many parents have publicly praised the Surprise school’s handling of the deception, Mr. Rodreick’s enrollment has raised questions about admissions procedures, which officials at Imagine, one of the state’s largest charter schools, said they were reviewing. Arizona, the nation’s fastest-growing state, is a leader in charter school enrollment, with more than 450 schools that account for 8 percent of the state’s total student body.

    “He probably thought that a charter school was easier,” said Candace Foth, another parent in Surprise. “It is not really difficult to enroll.”

    Mr. Rodreick’s time in Arizona was the latest episode in a life speckled with disappointment, crimes and estrangement, according to relatives and law enforcement officials.

    When he was growing up in Oklahoma, he was sexually abused by neighbors, said his mother’s sister, Jan Bautista, with whom he lived briefly after his release from prison in 2002.

    He was 18 in 1996 when the authorities in Chickasha, Okla., charged him with making lewd and indecent proposals to two 6-year-old boys. He was sentenced to 10 years in prison and released after serving 7.

    From there, Ms. Bautista said, he made his way to San Bernardino County in California, where he stayed with her for about two months. Ms. Bautista said she asked Mr. Rodreick to leave after she found child pornography on her computer and suspected him of lewd acts toward a child. She said she turned her computer over to the sheriff.

    “I really hate this man,” she said. “I really do. I hope they keep him in prison the rest of his life, because I know he is never going to get well.”

    Mr. Rodreick next made his way to Kingfisher County in Oklahoma, where, according to the sheriff, Dennis Banther, he registered as a sex offender. Soon, he was joined in his mobile home in a secluded area by Mr. Nellis, who had served three years in prison after a conviction for lewd molestation, Sheriff Banther said. The two gravitated among fast-food restaurant jobs, the sheriff said, and were seen at a school playground, a library and parks.

    The two left Kingfisher County in 2003 for El Reno, Okla., and trouble followed. In 2005, Lieutenant Gillock of the El Reno Police Department got a call from a computer rental store that had repossessed a computer Mr. Nellis had rented and found child pornography — more than 1,000 images and 150 videos.

    While looking for Mr. Rodreick, Lieutenant Gillock stumbled upon his new life. He learned he had been posing as a 12-year-old named Casey and befriending families at a local church. He had spent the night with at least one boy, the lieutenant said, and traveled to the Grand Canyon, with Mr. Nellis in tow as his uncle, with another boy. And since at least late 2004, he said, Mr. Rodreick had been receiving Western Union money transfers from Mr. Stiffler.

    Mr. Stiffler is the only one of the four who has not been convicted of a sexual offense, according to officials in Yavapai County. He spent most of his life in New Jersey, some of it married to a woman named Jill, who died in 1984, and with whom he had a daughter, according to Karen Svecz, his wife’s sister.

    Mr. Snow, who law enforcement officials say has been convicted of a sexual offense in California, lived mostly there until about 1985, when he moved to Arizona.

    When Mr. Rodreick arrived in Arizona, he is believed to have first enrolled at the Shelby School in Peyson, where administrators say he attended under the name of Casy Rodreick for 21 days in 2005.

    The next stop was Surprise, where the same “uncle” played the role of enroller again, presenting Mr. Rodreick as a 12-year-old. His concocted name, Casey Price, was that of a child in Oklahoma, the authorities there said.

    “He absolutely looked age-appropriate,” said Rhonda Cagle, a spokeswoman for Imagine Charter School, of Mr. Rodreick, who is listed on the Oklahoma Department of Corrections Web site as 5 feet 8 inches tall and 120 pounds. “We have several seventh-grade students who are taller and of a larger build than this individual.”

    Ms. Cagle said he was quiet and participated in no after school activities, eventually being expelled by school officials for poor attendance.

    “He took all of the subjects our students take — math, social studies,” she said. “By all accounts from the teachers, he was fairly quiet and withdrawn. He turned in homework, certainly didn’t come off as brilliant or as someone needing extra help.”

    After another effort to enroll in a school in Prescott Valley, the police say, Mr. Rodreick headed a bit north, to the Mingus Springs Charter School in Chino Valley, and this time, his “grandfather,” Mr. Stiffler, took him to enroll on Jan. 16 toward the end of the day.

    But administrators and staff members quickly grew suspicious, said Dawn Gonzales, the school director.

    “The person posing as the child obviously looked older than 12,” Ms. Gonzales said, although he was allowed to start class while they looked over his paperwork. Things were not right, there, either. Some records had Casey, others Casy. Different birth dates emerged.

    The next day came, and so did Casey. “He did have the demeanor of a kid,” Ms. Gonzales said. “He played that part very well. He appeared to be very shy. He kept his head down and spoke softly.”

    It wasn’t working. “Every adult that encountered him said something here is not right,” she said. “He just looked older. They kept saying, ‘Are you sure he is 12?’ “

    When information on his enrollment forms turned out to be fiction, school officials, believing they had an abducted older child on their hands, called the Yavapai sheriff’s office.

    “In my wildest imagination I could not have dreamt up,” what was discovered, Ms. Gonzales said.

    The authorities said Mr. Stiffler and Mr. Snow were shocked, too, and angry about being duped by an adult posing as a minor.

    Ms. Cagle said her school in Surprise learned about their “Casey” on the evening news. “Needless to say, our staff is devastated,” she said. “This individual violated a sense of community that we all share. This is something that is bigger than our school. It affects the way we live and the way we look at each other.”

    Cheryl Camp contributed reporting from Kingfisher, Okla., and Alain Delaquérière from New York.


     

    Better Shoeboxes for Digital Photos

    Illustration by Frank Frisari
    Better Shoeboxes for Digital Photos

    PHOTO management once meant finding room to stash yet another box filled with snapshots. While digital photography has freed up that closet space, sorting and retrieving pictures in the era of the 250-gigabyte hard drive has created a set of challenges of its own.

    Recently, the declining cost of high-capacity camera memory cards has accelerated the pace at which many people accumulate photos. At the same time, the growing popularity of sophisticated digital single-lens-reflex cameras among amateur photographers is leading to larger file sizes and more interest in fine-tuning images.

    Two major software companies offered their latest answers to these problems this week, adding to the range of programs available for browsing and managing photos.

    The Microsoft Windows operating system has lacked anything approaching the easy-to-use iPhoto program supplied with Apple Macintosh computers. But Windows Vista, which went on sale to consumers this week, includes an advanced photo management system that Microsoft calls Windows Photo Gallery.

    On Monday, Adobe Systems, the maker of Photoshop, released a final version of Photoshop Lightroom, an organizing program that has been floating around in trial form for more than a year.

    Photo management programs are not complete substitutes for full photo-editing software like Photoshop. That being said, they do offer the editing tools that photographers use most frequently to change the overall look of photos, like adjustments for exposure, brightness, contrast and color.

    “It’s not about pixel manipulation,” said Rob Schoeben, vice president for applications product marketing at Apple. “It’s about pulling the beauty out of the image.”

    Most of these programs assume that users want to fix and sort a large number of photos at the same time, for example after downloading them from a camera. Editing software like Photoshop offers batch processing options, but the working premise of those programs is that users will generally be intensively fiddling with one picture at a time.

    One trick offered by many types of photo management software is nondestructive editing. Through various means, the programs make sure that the original image is always left intact during editing. In effect, the original image files play the role given to negatives in the film world. Among other things, that allows users to change their minds about edits. Unpopular relatives or political despots can be cropped out of photos one day and then restored when they return to favor.

    Most makers of photo management software follow Apple’s option and offer two flavors of products. IPhoto, for example, costs nothing when you buy a new Mac, while its more advanced sibling, Aperture 1.5, sells for $300.

    People who rarely make adjustments to their photos and think of them as snapshots rather than personal expression will probably be more than satisfied with the basic programs. Owners of digital S.L.R.’s who frequently adjust photos or who often take photos using their camera’s RAW setting, which saves all the color and exposure data gathered by the camera’s sensor in a large file, may find working with the more costly, more capable programs easier.

    Most high-end photo management programs are available as fully featured trial downloads that expire after a certain period. Because the programs approach some basic tasks in ways that may not suit every user’s tastes, the no-cost trials have much to recommend them. They also allow owners of older computers to see if their machines can form a happy partnership with these demanding programs.

    An Elegant Solution

    The features and reliability of Photoshop long ago made it the editor of choice for serious photographers. But its interface, even for knowledgeable users, can be as intimidating as the instrument panel of a jumbo jet.

    The new Photoshop Lightroom is a study in simplicity and elegance. One of its setup options enables photos to float on a black background, with the editing and navigation tools appearing only when the cursor is dragged near the monitor’s edge.

    While Lightroom, which will cost $200 for the next few months when purchased directly from Adobe (www.adobe.com), allows easy navigation through large numbers of photos, some of the other features need refinement. Lightroom cannot, for example, directly attach photos to e-mail messages.

    Adobe Photoshop Elements is a variation of Photoshop CS2, Adobe’s $650 flagship program, that offers its most important features for the bargain price of $100 in the Windows version or $80 for Mac users.

    Perhaps Mac users are given a discount because iPhoto, which can easily be integrated with any version of Photoshop for high-level editing, is a much better way to manage their photos. For Windows users, Photoshop Elements is a relatively inexpensive way to get the leading editing program and a competent photo manager in the same box.

    Microsoft’s interest in photo management software is not confined to Vista. Last year, it bought iView Multimedia (www.iview-multimedia.com), the maker of MediaPro, a $200 program with a reputation for working quickly when searching through large numbers of photos. The program can also store other types of data, including video and music. This spring, MediaPro will become a new program, Microsoft Expression Media. It will include additional features and cost $100 more. Despite the new ownership, it will be sold in Mac and Windows versions.

    Macintosh Software

    IPhoto, which has been around since 2002, clearly inspired several photo management programs from other companies. For most Apple users, nothing else is as easy to use. Aperture, however, offers several features that may benefit people who frequently tweak their photos and who have large photo collections.

    In a sense, the designers of iPhoto stuck to the shoebox school of organizing. It is designed with the idea that all images will be stored in a single library file. Aperture, by contrast, can track photos stored anywhere and in multiple locations, including external hard drives and those archived on CDs and DVDs.

    Aperture’s nondestructive editing system also consumes far less hard-drive space over time. IPhoto preserves its originals by duplicating the full image when it is edited. Aperture merely stores a compact set of instructions indicating how to alter the master image to recreate the edits.

    If Lightroom excels in navigation, Aperture leads the way in easy-to-use editing tools. Its only drawback is that the screen display can seem a bit crowded when on a laptop. As with iPhoto, however, users can easily toggle to a full-screen display that hides the editing and navigation accessories.

    Windows Software

    Even Microsoft acknowledges that the photo features supplied with earlier Windows versions did little more than allow users to get pictures out of their cameras and into their computers. The Windows Photo Gallery in Vista promises to improve that situation. As a bonus, like Apple’s Aperture program, it can also keep tabs on pictures that have been moved to CDs, DVDs or external hard drives.

    For Windows users without Vista, one of the best options costs nothing to download: Google‘s Picasa (picasa.google.com). Unsurprisingly, it integrates well with other Google services and it offers efficient editing tools. While it can manage images on external hard drives, Picasa cannot deal with pictures on CDs or DVDs.

    Twelve years ago, ACDSee from ACD Systems (www.acdsee.com) was a pioneer of photo management. Today, the company offers a basic version of its latest software, ACDSee 9, for $40. For an extra $90, ACDSee Pro handles RAW file conversion more quickly and allows greater customization.

    Another software company, Corel (www.corel.com), bought Jasc Software, an early photo software developer, about two years ago. One result of the deal is Corel Snapfire, a free photo manager, although users must put up with a small, ever-changing ad for Corel products in one corner.

    It is much like iPhoto in its basic concept and includes some relatively advanced editing functions, like the ability to straighten off-kilter snapshots. For the benefit of complete novices, the software automatically analyzes images and suggests which ones might benefit from basic editing.

    The free version has a major shortcoming: it does not offer any direct way to back up photos. Doing that requires buying Snapfire Plus for $40, which adds a few editing features and allows users to switch off the ads. Neither version offers nondestructive editing because Corel decided that the concept was too confusing for novices.


     

    Celebrity Architects Reveal a Daring Cultural Xanadu for the Arab World

    Zaha Hadid

    Zaha Hadid’s design for a performing arts center for an island in Abu Dhabi.

    Celebrity Architects Reveal a Daring Cultural Xanadu for the Arab World

    ABU DHABI, United Arab Emirates, Jan. 31 — In this land of big ambition and deep pockets, planners on Wednesday unveiled designs for an audacious multibillion-dollar cultural district whose like has never been seen in the Arab world.

    The designs presented here in Abu Dhabi, the capital of the United Arab Emirates and one of the world’s top oil producers, are to be built on an island just off the coast and include three museums designed by the celebrity architects Frank Gehry, Jean Nouvel and Tadao Ando, as well as a sprawling, spaceshiplike performing arts center designed by Zaha Hadid.

    Mr. Gehry’s building is intended for an Adu Dhabi branch of the Guggenheim Museum featuring contemporary art and Mr. Nouvel’s for a classical museum, possibly an outpost of the Louvre Museum in Paris. Mr. Ando’s is to house a maritime museum reflecting the history of the Arabian gulf.

    The project also calls for a national museum and a biennial exhibition space composed of 19 pavilions designed by smaller names and snaking along a canal that cuts through the island. Art schools and an art college are also planned.

    In all, the project, known as the Cultural District of Saadiyat Island, would create an exhibition space intended to turn this once-sleepy desert city along the Persian Gulf into an international arts capital and tourist destination. If completed according to plan sometime in the next decade, consultants predict, it could be the world’s largest single arts-and-culture development project in recent memory.

    At times astonishing, at times controversial, the district is part of a far broader $27 billion development project on the island that includes hotels, resorts, golf courses and housing that could accommodate 125,000 residents or more.

    The museum designs, displayed at an exhibition attended by dignitaries and the United Arab Emirates leadership, are a striking departure from Abu Dhabi’s crumbling 1970s-style concrete buildings and more modern glass-and-steel high-rises. Still, because Saadiyat Island is undeveloped, architects faced the unusual challenge of an aesthetic and contextual tabula rasa.

    The daring designs, some teeming with life and color, others more starkly formal, have one aspect in common: it probably would be hard to build them all in one district anywhere else.

    “It’s like a clean slate in a country full of resources,” said Mr. Gehry, who appeared at the exhibition to show off his model for the Guggenheim Abu Dhabi. “It’s an opportunity for the world of art and culture that is not available anywhere else because you’re building a desert enclave without the contextual constraints of a city.”

    No cost estimates were given for the buildings unveiled on Wednesday, but each is certain to run into the hundreds of millions of dollars.

    For the Guggenheim Abu Dhabi, Mr. Gehry envisions a 320,000-square-foot structure with 130,000 square feet of exhibition space built around a cluster of galleries, a space far larger than his Guggenheim Bilbao in Spain, which cost about $100 million. A jumble of blocks, glass awnings and open spaces, the Abu Dhabi Guggenheim would be centered on a core of galleries of varying height atop one another and forming a courtyard. A second ring of larger galleries is followed by a third ring of galleries housing raw industrial-looking spaces with exposed lighting and mechanical systems.

    The design for the classical museum enters into a dialogue with its surroundings, suggesting a submerged archaeological field with a cluster of one-room buildings placed along a promenade. The complex is covered by a massive translucent dome etched in patterns that allow diffused light into the spaces below.

    Mr. Ando’s maritime museum design borrows from the maritime history of the emirates, with a reflective surface merging sea and land and a shiplike interior with floating decks.

    Ms. Hadid’s performing arts center concept, which seems part spaceship, part organism, is to house a music hall, concert hall, opera house and two theaters, one seating up to 6,300. Transparent and airy, the center hovers over the azure waters of the Persian Gulf.

    “It’s an inspiration from nature and an organic design, with a fluid design, as well as a space with good sound,” Ms. Hadid said.

    Abu Dhabi’s sheiks dreamed up this sweeping cultural project in late 2004, after brainstorming ways to attract more tourism to the emirate, which is the richest of the seven in the United Arab Emirates confederation, but has largely missed out on the flood of visitors attracted by its neighbor Dubai.

    Flush with cash from the oil boom, the emirate has embarked on a development spree intended to update its infrastructure after years of limited development. Abu Dhabi’s tourist board insists it is not trying to one-up Dubai, but instead wants to complement Dubai’s emphasis on other forms of entertainment.

    “The real strategic decision here is that Dubai has established itself as a tourist destination, and Abu Dhabi is complementing what Dubai is doing,” said Barry Lord, president of Lord Cultural Resources, which has helped manage the development of the cultural project. “Cultural tourists are wealthier, older, more educated, and they spend more. From an economic view, this makes sense.”

    Abu Dhabi’s Tourism Development and Investment Company announced a deal to build the Guggenheim Abu Dhabi last year. Recently it reached a $1 billion accord to rent the name, art and expertise of the Louvre for a museum to be built on the island. Protests quickly arose in France that that country was selling its patrimony to the highest bidder. The emirate’s tourism officials played down the Louvre plan on Wednesday, insisting the deal was not final.

    Mr. Lord noted that the arts project was taking shape against the backdrop of continued turbulence in the Middle East.

    “They are very conscious here that this can change the cultural climate in the region,” Mr. Lord said. “To be able to add high culture at the high end of international culture, this is a tremendous change.”

    After oil booms in the 1970s and 80s in which their proceeds were not always used wisely, Persian Gulf governments are now focusing on spending their surpluses on infrastructure projects and real-estate development. A new generation of leaders in the gulf, especially in the emirates, where a new ruler was installed only in late 2004 and where several ministers are still in their 30s, has looked beyond traditional real-estate projects to efforts that would help their cities stand out on the world stage.

    Other Persian Gulf countries have turned to the arts too. In Qatar the final touches are being added to I. M. Pei’s latest structure, the Qatar Museum, built just off the coast of the capital, Doha, to house a new Islamic arts collection. In Sharjah, another emirate, which has fashioned itself as the cultural capital of the Persian Gulf, the Sharjah Art Museum continues to expand its collection and is planning its eighth biennial. And even Dubai is building a Culture Village, centered on an opera house also designed by Ms. Hadid and other arts and culture institutions.

    “This is not just about tourism; it also has global cultural dimensions,” Mubarak Muhairi, the director general of the Abu Dhabi tourism authority, said. “We believe the best vehicle for crossing borders is art. And this region is in need of such artistic initiatives.”


     

    Should we buy Michael Pollan’s nutritional Darwinism?Unhappy Meals

    Survival of the Yummiest
    Should we buy Michael Pollan’s nutritional Darwinism?
    By Daniel Engber
    Posted Wednesday, Jan. 31, 2007, at 6:24 PM E.T.

    Adam and Eve must have been a healthy pair. They got some exercise, ate lots of locally grown fruits and vegetables, and while they may not have been thin by today’s fashion standards, they certainly weren’t ashamed of their bodies. Now look what’s happened: In just 6,000 years, we’ve abandoned their sensible eating habits for a high-fat, sugar-loaded diet, and turned ourselves into a nation of lard-asses. Goodbye Garden of Eden; hello Olive Garden.

    Whence our fall from grace? According to Michael Pollan’s essay in last Sunday’s New York Times Magazine, the serpent wears a lab coat. For decades scientists have been analyzing the food we eat, breaking it down into component parts, and studying how each nutrient affects our health in controlled conditions. More often than not, the “expert advice” that emerged from this work did more harm than good, it seems. When the government told us to eat more low-fat foods, we ended up binging on carbs. We bought margarine when the gurus told us to avoid saturated fats; now city governments are telling us that margarine is against the law. Well-intentioned blunders like these have crowded out the ancient wisdom that once guided our culinary habits, Pollan argues.

    Blame the scientists. They “need individual variables they can isolate,” Pollan explains. “Yet even the simplest food is a hopelessly complex thing to study, a virtual wilderness of chemical compounds, many of which exist in complex and dynamic relation to one another, and all of which together are in the process of changing from one state to another.” We’ll never understand the biology of eating because it’s just too hard to study in the lab. Large-scale clinical investigations won’t be much help, either: There’s no good way to observe or control how people eat; when doctors ask us about our diets we either misremember or make up stuff.

    That much may be true, but it doesn’t mean there’s an inherent flaw in the scientific method. An optimist would say the worst years are behind us. Sure, we’ve made a few mistakes, but the science is getting stronger every day. Just as the discovery of vitamins made it easier to treat beriberi and scurvy, so will the latest research eventually help us to vanquish coronary heart disease and diabetes. That’s how science works: You keep plugging away in the lab until you finally get somewhere.

    It would help me to accept Pollan’s claim to the contrary if I could think of any other topic in the universe so complicated that it defies scientific investigation. Yes, there’s a lot to consider when you’re looking at nutrition. But is climatology any easier? Should we throw up our hands at the idea of studying global warming, simply because it reflects a wilderness of variables in complex and dynamic relation to one another? Once we might have charged psychology with the same crimes here ascribed to nutrition: The mind is too complex, and individuals too unreliable, for us ever to understand what goes on inside our heads. But surely we’ve now seen the benefits of opening the black box—and tinkering around with the 100 billion neurons of the human brain.

    Pollan presents the food scientist as a reductionist bogeyman, trampling willy-nilly over the delicate complexities of the natural world. (The illustrations assigned to his article convey dread at the notion that a fruit might be reduced—gasp—to its constituent parts.) It’s a dangerous path, he argues, since those complexities have kept us alive over the course of human history. We don’t have to identify which of the three-dozen antioxidants in a sprig of thyme, for example, will protect us from cancer; if we’ve always been eating fruits and vegetables, then they must be good for us. It’s natural selection of the human diet: Thousands of years of trial and error must have pushed us toward increasingly wholesome foods. Any unhealthy eating habits would have gone extinct along the way. Why toss out these extraordinary evolutionary data in favor of a few decades’ worth of lab experiments?

    But Pollan’s nutritional Darwinism only makes sense if the selection pressures of the distant past were in perfect alignment with the health concerns of today. In other words, our food culture would have evolved to protect us from cancer, heart disease, and obesity only if those maladies had been a primary threat to reproduction in the ancient world. It’s hard to imagine that the risks posed by these so-called “diseases of affluence“—which often strike late in life, after we’ve had babies—would have been as significant to our fast-living, sickly forebears as the dangers of, say, bacterial infections or the occasional drought. Indeed, for much of human history, natural selection might well have traded off the dangers of morbid obesity to mitigate the risk of starvation. There’s just no way to know how the ancient culinary traditions will fare in the modern world until we try them.

    Modern nutrition may be more of an ideology than a science, but so is Pollan’s nutritional Darwinism. The two ideologies stand in direct opposition to one another, with the science-minded progressives on one side and the culinary conservatives on the other. The Darwinists reject the idea that lab science can be used to engineer public health on a massive scale. They rely instead on the time-tested mores that have always been our guides. Pollan’s reflections on the diet revolution could be an homage to Edmund Burke: Our radical eating habits have produced a swinish multitude.

    A conservative approach to eating seems very straightforward, which gives it an enormous appeal. We’d be healthier, Pollan argues, if we just stopped thinking and worrying so much about food and let nature take its course. (He takes several opportunities to congratulate the svelte, chain-smoking French for their pleasure-based cuisine.) But there’s no reason to believe that nutritional Darwinism will give us any more clarity on its own terms.

    Health gurus routinely use the same language of ancient culinary traditions to sell fad diets that would make Pollan cringe. Barry Sears, author of the low-carb Zone diet, suggests a return to the traditional food culture of the “Neo-Paleolithic” period, when caveman “decathletes” consumed large amounts of meat and very little grain. In his version, we bungled up the natural selection of foodstuffs when we invented agriculture. Pollan says that happened during the Industrial Revolution. Two evolutionary stories offer very different nutritional advice. How can we know who’s right?

    If we had only the rhetoric of natural selection to go by, we’d never know for sure. Lucky for us, humans have gradually developed the means—over centuries of cultural evolution, no less—to evaluate one claim against another on the basis of objective facts. For all its foibles, food science has given us a reliable set of data on what works and what doesn’t. As Ben Goldacre points out in the New Statesman, solid epidemiological work has validated the standard advice we get from our doctors: Exercise more and eat your fruits and vegetables.

    Pollan cites the same scientific research to support what he describes as his “flagrantly unscientific” diet plan: “Eat food. Not too much. Mostly plants.” I’m happy to follow those dicta if they’ll help me to live a longer, happier life. But that doesn’t mean I have to buy into the misleading, great-great-grandma-knew-best philosophy that spawned them. I’d rather stick to the science, warts and all.

    Daniel Engber is an associate editor at Slate. He can be reached at danengber@yahoo.com.

    Unhappy Meals

    Eat food. Not too much. Mostly plants.

    That, more or less, is the short answer to the supposedly incredibly complicated and confusing question of what we humans should eat in order to be maximally healthy. I hate to give away the game right here at the beginning of a long essay, and I confess that I’m tempted to complicate matters in the interest of keeping things going for a few thousand more words. I’ll try to resist but will go ahead and add a couple more details to flesh out the advice. Like: A little meat won’t kill you, though it’s better approached as a side dish than as a main. And you’re much better off eating whole fresh foods than processed food products. That’s what I mean by the recommendation to eat “food.” Once, food was all you could eat, but today there are lots of other edible foodlike substances in the supermarket. These novel products of food science often come in packages festooned with health claims, which brings me to a related rule of thumb: if you’re concerned about your health, you should probably avoid food products that make health claims. Why? Because a health claim on a food product is a good indication that it’s not really food, and food is what you want to eat.

    Uh-oh. Things are suddenly sounding a little more complicated, aren’t they? Sorry. But that’s how it goes as soon as you try to get to the bottom of the whole vexing question of food and health. Before long, a dense cloud bank of confusion moves in. Sooner or later, everything solid you thought you knew about the links between diet and health gets blown away in the gust of the latest study.

    Last winter came the news that a low-fat diet, long believed to protect against breast cancer, may do no such thing — this from the monumental, federally financed Women’s Health Initiative, which has also found no link between a low-fat diet and rates of coronary disease. The year before we learned that dietary fiber might not, as we had been confidently told, help prevent colon cancer. Just last fall two prestigious studies on omega-3 fats published at the same time presented us with strikingly different conclusions. While the Institute of Medicine stated that “it is uncertain how much these omega-3s contribute to improving health” (and they might do the opposite if you get them from mercury-contaminated fish), a Harvard study declared that simply by eating a couple of servings of fish each week (or by downing enough fish oil), you could cut your risk of dying from a heart attack by more than a third — a stunningly hopeful piece of news. It’s no wonder that omega-3 fatty acids are poised to become the oat bran of 2007, as food scientists micro-encapsulate fish oil and algae oil and blast them into such formerly all-terrestrial foods as bread and tortillas, milk and yogurt and cheese, all of which will soon, you can be sure, sprout fishy new health claims. (Remember the rule?)

    By now you’re probably registering the cognitive dissonance of the supermarket shopper or science-section reader, as well as some nostalgia for the simplicity and solidity of the first few sentences of this essay. Which I’m still prepared to defend against the shifting winds of nutritional science and food-industry marketing. But before I do that, it might be useful to figure out how we arrived at our present state of nutritional confusion and anxiety.

    The story of how the most basic questions about what to eat ever got so complicated reveals a great deal about the institutional imperatives of the food industry, nutritional science and — ahem — journalism, three parties that stand to gain much from widespread confusion surrounding what is, after all, the most elemental question an omnivore confronts. Humans deciding what to eat without expert help — something they have been doing with notable success since coming down out of the trees — is seriously unprofitable if you’re a food company, distinctly risky if you’re a nutritionist and just plain boring if you’re a newspaper editor or journalist. (Or, for that matter, an eater. Who wants to hear, yet again, “Eat more fruits and vegetables”?) And so, like a large gray fog, a great Conspiracy of Confusion has gathered around the simplest questions of nutrition — much to the advantage of everybody involved. Except perhaps the ostensible beneficiary of all this nutritional expertise and advice: us, and our health and happiness as eaters.

    FROM FOODS TO NUTRIENTS

    It was in the 1980s that food began disappearing from the American supermarket, gradually to be replaced by “nutrients,” which are not the same thing. Where once the familiar names of recognizable comestibles — things like eggs or breakfast cereal or cookies — claimed pride of place on the brightly colored packages crowding the aisles, now new terms like “fiber” and “cholesterol” and “saturated fat” rose to large-type prominence. More important than mere foods, the presence or absence of these invisible substances was now generally believed to confer health benefits on their eaters. Foods by comparison were coarse, old-fashioned and decidedly unscientific things — who could say what was in them, really? But nutrients — those chemical compounds and minerals in foods that nutritionists have deemed important to health — gleamed with the promise of scientific certainty; eat more of the right ones, fewer of the wrong, and you would live longer and avoid chronic diseases.

    Nutrients themselves had been around, as a concept, since the early 19th century, when the English doctor and chemist William Prout identified what came to be called the “macronutrients”: protein, fat and carbohydrates. It was thought that that was pretty much all there was going on in food, until doctors noticed that an adequate supply of the big three did not necessarily keep people nourished. At the end of the 19th century, British doctors were puzzled by the fact that Chinese laborers in the Malay states were dying of a disease called beriberi, which didn’t seem to afflict Tamils or native Malays. The mystery was solved when someone pointed out that the Chinese ate “polished,” or white, rice, while the others ate rice that hadn’t been mechanically milled. A few years later, Casimir Funk, a Polish chemist, discovered the “essential nutrient” in rice husks that protected against beriberi and called it a “vitamine,” the first micronutrient. Vitamins brought a kind of glamour to the science of nutrition, and though certain sectors of the population began to eat by its expert lights, it really wasn’t until late in the 20th century that nutrients managed to push food aside in the popular imagination of what it means to eat.

    No single event marked the shift from eating food to eating nutrients, though in retrospect a little-noticed political dust-up in Washington in 1977 seems to have helped propel American food culture down this dimly lighted path. Responding to an alarming increase in chronic diseases linked to diet — including heart disease, cancer and diabetes — a Senate Select Committee on Nutrition, headed by George McGovern, held hearings on the problem and prepared what by all rights should have been an uncontroversial document called “Dietary Goals for the United States.” The committee learned that while rates of coronary heart disease had soared in America since World War II, other cultures that consumed traditional diets based largely on plants had strikingly low rates of chronic disease. Epidemiologists also had observed that in America during the war years, when meat and dairy products were strictly rationed, the rate of heart disease temporarily plummeted.

    Naïvely putting two and two together, the committee drafted a straightforward set of dietary guidelines calling on Americans to cut down on red meat and dairy products. Within weeks a firestorm, emanating from the red-meat and dairy industries, engulfed the committee, and Senator McGovern (who had a great many cattle ranchers among his South Dakota constituents) was forced to beat a retreat. The committee’s recommendations were hastily rewritten. Plain talk about food — the committee had advised Americans to actually “reduce consumption of meat” — was replaced by artful compromise: “Choose meats, poultry and fish that will reduce saturated-fat intake.”

    A subtle change in emphasis, you might say, but a world of difference just the same. First, the stark message to “eat less” of a particular food has been deep-sixed; don’t look for it ever again in any official U.S. dietary pronouncement. Second, notice how distinctions between entities as different as fish and beef and chicken have collapsed; those three venerable foods, each representing an entirely different taxonomic class, are now lumped together as delivery systems for a single nutrient. Notice too how the new language exonerates the foods themselves; now the culprit is an obscure, invisible, tasteless — and politically unconnected — substance that may or may not lurk in them called “saturated fat.”

    The linguistic capitulation did nothing to rescue McGovern from his blunder; the very next election, in 1980, the beef lobby helped rusticate the three-term senator, sending an unmistakable warning to anyone who would challenge the American diet, and in particular the big chunk of animal protein sitting in the middle of its plate. Henceforth, government dietary guidelines would shun plain talk about whole foods, each of which has its trade association on Capitol Hill, and would instead arrive clothed in scientific euphemism and speaking of nutrients, entities that few Americans really understood but that lack powerful lobbies in Washington. This was precisely the tack taken by the National Academy of Sciences when it issued its landmark report on diet and cancer in 1982. Organized nutrient by nutrient in a way guaranteed to offend no food group, it codified the official new dietary language. Industry and media followed suit, and terms like polyunsaturated, cholesterol, monounsaturated, carbohydrate, fiber, polyphenols, amino acids and carotenes soon colonized much of the cultural space previously occupied by the tangible substance formerly known as food. The Age of Nutritionism had arrived.

    THE RISE OF NUTRITIONISM

    The first thing to understand about nutritionism — I first encountered the term in the work of an Australian sociologist of science named Gyorgy Scrinis — is that it is not quite the same as nutrition. As the “ism” suggests, it is not a scientific subject but an ideology. Ideologies are ways of organizing large swaths of life and experience under a set of shared but unexamined assumptions. This quality makes an ideology particularly hard to see, at least while it’s exerting its hold on your culture. A reigning ideology is a little like the weather, all pervasive and virtually inescapable. Still, we can try.

    In the case of nutritionism, the widely shared but unexamined assumption is that the key to understanding food is indeed the nutrient. From this basic premise flow several others. Since nutrients, as compared with foods, are invisible and therefore slightly mysterious, it falls to the scientists (and to the journalists through whom the scientists speak) to explain the hidden reality of foods to us. To enter a world in which you dine on unseen nutrients, you need lots of expert help.

    But expert help to do what, exactly? This brings us to another unexamined assumption: that the whole point of eating is to maintain and promote bodily health. Hippocrates’s famous injunction to “let food be thy medicine” is ritually invoked to support this notion. I’ll leave the premise alone for now, except to point out that it is not shared by all cultures and that the experience of these other cultures suggests that, paradoxically, viewing food as being about things other than bodily health — like pleasure, say, or socializing — makes people no less healthy; indeed, there’s some reason to believe that it may make them more healthy. This is what we usually have in mind when we speak of the “French paradox” — the fact that a population that eats all sorts of unhealthful nutrients is in many ways healthier than we Americans are. So there is at least a question as to whether nutritionism is actually any good for you.

    Another potentially serious weakness of nutritionist ideology is that it has trouble discerning qualitative distinctions between foods. So fish, beef and chicken through the nutritionists’ lens become mere delivery systems for varying quantities of fats and proteins and whatever other nutrients are on their scope. Similarly, any qualitative distinctions between processed foods and whole foods disappear when your focus is on quantifying the nutrients they contain (or, more precisely, the known nutrients).

    This is a great boon for manufacturers of processed food, and it helps explain why they have been so happy to get with the nutritionism program. In the years following McGovern’s capitulation and the 1982 National Academy report, the food industry set about re-engineering thousands of popular food products to contain more of the nutrients that science and government had deemed the good ones and less of the bad, and by the late ’80s a golden era of food science was upon us. The Year of Eating Oat Bran — also known as 1988 — served as a kind of coming-out party for the food scientists, who succeeded in getting the material into nearly every processed food sold in America. Oat bran’s moment on the dietary stage didn’t last long, but the pattern had been established, and every few years since then a new oat bran has taken its turn under the marketing lights. (Here comes omega-3!)

    By comparison, the typical real food has more trouble competing under the rules of nutritionism, if only because something like a banana or an avocado can’t easily change its nutritional stripes (though rest assured the genetic engineers are hard at work on the problem). So far, at least, you can’t put oat bran in a banana. So depending on the reigning nutritional orthodoxy, the avocado might be either a high-fat food to be avoided (Old Think) or a food high in monounsaturated fat to be embraced (New Think). The fate of each whole food rises and falls with every change in the nutritional weather, while the processed foods are simply reformulated. That’s why when the Atkins mania hit the food industry, bread and pasta were given a quick redesign (dialing back the carbs; boosting the protein), while the poor unreconstructed potatoes and carrots were left out in the cold.

    Of course it’s also a lot easier to slap a health claim on a box of sugary cereal than on a potato or carrot, with the perverse result that the most healthful foods in the supermarket sit there quietly in the produce section, silent as stroke victims, while a few aisles over, the Cocoa Puffs and Lucky Charms are screaming about their newfound whole-grain goodness.

    EAT RIGHT, GET FATTER

    So nutritionism is good for business. But is it good for us? You might think that a national fixation on nutrients would lead to measurable improvements in the public health. But for that to happen, the underlying nutritional science, as well as the policy recommendations (and the journalism) based on that science, would have to be sound. This has seldom been the case.

    Consider what happened immediately after the 1977 “Dietary Goals” — McGovern’s masterpiece of politico-nutritionist compromise. In the wake of the panel’s recommendation that we cut down on saturated fat, a recommendation seconded by the 1982 National Academy report on cancer, Americans did indeed change their diets, endeavoring for a quarter-century to do what they had been told. Well, kind of. The industrial food supply was promptly reformulated to reflect the official advice, giving us low-fat pork, low-fat Snackwell’s and all the low-fat pasta and high-fructose (yet low-fat!) corn syrup we could consume. Which turned out to be quite a lot. Oddly, America got really fat on its new low-fat diet — indeed, many date the current obesity and diabetes epidemic to the late 1970s, when Americans began binging on carbohydrates, ostensibly as a way to avoid the evils of fat.

    This story has been told before, notably in these pages (“What if It’s All Been a Big Fat Lie?” by Gary Taubes, July 7, 2002), but it’s a little more complicated than the official version suggests. In that version, which inspired the most recent Atkins craze, we were told that America got fat when, responding to bad scientific advice, it shifted its diet from fats to carbs, suggesting that a re-evaluation of the two nutrients is in order: fat doesn’t make you fat; carbs do. (Why this should have come as news is a mystery: as long as people have been raising animals for food, they have fattened them on carbs.)

    But there are a couple of problems with this revisionist picture. First, while it is true that Americans post-1977 did begin binging on carbs, and that fat as a percentage of total calories in the American diet declined, we never did in fact cut down on our consumption of fat. Meat consumption actually climbed. We just heaped a bunch more carbs onto our plates, obscuring perhaps, but not replacing, the expanding chunk of animal protein squatting in the center.

    How did that happen? I would submit that the ideology of nutritionism deserves as much of the blame as the carbohydrates themselves do — that and human nature. By framing dietary advice in terms of good and bad nutrients, and by burying the recommendation that we should eat less of any particular food, it was easy for the take-home message of the 1977 and 1982 dietary guidelines to be simplified as follows: Eat more low-fat foods. And that is what we did. We’re always happy to receive a dispensation to eat more of something (with the possible exception of oat bran), and one of the things nutritionism reliably gives us is some such dispensation: low-fat cookies then, low-carb beer now. It’s hard to imagine the low-fat craze taking off as it did if McGovern’s original food-based recommendations had stood: eat fewer meat and dairy products. For how do you get from that stark counsel to the idea that another case of Snackwell’s is just what the doctor ordered?

    BAD SCIENCE

    But if nutritionism leads to a kind of false consciousness in the mind of the eater, the ideology can just as easily mislead the scientist. Most nutritional science involves studying one nutrient at a time, an approach that even nutritionists who do it will tell you is deeply flawed. “The problem with nutrient-by-nutrient nutrition science,” points out Marion Nestle, the New York University nutritionist, “is that it takes the nutrient out of the context of food, the food out of the context of diet and the diet out of the context of lifestyle.”

    If nutritional scientists know this, why do they do it anyway? Because a nutrient bias is built into the way science is done: scientists need individual variables they can isolate. Yet even the simplest food is a hopelessly complex thing to study, a virtual wilderness of chemical compounds, many of which exist in complex and dynamic relation to one another, and all of which together are in the process of changing from one state to another. So if you’re a nutritional scientist, you do the only thing you can do, given the tools at your disposal: break the thing down into its component parts and study those one by one, even if that means ignoring complex interactions and contexts, as well as the fact that the whole may be more than, or just different from, the sum of its parts. This is what we mean by reductionist science.

    Scientific reductionism is an undeniably powerful tool, but it can mislead us too, especially when applied to something as complex as, on the one side, a food, and on the other, a human eater. It encourages us to take a mechanistic view of that transaction: put in this nutrient; get out that physiological result. Yet people differ in important ways. Some populations can metabolize sugars better than others; depending on your evolutionary heritage, you may or may not be able to digest the lactose in milk. The specific ecology of your intestines helps determine how efficiently you digest what you eat, so that the same input of 100 calories may yield more or less energy depending on the proportion of Firmicutes and Bacteroidetes living in your gut. There is nothing very machinelike about the human eater, and so to think of food as simply fuel is wrong.

    Also, people don’t eat nutrients, they eat foods, and foods can behave very differently than the nutrients they contain. Researchers have long believed, based on epidemiological comparisons of different populations, that a diet high in fruits and vegetables confers some protection against cancer. So naturally they ask, What nutrients in those plant foods are responsible for that effect? One hypothesis is that the antioxidants in fresh produce — compounds like beta carotene, lycopene, vitamin E, etc. — are the X factor. It makes good sense: these molecules (which plants produce to protect themselves from the highly reactive oxygen atoms produced in photosynthesis) vanquish the free radicals in our bodies, which can damage DNA and initiate cancers. At least that’s how it seems to work in the test tube. Yet as soon as you remove these useful molecules from the context of the whole foods they’re found in, as we’ve done in creating antioxidant supplements, they don’t work at all. Indeed, in the case of beta carotene ingested as a supplement, scientists have discovered that it actually increases the risk of certain cancers. Big oops.

    What’s going on here? We don’t know. It could be the vagaries of human digestion. Maybe the fiber (or some other component) in a carrot protects the antioxidant molecules from destruction by stomach acids early in the digestive process. Or it could be that we isolated the wrong antioxidant. Beta is just one of a whole slew of carotenes found in common vegetables; maybe we focused on the wrong one. Or maybe beta carotene works as an antioxidant only in concert with some other plant chemical or process; under other circumstances, it may behave as a pro-oxidant.

    Indeed, to look at the chemical composition of any common food plant is to realize just how much complexity lurks within it. Here’s a list of just the antioxidants that have been identified in garden-variety thyme:

    4-Terpineol, alanine, anethole, apigenin, ascorbic acid, beta carotene, caffeic acid, camphene, carvacrol, chlorogenic acid, chrysoeriol, eriodictyol, eugenol, ferulic acid, gallic acid, gamma-terpinene isochlorogenic acid, isoeugenol, isothymonin, kaempferol, labiatic acid, lauric acid, linalyl acetate, luteolin, methionine, myrcene, myristic acid, naringenin, oleanolic acid, p-coumoric acid, p-hydroxy-benzoic acid, palmitic acid, rosmarinic acid, selenium, tannin, thymol, tryptophan, ursolic acid, vanillic acid.

    This is what you’re ingesting when you eat food flavored with thyme. Some of these chemicals are broken down by your digestion, but others are going on to do undetermined things to your body: turning some gene’s expression on or off, perhaps, or heading off a free radical before it disturbs a strand of DNA deep in some cell. It would be great to know how this all works, but in the meantime we can enjoy thyme in the knowledge that it probably doesn’t do any harm (since people have been eating it forever) and that it may actually do some good (since people have been eating it forever) and that even if it does nothing, we like the way it tastes.

    It’s also important to remind ourselves that what reductive science can manage to perceive well enough to isolate and study is subject to change, and that we have a tendency to assume that what we can see is all there is to see. When William Prout isolated the big three macronutrients, scientists figured they now understood food and what the body needs from it; when the vitamins were isolated a few decades later, scientists thought, O.K., now we really understand food and what the body needs to be healthy; today it’s the polyphenols and carotenoids that seem all-important. But who knows what the hell else is going on deep in the soul of a carrot?

    The good news is that, to the carrot eater, it doesn’t matter. That’s the great thing about eating food as compared with nutrients: you don’t need to fathom a carrot’s complexity to reap its benefits.

    The case of the antioxidants points up the dangers in taking a nutrient out of the context of food; as Nestle suggests, scientists make a second, related error when they study the food out of the context of the diet. We don’t eat just one thing, and when we are eating any one thing, we’re not eating another. We also eat foods in combinations and in orders that can affect how they’re absorbed. Drink coffee with your steak, and your body won’t be able to fully absorb the iron in the meat. The trace of limestone in the corn tortilla unlocks essential amino acids in the corn that would otherwise remain unavailable. Some of those compounds in that sprig of thyme may well affect my digestion of the dish I add it to, helping to break down one compound or possibly stimulate production of an enzyme to detoxify another. We have barely begun to understand the relationships among foods in a cuisine.

    But we do understand some of the simplest relationships, like the zero-sum relationship: that if you eat a lot of meat you’re probably not eating a lot of vegetables. This simple fact may explain why populations that eat diets high in meat have higher rates of coronary heart disease and cancer than those that don’t. Yet nutritionism encourages us to look elsewhere for the explanation: deep within the meat itself, to the culpable nutrient, which scientists have long assumed to be the saturated fat. So they are baffled when large-population studies, like the Women’s Health Initiative, fail to find that reducing fat intake significantly reduces the incidence of heart disease or cancer.

    Of course thanks to the low-fat fad (inspired by the very same reductionist fat hypothesis), it is entirely possible to reduce your intake of saturated fat without significantly reducing your consumption of animal protein: just drink the low-fat milk and order the skinless chicken breast or the turkey bacon. So maybe the culprit nutrient in meat and dairy is the animal protein itself, as some researchers now hypothesize. (The Cornell nutritionist T. Colin Campbell argues as much in his recent book, “The China Study.”) Or, as the Harvard epidemiologist Walter C. Willett suggests, it could be the steroid hormones typically present in the milk and meat; these hormones (which occur naturally in meat and milk but are often augmented in industrial production) are known to promote certain cancers.

    But people worried about their health needn’t wait for scientists to settle this question before deciding that it might be wise to eat more plants and less meat. This is of course precisely what the McGovern committee was trying to tell us.

    Nestle also cautions against taking the diet out of the context of the lifestyle. The Mediterranean diet is widely believed to be one of the most healthful ways to eat, yet much of what we know about it is based on studies of people living on the island of Crete in the 1950s, who in many respects lived lives very different from our own. Yes, they ate lots of olive oil and little meat. But they also did more physical labor. They fasted regularly. They ate a lot of wild greens — weeds. And, perhaps most important, they consumed far fewer total calories than we do. Similarly, much of what we know about the health benefits of a vegetarian diet is based on studies of Seventh Day Adventists, who muddy the nutritional picture by drinking absolutely no alcohol and never smoking. These extraneous but unavoidable factors are called, aptly, “confounders.” One last example: People who take supplements are healthier than the population at large, but their health probably has nothing whatsoever to do with the supplements they take — which recent studies have suggested are worthless. Supplement-takers are better-educated, more-affluent people who, almost by definition, take a greater-than-normal interest in personal health — confounding factors that probably account for their superior health.

    But if confounding factors of lifestyle bedevil comparative studies of different populations, the supposedly more rigorous “prospective” studies of large American populations suffer from their own arguably even more disabling flaws. In these studies — of which the Women’s Health Initiative is the best known — a large population is divided into two groups. The intervention group changes its diet in some prescribed manner, while the control group does not. The two groups are then tracked over many years to learn whether the intervention affects relative rates of chronic disease.

    When it comes to studying nutrition, this sort of extensive, long-term clinical trial is supposed to be the gold standard. It certainly sounds sound. In the case of the Women’s Health Initiative, sponsored by the National Institutes of Health, the eating habits and health outcomes of nearly 49,000 women (ages 50 to 79 at the beginning of the study) were tracked for eight years. One group of the women were told to reduce their consumption of fat to 20 percent of total calories. The results were announced early last year, producing front-page headlines of which the one in this newspaper was typical: “Low-Fat Diet Does Not Cut Health Risks, Study Finds.” And the cloud of nutritional confusion over the country darkened.

    But even a cursory analysis of the study’s methods makes you wonder why anyone would take such a finding seriously, let alone order a Quarter Pounder With Cheese to celebrate it, as many newspaper readers no doubt promptly went out and did. Even the beginner student of nutritionism will immediately spot several flaws: the focus was on “fat,” rather than on any particular food, like meat or dairy. So women could comply simply by switching to lower-fat animal products. Also, no distinctions were made between types of fat: women getting their allowable portion of fat from olive oil or fish were lumped together with woman getting their fat from low-fat cheese or chicken breasts or margarine. Why? Because when the study was designed 16 years ago, the whole notion of “good fats” was not yet on the scientific scope. Scientists study what scientists can see.

    But perhaps the biggest flaw in this study, and other studies like it, is that we have no idea what these women were really eating because, like most people when asked about their diet, they lied about it. How do we know this? Deduction. Consider: When the study began, the average participant weighed in at 170 pounds and claimed to be eating 1,800 calories a day. It would take an unusual metabolism to maintain that weight on so little food. And it would take an even freakier metabolism to drop only one or two pounds after getting down to a diet of 1,400 to 1,500 calories a day — as the women on the “low-fat” regimen claimed to have done. Sorry, ladies, but I just don’t buy it.

    In fact, nobody buys it. Even the scientists who conduct this sort of research conduct it in the knowledge that people lie about their food intake all the time. They even have scientific figures for the magnitude of the lie. Dietary trials like the Women’s Health Initiative rely on “food-frequency questionnaires,” and studies suggest that people on average eat between a fifth and a third more than they claim to on the questionnaires. How do the researchers know that? By comparing what people report on questionnaires with interviews about their dietary intake over the previous 24 hours, thought to be somewhat more reliable. In fact, the magnitude of the lie could be much greater, judging by the huge disparity between the total number of food calories produced every day for each American (3,900 calories) and the average number of those calories Americans own up to chomping: 2,000. (Waste accounts for some of the disparity, but nowhere near all of it.) All we really know about how much people actually eat is that the real number lies somewhere between those two figures.

    To try to fill out the food-frequency questionnaire used by the Women’s Health Initiative, as I recently did, is to realize just how shaky the data on which such trials rely really are. The survey, which took about 45 minutes to complete, started off with some relatively easy questions: “Did you eat chicken or turkey during the last three months?” Having answered yes, I was then asked, “When you ate chicken or turkey, how often did you eat the skin?” But the survey soon became harder, as when it asked me to think back over the past three months to recall whether when I ate okra, squash or yams, they were fried, and if so, were they fried in stick margarine, tub margarine, butter, “shortening” (in which category they inexplicably lump together hydrogenated vegetable oil and lard), olive or canola oil or nonstick spray? I honestly didn’t remember, and in the case of any okra eaten in a restaurant, even a hypnotist could not get out of me what sort of fat it was fried in. In the meat section, the portion sizes specified haven’t been seen in America since the Hoover administration. If a four-ounce portion of steak is considered “medium,” was I really going to admit that the steak I enjoyed on an unrecallable number of occasions during the past three months was probably the equivalent of two or three (or, in the case of a steakhouse steak, no less than four) of these portions? I think not. In fact, most of the “medium serving sizes” to which I was asked to compare my own consumption made me feel piggish enough to want to shave a few ounces here, a few there. (I mean, I wasn’t under oath or anything, was I?)

    This is the sort of data on which the largest questions of diet and health are being decided in America today.

    THE ELEPHANT IN THE ROOM

    In the end, the biggest, most ambitious and widely reported studies of diet and health leave more or less undisturbed the main features of the Western diet: lots of meat and processed foods, lots of added fat and sugar, lots of everything — except fruits, vegetables and whole grains. In keeping with the nutritionism paradigm and the limits of reductionist science, the researchers fiddle with single nutrients as best they can, but the populations they recruit and study are typical American eaters doing what typical American eaters do: trying to eat a little less of this nutrient, a little more of that, depending on the latest thinking. (One problem with the control groups in these studies is that they too are exposed to nutritional fads in the culture, so over time their eating habits come to more closely resemble the habits of the intervention group.) It should not surprise us that the findings of such research would be so equivocal and confusing.

    But what about the elephant in the room — the Western diet? It might be useful, in the midst of our deepening confusion about nutrition, to review what we do know about diet and health. What we know is that people who eat the way we do in America today suffer much higher rates of cancer, heart disease, diabetes and obesity than people eating more traditional diets. (Four of the 10 leading killers in America are linked to diet.) Further, we know that simply by moving to America, people from nations with low rates of these “diseases of affluence” will quickly acquire them. Nutritionism by and large takes the Western diet as a given, seeking to moderate its most deleterious effects by isolating the bad nutrients in it — things like fat, sugar, salt — and encouraging the public and the food industry to limit them. But after several decades of nutrient-based health advice, rates of cancer and heart disease in the U.S. have declined only slightly (mortality from heart disease is down since the ’50s, but this is mainly because of improved treatment), and rates of obesity and diabetes have soared.

    No one likes to admit that his or her best efforts at understanding and solving a problem have actually made the problem worse, but that’s exactly what has happened in the case of nutritionism. Scientists operating with the best of intentions, using the best tools at their disposal, have taught us to look at food in a way that has diminished our pleasure in eating it while doing little or nothing to improve our health. Perhaps what we need now is a broader, less reductive view of what food is, one that is at once more ecological and cultural. What would happen, for example, if we were to start thinking about food as less of a thing and more of a relationship?

    In nature, that is of course precisely what eating has always been: relationships among species in what we call food chains, or webs, that reach all the way down to the soil. Species co-evolve with the other species they eat, and very often a relationship of interdependence develops: I’ll feed you if you spread around my genes. A gradual process of mutual adaptation transforms something like an apple or a squash into a nutritious and tasty food for a hungry animal. Over time and through trial and error, the plant becomes tastier (and often more conspicuous) in order to gratify the animal’s needs and desires, while the animal gradually acquires whatever digestive tools (enzymes, etc.) are needed to make optimal use of the plant. Similarly, cow’s milk did not start out as a nutritious food for humans; in fact, it made them sick until humans who lived around cows evolved the ability to digest lactose as adults. This development proved much to the advantage of both the milk drinkers and the cows.

    “Health” is, among other things, the byproduct of being involved in these sorts of relationships in a food chain — involved in a great many of them, in the case of an omnivorous creature like us. Further, when the health of one link of the food chain is disturbed, it can affect all the creatures in it. When the soil is sick or in some way deficient, so will be the grasses that grow in that soil and the cattle that eat the grasses and the people who drink the milk. Or, as the English agronomist Sir Albert Howard put it in 1945 in “The Soil and Health” (a founding text of organic agriculture), we would do well to regard “the whole problem of health in soil, plant, animal and man as one great subject.” Our personal health is inextricably bound up with the health of the entire food web.

    In many cases, long familiarity between foods and their eaters leads to elaborate systems of communications up and down the food chain, so that a creature’s senses come to recognize foods as suitable by taste and smell and color, and our bodies learn what to do with these foods after they pass the test of the senses, producing in anticipation the chemicals necessary to break them down. Health depends on knowing how to read these biological signals: this smells spoiled; this looks ripe; that’s one good-looking cow. This is easier to do when a creature has long experience of a food, and much harder when a food has been designed expressly to deceive its senses — with artificial flavors, say, or synthetic sweeteners.

    Note that these ecological relationships are between eaters and whole foods, not nutrients. Even though the foods in question eventually get broken down in our bodies into simple nutrients, as corn is reduced to simple sugars, the qualities of the whole food are not unimportant — they govern such things as the speed at which the sugars will be released and absorbed, which we’re coming to see as critical to insulin metabolism. Put another way, our bodies have a longstanding and sustainable relationship to corn that we do not have to high-fructose corn syrup. Such a relationship with corn syrup might develop someday (as people evolve superhuman insulin systems to cope with regular floods of fructose and glucose), but for now the relationship leads to ill health because our bodies don’t know how to handle these biological novelties. In much the same way, human bodies that can cope with chewing coca leaves — a longstanding relationship between native people and the coca plant in South America — cannot cope with cocaine or crack, even though the same “active ingredients” are present in all three. Reductionism as a way of understanding food or drugs may be harmless, even necessary, but reductionism in practice can lead to problems.

    Looking at eating through this ecological lens opens a whole new perspective on exactly what the Western diet is: a radical and rapid change not just in our foodstuffs over the course of the 20th century but also in our food relationships, all the way from the soil to the meal. The ideology of nutritionism is itself part of that change. To get a firmer grip on the nature of those changes is to begin to know how we might make our relationships to food healthier. These changes have been numerous and far-reaching, but consider as a start these four large-scale ones:

    From Whole Foods to Refined. The case of corn points up one of the key features of the modern diet: a shift toward increasingly refined foods, especially carbohydrates. Call it applied reductionism. Humans have been refining grains since at least the Industrial Revolution, favoring white flour (and white rice) even at the price of lost nutrients. Refining grains extends their shelf life (precisely because it renders them less nutritious to pests) and makes them easier to digest, by removing the fiber that ordinarily slows the release of their sugars. Much industrial food production involves an extension and intensification of this practice, as food processors find ways to deliver glucose — the brain’s preferred fuel — ever more swiftly and efficiently. Sometimes this is precisely the point, as when corn is refined into corn syrup; other times it is an unfortunate byproduct of food processing, as when freezing food destroys the fiber that would slow sugar absorption.

    So fast food is fast in this other sense too: it is to a considerable extent predigested, in effect, and therefore more readily absorbed by the body. But while the widespread acceleration of the Western diet offers us the instant gratification of sugar, in many people (and especially those newly exposed to it) the “speediness” of this food overwhelms the insulin response and leads to Type II diabetes. As one nutrition expert put it to me, we’re in the middle of “a national experiment in mainlining glucose.” To encounter such a diet for the first time, as when people accustomed to a more traditional diet come to America, or when fast food comes to their countries, delivers a shock to the system. Public-health experts call it “the nutrition transition,” and it can be deadly.

    From Complexity to Simplicity. If there is one word that covers nearly all the changes industrialization has made to the food chain, it would be simplification. Chemical fertilizers simplify the chemistry of the soil, which in turn appears to simplify the chemistry of the food grown in that soil. Since the widespread adoption of synthetic nitrogen fertilizers in the 1950s, the nutritional quality of produce in America has, according to U.S.D.A. figures, declined significantly. Some researchers blame the quality of the soil for the decline; others cite the tendency of modern plant breeding to select for industrial qualities like yield rather than nutritional quality. Whichever it is, the trend toward simplification of our food continues on up the chain. Processing foods depletes them of many nutrients, a few of which are then added back in through “fortification”: folic acid in refined flour, vitamins and minerals in breakfast cereal. But food scientists can add back only the nutrients food scientists recognize as important. What are they overlooking?

    Simplification has occurred at the level of species diversity, too. The astounding variety of foods on offer in the modern supermarket obscures the fact that the actual number of species in the modern diet is shrinking. For reasons of economics, the food industry prefers to tease its myriad processed offerings from a tiny group of plant species, corn and soybeans chief among them. Today, a mere four crops account for two-thirds of the calories humans eat. When you consider that humankind has historically consumed some 80,000 edible species, and that 3,000 of these have been in widespread use, this represents a radical simplification of the food web. Why should this matter? Because humans are omnivores, requiring somewhere between 50 and 100 different chemical compounds and elements to be healthy. It’s hard to believe that we can get everything we need from a diet consisting largely of processed corn, soybeans, wheat and rice.

    From Leaves to Seeds. It’s no coincidence that most of the plants we have come to rely on are grains; these crops are exceptionally efficient at transforming sunlight into macronutrients — carbs, fats and proteins. These macronutrients in turn can be profitably transformed into animal protein (by feeding them to animals) and processed foods of every description. Also, the fact that grains are durable seeds that can be stored for long periods means they can function as commodities as well as food, making these plants particularly well suited to the needs of industrial capitalism.

    The needs of the human eater are another matter. An oversupply of macronutrients, as we now have, itself represents a serious threat to our health, as evidenced by soaring rates of obesity and diabetes. But the undersupply of micronutrients may constitute a threat just as serious. Put in the simplest terms, we’re eating a lot more seeds and a lot fewer leaves, a tectonic dietary shift the full implications of which we are just beginning to glimpse. If I may borrow the nutritionist’s reductionist vocabulary for a moment, there are a host of critical micronutrients that are harder to get from a diet of refined seeds than from a diet of leaves. There are the antioxidants and all the other newly discovered phytochemicals (remember that sprig of thyme?); there is the fiber, and then there are the healthy omega-3 fats found in leafy green plants, which may turn out to be most important benefit of all.

    Most people associate omega-3 fatty acids with fish, but fish get them from green plants (specifically algae), which is where they all originate. Plant leaves produce these essential fatty acids (“essential” because our bodies can’t produce them on their own) as part of photosynthesis. Seeds contain more of another essential fatty acid: omega-6. Without delving too deeply into the biochemistry, the two fats perform very different functions, in the plant as well as the plant eater. Omega-3s appear to play an important role in neurological development and processing, the permeability of cell walls, the metabolism of glucose and the calming of inflammation. Omega-6s are involved in fat storage (which is what they do for the plant), the rigidity of cell walls, clotting and the inflammation response. (Think of omega-3s as fleet and flexible, omega-6s as sturdy and slow.) Since the two lipids compete with each other for the attention of important enzymes, the ratio between omega-3s and omega-6s may matter more than the absolute quantity of either fat. Thus too much omega-6 may be just as much a problem as too little omega-3.

    And that might well be a problem for people eating a Western diet. As we’ve shifted from leaves to seeds, the ratio of omega-6s to omega-3s in our bodies has shifted, too. At the same time, modern food-production practices have further diminished the omega-3s in our diet. Omega-3s, being less stable than omega-6s, spoil more readily, so we have selected for plants that produce fewer of them; further, when we partly hydrogenate oils to render them more stable, omega-3s are eliminated. Industrial meat, raised on seeds rather than leaves, has fewer omega-3s and more omega-6s than preindustrial meat used to have. And official dietary advice since the 1970s has promoted the consumption of polyunsaturated vegetable oils, most of which are high in omega-6s (corn and soy, especially). Thus, without realizing what we were doing, we significantly altered the ratio of these two essential fats in our diets and bodies, with the result that the ratio of omega-6 to omega-3 in the typical American today stands at more than 10 to 1; before the widespread introduction of seed oils at the turn of the last century, it was closer to 1 to 1.

    The role of these lipids is not completely understood, but many researchers say that these historically low levels of omega-3 (or, conversely, high levels of omega-6) bear responsibility for many of the chronic diseases associated with the Western diet, especially heart disease and diabetes. (Some researchers implicate omega-3 deficiency in rising rates of depression and learning disabilities as well.) To remedy this deficiency, nutritionism classically argues for taking omega-3 supplements or fortifying food products, but because of the complex, competitive relationship between omega-3 and omega-6, adding more omega-3s to the diet may not do much good unless you also reduce your intake of omega-6.

    From Food Culture to Food Science. The last important change wrought by the Western diet is not, strictly speaking, ecological. But the industrialization of our food that we call the Western diet is systematically destroying traditional food cultures. Before the modern food era — and before nutritionism — people relied for guidance about what to eat on their national or ethnic or regional cultures. We think of culture as a set of beliefs and practices to help mediate our relationship to other people, but of course culture (at least before the rise of science) has also played a critical role in helping mediate people’s relationship to nature. Eating being a big part of that relationship, cultures have had a great deal to say about what and how and why and when and how much we should eat. Of course when it comes to food, culture is really just a fancy word for Mom, the figure who typically passes on the food ways of the group — food ways that, although they were never “designed” to optimize health (we have many reasons to eat the way we do), would not have endured if they did not keep eaters alive and well.

    The sheer novelty and glamour of the Western diet, with its 17,000 new food products introduced every year, and the marketing muscle used to sell these products, has overwhelmed the force of tradition and left us where we now find ourselves: relying on science and journalism and marketing to help us decide questions about what to eat. Nutritionism, which arose to help us better deal with the problems of the Western diet, has largely been co-opted by it, used by the industry to sell more food and to undermine the authority of traditional ways of eating. You would not have read this far into this article if your food culture were intact and healthy; you would simply eat the way your parents and grandparents and great-grandparents taught you to eat. The question is, Are we better off with these new authorities than we were with the traditional authorities they supplanted? The answer by now should be clear.

    It might be argued that, at this point in history, we should simply accept that fast food is our food culture. Over time, people will get used to eating this way and our health will improve. But for natural selection to help populations adapt to the Western diet, we’d have to be prepared to let those whom it sickens die. That’s not what we’re doing. Rather, we’re turning to the health-care industry to help us “adapt.” Medicine is learning how to keep alive the people whom the Western diet is making sick. It’s gotten good at extending the lives of people with heart disease, and now it’s working on obesity and diabetes. Capitalism is itself marvelously adaptive, able to turn the problems it creates into lucrative business opportunities: diet pills, heart-bypass operations, insulin pumps, bariatric surgery. But while fast food may be good business for the health-care industry, surely the cost to society — estimated at more than $200 billion a year in diet-related health-care costs — is unsustainable.

    BEYOND NUTRITIONISM

    To medicalize the diet problem is of course perfectly consistent with nutritionism. So what might a more ecological or cultural approach to the problem recommend? How might we plot our escape from nutritionism and, in turn, from the deleterious effects of the modern diet? In theory nothing could be simpler — stop thinking and eating that way — but this is somewhat harder to do in practice, given the food environment we now inhabit and the loss of sharp cultural tools to guide us through it. Still, I do think escape is possible, to which end I can now revisit — and elaborate on, but just a little — the simple principles of healthy eating I proposed at the beginning of this essay, several thousand words ago. So try these few (flagrantly unscientific) rules of thumb, collected in the course of my nutritional odyssey, and see if they don’t at least point us in the right direction.

    1. Eat food. Though in our current state of confusion, this is much easier said than done. So try this: Don’t eat anything your great-great-grandmother wouldn’t recognize as food. (Sorry, but at this point Moms are as confused as the rest of us, which is why we have to go back a couple of generations, to a time before the advent of modern food products.) There are a great many foodlike items in the supermarket your ancestors wouldn’t recognize as food (Go-Gurt? Breakfast-cereal bars? Nondairy creamer?); stay away from these.

    2. Avoid even those food products that come bearing health claims. They’re apt to be heavily processed, and the claims are often dubious at best. Don’t forget that margarine, one of the first industrial foods to claim that it was more healthful than the traditional food it replaced, turned out to give people heart attacks. When Kellogg’s can boast about its Healthy Heart Strawberry Vanilla cereal bars, health claims have become hopelessly compromised. (The American Heart Association charges food makers for their endorsement.) Don’t take the silence of the yams as a sign that they have nothing valuable to say about health.

    3. Especially avoid food products containing ingredients that are a) unfamiliar, b) unpronounceable c) more than five in number — or that contain high-fructose corn syrup.None of these characteristics are necessarily harmful in and of themselves, but all of them are reliable markers for foods that have been highly processed.

    4. Get out of the supermarket whenever possible. You won’t find any high-fructose corn syrup at the farmer’s market; you also won’t find food harvested long ago and far away. What you will find are fresh whole foods picked at the peak of nutritional quality. Precisely the kind of food your great-great-grandmother would have recognized as food.

    5. Pay more, eat less. The American food system has for a century devoted its energies and policies to increasing quantity and reducing price, not to improving quality. There’s no escaping the fact that better food — measured by taste or nutritional quality (which often correspond) — costs more, because it has been grown or raised less intensively and with more care. Not everyone can afford to eat well in America, which is shameful, but most of us can: Americans spend, on average, less than 10 percent of their income on food, down from 24 percent in 1947, and less than the citizens of any other nation. And those of us who can afford to eat well should. Paying more for food well grown in good soils — whether certified organic or not — will contribute not only to your health (by reducing exposure to pesticides) but also to the health of others who might not themselves be able to afford that sort of food: the people who grow it and the people who live downstream, and downwind, of the farms where it is grown.

    “Eat less” is the most unwelcome advice of all, but in fact the scientific case for eating a lot less than we currently do is compelling. “Calorie restriction” has repeatedly been shown to slow aging in animals, and many researchers (including Walter Willett, the Harvard epidemiologist) believe it offers the single strongest link between diet and cancer prevention. Food abundance is a problem, but culture has helped here, too, by promoting the idea of moderation. Once one of the longest-lived people on earth, the Okinawans practiced a principle they called “Hara Hachi Bu”: eat until you are 80 percent full. To make the “eat less” message a bit more palatable, consider that quality may have a bearing on quantity: I don’t know about you, but the better the quality of the food I eat, the less of it I need to feel satisfied. All tomatoes are not created equal.

    6. Eat mostly plants, especially leaves. Scientists may disagree on what’s so good about plants — the antioxidants? Fiber? Omega-3s? — but they do agree that they’re probably really good for you and certainly can’t hurt. Also, by eating a plant-based diet, you’ll be consuming far fewer calories, since plant foods (except seeds) are typically less “energy dense” than the other things you might eat. Vegetarians are healthier than carnivores, but near vegetarians (“flexitarians”) are as healthy as vegetarians. Thomas Jefferson was on to something when he advised treating meat more as a flavoring than a food.

    7. Eat more like the French. Or the Japanese. Or the Italians. Or the Greeks. Confounding factors aside, people who eat according to the rules of a traditional food culture are generally healthier than we are. Any traditional diet will do: if it weren’t a healthy diet, the people who follow it wouldn’t still be around. True, food cultures are embedded in societies and economies and ecologies, and some of them travel better than others: Inuit not so well as Italian. In borrowing from a food culture, pay attention to how a culture eats, as well as to what it eats. In the case of the French paradox, it may not be the dietary nutrients that keep the French healthy (lots of saturated fat and alcohol?!) so much as the dietary habits: small portions, no seconds or snacking, communal meals — and the serious pleasure taken in eating. (Worrying about diet can’t possibly be good for you.) Let culture be your guide, not science.

    8. Cook. And if you can, plant a garden. To take part in the intricate and endlessly interesting processes of providing for our sustenance is the surest way to escape the culture of fast food and the values implicit in it: that food should be cheap and easy; that food is fuel and not communion. The culture of the kitchen, as embodied in those enduring traditions we call cuisines, contains more wisdom about diet and health than you are apt to find in any nutrition journal or journalism. Plus, the food you grow yourself contributes to your health long before you sit down to eat it. So you might want to think about putting down this article now and picking up a spatula or hoe.

    9. Eat like an omnivore. Try to add new species, not just new foods, to your diet. The greater the diversity of species you eat, the more likely you are to cover all your nutritional bases. That of course is an argument from nutritionism, but there is a better one, one that takes a broader view of “health.” Biodiversity in the diet means less monoculture in the fields. What does that have to do with your health? Everything. The vast monocultures that now feed us require tremendous amounts of chemical fertilizers and pesticides to keep from collapsing. Diversifying those fields will mean fewer chemicals, healthier soils, healthier plants and animals and, in turn, healthier people. It’s all connected, which is another way of saying that your health isn’t bordered by your body and that what’s good for the soil is probably good for you, too.

    Michael Pollan, a contributing writer, is the Knight professor of journalism at the University of California, Berkeley. His most recent book, “The Omnivore’s Dilemma,” was chosen by the editors of The New York Times Book Review as one of the 10 best books of 2006.


     

     

    Seventh and Final Potter Book Out July 21

    By Debbi Wilgoren
    Washington Post Staff Writer
    Thursday, February 1, 2007; 3:46 PM

    Mega-author J.K. Rowling today announced that the seventh — and apparently final — book in the blockbuster Harry Potter series would be released July 21, as soon as the clock ticks past midnight.

    “Harry Potter and the Deathly Hallows” will be available in bookstores at 12:01 a.m. July 21 in the United States, and at 12:01 a.m. British Standard Time (BST) in the United Kingdom and other English-speaking companies, according to Rowling’s Web site.

    The middle-of-the-night publication of previous installments of the widely popular series, about a tousled boy wizard and his evil-fighting mates, has sparked long lines, elaborate book parties and frenzied expectations, all of which are expected only to grow in magnitude this time around.

    Bookseller and fan Web sites reacted immediately to the news of the publication date. Amazon.com and the site for Barnes & Noble posted banner headlines and urged readers to pre-order the book for at least 40 percent off the $34.99 sale price. Mugglenet.com set up a countdown clock, which as of 11 a.m. this morning showed 161 days, 11 hours and 59 minutes remaining until the magic moment.

    “OMG YAY!!!! SO EXCITING!!!!!!!” read one excited message from a fan on the site [for those uninitiated in digital shorthand, OMG stands for 'Oh my God.'].

    Another fan, posting under the name rachel9isfine, wrote: “This rocks!! I’m so excited! This made my month!”

    Rowling announced the title of the book on her Web site Dec. 21, prompting huge speculation as to what it might mean (Hallows refers to a holy person or saint, according to the officials at Scholastic, the book’s U.S. publisher; it does not necessarily have anything to do with Godric’s Hollow, the name of the place where, in the fictional series, Harry’s parents are killed when he is a baby).

    The use of the adjective ‘deathly’ fueled the speculation about who would die in the book — Rowling has said in the past that two characters will meet his or her demise, but she did not (of course) let slip who it would be.

    Book Six ended with Harry vowing not to return for his final year at Hogwarts School of Witchcraft and Wizardry, determined instead to hunt down his mortal enemy Voldemort and make a final, life-or-death attempt to vanquish him forever.

    When the title was released six weeks ago, Rowling was still working on the book, according to an entry she posted on her Web site. “I’m now writing scenes that have been planned, in some cases, for a dozen years or even more . . . I am alternately elated and overwrought. I both want, and don’t want, to finish this book (don’t worry, I will).”

    Scholastic said the interior and cover art for “Deathly Hallows” will be illustrated by Mary GrandPré, who has illustrated the previous six books.

    “Harry Potter and the Half-Blood Prince,” the sixth installment in the series, was released in July of 2005, and was the fastest-selling book in history, with 6.9 million copies snapped up in the first 24 hours, Scholastic said.

    There are more than 120 million copies of the Harry Potter books in print in the United States alone. Each book published so far has topped the bestseller list in the United States, the United Kingdom and around the world.

     

    Today’s Papers

    Compromise Accomplished
    By Daniel Politi
    Posted Thursday, Feb. 1, 2007, at 5:53 AM E.T.

    The Washington Post leads with Democratic and Republican senators who oppose President Bush’s plan to send more troops to Iraq announcing last night they have reached a compromise and will support a resolution put forward by Republican Sen. John Warner of Virginia. The nonbinding resolution isn’t as strongly worded as the one Democrats preferred, but after Warner made some changes they decided it was their best chance to get Republican support. The Los Angeles Times leads with a look at how Iraq’s northern city of Kirkuk “could develop into a third front in the country’s civil war” as different groups vie to control the oil-rich region. The Wall Street Journal tops its world-wide newsbox with the latest quarterly report on Iraq reconstruction that found tens of millions of dollars were wasted. The paper also notes Iraq has stopped all flights to and from Syria and closed one border crossing with Iran as the government prepares for a security crackdown.

    The New York Times leads news that a German court issued arrest warrants for 13 people who were part of a CIA “abduction team” that detained a German citizen and held him for five months in Afghanistan. German authorities did not name the suspects, and said they were still trying to determine their true identities. The NYT notes the LAT was the first to report the story. USA Today leads with an early look at a study that says oil from the 1989 Exxon Valdez spill continues to cause problems to the ecosystem and wildlife. It’s going to take longer for the oil to disappear than many predicted and it “will be readily detectable for decades,” a scientist tells the paper.

    The basic gist of the resolution is the same: The Senate is opposed to the troop increase. Warner agreed to drop his initial wording that supported more troops for Anbar province. The resolution also won’t state that Bush’s plan goes against the national interest and includes language saying the Senate vows not to decrease funding for troops in the field. Democratic leaders in the House said they will write a resolution using the Senate’s as a blueprint.

    Kirkuk residents say they don’t want war, but everyone “appears to be preparing for it,” says the LAT. Kurds are dominant in the area, but Sunnis and Shiites also want a piece of the action and insist they will fight if necessary. To make matters even more complicated, this goes beyond Iraq, as Turkey and Iran are worried that if the Kurds do take control of Kirkurk, it could lead to an independent Kurdish region, which might “embolden Kurdish militants.” In advance of a constitutionally mandated referendum, some Kurdish officials are trying to remove voting rights of thousands of Arabs in the area, while pressuring them to leave. Officials in Turkey have vowed to intervene if necessary to maintain the population balance.

    Just in case you thought the details of the militia group that fought with Iraqi and American troops on Sunday couldn’t get more confusing, the NYT goes inside with the latest. Some Iraqi officials are now saying the leader of the group was a Sunni who was pretending to be a Shiite. Meanwhile, the U.S. military announced the death of four more American servicemembers. The LAT reports figures from the ministries of defense and health that reveal at least 2,067 Iraqis were killed in January.

    All the papers note that after much wrangling back and forth, the Justice Department has agreed to turn over files about its eavesdropping program to a select group of lawmakers but not to the public. The documents should, at the very least, reveal whether the court overseeing the program will approve requests individually or if it issued a blanket warrant allowing eavesdropping on a group of people.

    The NYT fronts, and everyone mentions, Sen. Joseph Biden of Delawere announcing he will run for president. Everyone focuses on how Biden was forced to spend the day trying to explain an interview in which he said Sen. Barack Obama is “the first mainstream African-American who is articulate and bright and clean and a nice-looking guy.”

    The NYT fronts a look at the fascinating story of a 29-year-old sex offender who pretended to be a 12-year-old for almost two years and lately had enrolled in a public school in Arizona for four months. He lived with three other men who passed themselves off as family members but were really a group of sex offenders. And, to make things even stranger, it seems two of the men he lived with and had sex with actually thought he was a minor.

    Everyone goes inside with the latest from the Libby trial, where, contrary to what most of the papers predicted yesterday, defense attorneys were able to question Judith Miller on her other sources. Turns out, she couldn’t remember other officials with whom she had talked about Valerie Plame.

    Then it was Time‘s Matt Cooper’s turn, and he testified that Karl Rove was the first to tell him about Joseph Wilson’s wife. He then got confirmation from Libby. The defense homed in on what it characterized as Cooper’s sloppy notes to attack his credibility. There was also discussion of how Cooper talked to Libby on a Saturday while “sprawled” on his bed after he had spent the morning swimming at a country club.

    In a separate Page One analysis, the LAT says that in “many respects it is the ugly mutual exploitation that goes on every day in Washington between powerful government officials and influential members of the media that is on trial.”

    Everyone notes the death of Molly Ivins, the liberal syndicated columnist who dedicated much of her work to making fun of powerful politicians. President Bush, a fellow Texan, was one of her most frequent targets. The WP says more than 400 newspapers subscribed to her column. Ivins was 62.

    Daniel Politi writes “Today’s Papers” for Slate. He can be reached at todayspapers@slate.com.

     

    Renewal, in Real Estate and in Culture, for Ancient People

    Ashley Gilbertson for The New York Times

    A vast flock of sparrows is sprinkled above the ghetto of Rome, a Jewish community for 450 years, as a man examines the menu of a kosher falafel restaurant. Soaring real estate values are enticing many of the few remaining Jews to leave, but the Jewish identity of the district is being revived with new shops and restaurants.

    Renewal, in Real Estate and in Culture, for Ancient People

    ROME, Jan. 25 — As a boy, in October 1943, Pacifico Disegni watched from his window as two German trucks hauled people from the ghetto in Rome, a city where Jews have lived for 2,000 years.

    Last year, in blessedly more peaceful times, a rich visitor from Boston took in the view from that same window. A magnificent front-row view of the Theater of Marcellus, first planned by Julius Caesar, somehow salves the sting of history.

    Mr. Disegni, now 78, said the man produced a blank check and offered to buy the apartment on the spot.

    “He said, ‘You write how many millions you want,’ ” Mr. Disegni said.

    Mr. Disegni, who is Jewish, refused. But these bookend events at his window cast light on a paradox in the city with the oldest Jewish population in Europe. High real estate prices, not violence or bias, are driving the last Jews from their homes in the old ghetto, which is slowly transforming itself into a trendy enclave for the rich and famous.

    Experts say only 200 or 300 Jews remain, in a neighborhood that numbered perhaps 9,000 after the deportation of 2,000 or more during World War II.

    But there is a second paradox. Even as the number of Jews living in the ghetto drops to near nothing, Jewish life is thriving.

    Rome’s Jewish school recently moved to the ghetto from a neighboring area. Jewish shops, including the first kosher fast-food restaurants, are popular. Visits to the museum at the grand synagogue have doubled in two years.

    “Even if Jews no longer live in the area, they come to open their shops,” said Daniela Di Castro, director of the Jewish Museum of Rome. “So there is always Jewish life around, to work, to go to the synagogue, to buy from the kosher market, bring their children to school.

    “You always have a reason to come here if you are Jew.”

    It is a dynamic of complex layers, defying media alarmism about the loss of Jewish character in central Rome, but not quite assuring that character’s ultimate survival.

    On the other hand, this is Italy, where history moves at its own unpredictable pace. For now, few locals can imagine the ghetto as having a soul that is anything other than Jewish.

    “It would be impossible to erase it,” said Luciano Calò, 45, a Jew who owns Bartaruga, a bar next to one of Rome’s most sublime fountains, featuring four boys playing with turtles, a whimsy Bernini added 83 years after the fountain was finished.

    “History was born here,” Mr. Calò added. “And the tourists come here because of that history in the walls of these buildings. You feel the desperation of the people who lived here.”

    Jews are documented in Rome as early as the second century B.C., first as respected guests from the empire’s far reaches, later as slaves who helped build the Colosseum, finished in A.D. 80.

    In 1516 the Jews of Venice were the first in Europe to be segregated — and there the word “ghetto” was born, from the local dialect for the slag heaps in the area where Jews were forced to live.

    In 1555 a papal bull established a ghetto in Rome, laid out near the Tiber, amid the nubby, desiccated ruins, and locked at night. That entity was not abolished until Italy’s unification in 1870, but Jews continued to live there, often in deep poverty, in buildings with inadequate heat and plumbing.

    Those conditions drove many Jews to leave the ghetto after World War II, settling in more modern apartments in Monteverde or near Viale Marconi, to the south. Many moved to Israel.

    As the years went on, the rich began buying up homes all around central Rome, including in the ghetto. Prices and quality went up — and then up much more when Italy converted to the euro in 2002.

    A real estate operator, Daniela Di Maulo, said apartments in the ghetto now cost as much as $1,000 a square foot.

    “It’s only for tourists, for people on the magazine covers,” she said.

    Speculation exploded, and the choicest properties were often those of the district’s remaining Jews, many of them elderly. One is Roberto Calò, 75, who said he had fended off at least 10 offers for millions of euros.

    “I would never accept,” said Mr. Calò, the uncle of the bar owner. “It’s because I have old memories,” among them of his father and brother, who were taken away by the Nazis in October 1943 (and who were among the few who returned). But many did sell. With prices so high, few are casting blame.

    “It could be that there is an offer you can’t say no to,” said Angelo Sermoneta, 58, who was born in the ghetto. “There is an even greater god, and that is the god of money.”

    But money is not the only force in the ghetto. Mr. Calò and Mr. Sermoneta were sitting in a cozy social club in the heart of the ghetto, along with 11 older friends who were all born there. Only four of them still lived there.

    But they all go to the club regularly — to chat (about sports, and about politics in Italy and Israel) and to drink — basically to keep the ghetto alive and Jewish. So, too, Jews who live around Rome worship in the ghetto, socialize there, work there, because commercial space is not as pricey as apartments.

    “For us Jews to come here every night — it’s something that’s in our DNA,” Mr. Sermoneta said. “It’s where we were born, where we lived, where our friends are.”

    Street life has become even more Jewish, with shops and restaurants with Jewish products and food that attract tourists, many of them American Jews, and that keep Rome’s Jews anchored there. Seven years ago, Rafael Fadlon, now 36, opened the ghetto’s first modern kosher restaurant, La Taverna del Ghetto, and last year, he started its first kosher fast-food restaurant — both of which, he reports, are doing well.

    “If you want to keep kosher, it’s simpler now than 10 years ago,” he said.

    “If you are talking about Jews living here, it’s not so much,” he added. “But it’s much better equipped than other ghettos around the world.”

    With prices as they are, he cannot see Jews coming back to live in the ghetto.

    “Jews are not stupid,” he said. “They would like to move here, but they can’t.”

    Peter Kiefer contributed reporting.


     

    Berlusconi Flirts. His Wife’s Fed Up

    Ettore Ferrari/European Pressphoto Agency

    Silvio Berlusconi shown with Mara Carfagna, a lawmaker, in May. His wife wants a public apology for his behavior with her and other women.

    Berlusconi Flirts. His Wife’s Fed Up. Read All About It.

    ROME, Jan. 31 — “Dear Editor,” began a letter published Wednesday on the front page of La Repubblica, the newspaper that Silvio Berlusconi hates most. The scalding letter demanded an apology from Mr. Berlusconi for flirting publicly — and it was signed by his wife.

    And so, a nation bored and a little down at its return to semi-normal politics woke to a juicy news cycle with an inescapable conclusion: in or out of power, Mr. Berlusconi may behave reprehensibly, but Italy cannot keep its eyes off him.

    “We have had for eight months a notably boring government,” said Giuliano Ferrara, an editor and informal aide to Mr. Berlusconi, referring to the stewardship of Prime Minister Romano Prodi, who beat Mr. Berlusconi in elections last spring.

    “And right now there is an explosion of strange and weird vitality, the heart that keeps on pumping,” he said. “People miss very much that style. It’s not healthy, but it’s Italian.”

    It turns out that the 70-year-old former prime minister, whose own heart now beats with a pacemaker, attended an awards ceremony last week and was overly friendly with two young and beautiful guests.

    “If I weren’t already married, I would marry you right now,” he told one, according to Italian news media accounts. To another, he said, “With you I would go anywhere.”

    “These are statements I consider damaging to my dignity,” wrote Veronica Lario, 50, who has been with Mr. Berlusconi for 27 years. His remarks could not be “reduced to jokes,” she said.

    “To my husband and to the public man, I therefore ask for a public apology, not having received one privately.”

    In divining what this could mean, Italians barely knew where to start.

    Feminists called it an overdue rallying cry for Italian women like Ms. Lario, who has endured years of supposed infidelity (and no end of sexual remarks, like when Mr. Berlusconi opened a political conference by praising the legs of the women in the front row). Political analysts said Mr. Berlusconi, who wants a third turn as prime minister, could never again win the votes of women — and so was finished.

    Then, in early evening, Mr. Berlusconi, who can never be counted out, wrote his own open letter, released by Forza Italia, his political party.

    “Your dignity should not be an issue: I will guard it like a precious material in my heart even when thoughtless jokes come out of my mouth,” he wrote. “But marriage proposals, no, believe me, I have never made one to anyone.

    “Forgive me, however, I beg of you, and take this public testimony of private pride that submits to your anger as an act of love. One among many. A huge kiss. Silvio.”

    In the end, it seemed an especially spicy episode in the long and complicated relationship not only between Silvio and Veronica, but also between Silvio and Italy. The private drama of Italy’s richest man, the nation’s shrewd, shady and irrepressible personification, became something public, possibly even relevant politically and psychically.

    There seemed little question here that Ms. Lario’s letter deserved its spot on the front page. “It would be like Hillary Clinton asks for the public apology from Bill Clinton,” said Ezio Mauro, La Repubblica’s top editor.

    Indeed, Italy’s top three evening talk shows devoted all their time to the unusual exchange of letters. Beppe Severgnini, one of the most discerning commentators on Italian mores, quickly churned out a column for Corriere della Sera summing up its import.

    “The man is a walking oxymoron, but it has not stopped him from working his way up,” he wrote. “Why? Simple: because he embodies the Italian dream of being everything, of pleasing everyone (and indulging himself in everything), without giving up anything.”

    Perhaps all marriages are mysteries on some level, but the drama also shed light on one of Italy’s most visible but ambiguous couples. They met in 1980, when he was a budding, and married, builder and she was a beautiful B-movie actress appearing in a play in Milan. He saw her on stage, the story goes, and fell deeply in love.

    He left his first wife, they married and had three children (he already had two). He grew richer, entered politics in the mid-1990s, and the two seemed somehow together yet increasingly apart. No small amount of his public persona was linked to his constant, earthy joking about women and his mastery of them, amid rumors that monogamy was not among his virtues.

    “I lost my hair because I had too many girlfriends,” he once said (he has since had implants). In 2003, he gave a reason foreigners should invest in Italy: “Aside from the good weather, we have beautiful businesswomen and also beautiful secretaries.”

    Through it all, Ms. Lario remained largely silent — a fact she noted acidly in the letter, which she pointedly signed “Mrs. Berlusconi” though she routinely uses her own name. “I chose not to leave space for marital conflicts, even when his behavior created reasons to do so,” she wrote.

    But not entirely: she made no secret over the years that her personal political views were more to the left than her husband’s. Maria Latella, an Italian journalist who wrote a biography of Ms. Lario, recalled that during Mr. Berlusconi’s first term as prime minister, in 1994, a newspaper article appeared saying that every day he sent flowers to someone.

    He contended they were to his wife. But Ms. Latella noted that Ms. Lario had sent the newspaper a brief letter saying that, in fact, she never received flowers from Palazzo Chigi, the prime minister’s official residence.

    “She considered it humiliating that flowers sent to another person were attributed to her,” Ms. Latella said. “It shows the character of the person.”

    Ms. Lario also spoke candidly in the biography, saying that she rarely saw Mr. Berlusconi but that she considered their marriage stable and herself “the perfect kind of wife for the kind of man Silvio is.”

    “He can concentrate on himself and his work knowing his wife won’t create a fuss if he’s away from his family,” she said in the biography.

    As fate would have it, on the very same day that Ms. Lario fired off her letter, Mr. Berlusconi echoed his wife’s comments, now possibly void, in an interview he gave, also to Ms. Latella, for her magazine, A.

    “Veronica has always been a total passion,” he said. “When we met I lost my head for her. And she has been a marvelous mother.

    “She has never made me look bad, never — while the wives of certain other politicians…,” he said, trailing off his thought. “And then she is so indulgent. What more could I want?”

    Ms. Latella said, “I think he was wrong this time.”

    Peter Kiefer contributed reporting.


    Copyright 2007 The New York Times Company

     

    Peyton Manning Huckster

    Peyton Manning Overjoyed His Commercials Will Finally Appear In Super Bowl

    February 1, 2007 | Issue 43•05

    INDIANAPOLIS—Colts quarterback Peyton Manning took a moment during Super Bowl Media Day Tuesday to acknowledge his “deep, abiding joy and pride” that, after many years of attempting to make his presence felt on advertising’s biggest stage, his commercials would finally be coming to the Super Bowl. “There’s no greater honor for a major player in the endorsement game than to get to the Super Bowl,” said Manning, a three-time AdWeek MVP who is attempting to prove once and for all that he can land the big campaign. “My dad was a great pitchman, but he never got here. People said I would never get here. But on Super Bowl Sunday, Sprint, DirecTV, MasterCard, Sony and I plan to prove them all wrong. I guarantee it.” Manning will also be playing quarterback for the Colts during the game, although he is not expected to be televised nearly as much in that capacity.

    © Copyright 2006, Onion, Inc. All rights reserved.
    The Onion is not intended for readers under 18 years of age

     

    Wednesday, January 31, 2007

    Schoolboy Friedkin, Boxer, Dies at 89

    Bernie Friedkin in 1935.

    Schoolboy Friedkin, Boxer, Dies at 89

    Bernie Friedkin, a native of Brownsville, Brooklyn, who was known as Schoolboy and who as a professional boxer in the late 1930s and early ’40s battled many of his opponents to a draw — including three former lightweight champions — died Jan. 18 in Brooklyn. He was 89.

    He died of natural causes at a hospice, his granddaughter Sabrina Saltz said.

    Given the nickname Schoolboy because of his baby face and 5-foot-3 height, and because he used his older brother’s birth certificate to be admitted to local gyms when he was 14, Friedkin developed into a skilled tactical fighter, rather than a hard puncher, in a six-year professional career that began in 1935. He started as a featherweight, at less than 126 pounds, but bulked up to 135 as a lightweight.

    According to records at the International Boxing Hall of Fame in Canastota, N.Y., Friedkin won 48 fights, 9 by knockout, with 11 losses and 16 draws.

    “He has more draws than losses,” the boxing historian Bert Sugar said. “Sixteen draws is almost an unheard-of number.”

    In March 1937, Friedkin fought the former lightweight champion Eligio Sardiñas, known as Kid Chocolate, to a draw. In January 1940, he faced Mike Belloise, another former lightweight champion, twice, with both bouts ending in draws. Five months later, he stepped into the ring with a third former lightweight champion, Petey Scalzo; another draw.

    “That tells me he might have won some of these fights,” Sugar said, “but they were protecting the bigger names.”

    In November 1940, five months after their draw, Friedkin and Scalzo faced each other again. This time, Scalzo won an eight-round decision.

    Bernard Friedkin was born on July 10, 1917, one of seven children of Morris and Bessie Friedkin. Besides his granddaughter Sabrina Saltz of Brooklyn, he is survived by his wife of 60 years, the former Lenore Bennett; two daughters, Donna Saltz and Marilyn Saewitz, both of Staten Island; and two other grandchildren.

    On July 21, 1938, a bitter rivalry between Friedkin and Al Davis, known as Bummy, brought 6,000 fans to Madison Square Garden. Friedkin was knocked out in the fourth round.

    “This was a turf war,” Sugar said, “two Jewish boxers from Brownsville.”


Post a Comment

Leave a Reply

Your email address will not be published. Required fields are marked *