小学科学教学论坛

注册

 

发新话题 回复该主题

小科饕客说:烹饪to人类【读书笔记】 [复制链接]

1#

小科饕客说:烹饪to人类【读书笔记】



1、简介
英文原著:Catching Fire_ How Cooking Made Us Human
中文书名:着火 烹饪如何让我们成为人类【无中文翻译版本】
核心观点:烹饪导致类人猿到直立人转变
                烹饪or熟食使消化道收缩+大脑生长,帮助构建人类社会+创造男女劳动分工。

目录【本人强烈推荐本书 各位按兴趣择楼阅读】:

【2楼】CHAPTER 1 - Quest for Raw-Foodists

【3楼】CHAPTER 2 - The Cook’s Body

【4楼】CHAPTER 3 - The Energy Theory of Cooking

【5楼】CHAPTER 4 - When Cooking Began

【6楼】CHAPTER 5 - Brain Foods

【8楼】CHAPTER 6 - How Cooking Frees Men

【9楼】CHAPTER 7 - The Married Cook

【10楼】CHAPTER 8 - The Cook’s Journey


2、摘录
个人摘录本人认为的重点,仅供参考。【后面附上全文】

Evolutionary benefits of adapting to cookedfood are evident from comparing human digestive systems with those of chimpanzeesand other apes. The main differences all involve humans having relatively smallfeatures. We have small mouths, weak jaws, small teeth, small stomachs, smallcolons, and small guts overall. In the past, the unusual size of these bodyparts has mostly been attributed to the evolutionary effects of our eatingmeat, but the design of the human digestive system is better explainedas an adaptation to eating cooked food than it is to eating raw meat.

适应熟食的进化益处从比较人类消化系统与黑猩猩和其他猿类的消化系统中可见一斑。主要的差异都涉及人类具有相对较小的特征。我们有小嘴巴、弱下巴、小牙齿、小胃、小结肠,总体来说肠道也较小。在过去,这些身体部位不寻常的大小主要归因于我们吃肉的进化影响,但人类消化系统的设计更好地解释为是对食用熟食的适应而不是对食用生肉的适应。

Continuing farther into the body, our stomachsagain are comparatively small. In humans the surface area of the stomach isless than one-third the size expected for a typical mammal of our body weight,and smaller than in 97 percent of other primates. The high caloric density ofcooked food suggests that our stomachs can afford to be small. Great apes eatperhaps twice as much by weight per day as we do because their foods are packedwith indigestible fiber (around 30 percent by weight, compared to 5 percent to10 percent or less in human diets). Thanks to the high caloric density of cookedfood, we have modest needs that are adequately served by our small stomachs.

继续深入身体内部,我们的胃相对较小。在人类中,胃的表面积不到体重与人类相当的典型哺乳动物预期胃表面积的三分之一,并且比 97%的其他灵长类动物的胃都要小。熟食的高热量密度表明我们的胃可以很小。大猩猩每天的食量可能是我们的两倍,因为它们的食物中含有难以消化的纤维(约占食物重量的 30%,而人类饮食中这一比例为 5% 10%或更低)。由于熟食的高热量密度,我们的需求适度,小胃足以满足。

Below the stomach, the human small intestine isonly a little smaller than expected from the size of our bodies, reflectingthat this organ is the main site of digestion and absorption, and humans havethe same basal metabolic rate as other primates in relation to body weight. Butthe large intestine, or colon, is less than 60 percent of the mass that wouldbe expected for a primate of our body weight. The colon is where our intestinalflora ferment plant fiber, producing fatty acids that are absorbed into thebody and used for energy. That the colon is relatively small in humans means wecannot retain as much fiber as the great apes can and therefore cannot utilizeplant fiber as effectively for food. But that matters little. The high caloricdensity of cooked food means that normally we do not need the large fermentingpotential that apes rely on.

在胃的下方,人类的小肠仅比根据我们身体的大小所预期的略小一点,这反映出这个器官是消化和吸收的主要部位,并且人类与其他灵长类动物在身体方面具有相同的基础代谢率。体重。但是大肠或结肠的质量不到我们这种体重的灵长类动物预期质量的 60%。结肠是我们肠道菌群发酵植物纤维的地方,产生被身体吸收并用作能量的脂肪酸。人类的结肠相对较小意味着我们不能像类人猿那样保留那么多的纤维,因此不能像类人猿那样有效地将植物纤维用作食物。但这关系不大。熟食的高热量密度意味着通常我们不需要类人猿所依赖的巨大发酵潜力。

Finally, the volume of the entire human gut,comprising stomach, small intestine, and large intestine, is also relativelysmall, less than in any other primate measured so far. The weight of our gutsis estimated at about 60 percent of what is expected for a primate of our size:the human digestive system as a whole is much smaller than would be predictedon the basis of size relations in primates.

我们的小嘴、牙齿和肠道与熟食的柔软度、高热量密度、低纤维含量和高易消化性非常匹配。这种缩小提高了效率,使我们不必在那些仅为了让我们消化大量高纤维食物而存在的特征上浪费不必要的代谢成本。嘴巴和牙齿不需要很大就能咀嚼柔软、高密度的食物,而颌部肌肉的缩小可能有助于我们产生适合食用熟食的低力量。较小的尺寸可以减少牙齿损伤和随后的疾病。就肠道而言,体质人类学家莱斯利·艾洛和彼得·惠勒报告称,与类人猿相比,人类肠道尺寸的缩小为人类节省了至少 10%的日常能量消耗:身体中的肠道组织越多,就必须在其新陈代谢上花费更多的能量。由于烹饪,类人猿所食用的那种高纤维食物不再是我们饮食中有用的一部分。人类消化系统的一系列变化是有道理的。

Cooked food is better than raw food becauselife is mostly concerned with energy. So from an evolutionary perspective, ifcooking causes a loss of vitamins or creates a few long-term toxic compounds,the effect is relatively unimportant compared to the impact of more calories. Afemale chimpanzee with a better diet gives birth more often and her offspringhave better survival rates. Insubsistence cultures, better-fed mothers have more and healthier children. Inaddition to more offspring, they have greater competitive ability, bettersurvival, and longer lives. When our ancestors first obtained extracalories by cooking their food, they and their descendants passed on more genesthan others of their species who ate raw. The result was a new evolutionaryopportunity.

煮熟的食物比生食好,因为生命主要与能量有关。所以从进化的角度来看,如果烹饪导致维生素损失或产生一些长期有毒化合物,与更多的影响相比,这种影响相对不重要卡路里。饮食更好的雌性黑猩猩生育的频率更高,她的后代存活率也更高。在生存文化中,喂养更好的母亲会有更多更健康的孩子。除了更多的后代,它们还有更大的竞争能力、更好的生存和更长的寿命。当我们的祖先第一次通过烹饪食物获得额外的卡路里时,他们和他们的后代比生吃的同类中的其他人传递了更多的基因。结果是一个新的进化机会。

The introduction of cooking may well have beenthe decisive factor in leading man from a primarily animal existence into onethat was more fully human.

—CARLETON S. COON, The History of Man

烹饪的引入很可能是将人类从主要的动物生活转变为更完整的人类生活的决定性因素。

The inability of the archaeological evidence totell when humans first controlled fire directs us to biology, where we find twovital clues. First, the fossil record presents a reasonably clear picture ofthe changes in human anatomy over the past two million years. It tells us whatwere the major changes in our ancestors’ anatomy, and when they happened.Second, in response to a major change in diet, species tend to exhibit rapidand obvious changes in their anatomy. Animals are superbly adapted to their diets,and over evolutionary time the tight fit between food and anatomy is driven byfood rather than by the animal’s characteristics. Fleas do not suck bloodbecause they happen to have a proboscis well designed for piercing mammalianskin; they have the proboscis because they are adapted to sucking blood. Horsesdo not eat grass because they happen to have the right kind of teeth and gutsfor doing so; they have tall teeth and long guts because they are adapted toeating grass. Humans do not eat cooked foodbecause we have the right kind of teeth and guts; rather, we have small teethand short guts as a result of adapting to a cooked diet.

考古证据无法判断人类何时首次控制了火,这将我们引向生物学,我们在生物学中找到了两条重要线索。首先,化石记录相当清楚地展示了过去200万年人类解剖结构的变化。它告诉我们我们祖先解剖结构的主要变化是什么,以及它们发生的时间。其次,为了应对饮食的重大变化,物种往往会在解剖结构上表现出快速而明显的变化。动物非常适应它们的饮食,随着进化时间的推移,食物和解剖结构之间的紧密配合是由食物而不是动物的特征驱动的。跳蚤不吸血,因为它们碰巧有一个非常适合刺穿哺乳动物皮肤的口器;它们有口器是因为它们适应吸血。马不吃草是因为它们碰巧有合适的牙齿和内脏;它们有高牙齿和长内脏是因为它们适应了吃草。人类不吃熟食是因为我们有合适的牙齿和内脏;相反,我们适应了熟食,所以牙齿小内脏短。

Two kinds of evidence thus point independently to the originof Homo erectus as the time when cooking began. First, anatomical changes related to diet, including thereduction in tooth size and in the flaring of the rib cage, were larger than atany other time in human evolution, and they fit the theory that the nutritionalquality of the diet improved and the food consumed was softer. Second, the lossof traits allowing efficient climbing marked a commitment to sleeping on theground that is hard to explain without the control of fire.

因此,有两种证据独立地指出,烹饪开始于直立人的起源。首先,与饮食相关的解剖学变化,包括牙齿尺寸的缩小和胸腔的张开,比人类进化中的任何其他时期都大,它们符合饮食的营养质量提高和食物更软的理论。其次,允许高效攀爬的特征的丧失标志着对睡在地上的承诺,如果没有火的控制,这很难解释。

For more than 2.5 million years our ancestorshave been cutting meat off animal bones, and the impact was huge. A diet thatincluded raw meat as well as plant foods pushed our forebears out of theaustralopithecine rut, initiated the evolution of their larger brains, andprobably inspired a series of food-processing innovations. But according to theevidence carried in our bodies, it would take the invention of cooking to convert habilines into Homo erectus, and launchthe journey that has led without any major changes to the anatomy of modernhumans.

250万多年来,我们的祖先一直在从动物骨头上割肉,其影响是巨大的。包括生肉和植物食物在内的饮食将我们的祖先从南方古猿的传统中推了出来,启动了他们更大大脑的进化,并可能激发了一系列食品加工创新。但根据我们体内携带的证据,烹饪的发明将能人转化为直立人,并开启了没有对现代人类解剖结构产生任何重大变化的旅程。

The discovery that gut size variessubstantially gave Aiello and Wheeler the opening they were looking for.Relative to their body weight, primates with smaller guts proved to havelarger brains—just the kind oftrade-off that had been expected. Aiello and Wheeler estimated the number ofcalories a species is able to save by having a small gut, and showed that thenumber nicely matched the extra cost of the species’ larger brains. Theanthropologists concluded that primates that spendless energy fueling their intestines can afford to power more brain tissue. Big brains are made possible by a reductionin expensive tissue. The idea became known as the expensive tissue hypothesis.

肠道大小差异很大的发现给了艾洛和惠勒他们寻找的机会。事实证明,相对于体重,肠道较小的灵长类动物拥有更大的大脑——这正是人们所期望的权衡。艾洛和惠勒估计了一个物种通过拥有小肠道能够节省的卡路里数量,并表明这个数字与该物种较大大脑的额外成本非常匹配。人类学家得出结论,为肠道提供燃料的能量较少的灵长类动物能够负担得起为更多脑组织提供动力。大大脑是通过减少昂贵的组织来实现的。这个想法被称为昂贵组织假说。

The expensive tissue hypothesis predicted thatmajor rises in human brain size would be associated with increases in dietquality. Aiello and Wheeler identified two such rises. The first brain-size expansion was around two million years ago fromaustralopithecines to Homo erectus. In line with the Man-the-Hunterscenario, the scientists credited this rise in brain size to the increasedeating of meat. Second was a little morethan half a million years ago, when Homo erectus became Homo heidelbergensis. Theyattributed this rise to the only other obvious candidate for an improvement indietary quality: cooking.

昂贵的组织假说预测,人类大脑大小的大幅增加将与饮食质量的提高有关。艾洛和惠勒发现了两个这样的增加。第一次大脑大小的扩大发生在大约200万年前,从南方古猿到直立人。与猎人的假设一致,科学家们将大脑大小的增加归因于肉类的增加。第二个是50多万年前,当时直立人变成了海德堡人。他们将这种增长归因于饮食质量改善的唯一另一个明显候选者:烹饪。

Such improvements in cooking efficiency couldexplain why there was a steady upward trend in brain size during the lifetimesof the early human species. Brains were notably bigger in late Homo erectusthan in early Homo erectus, and in late Homo heidelbergensis than in early Homoheidelbergensis . Major dietary breakthroughs such as meat eating and theinvention of cooking cannot account for these smaller changes. The steady risein brain size between the major jumps is most easily explained by a series ofimprovements in cooking techniques. Perhaps some particularly importantadvances enabled the prominent rise in brain size with Homo heidelbergensis.

烹饪效率的这种提高可以解释为什么在早期人类的一生中,大脑大小呈稳步上升的趋势。晚期直立人的大脑明显大于早期直立人,晚期海德堡人的大脑明显大于早期海德堡人。食肉和烹饪的发明等重大饮食突破不能解释这些较小的变化。大脑大小在两次重大跳跃之间的稳步上升最容易用以下一系列改进来解释烹饪技术。也许一些特别重要的进步使得海德堡人的大脑尺寸显著增加。

Although the breakthrough of using fire at allwould have been the biggest culinary leap, the subsequent discovery of betterways to prepare the food would have led to continual increases in digestiveefficiency, leaving more energy for brain growth. The improvements would havebeen especially important for brain growth after birth, since easily digestedweaning foods would have been critical contributors to a child’s energy supply.Advances in food preparation may thus have contributed to the extraordinarycontinuing rise in brain size through two million years of human evolution—atrajectory of increasing brain size that has been faster and longer-lastingthan known for any other species. When Charles Darwin called cooking “probablythe greatest [discovery], excepting language, ever made by man,” he wasthinking merely of our improved food supply. But the idea that brainenlargement was made possible by improvements in diet suggests a widersignificance. Cooking was a great discovery not merely becauseit gave us better food, or even because it made us physically human.It did something even more important: it helped make our brains uniquely large,providing a dull human body with a brilliant human mind.

尽管使用火的突破将是烹饪的最大飞跃,但随后发现的更好的食物准备方法将导致消化效率的持续提高,为大脑生长留下更多的能量。这些改进对出生后的大脑生长尤其重要,因为容易消化的断奶食物将是儿童能量供应的关键贡献者。因此,食物准备的进步可能有助于在200万年的人类进化过程中大脑尺寸的持续惊人增长——这一增长轨迹比任何其他物种都更快、更持久。当查尔斯·达尔文称烹饪可能是人类有史以来最伟大的[发现,除了语言时,他只是在考虑我们改善的食物供应。但是,饮食的改善使大脑扩大成为可能的想法暗示了更广泛的意义。烹饪是一个伟大的发现,不仅仅是因为它给了我们更好的食物,甚至是因为它让我们成为了人类。它做了更重要的事情:它帮助我们的大脑变得特别大,让一个迟钝的人体和一个聪明的人类头脑。


3、提炼

烹饪to人类进化


  • 解剖结构改变

    • 消化系统优化:人类进化出相对较小的口腔、较弱的颌骨与咀嚼肌、细小的牙齿、紧凑的胃部、短小的结肠以及整体缩小的肠道。这些适应性变化与烹饪食物的特性高度契合,柔软且高能量密度的熟食无需强大咀嚼与庞大消化器官处理,有效提升了消化效率并节省代谢成本。
    • 大脑容量扩充:依据昂贵组织假说,肠道缩小节省的能量为大脑发育提供支撑。自南方古猿起,历经能人、直立人至智人阶段,大脑容量持续显著增长,从约 450 立方厘米跃升至 1400 立方厘米左右,这一演变与烹饪带来的能量盈余紧密关联,推动了人类智力进化。
  • 时间利用变革

    • 减少进食时间:人类因烹饪享用柔软熟食,咀嚼时间大幅缩减,每日仅约 1 小时,而黑猩猩等灵长类咀嚼生食则需 4 - 6 小时。这极大解放了人类时间,使男性可专注狩猎,女性能够从事采集与食物加工等多样活动,深化了劳动分工专业化程度。
    • 增加社交与学习契机:从寻找食材、准备到共同享用熟食,烹饪全过程蕴含丰富社交互动元素,有力促进了群体成员间的沟通协作、知识传承及文化演进,为人类社会关系构建和文化传承创新筑牢根基。
  • 社会行为塑造

    • 家庭结构稳固:烹饪催生了男女间稳定的食物共享模式,女性采集并烹饪食物,男性狩猎提供食材,构建起相互依存的紧密关系,强化了家庭核心纽带,提升了后代养育成功率与群体生存能力。
    • 社群协作强化:围绕烹饪与饮食形成的群体活动,促进了食物分配规则、劳动分工以及社会规范的建立完善,催生了共同防御、资源管理和集体决策等合作机制,为复杂社会体系发展与文明诞生奠定基石。

昂贵组织假说

    大脑作为高能耗器官,其进化扩张需充足能量支撑,而这能量源自身体其他部位的能耗优化,肠道成为能量重新分配的关键环节。在人类进化早期,原始饮食以高纤维、低能量的生食为主,消化此类食物需庞大且耗能的肠道系统,包括宽大胃部、悠长肠道及强力消化肌肉,以应对生食的粗硬质地与复杂营养结构,这致使大量能量被肠道消化活动消耗,限制了大脑发育资源。

    烹饪技术的诞生是转折点。熟食具有质地柔软、营养易吸收等特性,显著降低了消化难度与能耗。随着烹饪普及,肠道消化工作减轻,其尺寸与重量逐步缩减,节省的能量得以重新定向分配至大脑。肠道重量从原本占身体预期值的较高比例降至约 60%,这部分释放的能量为大脑进化提供了关键的能量补给,有力支撑了大脑神经元数量增加、神经连接网络拓展及复杂认知功能发育,促使脑容量从早期人类的较小规模持续增长,如直立人脑容量较能人显著提升,为人类智慧进化筑牢能量基石。

    【吃煮熟的食物可以让身体吸收热量的代价大为降低,食品摄入量和消化时间都大为减少,身体有了可以腾出的能量,其结果是大脑容量的不断扩大,可以处理越来越复杂的信息,掌握新的武器和狩猎方式,适应新的社会关系,在进化上成为优势,最后演化成智人、现代人类。


4、小结

烹饪to人类进化

①肠胃变小:烹饪使得食物更容易被消化,允许消化道缩小,从而为大脑生长提供更多资源。

②大脑变大:烹饪提高食物营养和利用率,类人能够获得更多更好的能量来支持大脑的发展。

③社会结构变化:烹饪改变人类的饮食方式,进而影响社会结构。(如共同进餐促进交流合作)

④男女劳动分工:烹饪导致男女劳动的分工,女性更多参与烹饪,男性从事于狩猎采集和对外。


陈注:一言蔽之  食品安全 多吃熟食 少吃生食;这是科学 也是人类进化的双向选择。


5、系列
最后编辑稻草堆里的鱼 最后编辑于 2024-12-05 13:25:43
分享 转发
TOP
2#

CHAPTER 1



Quest for Raw-Foodists


“My definition of Man is, a ‘Cooking Animal’. The beasts have memory, judgement, and all the faculties and passions of our mind, in a certain degree; but no beast is a cook. . . . Man alone can dress a good dish; and every man whatever is more or less a cook, in seasoning what he himself eats.”

—JAMES BOSWELL, Journal of a Tour to the Hebrides with Samuel Johnson





A nimals thrive on raw diets. Can humans do the same? Conventional wisdom has always assumed so, and the logic seems obvious. Animals live off raw food, and humans are animals, so humans should fare well on raw food. Many foods are perfectly edible raw, from apples, tomatoes, and oysters to steak tartare and various kinds of fish. Tales of raw diets are numerous. According to Marco Polo, Mongol warriors of the thirteenth century supposedly rode for ten days at a time without lighting a fire. The riders’ food was the raw blood of their horses, obtained by piercing a vein. The cavalry saved time by riding without cooking, and they avoided producing the smoke that might reveal their position to hostile forces. The men did not like the liquid diet and looked forward to a cooked meal when speed was not essential, but there is no suggestion that they suffered from it. Such stories make cooking seem like a luxury, unimportant to our biological needs. But consider the Evo Diet experiment.

In 2006 nine volunteers with dangerously high blood pressure spent twelve days eating like apes in an experiment filmed by the British Broadcasting Corporation. They lived in a tented enclosure in England’s Paignton Zoo and ate almost everything raw. Their diet included peppers, melons, cucumbers, tomatoes, carrots, broccoli, grapes, dates, walnuts, bananas, peaches, and so on—more than fifty kinds of fruits, vegetables, and nuts. In the second week they ate some cooked oily fish, and one man sneaked some chocolate. The regime was called the Evo Diet because it was supposed to represent the types of foods our bodies have evolved to eat. Chimpanzees or gorillas would have loved it and would have grown fat on a menu that was certainly of higher quality than they could find in the wild. The participants ate until they were full, taking in up to 5 kilograms (10 pounds) by weight per day. The daily intake was calculated by the experiment’s nutritionist to include an adequate 2,000 calories for women, and 2,300 calories for men.

The aim of the volunteers was to improve their health, and they succeeded. By the end of the experiment their cholesterol levels had fallen by almost a quarter and average blood pressure was down to normal. But while medical hopes were met, an extra result had not been anticipated. The volunteers lost a lot of weight—an average of 4.4 kg (9.7 pounds) each, or 0.37 kg (0.8 pounds) per day.

The question of what kind of diet we need is critical for understanding human adaptation. Are we just an ordinary animal that happens to enjoy the tastes and securities of cooked food without in any way depending on them? Or are we a new kind of species tied to the use of fire by our biological needs, relying on cooked food to supply enough energy to our bodies? No serious scientific tests have been designed to resolve this problem. But whereas the Evo Diet investigation was short-term and informal, a few studies of long-term raw-foodists give us systematic data with a similar result.





Raw-foodists are dedicated to eating 100 percent of their diets raw, or as close to 100 percent as they can manage. There are only three studies of their body weight, and all find that people who eat raw tend to be thin. The most extensive is the Giessen Raw Food study, conducted by nutritionist Corinna Koebnick and her colleagues in Germany, which used questionnaires to study 513 raw-foodists who ate from 70 percent to 100 percent of their diet raw. They chose to eat raw to be healthy, to prevent illness, to have a long life, or to live naturally. Raw food included not only uncooked vegetables and occasional meat, but also cold-pressed oil and honey, and some items that were lightly heated such as dried fruits, dried meat, and dried fish. Body mass index (BMI), which measures weight in relation to the square of the height, was used as a measure of fatness. As the proportion of food eaten raw rose, BMI fell. The average weight loss when shifting from a cooked to a raw diet was 26.5 pounds (12 kilograms) for women and 21.8 pounds (9.9 kilograms) for men. Among those eating a purely raw diet (31 percent), the body weights of almost a third indicated chronic energy deficiency. The scientists’ conclusion was unambiguous: “a strict raw food diet cannot guarantee an adequate energy supply.”

The amount of meat in the Giessen Raw Food diets was not recorded but many raw-foodists eat rather little meat. Could a low meat intake have contributed to their poor energy supply? It is possible. However, among people who eat cooked diets, there is no difference in body weight between vegetarians and meat eaters: when our food is cooked we get as many calories from a vegetarian diet as from a typical American meat-rich diet. It is only when eating raw that we suffer poor weight gain.

The energy consequences of forgoing cooked food lead to a consistent reaction, illustrated by journalist Jodi Mardesich when she became a raw-foodist. “I’m hungry. These days, I’m almost always hungry,” she wrote. A typical day began at 7 A.M. when she cut and juiced two ounces of wheat grass. At 8:30 A.M. she had a bowl of “energy soup,” which she describes as a “room-temperature concoction made of sunflower greens, which are the tiny first shoots of a sunflower plant, and rejuvelac, a fermented wheat drink that tastes a lot like bad lemonade.” She added a couple of spoonfuls of blended papaya for interest. Lunch was a salad of sunflower greens, sprouted fenugreek seeds, sprouted broccoli seeds, fermented cabbage, and a loaf made of sprouted sunflower seeds, dehydrated seaweed, and some vegetables. Dinner was more sprouts, avocado chunks, pineapple, red onion, olive oil, raw vinegar, and sea salt. An hour later she was hungry again. In photographs she looks distinctly thin, but she was happy. She described herself as feeling energized, mentally sharper, and more serene. Nevertheless, after six months, during which she lost 18 pounds (8.2 kilograms), she could not resist slipping out for a pizza. Mardesich was not alone in finding a wholly raw diet a challenge. The Giessen Raw Food study found that 82 percent of long-term raw-foodists included some cooked food in their diets.

To judge whether the energy shortage experienced by raw-foodists is biologically significant, we need to know whether raw-induced weight loss interferes with critical functions—ideally, for a population living under conditions similar to those in our evolutionary past. In the Giessen study, the more raw food that women ate, the lower their BMI and the more likely they were to have partial or total amenorrhea. Among women eating totally raw diets, about 50 percent entirely ceased to menstruate. A further proportion, about 10 percent, suffered irregular menstrual cycles that left them unlikely to conceive. These figures are far higher than for women eating cooked food. Healthy women on cooked diets rarely fail to menstruate, whether or not they are vegetarian. But ovarian function predictably declines in women suffering from extreme energy depletion, such as marathoners and anorexics.

Raw-foodist men sometimes also report an impact on their sexual functions. In How to Do the Raw Food Diet with Joy for Awesome Health and Success, the author, Christopher Westra, wrote: “In my own experience, starting on living foods brought about a change in sexuality that was dramatic and completely unexpected. In just a few weeks, the number of times per day I thought about sex decreased tremendously.” Westra believed that seminal emissions are designed to remove toxins from the body. After a few weeks of a raw diet, he said, the intake of toxins had fallen to the point where ejaculation was no longer necessary. In a similar way some raw-foodists regard menstruation as a mechanism for removing toxins and therefore regard its cessation as a sign of the health of their diets. Perhaps it is unnecessary to note that medical science finds no support for the idea that toxins are removed by seminal emissions or menstruation.

Reduced reproductive function means that in our evolutionary past, raw-foodism would have been much less successful than the habit of eating cooked food. A rate of infertility greater than 50 percent, such as was found in the Giessen Raw Food study, would be devastating in a natural population of foragers. And since the Giessen study was of urban people enjoying a life of middle-class ease, such dramatic effects on reproduction are mild compared to what would have happened if these German raw-foodists had been searching for food in the wild.

Most raw-foodists prepare their food elaborately in ways that increase their energy value. Techniques include mild heating, blending, grinding, and sprouting. Any system of reducing the size of food particles, such as grinding and crushing, leads to predictable increases in energy gain. The German raw-foodists also had the advantage of eating oils produced commercially by industrial processing. Koebnick’s team found that about 30 percent of the subjects’ calories came from these lipids, a valuable energy source that would not have been available to hunter-gatherers. Yet even with all these helpful conditions, at least half the German women eating raw foods obtained so little energy from their diet, they were physiologically unable to have babies.

The Giessen subjects had further advantages. There is no indication that they engaged in much exercise, unlike women in foraging populations. Anthropologist Elizabeth Marshall Thomas describes bushman women in Africa’s Kalahari Desert returning to camp at the end of their ordinary long day thoroughly exhausted, because for much of the day they have been squatting and digging and walking, and hefting large loads of food, wood, and children. Even in populations that cook, these natural activity levels are high enough to interfere with reproductive function. If we imagine the lives of our German raw-foodists made more difficult by a daily regime of foraging for food in the wild, their rate of energy expenditure would surely be substantially increased. As a result, many more than 50 percent of the women would be incapable of pregnancy.

Then add that the subjects of the Giessen Raw Food study obtained their diets from supermarkets. Their foods were the typical products of modern farming—fruits, seeds, and vegetables all selected to be as delicious as possible. “Delicious” means high energy, because what people like are foods with low levels of indigestible fiber and high levels of soluble carbohydrates, such as sugars. Agricultural improvements have rendered fruits in a supermarket, such as apples, bananas, and strawberries, far higher in quality than their wild ancestors. In our laboratory at Harvard, nutritional biochemist NancyLou Conklin-Brittain finds that carrots contain as much sugar as the average wild fruit eaten by a chimpanzee in Kibale National Park in Uganda. But even carrots are better quality than a typical wild tropical fruit, because they have less fiber and fewer toxic compounds. If the German raw-foodists had been eating wild foods, their energy balance and reproductive performance would have been much lower than found by Koebnick’s team.

Supermarkets offer a year-round supply of the choicest foods, so the German raw-foodists had no seasonal shortages. Foragers, by contrast, cannot escape the tough times when sweet fruits, honey, or game meat become no more than occasional luxuries rather than daily pleasures. Even subsistence foods can then be hard to find. Anthropologist George Silberbauer reported that among the G/wi bushmen of the Central Kalahari, early summer was a time when all lost weight and everyone complained of hunger and thirst. In deserts like the Kalahari the result can be difficult indeed, but periodic shortages of energy like this are routine in all living hunter-gatherers, just as they are in rain-forest chimpanzees. Judging from studies of bones and teeth, which show in their fine structure the marks of nutritional stress, energy shortages were also universal in archaeological populations. Until the development of agriculture, it was the human fate to suffer regular periods of hunger—typically, it seems, for several weeks a year—even though they ate their food cooked.





Raw-foodism seems to be an increasingly popular habit, but if raw diets are so challenging, why do people like them? Raw-foodists are very enthusiastic about the health benefits, as described in books with such titles as Self Healing Power! How to Tap Into the Great Power Within You. They report a sense of well-being, better physical functioning, less bodily pain, more vitality, and improved emotional and social performance. There are claims of reductions in rheumatoid arthritis and fibromyalgia symptoms, less dental erosion, and improved antioxidant intake. Mostly such assertions have not been scientifically tested, but researchers have found improved serum cholesterol and triglyceride values.

Raw-foodists offer philosophical reasons too. “Natural nutrition is raw,” asserted Stephen Arlin, Fouad Dini, and David Wolfe in Nature’s First Law, a popular guide to raw-foodism. “It always has been. It always will be. . . . Cooked food is poison.” Many follow the pseudoscientific ideas of vegetarian Edward Howell, who theorized in a 1946 book that plants contain “living” or “active” enzymes, which, if eaten raw, operate for our benefit inside our bodies. His followers therefore prepare their foods below a certain temperature, normally about 45-48oC (113-118oF), above which the “life force” of the enzymes is supposedly destroyed. To scientists the idea that food enzymes contribute to digestion or cellular function in our bodies is nonsense because these molecules are themselves digested in our stomachs and small intestines. The “living enzyme” idea also ignores that even if food enzymes survived our digestive systems, their own specific metabolic functions are too specialized to allow them to do anything useful in our bodies. But while the idea of a “life force” in “living foods” is not accepted by physiologists, it persuades many raw-foodists to persist in their diet. By permitting some use of low heat, Howell’s philosophy also enables the “raw” food to be somewhat more palatable, easier to prepare, and more digestible than a truly unheated food would be.

Other raw-foodists are guided by moral principles. In 1813 the poet Percy Bysshe Shelley argued that meat eating was an appalling habit responsible for many of society’s ills and was obviously unnatural, given that humans lack claws, have blunt teeth, and dislike raw meat. Since he concluded that the invention of cooking was responsible for meat eating, and hence for such problems as “tyranny, superstition, commerce, and inequality,” he decided that humans were better off without cooking.

Instinctotherapists, a minority group among raw-foodists, believe that because we are closely related to apes we should model our eating behavior on theirs. In 2003 I had lunch with Roman Devivo and Antje Spors, whose book Genefit Nutrition argues that cooked food provides an unhealthy diet to which we are not adapted. They were lean and healthy. They were clear about their preference, which was to eat all their food not merely raw but without any preparation at all. They politely declined a salad because its ingredients had been chopped and mixed. The natural way, they explained, is to do what chimpanzees do. Just as those apes find only one kind of fruit when eating in a given tree, so we should eat only one kind of food in any meal.

To illustrate their habit, Devivo, Spors, and a friend had brought a basket containing a selection of organic foods. They sniffed at several fruits, one at a time, to allow their bodies to decide what would suit them best (“by instinct,” they said). One chose apples; another chose a pineapple. Each ate only his or her first choice. The third decided on a protein-rich food. He had brought frozen buffalo steaks and pieces of buffalo femur. Today was a marrow day. The femur chunks were the size of golf balls. Inside each was a cold pink mush that looked like strawberry ice cream. He cleaned out several pieces of bone with a teaspoon.

However strange it may be to think that we should eat to conserve living enzymes, or to reduce violence, or in the manner of apes, such concepts are helpful to raw-foodists because they bolster a strong commitment to principle. Eating raw intrudes into social life, demands a lot of time in the kitchen, and requires a strong will to resist the thought of cooked food. It can create personal problems, such as annoyingly frequent urination, and for meat eaters it increases the risk of eating toxins or pathogens that would be destroyed by cooking. There are other health risks too. Recent studies indicate that low bone mass in the backs and hips of raw-foodists was caused by their raw diet. Raw diets are also associated with low levels of vitamin B12, low levels of HDL cholesterol (the “good” cholesterol), and elevated levels of homocysteine (a suspected risk factor for cardiovascular disease).

In theory the precarious energy budgets experienced by the Giessen study subjects could be misleading. Maybe modern raw-foodists are so far removed from nutritional wisdom that they are just not choosing the right combination of foods. What about reliance on raw food in nonindustrialized cultures? This has often been reported. At the end of the nineteenth century, anthropologist William McGee, president of the National Geographical Society and cofounder of the American Anthropological Association, claimed that the Seri hunter-gatherers of northwestern Mexico ate meat and carrion largely raw. Four thousand years ago Sumerians in the Third Dynasty of Ur said that the bedouin of the western desert ate their food raw. As late as 2007, pygmies in Uganda’s Ruwenzori Mountains were reported in a national Ugandan newspaper to be living off raw food. Writers from Plutarch to colonial sailors of the nineteenth century made similar claims, but all have proved illusory, often colored by a racist tinge. “Only savages can be satisfied with the pure products of nature, eaten without seasoning and as nature provides them,” sniffed the entry in an eighteenth-century encyclopedia. In 1870 anthropologist Edward Tylor examined all such accounts and found no evidence of any being real. He concluded that cooking was practiced by every known human society. Similarly, all around the world are societies that tell of their ancestors having lived without fire. When anthropologist James Frazer examined reports of prehistoric firelessness, he found them equally full of fantasy, such as fire being brought by a cockatoo or being tamed after it was discovered in a woman’s genitals. The control of fire and the practice of cooking are human universals.





Still, in theory, societies could exist where cooked food is only a small part of the diet. The quirky nutritionist Howell thought so. In the 1940s he stated as part of his theory of the benefits of raw foods that the traditional Inuit (or Eskimo) diet was dominated by raw foods. His claim about the Inuit eating most of their food raw has been an important main-stay of the raw-foodist movement ever since.

But again it has proved exaggerated. The most detailed studies of un-Westernized Inuit diets were by Vilhjalmur Stefansson during a series of expeditions to the Copper Inuit beginning in 1906. Their diet was virtually plant-free, dominated by seal and caribou meat, supplemented by large salmonlike fish and occasional whale meat. Stefansson found that cooking was the nightly norm.

Every wife was expected to have a substantial meal ready for her husband when he got back from the hunt. In winter a husband came home at a predictably early time and would find the smell of boiling seal meat and steaming broth as soon as he entered the igloo. The long days of summer made the time of a husband’s return home less predictable, so wives often went to bed before he came back. Anthropologist Diamond Jenness accompanied Stefansson, and described what happened if a wife failed to leave cooked meat for her husband: “Woe betide the wife who keeps him waiting after a day spent in fishing or hunting! . . . Her husband will probably beat her, or stamp her in the snow, and may even end by throwing her household goods after her and bidding her begone forever from his house.”

Arctic cooking was difficult because of the shortage of fuel. In summer women made small twig fires, whereas in winter they cooked over burning seal oil or blubber in stone pots. After the snow had melted to water, the process of boiling meat took a further hour. Despite the difficulties, the meat was well cooked. “I have never seen Eskimo eat partly cooked meat so bloody as many steaks I have seen devoured in cities—when they cook, they usually cook well,” Stefansson wrote in 1910.

The slow cooking and shortage of fuel meant it was hard for men to cook when they were out on the hunt, so during the day they would sometimes eat fresh fish raw, either the flesh or in the case of large fish, just the intestines. Hunters also made caches of excess fish, which they could recover later for a cold meal. However, even though these foods were uncooked they were affected by being stored: fish from the cache became “high”—in other words, smelly because they were partially rotten. Most people liked the strong taste. Jenness saw “a man take a bone from rotten caribou-meat cached more than a year before, crack it open and eat the marrow with evident relish although it swarmed with maggots.”

Though many raw foods were eaten for convenience, some were taken by choice. Blubber was often preferred raw. It was soft and could be spread easily over meat like butter. Other meats eaten raw were also soft, such as seal livers and kidneys and caribou livers. Occasionally there was evidence of more exotic tastes. Stefansson’s hosts were horrified to hear of a distant group, the Puiplirmiut, who supposedly collected frozen deer droppings off the snow and ate them like berries. They said that was a truly repulsive habit, and anyway it was a waste of a good dropping. Those pellets were a fine food, they said, when boiled and used to thicken blood soup. The only vegetable food that was regularly eaten raw was the lichen eaten by caribou, which the Copper Inuit ate when the lichen was partially digested. In summer they would take it directly from the rumen and eat it while cutting up the carcass. As the cold closed in during the fall, they were more likely to allow the full stomach to freeze intact with the lichens inside. They would then cut it into slices for a frozen treat.

The Inuit probably ate more raw animal products than other societies, but like every culture the main meal of the day was taken in the evening, and it was cooked. In a scene captured by anthropologist Jiro Tanaka, the !Kung of the Kalahari illustrate the typical pattern for hunter-gatherers of a light breakfast and snacks during the day, followed by an evening meal. “Finally, as the sun begins to set, each woman builds a large cooking fire near her hut and commences cooking. . . . The hunters return to camp in the semidarkness, and each family eats supper around the fire after darkness has fallen. . . . Only in the evening does the whole family gather to eat a solid meal, and indeed people consume the greater part of their daily food then. The only exception is after a big kill, when a large quantity of meat has been brought back to camp: then people eat any number of times during the day, keeping their stomachs full to bursting, until all the meat is gone.”

The Inuit consumed raw food mostly as a snack out of camp, as is typical of human foragers. In 1987, anthropologist Jennifer Isaacs described which foods Australian aborigines ate raw or cooked. Although foragers sometimes lit fires in the bush to cook quick meals such as mud crabs (a particular favorite), the majority of animal items were brought back to camp to be cooked. A few items, such as a species of mangrove worm, were always eaten raw, and these were not brought back to camp. Isaacs reported three types of food that were eaten sometimes raw and sometimes cooked—turtle eggs, oysters, and witchetty grubs—and in each case they were eaten raw by people foraging far from camp but were cooked if eaten in camp. Most fruits are preferred raw and are eaten in the bush, whereas roots, seeds, and nuts are brought back to camp to be cooked. Everywhere we look, home cooking is the norm. For most foods, eating raw appears to be a poor alternative demanded by circumstance.





What happens to people who are forced to eat raw diets in wild habitats, such as lost explorers, castaways, or isolated adventurers simply trying to survive despite losing their ability to cook? This category of people offers a third test of how well humans can utilize raw food. You might think that when humans are forced to eat raw, they would grumble at the loss of flavor but nevertheless be fine. However, I have not been able to find any reports of people living long term on raw wild food.

The longest case that I found of survival on raw animal foods lasted only a few weeks. In 1972 a British sailor, Dougal Robertson, and his family lost their boat to killer whales in the Pacific and were confined to a dinghy for thirty-eight days. They began with a few cookies, oranges, and glucose candies. By the seventh day they were forced to eat what they could catch on a line. They spent their last thirty-one days at sea mostly eating raw turtle meat, turtle eggs, and fish. There were occasional treats, such as chewing the liver and heart of a shark, but their staple was a “soup” of dried turtle in a mix of rainwater, meat juice, and eggs.

They caught more food than they could eat, and they survived in good cheer. Indeed, their diet suited them so well that by the end of their ordeal, Robertson reported that their physical condition was actually better than when they had begun their journey. Sores that had been present when their boat was sunk had healed, and their bodies were functioning effectively. The only problem was that nine-year-old Neil, despite being given extra portions of bone marrow, was disturbingly thin.

And all were hungry. They “enjoyed the flavour of the raw food as only starving people can.” Their fantasies focused on cooked food. By the twenty-fourth day, Robertson recorded, “our daydreams had switched from ice cream and fruit to hot stews, porridge, steak and kidney puddings, hotpots and casseroles. The dishes steamed fragrantly in our imaginations and as we described their smallest details to each other we almost tasted the succulent gravies as we chewed our meager rations.” The Robertsons’ raw diet supported survival but it also brought a sense of starvation.

Their resourcefulness enabled them to emerge from a terrifying situation in fine condition. They may have been hungry and thinner, but they were apparently not starving to the point of danger. Their experience shows that with abundant food, people can survive well on a raw animal-based diet for at least a month. But people sometimes survive with no food at all for a month, provided they have water. The lack of any evidence for longer-term survival on raw wild food suggests that even in extremis, people need their food cooked.

The case that comes closest to long-term survival on raw wild food is that of Helena Valero. This exceptional woman was a Brazilian of European descent who reportedly survived in a remote forest for some seven months in the 1930s. She knew the jungle well because at about age twelve she had been kidnapped by Yanomamö Indians. She became a member of their tribe but her experience was very hard. One day, after her life was threatened, she escaped her captors. She took a firebrand wrapped in leaves so she could cook, but after a few days a heavy rain drenched it. Unwilling to return to Yanomamö life, she wandered alone, fireless and increasingly hungry, until she found an abandoned banana plantation. Valero was lucky because villagers had planted the trees in a dense grove. There, she said, she survived by eating raw bananas. She counted the seven months by the passage of the moon. Valero did not record her condition at the end of her exile, but she was eventually found by Yanomamö. She returned to the comforts of village life, married twice, had four children, and eventually feared for her children’s lives and escaped again at about age thirty-five. She never found happiness in Brazilian society.

Valero’s tale could not be verified, but if anyone were to survive on raw food in the wild, it makes sense that they would have the fortune to have an abundant supply of a high-calorie domesticated fruit. Bananas are often touted as nature’s most perfect food.

In more ordinary circumstances starvation is a rapid threat when eating raw in the wild. Anthropologist Allan Holmberg was at a remote mission station in Bolivia in the 1940s when a group of seven Siriono hunter-gatherers arrived from the forest. They were so hungry and emaciated that, as one of them told Holmberg, if they had not arrived when they did they might have died. This group had been part of a band that had thrived in the rain forest until they were taken to a government school. They had been so resentful of their forced removal that they had escaped with the aim of returning to their ancestral homeland. To avoid capture they had moved fast, walking even in heavy rain. Without proper cover the smoldering logs they were carrying were extinguished. After that the little group was reduced to a raw diet of wild plants until they were rescued after three weeks. They walked less than five miles per day and even though they knew the forest intimately and found raw plants to eat, they still could not obtain sufficient energy from their diets. Two of the men had bows and there was lots of game, so they might have done better but for a taboo on raw meat, which they claimed not to eat under any conditions. But even hunter-gatherers often live well with little meat for weeks on end, as long as they cook. The Siriono experience suggests that raw diets are dangerous because they do not provide enough energy.

In 1860 Robert Burke and William Wills led an ill-fated expedition from southern to northern Australia. When they ran out of food they asked the local Yandruwandha aborigines for help. The Yandruwandha were living on the abundant nardoo plant. They pounded nardoo seeds into a bitter flour, washed it, and then cooked it. The explorers liked the flour but apparently omitted the washing and cooking. The result was disaster. “I am weaker than ever,” wrote Wills, “although I have a good appetite, and relish the nardoo much, but it seems to give us no nutriment.” Burke and Wills died from poisoning, starvation, or both. However, they had a companion who survived and joined the Yandruwandha, ate lots of cooked nardoo flour, and was in excellent condition when he was rescued ten weeks later.

The cases I have listed are exceptional because it is rare for people to even attempt to survive on raw food in the wild. When Thor Heyerdahl took a primitive raft across the Pacific to test his theories about prehistoric migrations, he had a primus stove with him and one of his crew was a cook. When an airplane crashed in the Chilean Andes in 1972 and stranded twenty-seven people for seventy-one days, the survivors resorted to cannibalism and cooked the meat. When the whale ship Essex went down in the Pacific and its sailors cannibalized one another in small lifeboats, they cooked on stone-bottomed fires. Several Japanese soldiers lived alone in the jungle after World War II. One of them, Shoichi Yokio, stayed in Guam until 1972, surviving on fruits, snails, eels, and rats. But he did not eat them raw. Life in his underground cave depended on his smoke-blackened pots, just as it did for all such holdouts.

Perhaps the most famous real-life castaway was Alexander Selkirk, the model for Robinson Crusoe. In 1704, after quarrelling with the captain of his ship and rashly demanding to be put ashore, Selkirk began more than four years alone on the island of Más a Tierra, 670 kilometers (416 miles) west of Chile in the Pacific Ocean. He had his Bible, a musket with a pound of powder, some mathematical instruments, a hatchet, a knife, and a few carpenter’s tools. He ended up very wild, dancing with his tamed goats and cats and barely recognizable as human. But when his gunpowder was nearly spent, “he got fire by rubbing two sticks of Piemento Wood together upon his knee.” He was able to cook throughout his time in isolation.





Raw-foodists, it is clear, do not fare well. They thrive only in rich modern environments where they depend on eating exceptionally high-quality foods. Animals do not have the same constraints: they flourish on wild raw foods. The suspicion prompted by the shortcomings of the Evo Diet is correct, and the implication is clear: there is something odd about us. We are not like other animals. In most circumstances, need cooked food.
TOP
3#

CHAPTER 2



The Cook’s Body


“Domestication of fire probably reacted on man’s physical development as well as on his culture, for it would have reduced some selective pressures and increased others. As cooked food replaced a diet consisting entirely of raw meat and fresh vegetable matter, the whole pattern of mastication, digestion, and nutrition was altered.”

—KENNETH OAKLEY, Social Life of Early Man





Although humans fare poorly on raw diets nowadays, at some time our ancestors must have utilized bush fruits, fresh greens, raw meat, and other natural products as efficiently as apes do. What can account for the change? Why, given all the obvious advantages of being able to extract large amounts of energy from raw food, have humans lost this ancient ability?

In theory an evolutionary mishap might be responsible for this failure of our biology: the genetic coding for a well-adapted digestive system could have been lost by chance. But a failure of evolutionary adaptation is an unlikely explanation for something as widespread and labor-intensive as cooking. Natural selection mostly generates exquisitely successful designs, particularly for features that are as important and in such regular use as our intestinal systems. We can expect to find a compensatory benefit that has been made possible by our inability to utilize raw food effectively.

Evolutionary trade-offs are common. Compared to chimpanzees, we climb badly but we walk well. Our awkwardness in trees is due partly to our having long legs and flat feet, but those same legs and feet enable us to walk more efficiently than other apes. In a similar way, our limited effectiveness at digesting raw food is due to our having relatively small digestive systems compared to those of our cousin apes. But the reduced size of our digestive systems, it seems, enables us to process cooked food with exceptional proficiency.

We can think of cooked food offering two kinds of advantage, depending on whether species have adapted to a cooked diet. Spontaneous benefits are experienced by almost any species, regardless of its evolutionary history, because cooked food is easier to digest than raw food. Domestic animals such as calves, lambs, and piglets grow faster when their food is cooked, and cows produce more fat in their milk and more milk per day when eating cooked rather than raw seeds. A similar effect appears in fish farms. Salmon grow better on a diet of cooked rather than raw fishmeal. No wonder farmers like to give cooked mash or swill to their livestock. Cooked food promotes efficient growth.

The spontaneous benefits of cooked food explain why domesticated pets easily become fat: their food is cooked, such as the commercially produced kibbles, pellets, and nuggets given to dogs and cats. Owners of obese pets who recognize this connection and see cooked food as a health threat sometimes choose to feed raw food to their beloved ones to help them lose weight. Biologically Appropriate Raw Food, or BARF, is a special diet advertised as being beneficial for dogs for the same reason that raw-foodists advocate raw diets for humans: it is natural. “Every living animal on earth requires a biologically appropriate diet. And if you think about it, not one animal on earth is adapted by evolution to eat a cooked food diet. This means the BARF diet is exactly what we should be feeding our pets.” The effects of this diet is reminiscent of raw-foodists’ experience: “You can always tell a raw-food dog; they look better, have more energy, are thin and vibrant,” says an owner of a golden retriever whose coat started glowing within a week of eating raw food exclusively.

Even insects appear to get the spontaneous benefits of cooked food. Researchers rearing agricultural pests in large numbers to find out how to control them give each insect species its own particular recipe of cooked food. Larvae of the diamondback moth thrive on a toasted mix of wheat germ, casein, bean meal, and cabbage flour. Black vine weevils do best on thoroughly boiled and blended lima beans. Whether domestic or wild, mammal or insect, useful or pest, animals adapted to raw diets tend to fare better on cooked food.





In humans, because we have adapted to cooked food, its spontaneous advantages are complemented by evolutionary benefits. The evolutionary benefits stem from the fact that digestion is a costly process that can account for a high proportion of an individual’s energy budget—often as much as locomotion does. After our ancestors started eating cooked food every day, natural selection favored those with small guts, because they were able to digest their food well, but at a lower cost than before. The result was increased energetic efficiency.

Evolutionary benefits of adapting to cooked food are evident from comparing human digestive systems with those of chimpanzees and other apes. The main differences all involve humans having relatively small features. We have small mouths, weak jaws, small teeth, small stomachs, small colons, and small guts overall. In the past, the unusual size of these body parts has mostly been attributed to the evolutionary effects of our eating meat, but the design of the human digestive system is better explained as an adaptation to eating cooked food than it is to eating raw meat.

Mick Jagger’s biggest yawn is nothing compared to a chimpanzee’s. Given that the mouth is the entry to the gut, humans have an astonishingly tiny opening for such a large species. All great apes have a prominent snout and a wide grin: chimpanzees can open their mouths twice as far as humans, as they regularly do when eating. If a playful chimpanzee ever kisses you, you will never forget this point. To find a primate with as relatively small an aperture as that of humans, you have to go to a diminutive species, such as a squirrel monkey, weighing less than 1.4 kilograms (3 pounds). In addition to having a small gape, our mouths have a relatively small volume—about the same size as chimpanzee mouths, even though we weigh some 50 percent more than they do. Zoologists often try to capture the essence of our species with such phrases as the naked, bipedal, or big-brained ape. They could equally well call us the small-mouthed ape.

The difference in mouth size is even more obvious when we take the lips into account. The amount of food a chimpanzee can hold in its mouth far exceeds what humans can do because, in addition to their wide gape and big mouths, chimpanzees have enormous and very muscular lips. When eating juicy foods like fruits or meat, chimpanzees use their lips to hold a large wad of food in the outer part of their mouths and squeeze it hard against their teeth, which they may do repeatedly for many minutes before swallowing. The strong lips are probably an adaptation for eating fruits, because fruit bats have similarly large and muscular lips that they use in the same way to squeeze fruit wads against their teeth. Humans have relatively tiny lips, appropriate for a small amount of food in the mouth at one time.

Our second digestive specialization is having weaker jaws. You can feel for yourself that our chewing muscles, the temporalis and masseter, are small. In nonhuman apes these muscles often reach all the way from the jaw to the top of the skull, where they sometimes attach to a ridge of bone called the sagittal crest, whose only function is to accommodate the jaw muscles. In humans, by contrast, our jaw muscles normally reach barely halfway up the side of our heads. If you clench and unclench your teeth and feel the side of your head, you have a good chance of being able to prove to yourself that you are not a gorilla: your temporalis muscle likely stops near the top of your ear. We also have diminutive muscle fibers in our jaws, one-eighth the size of those in macaques. The cause of our weak jaws is a human-specific mutation in a gene responsible for producing the muscle protein myosin. Sometime around two and a half million years ago this gene, called MYH16, is thought to have spread throughout our ancestors and left our lineage with muscles that have subsequently been uniquely weak. Our small, weak jaw muscles are not adapted for chewing tough raw food, but they work well for soft, cooked food.

Human chewing teeth, or molars, also are small—the smallest of any primate species in relation to body size. Again, the predictable physical changes in food that are associated with cooking account readily for our weak chewing and small teeth. Even without genetic evolution, animals reared experimentally on soft diets develop smaller jaws and teeth. The reduction in tooth size produces a well-adapted system: physical anthropologist Peter Lucas has calculated that the size of a tooth needed to make a crack in a cooked potato is 56 percent to 82 percent smaller than needed for a raw potato.

Continuing farther into the body, our stomachs again are comparatively small. In humans the surface area of the stomach is less than one-third the size expected for a typical mammal of our body weight, and smaller than in 97 percent of other primates. The high caloric density of cooked food suggests that our stomachs can afford to be small. Great apes eat perhaps twice as much by weight per day as we do because their foods are packed with indigestible fiber (around 30 percent by weight, compared to 5 percent to 10 percent or less in human diets). Thanks to the high caloric density of cooked food, we have modest needs that are adequately served by our small stomachs.

Below the stomach, the human small intestine is only a little smaller than expected from the size of our bodies, reflecting that this organ is the main site of digestion and absorption, and humans have the same basal metabolic rate as other primates in relation to body weight. But the large intestine, or colon, is less than 60 percent of the mass that would be expected for a primate of our body weight. The colon is where our intestinal flora ferment plant fiber, producing fatty acids that are absorbed into the body and used for energy. That the colon is relatively small in humans means we cannot retain as much fiber as the great apes can and therefore cannot utilize plant fiber as effectively for food. But that matters little. The high caloric density of cooked food means that normally we do not need the large fermenting potential that apes rely on.

Finally, the volume of the entire human gut, comprising stomach, small intestine, and large intestine, is also relatively small, less than in any other primate measured so far. The weight of our guts is estimated at about 60 percent of what is expected for a primate of our size: the human digestive system as a whole is much smaller than would be predicted on the basis of size relations in primates.

Our small mouths, teeth, and guts fit well with the softness, high caloric density, low fiber content, and high digestibility of cooked food. The reduction increases efficiency and saves us from wasting unnecessary metabolic costs on features whose only purpose would be to allow us to digest large amounts of high-fiber food. Mouths and teeth do not need to be large to chew soft, high-density food, and a reduction in the size of jaw muscles may help us produce the low forces appropriate to eating a cooked diet. The smaller scale may reduce tooth damage and subsequent disease. In the case of intestines, physical anthropologists Leslie Aiello and Peter Wheeler reported that compared to that of great apes, the reduction in human gut size saves humans at least 10 percent of daily energy expenditure: the more gut tissue in the body, the more energy must be spent on its metabolism. Thanks to cooking, very high-fiber food of a type eaten by great apes is no longer a useful part of our diet. The suite of changes in the human digestive system makes sense.

Could the tight fit between the design of our digestive systems and the nature of cooked food be deceptive? The character Pangloss in Voltaire’s Candide claimed that our noses were designed to carry spectacles, based on the fact that our noses support spectacles efficiently. But actually spectacles have been designed to fit on noses, rather than the other way around. Following Pangloss’s reasoning, in theory cooked food might similarly be well suited for a human gut that had been adapted for another kind of diet.

Meat is the obvious possibility. The “Man-the-Hunter” hypothesis assumes our ancestors were originally plant eaters, with the last species to eat relatively little meat being the australopithecine that gave rise to habilines more than two million years ago. Much of the australopithecines’ plant food would have had the low caloric density and high fiber concentration seen in great-ape diets. We should therefore expect those ancient apes to have had large digestive systems, as chimpanzees and gorillas do today. In support of this idea, fossils show that australopithecines had broad hips and a rib cage that was flared outward toward the waist. Both features indicate the presence of capacious guts, held by the rib cage and supported by the pelvis. According to the meat-eating scenario, as increased amounts of meat were eaten by habilines and their descendants, modifications must have evolved in the mouth and digestive system.

Physical anthropologist Peter Ungar reported in 2004 that the molars (chewing teeth) of very early humans were somewhat sharper than those of their australopithecine ancestors. They might therefore have been adapted to eating tough foods, including raw meat. Carnivores such as dogs, and probably wolves and hyenas, also tend to have small guts compared to those of great apes, including small colons that are efficient for the high caloric density and low fiber density of a meat diet. But despite these hints of humans being designed for meat eating, our mouths, teeth, and jaws are clearly not well adapted to eating meat unless it has been cooked. Raw wild meat from game animals is tough, which is partly why cooking is so important. Advocates of the meat-eating hypothesis have themselves noted that humans differ from carnivores by our having small mouths, weak jaws, and small teeth that cannot easily shear flesh.

The way food moves through our bodies compounds the problem. In carnivores, meat spends a long time in the stomach, allowing intense muscular contractions of the stomach walls to reduce raw meat to small particles that can be digested rapidly. Dogs tend to keep food in the stomach for two to four hours, and cats for five to six hours, before passing the food quickly through the small intestine. By contrast, humans resemble other primates in keeping food in our stomachs for a short time, generally one to two hours, and then passing it slowly through the small intestine. Lacking the carnivore system of retaining food for many hours in our stomachs, we humans are inefficient at processing chunks of raw meat.

If our mouths, teeth, jaws, and stomachs all indicate that humans are not adapted to eating lumps of raw meat, they might in theory be designed to digest meat that has been processed without being cooked. Raw meat might have been usefully pounded to make it easily chewed. It might have been allowed to rot, in parts of the world that were sufficiently cold for bacterial infection not to be a major threat. Or it might have been dried. But these ideas cannot solve the problem of how plant foods were eaten.

The problem is that tropical hunter-gatherers have to eat at least half of their diet in the form of plants, and the kinds of plant foods our hunter-gatherer ancestors would have relied on are not easily digested raw. So if the meat-eating hypothesis is advanced to explain why Homo erectus had small teeth and guts, it faces a difficulty with the plant component of the diet. It cannot explain how a human with a diminished capacity for digestion could have digested plant foods efficiently.

Plants are a vital food because humans need large amounts of either carbohydrates (from plant foods) or fat (found in a few animal foods). Without carbohydrates or fat, people depend on protein for their energy, and excessive protein induces a form of poisoning. Symptoms of protein poisoning include toxic levels of ammonia in the blood, damage to the liver and kidneys, dehydration, loss of appetite, and ultimately death. The grim result was described by Vilhjalmur Stefansson based on his experience in the Arctic in a lean season when fat was so scarce (and plant foods were absent, as usual) that protein became the predominant macronutrient in the diet. “If you are transferred suddenly from a diet normal in fat to one consisting wholly of . . . [lean meat] you eat bigger and bigger meals for the first few days until at the end of about a week you are eating in pounds three or four times as much as you were at the beginning of the week. By that time you are showing both signs of starvation and protein poisoning. You eat numerous meals; you feel hungry at the end of each; you are in discomfort through distension of the stomach with much food and you begin to feel a vague restlessness. Diarrhoea will start in from a week to 10 days and will not be relieved unless you secure fat. Death will result after several weeks.”

Because the maximum safe level of protein intake for humans is around 50 percent of total calories, the rest must come from fat, such as blubber, or carbohydrates, such as in fruits and roots. Fat is an excellent source of calories in high-latitude sites like the Arctic or Tierra del Fuego, where sea mammals have evolved thick layers of blubber to protect themselves from the cold. However, fat levels are much lower in the meat of tropical mammals, averaging around 4 percent, and high-fat tissues like marrow and brain are always in limited supply. The critical extra calories for our equatorial ancestors therefore must have come from plants, which are vital for all tropical hunter-gatherers. During periods of food shortage, such as the annual dry seasons, fat levels in meat would have been particularly low, down to 1 percent to 2 percent. A carbohydrate supply from plant foods would then have been especially vital.

But if early humans had the same small guts as we do, they could not have obtained their plant carbohydrates without cooking. Recall the poor metabolic performance of the urban raw-foodists in the Giessen study. Those people ate very high-quality cultivated food processed with the aid of sprouting, blending, and even low-temperature ovens, yet still obtained so little energy that reproductive function was seriously impaired. If our early human ancestors indeed ate their plant food raw, they would have needed to find ways of processing it that were superior to our modern technology. But it is not credible that Stone Age people developed non-thermal methods of food preparation more effective than using an electric blender.

Hunter-gatherers living on raw food might sometimes have found plant foods of an exceptionally high caloric density, such as avocados, olives, or walnuts. But no modern habitats produce such foods in abundance all year. Perhaps a few lost places would have had highly productive natural orchards until they were replaced by agriculture, such as the fertile valleys of the Middle East. But occasional productive areas would not explain the wide geographical range of human ancestors across Africa, Europe, and Asia by 1.8 million years ago. Furthermore, seasonal scarcities occur in every habitat and would have forced people to use foods of lower caloric density, such as roots. The notion of a permanently superproductive habitat is unrealistic. People with an anatomy like ours today could not have flourished on raw food in the Pleistocene epoch.





Beyond reducing the size of teeth and guts, the adoption of cooking must have had numerous effects on our digestive system because it changed the chemistry of our food. Cooking would have created some toxins, reduced others, and probably favored adjustments to our digestive enzymes. Very little is known about how our detoxification system and enzyme chemistry differ from those of great apes, but studies should eventually provide further tests of the hypothesis that human bodies are adapted to eating cooked foods.

Take, for example, Maillard compounds, such as heterocyclic amines and acrylamide. These complex molecules are formed from a process that begins with the union of sugars and amino acids, particularly lysine. Maillard compounds occur naturally in our bodies and increase in frequency with age. They occur at low concentration in natural foods but under the influence of heat their concentration becomes much higher than what is found in nature, whether in smoke (from fires or cigarettes) or cooked items. Their presence is easily recognized in the brown colors found in pork crackling or bread crust. Maillard compounds cause mutations in bacteria and are suspected of leading to some human cancers. They can also induce a chronic state of inflammation, a process that raw-foodists invoke to explain why they feel better on raw diets. The cooking hypothesis suggests that our long evolutionary history of exposure to Maillard compounds has led humans to be more resistant to their damaging effects than other mammals are. It is an important question because many processed foods contain Maillard compounds that are known to cause cancer in other animals. Acrylamide is an example. In 2002 acrylamide was discovered to occur widely in commercially produced potato products, such as potato chips. If it is as carcinogenic to humans as it is to other animals, it is dangerous. If not, it may provide evidence of human adaptation to Maillard compounds, and hence of a long exposure to heated foods.

Evolutionary adaptation to cooking might likewise explain why humans seem less prepared to tolerate toxins than do other apes. In my experience of sampling many wild foods eaten by primates, items eaten by chimpanzees in the wild taste better than foods eaten by monkeys. Even so, some of the fruits, seeds, and leaves that chimpanzees select taste so foul that I can barely swallow them. The tastes are strong and rich, excellent indicators of the presence of non-nutritional compounds, many of which are likely to be toxic to humans—but presumably much less so to chimpanzees.

Consider the plum-size fruit of Warburgia ugandensis, a tree famous for its medicinal bark. Warburgia fruits contain a spicy compound reminiscent of a mustard oil. The hot taste renders even a single fruit impossibly unpleasant for humans to ingest. But chimpanzees can eat a pile of these fruits and then look eagerly for more.

Many other fruits in the chimpanzee diet are almost equally unpleasant to the human palate. Astringency, the drying sensation produced by tannins and a few other compounds, is common in fruits eaten by chimpanzees. Astringency is caused by the presence of tannins, which bind to proteins and cause them to precipitate. Our mouths are normally lubricated by mucoproteins in our saliva, but because a high density of tannins precipitates those proteins, it leaves our tongues and mouths dry: hence the “furry” sensation in our mouths after eating an unripe apple or drinking a tannin-rich wine. One has the same experience when tasting chimpanzee fruits such as Mimusops bagshawei or the widespread Pseudospondias microcarpa. Though chimpanzees can eat more than 1 kilogram (2.2 pounds) of such fruits during an hour or more of continuous chewing, we cannot. Some other chimpanzee foods taste bitter to us, such as certain figs. Still other fruits elicit in us unusual sensations, such as the fruits of Monodora myristica, whose sharp and lemony taste is followed by a numbing sensation at the tip of the tongue like that caused by novocaine. Of the scores of chimpanzee foods I have tasted, I could imagine filling my belly with only a very few species, such as a wild raspberry—but alas, one rarely finds more than a handful of these delicious fruits at a time. The shifts in food preference between chimpanzees and humans suggest that our species has a reduced physiological tolerance for foods high in toxins or tannins. Since cooking predictably destroys many toxins, we may have evolved a relatively sensitive palate.

By contrast, if we were adapted to a raw-meat diet we would expect to see evidence of resistance to the toxins produced by bacteria that live on meat. No such evidence is known. Even when we cook our meat we are vulnerable to bacterial infections. The U.S. Centers for Disease Control and Prevention state that at least forty thousand cases of food poisoning by Salmonella alone are reported annually in the United States, and as many as one million cases may go unreported. The estimated total number of cases due to the top twenty harmful bacteria, including Staphylococcus, Clostridium, Campylobacter, Listeria, Vibrio, Bacillus, and Escherichia coli (E. coli), is in the tens of millions per year. The best prevention is to cook meat, fish, and eggs beyond 140oF (60oC), and not to eat foods containing unpasteurized milk or eggs. The cooking hypothesis suggests that because our ancestors have typically been able to cook their meat, humans have remained vulnerable to bacteria that live on raw meat.

Anthropology has traditionally adopted the Man-the-Hunter scenario, proposing our species as a creature that was modified from australopithecines principally by our tendency to eat more meat. Certainly meat eating has been an important factor in human evolution and nutrition, but it has had less impact on our bodies than cooked food. We fare poorly on raw diets, no cultures rely on them, and adaptations in our bodies explain why we cannot easily utilize raw foods. Even vegetarians thrive on cooked diets. We are cooks more than carnivores. No wonder raw-foodism is a good way to lose weight.
TOP
4#

CHAPTER 3



The Energy Theory of Cooking


“A man does not live on what he eats, an old proverb says, but on what he digests.”

—JEAN ANTHELME BRILLAT-SAVARIN,


The Physiology of Taste: Or Meditations on Transcendental Gastronomy





An obvious implication of animals and humans gaining more weight and reproducing better on cooked than raw diets is that when a food is heated, it must yield more energy. Yet authoritative science flatly challenges this idea. The U.S. Department of Agriculture’s National Nutrient Database for Standard Reference and Robert McCance and Elsie Widdowson’s The Composition of Foods are the principal sources for public understanding of the nutrient data for thousands of foods in the United States and the United Kingdom, respectively. They provide the data for our food labels. These references report that the effect of cooking on energy content is the same for beef, pork, chicken, duck, beetroot, potatoes, rice, oats, pastries, and dozens of other foods—on average, zero. According to these and similar compilations, cooking has important effects in changing water content and reducing the concentration of vitamins, but the density of calories supposedly remains unchanged whether food is eaten raw or is roasted, grilled, or boiled.

This conclusion is very puzzling. Obviously it conflicts with the abundant evidence that humans and animals get more energy from cooked foods. It also conflicts with various contrary conclusions from nutritional science. On the one hand, a widespread idea states that cooking is “a technological way of externalizing part of the digestive process,” a claim that seems to imply some kind of benefit such as accelerated digestion. On the other hand, cooking is sometimes claimed to have a negative effect on energy value. I recently spotted some small “fresh premium breakfast sausages” in my local supermarket. The food label gave their energy content in calories. With a curious nod to those who might want to eat raw sausages, it included values for both the raw and the cooked product. “Serving size 2 links. Raw 130 cals (60 from fat). Cooked 120 cals (60 from fat).” The claim might seem surprising, but cooking can reduce calories in various ways. Cooking can lead to the loss of nutrient-filled juices. It can generate indigestible molecules such as Maillard compounds, reducing the amount of sugar or amino acids available for digestion. It can burn carbohydrates. It can lead to changes in texture that reduce a food’s digestibility. Leading nutritionist David Jenkins judged such effects significant: “The predominant effect (of cooking) is . . . to reduce the digestibility of the proteins.”

Although different nutritionists say that cooking has no effect on the caloric content of food, or increases it, or decreases it, we can clear up this confusion. As indicated by the evidence from raw-foodists and the immediate benefits experienced by many animals eating cooked food, I believe the effects of cooking on energy gain are consistently positive. The mechanisms increasing energy gain in cooked food compared to raw food are reasonably well understood. Most important, cooking gelatinizes starch, denatures protein, and softens everything. As a result of these and other processes, cooking substantially increases the amount of energy we obtain from our food.

Starchy foods are the key ingredient of many familiar items such as breads, cakes, and pasta. They constitute almost all the world’s major plant staples. In 1988-1990, cereals such as rice and wheat made up 44 percent of the world’s food production, and together with just a few other starchy foods (roots, tubers, plantains, and dry pulses) accounted for 63 percent of the average diet. Starchy foods make up more than half of the diets of tropical hunter-gatherers today and may well have been eaten in similar quantity by our human and pre-human ancestors in the African savannas.

The most direct studies of the impact of cooking measure digestibility, meaning the proportion of a food our bodies digest and absorb. If the digestibility of a particular kind of starch is 100 percent, the starch is a perfect food: every part of it is converted into useful food molecules. If it is zero percent, the starch is completely resistant to digestion and provides no food value at all. The question is, how much does cooking affect the digestibility of starchy foods?





Our digestive system consists of two distinct processes. The first is digestion by our own bodies, which starts in the mouth, continues in the stomach, and is mostly carried out in the small intestine. The second is digestion, or strictly fermentation, by four hundred or more species of bacteria and protozoa in our large intestine, also known as the colon or large bowel. Foods that are digested by our bodies (from the mouth to the small intestine) produce calories that are wholly useful to us. But those that are digested by our intestinal flora yield only a fraction of their available energy to us—about half in the case of carbohydrates such as starch, and none at all in the case of protein.

This two-part structure means that the only way to assess how much energy a food provides is to calculate ileal digestibility, which samples the intestinal contents at the end of the small intestine, or ileum. The procedure requires scientists to conduct research on ileostomy patients, who have had their large intestine surgically removed and have a bag, or stoma, where the ileum ends. Researchers collect the ileal effluent from this bag.

Studies of ileal digestibility show that we use cooked starch very efficiently. The percentage of cooked starch that has been digested by the time it reaches the end of the ileum is at least 95 percent in oats, wheat, potatoes, plantains, bananas, cornflakes, white bread, and the typical European or American diet (a mixture of starchy foods, dairy products, and meat). A few foods have lower digestibility: starch in home-cooked kidney beans and flaked barley has an ileal digestibility of only around 84 percent.

Comparable measurements of the ileal digestibility of raw starch are much lower. Ileal digestibility is 71 percent for wheat starch, 51 percent for potatoes, and a measly 48 percent for raw starch in plantains and cooking bananas. The differences conform to test-tube studies of a wide range of items showing that raw starch is poorly digested, often only half as well as cooked starch. Starch granules eaten raw frequently pass through the ileum whole and enter the colon unchanged from when they were eaten. This “resistant starch” is vivid testimony to the deficits of a raw starch diet, explaining why we like our starch cooked and contributing to the weight loss that raw-foodists experience.

The principal way cooking achieves its increased digestibility is by gelatinization. Starch inside plant cells comes as dense little packages of stored glucose called granules. The granules are less than a tenth of a millimeter (four-thousandths of an inch) long, too small to be seen with the naked eye or to be damaged by the milling of flour, and they are so stable that in a dry environment they can persist for tens of thousands of years. However, as starch granules are warmed up in the presence of water they start to swell—at around 58oC (136oF) in the case of wheat starch, a well-studied and representative example. The granules swell because hydrogen bonds in the glucose polymers weaken when they are exposed to heat, and this causes the tight crystalline structure to loosen. By 90oC (194 oF), still below boiling, the granules are disrupted into fragments. At this point the glucose chains are unprotected, and gelatinize. Starch does not necessarily stay gelatinized after being cooked. In day-old bread the starch reverts and becomes resistant. This might help explain why we like to toast bread after it has lost its initial freshness.

Gelatinization happens whenever starch is cooked, whether in the baking of bread, the gelling of pie fillings, the production of pasta, the fabrication of starch-based snack foods, the thickening of sauces, or, we can surmise, the tossing of a wild root onto a fire. As long as water is present, even from the dampness of a fresh plant, the more that starch is cooked, the more it is gelatinized. The more starch is gelatinized, the more easily enzymes can reach it, and therefore the more completely it is digested. Thus cooked starch yields more energy than raw.

This effect is detected easily in blood measurements. Within thirty minutes of a person eating a test meal of pure glucose, the concentration of glucose in his or her blood rises dramatically, before returning to base levels in just over an hour. The effect of eating cornstarch is almost identical as long as it is cooked. But following a meal of raw cornstarch, the value of blood glucose remains persistently low, peaking at less than a third of the value for cooked cornstarch.

The effects of cooking are captured by comparing the glycemic index of cooked and raw foods. Glycemic index (GI) is a widely used nutritional measure of a food’s effect on blood sugar levels. High-GI foods, such as pure sugar, white bread, and potatoes, are good sources of energy after exercise, but for most people they are poor foods because they easily lead to excessive weight gain. In addition, the calories they offer tend to be “empty,” being low in protein, essential fatty acids, vitamins, and minerals. Low-GI foods, such as whole-grain bread, high-fiber cereals, and vegetables, reduce weight gain, improve diabetes control, and lower cholesterol. Cooking consistently increases the glycemic index of starchy foods.





Animal protein has been almost as important as starch in diets throughout our evolution, and it remains a strongly preferred food today. Yet the effects of cooking on the energy derived from eating meat have never been formally investigated, particularly the effects due to meat’s complex structure. Even the effects on proteins are a matter of debate. Until recently some scientists, such as David Jenkins, saw cooking as reducing protein digestibility. Others claim cooking protein is beneficial or has no effect. Recent studies of the digestion of eggs are starting to resolve the argument, showing for the first time that cooked protein is digested much more completely than raw protein.

In contrast to the new finding, in the past raw eggs have often been claimed to be an ideal source of calories, for reasons that sound logical. “An egg should never be cooked,” wrote raw-foodists Molly and Eugene Christian in 1904. “In its natural state it is easily dissolved and readily taken up by all the organs of digestion, but the cooked egg must be brought back to liquid form before it can be digested, which puts extra and unnecessary labor upon those over-worked organs.” This kind of argument persuaded generations of bodybuilders. The first muscleman with wide popular appeal was Steve Reeves, Hollywood’s movie Hercules of the 1950s, who famously ate raw eggs every day for breakfast. Celebrated strongmen like Charles Atlas and Arnold Schwarzenegger touted their merits too—as Mr. Universe, Schwarzenegger swallowed his eggs mixed with thick cream. Raw egg-eating by muscular athletes has even entered popular culture. In 1976 Sylvester Stallone’s boxing hero Rocky Balboa swallowed them as part of his training regimen in the movie Rocky. Thirty years later, in Rocky Balboa, he was still downing raw eggs. The quantity eaten by these legendary figures could be daunting: “Iron Guru” Vince Gironda, a popular teacher of bodybuilders, recommended up to thirty-six raw eggs a day.

Raw eggs would seem to provide an excellent food supply not only because their protein needs no chewing but also because their chemical composition is ideal. The amino acids of chicken eggs come in about forty proteins in almost exactly the proportions humans require. The match gives eggs a higher biological value—a measure of the rate at which the protein in food supports growth—than the protein of any other known food, even milk, meat, or soybeans. Raw eggs have other natural advantages. Their shells make them safer from bacterial contamination than cuts of meat. When aborigines on the beaches of Australia’s tropical north coast are thirsty, they look for turtle nests and readily drink raw egg whites. Eggs are the only unprocessed animal food that can safely be stored at room temperature for several weeks.

But even though eggs appear to be both high-quality and relatively safe when eaten raw, hunter-gatherers prefer to cook them. Unlike Australians, the Yahgan hunter-gatherers of Tierra del Fuego “would never eat half-cooked, much less raw eggs.” The Yahgan bored holes in eggshells to prevent them from bursting, buried the eggs on the edge of the fire, and turned them until they were quite hard inside. When not drinking eggs to slake their thirst, Australian aborigines would take similar pains, throwing emu eggs in the air to scramble them while still intact. They would then put them into hot sand or ashes and turn them regularly to cook them evenly, taking about twenty minutes. Such care suggests that the hunter-gatherers knew better than the musclemen.





In the late 1990s a Belgian team of gastroenterologists tested the effects of cooking for the first time, using a new research tool that allowed the investigators to follow the fate of egg proteins after they had been swallowed. The researchers fed hens a diet rich in stable isotopes of carbon, nitrogen, and hydrogen. The labeled atoms found their way into the eggs, allowing the experimenters to monitor the fate of protein molecules when the eggs were eaten. To determine how much of an egg meal was digested and absorbed in the body, they adopted the same method that had been used for studies of starch digestibility: they collected the food remains from the end of people’s small intestine, the ileum. Any protein that was undigested by the time it reached the ileum was metabolically useless to the person who ate it, because in the large intestine bacteria and protozoa digest the food proteins entirely for their own benefit.

At first the experimenters worked only with ileostomy patients, but later they were able to check their results with healthy subjects as well. The ileostomy patients and healthy volunteers each ate about four raw or cooked eggs, containing a total of 25 grams (0.9 ounces) of protein. Results were similar for the two groups. When the eggs were cooked, the proportion of protein digested averaged 91 percent to 94 percent. This high figure was much as expected given that egg protein is known to be an excellent food. However, in the ileostomy patients, digestibility of raw eggs was measured at a meager 51 percent. It was a little higher, 65 percent, in the healthy volunteers whose protein digestion was estimated by the appearance of stable isotopes in the breath. The results showed that 35 percent to 49 percent of the ingested protein was leaving the small intestine undigested. Cooking increased the protein value of eggs by around 40 percent.

The Belgian scientists considered the reason for this dramatic effect on nutritional value and concluded that the major factor was denaturation of the food proteins, induced by heat. Denaturation occurs when the internal bonds of a protein weaken, causing the molecule to open up. As a result, the protein molecule loses its original three-dimensional structure and therefore its natural biological function. The gastroenterologists noted that heat predictably denatures proteins, and that denatured proteins are more digestible because their open structure exposes them to the action of digestive enzymes.

Even before the Belgian egg study, there were indications that cooking can be responsible for enough denaturation to strongly influence digestibility. In 1987 researchers chose to study a beef protein, bovine serum albumin (BSA), selected because it is a typical food protein. In cooked samples, digestion by the enzyme trypsin increased four times compared to that of uncooked samples. The researchers concluded that the simple process of denaturation by heat (causing the protein molecule to unfold and lose its solubility in water) explained its greatly increased susceptibility to digestion.

Heat is only one of several factors that promote denaturation. Three others are acidity, sodium chloride, and drying, all of which humans use in different ways.

Acid is vital in the ordinary process of digestion. Our empty stomachs are highly acidic thanks to the secretions of a billion acid-producing cells that line the stomach wall and secrete one to two liters of hydrochloric acid a day. Food entering the stomach buffers the acidity and causes a more neutral pH, but the stomach cells respond rapidly and secrete enough acid to return the stomach to its original intense low pH, less than 2. This intense acidity has at least three functions: it kills bacteria that enter with the food, activates the digestive enzyme pepsin, and denatures proteins. Denaturation looks particularly important.

Marinades, pickles, and lemon juice are acidic, so if applied for sufficient time they can contribute to the denaturing of proteins in meat, poultry, and fish. It is no surprise that we like seviche, raw fish marinated in a citrus juice mixture, traditionally for a few hours. Hunter-gatherers have likewise been reported mixing acidic fruits with stored meats. The Tlingit of Alaska stuffed goat meat with blueberries and stored salmon spawn mashed with cooked huckleberries. Many other North American groups made pemmican by mixing dried and pounded meat with various kinds of berries, and Australian aborigines mixed wild plums with the pounded bones and meat of kangaroo. While pleasing flavors and improved storage might be enough to account for such mixtures, increased digestibility could also contribute to explaining the broad use of these acidic preparations. Animal protein that has been salted and dried, such as fish, is likewise denatured and thereby made more digestible. Increased digestibility from denaturation also helps account for our enjoyment of dried meats such as jerky or salted fish.

Although gelatinization and denaturation are largely chemical effects, cooking also has physical effects on the energy food provides. Research on the topic began with a misfortune almost two hundred years ago. On June 6, 1822, twenty-eight-year-old Alexis St. Martin was accidentally shot from a distance of about a meter (three feet) inside a store of the American Fur Company at Fort Mackinac, Michigan. William Beaumont, a young, war-hardened surgeon, was nearby and arrived within twenty-five minutes to find a bloody scene that he described eleven years later: “A large portion of the side was blown off, the ribs fractured, and openings made into the cavities of the chest and abdomen, through which protruded portions of the lung and stomach, much lacerated and burnt, exhibiting altogether an appalling and hopeless case. The diaphragm was lacerated and perforation made directly into the cavity of the stomach, through which food was escaping at the time your memorialist was called to his relief.”

Beaumont took St. Martin to his own home. To everyone’s surprise, St. Martin survived, and Beaumont continued to house and care for him after he stabilized. In a few months the patient resumed a vigorous life, and he became so strong that he eventually even paddled his family in an open canoe from Mississippi to Montreal. Although the fist-sized wound mostly filled in, it never completely closed. For the rest of St. Martin’s life, the inner workings of his stomach were visible from the outside.

The ambitious Beaumont realized that he had an extraordinary study opportunity. He began on August 1, 1825. “At 12 o’clock, M., I introduced through the perforation, into the stomach, the following articles of diet, suspended by a silk string, and fastened at proper distances, so as to pass in without pain—viz.:—a piece of highly seasoned a la mode beef; a piece of raw, salted, fat pork; a piece of raw, salted, lean beef; a piece of boiled, salted beef; a piece of stale bread; and a bunch of raw, sliced cabbage; each piece weighing about two drachms; the lad [St. Martin] continuing his usual employment around the house.”

Beaumont observed the stomach closely. He noted how quiet it was when it had no food, the rugae (muscle folds) nestled upon each other. When soup was swallowed, the stomach was at first slow to respond. “The rugae gently close upon it, and gradually diffuse it through the gastric cavity.” When Beaumont placed food directly on the stomach wall, the stomach became excited and its color brightened. There was a “gradual appearance of innumerable, very fine, lucid specks, rising through the transparent mucous coat, and seeming to burst, and discharge themselves upon the very points of the papillae, diffusing a limpid, thin fluid over the whole interior gastric surface.” For the first time, it was possible to watch digestion in action.

Beaumont continued his experiments intermittently for eight years. He recorded in detail how long it took foods to be digested by the stomach and emptied into the duodenum. From those observations he drew two conclusions relevant to the effects of cooking.

The more tender the food, the more rapidly and completely it was digested. He noted the same effect for food that was finely divided. “Vegetable, like animal substances, are more capable of digestion in proportion to the minuteness of their division . . . provided they are of a soft solid.” Potatoes boiled to reduce them to a dry powder tasted poor, but they were more easily digested. If not powdered, entire pieces remained long undissolved in the stomach and yielded slowly to the action of the gastric juice. “The difference is quite obvious on submitting parcels of this vegetation, in different states of preparation, to the operation of the gastric juice, either in the stomach or out of it.”

The same principles held, said Beaumont, with respect to meat. “Fibrine and gelatine [muscle fibers and collagen in meat] are affected in the same way. If tender and finely divided, they are disposed of readily; if in large and solid masses, digestion is proportionally retarded. . . . Minuteness of division and tenderness of fibre are the two grand essentials for speedy and easy digestion.”

In addition to “minuteness of division and tenderness,” cooking helped. He was explicit in the case of potatoes. “Pieces of raw potato, when submitted to the operation of this fluid, in the same manner, almost entirely resist its action. Many hours elapse before the slightest appearance of digestion is observable, and this only upon the surface, where the external laminae become a little softened, mucilaginous, and slightly farinaceous. Every physician who has had much practice in the diseases of children knows that partially boiled potatoes, when not sufficiently masticated (which is always the case with children), are frequently a source of colics and bowel complaints, and that large pieces of this vegetable pass the bowels untouched by digestion.” It was the same with meat. When Beaumont introduced boiled beef and raw beef at noon, the boiled beef was gone by 2 P.M. But the piece of raw, salted, lean beef of the same size was only slightly macerated on the surface, while its general texture remained firm and intact.

Sadly, St. Martin came to resent being a focus of scientific interest. By the time of his death in 1880 at the ripe old age of eighty-five, he felt thoroughly mistreated. He had long refused to have anything to do with Beaumont, and his family shared his sense of abuse. Dr. William Osler, often described as the father of modern medicine, hoped to study St. Martin’s body and even buy his stomach, but the family refused. They kept his body privately for four days to ensure that it rotted, then they buried him in an unusually deep grave, eight feet down, to thwart any medical interest in his organs.





Beaumont’s discovery that soft and finely divided foods are more easily digested conforms to our preference for such items. In 2006 the London department store Selfridges received five advance orders for a new product: the world’s most expensive sandwich. For £85 ($148) people had the chance to eat a 595-gram (21-ounce) mixture of fermented sourdough bread, Wagyu beef, fresh lobe foie gras, black truffle mayonnaise, brie de Meaux, English plum tomatoes, and confit. The beef explains the high price. Wagyu cattle are one of the most expensive breeds in the world because their meat is exceptionally tender, and no effort is spared to make it so. The animals are raised on a diet that includes beer and grain, and their muscles are regularly massaged with sake, the Japanese rice wine. The fat in the meat is claimed to melt at room temperature. The exceptional value of Wagyu beef illustrates a notable human pattern: people like their meat tender. “Of all the attributes of eating quality,” wrote meat scientist R. A. Lawrie, “texture and tenderness are presently rated most important by the average consumer, and appear to be sought at the expense of flavour and colour.” A key aim of meat science is to discover how to produce the most tender meat. Rearing, slaughtering, preservation, and preparation methods all play their part.

So does cooking. According to cooking historian Michael Symons, the cook’s main goal has always been to soften food. “The central theme is that cooks assist the bodily machine,” he wrote. He cited Mrs. Beeton’s Book of Household Management, which in 1861 sought to advise naive housewives about the fundamentals of the kitchen. The first of six reasons for cooking was “to render mastication easy.” “Hurrying over our meals, as we do, we should fare badly if all the grinding and subdividing of human food had to be accomplished by human teeth.” A second reason for cooking stressed the point Beaumont had discovered: “to facilitate and hasten digestion.”

The way Kalahari San hunter-gatherers prepare their food suggests a similar concern for making their meals as soft as possible. They cook their meat until “it is so tender that the sinews will fall apart.” Then “it is usually crushed in a mortar.” It is the same with plant foods. After melons or seeds have been cooked by burying them in hot embers or ashes, their contents are “ground in a mortar and eaten as a gruel.”

Tropical and subtropical hunter-gatherers, such as Andaman Islanders, Siriono, Mbuti, and Kalahari San, eat all their meat cooked. It is in cooler climates that people sometimes eat animal protein raw. If they are eaten uncooked, the raw items tend to be soft, like the mammal livers and rotten fish the Inuit eat. The island-living Yahgan in the south of Tierra del Fuego have three such foods, according to Martin Gusinde, who lived with them for twenty years. There is “the soft meat” of mollusks such as winkles, “squeezed out of the calcareous shell with a slight pressure of the fingers and eaten without any preparation, except that occasionally the little morsel of fish is dipped into seal blubber.” There are also the ovaries of sea urchins and the milky liquid in the shell, a delicacy shared by the Tlingit and eaten by Japanese and Europeans today in fine restaurants. According to Gusinde, a few individuals found the raw fat of a young whale tasty. Other than these cases, all animal protein was cooked.

Game animals have a few soft parts. The Utes of Colorado were said to roast all their meat but they ate the kidneys and livers raw. Australian aborigines supposedly eat mammal intestines raw on occasion, as Inuit do with fish and birds. Raw intestines may seem a startling preference in view of the potential for parasites to be present. They are likewise almost always the first part of a prey animal eaten by chimpanzees, chewed and swallowed much faster than muscle meat.

Raw-blood meals are well known among pastoralists such as Maasai, and as we saw in chapter 1, reported by Marco Polo in thirteenth-century Mongol nomad warriors. Elsewhere raw-fat meals are provided by fat-tailed sheep. Asian nomads value these sheep so highly and have bred them to such an extreme that they sometimes provide their animals with little carts to support the massive tail. On trek the nomads remove some of the fat for a raw meal, and the sheep travels a little lighter the next day.

While some foods are naturally tender, meat is variable. Meat with smaller muscle fibers is more tender, so chicken is more tender than beef. An animal slaughtered without being stressed retains more glycogen in its muscles. After death the glycogen converts to lactic acid, which promotes denaturation and therefore a more tender meat. Carcasses that are left to hang for several days are more tender, because proteins are partly broken down by enzymes.

But nothing changes meat tenderness as much as cooking because heat has a tremendous effect on the material in meat most responsible for its toughness: connective tissue. Composed of a fibrous protein called collagen and a stretchy one called elastin, connective tissue wraps the meat in three pervasive layers. The innermost layer is a sleeve called endomysium, which surrounds each individual muscle fiber like the skin of a sausage. Bundles of endomysium-enclosed muscle fibers lie alongside one another jointly sheathed in a larger skin, the perimysium. Finally, those bundles, or fascicles, are held together by the outer wrapping, or epimysium, which encloses the entire muscle. At the end of the muscle, the epimysium turns into the tendon. Connective tissue is slippery, elastic, and strong: the tensile strength of tendons can be half that of aluminum. So connective tissue not only does a wonderful job of keeping our muscles in place but it also makes meat very difficult to eat, particularly for an animal like humans or chimpanzees whose teeth are notably blunt.

The main protein in connective tissue, collagen, owes its toughness to an elegant repeating structure. Three left-handed helices of protein twirl around one another to form a right-handed superhelix. The superhelixes join into fibrils, and the fibrils form fibers that assemble into a crisscross pattern. The effect is a marvel of microengineering. The extraordinary mechanical strength of collagen explains why sinews, or tendons, make excellent bowstrings and why it is the most abundant protein in vertebrates: it is the main component of skin.

But collagen has an Achilles’ heel: heat turns it to jelly. Collagen shrinks when it reaches its denaturation temperature of 60-70oC (140-158oF), and then, as the helices start to unwind, it starts melting away. Whether heated about 100oC (212oF) for a short time or at lower temperatures for a longer time, the fibrils of collagen fall apart until they convert into the very antithesis of toughness: gelatin, a protein with commercial uses from Jell-O to jellied eels. The amount of force required to cut through a standard piece of meat tends to reach a minimum between 60oC and 70oC (140 and 158oF). Above those temperatures, slow cooking in water can sometimes continue to increase the tenderness.

Unfortunately for the amateur cooks among us, a second effect of heating meat is contrary to the first. Unlike connective tissue, heated muscle fibers tend to get tougher and drier. The cumulative effects of cooking meat are therefore complex. Bad cooking can render meat hard to chew, but good cooking tenderizes every kind of meat, from shrimp and octopus to rabbit, goat, and beef. Tenderness is even important for cooks preparing raw meat. Steak tartare requires a particularly high grade of meat (low in connective tissue) and the addition of raw eggs, onions, and sauces. The Joy of Cooking recommends grinding top sirloin, or scraping it with the back of a knife, until only the fibers of connective tissue remain.

Steak tartare supposedly gets its name from the Tartars, or Mongols, who rode in Genghis Khan’s army. When soldiers were moving too fast to cook, they sometimes drank horse blood but they were also reported to put slabs of meat under the saddles, riding on them all day until they were tender. Brillat-Savarin recorded an enthusiastic testimony of the practice: “Dining with a captain of Croats in 1815, ‘Gads,’ said he, ‘there’s no need of so much fuss in order to have a good dinner! When we are on scout duty and feel hungry, we shoot down the first beast that comes in our way, and cutting out a good thick slice, we sprinkle some salt over it, place it between the saddle and the horse’s back, set off at a gallop for a sufficient time, and’ (working his jaws like a man eating large mouthfuls) ‘gniaw, gniaw, gniaw, we have a dinner fit for a prince.’”





Why does tenderness matter? Beaumont observed that softer food was digested faster, and since faster or easier digestion demands less metabolic effort, softer food might lead to energy saved during digestion. The idea should make sense when you consider the greater liveliness you feel after eating a light meal compared to a heavy one: the light meal demands less work from your intestines and therefore makes other kinds of physical activity easy. This energy-saving principle has been beautifully shown in rats given soft food.

A team of Japanese scientists led by Kyoko Oka reared twenty rats on two different food regimes. Ten rats ate ordinary laboratory pellets, which were hard enough to require substantial chewing. The other ten ate a version of the standard food that was modified in a single way: the pellets were made softer by increasing their air content. The soft pellets were puffed up like a breakfast cereal and required only half the force of the hard pellets to crush them. In every other way the rats’ conditions were identical. The calorie intake, and calorie expenditure on locomotion, were found to be the same for the two groups. The ordinary and soft pellets did not differ in how much they had been cooked, their nutrient composition, or water content. Conventional theory based on the calculation of calorie intake would predict that the two groups of rats should have grown at the same rates and to the same size. They should have had the same body weight and the same levels of fat.

But they did not. The rats began eating their different pellet diets at four weeks old. By fifteen weeks the growth curves of the two groups had visibly separated, and by twenty-two weeks the group curves were significantly different. The rats eating soft food slowly became heavier than those eating hard food: on average, 37 grams heavier, or about 6 percent; and they had more abdominal fat: on average, 30 percent more, enough to be classified as obese. Soft, well-processed foods made the rats fat. The difference was in the cost of digestion. At every meal the rats experienced a rise in body temperature, but the rise was lower in the soft-pellet group than in the hard-pellet group. The difference was particularly strong in the first hour after eating, when the stomach was actively churning and secreting. The researchers concluded that the reason the softer diet led to obesity was simply that it was a little less costly to digest.

The implications of Oka’s experiment are clear. If cooking softens food and softer food leads to greater energy gain, then humans should get more energy from cooked food than raw food not only because of processes such as gelatinization and denaturation, but also because it reduces the costs of digestion. This prediction has been studied in the Burmese python. Physiological ecologist Stephen Secor finds pythons to be superb experimental subjects because after swallowing a meal, the snakes lie in a cage doing little but digesting and breathing. By measuring how much oxygen the pythons consume before and after a meal, Secor measures precisely how much energy the snakes use, and can attribute it to the cost of digestion. He typically monitors the snakes for at least two weeks at a time.

Secor and his team have shown repeatedly that the physical structure of a python’s diet influences its cost of digestion. If the snake eats an intact rat, its metabolic rate increases more than if a similar rat is ground up before the snake eats it. Amphibians yield the same results. Toads given hard-bodied worms have higher costs of digestion than those eating soft-bodied worms. Just as Oka’s team found with rats eating softer pellets, Secor’s studies show that softer meat is also digested with less energy expenditure.

A particular advantage of the Burmese pythons is that experimenters can insert food directly into their esophagus. The snakes show no signs of objecting. No matter whether the pythons find a food appealing and regardless of how easy the food is to swallow, the pythons just digest what they are given. They are an ideal species in which to test the effects of cooking on the cost of digestion. I approached Secor in 2005 to ask if he would be interested in the following study. Secor assigned eight snakes to the research, and his team prepared five kinds of experimental diet. Lean beef steak (eye of round, with less than 5 percent fat) was the basic food and was given to the snakes in each of four preparations: raw and intact; raw and ground; cooked and intact; and cooked and ground. The snakes were also given whole intact rats.

The experiment took several months. As expected from earlier results, the snakes’ cost of digestion when they ate the raw, intact meat was the same as for the whole rats. But grinding and cooking changed the costs of digestion. Grinding breaks up both muscle fibers and connective tissue, so it increases the surface area of the digestible parts of the meat. Ground meat is exposed more rapidly to acid, causing denaturation, as well as to proteolytic enzymes, causing degradation of the muscle proteins. Grinding reduced the snakes’ cost of digestion by 12.3 percent. Cooking produced almost identical results. Compared to the raw diet, cooked meat led to a reduction in the cost of digestion by 12.7 percent. The effects of the two experimental treatments, grinding and cooking, were almost entirely independent. Alone, each reduced the cost of digestion by just over 12 percent. Together, they reduced it by 23.4 percent.

Mrs. Beeton was right to cherish softness as an aid to digestion. It makes sense that we like foods that have been softened by cooking, just as we like them chopped up in a blender, ground in a mill, or pounded in a mortar. The unnaturally, atypically soft foods that compose the human diet have given our species an energetic edge, sparing us much of the hard work of digestion. Fire does a job our bodies would otherwise have to do. Eat a properly cooked steak, and your stomach will more quickly return to quiescence. From starch gelatinization to protein denaturation and the costs of digesting, absorbing, and assimilating meat, the same lesson emerges. Cooking gives calories.





When we consider the difficulties humans experience on raw diets, the evidence that all animals thrive on cooked food, and the nutritional evidence concerning gelatinization, denaturation, and tenderness, what is extraordinary about this simple claim is that it is new. Admittedly, cooking can have some negative effects. It leads to energy losses through dripping during the cooking process and by producing indigestible protein compounds, and it often leads to a reduction of vitamins. But compared to the energetic gains, those processes do not matter. Overall it appears that cooking consistently provides more energy, whether from plant or animal food.

Why then do we like cooked food today? The energy it provides is more than many of us need, but it was a critical contribution for our remote ancestors just as it is vital for many people living nowadays in poverty. Tens of thousands of generations of eating cooked food have strengthened our love for it. Consider foie gras, the liver of French geese that have been cruelly force-fed to make them especially fat. The fresh liver is soaked in milk, water, or port, marinated in Armagnac, port, or Madeira, seasoned, and finally baked. The result is so meltingly soft and tender that a single bite has been said to make a grown man cry. Our raw-food-eating ancestors never knew such joy.

Cooked food is better than raw food because life is mostly concerned with energy. So from an evolutionary perspective, if cooking causes a loss of vitamins or creates a few long-term toxic compounds, the effect is relatively unimportant compared to the impact of more calories. A female chimpanzee with a better diet gives birth more often and her offspring have better survival rates. In subsistence cultures, better-fed mothers have more and healthier children. In addition to more offspring, they have greater competitive ability, better survival, and longer lives. When our ancestors first obtained extra calories by cooking their food, they and their descendants passed on more genes than others of their species who ate raw. The result was a new evolutionary opportunity.
TOP
5#

CHAPTER 4



When Cooking Began


“The introduction of cooking may well have been the decisive factor in leading man from a primarily animal existence into one that was more fully human.”

—CARLETON S. COON, The History of Man





Archaeologists are divided about the origins of cooking. Some suggest that fire was not regularly used for cooking until the Upper Paleolithic, about forty thousand years ago, a time when people were so modern that they were creating cave art. Others favor much earlier times, half a million years ago or before. A common proposal lies between those extremes, advocated especially by physical anthropologist Loring Brace, who has long noted that people definitely controlled fire by two hundred thousand years ago and argues that cooking started around the same time. As the wide range of views shows, the archaeological evidence is not definitive. Archaeology offers only one safe conclusion: it does not tell us what we want to know. But though we cannot solve the problem of when cooking began by relying on the faint traces of ancient fires, we can use biology instead. In the teeth and bones of our ancestors we find indirect evidence of changes in diet and the way it was processed.

Yet the archaeological data leave no doubt that controlling fire is an ancient tradition. In the most recent quarter of a million years, there is sparkling evidence of fire control, and even occasionally of cooking, by both our ancestors and our close relatives the Neanderthals. The most informative sites tend to be airy caves or rock shelters, many of them in Europe. In Abri Pataud in France’s Dordogne region, heat-cracked river cobblestones from the late Aurignacian period, around forty thousand years ago, show that people boiled water by dropping hot rocks in it. At Abri Romani near Barcelona, a series of occupations dating back seventy-six thousand years includes more than sixty hearths together with abundant charcoal, burnt bones, and casts of wooden objects possibly used for cooking. More than ninety-three thousand years ago in Vanguard Cave, Gibraltar, three separate episodes of burning can be distinguished in a single hearth. Neanderthals heated pinecones on these fires and broke them open with stones, much as contemporary hunter-gatherers have been recorded doing, to eat the seeds.

Our ancestors were using fire in the Middle East and Africa as well. In a cave at Klasies River Mouth, a coastal site in South Africa from sixty thousand to ninety thousand years ago, burnt shells and fish bones lie near family-size hearths that appear to have been used for weeks or months at a time. Between 109,000 and 127,000 years ago in the Sodmein Cave of Egypt’s Red Sea Mountains, people appear responsible for huge fires with three distinct superimposed ash layers and the burnt bones of an elephant. Charred logs, together with charcoal, reddened areas, and carbonized grass stems and plants, date to 180,000 years ago at Kalambo Falls in Zambia. Back to 250,000 years ago in Israel’s Hayonim Cave, there are abundant hearths with ash deposits up to 4 centimeters (1.6 inches) thick. Such sites show that people have been controlling fire throughout the evolutionary life span of our species, Homo sapiens, which is considered to have originated about two hundred thousand years ago.





Because evidence about controlling fire is inconsistent before the last quarter of a million years, it is often argued that the control of fire was unimportant or absent until that time. But that idea is now particularly shaky because the older part of the record, going back in time from a quarter of a million years ago, has been improving in quality. Two sites in particular give tantalizing hints of what earlier people were doing with fire.

An ancient fireplace at Beeches Pit archaeological site in England securely dated to four hundred thousand years ago lies on the gently sloping bank of an ancient pond. Eight hand axes attest to the presence of humans. Dark patches about one meter (three feet) in diameter with reddened sediments at the margins show where burning occurred. Tails of ashlike material lead down from the fires toward the pond, while the upper side contains numerous pieces of flint. The flints have been knapped, or broken by a sharp blow, and many are burnt. A team led by archaeologist John Gowlett fitted the flint pieces together, and one of the various refits showed that someone had been knapping a heavy core (1.3 kilograms, or 2.9 pounds) until a flaw became obvious. The knapper abandoned it, and two flakes from the series fell forward and were burnt, indicating that the toolmaker apparently had been squatting next to a warming blaze.

Another four-hundred-thousand-year-old site, at Schöningen in Germany, has yielded more than a half dozen superb throwing spears carved from spruce and pine, together with the remains of at least twenty-two horses that appear to have died at the same time as one another, apparently killed by humans. Cut marks show that people removed meat from the horses. At the same site were numerous pieces of burnt flint, four large reddened patches about one meter in diameter that appear to have been fireplaces, and some pieces of burnt wood including a shaped stick, also made from spruce, that had been charred at one end as if it had been used as a poker, or perhaps held over coals to cook strips of meat. This exceptional lakeshore find by archaeologist Hartmut Thieme represents the earliest evidence of group hunting. Thieme suggests that after people killed the horse group, they found themselves with far more food than they could consume at the time. They settled for several days and built the fires along the lakeshore to dry as much meat as possible.

Prior to half a million years ago, there is no evidence for the control of fire in Europe, but ice covered Britain for much of the time between five hundred thousand and four hundred thousand years ago, and glaciers would have swept away most evidence of any earlier occupations. Farther south, however, fire-using is strongly attested at 790,000 years ago. In a well-dated site called Gesher Benot Ya’aqov, next to Israel’s Jordan River, hand axes and bones were first discovered in the 1930s, and in the 1990s, Naama Goren-Inbar found burnt seeds, wood, and flint. Olives, barley, and grapes were among the species of seeds found burned. The flint fragments were grouped in clusters, suggesting they had fallen into campfires. Nira Alperson-Afil analyzed these dense concentrations. She concluded that the early humans who made these fires “had a profound knowledge of fire-making, enabling them to make fire at will.”

Gesher Benot Ya’aqov is the oldest site offering confident evidence of fire control. Before then we find only provocative hints. Archaeological sites between a million and a million and a half years old include burnt bones (at Swartkrans in South Africa), lumps of clay heated to the high temperatures associated with campfires (Chesowanja, near Lake Baringo in Kenya), heated rocks in a hearthlike pattern (Gadeb in Ethiopia), or colored patches with appropriate plant phytoliths inside (Koobi Fora, Kenya). But the meaning of such evidence as indicating human control of fire is disputed. Some archaeologists find it totally unconvincing, regarding natural processes such as lightning strikes as likely explanations for the apparent use of fire. Others accept the idea that humans controlled fire in the early days of Homo erectus as well established. Overall, these hints from the Lower Paleolithic tell us only that in each case the control of fire was a possibility, not a certainty.

Evidence of humans controlling fire is hard to recover from early times. Meat can be cooked easily without burning bones. Fires might have been small, temporary affairs, leaving no trace within a few days of exposure to wind and rain. Even now hunter-gatherers such as the Hadza, who live near the Serengeti National Park in northern Tanzania, may use a fire only once, and they often leave no bones or tools at the fire site, so archaeologists would not be able to infer human activity even if they could detect where burning had occurred. The caves and shelters that preserve relatively recent evidence of fire use tend to be made of soft rock, such as limestone, which erodes quickly, so the half-lives of caves average about a quarter of a million years, leaving increasingly few opportunities to find traces of fire use from earlier periods. From the past quarter of a million years there are sites of human occupation where people must have used fire, yet there is no sign of it. There are also mysterious reductions in the frequency of finding evidence of fire, such as one that followed an interglacial period in Europe from 427,000 years ago to 364,000 years ago, when fire evidence was relatively abundant. In short, while humans have certainly been using fire for hundreds of thousands of years, archaeology does not tell us exactly when our ancestors began to do so.

The inability of the archaeological evidence to tell when humans first controlled fire directs us to biology, where we find two vital clues. First, the fossil record presents a reasonably clear picture of the changes in human anatomy over the past two million years. It tells us what were the major changes in our ancestors’ anatomy, and when they happened. Second, in response to a major change in diet, species tend to exhibit rapid and obvious changes in their anatomy. Animals are superbly adapted to their diets, and over evolutionary time the tight fit between food and anatomy is driven by food rather than by the animal’s characteristics. Fleas do not suck blood because they happen to have a proboscis well designed for piercing mammalian skin; they have the proboscis because they are adapted to sucking blood. Horses do not eat grass because they happen to have the right kind of teeth and guts for doing so; they have tall teeth and long guts because they are adapted to eating grass. Humans do not eat cooked food because we have the right kind of teeth and guts; rather, we have small teeth and short guts as a result of adapting to a cooked diet.

Therefore, we can identify when cooking began by searching the fossil record. At some time our ancestors’ anatomy changed to accommodate a cooked diet. The change must mark when cooking became not merely an occasional activity but a predictable daily occurrence, because until then our ancestors would have sometimes had to resort to eating their food raw—and therefore could not adapt to cooking. The time when our ancestors became adapted to cooked food also marks the time when fire was controlled so effectively that it was never lost again.

Anthropologists have sometimes suggested that humans could have controlled fire for reasons such as warmth and light for many millennia before starting to use it for cooking. However, many animals show a spontaneous preference for cooked food over raw. Would prehuman ancestors have preferred cooked food also? Evolutionary anthropologists Victoria Wobber and Brian Hare tested chimpanzees and other apes in the United States, Germany, and Tchimpounga, a Congolese sanctuary. Across the different locations, despite different diets and living conditions, the apes responded similarly. No apes preferred any food raw. They ate sweet potatoes and apples with equal enthusiasm whether raw or cooked, but they preferred their carrots, potatoes, and meat to be cooked. The Tchimpounga chimpanzees were particularly informative because there was no record of them having eaten meat previously, yet they showed a strong preference for cooked meat over raw meat. The first of our ancestors to control fire would likely have reacted the same way. Cooked food would have suited their palate the first time they tried it, just as a taste for cooked food, with its immediate benefits, is shared by a wide range of wild and domestic species. Chimpanzees in Senegal do not eat the raw beans of Afzelia trees, but after a forest fire has passed through the savanna, they search under Afzelia trees and eat the cooked seeds.

Why are wild animals pre-adapted in this way to appreciate the smells, tastes, and textures of cooked food? The spontaneous preference for cooked food implies an innate mechanism for recognizing high-energy foods. Many foods change their taste when cooked, becoming sweeter, less bitter, or less astringent, so taste could play a role in this preference, as some evidence suggests. Koko is a gorilla who learned to communicate with humans, and she prefers her food cooked. Cognitive psychologist Penny Patterson asked her why: “I asked Koko while the video was rolling if she liked her vegetables better cooked (specifying my left hand) or raw/fresh (indicating my right hand). She touched my left hand (cooked) in reply. Then I asked why she liked vegetables better cooked, one hand standing for ‘tastes better,’ the other ‘easier to eat.’ Koko indicated the ‘tastes better’ option.”

When primates eat, sensory nerves in the tongue perceive not only taste but also particle size and texture. Some of the brain cells (neurons) responsive to texture converge with taste neurons in the amygdala and orbito-frontal cortex of the brain, allowing a summed assessment of food properties. This sensory-neural system enables primates to respond instinctively to a wide range of food properties other than merely taste, including such factors as grittiness, viscosity, oiliness, and temperature.

In 2004 such abilities in the human brain were reported for the first time. A team led by psychologist Edmund Rolls found that when people had foods of a particular viscosity in their mouths, specific brain regions were activated. Those regions partly overlapped with regions of taste cortex that register sweetness. The picture emerging from such studies is that hard-wired responses to properties such as taste, texture, and temperature are integrated in the brain with learned responses to the sight and smell of food. So the mechanisms that allow animals to assess the quality of raw foods directly apply to cooked foods and allow them to choose foods of a good texture for easy digestion.

Rolls’s studies suggest that the proximate reasons chimpanzees and many other species like their meat and potatoes cooked may be the same as in humans. We identify foods that have high caloric value not just by their being sweet, but also by their being soft and tender. Our ancestors were surely prepared by their preexisting sensory and brain mechanisms to like cooked foods in the same way. A long delay between the first control of fire and the first eating of cooked food is therefore deeply improbable.





A long delay between the adoption of a major new diet and resulting changes in anatomy is also unlikely. Studies of Galapagos finches by Peter and Rosemary Grant showed that during a year when finches experienced an intense food shortage caused by an extended drought, the birds that were best able to eat large and hard seeds—those birds with the largest beaks—survived best. The selection pressure against small-beaked birds was so intense that only 15 percent of birds survived and the species as a whole developed measurably larger beaks within a year. Correlations in beak size between parents and offspring showed that the changes were inherited. Beak size fell again after the food supply returned to normal, but it took about fifteen years for the genetic changes the drought had imposed to reverse.

The Grants’ finches show that anatomy can evolve very quickly in response to dietary changes. In the case of the drought year in the Galapagos, the change in diet was temporary and therefore so was the change in anatomy. Other data show that if an ecological change is permanent, the species also changes permanently, and again the transition is fast. Some of the clearest examples come from animals confined on islands that have been newly created by a rise in sea level. In fewer than eight thousand years, mainland boa constrictors that occupied new islands off Belize shifted their diets away from mammals and toward birds, spent more time in trees, became more slender, lost a previous size difference between females and males, and were reduced to a fifth of their original body weight. According to evolutionary biologist Stephen Jay Gould, this rate of change is not unusual. Drawing from the fossil record, he suggested that fifteen thousand to twenty thousand years may be about the average time one species takes to make a complete evolutionary transition to another. While a species that takes many years to mature, such as our ancestors, would take longer to evolve than a rapidly growing species, such rapid rates of evolution are sharply inconsistent with some previous interpretations of the effects of cooking. Loring Brace suggested that the use of fire for softening meat began around 250,000 to 300,000 years ago, followed by a supposed drop in tooth size that began about 100,000 years ago. This would mean that for at least the first 150,000 years after cooking was adopted, human teeth showed no response. Because such a long delay before adapting to a major new influence does not fit the animal pattern, we can conclude that Brace’s idea is wrong. The adaptive changes brought on by the adoption of cooking would surely have been rapid.

In addition to following quickly, the changes would have been substantial. We can infer this from pairs of species in which lesser differences in diet have large effects. Take chimpanzees and gorillas, two ape species that often share the same forest habitat. In many ways their diets are very similar. Both choose ripe fruits when they are available. Both also supplement their diets with fibrous foods, such as piths and leaves. There is only one important difference in their food choice. When fruits are scarce, gorillas rely on foliage alone, whereas chimpanzees continue to search for fruit every day. Unlike gorillas, chimpanzees never survive only on piths and leaves—presumably because they are physiologically unable to do so.

The relative ability of these two apes to rely on foliage might at first glance appear to be a trivial matter—especially compared to the introduction of cooking. But many consequences follow from it. To find their vital fruits, chimpanzees must travel farther than gorillas, so they are more agile and smaller. There are differences in distributional range. Unlike chimpanzees, gorillas successfully occupy high-altitude forests without fruits, such as the Virunga Volcanoes of Rwanda, Uganda, and the Democratic Republic of Congo. Chimpanzees are limited to lower altitudes. Like other primates that are able to rely on a leaf diet, gorillas mature earlier, start having babies at a younger age, and reproduce faster.

Grouping patterns of these species also differ strikingly as a result of the difference in diet. The terrestrial foliage gorillas rely on is easily found and occurs in big patches, allowing their groups to be stable all year. But during food-poor seasons, chimpanzees are driven to travel alone or in small groups as they search for rare fruits. The difference in grouping patterns has further consequences. Gorillas form long-lasting bonds between females and males, whereas chimpanzees do not.

More than the relatively slight dietary difference that distinguishes gorillas from chimpanzees, cooked food had multiple differences from raw food. Effects of cooking include extra energy, softer food, fireside meals, a safer and more diverse set of food species, and a more predictable food supply during periods of scarcity. Cooking would therefore be expected to increase survival, especially of the vulnerable young. It should also have increased the range of edible foods, allowing extension into new biogeographical zones. The anatomical differences between a cooking and a precooking ancestor should be at least as big as those between a chimpanzee and a gorilla. So whenever cooking was adopted, its effects should be easy to find. We can expect the origin of cooking to be signaled by large, rapid changes in human anatomy appropriate to a softer and more energy-rich diet.





The search for such changes proves to be rather simple. Before two million years ago, there is no suggestion for the control of fire. Since then there have been only three periods when our ancestors’ evolution was fast and strong enough to justify changes in the species names. They are the times that produced Homo erectus (1.8 million years ago), Homo heidelbergensis (800,000 years ago), and Homo sapiens (200,000 years ago). These are therefore the only times when it is reasonable to infer that cooking could have been adopted.

Most recent was the evolution of Homo sapiens from an ancestor that is now usually called Homo heidelbergensis. It was a gentle process that began in Africa as early as three hundred thousand years ago and was largely complete by around two hundred thousand years ago. The transition was too recent to correspond to the origin of cooking, however, because Homo heidelbergensis was already using fire at Beeches Pit, Schöningen, and elsewhere four hundred thousand years ago. Nor does the transition to Homo sapiens show the kinds of change we are looking for. Homo heidelbergensis was merely a more robust form of human than Homo sapiens, with a large face, less rounded head, and slightly smaller brain. Most of the differences between these two species are too small and not obviously related to diet. We can be confident that cooking began more than three hundred thousand years ago, before Homo sapiens emerged.

Homo heidelbergensis evolved from Homo erectus in Africa from eight to six hundred thousand years ago. The timing of the erectus-heidelbergensis transition provides a reasonably comfortable fit with the archaeological data on the control of fire becoming particularly scarce. The main changes in anatomy from Homo erectus to Homo heidelbergensis were an increase in cranial capacity (brain volume) of around 30 percent, a higher forehead, and a flatter face. These are smaller modifications than the differences between a chimpanzee and a gorilla, and the modifications show little correspondence to changes in the diet. So this Pleistocene transition does not look favorable. It is a possibility for when cooking began, but not a promising one.

The only other option is the original change, from habilines to Homo erectus. This shift happened between 1.9 million and 1.8 million years ago and involved much larger changes in anatomy than any subsequent transitions. Recall that in many ways habilines were apelike. Like the australopithecines, they appear to have had two effective styles of locomotion. They walked upright and can be reconstructed as having had sufficiently strong and mobile arms to be good climbers. Their small size must have helped them in trees. They are estimated to have stood about 1 to 1.3 meters tall (3 feet 3 inches to 4 feet 3 inches) and appear to have weighed about the same as a chimpanzee, around thirty-two kilograms (seventy pounds) for a female and thirty-seven kilograms (eighty-one pounds) for a male. Despite their small bodies, they had much bigger chewing teeth than in any subsequent species of the genus Homo: the surface areas of three representative chewing teeth decreased by 21 percent from habilines to early Homo erectus. Habilines’ larger teeth imply a bulky diet that required a lot of chewing.

Homo erectus did not exhibit the apelike features of the habilines. In the evolution of Homo erectus from habilines, we find the largest reduction in tooth size in the last six million years of human evolution, the largest increase in body size, and a disappearance of the shoulder, arm, and trunk adaptations that apparently enabled habilines to climb well. Additionally, Homo erectus had a less flared rib cage and a narrower pelvis than the australopithecines, both features indicating that they had a smaller gut. There was a 42 percent increase in cranial capacity. Homo erectus was also the first species in our lineage to extend its range beyond Africa: it was recorded in western Asia by 1.7 million years ago, Indonesia in Southeast Asia by 1.6 million years ago, and Spain by 1.4 million years ago. The reduction in tooth size, the signs of increased energy availability in larger brains and bodies, the indication of smaller guts, and the ability to exploit new kinds of habitat all support the idea that cooking was responsible for the evolution of Homo erectus.

Even the reduction in climbing ability fits the hypothesis that Homo erectus cooked. Homo erectus presumably climbed no better than modern humans do, unlike the agile habilines. This shift suggests that Homo erectus slept on the ground, a novel behavior that would have depended on their controlling fire to provide light to see predators and scare them away. Primates hardly ever sleep on the ground. Smaller species sleep in tree holes, in hidden nests, on branches hanging over water, on cliff ledges, or in trees so tall that no ground predator is likely to reach them. Great apes mostly build sleeping platforms or nests. The only nonhuman primate that regularly sleeps on the ground is the largest species of great ape, gorillas. Gorillas are safer on the ground than Homo erectus would have been because gorillas live in forests with few predators and they are relatively enormous. The most frequent ground sleepers are adult males, weighing around 127 kilograms (286 pounds). Smaller gorillas often sleep in trees.

The late Pliocene and early Pleistocene periods in Africa were rich in predators. In wooded areas from 4 million to 1.5 million years ago, our ancestors would have found saber-toothed cats. There was Megantereon, the size of a leopard, and Dinofelis, as big as a lion. In more open habitats there was the scimitar cat Homotherium, equally large. An extinct kind of lion and spotted hyena lived alongside our early ancestors, while modern lions and leopards have been present since at least 1.8 million years ago. There were also many large animals such as elephants, rhinoceroses, and buffalo-like ungulates that could stumble unawares onto an unconscious biped. The African woodlands would have been a very dangerous place to sleep on the ground.

Extrapolating from the behavior of living primates in predator-rich environments, the australopithecines and habilines surely slept in trees. Their habitats were well wooded and their upper-body anatomy suggests they climbed well. But what did Homo erectus do? The famous “Turkana boy,” a beautifully preserved specimen of Homo erectus dated between 1.51 and 1.56 million years ago provides excellent evidence that they climbed relatively poorly. Physical anthropologists Alan Walker and Pat Shipman have described the Turkana boy as committed to locomotion on the ground. His finger bones had lost the curved, robust shape of australopithecine fingers. His shoulder blade had the modern form, giving no indication of being adapted to the stresses of climbing with the arm above the shoulder. The Turkana boy is so well preserved that Walker was able to study the vestibular system of the inner ear, responsible for balance. Species that climb regularly have a large and characteristically shaped vestibular system. The Turkana boy’s is different from that of species that climb, but closely resembles the modern human system.

So the Turkana boy, like other Homo erectus, could not have climbed well and he therefore would have found it difficult to make the type of nest great apes sleep in. Chimpanzees take about five minutes to build their nests by standing on all fours where the nest is taking shape, bending branches toward themselves. They break some of the bigger ones and weave the branches together to form a platform that they finish off with a few leafy twigs that serve as cushions or pillows to make it comfortable. Making a nest depends on being able to move around easily on the end of a swaying branch. The long legs and flat feet of humans such as Homo erectus and modern people do not allow such agility. For a mother with a small infant, the gymnastic challenges of making a nest would have been particularly difficult given her need to cradle while she swayed in the tree.

Homo erectus therefore must have slept on the ground. But to do so in the dark of a moonless night seems impossibly dangerous. Homo erectus was as poorly defended a creature as we are, unable to sprint fast and dependent on weapons for any success in fighting. Surprised by a Dinofelis or a pack of hyenas at midnight, they would have been vulnerable.

If Homo erectus used fire, however, they could sleep in the same way as people do nowadays in the savanna. In the bush, people lie close to the fire and for most or all of the night someone is awake. When a sleeper awakens, he or she might poke at the fire and chat a while, allowing another to fall asleep. In a twelve-hour night with no light other than what the fire provides, there is no need to have a continuous eight-hour sleep. An informal system of guarding easily emerges that allows enough hours of sleep for all while ensuring the presence of an alert sentinel. To judge from records of attacks by jaguars, modern hunter-gatherers are safer in camp at night than they are on the hunt by day.

The control of fire could explain why Homo erectus lost their climbing ability. The normal assumption is that when long legs were favored, perhaps as a result of the increasing importance of long-distance travel as humans searched for meat, it was harder for humans to climb efficiently, and Homo erectus therefore abandoned the trees. But since that argument does not explain how Homo erectus could sleep safely, I prefer an alternative hypothesis: having controlled fire, a group of habilines learned that they could sleep safely on the ground. Their new practice of cooking roots and meat meant that food obtained from trees was less important than it had been when raw food was the only option. When they no longer needed to climb trees to find food or sleep safely, natural selection rapidly favored the anatomical changes that facilitated long-distance locomotion and led to living completely on the ground.





Two kinds of evidence thus point independently to the origin of Homo erectus as the time when cooking began. First, anatomical changes related to diet, including the reduction in tooth size and in the flaring of the rib cage, were larger than at any other time in human evolution, and they fit the theory that the nutritional quality of the diet improved and the food consumed was softer. Second, the loss of traits allowing efficient climbing marked a commitment to sleeping on the ground that is hard to explain without the control of fire.

The only alternative is the traditional theory that cooking was first practiced by beings that already looked like us—physically human members of the genus Homo. If this were true, by the time our ancestors adopted cooking, Homo erectus had long ago adapted to a soft, easily chewed diet of high caloric density. But as we have seen, cold-processing techniques such as grinding and blending provide relatively poor energy even when carried out by raw-foodists with modern equipment.

For more than 2.5 million years our ancestors have been cutting meat off animal bones, and the impact was huge. A diet that included raw meat as well as plant foods pushed our forebears out of the australopithecine rut, initiated the evolution of their larger brains, and probably inspired a series of food-processing innovations. But according to the evidence carried in our bodies, it would take the invention of cooking to convert habilines into Homo erectus, and launch the journey that has led without any major changes to the anatomy of modern humans.
TOP
6#

CHAPTER 5



Brain Foods


“Tell me what you eat, and I shall tell you what you are.”

—JEAN ANTHELME BRILLAT-SAVARIN, The Physiology of Taste: Or Meditations on Transcendental Gastronomy





“Man is but a reed, the weakest in nature, but he is a thinking reed,” wrote philosopher Blaise Pascal in 1670. Exceptional intelligence is the defining feature of our species, yet its origins have long been a puzzle. Darwin concluded that intellect would have given advantages in social competition and the struggle to survive, but why humans should be brainier than other species was unclear. Only recently has an explanation emerged. In the view of many evolutionary anthropologists, the pressure for intelligence indeed comes primarily from the advantages of outwitting social competitors, whereas a major reason for species differences is how much brainpower the body can afford. For this reason the quality of the diet has been identified as a key driver of the growth of primate brains. For humans, cooking must have played a major role.

Attempts to explain the evolution of intelligence have sometimes appealed to rather specific advantages. Evolutionary biologist Richard Alexander argues that because humans practice warfare, and brainpower is critical for planning raids and winning battles, higher intellect could have been favored by a long evolutionary history of intense intergroup violence. But this hypothesis is undermined by chimpanzees, which behave in ways similar to warfare in small-scale human societies, but without humans’ braininess. Violence between groups of chimpanzees is like a “shoot-on-sight” policy. Parties of males attack vulnerable rivals from adjacent groups whenever they encounter them, sometimes during incursions deep into the other group’s territory in search of victims. Death rates from these interactions among chimpanzees are similar to those in small-scale societies of humans, yet chimpanzees are much less brainy than humans, and only about as clever as their more peaceable relatives, bonobos, gorillas, and orangutans.

Another suggested explanation for the evolution of intelligence is more ecological than social. This line of thinking proposes that intellect would be favored in species that occupy large home ranges, on the theory that wide-roaming creatures would need exceptional brainpower to mentally map their territories. And indeed, human hunter-gatherers cover huge areas compared to the ranges of apes and monkeys. But the correlation between range size and brain size does not generalize. Species of primates with larger brains are more intelligent, but they show no overall tendency to have larger ranges. The association of intellect and range size in humans looks accidental; that is, there is no evidence for a causal effect of brain size on range size, or vice versa, across primate species.

A more promising approach assumes that numerous kinds of benefits come from being intelligent. Clever species can forage in a variety of creative ways, such as using grasses and twigs to extract insects from holes, or lifting stones as hammers to smash nuts. Big-brained species can also manage complex social relationships. Evolutionary psychologist Robin Dunbar found that primates with bigger brains or more neocortex live in larger groups, have a greater number of close social relationships, and use coalitions more effectively than those with smaller brains.

Brains pay off socially when they beat brawn. Relationships can change daily in primates that live in large groups, such as chimpanzees or baboons. Flexible coalitions in which two or more group members gang up on another group member allow small or individually low-status animals to compete successfully for access to resources and mates. Coalitions are difficult to manage because individuals compete for the best allies, and an ally today may be a rival tomorrow. Individuals must constantly reassess one another’s moods and strategies, and alter their own behavior accordingly. Clever animals can be deceitful too, deliberately hiding their feelings by masking facial expressions, or screaming to pretend they have been attacked when their real motive is to rally supporters to chase a dominant individual away from food. The result is a soap opera of changing affections, alliances, and hostilities, and a constant pressure to outsmart others.

Most animals are not up to the cognitive challenges of juggling social alliances. They compete one-on-one, like chickens, or following simple rules such as supporting members of their own group against outsiders. The exceptions are telling. Birds in the crow family have many of the social abilities of primates and are distinctly large-brained compared to other birds. Bottlenose dolphins form particularly complex and changeable alliances, and have the largest brains relative to body size of any nonhuman. Spotted hyenas live in large groups and use flexible coalitions to compete for power, and consistent with the primate evidence, they have bigger brains than their less social relatives. A similar link of sociality to mental power is found in social insects, whose neural tissue is concentrated not in brains but in ganglia. Darwin noted that colony-living ants and wasps have “cerebral ganglia of extraordinary dimensions,” many times larger than other insects.

These kinds of correlation have supported the social brain hypothesis, which says that large brains have evolved because intelligence is a vital component of social life. The hypothesis nicely explains how animals that live in groups can benefit from being clever by outwitting their rivals in competition over mates, food, allies, and status. It also explains why species with bigger brains tend to have more complex societies, and the hypothesis suggests that if a species has limited brainpower, its social options may be constrained as well: small-brained monkeys may be too dim to handle many social relationships.





The social brain hypothesis is very important in explaining a major benefit of being intelligent. Indeed, the advantages are so strong that we might expect all social primates to have developed big brains and high intellect. Yet there is wide variation. Lemurs are as small-brained as typical mammals. Apes have bigger brains than monkeys, and humans have the biggest brains of all. The social brain hypothesis does not explain these variations. It sets up this problem: if social intelligence is so important, why do some group-living species have smaller brains than others?

Diet provides a major part of the answer. In 1995 Leslie Aiello and Peter Wheeler proposed that the reason some animals have evolved big brains is that they have small guts, and small guts are made possible by a high-quality diet. Aiello and Wheeler’s head-spinning idea came from the realization that brains are exceptionally greedy for glucose—in other words, for energy. For an inactive person, every fifth meal is eaten solely to power the brain. Literally, our brains use around 20 percent of our basal metabolic rate—our energy budget when we are resting—even though they make up only about 2.5 percent of our body weight. Because human brains are so large, this proportion of energy expenditure is higher than it is in other animals: primates on average use about 13 percent of their basal metabolic rate on their brains, and most other mammals use less again, around 8 percent to 10 percent. As expected from the importance of maintaining energy flow to our many brain cells (neurons), genes that are responsible for energy metabolism show increased expression in the brains of humans compared to the brains of nonhuman primates. The high rate of energy flow is vital because our neurons need to keep firing whether we are awake or asleep. Even a brief interruption in the flow of oxygen or glucose causes neuron activity to stop, leading rapidly to death. The constant energy demand of brain cells continues even when times are tough, such as when food is scarce or an infection is raging. The first requirement for evolving a big brain is the ability to fuel it, and to do so reliably.

Given that large brains need large amounts of energy, Aiello and Wheeler asked themselves what special features of our species enable us to apportion more glucose to our brains than other animals do. One possibility is that humans might have a uniquely high rate of energy use. After all, human food is exceptionally calorie-dense and we routinely take in more energy per day than a typical primate of our body weight, so maybe extra energy running through our bodies gives us the calories we need to feed our hungry brains. But basal metabolic rates are well known in primates and other animals, and they are unremarkable in humans. A resting person supplies energy to their body at almost exactly the rate predicted for any primate of our body weight. Since nothing about basal metabolic rates is special to humans, Aiello and Wheeler were able to rule out the idea that our big brains are powered by inordinate amounts of energy passing through the body.

The elimination of the overall high-energy-use theory was a critical breakthrough because it left only one solution. Among species that have the same relative basal metabolic rate, such as humans and other primates, extra energy going to the brain must be offset by a reduced amount of energy going elsewhere. The question is what part of the body is shortchanged. Among primates, the size of most organs is closely predicted by body weight because of inescapable physiological rules. A species whose body weighs twice that of another needs a heart that weighs almost exactly twice as much. Hearts have to be a certain size to pump enough blood around a body of a certain size. No trade-off is possible there. Similar principles apply to kidneys, adrenals, and most other organs. But Aiello and Wheeler found a provocative exception to this tendency. They discovered that across the primates there is substantial variation in the relative weight of the intestinal system. Some species have big guts and some have small. The variation in gut size is linked to the quality of the diet.





Anyone who has handled tripe or cleaned a deer knows that mammals have a lot of gut tissue. Mammalian intestines have a high metabolic rate, and in large, mostly vegetarian species like great apes, intestines tend to be busy all day, starting with the postdawn meal and continuing ceaselessly until hours after the animal goes to sleep. All this time the guts are engaged in several energy-intensive functions, such as churning, making stomach acid, synthesizing digestive enzymes, or actively transporting digested molecules across the gut wall and into the blood. Active guts consume calories at a consistently high rate, so their total energy expenditure depends on their weight and on how much work they are doing. Carnivores, such as dogs and wolves, have smaller intestines than plant eaters, such as horses, cows, or antelope. In species that are adapted to eating more easily digested foods, such as sugar-rich fruits compared to fibrous leaves, guts are also relatively small: fruit-eating chimpanzees or spider monkeys have smaller guts than the leaf-eating gorillas or howler monkeys. Those reduced guts use less total energy than larger guts and therefore give a species with a high-quality diet some spare calories to allocate elsewhere in the body.

The discovery that gut size varies substantially gave Aiello and Wheeler the opening they were looking for. Relative to their body weight, primates with smaller guts proved to have larger brains—just the kind of trade-off that had been expected. Aiello and Wheeler estimated the number of calories a species is able to save by having a small gut, and showed that the number nicely matched the extra cost of the species’ larger brains. The anthropologists concluded that primates that spend less energy fueling their intestines can afford to power more brain tissue. Big brains are made possible by a reduction in expensive tissue. The idea became known as the expensive tissue hypothesis.

Some species other than primates show a similar pattern, capitalizing on small guts to evolve particularly large brains. An elephant-nosed mormyrid fish from South America has a relatively tiny gut and is able to use an astonishing 60 percent of its energy budget to power its exceptionally large brain. Other animals follow the principle of an energy trade-off but gain muscle instead of brains. Birds that have small amounts of intestinal tissue tend to use their spare energy to grow bigger wing muscles, presumably because for a bird, better flight can be even more important than a bigger brain. Different kinds of trade-offs have also been proposed. Species with relatively low muscle mass have been found to have relatively large brains. The general lesson is that bigger brains must be paid for somehow. How animals with small guts make use of their energy savings depends on what matters to them. In primates the tendency to use energy saved by smaller guts for added brain tissue is particularly strong, presumably because most primates live in groups, where extra social intelligence has big payoffs.

The expensive tissue hypothesis predicted that major rises in human brain size would be associated with increases in diet quality. Aiello and Wheeler identified two such rises. The first brain-size expansion was around two million years ago from australopithecines to Homo erectus. In line with the Man-the-Hunter scenario, the scientists credited this rise in brain size to the increased eating of meat. Second was a little more than half a million years ago, when Homo erectus became Homo heidelbergensis. They attributed this rise to the only other obvious candidate for an improvement in dietary quality: cooking.





I believe that Aiello and Wheeler were right in their principles. But they were wrong in their specifics because they assumed there was only a single increase in brain size from australopithecines to Homo erectus. In actuality, that phase of our evolution occurred in two steps: first, the appearance of the habilines, and second, the appearance of Homo erectus . Meat eating and cooking account respectively for these two transitions, and therefore for their accompanying increases in brain size.

The expensive tissue hypothesis provides an explanation not only for the substantial increases in brain size that occurred around the time of human origins, but also for the many other rises in brain size before and after two million years ago. Consider first our last common ancestor with chimpanzees, which lived around five million to seven million years ago. We can reconstruct this pre-australopithecine ape as living in rain forest and resembling a chimpanzee. Closely related to gorillas as well as chimpanzees, these ancestors likely had brains comparable in volume to those found in great apes living today, and therefore had larger brains than are found in living monkeys. The apes’ big brains compared to those of monkeys are nicely explained by the expensive tissue hypothesis, because great apes have high-quality diets for their body weights. They eat relatively less fiber and fewer toxins than monkeys.

Chimpanzees have a cranial capacity of around 350 to 400 cubic centimeters (21.6 to 24.4 cubic inches). Australopithecines, with the same body weight as chimpanzees or even slightly less, had substantially larger cranial capacities, about 450 cubic centimeters (27.5 cubic inches). Following Aiello and Wheeler’s hypothesis, australopithecine diets should therefore have been higher in quality than the diets of living chimpanzees. This seems likely. During seasons of plenty, australopithecines would have eaten much the same diet as chimpanzees or baboons do when living in the kinds of woodland that australopithecines occupied—fruits, occasional honey, soft seeds, and other choice plant items. It was when fruits were scarce that australopithecines must have eaten better than their chimpanzee-like ancestors. Present-day chimpanzees that are short of fruit turn to items specific to their rain-forest homes, eating foliage such as the stems of giant herbs and the soft young leaves of forest trees. In their drier woodlands australopithecines would have found few such items. The most likely alternatives were starch-filled roots and other underground or underwater storage tissues of herbaceous plants. These would have been ideal.

Carbohydrates are stored abundantly in corms, rhizomes, or tubers of many savanna plants and are highly concentrated sources of energy-rich starch in the dry season. These food reserves are so well hidden that few animals can find them, but chimpanzees do dig for tubers occasionally, sometimes with sticks, and australopithecines would have been at least as skillful and well-adapted: their chewing teeth are famously massive and somewhat piglike, suited to crushing roots and corms. An important location for australopithecine food sources likely would have been the edges of rivers and lakes, where sedges, water lilies, and cattails grow well and provide a natural supermarket of starchy foods for hunter-gatherers today.

The underground energy-storage organs of plants have a quality anticipated by the expensive tissue hypothesis: they have less indigestible fiber from plant cell walls than foliage, making them easier to digest and therefore a food of higher value. A dietary change from foliage to higher quality roots is thus a plausible explanation for the first increase in brain size, from forest apes to australopithecines five million to seven million years ago.

During the second sharp increase, brain volume rose by about one-third, from the roughly 450 cubic centimeters (27 cubic inches) of australopithecines to 612 cubic centimeters (37 cubic inches) in habilines (based on measurements of five skulls). The body weights of australopithecines and habilines were about the same, so this was a substantial gain in relative brain size. Given the archaeological evidence, the big dietary change at this time was more meat eating, so meat should have made this brain growth possible. To account for such a large increase in brain size, it seems likely that habilines processed their meat. Apes and humans are disadvantaged: their teeth cannot cut meat easily, their mouths are relatively small, and as William Beaumont noticed in the case of Alexis St. Martin, their stomachs do not process hunks of raw meat efficiently.

Chimpanzees also show that eating unprocessed meat is difficult with ape jaws. They chew their animal prey intensely, but small bits of undigested meat sometimes appear in their feces. Perhaps because of this hard work and inefficiency, chimpanzees sometimes decline the opportunity to eat meat despite their usual enormous enthusiasm for it. After chewing meat for an hour or two, a chimpanzee can abandon a carcass and relax or eat fruit instead. Chimpanzees of the Kanyawara community in Kibale National Park, Uganda, occasionally forgo meat-eating opportunities without chewing muscle at all. I once saw Johnny, an avid chimpanzee hunter of red colobus monkeys, do this even though he appeared hungry for animal protein. He first killed an infant red colobus monkey, brought it to the ground, ate its intestines, then left the carcass lying unseen by other chimpanzees. He immediately returned to the trees, rapidly killed another infant, and repeated his prior actions: he again brought his prey to the ground, ate the intestines, and left the rest to rot. His preference for the softer parts was typical. When chimpanzees kill a prey animal, they normally eat such parts as the guts, liver, or brain first. They can swallow those quickly. But when eating muscle, chimpanzees are forced to chew it slowly, taking as much as an hour to chew one-third of a kilogram (three-quarters of a pound). They can get as many calories per hour by chewing fruits as they can by chewing meat. The habilines would have faced the same challenge. If they had relied on unprocessed meat for as much as half their calories and had eaten their meat as slowly as chimpanzees, with certain cuts of meat they would have had to spend several hours a day chewing it. The digestive costs likewise would have been high, since the gut would have been busy digesting for many hours.

A system for hastening chewing and digestion by processing the meat would have greatly reduced the problem. Chimpanzees have a primitive form of processing meat. By adding tree leaves to their meat meals, they make chewing easier. The chosen leaves have no special nutritional properties, judging from the fact that the meat eaters pick leaves from whatever species of tree is nearest when they settle down to eat their prey. The only obvious rule governing their choice is that the leaf must be tough: they take only mature tree leaves, not young tree leaves or the soft leaves of an herb. Sometimes they even use long-dead leaves from the forest floor, mere brown skeletons devoid of nutrients. An informal experiment in which friends and I chewed raw goat meat suggested that the added leaves give traction. When we chewed thigh muscle together with a mature avocado leaf, the bolus of chewed meat was reduced faster than when we chewed with no added leaf. Australopithecines probably used similar practices when they caught gazelle fawns or other small mammals.

Habilines had access to more advanced techniques. Their bones are found close to stone hammers, fist-size spheres whose shapes provide vivid testimony of their repeated use. Habilines probably used the hammers partly to smash prey bones to extract the marrow. They also doubtless used the hammers to crack open nuts, as West African chimpanzees do, as well as to make other tools. In addition to these practices, stone hammers or wooden clubs could equally have been used for tenderizing meat. After habilines cut hunks of meat off the carcasses of game animals, they may have sliced them into steaks, laid them on flat stones, and pounded them with logs or rocks. Even relatively crude hammering would have reduced the costs of digestion by tenderizing the meat and breaking connective tissue. Because raw unprocessed meat is difficult to chew and digest, I suspect this was one of the most important cultural innovations in human origins, enabling habilines to increase the nutritional benefit of meat and the speed with which they could eat and digest it. Tenderizing meat would have reduced the costs of digestion by cutting the time that meat was in the stomach, and thus allowed habilines to divert energy toward their brains.

Dietary shifts toward roots, meat eating, and meat processing thus can explain the growth in brains from a chimpanzee-like ancestor at six million years to the habilines around two million years ago. From then on, the increases in brain size were more continuous. The habiline cranial capacity of 612 cubic centimeters (37 cubic inches) rose by over 40 percent to reach an average of 870 cubic centimeters (53 cubic inches) in the earliest measured Homo erectus. The significance of this rise is complicated by a parallel growth in body weight, from the lowly 32 to 37 kilograms (70 to 81 pounds) of habilines to a substantial 56 to 66 kilograms (123 to 145 pounds) in Homo erectus. Unfortunately, body weights are hard to estimate accurately from bones, and the number of specimens is small, so how much larger relative to body weight the brains of the first Homo erectus were than those of habilines, or whether they were relatively larger at all, is uncertain. However, Homo erectus brains continued to increase in size after 1.8 million years ago, averaging almost 950 cubic centimeters (58 cubic inches) by 1 million years ago. Given the evidence and arguments I have offered that Homo erectus originated as cooks, the expensive tissue hypothesis suggests their eating cooked food caused their brains to grow. Once cooking began, gut size could fall and the gut would be less active, both trends reducing the cost of the digestive system.





The fourth notable increase in cranial capacity occurred with the emergence of Homo heidelbergensis after eight hundred thousand years ago. The increase was again substantial, leading to a brain occupying around 1,200 cubic centimeters (73 cubic inches). This was the impressive rise that Aiello and Wheeler attributed to the invention of cooking—mistakenly, I believe. It remains a mystery, inviting speculation.

More efficient hunting is a possibility. Hartmut Thieme’s evidence of group hunting four hundred thousand years ago in Schöningen suggests a marked improvement in hunting skills over earlier eras. This raises the possibility that meat intake, and perhaps therefore the use of animal fat, rose significantly before this time and played a role in the evolution of Homo erectus into Homo heidelbergensis.

Alternatively, cooking surely continued to affect brain evolution long after it was invented, because cooking methods improved. Laying a food item on the fire presumably was the main early method. Such techniques have been used by generations of campers and have been recorded by hunter-gatherers in recent times for foods that are easy to cook. The Aranda foragers of central Australia gather pea-size corms of sedges by digging them from flat ground near rivers. One method of cooking consists merely of laying them on hot ashes for a short time, then rubbing them between the hands to remove the light shell before eating them. !Kung San hunter-gatherers of Africa’s Kalahari Desert cook tsin beans, one of their more important foods, by simply burying them in hot ashes. Putting an animal on a fire to roast can work fairly well, especially if the hairs have been singed off first. Marrow can be cooked with similar efficiency by roasting a complete bone in fire, then using stones to crack it. The marrow flows out like warm butter.

More complex ways to roast presumably would have accumulated slowly, often specific to particular foods. Take mongongo nuts eaten by !Kung hunter-gatherers. Mongongo nuts are a highly nutritious staple, often providing the !Kung with their major source of calories for weeks on end. To cook them, a woman mixes the coals from a dying fire with hot, dry sand. She then buries scores of nuts in the hot pile without allowing the nuts to touch any of the live coals. After a few minutes she kneads the pile to ensure that the nuts are evenly heated, adding more coals as needed. When the nuts are done, she hammers each one to split it, then eats the seeds inside or keeps them for further cooking. We do not know when such a sophisticated method appeared, but it seems likely to have contributed to raising the energetic quality of food, reducing the time the digestive system was active, and so lowering the total costs of digestion and allowing more energy for the brain.

Such improvements in cooking efficiency could explain why there was a steady upward trend in brain size during the lifetimes of the early human species. Brains were notably bigger in late Homo erectus than in early Homo erectus, and in late Homo heidelbergensis than in early Homo heidelbergensis . Major dietary breakthroughs such as meat eating and the invention of cooking cannot account for these smaller changes. The steady rise in brain size between the major jumps is most easily explained by a series of improvements in cooking techniques. Perhaps some particularly important advances enabled the prominent rise in brain size with Homo heidelbergensis.





The same possibility applies to the evolution of our own species, Homo sapiens, around two hundred thousand years ago. The gain in brain size was relatively minor, from 1,200 cubic centimeters (73 cubic inches) in Homo heidelbergensis to around 1,400 cubic centimeters (85 cubic inches) in Homo sapiens. Various modern behaviors are seen for the first time around this transition, such as the use of red ocher (presumably as a form of personal decoration), making tools out of bone, and long-distance trade. Increasing behavioral sophistication could also have happened in cooking techniques.

An early form of earth oven is the kind of innovation that could have been influential because it would have marked an important advance in cooking efficiency. Hunter-gatherers worldwide used earth ovens that employed hot rocks. The ovens do not appear to have been used by the people who expanded out of Africa more than sixty thousand years ago and colonized the rest of the world, since they are not recorded in Australia until thirty thousand years ago. However, it is possible that a simpler design, now vanished and forgotten, may have been used in earlier times.

In recent earth ovens the hot rocks provide an even, long-lasting heat. A typical procedure recorded in 1927 among the Aranda of central Australia involved digging a hole, filling it with a pile of dry wood, and topping that with large stones that did not crack when heated—often river cobblestones that had to be carried from a distance. When the stones were red-hot and fell through the fire, they were pulled out with sticks and the ashes were removed. The hot stones were then returned and covered with a layer of green leaves. Cooks liked to wrap meat in leaves to retain its juices before placing it on this layer, sometimes on top of a plant food such as roots. More green leaves and perhaps a basket mat would be laid on top, water was poured on, and some people added herbs for taste. Finally, the hole was filled with a layer of soil to retain the steam. After an hour or more—sometimes it was left overnight—the meat and vegetables would be ready and superb. The meat was laid on leafy branches, carved with a stone knife, and served. The even heat and moist environment made earth ovens efficient for gelatinizing starch and other carbohydrates, and they offered effective control over the tenderness of meat. This sophisticated cooking technique doubtless increased the digestibility of the meat and plant foods.

Likewise, the use of containers must have made cooking more efficient and might have contributed to reducing digestive costs and thus allowing increases in brain size. Pottery is a very recent invention, around ten thousand years ago, but natural objects could have been used as cooking containers long before that. Certain animals come with their own dishes. Shellfish, such as mussels, have been cooked whole in many parts of the world by being thrown into a fire until the valves open. The Yahgan of Tierra del Fuego used mussel shells to catch the drips from a roasting seal or to hold whale oil, which they ate by dipping pieces of edible fungus into it.

It is a small step from such techniques to cooking in a container. Heating in natural containers by early Homo sapiens is indicated around 120,000 years ago by evidence that people made a glue from ancient birch tar, which they used to haft stone points on to spears. The glue had to be heated to achieve the desired stickiness, so people must have been cooking with containers by then. Some containers would have needed little imagination. Turtles are a natural convenience food because they can be easily kept alive for days, and whether alive or cooked they are easily carried. If they are turned upside down they even provide their own cooking pot. After their flesh has been eaten, their bodies remain useful. Andaman Islanders from the Bay of Bengal cooked turtle blood in an upside-down shell until it was thick, then ate it at once. Like many Asian peoples they also used bamboo as a container, sometimes for cooking. The Andaman Islanders would clean a length of bamboo and heat it over a fire so all its juices were absorbed. They then packed it with half-cooked pieces of wild pork or other meat and heated it so slowly that the meat swelled without cracking the bamboo. When the bamboo stopped steaming, they removed it from the fire and stuffed the opening with leaves to seal it. The cooked meat could be left for several days. Sadly, many ingenious cooking techniques practiced by early people with plant materials are forever lost to us because they leave no traces.

Development of other methods would have improved the efficiency of cooking and the quality of food. Various special ways of roasting have unknown antiquity. In their cold climate near the Antarctic, the Yahgan developed a two-stone griddle by heating two flat stones in a fire. The stones were then withdrawn, and the larger stone served as a griddle for a steak or layer of blubber, while the smaller was laid on top. This worked so well that the fat was browned and shriveled in a few minutes, a favorite for the hunters. The Yahgan were also fond of sausages. To make a sea-lion blood sausage, they kept the blood that collects in the abdominal cavity of a freshly killed sea lion. They took a soft, still moist piece of gut, turned it inside out, cleaned it, tied it shut at one end with sinews, filled it with air by blowing, tied the other end shut, and left it to dry. When the empty sausage was sufficiently firm, they used a large shell to fill it with blood, tied it shut again, and for safety’s sake jabbed a short, thin stick into each end to prevent the ties from unraveling. They then put the sausage into hot ashes, poking it occasionally to keep it from bursting. The same idea worked equally well with other parts of the gut. They sometimes filled stomachs with blubber or chopped tissues like heart, lungs, or liver. These haggises of the past left no traces, but they remind us that even in the bush, long before such recent inventions as grinding and stone boiling (which started within the past twenty-five thousand to forty thousand years), cooking can involve much more than simple heating.





Although the breakthrough of using fire at all would have been the biggest culinary leap, the subsequent discovery of better ways to prepare the food would have led to continual increases in digestive efficiency, leaving more energy for brain growth. The improvements would have been especially important for brain growth after birth, since easily digested weaning foods would have been critical contributors to a child’s energy supply. Advances in food preparation may thus have contributed to the extraordinary continuing rise in brain size through two million years of human evolution—a trajectory of increasing brain size that has been faster and longer-lasting than known for any other species. When Charles Darwin called cooking “probably the greatest [discovery], excepting language, ever made by man,” he was thinking merely of our improved food supply. But the idea that brain enlargement was made possible by improvements in diet suggests a wider significance. Cooking was a great discovery not merely because it gave us better food, or even because it made us physically human. It did something even more important: it helped make our brains uniquely large, providing a dull human body with a brilliant human mind.
TOP
7#

是不是也可以理解为我们的祖先掌握了火的使用
TOP
8#

CHAPTER 6



How Cooking Frees Men


“Voracious animals . . . both feed continually and as incessantly eliminate, leading a life truly inimical to philosophy and music, as Plato has said, whereas nobler and more perfect animals neither eat nor eliminate continually.”

—GALEN, Galen on the Usefulness of the Parts of the Body





Diet has long been considered a key to understanding social behavior across species. The food quest is fundamental to evolutionary success, and social strategies affect how well individuals eat. Group size in chimpanzees rapidly adjusts to monthly changes in the density and distribution of fruiting trees. Chimpanzee society differs markedly from gorilla society, thanks to the gorillas’ reliance on herbs. Humans are no exception to such relationships. The Man-the-Hunter hypothesis has inspired such potent explanations of bonding between males and females that it has seemed to some researchers that no other explanation is necessary. In 1968 physical anthropologists Sherwood Washburn and Chet Lancaster wrote, “Our intellect, interests, emotions and basic social life, all are evolutionary products of the hunting adaptation.” Such ideas have been highly influential, but they have rarely looked beyond meat. The adoption of cooking must have radically changed the way our ancestors ate, in ways that would have changed our social behavior too.

Take softness. Foods soften when they are cooked, and as a result, cooked food can be eaten more quickly than raw food. Reliance on cooked food has therefore allowed our species to thoroughly restructure the working day. Instead of chewing for half of their time, as great apes tend to do, women in subsistence societies tend to spend the active part of their days collecting and preparing food. Men, liberated from the simple biological demands of a long day’s commitment to chewing raw food, engage in productive or unproductive labor as they wish. In fact, I believe that cooking has made possible one of the most distinctive features of human society: the modern form of the sexual division of labor.





The sexual division of labor refers to women and men making different and complementary contributions to the household economy. Though the specific activities of each sex vary by culture, the gendered division of labor is a human universal. It is therefore assumed to have appeared well before modern humans started spreading across the globe sixty thousand to seventy thousand years ago. So discussion of the evolution of the sexual division of labor centers on hunter-gatherers. The 750-strong Hadza tribe are one such group. They live in northern Tanzania, scattered among a series of small camps in dry bush country around a shallow lake.

The Hadza are modern-day people. Neighboring farmers and pastoralists trade with them and marry some of their daughters. Government officials, tourists, and researchers visit them. The Hadza use metal knives and money, wear cotton clothes, hunt with dogs, and occasionally trade for agricultural foods. Much has changed since the time, perhaps two thousand years ago, when they last lived in an exclusive world of hunter-gatherers. Nevertheless, they are one of the few remaining peoples who obtain the majority of their food by foraging in an African woodland of a type that was once occupied by ancient humans.

Dawn sees people emerging from their sleeping huts to eat scraps of food from the previous night’s meal. As consensus quietly develops about the day’s activity; most of the women in camp—six or more, perhaps—take up their digging sticks and go toward a familiar ekwa patch a couple of kilometers (more than a mile) away. Some take their babies in slings, and one or more carries a smoldering log with which to start a fire if needed. Older children walk alongside. Meanwhile, in ones and twos, various men and their dogs also walk off with bows and arrows in hand. Some men are going hunting, others to visit neighbors. A scattering of people remain in camp—a couple of old women, perhaps, looking after toddlers whose mothers have gone for food, and a young man resting after a long hunt the previous day.

The women walk slowly, in pace with the younger children. They stop occasionally to pick small fruits that they eat on the spot. After less than an hour they break into smaller parties as each forager finds her own choice site in calling distance of her companions. The digging is hard and uncomfortable but it does not take long. A couple of hours later the women’s karosses—cloaks made of animal skins—are covered in piles of thick, brown, foot-long roots. These ekwa tubers are a year-round staple for the Hadza, always easily found. As the karosses fill, someone starts a fire, and shortly afterward the foragers gather for a well-deserved snack. They bake their ekwa by leaning the tubers against the coals. In barely twenty minutes, the smaller ones are ready. After the simple meal, some women chat while others dig up a few more ekwa to make sure they have enough for the rest of the day. Most have found other foods as well—a few bulbs, perhaps. They tie up their karosses and start homeward. Each woman totes at least 15 kilograms (33 pounds). They are back in camp by early afternoon, tired from the hard work.

Anthropologists sometimes debate whether hunting and gathering is a relaxed way of life. Lorna Marshall worked alongside Nyae Nyae !Kung women gathering in the Kalahari in the 1950s. “They did not have pleasurable satisfaction,” she said, “in remembering their hot, monotonous, arduous days of digging and picking and trudging home with their heavy loads.” But times and cultures vary. Anthropologist Phyllis Kaberry, who worked with aborigines in the Kimberley region of northwestern Australia, said the women enjoyed one another’s company and their foraging routine.

Back in the Hadza camp, each woman empties her kaross in her own hut. By early evening she has a fire, and a pile of ekwa lies baked and ready. She hopes the men will bring some meat to complete the meal. During the evening hours several men return. Some have honey, a few have nothing, and one arrives with the carcass of a warthog. After he singes the animal’s hair off in a fire, men and women gather to divide it. Following the typical practice of hunter-gatherers, many men in the camp get a share, but the successful hunter makes sure his friends, family, and relatives get the most. Soon each household fire is cooking meat. The delicious smells enrich the night air. The meat and the roasted ekwa are quickly consumed. As the camp settles into sleep, enough ekwa remains for breakfast the following day.

The Hadza illustrate two major features of the sexual division of labor among hunter-gatherers that differentiate humans sharply from nonhuman primates. Women and men spend their days seeking different kinds of foods, and the foods they obtain are eaten by both sexes. Why our species forages in such an unusual way (compared to primates and all other animals, whose adults do not share food with one another) has never been fully resolved. There are many variations in the particular foods obtained. Tierra del Fuego’s bitter climate provided few plant foods, so while men hunted sea mammals, women would dive for shellfish in the frigid shallows. In the tropical islands of northern Australia, there was so much plant food that women brought enough to feed all the family and still found time to hunt occasional small animals. Men there did little hunting, mostly playing politics instead.

Although the specific food types varied from place to place, women always tended to provide the staples, whether roots, seeds, or shellfish. These foods normally needed processing, which could involve a lot of time and laborious work. Many Australian tribes prepared a kind of bread called damper from small seeds, such as from grasses. Women gathered the plants and heaped them so their seeds would drop and collect in a pile. They threshed the seeds by trampling, pounding, or rubbing them in their hands, winnowed them in long bark dishes, and ground them into a paste. The result was occasionally eaten raw but was more often cooked on hot ashes. The whole process could take more than a day. Women worked hard at such tasks because their children and husbands relied on the staples women prepared.

Men, by contrast, tended to search for foods that were especially appreciated but could not be found easily or predictably. They hoped for such prizes as meat and honey, which tended to come in large amounts and tasted delicious. Their arrival in camp made the difference between happiness and sadness. Phyllis Kaberry’s description of an aborigine camp in western Australia is typical: “The Aborigines continually craved for meat, and any man was apt to declare, ‘me hungry alonga bingy,’ though he had had a good meal of yams and damper a few minutes before. The camp on such occasions became glum, lethargic, and unenthusiastic about dancing.” Hunting large game was a predominantly masculine activity in 99.3 percent of recent societies.

Hints of comparable sex differences in food procurement have been detected in primates. Female lemurs tend to eat more of the preferred foods than males. In various monkeys such as macaques, guenons, and mangabeys, females eat more insects and males eat more fruit. Among chimpanzees, females eat more termites and ants, and males eat more meat. But such differences are minor because in every nonhuman primate the overwhelming majority of the foods collected and eaten by females and males are the same types.

Even more distinctive of humans is that each sex eats not only from the food items they have collected themselves, but also from their partner’s finds. Not even a hint of this complementarity is found among nonhuman primates. Plenty of primates, such as gibbons and gorillas, have family groups. Females and males in those species spend all day together, are nice to each other, and bring up their offspring together, but, unlike people, the adults never give each other food. Human couples, by contrast, are expected to do so.

In foraging societies a woman always shares her food with her husband and children, and she gives little to anyone other than close kin. Men likewise share with their wives, whether they have received meat from other men or have brought it to camp themselves and shared part of it with other men. The exchanges between wife and husband permeate families in every society. The contributions might involve women digging roots and men hunting meat in one culture, or women shopping and men earning a salary in another. No matter the specific items each partner contributes, human families are unique compared to the social arrangements of other species because each household is a little economy.





Attempts to understand how the sexual division of labor arose in our evolutionary history have been strongly affected by whether women or men are thought to have provided more of the food. It used to be thought that women typically produced most of the calories, as occurs among the Hadza. Worldwide across foraging groups, however, men probably supplied the bulk of the food calories more often than women did. This is particularly true in the high, colder latitudes where there are few edible plants, and hunting is the main way to get food. In an analysis of nine well-studied groups, the proportion of calories that came from foods collected by women ranged from a maximum of 57 percent, in the desert-living G/wi Bushmen of Namibia, down to a low of 16 percent in the Aché Indians of Paraguay. Women provided one-third of the calories in these societies, and men two-thirds. But such averages do not give an accurate sense of the value of items each sex contributes. At different times of year, the relative importance of foods obtained by women and men can change, and overall each sex’s foods can be just as critical as the other’s in maintaining health and survival. Furthermore, each sex makes vital contributions to the overall household economy regardless of any difference in the proportion of food calories contributed.

The division of labor by sex affects both household subsistence and society as a whole. Sociologist Emile Durkheim thought that its most important result was to promote moral standards, by creating a bond within the family. Specialization of labor also increases productivity by allowing women and men to become more skilled at their particular tasks, which promotes efficient use of time and resources. It is even thought to be associated with the evolution of some emotional and intellectual skills, because our reliance on sharing requires a cooperative temperament and exceptional intelligence. For such reasons anthropologists Jane and Chet Lancaster described the sexual division of labor as the “fundamental platform of behavior for the genus Homo,” and the “true watershed for differentiating ape from human lifeways.” Whether they were right in thinking the division began with the genus Homo is debated. Though I agree with the Lancasters, many think the division of labor by sex started much later. But there is no doubt of its importance in making us who we are.

The classic explanation in physical anthropology for this social structure is essentially what Jean Anthelme Brillat-Savarin proposed: when meat became an important part of the human diet, it was harder for females than males to obtain. Males with a surplus would have offered some to females, who would have appreciated the gift and returned the favor by gathering plant foods to share with males. The result was an incipient household. Physical anthropologist Sherwood Washburn put it this way:

When males hunt and females gather, the results are shared and given to the young, and the habitual sharing between a male, a female, and their offspring becomes the basis for the human family. According to this view, the human family is the result of the reciprocity of hunting, the addition of a male to the mother-plus-young social group of the monkeys and apes.

Washburn’s statement captures a core feature of conventional wisdom, which is that the way to explain the evolution of the sexual division of labor is to imagine that, together, meat eating and plant eating allowed a household. An un-stated assumption was that the food was raw. But if food was raw, the sexual division of labor is unworkable. Nowadays a man who has spent most of the day hunting can satisfy his hunger easily when he returns to camp, because his evening meal is cooked. But if the food waiting for him in camp had all been raw, he would have had a major problem.

The difficulty lies in the large amount of time it takes to eat raw food. Great apes allow us to estimate it. Simply because they are big—30 kilograms (66 pounds) and more—they need a lot of food and a lot of time to chew. Chimpanzees in Gombe National Park, Tanzania, spend more than six hours a day chewing. Six hours may seem high considering that most of their food is ripe fruit. Bananas or grapefruit would slip down their throats easily, and for this reason, chimpanzees readily raid the plantations of people living near their territories. But wild fruits are not nearly as rewarding as those domesticated fruits. The edible pulp of a forest fruit is often physically hard, and it may be protected by a skin, coat, or hairs that have to be removed. Most fruits have to be chewed for a long time before the pulp can be fully detached from the pieces of skin or seeds, and before the solid pieces are mashed enough to give up their valuable nutrients. Leaves, the next most important food for chimpanzees, are also tough and likewise take a long time to chew into pieces small enough for efficient digestion. The other great apes (bonobos, gorillas, and orangutans) commit similarly long hours to chewing their food. Because the amount of time spent chewing is related to body size among primates, we can estimate how long humans would be obliged to spend chewing if we lived on the same kind of raw food that great apes do. Conservatively, it would be 42 percent of the day, or just over five hours of chewing in a twelve-hour day.

People spend much less than five hours per day chewing their foods. Brillat-Savarin claimed to have seen the vicar of Bregnier eat the following within forty-five minutes: a bowl of soup, two dishes of boiled beef, a leg of mutton, a handsome capon, a generous salad, a ninety-degree wedge from a good-sized white cheese, a bottle of wine, and a carafe of water. If Brillat-Savarin was not exaggerating, the amount of food eaten by the vicar in less than an hour would have provided enough calories for a day or more. It is hard to imagine a wild chimpanzee achieving such a feat.

A few careful studies using direct observation confirm how relatively quickly humans eat their food. In the United States, children from nine to twelve years of age spend a mere 10 percent of their time eating, or just over an hour per twelve-hour day. This is close to the daily chewing time for children recorded by anthropologists in twelve subsistence societies around the world, from the Ye’kwana of Venezuela to the Kipsigi of Kenya and the Samoans of the South Pacific. Girls ages six to fifteen chewed for an average of 8 percent of the day, with a range of 4 percent to 13 percent. Results for boys were almost identical: they chewed for an average of 7 percent of the day, again ranging from 4 percent to 13 percent.

The children’s data show little difference between the industrialized United States and subsistence societies. In the twelve measured cultures, adults chewed for even less time than the children. Women and men each spent an average of 5 percent of their time chewing. One might object that the people in the subsistence societies were observed only from dawn to dusk. Since people often have a big meal after dark, the total time eating per day might be more than indicated by the 5 percent figure, which translates to only thirty-six minutes in a twelve-hour day. But even if people chewed their evening meals for an hour after dark, which is an improbably long time, the total time spent eating would still be less than 12 percent of a fourteen-hour day, allowing two hours for the evening meal. However we look at the data, humans devote between a fifth and a tenth as much time to chewing as do the great apes.

This reduction in chewing time clearly results from cooked food being softer. Processed plant foods experience similar physical changes to those of meat. As the food canning industry knows all too well, it is hard to retain a crisp, fresh texture in heated vegetables or fruits. Plant cells are normally glued together by pectic polysaccharides. These chemicals degrade when heated, causing the cells to separate and permitting teeth to divide the tissue more easily. Hot cells also lose rigidity, a result of both their walls swelling and their membranes being disrupted by denaturation of proteins. The consequences are predictable. By measuring the amount of force needed to initiate a crack in food, researchers have shown that softness (or hardness) closely predicts the number of times someone chews before swallowing. The effect works for animals too. Wild monkeys spend almost twice as long chewing per day if their food is low-quality. Observers have recorded the amount of time spent chewing by wild primates that obtain human foods (such as garbage stolen from hotels). As the proportion of human foods rises in the diet, the primates spend less time chewing, down to less than 10 percent when all of the food comes from humans.

Six hours of chewing per day for a chimpanzee mother who consumes 1,800 calories per day means that she ingests food at a rate of around 300 calories per hour of chewing. Humans comparatively bolt their food. If adults eat 2,000 to 2,500 calories a day, as many people do, the fact that they chew for only about one hour per day means that the average intake rate will average 2,000 to 2,500 calories an hour or higher, or more than six times the rate for a chimpanzee. The rate is doubtless much more when people eat high-calorie foods, such as hamburgers, candy bars, and holiday feasts. Humans have clearly had a long history of much more intense calorie consumption than primates are used to. Thanks to cooking, we save ourselves around four hours of chewing time per day.





Before our ancestors cooked, then, they had much less free time. Their options for subsistence activities would therefore have been severely constrained. Males could not afford to spend all day hunting, because if they failed to get any prey, they would have had to fill their bellies on plant foods instead, which would take a long time just to chew. Consider chimpanzees, who hunt little and whose raw-food diet can be safely assumed to be similar to the diet of australopithecines. At Ngogo, Uganda, chimpanzees hunt intensely compared to other chimpanzee populations, yet males still average less than three minutes per day hunting. Human hunters have lots of time and walk for hours in the search for prey. A recent review of eight hunter-gatherer societies found that men hunted for between 1.8 and 8.2 hours daily. Hadza men were close to the average, spending more than 4 hours a day hunting—about eighty times as long as an Ngogo chimpanzee.

Almost all hunts by chimpanzees follow a chance encounter during such routine activities as patrolling their territorial boundaries, suggesting that chimpanzees are unwilling to risk spending time on a hopeful search. When chimpanzees hunt their favorite prey—red colobus monkeys—the colobus rarely move out of the tree where they are attacked. The monkeys appear to feel safer staying in one place, rather than jumping to adjacent trees where chimpanzees might ambush them. The monkeys’ immobility allows chimpanzees to alternate between sitting under the prey and making repeated rushes at them. In theory, the chimpanzees could spend hours pursuing this prey. But at Ngogo the longest hunt observed was just over one hour, and the average length of hunts is only eighteen minutes. At Gombe I found that the average interval between plant-feeding bouts was twenty minutes, almost the same as the length of a hunt. The similarity between the average hunt duration and the average interval between plant-feeding bouts suggests that chimpanzees can afford a break of twenty minutes from eating fruits or leaves to hunt, but if they take much longer they risk losing valuable plant-feeding time.

The time budget for an ape eating raw food is also constrained by the rhythm of digestion, because apes have to pause between meals. Judging from data on humans, the bigger the meal, the longer it takes for the stomach to empty. It probably takes one to two hours for a chimpanzee’s full stomach to empty enough to warrant feeding again. Therefore, a five-hour chewing requirement becomes an eight- or nine-hour commitment to feeding. Eat, rest, eat, rest, eat. An ancestor species that did not cook would presumably have experienced a similar rhythm.

These time constraints are inescapable for a large ape or habiline eating raw unprocessed food. Males who did not cook would not have been able to rely on hunting to feed themselves. Like chimpanzees, they could hunt in opportunistic spurts. But if they devoted many hours to hunting, the risk of failure to obtain prey could not be compensated rapidly enough. Eating their daily required calories in the form of their staple plant foods would have taken too long.





Washburn and other anthropologists have proposed that the human division of labor by sex was based on hunting. They suggest that on days when a male failed to find meat, honey, or other prizes, a female could provide food to him. As we now see, this would not have been sufficient, because a returning male who had not eaten during the day would not have had enough time left in the evening to chew his plant-food calories. The same time constraints apply whether our precooking ancestor obtained his staple plant diet by his own labor or received it from a female. A division of labor into hunting and gathering would not have afforded consumption of sufficient calories, as long as the food was consumed raw.

Suppose that a hunter living on raw food has a mate who is willing to feed him, that his mate could collect enough raw foods for him (while satisfying her own needs) and would bring them back to a central place, to be met by her grateful mate. Then suppose the male has had an unsuccessful day of hunting. Even modern hunter-gatherers armed with efficient weapons often fail. Among the Hadza there are stretches of a week or more several times per year when hunters bring no big-game meat to camp. The hungry hunter needs to consume, say, two thousand calories, but he cannot eat after dark. To do so would be too dangerous, scrabbling in the predator-filled night to feel for the nuts, leaves, or roots his gatherer friend brought him. If the hunter slept on the ground, he would be exposed to predators and large ungulates as he fumbled for his food. If he were in a tree, he would find it hard to have his raw foods with him because they do not come in tidy packages.

So to eat his fill he would have to do most of his eating before dusk, which falls between about 6 and 7 P.M. in equatorial regions. If he had eaten nothing while on the hunt, he would need to be back in camp before midday, and there he would find his mate’s gathered foods (assuming she had been able to complete her food gathering so early in the day). He would then have to spend the rest of the day eating, resting, eating, resting, and eating. In short, the long hours of chewing necessitated by a raw diet would have sharply reduced hunting time. It is questionable whether the sexual division of labor would have been possible at all.

The use of fire solved the problem. It freed hunters from previous time constraints by reducing the time spent chewing. It also allowed eating after dark. The first of our ancestral line to cook their food would have gained several hours of daytime. Instead of being an opportunistic activity, hunting could have become a more dedicated pursuit with a higher potential for success. Nowadays men can hunt until nightfall and still eat a large meal in camp. After cooking began, therefore, hunting could contribute to the full development of the family household, reliant as it is on a predictable economic exchange between women and men.
TOP
9#

CHAPTER 7



The Married Cook


“The labor of women in the house, certainly, enables men to produce more wealth than they otherwise could; and in this way women are economic factors in society. But so are horses . . . the horse is not economically independent, nor is the woman.”

—CHARLOTTE PERKINS GILMAN,

Women and Economics: A Study of the

Economic Relation Between Men and Women

as a Factor in Social Evolution





An evening meal cooked by a woman serves her and her children’s needs. It also helps her husband by giving him a predictable source of food, allowing him to spend his day doing whatever activity he chooses. But while the arrangement is comfortable for both sexes, it is particularly convenient for the male. Why should a female cook for him? A focus on the peculiar properties of cooked food provokes a new understanding of the nature of married life and the human community. It suggests that the reasons why the sexes pair off go beyond the traditional ideas of mating competition, or the interests that women and men have in the product of each other’s labor. It leads to the uncomfortable idea that as a cultural norm, women cook for men because of patriarchy. Men use their communal power to consign women to domestic roles, even when women would prefer otherwise.

That women tend to cook for their husbands is clear. In 1973 anthropologists George Murdock and Catarina Provost compiled the pattern of sex differences in fifty productive activities in 185 cultures. Although men often like to cook meat, overall cooking was the most female-biased activity of any, a little more so than preparing plant food and fetching water. Women were predominantly or almost exclusively responsible for cooking in 97.8 percent of societies. There were four societies in which cooking was reportedly performed about equally by both sexes or predominantly by males. One of them, the Todas of South India, was an error: a 1906 report had been misleading. Murdock and Provost failed to catch a correction showing that Toda women did most of the cooking.

Even the apparent exceptions conformed to the general rule. The three standouts reveal an important distinction between two types of cooking: cooking for the family, done by women, and cooking for the community, done by men. The three were the Samoans, Marquesans, and Trukese, all in the South Pacific. Their cultural backgrounds are different and they are located hundreds of miles apart from one another, but they have one thing in common: their staple food is breadfruit. Breadfruit trees produce fruits the size of basket-balls, yield large volumes of high-quality starch, and demand cooperative processing.

The procedure for cooking the fruit pulp is physically arduous, takes many hours, and is performed in a communal house by a group of men on days of their choice. The men build a large fire, peel the fruits, cut them into chunks, and steam them. On the Truk island group of Micronesia (now often called Chuuk), the ringing sound of sweat-soaked men pounding the fruit meat with coral pestles could be heard a hundred yards away. It was late in the day before men were done wrapping the cooled mash into leaf packages. They distributed the surplus to men who had not been cooking. At the end of the day, all the men had food packages, and sometimes they ate together in the men’s house, where women were not allowed.

Men did not need women to feed them. They could spend weeks at a time in the men’s houses with men of their lineage, receiving no assistance from women. But when men ate at home they gave the breadfruit mash to their wives, and their wives used it as the basis for the evening meal. Women rounded it out with pork or fish sauces and vegetables they cooked themselves. If there was no breadfruit, the women cooked other starchy foods such as taro roots. Men cooked the main staple when they chose to do so, but women were responsible for cooking everything else and for producing household meals.

Might there be a few societies, not sampled by Murdock and Provost, in which women are so liberated that the gendered pattern of cooking is reversed? Cultural anthropologist Maria Lepowsky studied the people of Vanatinai in the South Pacific expressly because, from the outside, this society seemed like a woman’s dream community. In many ways, life was indeed very good for women. There was no ideology of male superiority. Both sexes could host feasts, lead canoeing expeditions, raise pigs, hunt, fish, participate in warfare, own and inherit land, decide about clearing land, make shell necklaces, and trade in such valued items as greenstone ax blades. Women and men were equally capable of attaining the prestige of being “big” (important) people. Domestic violence was rare and strongly censured. There was “tremendous overlap in the roles of men and women” and a great deal of personal control over how they chose to spend their time. Women had “the same kinds of personal autonomy and control of the means of production as men.”

Yet despite the apparent escape from patriarchy, women on Vanatinai did all the domestic cooking. Cooking was regarded as a low-prestige activity. Other chores for which women were responsible included washing dishes, fetching water and firewood, sweeping, and cleaning up pig droppings. All were again regarded as low-status duties—in other words, the kind of work men did not want to do. One day as a group of women returned after walking three miles with heavy baskets of yams on their heads, they complained to Lepowsky, “We come home after working in the garden all day, and we still have to fetch water, look for firewood, do the cooking and cleaning up and look after the children while all men do is sit on the verandah and chew betel nut!” But when they asked for help with these tasks, “The men,” wrote Lepowsky, “usually retort that these are the work of women.” Why should the men help, if they can get away with not helping?

The worldwide pattern is reflected in the English language. The word lady is derived from the Old English hlaefdige, meaning “bread kneader,” whereas lord comes from hlaefweard , or “bread keeper.” Of course, men are entirely capable of cooking. In industrial societies men can be professional cooks. Spouses in urban marriages often share the cooking, or husbands can do most of it. In hunter-gatherer societies men cook for themselves on long hunting expeditions or in bachelor groups. Men cook on feast days and ritual occasions, cooperating in public somewhat like the breadfruit cooks. But even the men who cook when no women are present or on ceremonial occasions still have their home foods prepared by women. The rule that domestic cooking is women’s work is astonishingly consistent.

The classic reason suggested for this pattern is mutual convenience. Each sex gains from sharing their efforts, as many happily married couples can attest. But the explanation is superficial because it does not address the more fundamental problem of why our species has households at all, or the darker dynamic that sometimes has husbands exploiting their wives’ labor. The men on Vanatinai could have shared the cooking easily, as the women would sometimes have liked them to do, but they chose not to. Charlotte Perkins Gilman noted that humans are the only species in which “the sex-relation is also an economic relation” and compared women’s role to that of horses. Molly and Eugene Christian complained that cooking “has made of woman a slave.” In theory, among hunter-gatherers both males and females could forage for themselves, like every other animal, and then cook his or her own meal at the end of the day. So what led to a sexual division of labor in which men routinely insist that it is women’s lot to do the household cooking?

Nonhuman primates mostly pick and eat their food at once. But hunter-gatherers bring food to a camp for processing and cooking, and in the camp, labor can be offered and exchanged. This suggests that cooking might be responsible for converting individual foraging into a social economy. Archaeologist Catherine Perlès thinks so: “The culinary act is from the start a project. Cooking ends individual self-sufficiency.” Relying on cooking creates foods that can be owned, given, or stolen. Before cooking, we ate more like chimpanzees, everyone for themselves. After the advent of cooking, we assembled around the fire and shared the labor.

Perlès’s notion that, by necessity, cooking was a social activity is supported by Dutch sociologist and fire expert Joop Goudsblom, who suggests that cooking required social coordination, “if only to ensure that there would always be someone to look after the fire.” Food historian Felipe Fernandez-Armesto proposed that cooking created mealtimes and thereby organized people into a community. For culinary historian Michael Symons, cooking promoted cooperation through sharing, because the cook always distributes food. Cooking, he wrote, is “the starting-place of trades.”

These ideas fit nicely with the ubiquitous social importance of cooked food. The contrast between communal and solitary eating is particularly pronounced among hunter-gatherers, for whom cooking is a highly social act, unlike eating raw food. When people are out of camp, their snacks tend to be raw foods such as ripe fruits or grubs, and these are normally collected individually and eaten without sharing. But when people cook food, they do so mostly in camp, and they share it within the family or, when feasting, with other families. Furthermore, much of the labor in preparing the meal is complementary. In a common pattern, a woman brings firewood and vegetables, prepares the vegetables, and does the cooking. A man brings meat, which either he or a woman might cook. Family members also tend to eat at roughly the same time (though the man may eat first) and often sit face to face around a fire.

But the suggestion that tending a fire, eating a meal, and sharing food require cooperation is obviously wrong. Alexander Selkirk, the real-life Robinson Crusoe, was very fit when he was rescued in 1709 after more than four years of cooking for himself in the Juan Fernandez Islands in the middle of the Pacific. Numerous solitary war survivors also have lived off the land and cooked for themselves, as Shoichi Yokoi did in Guam for almost thirty years before he was found in 1972. Hunter-gatherer women sometimes collect food and fuel, tend a fire, and do the cooking without any support from their husbands, such as Tiwi women in northern Australia. Men in societies ranging from hunter-gatherers to the United States can go on hunting expeditions for days at a time and cook for themselves. Examples of individual self-sufficiency clearly undermine the idea that the sheer mechanics of cooking require that it be practiced cooperatively.

Why, then, is the “culinary project” so often social, if it does not need to be? Relying on cooked food creates opportunities for cooperation, but just as important, it exposes cooks to being exploited. Cooking takes time, so lone cooks cannot easily guard their wares from determined thieves such as hungry males without their own food. Pair-bonds solve the problem. Having a husband ensures that a woman’s gathered foods will not be taken by others; having a wife ensures the man will have an evening meal. According to this idea, cooking created a simple marriage system; or perhaps it solidified a preexisting version of married life that could have been prompted by hunting or sexual competition. Either way, the result was a primitive protection racket in which husbands used their bonds with other men in the community to protect their wives from being robbed, and women returned the favor by preparing their husbands’ meals. The many beneficial aspects of the household, such as provisioning by males, increases in labor efficiency, and creation of a social network for child-rearing, were additions consequent to solving the more basic problem: females needed male protection, specifically because of cooking. A male used his social power both to ensure that a female did not lose her food, and to guarantee his own meal by assigning the work of cooking to the female.





The logic for this theory begins with the banal observation that cooking is necessarily a conspicuous and lengthy process. In the bush, the sight or smell of smoke reveals a cook’s location at a long distance, allowing hungry individuals who have no food to easily locate cooks in action. The effect among Homo erectus is easily imagined. Because females were smaller and physically weaker, they were vulnerable to bullying by domineering males who wanted food. Each female therefore obtained protection from other males’ wheedling, scrounging, or bullying by forming a special friendship with her own particular male. Her bond with him protected her food from other males, and he also gave her meat. These bonds were so critical for the successful feeding of both sexes that they generated a particular kind of evolutionary psychology in our ancestors that shaped female-male relationships and continues to affect us today.

The idea that cooking has influenced social relationships in this way is supported by the intense aversion to competition shown by hunter-gatherers eating their meals. Lorna Marshall’s description of the delicacy with which Nyae Nyae !Kung treat one another at mealtimes is typical of hunter-gatherers: “When a visitor comes to the fire of a family which is preparing food or eating, he should sit at a little distance, not to seem importunate, and wait to be asked to share. . . . We observed no unmannerly behavior and no cheating and no encroachment about food. . . . The polite way to receive food, or any gift, is to hold out both hands and have the food or other gift placed in them. To reach out with one hand suggests grabbing to the !Kung. I found it moving to see so much restraint about taking food among people who are all thin and often hungry, for whom food is a constant source of anxiety.”

Such spontaneous etiquette is universal within functioning hunter-gatherer societies. Nothing like it is found in any other social species. Among nonhuman animals, valuable items that cannot be eaten at once predictably induce fights. Most of the fruits eaten by chimpanzees are the size of plums or smaller, too small to be worth fighting over, but a single ripe breadfruit weighs up to eight kilograms (eighteen pounds) and can take a group two hours to eat. An individual does not have time to swallow it before others see the prize and come to compete for it. Offspring take advantage of the situation by begging from their mothers, and adults fight to possess whole fruits or large pieces. Among chimpanzees, males win. Among bonobos, females win. In each case, the winners are members of the dominant sex. Among various species of spiders, a male that cohabits on a female’s web likewise takes her food, and as a result she weighs less than if no male is there. Among savannah lions, females lose much of their prey to males.

Restraint is rare indeed in animal competition over food. Chimpanzees fight over any food that can be monopolized, but the contests are fiercest over meat, producing a fracas that can often be heard more than a kilometer (half a mile) away. Within seconds of a successful predation by a low-ranking chimpanzee, a dominant male is liable to snatch the entire carcass from the killer. In a large group, the carcass will be torn apart by screaming males desperate for a share. Meat-eating can continue for hours. Those without meat, or with only a small piece, beg hard with upturned hands and reaching mouths. The harder they beg, the more meat they get, often by simply tearing it or pulling it away. Possessors try to escape the pressure by turning their backs or climbing to an inaccessible branch. They occasionally charge at their tormentors or flail the carcass at them. Such tactics buy time but are rarely effective. Persistent begging is normally such a nuisance to the possessor that it reduces the rate at which he can eat, and for this reason he sometimes allows others to take a piece of meat. He occasionally even makes an outright donation to a pushy beggar, who immediately leaves with it. Possessing meat can thus be less rewarding than expected from its food value. Meat brings trouble because it takes time to eat.

The most subordinate individuals get little. In the mayhem of carcass division, females rarely end up with a large piece. Overall, females eat much less meat than males, and their low success rate is clearly due to their poor fighting ability. Females with close social relationships to male possessors may get some meat, but in general, meat has less nutritional impact on the lives of female and young chimpanzees than it does on males. Even sexually attractive females cannot expect meat.

If the first cooks were temperamentally like chimpanzees, life would have been absurdly difficult for females or low-status males trying to cook a meal. Cooked food would have been intensely valuable. Even the act of gathering creates value merely by assembling raw foods into a pile. Cooking only increases its attraction. Subordinate individuals cooking their own meals would have been vulnerable to petty theft or worse. If several hungry dominants were present, the weak or unprotected would have lost much or all of the food. Females would have been the losers, just as they are among chimpanzees. There are no indications that human females or their ancestors have ever been prone to forming the kinds of physical fighting alliances with one another that protect bonobo females from being bullied by males.

Consider the possibility that small groups of tough males could search for signs of a campfire as a way to feed themselves. They would be able to descend on an undefended cook and take his or her food at will—after waiting, perhaps, for the cooking to be done. If this ploy were regularly successful, the males could become professional food pirates, which in turn would mean they would not bother to feed themselves or prepare their own food, adding to their desperation to steal it. Male lions come close to doing this, regularly taking whatever meat they want from kills made by females. This scenario suggests that unless cooks somehow established a peaceful environment in which to work, cooking might not have been a viable method of preparing food at all.

Even humans steal readily in various circumstances, so our species is not inherently uncompetitive. The nervous child with a lunch box in the schoolyard knows the problem as well as the anxious late-night stroller with cash in his pocket. People who have the chance to take from members of a different social network have few qualms about doing so. Farmers living next to hunter-gatherers routinely complain of being robbed. Stealing, cheating, and bullying were prevalent among the troubled Ik in the uplands of northern Uganda observed by cultural anthropologist Colin Turnbull, whose book about them, The Mountain People, was said by writer Robert Ardrey to record a society without morality. The Ik were a hunting people who had been kept from their traditional hunting grounds. The result was starvation, disease, and mutual exploitation. Turnbull described an almost complete evaporation of their community spirit: “They place the individual good above all else and almost demand that each get away with as much as he can without his fellows knowing.” Turnbull’s description shows just how savage people can become when social networks break down and life is tough.

Ethnographers sometimes report cases of theft within stable hunter-gatherer communities. Turnbull described how Pepei, an Mbuti Pygmy, had to cook for himself because he was a bachelor with no female kin. As a result, he was often hungry. Several times he was caught stealing small quantities of food from another cooking pot or someone else’s hut, mostly from an old woman who had no husband to protect her. His punishment was public ridicule, receiving food fit only for animals, or a thrashing with a thorny branch. Pepei was forgiven after he ended up in tears.

Since hunter-gatherers are often hungry, one might imagine that food theft would be a daily problem. Like other people living in small-scale egalitarian societies, they have no police or any other kind of authority. A hunter-gatherer woman returns to camp in the middle of the day carrying the raw foods she has obtained. She then prepares and cooks them for the evening meal at her own individual fire. Men might return to camp at any time, alone or in a small group. Many of the foods a woman cooks are edible raw, so they could be eaten before, during, or after the cooking process. If a man returns from the bush feeling hungry and has no one to cook for him, he might be tempted to ask a woman for some food—or even simply take it—rather than doing his own cooking. He can also sneak about the camp at any other time, including night.

Yet such tactics are rare. The relaxed atmosphere Lorna Marshall described for the !Kung is due to a system that keeps the peace at mealtimes among hunter-gatherers and other small-scale societies. The system consists of strong cultural norms. Married women must provide food to their husbands, and they must cook it themselves, though other family members may help. Social anthropologists Jane Collier and Michelle Rosaldo surveyed small-scale societies worldwide. “In all cases,” they found, “a woman is obliged to provide daily food for her family.” That is why married men can count on an evening meal. As a result, they have little reason to take food from women who are not their wives.

The obligation of wives to cook for their husbands occurs regardless of how much other work each of them do, or how much food they give each other. Sometimes men produce much more than women, as among traditional Inuit of the high Arctic, where the almost wholly animal diet of sea mammals, caribou, and fish was produced entirely by men. A man would hunt all day and would come home to a dinner his wife cooked. Cooking was slow over a seal-oil lamp, and women often had to spend much of the afternoon on the task. Sometimes the whole family went hunting together, but the wife had to return early to have everything ready when her husband and others returned to camp. Even when the time of her husband’s return was uncertain, she risked punishment if there was no food available for him. But at least a wife’s obligation to cook for a husband was matched by his providing all the food.

On the other hand, in some societies women brought home almost all the food. This happened among the Tiwi hunter-gatherers of northern Australia, a polygynous people who lived in households of up to twenty wives and one man. Women foraged for long hours and still returned in the evening to cook the one meal of the day. There were few animals to hunt. Men mostly contributed occasional small animals, such as goanna lizards, and brought in such little food that they needed women’s food production for their own welfare. As one Tiwi husband said, “If I had only one or two wives, I would starve.” Men relied on their wives not only for their own food but also to feed others. The possession of surplus food was the most concrete symbol of a Tiwi man’s success, allowing him to host feasts and promote his political agenda. Women’s high food contribution did not sway the balance of power in their marriages. Despite their economic independence and key role in their husbands’ status, they were “as frequently and as brutally beaten by their husbands as wives in any other savage society.”

Among the Inuit, Tiwi, and all other small-scale societies on record, fairness in distributing labor among women and men was not the issue. Whether or not wives wanted to do so, they cooked for their husbands. As a result, married men were guaranteed adequate food whether they returned late, tired, and hungry from a day’s hunting or came home relaxed and early from discussing politics with a neighbor. The man might have eaten in a courteous manner and have had a friendly or even loving interaction with his wife, but the formal structure of their eating relationship was that he could count on her labor and take a large portion of her food—typically, it seems, the best part.

Peace in the camp is further cemented by the principle that unless a husband gave his blessing, a wife could feed no other man except her close kin. This rule applied to cooked food around the campfire, as well as to the raw food she gathered. Other than her kin and husband, no one else had any right to ask for a share, so she could trudge back to camp secure in the knowledge that she would be able to cook all the food she had obtained. In Western society, we take the principle of ownership for granted. But among hunter-gatherers, this manifestation of private ownership is noteworthy because it lies in remarkable contrast to the obligatory sharing of men’s foods in particular, and more generally to a strong ethos of communitywide cooperation.





So however hard a man labors to produce food, in hunter-gatherer societies his rights to the food are a matter of communal decision. A man follows the rules, even if that means he gets nothing from his labor. Sometimes he must allow others to distribute his meat. A common requirement among Native American hunters was for boys making their first kill to carry their prize back to camp and stand by while others cooked and ate it. The practice symbolized the subordination of men to the demands of the group. More often, he divided his food himself. The community might allow him to make personal choices about who to give meat to, but not necessarily. In the western desert of Australia, every large hunted animal had to be prepared in a rigidly defined fashion when it was brought to camp. The hunter’s own share of a kangaroo was the neck, head, and backbone, while his parents-in-law received a hind leg, and old men ate the tail and innards. The contrast with women’s ownership of their foods is striking. Although women forage in small groups and might help one another find good trees or digging areas, their foods belong to them. The sex difference suggests that the cultural rules that specify how women’s and men’s foods are to be shared are adapted to the society’s need to regulate competition specifically over food. The rules were not merely the result of a general moral attitude.

A woman’s right to ownership protects her from supplicants of both sexes. In Australia’s western desert, a hungry aborigine woman can sit amicably by a cook’s fire, but she will not receive any food unless she can justify it by invoking a specific kinship role. It is even more difficult for a man. A bachelor or married man who approaches someone else’s wife in search of food would be in flagrant breach of convention and an immediate cause of gossip, just as a woman would be if she gave him any food. The norm is so strong that a wife’s presence at a meal can protect even a husband from being approached. Among Mbuti Pygmies, if a family is eating together by their hearth, they will be undisturbed. But when a man is eating alone, he is likely to attract his friends, who will expect to share his food.

Under this system, an unmarried woman who offers food to a man is effectively flirting, if not offering betrothal. Male anthropologists have to be aware of this to avoid embarrassment in such societies. Cofeeding is often the only marriage ceremony, such that if an unmarried pair are seen eating together, they are henceforward regarded as married. In New Guinea, Bonerif hunter-gatherers rely on the sago palm tree for their staple food year-round. If a woman prepares her own sago meal and gives it to a man, she is considered wed to him. The interaction is public, so others take the opportunity to tease the new couple with jokes equating food and sex, such as, “If you get a lot of sago you are going to be a happy man.” The association is so ingrained that a man’s penis is symbolized by the sago fork with which he eats his meal. If a man takes his sago fork out of his hair and shows it to a woman, they both know he is inviting her for sex. In that society, for a woman to even look at a man’s feeding implement is to break the rule against her constrained food-sharing.

Because interactions occur in public, a husband’s presence is not necessary to maintain customary principles. The husband’s role is important not so much for his physical presence, but because he represents a reliable conduit to the support of the community. If a wife reported to her husband that another man had inappropriately asked her for food, the accused would be obliged to defend himself to both the husband and the community at large.

This may explain one of the reasons why marriage is important to a woman in these societies. Among the Bonerif, as among many hunter-gatherers, sexual intercourse is not tightly restricted to marriage. Wives are free to have sexual relations with several men at the same time, and may do so even when their husbands protest. Furthermore, they get little food from their husbands. But marriage means that her children will be accepted, according to anthropologist Gottfried Oosterwal. In addition, marriage gives a woman access to the only ultimate authority, which is the set of communal decisions reached by men in the men’s house. These decisions represent the “crystallized view of everyone about everything” and are accepted as the right view by the whole community. Having a husband means that when social conflict arises, a good wife has an advocate who is a member of the ultimate source of social control.

A link to the communal authority is critical, because the ability of victims to deter a bully or a persistent pest depends on their being a legitimate member of the community. Hunter-gatherers deal with braggarts, thieves, and violators of other social norms in a consistent way, according to anthropologist Christopher Boehm. They use communal sanctions. Whispers, rumors, and gossip evolve into public criticism or ridicule directed at the accused. If the offender continues to incur public anger, he or she will be severely punished or even killed. The killing is done by one or a few men but will be approved by all the elders. Capital punishment provides the sanction that most completely enforces hunter-gatherer adherence to social norms, and it is in men’s hands. Thus by virtue of being married (or, if unmarried, by virtue of being a daughter), a woman is socially protected from losing any of her food. Having a husband or father who is a legitimate member of the group, she is effectively protected by him.

In theory, cultural norms that oblige a woman to feed her husband but no other men could have arisen from a societal goal other than to protect women’s foods. Such norms might have arisen from a desire to avoid conflicts in general, or from a concern for reducing adultery in particular. But these alternative explanations are unconvincing because men needed their wives specifically to cook for them, rather than merely to behave in a way that promoted communal civility in general. Cross-cultural evidence described above shows that women’s cooking for the family is a universal pattern. From ethnographic reports it seems that this domestic service is often the most important contribution a wife makes to their partnership.





We have already seen that among the Tiwi a man depended on being fed by his wives, and it turns out that the Tiwi case is typical. Hunter-gatherer men suffered if they had no wives or female relatives to provide cooked meals. “An aborigine of this Colony without a female partner is a poor dejected being,” wrote G. Robinson about the Tasmanians in 1846. When an Australian aboriginal wife deserts her husband, wrote Phyllis Kaberry, he can easily replace her role as a sexual partner but he suffers because he has lost someone attending to his hearth. The loss is important because a bachelor is a sorry creature in subsistence societies, particularly if he has no close kin. As Thomas Gregor explained for the Mehinaku hunter-gardeners of Brazil, an unmarried man “cannot provide the bread and porridge that is the spirit’s food and a chief’s hospitality. . . . To his friends, he is an object of pity.” Colin Turnbull explained precisely why bachelors among Mbuti Pygmies were unhappy: “A woman is more than a mere producer of wealth; she is an essential partner in the economy. Without a wife a man cannot hunt; he has no hearth; he has nobody to build his house, gather fruits and vegetables and cook for him.” Examples like these are so widespread that according to Jane Collier and Michelle Rosaldo, in small-scale societies all men have a “strictly economic need for a wife and hearth.” Men need their personal cooks because the guarantee of an evening meal frees them to spend the day doing what they want, and allows them to entertain other men. They can find opportunities for sexual interactions more easily than they can find a food provider.

In societies with no restaurants or supermarkets, the need for a wife can lead a man to desperate measures. Among the Inuit, where a woman contributed no food calories, her cooking and production of warm, dry hunting clothes were vital: a man cannot both hunt and cook. The pressure could drive widowers or bachelors to neighboring territories in an attempt to steal a woman, even if it meant killing her husband. The problem was so pervasive that the threat of stealing women dominated relationships among Inuit strangers: unfamiliar men would normally be killed even before questions were asked. Lust was not the motivation for stealing wives. “The vital importance of a wife to perform domestic services provided the most usual motive for abduction,” according to ethnographer David Riches. Oosterwal recorded a comparable reason for wife stealing in New Guinea, where a woman’s domestic contribution was critical because of the sago meal she prepared. Men wanted to give feasts as large as possible, so they needed women to organize the food. This led them to conduct raids on neighboring groups to kidnap wives for sago production. Captured women were put to work at once. Their sexual services were an added bonus.

Another version of the same formula applied to many Tiwi marriages. In this highly polygynous culture, old men took most of the young wives, so more than 90 percent of men’s first marriages were to widows much older than themselves, sometimes as old as sixty. The old wives might have been past child-bearing age and physically unattractive, but young men delighted in the marriages because they were then fed. Among one nearby group, the Groote Eylandt Aborigines, adult bachelors were given a teenage boy to do the domestic chores. The teenager was called a boy slave, suggesting that wives may have been similarly perceived as fulfilling a slavelike role.

Although the Inuit and Tiwi offer extreme examples of how hunter-gatherer men acquired wives, the importance of marriage for a man in small-scale societies was universal. Collier and Rosaldo explained that a married man has status because once he has a wife, he need never ask for cooked food and he can invite others to his hearth. He is also likely to eat well because men typically eat before their wives and have the choice of the best food. In Michael Symons’s words, men “demand selfless generosity from women.” To favor the married man even further, small-scale societies have food taboos such that married men are allowed to eat more of the choice kinds of food than are bachelors or women. Women in these societies often dislike marriage specifically because as wives they are obliged to produce food for men, and they have to work harder than they would as unmarried women.





Inequitable as marriage is in certain respects for hunter-gatherer women, that women have to cook for men empowers them. “Her economic skill is not only a weapon for subsistence, but also a means of enforcing good treatment and justice,” wrote Phyllis Kaberry of Australian aboriginal women. A wife who cooks badly might be beaten, shouted at, chased, or have her possessions broken, but she can respond to abuse by refusing to cook or threatening to leave. Such disputes seem to be characteristic mostly of new marriages. Most couples easily develop a comfortable predictability, with wives doing their best to provide husbands their cooked meals and husbands appreciating the effort. Hunter-gatherer women are therefore not normally treated badly, and many ethnographers have concluded that, in comparison to most societies, married women lead lives of high status and considerable autonomy.

Catherine Perlès was right in saying that cooking ends individual self-sufficiency. Cooking need not be a social activity, but a woman needs a man to guard her food, and she needs the community to back him up. A man relies on a woman to feed him, and on other men to respect his relationship with her. Without a social network defining, supporting, and enforcing social norms, cooking would lead to chaos.

It is impossible to know how rapidly cooking would have ended individual self-sufficiency after it was first practiced, but in theory the protective pair-bond system could have evolved quickly. Admittedly, the first cooks were not modern hunter-gatherers, and we know too little about their way of life to confidently judge the effects of cooking on social organization. We do not know how linguistically skilled our ancestors were when cooking was adopted. Language is needed nowadays to enforce culturally understood rules, and because a woman’s food is made secure by her being able to report on a thief ’s activity. But at least we can say that three of the key behavioral elements found in the hunter-gatherer system—male food guards, female food suppliers, and respect for other’s possessions—are found in other animals, suggesting that a primitive version of the modern food-protection regime could have evolved rapidly among early cooks.

Gibbons illustrate the role of males as food guards. Pairs of these small tree-dwelling apes defend territories against their neighbors. When pairs meet at a tree in the border zone, males fight hard with each other, and the female of the winning male tends to eat better. While food guards are relatively common in animals, there is only one species in which females have been seen provisioning males: a tiny Australian insect called the Zeus bug. Male Zeus bugs are smaller than females and ride on the backs of their mates like jockeys. Females secrete a waxlike material on their backs that is eaten by the male and has no known purpose except to feed him. Males that have been prevented from eating the female’s secretions turn competitive: they steal the female’s fresh prey. The researchers who discovered this strange relationship hypothesize that females do better by feeding their riding males than by losing prey to them, perhaps because the waxy stuff contains nutrients that the females do not need. This system has apparently evolved to stop males from interfering with the female’s feeding. In other words, females feed males to reward them for behaving well. That is close to the system found in humans.

Male “respect for possession” is found more widely than female provisioning. Competition for mates among desert-living hamadryas baboons from around the Red Sea provides a striking example. Male hamadryas who do not know each other fight intensely over females, but among familiars a male is completely inhibited from interfering with an existing bond. Zoologist Hans Kummer demonstrated this with experiments in which he captured two wild males who came from the same group. He found out which of the males was dominant by putting food between them. He then kept the males in separate cages. While the dominant male was allowed to watch, Kummer introduced an unfamiliar female into the cage of the subordinate. The dominant saw everything, but being in a different cage, he could do nothing to stop the subordinate from interacting with the new female. Inside the pairing cage, the subordinate male approached the female and quickly mated with her. A few minutes later she showed him her approval by grooming him, and by that time a bond was formed.

At this point Kummer introduced the dominant male into the cage where the subordinate male and his new female were enjoying their honeymoon. An hour earlier the dominant had been so superior that he had taken food from his subordinate at will, but now the dominant lost all interest in competing for the female. The dominant showed complete respect for the subordinate’s possession of the female. Films of these experiments show the dominant looking anywhere but at the subordinate. The dominant develops an intense fascination with a pebble at his feet, which he rolls and twiddles with a pointed finger. He stares at the clouds as if entranced by the weather. The one direction he does not look is toward the most obvious thing in the cage: the two so recently paired baboons. When paired in equivalent circumstances with an unfamiliar male, by contrast, the dominant baboon shows no such respect. Kummer’s experiment identified male bonding as the source of respect between males.

The food guarding, provisioning by females, and respect for possession found in animals are associated with males competing over sexual access to females, but only in humans have they led to households. Something about humans is different from other species. A woman’s need to have her food supply protected is unique among primates and provides a sensible explanation for the sexual division of labor.

The proposal that the human household originated in competition over food presents a challenge to conventional thinking because it holds economics as primary and sexual relations as secondary. Anthropologists often see marriage as an exchange in which women get resources and men get a guarantee of paternity. In that view, sex is the basis of our mating system; economic considerations are an add-on. But in support of the primary importance of food in determining mating arrangements, in animal species the mating system is adapted to the feeding system, rather than the other way around. A female chimpanzee needs the support of all the males in her community to aid her in defending a large feeding territory, so she does not bond with any particular male. A female gorilla, however, has no need for a defended food territory, so she is free to become a mate for a specific male. Many such examples suggest that the mating system is constrained by the way species are socially adapted to their food supply. The feeding system is not adapted to the mating arrangement. The consequences of a man’s economic dependence takes different forms in different societies, but recall that according to Jane Collier and Michelle Rosaldo, his needing a wife to provide food is universal among hunter-gatherers. Food, it seems, routinely drives a man’s marriage decision more than the need for a sexual partner.

Furthermore, food relationships appear to be more tightly regulated than sexual relationships. Among the Bonerif, husbands disapproved of their wives having sex with bachelors, but the bachelors did it anyway. Husbands were relatively tolerant of their wives having sex with other husbands, perhaps because promiscuous sex involved less threat of losing her economic services than did promiscuous feeding. As in many other hunter-gatherer communities, Bonerif attitudes toward premarital sex are particularly open-minded. One girl had sex with every unmarried male in the community except her brother. But when a woman feeds a man, she is immediately recognized as being married to him. Western society is not alone in thinking that the way to a man’s heart is through his stomach.





Marriage in the United States affects women and men in different ways. Women tend to work longer hours after marriage, thanks to putting in extra time on household tasks, but men do no more household work than before they marry. The pattern is much the same as Jane Collier and Michelle Rosaldo found in small-scale societies, where marriage “binds specific people together in a particular, hierarchical system of obligations, requiring that women provide services for husbands.”

In Victorian England, the aesthetic writer John Ruskin argued that household labor was divided harmoniously and that women were superior to men. He credited women with greater organizational skills than men and explained that women were therefore better at managing households. But to philosopher John Stuart Mill, it was obvious that women were treated unfairly. Ruskin’s gallantry, he said, was “an empty compliment . . . since there is no other situation in life in which it is the established order, and considered quite natural and suitable, that the better should obey the worse. If this piece of talk is good for anything, it is only as an admission by men, of the corrupting influence of power.”

Mill’s accusation that Victorian British men used power to their own advantage might be applied equally well to all nonindustrial societies. The women living on Vanatinai had as much control over their lives as in any society. They were not regarded as inferior to men, and in the public realm they were not subject to male authority. But even when they were tired and men were relaxing, they still had to cook. Maria Lepowsky does not report what would have happened if a woman had refused to cook, but among hunter-gatherers who are similarly egalitarian, husbands are liable to beat wives if the evening meal is merely late or poorly cooked. When there is a conflict, most women have no choice: they have to cook, because cultural rules, ultimately enforced by men for their own benefit, demand it.

The idea that cooking led to our pair-bonds suggests a worldwide irony. Cooking brought huge nutritional benefits. But for women, the adoption of cooking has also led to a major increase in their vulnerability to male authority. Men were the greater beneficiaries. Cooking freed women’s time and fed their children, but it also trapped women into a newly subservient role enforced by male-dominated culture. Cooking created and perpetuated a novel system of male cultural superiority. It is not a pretty picture.
TOP
10#

CHAPTER 8



The Cook’s Journey


“A great flame follows a little spark.”

—DANTE, The Divine Comedy





When Jean Anthelme Brillat-Savarin wrote, “Tell me what you eat and I shall tell you what you are,” he could not have known how right he was. Even nowadays no one knows how deeply the effects of cooking and the control of fire have been burned into our DNA.

Take the pace of our lives. Compared with great apes, we live a few decades longer and reach sexual maturity more slowly. Our long life spans suggest that our ancestors were good at escaping predators. Across species, those who can escape predators more easily tend to live longer. Tortoises, safe in their shells, have lives measured in decades, far longer than most animals their size. Flying species, such as birds or bats, live longer than those confined to the ground, such as mice or shrews. Even in captivity, terrestrial rodents rarely live more than two years, whereas bats of the same size can live for twenty years. Likewise, gliding animals live longer than their nongliding relatives. Bowhead whales stay so far north that killer whales cannot reach them, and they live more than a hundred years. The longevity of early humans is unknown, but their relative safety during evolution must have owed much to the use of fire to deter predators.

Or consider weaning. Cooked food, being soft, enables mothers to wean their young early. During human evolution, early weaning would have allowed a mother to recover her body condition rapidly after birth, promoting a short interval between births. In addition, the higher energy value of cooked food should have promoted a faster rate of growth for the young. The expected early weaning made possible by a human mother’s giving cooked food to her infant would have affected social behavior too. Mothers who weaned their babies early would have larger families than before, an infant and a toddler side by side. The advantages of help given by grandmothers and other kin would have increased. Chimpanzee grandmothers occasionally express interest in their daughters’ offspring through carrying or grooming, but they are normally preoccupied with their own infants. By generating easily donated gifts of cooked food that are useful for the young, the new system of processing food would have favored cooperative tendencies in rearing families.

Cooking also should have reduced the difficulties of finding enough to eat during the poorest seasons, when even now hunter-gatherers routinely find conditions hard. The notion of cooked food making life easier challenges the thrifty-gene hypothesis, which claims that because the environments of our hunter-gatherer ancestors were highly seasonal, we are physiologically adapted to periods of feast and famine. Accordingly, ancestral humans supposedly digested and stored energy in their bodies with exceptional efficiency. The thrifty-gene hypothesis suggests this efficiency was a useful adaptation when starvation was a consistent threat but is responsible for obesity and diabetes in many modern environments. The cooking hypothesis suggests a different idea: during our evolution, our use of cooked food would have left us better protected from food shortages than the great apes are, or than our noncooking ancestors were. It implies that humans easily become obese as a result of eating exceptionally high-energy, calorie-dense food, rather than from ancient adaptation to seasonality. Great apes become obese in captivity on a rich diet of cooked food.

Cooking and the control of fire must have had substantial influences on our ancestors’ digestive physiology. Compared with our close ape relatives, humans regularly experience a higher caloric intake in a short time (e.g., a rapidly ingested evening meal), a more easily digested protein intake, and a higher concentration of the dangerous Maillard compounds that are produced by the combination of sugars and amino acids during cooking. We can therefore expect to find changes in our insulin system compared with those of apes, in the nature of our proteolytic enzymes, and in our systems of defense against a range of carcinogens and inflammatory agents. We might find that we are better protected against Maillard molecules than other primates are, given our uniquely long exposure to ingesting them in high concentrations.





Anthropologists often propose that when fire was first controlled, one of its major contributions was to keep people warm, but that idea wrongly implies that our precooking ancestors would have had difficulty staying warm without fire. Chimpanzees survive nights exposed to long, cold rain-storms. Gorillas sleep uncovered in high, cool mountains. Every species other than humans can maintain adequate body heat without fire. When our ancestors first controlled fire, they would not have needed it for warmth, though fire would have saved them some energy in maintaining body temperature.

But the opportunity to be warmed by fire created new options. Humans are exceptional runners, far better than any other primate at running long distances, and arguably better even than wolves and horses. The problem for most mammals is that they easily become overheated when they run. After a chimpanzee has performed a five-minute charging display, he sits exhausted, panting and visibly hot, beads of sweat glistening among his erect hairs as he uses increased air circulation and sweat production to dissipate his excessive heat. Most mammals cannot evolve a solution to this problem, because they need to retain an insulation system, such as a thick coat of hair. The insulation is needed to maintain body heat during rest or sleep, and of course it cannot be removed after exercise. At best it can be modified, such as by hair being erected to promote air flow.

The best adaptation to losing heat is not to have such an effective insulation system in the first place. As physiologist Peter Wheeler has long argued, this may be why humans are “naked apes”: a reduction in hair would have allowed Homo erectus to avoid becoming overheated on the hot savanna. But Homo erectus could have lost their hair only if they had an alternative system for maintaining body heat at night. Fire offers that system. Once our ancestors controlled fire, they could keep warm even when they were inactive. The benefit would have been high: by losing their hair, humans would have been better able to travel long distances during hot periods, when most animals are inactive. They could then run for long distances in pursuit of prey or to reach carcasses quickly. By allowing body hair to be lost, the control of fire allowed extended periods of running to evolve, and made humans better able to hunt or steal meat from other predators.

The hair loss that benefited adults would have been a problem for babies because babies spend a lot of time inactive and are therefore at risk of becoming cold unless cuddled or nestled in warm surroundings. Perhaps at first babies retained their body hair even when their older siblings lost theirs. But an infant lying next to a fire would have risked burning his or her body hair. Nowadays, human babies are unique among primate infants in having an especially thick layer of fat close to the skin. Baby fat could well be partly a thermal adaptation to the loss of chimpanzee-like hair.

Even our ancestors’ emotions are likely to have been influenced by a cooked diet. Clustering around a fire to eat and sleep would have required our ancestors to stay close to one another. To avoid lost tempers flaring into disruptive fights, the proximity would have demanded considerable tolerance. The first dogs provide a provocative model for how tolerance might have evolved. According to biologists Raymond and Lorna Coppinger, wolves began their evolution into dogs when they were drawn to human villages in search of food refuse about fifteen thousand years ago. The Coppingers suggest that when wolves were attracted to these potent new food resources, there was intense natural selection in favor of the calmer individuals, because the calmer wolves were able to get closer to the settlements and more easily find the precious new foods. In effect, dogs experienced a form of self-domestication.

The first cooks probably experienced a similar process. Among the eaters of cooked food who were attracted to a fireside meal, the calmer individuals would have more comfortably accepted others’ presence and would have been less likely to irritate their companions. They would have been chased away less often, would have had more access to cooked food, and would have passed on more genes to succeeding generations than the wild-eyed and intemperate bullies who disturbed the peace to the point that they were ostracized by a coalition of the calm. A version of this system had probably already started before cooking, when groups of habilines clustered about a meat carcass.

A process similar to domestication could then have led to an evolutionary advance in ancestral humans’ social skills. In animals, more tolerant individuals cooperate and communicate better. Among chimpanzees, individuals that are more tolerant of each other cooperate better. Again, bonobos are more tolerant than chimpanzees, and they collaborate more readily to obtain food. Experimentally domesticated foxes are likewise more tolerant than their wild ancestors and are better at reading human signals. If the intense attractions of a cooking fire selected for individuals who were more tolerant of one another, an accompanying result should have been a rise in their ability to stay calm as they looked at one another, so they could better assess, understand, and trust one another. Thus the temperamental journey toward relaxed face-to-face communication should have taken an important step forward with Homo erectus. As tolerance and communication ability increased, individuals would have become better at reaching a mutual understanding, forming alliances, and excluding the intolerant. Such changes in social temperament would have contributed to a growing ability to communicate, including the evolution of language.

The changes wrought by cooked food would have included family dynamics and their supporting psychological mechanisms. The development of pair-bonds in early humans (or their elaboration, if habilines had already evolved a pair-bonding system) contributed to the importance of romantic attachments. On the other hand, domestic violence would have been promoted by the way in which, thanks to cooking, labor is sexually divided and exchanged. Hunter-gatherers are not the only cultures in which wife-beating can be stimulated by disappointments over cooking. Sociologist Marjorie DeVault studied American households and found that “expectations of men’s entitlement to service from women are powerful in most families, [and] that these expectations often thwart attempts to construct truly equitable relationships and sometimes lead to violence.” Sigmund Freud thought the control of fire led to self-control. Around a hearth, he said, we have to suppress a primal urge to quench the flames with a stream of urine. Freud’s notion is far-fetched, but he was right about one thing: our species must have changed radically when we learned to live with flames.





The changes all depend on the mysterious initial moment. We may never know for sure how cooking started, because the breakthrough happened so long ago and probably rather quickly in a small geographical area. But we can use our growing knowledge of great ape behavior, nutrition, and archaeology to speculate. Consider first the woodland apes, or australopithecines. By the period between three million and two million years ago, several genera and many species of australopithecines had already occupied the African woodlands for perhaps three million years. At that time, the only known species of australopithecines were Australopithecus afarensis, A. garhi, and A. africanus, and then even they disappeared.

Climate change appears responsible for the extinction of australopithecine species. Africa began getting drier about three million years ago, making the woodlands a harsher and less productive place to live. Desertification would have reduced the wetlands where australopithecines would have found underwater roots, such as cattails and water lilies, and they would have found fewer fruits and seeds. The species of Australopithecus had to adapt their diet or go extinct. Two lines survived.

One adapted by intensifying its reliance on the underground foods that had provided the backup diet of less preferred foods for australopithecines in times of food scarcity. Their descendants rapidly developed enormous jaws and chewing teeth, and are recognized in the naming of a new genus, Paranthropus, or the “robust” australopithecines. Paranthropus emerged around three million years ago, possibly descendants of Australopithecus afarensis or A. africanus. They flourished in some of the same dry woodlands as our human ancestors until a million years ago and still looked like upright-walking chimpanzees. But even more than their Australopithecus ancestors, Paranthropus relied mainly on a diet of roots and other plant storage organs.

The other line of descendants led to humans, and it began with meat eating. Australopithecines must always have been interested in eating meat when they found fresh kills, just as chimpanzees and almost every other primate are today. They would therefore have readily pirated carcasses from any predator they were willing to confront, such as cheetahs or jackals, both of which had close relatives present in Africa by 2.5 million years ago. Chimpanzees today steal carcasses of young antelope or pigs from baboons. But stealing meat from lions and saber-tooths must normally have been too dangerous for australopithecines. Even lions and hyenas kill each other in competition over food, and australopithecines would have been feeble and slow compared to any of the big carnivores.

Given these challenges, it is unclear how australopithecines obtained access to the meat of antelope and other game animals. Maybe they found new ways to kill, which would have given them a few minutes or more to cut meat off their prey before they were chased away by the arrival of big carnivores. Or perhaps they discovered how to stand up to the dangerous predators without serious risk of being wounded or killed. A bold group of australopithecines might have confronted the predators with simple spears modified from digging sticks that they had used to obtain roots. That technology would not have been a huge advance from the short sticks chimpanzees use to jab at bush babies hidden in tree holes, as happens in Senegal. Or maybe they threw rocks at their opponents, much as chimpanzees now sometimes scare pigs or humans with well-aimed missiles in Gombe, Tanzania. If they threw rocks, they might have noticed that sometimes the rocks smashed on landing and produced flakes that could be used for cutting.

Whatever the technique, by at least 2.6 million years ago, some groups were definitely getting meat from carcasses that previously only big carnivores would have eaten. Over the next few hundred thousand years, impact notches and cut marks on animal bones caused by stone tools attest to habilines spending long enough in the danger zones to be able to slice the meat off dead animals, from turtles to elephants. The result was a new and immensely beneficial food source. Knowing that habilines were able to cut steaks and that chimpanzees often pound nuts with hammerstones, we can be sure that habilines would have had the cognitive ability to batter their meat before they ate it, and they surely would have preferred their meat pounded.

Habilines must have also eaten substantial amounts of plant food. During periods of food shortage, such as the annual dry seasons, meat would have been particularly low in fat, down to 1 to 2 percent. Plant foods would then have become critical. Habilines’ chewing teeth were similar in size and shape to those of australopithecines, showing a continuing commitment to the same plant foods, including raw roots and corms during the most difficult seasons, and such items as soft seeds and fruits when they could find them. Probably habilines prepared nuts by smashing them to expose the edible seeds, as chimpanzees do. It is doubtful that habilines could process plant foods by any techniques that were much more elaborate than pounding. Almost all the methods hunter-gatherers use to improve the nutritional value of plant foods involve fire, because heat is needed to gelatinize starch. Until fire was controlled, habilines would have been stuck with eating raw plant foods whose caloric value could not be much improved by cold processing.

The breakthrough could have been simple, because it did not require that fire be made from scratch. If fire could be captured, the tending would have been relatively easy. Among hunter-gatherers, children as young as two years old make their own fires by taking sticks from their mothers’ fires. Even chimpanzees and bonobos can tend fires well. The bonobo Kanzi is famous for his ability to communicate with psychologist Sue Savage-Rumbaugh using symbols. During an outing in the woods, Kanzi once touched the symbols for “marshmallow” and “fire.” He was given matches and marshmallows, and he proceeded to snap twigs for a fire, light them with matches, and toast the marshmallows. By the time of habilines, brain size had roughly doubled compared with the relative brain size of great apes. It is very likely that habilines were mentally capable of keeping a fire alive.

The big question for the habilines that became Homo erectus is not how they tended fire, but how they would regularly have obtained it. In his Descent of Man, Charles Darwin mentioned an idea suggested by his archaeologist friend John Lubbock: sparks produced by accident from pounded rocks could have launched the control of fire. Anthropologist James Frazer liked the idea of human fire coming accidentally from hitting rocks together, and so did the Yakuts of Siberia, whose campfire tales recounted how hammering led to controlled fire. Certainly habilines would have seen sparks when they hit stones together to make tools. If they softened their meat by pounding it not only with logs but also with hammerstones, they would have had a second source of sparks. There often would have been dry tinder close by, such as grass or the tinder fungus that many people use today to catch a fire.

Anthropologists caution that the sparks produced by many kinds of rock are too cool or short-lived to catch fire. But when pyrites, a common ore containing iron and sulfur, are hit against flint, the result is a set of such excellent sparks that pyrites and flint are standard components of fire-making kits used by hunter-gatherers from the Arctic to Tierra del Fuego. If a particular group of habilines lived in an area exceptionally rich in pyrites, they could have found themselves inadvertently making fire rather often.





The steps to managing fire need not have involved the difficult process of deliberately making it. Here is an alternative scenario: during the tens of thousands of generations between the origin of habilines (at least 2.3 million years ago) and Homo erectus (at least 1.8 million years ago), from time to time the sparks resulting from habilines’ pounding rocks could have accidentally produced small fires in adjacent brush. Perhaps cocky juvenile habilines dared to grab the cool end of a branch and tease one another with the smoldering twigs or blazing leaves, much as young chimpanzees playfully bully one another with sticks they use as clubs. Adults learned the effect on one another of waving a burning log. The practice of scaring others with fire was then transferred to the serious job of frightening lions, saber-tooths and hyenas, similar to how chimpanzees use clubs against leopards. At first the fires went out. But over time, when sparks happened to start a fire, habilines learned that it was worth their while to keep it going. They cultivated fire as a way to help them defend against dangerous animals.

There are other possibilities. The climate was become increasingly dry. Natural fires could have become more frequent. Perhaps people walked behind brush fires looking for cooked seeds. Maybe they obtained fire from trees that burned slowly after being struck by lightning; a eucalyptus tree can smolder for eight months. Perhaps there was a permanent natural source somewhere in Africa, like the gas-fired strip of flame that has been burning nonstop near Antalya in southwestern Turkey ever since Homer recorded it in the Iliad almost three thousand years ago.

Repeated experience with natural fire would have been necessary to give individuals the confidence to use it, which would not have happened easily—otherwise, fire would have been controlled by every group of habilines. But if there were a natural source of fire, such as sparks, there would have been no need to learn to make fire, because it could be taken from nature again and again, and eventually from other groups: the chance of a rainstorm extinguishing every fire in a neighborhood would soon have become vanishingly small. Among Australian aborigines, groups that lost their fire from a drenching rain or flood would refresh their supply from neighbors, who would expect something in return, such as quartz flakes or red ocher. Sometimes the trade occurred across a territorial boundary, which made it dangerous, but risk did not prevent the vital recovery of fire.

Keeping a fire lit would have been a big achievement, but logs are easy to keep aflame when people are moving. Hunter-gatherers regularly carry fire in the form of a burning log. As long as the carrier is walking, the fire is well oxygenated and the log continues to smolder. When people stop, they start a small fire within a few minutes by adding a few sticks to the smoldering log and blowing.

An important step in fire’s becoming a central part of human lives was to maintain it at night. Suppose some habilines carried a smoldering log by day to protect against predators, then left it at the base of a sleeping tree when they climbed to make a nest for the night. It would not have been such a big step to give it extra fuel so the log would still be burning the next day—perhaps after seeing this happen first by accident. From there it would have been a smaller step to sitting near the fire to keep it burning, and thereby take advantage of its protection, light, and warmth.

Once they kept fire alive at night, a group of habilines in a particular place occasionally dropped food morsels by accident, ate them after they had been heated, and learned that they tasted better. Repeating their habit, this group would have swiftly evolved into the first Homo erectus. The newly delicious cooked diet led to their evolving smaller guts, bigger brains, bigger bodies, and reduced body hair; more running; more hunting; longer lives; calmer temperaments; and a new emphasis on bonding between females and males. The softness of their cooked plant foods selected for smaller teeth, the protection fire provided at night enabled them to sleep on the ground and lose their climbing ability, and females likely began cooking for males, whose time was increasingly free to search for more meat and honey. While other habilines elsewhere in Africa continued for several hundred thousand years to eat their food raw, one lucky group became Homo erectus—and humanity began.
TOP
11#

是不是也可以理解为我们的祖先掌握了火的使用
方芳 发表于 2024/12/5 10:07:06
不够哦 火+熟食 才约等于 烹饪
TOP
12#

一面英文墙,看不懂,
TOP
13#

一面英文墙,看不懂,
金亚军 发表于 2024/12/5 10:22:40

金老师太谦虚了 另外英文全文摘录确实像面墙【请忽略】 还好主帖能见人
TOP
14#

厉害厉害,小丽大神呀
TOP
15#

厉害厉害,小丽大神呀
湖州熊二 发表于 2024/12/5 10:54:11
陈老师别取笑陈老师嘞 努力成为吃货小神 fighting
TOP
16#

TOP
17#

不得了,古诗词杠杠,英文也杠杠。小科界所向披靡了
TOP
18#

不得了,古诗词杠杠,英文也杠杠。小科界所向披靡了
屠琴 发表于 2024/12/5 13:35:26
嘘~~ 小科论坛里面低调的大神很多很多呢 你这么夸我我将“应风披靡”
TOP
19#

不得了,古诗词杠杠,英文也杠杠。小科界所向披靡了
屠琴 发表于 2024/12/5 13:35:26

哈哈 挖出你的过往贴 原来你就是低调的又爱学习(涉猎文史哲科)又会做读书笔记的大才女呢 向你学习
TOP
20#

怎么这么厉害的,鄙人表示看不懂,需要点击“翻译”
TOP
21#

人类吃熟食,加快了进化。
TOP
22#

哇..这单词词汇量无敌
TOP
23#

好多英文文选,差点以为进错论坛了!
一个人能走得很快,一群人可以走得更远!
TOP
24#

哈哈哈,太夸张了
TOP
25#

好高深的论坛
TOP
26#

怎么这么厉害的,鄙人表示看不懂,需要点击“翻译”
皮卡裘二 发表于 2024/12/5 20:51:19
裘呀 我没说不翻译呀【粗看看翻译 精读读原文 中英两不误】
TOP
27#

好多英文文选,差点以为进错论坛了!
海风 发表于 2024/12/6 9:41:16
抱歉抱歉 失误失误 下不为例哈
TOP
28#

我需要好好去进修下英文
TOP
29#

用火做熟食
扔块砖头,换点玉
TOP
30#

感谢分享,虽然不点翻译看着着实有些费劲(看了两段之后,果断选择翻译)
TOP
发新话题 回复该主题