Do humans have any biological adaptations to eating cooked food?

Do humans have any biological adaptations to eating cooked food?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

Humans have been cooking food for at least tens of thousands of years. The presumed reason why cooking took root in nearly all human cultures is that cooked food is easier to digest. However, cooking food can also generate toxic compounds such as polycyclic aromatic hydrocarbons, which would not be found in the uncooked version of the food. Considering that humans have been eating cooked food for such a long time, I am wondering whether humans evolved any adaptations to eating cooked food, e.g., that certain types of foods, when cooked, are more toxic to our closest relatives (great apes) than they are to humans, because we have more ability to metabolize the toxic compounds.

Humans are incredibly good at processing maillard compounds, which include both beneficial and mildly toxic byproducts of cooking. Humans are better at breaking them down than other animals. This is presumed to be an adaptation to eating cooked food. Malliard reactions are also a good indicator of when most plant and animal products are safest to eat via cooking, (browning) which may explain why humans on average show a preference for them or even adaptations to detect them. we are also learning that one place animal testing may be problematic is in dietary tests becasue of this.

extra reading on hominid adaptations to diet

There seems to be evidence that cooked food intake regulates different genes than raw food in mice. Apparently these genes also tend to be human specifically expressed (more precisely, their overlap is more than expected by chance) [1]. While this is not a dramatic finding, it does show association between cooked food and genomic changes that evolutionary adaptation implies.

As for toxicity of certain edible foods to non-human primates, I do not know the answer but I would not expect toxicity. Rather, high metabolic demand of humans might have required more protein based diet which is safer when cooked.


Do humans have any biological adaptations to eating cooked food? - Biology

Humans have biological plasticity, or an ability to adapt biologically to our environment. An adaptation is any variation that can increase one’s biological fitness in a specific environment more simply it is the successful interaction of a population with its environment. Adaptations may be biological or cultural in nature. Biological adaptations vary in their length of time, anywhere from a few seconds for a reflex to a lifetime for developmental acclimatization or genetics. The biological changes that occur within an individual’s lifetime are also referred to as functional adaptations. What type of adaptation is activated often depends on the severity and duration of stressors in the environment. A stressor is anything that disrupts homeostasis, which is a “condition of balance, or stability, within a biological system…” (Jurmain et al 2013: 322). Stressors can be abiotic, e.g., climate or high altitude, biotic, e.g., disease, or social, e.g., war and psychological stress. Cultural adaptations can occur at any time and may be as simple as putting on a coat when it is cold or as complicated as engineering, building, and installing a heating system in a building.

Types of Biological Adaptation


This form of adaptation can take moments to weeks to occur and is reversible within an individual’s lifetime no matter if it occurs when one is a child or an adult.

Short-term acclimatization can occur within seconds of exposure to a stressor. This type of response quickly reverses when the stressor is no longer present. Imagine stepping out of an air-conditioned building or car into a 90 degree day. Your body will quickly begin to perspire in an attempt to cool your body temperature and return to homeostasis. When the temperature declines, so will your perspiration. Tanning is another short-term response, in this case to increased UV-radiation exposure especially during summer months, which can occur within hours. Tans are generally lost during the winter when UV-radiation decreases.

Developmental Acclimatization

Developmental acclimatization occurs during an individual’s growth and development. It’s also called ontological acclimatization or developmental adjustment. Note that these cannot take place once the individual is fully grown. There is usually a “magic time window” of when the acclimatization can occur. This adaptation can take months to years to acquire.

A famous example of this is those who have grown up at high altitude vs. those who have moved to high altitude as adults. Those who were born at high altitude tend to develop larger lung capacities than do those who were not born at high altitude, but moved there later in life. However, developmental adjustment occurs in response to cultural stressors as well. Intentional body deformation has been documented throughout human history. The ancient Maya elite used cradle boards to reshape the skull. Foot binding in China, now an illegal practice, was considered an mark of beauty and enabled girls to find a wealthy spouse.


Genetic adaptations can occur when a stressor is constant and lasts for many generations (O’Neil 1998-2013). The presence of the sickle cell allele in some human populations is one example. Keep in mind that genetic adaptations are environmentally specific. In other words, while a particular gene may be advantageous to have in one environment (AKA a genetic adaptation), it may be detrimental to have in another environment.

Did Cooking Give Humans An Evolutionary Edge?

In Catching Fire: How Cooking Made Us Human, primatologist Richard Wrangham argues that cooking gave early humans an advantage over other primates, leading to larger brains and more free time. Wrangham discusses his theory, and why Homo sapiens can't live on raw food alone.

From NPR News, this is SCIENCE FRIDAY. I'm Paul Raeburn.

This summer, while you're helping yourself to burgers and hot dogs and corn off the barbecue, stop for a second to think about this: Have you ever seen a chimp or a gorilla flipping burgers at the grill in all those hours of animal films on television ever? Probably not.

That's because other primates don't barbecue their food or boil it or broil it or sauté it or any of those things. Cooking, it turns out, is a uniquely human thing, but my next guest says it's not just unique to humans, it's essential. It's what made us human, and he argues that this custom of cooking our food has not only changed our bodies over the years, giving us smaller mouths and smaller guts, he says it's given us an evolutionary advantage: bigger brains, more time to use those brains and less time wasting time foraging and chewing all day long.

For the rest of the hour, we'll be talking about how cooking made us human, and give us a call. We'd like to hear from you and hear your questions. The number is 800-989-8255. That's 1-800-989-TALK.

Now I'd like to introduce my guest, Richard Wrangham, and we'll have some - we'll be talking about cooking with my guest Richard Wrangham, the author of "Catching Fire: How Cooking Made Us Human," who will be with us shortly. He is the director of the Kibale Chimpanzee Project in Uganda and Ruth Moore Professor of Biological Anthropology and Curator of Primate Behavioral Biology at Harvard University.

While we're waiting for him, his new book about cooking and how it made us human is just out. So please check that out if you're interested in what he has to say.

I think we have him. Dr. Wrangham, are you with us?

Dr. RICHARD WRANGHAM (Director, Kibale Chimpanzee Project Ruth Moore Professor of Biological Anthropology, Harvard University): I am, Paul, thanks a lot.

RAEBURN: Thanks for joining us.

So now, I've given a bit of an introduction, which you might not have heard, about how cooking is not only unique to humans but made us human, and before we get to that argument and this very interesting new hypothesis of yours, do we have any evidence on how humans first began to cook?

Dr. WRANGHAM: Good heavens, no.

Dr. WRANGHAM: At all. It's completely lost in the mists of time, but we do have evidence is that our ancestors would have enjoyed cooked food as soon as they experienced it. That is to say if they had the opportunity to sit next to a fire or to see the products of a fire and take food after it had been heated, there is very little doubt that they would have immediately appreciated it. And the reason for saying that is we have done tests on the great apes, and the great apes uniformly show a preference for cooked food over raw or sometimes have no preference for cooked over raw in the case of one or two things, but they never prefer raw to cooked. And it seems likely that our ancestors would have been the same.

Put them next to a fire, drop something in by accident, and bingo, cooking gets going.

RAEBURN: Now, do we think that's - I mean, do we guess that somebody dropped meat in the fire or that there was a forest fire, and animals were burned? What do we think might have happened?

Dr. WRANGHAM: Well, I mean, that's all very speculative, of course. We can look at chimpanzees, and we do see a little bit of food processing by them, but not with fire - they will mash their foods. But what they will do in one place in West Africa is, after a fire has gone through and has baked some seeds of a species of tree that these chimpanzees do not normally eat when they're raw - do not eat at all, in fact, when they're raw - they will eat them after they're cooked.

So they understand that it's worth going to a place after a fire has swept through, and it's possible that that's one of the kinds of ways in which our ancestors might have learned.

RAEBURN: Now, somewhere along the line, there's this great difference between these very close relatives of ours, the chimps and apes, and humans - somewhere, these lineages split in evolutionary time, and one lineage started to cook, and the other didn't.

Now, you talk in the book about two particularly significant periods of development of human ancestors. Can you tell us a bit about that?

Dr. WRANGHAM: Well, yes. I mean, until we have the period of around two million years ago, our ancestors are pretty well known as australopithecines, which were chimpanzee-sized creatures that were not very different from a chimpanzee standing upright.

They walked bipedal. They had brains a little bit bigger than chimpanzees, but they were basically undoubtedly eating the same sorts of foods as chimps or gorillas: raw foods and a mixtures of fruits and veg and maybe occasional bits of meat.

And then around 2.5 million years ago, we have the first of the great transitions, which is that the australopithecines gave way to a species that is variably called Homo habilis or Australopithecus habilis, the uncertainty reflecting the fact that people don't know whether to call it more ape-like, Australopithecine, or human-like, Homo.

RAEBURN: We're getting close to the dividing line here at this point.

Dr. WRANGHAM: Exactly. This was kind of the missing link. This was the species that was still pretty small but was getting a bigger brain, and it was associated in time with tools that could have been used for cutting meat off bones, and so they were almost certainly meat-eaters.

And then another half million years and more pass, and we get to around 1.9 million years ago, and that's when you have the first species that everybody is happy to put into our genus, Homo. Homo erectus, a species about our size, although variable, and the first one, one might say, that could walk down a street in a modern city and go into a store and get some clothes off the peg.

RAEBURN: They didn't prefer designer clothes, as far as we know.

Dr. WRANGHAM: Of course, you know, they might have been hairy, but I don't think they were. At any rate, Homo erectus - I mean, some people regard it as so similar to ourselves that they call it Homo sapiens, some professional anthropologists.

Most people recognize it as a different species on the basis of its smaller brain and somewhat thicker bones and so on, but it was pretty much like us. And so the big question about where do we come from and why did these changes happen from essentially a chimpanzee-like creature standing upright all the way to an early, primitive version of humans, concerns those two changes. So one into the habilis type, and then from habilis to Homo erectus, and that's where all the action lies.

RAEBURN: And that's where cooking came into play.

Dr. WRANGHAM: Well, for my money, it is. Undoubtedly, meat-eating was a hugely important part of all this. And the traditional view is that was all there is to say, that our ancestors became meat-eaters, and the rest followed from there.

But here's the thing. Nowadays, we look at people, and we find that if people go onto a diet of raw food, then something peculiar happens -which is that unlike every other animal, they do not thrive in terms of getting really adequate energy. And there is a pretty clear reason for this, which is that our species has a very odd type of digestive system.

It's less than two-thirds of the size of the digestive system if we were a great ape - like a chimpanzee or a gorilla - in relationship to our body size. And so we have somehow, and for some reason, adapted to having a small gut - and we also have small teeth and small mouths - all of which indicates that we, as a species, have adapted to a diet which is very high quality, and we don't have to put large amounts through our gut and retain them and ferment them for many, many hours.

Well, what kind of diet is that? It seems very clear that cooking is responsible for increasing the quality of our diet in this way. So then one can say, well, okay, when did we get these adaptations, the small gut, the small teeth, the small mouth? And the answer is 1.8, 1.9 million years ago with the evolution of Homo erectus.

So if cooking is what restricts us as a result of our small diet - our small guts, excuse me - to a high-quality diet, then surely that's when cooking must have begun.

So I now like to think that the way to conceive of the pattern of human evolution is in these two important jumps. One is the acquisition of meat-eating, around 2.5 million years ago, where a serious increase in the amount of meat eaten led to important biological changes, including the initiation of a larger brain and then with the full hominization at 1.9. That's when, surely, cooking must have begun.

RAEBURN: So others have said, if I understand correctly, that it was meat-eating or meat-eating and hunting that led to this change or it encouraged or accelerated this change. You're saying it's cooking per se, not just raw meat wouldn't have done it.

Dr. WRANGHAM: Yes. I mean, one of the problems with the meat-eating hypothesis is the one that I just mentioned, that you're asking it to do two things, to explain two kinds of transition. But another is that people have not given consideration to the difficulty of eating raw meat.

You know, I spent a lot of time watching chimpanzees, and even chimpanzees find it difficult to eat raw meat, even though they've got, you know, much bigger jaws than ourselves and big teeth. But they eat rare meat pretty slowly. They eat it so slowly that as they chew and chew and chew, the calculation of the rates of calories taken in per minute or hour is not very different from eating their not-very-high-quality fruit.

The difficulty with meat is that it's tough when it's raw. And I think that even when our early ancestors were taking this very important step of adding a significant amount of this high-quality food to their diet, they must have been processing it. And I think a very reasonable idea here, which should be archeologically testable, is that what the habilines were doing before the evolution of Homo erectus, when they were cutting meat off bones, undoubtedly, I bet that they were pounding it with rocks.

And we have lots of fist-sized hammer stones that they were clearly using for something at that time, and that would seem very reasonable. Because if you pound meat, then just like making steak tartare, just like making ground beef, it makes it much easier to chew and - actually, as we have seen with tests with pythons - it reduces the metabolic costs that the eater must pay for digesting the food. It actually gives you a relatively greater amount of calories if you can process your food in that way.

RAEBURN: So, you - in other words, the raw food. Is it the chewing, or is it something that has to happen in the stomach? Where are those extra calories going?

Dr. WRANGHAM: Well, there are two big reasons why it pays to cook your food. One is that it increases the proportion of the nutrients that you actually digest, because for raw food, there is a significant probability that a particular nutrient will pass through your gut undigested.

The amazing thing about this is that, until recently, this had not been widely appreciated because most people had looked at the difference between the amount of nutrient that you eat in a mouthful and the amount that exits in your feces to try and work out how much of it you had digested. But this is not quite the right way to do it. That - call that measure fecal digestibility. And you might find that all of the starch, say, that you eat might have been disappeared by the time it reaches the feces, which makes it look as though it's 100 percent digestible.

But the reason it might not be is that in our large intestines, in our colons, we have some 400 or 500 species of bacteria and protozoa that are themselves hungry, as it were. They are going to metabolize any food that comes through into that area. And depending on what the nutrient is, our bodies may not be able to use that at all. I mean, in the case of protein, for example, any protein that goes from the end of the small intestine into the large intestine is completely useless to us metabolically. It is digested by the bacteria and transformed into chemicals that we cannot use.

So, the only way to assess the impact of cooking on digestibility is to look at what happens to the food by the time it reaches the end of the small intestine, before it goes into the large intestine. That is, of course, a great, great difficulty because it hurts if people dive into your guts and extract your food.

Dr. WRANGHAM: But there is a way to do it, and that is to take advantage of people who've the misfortune of losing their large intestine, or a lot of it. And they end up with an ileostomy, a bag at the end of the small intestine that lies on the surface of the abdomen, and this is the way that they pass their food. And the researcher can then get permission to extract this effluent from every 15 minutes, or whatever it is, and then see how much of the food is digested by the time it gets through the end of the small intestine.

The results have been fascinating. There is a study by some Belgian gastrointestinal physiologists on eggs. And what they discovered was that when you cook your eggs, then almost all of the protein is digested. So it's digested to the point of about 94 percent, whereas if it is eaten raw, then only 55 to 64 percent of it is digested and the rest is lost.

RAEBURN: Well, that's a big number.

Dr. WRANGHAM: That is not a huge surplus(ph).

Let me stop for just a minute to remind people I'm Paul Raeburn. This is SCIENCE FRIDAY from NPR News.

And I think there're so many questions here. We've now - now, we've gone from anthropology into potential diet advice here, I think. Let me…

Dr. WRANGHAM: Well, yes. I mean…

Dr. WRANGHAM: …it's a great way to lose weight for people…

Dr. WRANGHAM: …to eat raw food.

RAEBURN: If you're not trying to evolve into a human. Let me take a call. We have Dan, from Boston. Go ahead, Dan.

DAN (Caller): Hi, gentlemen. I have a question about - well, I guess it's already been addressed to some degree, but exactly how the food is changed when it's cooked, from a molecular standpoint. I was just curious if you could - you can address that a little bit.

Dr. WRANGHAM: Well, yes. No, because I was only really getting into that. So what I was talking about was when you cook the food, then you increase the digestibility of many foods, many nutrients. But exactly how does that happen?

Well, we were talking about protein just then. And the conclusion of the people who investigated this question is that the significant consequence is that when you heat protein at all, then it tends to lose its structure. It's called denaturation, the sort of - it's like a ball of wool that is tightly wound. And as you heat the protein, it opens up. And the significance of being opened up is that it is then much easier for the digestive enzymes to come in and snip off the peptide bonds, to break off the amino acids.

RAEBURN: It's easier to digest, in other words. It's just easier to digest. Yeah.

Dr. WRANGHAM: Exactly. It becomes digestible in a way that it previously was not. And so this is a completely predictable consequence of heating. I mean, denaturation, of course, it's hugely important. One of the ways in which we denature protein is by putting them in acid. And guess what our stomach is full of? It's full of, you know, hydrochloric acid, pH 1 or 2 - very, very acidic. So that starts the process. But cooking really accelerates the amount of denaturation that happens and, therefore, it exposes these molecules to easy action of the digestive enzyme.

And another example is starch. Starch, when eaten raw, is a sort of semi-crystalline molecule, granule. And when it is cooked - properly, at least - then it opens up and the amylose and amylopectin - these critical sugars - these chains of sugars are opened up, and again, can be snipped off. So heat exposes the molecules to action by digestive enzymes.

RAEBURN: So this is - this, to me, is from a - again, pardon me for shifting the focus. But from a dietary direction, aside from the anthropological implications, we've - I mean, I have argued many times with many people about the fact that food has so many calories, that's it. And now you're saying that it's not a question of, you know - a carrot has some calories when it's cooked and some when it's raw.

Before you answer, however, we have to take a short break. We have lots more with Richard Wrangham. Please stay with us.


RAEBURN: From NPR News, this is SCIENCE FRIDAY. I'm Paul Raeburn.

We're talking this hour about how cooking made us human. My guest is Richard Wrangham, the author of "Catching Fire: How Cooking Made Us Human," oddly enough. He's also director of the Kibale Chimpanzee Project in Uganda and professor at Harvard University in Cambridge.

Now, before the break, Professor Wrangham, I was asking you about how many calories in a cooked carrot or a raw carrot. You mean to tell me that all those books I have that tell me how many calories are in this or that, they don't say anything about whether the food is cooked or raw. So what's going on? Do I have to throw those all away?

Dr. WRANGHAM: Well, I mean, we do have to modify it, and it's true for the food labeling system. Because if you look up on the USDA Web site and see how much - how many calories there are in a piece of raw meat or a piece of cooked meat or a raw potato or a cooked potato, you will find that old wisdom there, that it's the same. But look, when you cook, when you process the food, it really does affect the number of calories you get into your body. And there was a wonderful little experiment on rats that just shows this so clearly, and it's a very, very simple form of processing. It's a little analogous to cooking.

This is an experiment in which rats were given their regular chow pellets in two different forms. One was the ordinary pellet, and the other was with air added. They were puffed up. It's as if you took a grain of wheat and then puffed it off into puffed wheat.

RAEBURN: No nutrients added or subtracted, just air.

Dr. WRANGHAM: That's the only thing, air. And the experimenters were very careful. They gave exactly the same number of calories as measured - you know, the same weight of food to two groups of rats. And they measured how much locomotion they expended, and it was the same. So, same number of calories, same locomotor expenditure - you'd think that they would grow at the same rates. But the ones that ate the softer food grew faster, ended up heavier and had 30 percent more body fat.

Dr. WRANGHAM: So they grew obese.

Dr. WRANGHAM: Well, you see, this is where the costs of digestion come in. It's so important, because they could actually show where the difference was. And the difference is this: that after a meal, the rats that ate the softer food had a lower rise in body temperature than those that ate the harder food. Their metabolic rate was lower because their bodies were working less hard, because there was less to do. They didn't have to soften their food.

And this is a wonderful little model, I think, for all sorts of examples in the human case. When we turn our beef into ground beef - just like hunters and gatherers who cook their meat and then pound it, what we're doing is making it easier for our bodies to digest the food and therefore sparing our bodies the need to waste energy, calories, on digesting the food. And the result is that the net caloric gain is greater when we eat food that has been more highly processed.

Dr. WRANGHAM: And, of course, this is fantastically significant in terms of the fact that we've got an obesity crisis and very highly processed food in the middle(ph).

RAEBURN: It's amazing. Let me - I know there are some listeners who want to get a crack at this. Let me try Katie in Cape Cod.

KATIE (Caller): Hi. How are you?

RAEBURN: Good. Go ahead and ask your question.

KATIE: I've recently been introduced to the raw food diet, and I was wondering what you think about that and if you think it's all a hoax.

RAEBURN: Go ahead, doctor(ph).

Dr. WRANGHAM: Well, thanks, Katie.

Dr. WRANGHAM: I mean, that's a great question. And I think that it - the funny thing about the raw food diet is that many of the proponents argue that it is the natural thing to do. And I'm quite sure that it's not the natural thing to do in the sense that we're not biologically adapted for it, because if you look at raw foodists nowadays, they lose weight on a raw food diet, even to the point where women, in the only large survey that is being done of this, turn out to stop menstruating in half the cases when they are on a 100 percent raw food diet, an indication of how little energy they have. The scientists conclude that raw food diets lead to chronic energy shortage.

So if you want to gain energy, if you're living in the Third World, like a third of the people in the world, very hungry, then you - the last thing you want is a raw diet. But in our society, a raw diet can have all sorts of advantages. It can help you control your weight, and it has other advantages, too, for some people. I mean, there's lots of benefits that people report.

Some people find that they get reductions in rheumatoid arthritis, for instance, some very specific things like that. But many people feel a greater sense of well being, more vitality - quite often, less pain. And I think partly, this is going to be due to just eating less, and partly it probably is due to the fact that some people may be allergic or have some kind of response to the chemicals that are produced in cooked food. So, it's a very personal thing. You know, for some people, raw diets can be terrific. It's just that, you know, don't think they're natural, they're not.

RAEBURN: Are you on a raw diet, Katie?

KATIE: I am not on a raw diet, but I've started to eat a lot more raw foods, and I found that I did lose a good amount of weight by changing my diet.

RAEBURN: And now you know why. Thanks for calling.

RAEBURN: We have another call from - let me see if I can get the right button here - from Mark in Saint Paul, Minnesota.

MARK (Caller): (unintelligible) and I think - thanks for taking my call.

RAEBURN: Where are you from?

RAEBURN: South Saint Paul. Okay I…

MARK: It's one of the misnomers. We're sort of south, but we have west, that's actually just west of South St. Paul. North is actually east of St. Paul, and (unintelligible) just like that.

RAEBURN: Wow. I'm with you. Go ahead with your question.

MARK: Well, I have actually two. I developed the second one while listening to you. The first one is about - well, not to be over about it, but aspect of race. Explain real quick. Scientific American did a piece where they showed some Pima Indians in Southwestern United States, that's - are the typical American diet: lots of meat, starch and so forth. They had overweight, diabetes, hypertension - the works.

But they had some (unintelligible) cousins. They were down in Southern Mexico that were having the traditional diet. They did not have any of the medical troubles. If there is such a problem in that where we, sort of, racially evolved, you might say, who - a measure where we can eat certain foods better than others.

RAEBURN: Okay, let's give that question to Richard Wrangham. And then, if we have time, we'll come back for your next one.

Dr. WRANGHAM: Yeah. Well, thanks, Mark. And these are important questions, but I'm working at the level of the species differences, of what makes us different from chimpanzees. And I have not seen any evidence of - specifically, cooking - having different impacts or leading to different kinds of digestibilities among different peoples.

So, it certainly is the case that different peoples have some different enzymes. The famous one is the lactose digestion enzyme which is persistent in people who have got an evolutionary history of eating milk. And whether or not this applies to some of the products of cooked food, I can't tell you.

RAEBURN: Okay, Mark, give us the second question very quickly if you could, please.

MARK: Okay. It's about carmelization. Some researchers saying that carmelization can actually induce aging in certain measures. The question, basically, is: Do some - how does overcooking or undercooking affect foods can actually affect our aging? And in some regards to that, does aging affect our ability to eat certain foods? Thank you.

Dr. WRANGHAM: Well, let me answer that question by saying that cooking does increase the concentration, very greatly, of a series of compounds called sometimes Maillard compound after the French biochemist discovered them, that are combinations of amino acids and sugars. And these are very much implicated in certain aspects of aging. We produce them more as we age, but the dietary source of them in cooking is a very high concentration compared to our normal production. And they're significant because they do produce tumors in rats. So, it is feared that they may be important in humans.

Now, recently, it was discovered that there was one of these acrylamide at unsuspectedly high concentrations in potato products, cooked potato products, and no one's known quite what to do with that. So, because as I understand it, the expectation was that this should've led to lots of cancers from people eating cooked potatoes, but actually it doesn't.

So maybe we have, as a species, adapted to the negative effects of these Maillard compounds. It's one of many, many areas where there's not enough information for us to be certain. But you're absolutely right that cooking produces these Maillard compounds, and many of them are known to be toxic in other animals, so there are cases where we want to know more about that.

RAEBURN: I'm sorry to say, we're out of time. I just have one final question for you. Have you thought of leaving Harvard and opening a restaurant?

Dr. WRANGHAM: Well, you know, there's a lot to be said for an evolutionary approach to eating, undoubtedly. And I'll take that suggestion on hold.

RAEBURN: Yeah, think it over. Our guest has been Richard Wrangham, the author of "Catching Fire: How Cooking Made Us Human." He's also a director of the Kibale Chimpanzee Project in Uganda, and Ruth Moore professor of biological anthropology and curator of primate behavioral biology at Harvard University in Cambridge, Massachusetts. Whether you open a restaurant or not, thanks for being with us.

Copyright © 2009 NPR. All rights reserved. Visit our website terms of use and permissions pages at for further information.

NPR transcripts are created on a rush deadline by Verb8tm, Inc., an NPR contractor, and produced using a proprietary transcription process developed with NPR. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR&rsquos programming is the audio record.

Biology's response to dieting: the impetus for weight regain

Dieting is the most common approach to losing weight for the majority of obese and overweight individuals. Restricting intake leads to weight loss in the short term, but, by itself, dieting has a relatively poor success rate for long-term weight reduction. Most obese people eventually regain the weight they have worked so hard to lose. Weight regain has emerged as one of the most significant obstacles for obesity therapeutics, undoubtedly perpetuating the epidemic of excess weight that now affects more than 60% of U.S. adults. In this review, we summarize the evidence of biology's role in the problem of weight regain. Biology's impact is first placed in context with other pressures known to affect body weight. Then, the biological adaptations to an energy-restricted, low-fat diet that are known to occur in the overweight and obese are reviewed, and an integrative picture of energy homeostasis after long-term weight reduction and during weight regain is presented. Finally, a novel model is proposed to explain the persistence of the "energy depletion" signal during the dynamic metabolic state of weight regain, when traditional adiposity signals no longer reflect stored energy in the periphery. The preponderance of evidence would suggest that the biological response to weight loss involves comprehensive, persistent, and redundant adaptations in energy homeostasis and that these adaptations underlie the high recidivism rate in obesity therapeutics. To be successful in the long term, our strategies for preventing weight regain may need to be just as comprehensive, persistent, and redundant, as the biological adaptations they are attempting to counter.


Pressures affecting the steady-state weight.…

Pressures affecting the steady-state weight. The three pressures, all influenced by the underlying…

Biology's influence during obesity development,…

Biology's influence during obesity development, treatment, and relapse. Homeostatic systems adapt to prevent…

Homeostatic adaptations to an energy-restricted,…

Homeostatic adaptations to an energy-restricted, low-fat diet. A : adaptations in the homeostatic…

Effect of dietary fat on the resolution of insulin and leptin and the…

Impact of weight loss on the components of total energy expenditure (TEE). Energy-restricted…

The change in adipocyte cell…

The change in adipocyte cell size frequency distribution with weight loss and weight…

Model linking adipocyte cellularity and…

Model linking adipocyte cellularity and peripheral nutrient clearance to the energy gap during…


This neutrophil-based study identified several lines of converging evidence showing the importance of circadian oscillations of cytosine modification in humans. The discovery that cytosine modification is a part of the cellular circadian machinery is at odds with the traditional perception of static cytosine modifications in somatic differentiated cells, albeit with some gradual and unpredictable life-long “epigenetic drift” [40]. Our findings indicate that unexplained inter- and intra-individual variations of cytosine modification are not as random as once thought. Differential distribution of osc-modCs across genomic elements can be one of the reasons why regions outside of CpG islands exhibit higher variance in cytosine modification [41, 42]. Since epigenetic elements of higher variation are involved in tissue differentiation and malignant transformation (ibid.), osc-modCs may play a role in both processes. The observation of epigenetic “apogee” and “perigee” provide new mechanistic insights into carcinogenesis if circadian epigenomic convergence is not fully compensated by divergence, the cytosine modification profile could acquire cancer-like features over a number of cycles, resulting in an extreme case of epigenetic “perigee.”

An overlap between osc-modCs from a single individual and positions of variable cytosine modification in the general population suggests that inter-individual epigenetic variability may be influenced by, at least to some extent, the circadian rhythm. Potential sites of population epigenetic variance may result from circadian differences among individuals, differences in the circadian time of sample collection, and (or) biological variation associated with osc-modCs (e.g., epigenetic aging). Our findings imply that differential epigenetic variation identified in several disease studies [29, 43,44,45] may also be associated with circadian epigenomic oscillations. If proven true, a direct link between circadian epigenomes and inter-individual epigenomic variation would provide a mechanistic basis for parts of

80% of populational variation that is assumed to be of unexplained environmental origin [33].

While we purified neutrophils to eliminate between blood cell type heterogeneity, the study can still be confounded by circadian replenishing of the neutrophil subtypes [46]. Although neutrophils exhibit variation in density of surface antigens during maturation (e.g., CD62L) [46], current evidence suggests that cytosine modification profiles across different stages of development show no discernable differences [47]. In addition, the neutrophil study used for our intra- vs. inter-individual comparison [18] tested for the expression of several surface antigens, a proxy marker for cell type heterogeneity, and excluded neutrophil subpopulations as the main determinant of inter-individual cytosine modification variability. Nevertheless, neutrophil subtypes have not been clearly characterized, and interpretation of our findings cannot be completely transparent. The fact that WBC differentiating cytosines were significantly enriched for osc-modCs may imply hidden cellular heterogeneity. On the other hand, this may also indicate that osc-modCs are linked to cell differentiation and development. We believe that even if some cytosines were to be involved in daily dynamics of neutrophil subtypes, and therefore simulate oscillations, it is unlikely that they can fully account for the findings described in this study. Given the large number of EWAS that showed an association with our osc-modCs, where all studies were performed independently from each other, all datasets would have to be consistently confounded by inefficiently corrected heterogeneity. However, we acknowledge that our findings may be confounded by hidden heterogeneities and that our biological interpretation may change as new WBC subtypes are discovered. Moreover, even if our findings are the result of some yet unknown hidden neutrophil subtype heterogeneity, our biological interpretation may be incorrect but it does not diminish its property as an epigenetic marker of disease. Relatedly, since our subjects were exposed to normal lighting and eating habits during the experiments (i.e., external entrainment cues), it is also difficult to parse out the relationship between osc-modCs and intrinsic circadian rhythm. In short, there are many hurdles that impede our ability to fully interpret our findings, but future environment-controlled experiments involving large-scale molecular characterizations of neutrophils at a single-cell resolution may help resolve these question.

Independent from interpretation uncertainties, a shift from a “static and stochastic” cross-sectional studies to “cyclic and deterministic” circadian strategies can change our understanding of the molecular and cellular basis of common disease. Circadian strategies are based on multiple samples (WBC, adipocytes, fibroblasts, cultivated cells) collected over a 24-h period (or longer) to identify individual-specific profiles. Although the cause-and-effect relationship between disturbed circadian cycles and complex disease still needs to be established, the circadian interpretation of disease origin is simple and intuitive daily circadian reprogramming is likely to be prone to errors and imperfectly maintained circadian aberrations in epigenomes (and transcriptomes, metabolomes, or cell subtypes) gradually convert into disease risk factors.

Circadian molecular and cellular studies may identify individual-specific disease features that could open new opportunities for precision medicine, and offer a customized approach for predicting disease risks and prognosis to facilitate early and efficient interventions [48]. Such approaches would integrate circadian biomarkers with clinical data to develop a more accurate molecular disease taxonomy to improve diagnostic specificity and treatment efficacy [49]. Furthermore, since circadian parameters can be modified by diet, lifestyle, and medications [50], we predict that preventative interventions aimed at rectifying circadian aberrations may be a viable approach to reduce the risk of a disease or delay its age of onset.

How Cooking Creates the Toxins in Food that Cause Disease!

Eighty million species on earth — about 700,000 of which are animals — thrive almost exclusively on raw, non-GMO or organic food. Only humans intentionally cook our food or design its DNA in a laboratory. So it isn’t very surprising that we also typically die at just half of our potential life span from chronic illnesses that are almost all diet and lifestyle related. Farm animals and pets are also fed cooked, processed, packaged and genetically modified foods — and also suffer the same kind of chronic health problems that we do — cancer, arthritis and other degenerative diseases.

In their natural environment, the typical species on our planet lives seven times its age of maturity. Humans generally mature in their late teens to early twenties, making our potential life span 120-140 years. This is never actually realized largely due to the health effects of cooking our food, GMOs and the toxins created by heat, food processing or cooking! Before discovering fire 10,000 to 20,000 years ago we thrived for millions of years on fresh, raw, live foods furnished by nature in their whole unadulterated state. In some ways cooking allowed humans to expand all over the world, from Africa to Antarctica. However, we paid dearly for that with shorter lifespans and many diseases.

These days humans cook or process almost all of our food, which then breaks down under heat into toxins we can’t properly assimilate. The GMO’s we eat are often by-products of eating farm animals fed GMO feed. Slowly the decades pass and the harmful effects of these toxins becomes more and more deadly. Like most animal species, humans are actually adapted over eons (thousands of years) to eating live, raw food. Raw food — not cooked or processed — is what we are actually designed to eat.

Raw Food Contributes to the Prevention of Cancer & Disease

Scientific evidence shows that a raw food, vegan diet decreases toxic by-products of cooking or heat in the colon. — Ling WH, and Hanninen O, of the Department of Physiology, University of Kuopio, Finland, J Nutr 1992 Apr122(4):924-30

Shifting from processed, GMO and cooked foods to a healthy raw food diet dramatically alters fecal hydrolytic activities in humans, suggesting that live, raw food causes a decrease in bacterial enzymes and other toxins associated with colon cancer. Researchers also found that a diet rich in raw vegetables lowers the risk of breast cancer. Finally, including fresh fruit and veggies in your diet is associated with fewer heart attacks and related problems. — British Medical Journal, 1996

Heating or cooking food over 118 °F denatures its nutrients. You wouldn’t stick your hand on a hot stove — so why do that to your food? You see, when you burn your finger some of your healthy, live skin actually dies — and you feel that immediately. However, when you cook food it also “kills” most of the nutrients and enzymes but you just don’t feel it for many years.

Food fresh from the field or orchard provides the healthy, living raw materials that you meed to replenish your cells and tissues. Cooking or processing, on the other hand, destroys live plant and animal tissues so their nutrients no longer bear any relationship to a living body.

Heating, processing or cooking food breaks down vitamins and amino acids, producing undesirable cross-linkages in proteins, especially meat. When food is cooked above 118 °F. for over three minutes, the following negative cause nutritional damage that gets worse as temperatures increase:

  • proteins coagulate
  • high temperatures denature protein molecular structure leading to deficiency of some essential amino acids
  • carbohydrates caramelize
  • overly heated fats generate numerous carcinogens including acrolein, nitrosamines, hydrocarbons, and benzopyrene (one of the most potent cancer-causing agents known)
  • natural fibers break down
  • cellulose is completely changed from its natural condition: it loses its ability to sweep the alimentary canal clean
  • 30% to 50% of vitamins and minerals are destroyed
  • 100% of enzymes are damaged, the body’s enzyme potential is depleted which drains energy needed to maintain and repair tissue and organ systems, thereby shortening our life span
  • pesticides are restructured into even more toxic compounds
  • valuable oxygen is lost
  • free radicals are produced
  • cooked food pathogens enervate the immune system
  • heat degenerates nucleic acids and chlorophyll
  • cooking causes inorganic mineral elements to enter the blood and circulate through the system, which settle in the arteries and veins, causing arteries to lose their pliability
  • the body prematurely ages as this inorganic matter is deposited in various joints or accumulates within internal organs, including the heart valves.

As temperatures rise, each of these damaging events reduce the availability of individual nutrients. Modern food processing not only strips away natural anti-cancer agents, but high heat actually creates new, more potent cancer-causing chemicals in the process. This gets even worse with many GMOs. “Alien” toxins are thus created that the body cannot properly metabolize.

Chemicals from Cooking can Cause Tumors Throughout the Body

  • Mucoid plaque, a thick tar-like substance, builds up in the intestines
  • Mucoid plaque is caused by uneliminated, partically digested, putrefying cooked fatty and starch foods eaten in association with protein flesh foods
  • Lipofuscin is another toxin: an accumulation of waste materials throughout the body and within cells of the skin, manifesting as spots in the liver as liver and in the nervous system including the brain, possibly contributing to ossification of gray matter and senility.

— From an article by Dr. Bruce Ames, professor of Biochemistry and Molecular Biology at University of California, Berkeley.

Additional carcinogens in cooked or processed foods include:

  • Hydroperoxide, alkoxy, endoperoxides and epoxides from heated meat, eggs, fish and pasteurized milk
  • Ally aldehyde (acrolein), butyric acid, nitropyrene, nitrobenzene and nitrosamines from heated fats and oils
  • Methyglyoxal and chlorogenic atractyosides in coffee
  • Indole, skatole, nitropyrene, ptomatropine, ptomaines, leukomaines, ammonia, hydrogen sulfide, cadaverine, muscarine, putecine, nervine, and mercaptins in cheese.

— From the book Diet, Nutrition and Cancer, Nutritional Research Council of the American Academy of Sciences (1982) and the FDA (Food and Drug Administration) Office of Toxicological Sciences,

Since 1950, as processed food became more common in the U.S., cancer rates did as well! We now at the highest cancer rates in history. The result from consuming cooked food is both (2) minimal nutrition and (2) maximum toxic overload. After a typical cooked meal on the SAD diet (Standard American Diet), your body is forced to raid its dwindling supply of nutritional reserves leading to more hunger even though your stomach is full. This results in chronic overeating and obesity, now also at epidemic levels nationwide.

You see, cooking denatures proteins – period. This creates an unhealthy or toxic modification to the protein molecules. Heat, acid, alkali and ultraviolet radiation can also denature a protein’s properties making it unusable or less usable by your body. Unlike more basic organic molecules, the properties of proteins are dangerously changed when they are cooked/heated. This also denatures the secondary bonds of the molecular chains. Once these weak bonds are broken the molecule falls into a disorganized tangle — and loses the protein’s original biological function.

For example, enzymes lose their catalytic powers. Hemoglobin in your blood can lose its capacity to carry oxygen. Denaturation can also destroy the highly specific patterns that amino acid chains use when folded into the native protein. In the case of egg whites, a gel or coagulum is formed when heat/cooking is applied forming enzyme resistant linkages that inhibit the separation of constituent amino acids.

This is why the I use the phrase “dead food” when used referring to cooked food.

How Cooked Proteins Coagulate Becoming Indigestible

You can actually see the “coagulation” of protein molecules when you fry an egg. The clear protein surrounding the yolk turns white, thickens and then coagulates. Your digestive enzymes (peptones and proteases) cannot readily break down coagulated protein molecules once fused together. Now these heat-damaged proteins are no longer available to your body — but there’s more! The indigestible proteins will also to putrefy as bacteria in your digestive tract feed upon the dead organic matter. Bacterial enzymatic by-products are highly carcinogenic. This coagulation effect happens microscopically in ALL cooked or heated protein molecules whether you can actually see it (like the egg) or not.

So when you eat raw fruits, vegetables, nuts and seeds as a source of protein (amino acids) you are getting the maximum biological value of all the nutrients in that raw food. As you consume more fresh, raw produce, your body progressively requires less and less food as you absorb more and more nutrients. Since the food is raw and uncooked, you also require little or no calories for the elimination of toxins.

The result is that as you eat more nutrient-rich fresh, raw food over time your body adapts and becomes healthier. Your metabolic efficiency increases as does your ability to assimilate the higher nutrition found naturally in raw food. Eventually you will only need about half the protein you needed when consuming cooked, dead food filled with mostly denatured proteins.

However, some physiologists claim that cooking and digestion are chemically identical or that cooking is a form of pre-digestion where heat hydrolyzes nutrients that otherwise are hydrolyzed by digestion. This is a huge oversimplification. There are two ways to denature proteins: chemically with enzymes or with heat (cooking). However, when you cook/heat your food, your body cannot utilize the heat-damaged, denatured amino acids. They’ll never be viable, healthy protein molecules again.

Cooking denatures food molecules far past the point of any bio-activity. Your body heat is way too low to permanently damage protein molecules like that. It digests food chemically with enzymes — not heat! So these chemically digested proteins can easily be reused. Protein molecules denatured by heat/cooking cannot be used by your body at all.

Raw, Live Plant Protein Is Best. A fresh raw food diet includes lots of raw proteins. Fruits, vegetables, nuts, seeds and sprouts do not require cooking at all to improve their taste or digestibility. When proteins are heated during cooking, however, enzyme resistant links are formed between the amino acids that the body cannot separate — and what the body cannot use it must eliminate or store in your fat tissues. So now these cooked and damaged proteins can become a source of toxicity for the whole body.

Some high-protein plant foods like soybeans and Lima beans contain naturally occurring toxins that some people say are neutralized by cooking. However, cooking does not totally remove all the toxic effects of such foods — and has little effect on the toxins found in GMOs. On a raw food diet you’ll get everything you actually need without having to consume the few things that may be enhanced by cooking. Just don’t eat anything that has to be cooked at all.

Detrimental Effects of Heat on Nutrients. According to the textbook Nutritional Value of Food Processing, written for chemists in the processed food industry, changes that occur during food processing result in either nutrient loss or destruction. — Nutritional Value of Food Processing 3rd Edition Karmas, Harris Van Nostrand Reinhold

Effect of Temperature on Rate of Destruction of Food Components (minutes at 250 °F )

  • Vitamins 100 to 1000 20-30
  • Quality factors 5 to 500 10-30 (texture, color, flavor)
  • Enzyme inactivation 1 to 10 10-100
  • Vegetative cell inactivation 0.001 to 0.01 80-200
  • Spore inactivation 0.1 to 5 50-200
  • At 121 degrees C (249.8 F) the nutritional components decreased by 90%

At relatively low temperatures, the destruction of enzymes is higher than that for microorganisms. The temperature range where the destruction of enzymes is equals to that for microorganisms is 270-290 °F . The fact that heating food reduces its nutrition cannot be argued.

During baking, for example, the effect on nutrition is most intense in the crust or outer portions of the baked goods. While the heat of baking does denature proteins, the quality of the protein is even affected by “browning,” called the Maillard reaction. Maillard reactions are responsible for the odors and flavors of freshly baked products. The dark crust many people like on bread is, for example, the result of a Maillard reaction. However, Maillard reactions produce nothing of nutritional value at all, and actually reduce the nutritional value of bread and actually raise cholesterol.

Most amino acids are destroyed during the Maillard reaction. The damage to lysine after toasting bread was studied in rats using bread toasted to varying degrees of brownness. Weight gain in rats was especially low with diets of dark-toasted bread.

Cooking Can Affect Vitamins which are heat sensitive. Thiamine and vitamin C were the most susceptible to baking losses. When the pH of the baked product rises above 6, nearly all of the thiamine is destroyed. Similar conditions exist in a variety of chemically leavened baked goods including cookies and crackers. In high-protein cookies, calculations revealed thiamine losses from baking exceeding 90%.

In addition to baking, vitamin B6 and pantothenic losses could be as high as 91% in canned food. The recommended daily allowance (RDA) for these two nutrients probably can not be obtained from eating refined, processed and canned foods at all! When foods are treated with heat they can lose up to 97% of the water-soluble vitamins (Vitamins B and C) and up to 40% of the lipid soluble vitamins (Vitamins A, D, E and K).

Cooking Can Affect Minerals. Heat affects the absorption of some minerals by making them less absorbable. Phytate, fiber, proteins, and certain minerals are particularly suspect as components of these complexes. Vitamins and minerals must be consumed in an organic colloidal, naturally chelated form to be absorbed, assimilated and utilized by you body! Heat damages the molecular structure of vitamins and minerals, returning carbon atoms to an inorganic, ash-like form that is like a toxin to your body.

Cooking Affects Fats & Carbohydrates by damaging the carbohydrate and fatty acid content of baked good, called the Maillard Reaction. In a hot oven, linoleic acid and even some other fatty acids can be converted to unstable hydroperoxides, which affects nutritional quality of the baked goods. Heat actually changes the lipids or fats interfering with cellular respiration and causing an increase in cancer and heart disease. Carcinogenic, cancer causing compounds created by heating fats include acrolein, nitrosamines, hydrocarbons and benzopyrene.

Deep-fried foods like fried chicken, french fries, onion rings, potato chips, corn chips, cooked beef, chicken and most cooked meats are even worse due to their high fat content. This is perhaps the main reason why cancer is the number one killer of children in the U.S. Oils and fats can become very rancid when heated, turning rancid and toxic when broken down during cooking. High heat applied to oils turns them into hydrocarbons that can cause cancer. Frying temperature can reach up to 600-700 °F causing fatty acids to become TRANS fatty acids. The unsaturated fats then act like saturated fats — raising cholesterol and LDL cholesterol levels almost as much as saturated fats. This is another reason why fried foods contribute so much to hardening of the arteries.

When oil is reheated for deep fryers the fat is more likely to develop cancer causing agents like acrolein and benzopyrene. Very hot temperatures also destroy vitamins and damage major proteins. Temperatures up to 1,000 °F (found in many fast-food restaurants) forms free radicals as the polyunsaturated fats break down. These fragments then can combine with oxygen to create toxic peroxides, with a strong oxidizing (rusting) capacity to damage cells.

Cooking Carmelizes Carbohydrates. If you ever baked yams or sweet potatoes you probably noticed a sweet sticky goo oozing from the skin. When you see that you’re actually witnessing sugar molecules carmelizing — or fusing together. Though you can actually see this in a sweet potato, it also occurs on a microscopic level when any foods are sufficiently heated. When complex carbohydrates like sugar molecules are carmelized your digestive enzymes cannot break them down into simple sugars for use as an energy source. So now they are not just unavailable, the heat turned them into an ash-like toxin on your body.

Proteins under ideal conditions are broken down into amino acids by gastric enzymes. Every protein molecule in your body is synthesized from these amino acids. Protein that you eat is actually NOT used by your body as protein! First, the protein must be recycled or broken down into amino acids which only then can be used to build the protein molecules your body actually needs.

There are 23 different amino acids that link together in various combinations. The very long chains of amino acids create protein molecules. Excessive heat, however, can “decapitate” the amino group rendering not only useless but toxic!

Heating Food Past 118 °F Deranges Enzymes

When food is heated past 118 °F enzymes are destroyed. Enzymes are specialized protein molecules that are needed for numerous catalytic functions in your body, including breaking food down during digestion. When exposed to heat/cooking, food enzymes are virtually all inactivated. Then you body then must use more energy to generate more digestive enzymes, and so on. When you heat food under 118 °F it is still considered a raw food. That temperature does not denature the enzymes. That’s why food dehydrators that work at low, safe temperatures are great for making many raw food recipes or dehydrating live crackers.

Some nutritionists and biochemists, however, try to claim that raw food isn’t always the most nutritious or that cooking releases more nutrients into the food. One example they use — out of context — is that the body more easily absorbs iron when cooked. Iron from broccoli or cabbage seems to jump from about 6-7% to almost 30% with cooking. But, what these so-called scientists didn’t consider is that this is all inorganic iron — which actually damages your body more than helps it.

There are numerous times when cooking something releases a nutrient here or there – like lycopene in tomatoes, or when cooking breaks down toxins in something you couldn’t eat raw. However, they fail to mention that if you eat raw foods you won’t have a prostate problem so won’t be needing any extra lycopene! Or that with all the live raw nutrients you get on raw food you won’t need to eat things that might otherwise be toxic anyway.

The Dangers of Inorganic Iron. Iron absorbed from cooked food is unhealthy compared to iron from raw food, which we need. You see, cooking radically changes iron into its inorganic form, which can overload your body since it’s difficult to eliminate. Excess inorganic iron is associated with increased infection, heart disease, predisposition to radicals, the promotion of atherosclerosis, premature aging and cancer.

Remember, nature has provided the perfect balance of nutrients in fresh, live raw foods which is what we are all designed to eat. Applying heat to our food actually upsets that perfect balance.

Why Cooking Can Never Improve the Nutrition in Food

The more that a food actually needs to be cooked to be edible, the more it compromises your health. When a food needs to be cooked before you eat it, that is the number one sign that it is not something you need to eat — so don’t eat it! Our culture, however, is centered around a cooked food lifestyle. So rather than force people to be totally raw, I like to recommend that most people eat 75% raw food. If you count all snacks as a meal, that leaves 25% for cooked foods. Your body is easily able to eliminate the toxins you get from just 25% of your diet. That means you can enjoy some cooked foods and still be pretty healthy. Its important to enjoy your food and not get overly stressed about all this!

Dr. Francis M. Pottenger Jr. MD wrote about his experiments with 900 cats over a period of ten years. Pottenger fed raw meat to a portion of his test cats, and fed cooked meat to the other test cats. Pottenger wrote, “Cooked meat fed cats were irritable. The females were dangerous to handle, occasionally biting the keeper.”

Cooked meat and pasteurized milk led to progressive degeneration of the animals. He compared healthy cats on raw foods with those on heated diets with similar findings among humans by Dr. Weston A. Price. Behavioral characteristics, arthritis, sterility, skeletal deformities and allergies are some of the problems associated with all-cooked foods.

The cats fed cooked meat suffered with pneumonia, empyema ( an accumulation of pus) , diarrhea, osteomyelitis, cardiac lesions, hyperopia and myopia (eye diseases), thyroid diseases, nephritis, orchitis, oophoritis (ovarian inflammation) and many other degenerative diseases. No cooked food is benign. Cooked foods act malignantly by exhausting energy, inhibiting healing and decreasing alertness, efficiency and productivity.

The Pathogenic Nature of Cooked Food

An increase in white corpuscles in the blood indicates a pathology. White corpuscles are your body’s first line of defense against toxic substances, whether from cooking, chemicals or GMOs. The typical white corpuscle count is about 6,000 per cubic millimeter. When this count doubles, triples or increases four or five times it is evidence of a disease even if you have no symptoms.

Dr. Kouchakoff of Switzerland conducted over 300 experiments pinpointing the pathogenic nature of cooked and processed foods. Food heated to just 120 to 190 °F (usually associated with warming rather than cooking though it still kills all enzymes), causes leukocytosis — an abnormally high white corpuscle count.

Leukocytosis Doesn’t Happen with Raw Food! In Dr. Kouchakoff’s experiments, white corpuscle counts tripled even after being exposed to relatively low temperatures (but still above 118 °F ). When raw foods were added to the foods cooked at low temperatures they did not cause leukocytosis. With temperatures higher than 190 °F , however, no amount of raw food diminished the pathological effects so leukocytosis always occurred.

So there is no increase in white blood cells when you eat raw fresh live food. But when you eat cooked, processing and GMO foods, you are engaging a daily battle against the toxins in food instead of benefiting from the nutrients in healthy food. This will eventually exhaust your immune system causing age-related diseases and even premature death.

This research was done in the 1930’s and has never been adequately confirmed. But it also seems that nobody has done anything to disprove it either – in almost 80 years. Hmmm, I wonder why? Meanwhile, cooked food proponents will always remind you that this research hasn’t been verified, and was poorly designed. Nonetheless, it is indicative of something — and medical schools still teach doctors that so-called “digestive leukocytosis” is a perfectly normal reaction to eating — cooked foods, that is. People who eat raw foods, however, DO NOT get digestive leukocytosis after eating…not even a little bit!

White Blood Cells & the Immune System

Your white blood cells are the sanitation engineers of your body, keeping your tissues, lymph and bodily fluids pure. If poisons, bacteria, fungi (yeast), metabolic wastes, cooked food toxins or other foreign substances get in your blood, they are immediately attacked by your white blood cells. But too many toxins at one time can overwhelm your immune system.

White blood cells increase in normal blood right after major infections or poisons since they are the disease fighting mechanisms in your blood. Your body contains hundreds of these defensive mechanisms including leukocytes, lymphocytes, plasma cells, monocytes, basophils, neutrophils, eosinophils and granulocytes that clear your blood of toxins.

Whenever you swallow any drugs (recreational or medical), many herbs, even vitamins and minerals – or just eat cooked foods — you’ll experience leukocytosis. This is when an abnormal amount of leukocytes are released into your blood to stop harmful alien substances before they damage any cells or tissues.

Most Americans have a “normal” white blood cell count of around 4,300-7,000 per cubic millimeter of blood. After eating cooked food or taking drugs your white cell count can jump to as much as 20,000 per cubic millimeter!

This “average” white cell count of 4,300-7,000 is actually very high! You see, it’s based on what the average American usually eats — which generally includes junk foods, cooked foods, sugary soft drinks, coffee, tobacco and other foods that cause an “abnormal” white cell count which only looks normal in this context. After decades of eating a “normal” diet that does this, you resistance to disease and your immune response it badly damaged.

But for people who consume a raw food diet with mostly fresh, organic fruits, vegetables, nuts and seeds actually have white blood cell counts well below 4,300! In fact, doctors may mistakenly think such people have an immune deficiency like AIDS! The real truth is that people who have a real immune deficiency like AIDS have completely destroyed their capacity to create white blood cells from constantly taking drugs or getting transfusions.

Raw foods are easily digested, requiring only 24-36 hours for transit time through the digestive tract, compared to 40-100 hours for cooked foods. Eating overly heated nutrients increases the threat of putrefaction and disease. As you consume cooked carbohydrates, proteins and fats, you are eating numerous carcinogens created by heat.

Such overly cooked foods actually cause a lot of damage in your body. Nutrients are coagulated, deaminized, caramelized, rendered inorganic and become pathogenic or disease causing. Cooked food spoils rapidly, both inside and outside your body, while live, raw foods are quite slow to lose their nutritional value.

Pasteurization Processing Plant

After you eat cooked or processed food, bacteria grow and multiply exponentially. This is what what happens to milk when making yogurt. If you tried to make yogurt with raw milk, adding a bacteria culture, the bacteria actually just dies. No yogurt! But if you pasteurize the milk it is now “dead” and the bacteria can then “spoil” or ferment the milk producing yogurt in a few hours.

Cooked food is nothing but dead material that are a feast for many bacteria. The cooked, damaged proteins, starches and sugars quickly ferment and putrefy creating a soup of vinegars, alcohols, indole, skatole, nitropyrene, ptomatropine, ptomaines, leukomaines, hydrogen sulfide, cadaverine, muscarine, putecine, nervine, mercaptins and ammonias.

GMOs Are Not Designed for Better Nutrition – Ever!

Agribusiness corporations design their seeds for better yields, higher growth rates, more pest resistance and other attributes that are intended to increase their profits. The last thing they actually think about is nutrient content, if they even give it any thought at all! In fact, agricultural researchers generally accept the fact that selection for one function may take resources away from another. If they ever select for nutrients they’d eventually get them — but that is the last thing they’re looking for.

There are many reasons for changes in nutrient content over time. Farming practices like row spacing, seeding, type of fertilizer, irrigation and other factors all play a significant role However, the single biggest factor today may be the fact that GMO crops are never designed to improve the nutritional profile of a crop as the priority is almost always resistance to chemicals or pests and other factors affecting profitability.

Geneticists actually don’t think very much about nutrient content, it they bother at all. Instead, they like to breed GMO pigs that glow in the dark by inserting a gene from a jellyfish into their DNA — now THAT’S interesting! So are tomatoes that have antifreeze from fish genes to resist freezing. Or the Arctic Apple, now approved, that doesn’t turn brown. Or potatoes that don’t bruise. Many genetically engineered foods have foreign genes spliced right into their genetic code without much thought of any side effects that might exist.

At least 90 percent of the soy, cotton, canola, corn and sugar beets sold in the U.S.A. today are genetically engineered. Herbicide-resistant corn now covers about 90% of U.S. corn acreage! And the gene for toxins in Bacillus thuringiensis (Bt), provides increased resistance to insects reducing the need for some pesticides.

But many people, including scientists, think that these “Frankenfoods” cause serious environmental damage and health issues for consumers. The Center for Food Safety calls genetic engineering, “one of the greatest and most intractable environmental challenges of the 21st century.”

“Genetically modified foods have been linked to toxic and allergic reactions, sick, sterile and dead livestock and damage to virtually every organ studied in lab animals,” according to the Institute for Responsible Technology. And, according to the Non-GMO Project, “Most developed nations do not consider GMOs to be safe.”

My advice is that the risk isn’t worth the potential damage — so stick to non-GMO and organic foods!

Healthy Adaptations You’ll Enjoy with Raw Food

Three important changes immediately occur when you choose a mostly raw food diet. First are the higher quality nutrients you get in your body. Fresh raw food is nutrient dense, mostly pre-digested and easily absorbed. Cooked and processed nutrients are mostly “dead,” denatured and low quality. That’s one reason people over-eat — their stomach may feel full but their body still craves real nutrients.

The second big change with raw food is what you stop eating — devitalized, heat-damaged toxic food residues. Now you won’t be wasting energy flushing toxic byproducts from your last meal away from your cells or storing them in fats, cysts, warts or tumors! Instead, this valuable energy is now used to maximize your wellness!

The third big shift on a raw food diet is the end of overeating. When you overeat you saturate your body with things that clog your body’s nutrient delivery and cleansing systems. The blood delivers nutrients and oxygen to living cells and carries away toxins metabolites created during cellular metabolism. When you eat too much your body is overtaxed with nutrients, toxins and calories it just cannot use or even excrete, so it just stays there stored away in your tissues eventually making you sick years down the road.

Athletic performance can also be dramatically improved on a raw food diet. Dr. Douglas Graham, a 100% raw foodist for over 20 years, has coached pro athletes all over the world from many fields, including tennis legend Martina Navratilova and NBA pro basketball player Ronnie Grandison. He has advised Olympic athletes from four continents in a variety of sports.

Dr. Karl Elmer also worked with top athletes in Germany, producing improved performance with a food diet. Raw food gives athletes more strength, energy and stamina. The mind, memory and concentration of a raw foodist can be more focused and effective. Raw food helps you feel energized after eating rather than the usual feeling of tiredness you get as your body tries to cope with the toxins in dead food. Raw foodists have more energy, need less sleep and often get a more restful sleep.

The healthy raw materials your body needs to for wellness are not found in cooked, processed or GMO foods. Also, the nutrient values in our Recommended Dietary Allowance [RDA] charts are the highest for raw foods. NOTE: Potato, rice, wheat, pasta and bread aren’t even found on this chart.

So don’t get stuck trying to figure out the best diet or the right way to cook your fruits and vegetables. The most important thing you can do is simply eat more fruits and vegetables. The U.S. Dept. of Agriculture recommends up to 10 servings of fruits and vegetables — and if you do that with raw food you are already a raw foodist! The average American, by comparison, eats only the 3.6 servings of fruits and vegetables combined, which includes garbage like french fries, potato chips, ice berg lettuce and other cooked, toxic, empty calories.

On a raw food diet your energy will increase, you’ll spend less money on groceries (processed foods are actually more expensive), your sugar and fat cravings will eventually disappear and your weight will normalize. It’s actually nearly impossible to overeat fresh raw fruits and veggies.

Raw food is also good for all other living beings on our planet, from plants to animals. Wild animals eat raw food almost exclusively. Your beloved pets do much better on raw foods. Zoo animals do better on raw foods. Fish are healthier and more lively on raw foods. Even plants grown with raw fertilizers do better than those grown with processed fertilizers.

Raw Food & the World’s Great Scriptures

In the Garden of Eden, Adam and Eve did not eat cooked food until after the snake gave them the apple of knowledge — a baked apple, of course! Chinese, Egyptian, Indian and Hebrew stories all say that humans were kicked out of paradise for cooking their food. Methuselah, however, ate only raw food and lived to a ripe old age.

The Bhagavad Gita says that pious people should eat what nature leaves them after their offering. But ungodly people that cook food sin as they eat. They’re referring to the diseases caused by cooking food.

Finally, in the Essene Gospel of Peace, a 3rd century Aramaic document, Jesus says, “Cook not your food with the fire of death, which is the fire that blazed outside you, that is hotter than your blood. Cook only with the fire of life, that is the natural heat of the day.”

Originally inspired by Arthur M. Baker MA, NHE, ca. 2000.
Revised & updated, 2011-2017 by Robert Ross

Is there any evidence that microwaving food alters its composition or has any detrimental effects on humans or animals?

There is no evidence that eating microwaved foods is detrimental to humans or animals. Microwaves are low-energy waves that, like visible light, fall within the electromagnetic spectrum. Like all electromagnetic waves, they are composed of photons, but the photons in microwaves have so little energy that they are unable to cause chemical changes in the molecules they encounter--including those in food. They are non-ionizing waves and do not leave a residue.

When food absorbs the energy in microwaves, ions in the food polarize and polar food molecules rotate, causing collisions. It is these collisions that generate friction with the surrounding matrix the friction quickly produces a lot of heat.

As far as we know, microwaves have no nonthermal effect on food. The only chemical and physical changes result from the heat generated. For instance, if the food is heated to very high temperatures, proteins and carbohydrates may be hydrolysed, certain vitamins will be destroyed, and sugars and proteins may interact the same reactions occur in foods heated in regular ovens at high temperatures. In fact, some of these changes are less obvious in foods heated in microwaves because the temperature rise is much faster than it is in regular ovens. Longer heating periods can lead to greater physical and chemical changes. For this reason, the nutritional quality of microwaved foods is often superior. For example, when vegetables are steamed on a stove, the water used to cover the vegetables can leach away some of their valuable vitamins in a microwave, though, additional water is not needed.

Peter McIntyre, physics professor at Texas A&M University, adds the following information:

When food is placed in a microwave oven, the electromagnetic fields from the oven induce electric currents within the water in the food. Because all of our food (like ourselves) is mostly water, this tactic is a pretty good way to generate heat uniformly throughout a serving of food. That is also why it is possible to heat food more quickly in a microwave oven than in a conventional oven, where the food must be heated from outside in. Microwaves do nothing more to food than heat it. There is no evidence that microwaves alter the composition of food or have any other detrimental effects.

Sorry Vegans: Here's How Meat-Eating Made Us Human

S cience doesn’t give a hoot about your politics. Think global warming is a hoax or that vaccines are dangerous? Doesn’t matter, you’re wrong.

Something similar is true of veganism. Vegans are absolutely right when they say that a plant-based diet can be healthy, varied and exceedingly satisfying, and that&mdashnot for nothing&mdashit spares animals from the serial torments of being part of the human food chain. All good so far.

But there’s veganism and then there’s Veganism&mdashthe upper case, ideological veganism, the kind that goes beyond diet and lifestyle wisdom to a sort of counterfactual crusade. For this crowd, it has become an article of faith that not only is meat-eating bad for humans, but that it’s always been bad for humans&mdashthat we were never meant to eat animal products at all, and that our teeth, facial structure and digestive systems are proof of that.

But sorry, it just ain’t so. As a new study in Nature makes clear, not only did processing and eating meat come naturally to humans, it’s entirely possible that without an early diet that included generous amounts of animal protein, we wouldn’t even have become human&mdashat least not the modern, verbal, intelligent humans we are.

It was about 2.6 million years ago that meat first became a significant part of the pre-human diet, and if Australopithecus had had a forehead to slap it would surely have done so. Being an herbivore was easy&mdashfruits and vegetables don’t run away, after all. But they’re also not terribly calorie-dense. A better alternative were so-called underground storage organs (USOs)&mdashroot foods like beets and yams and potatoes. They pack a bigger nutritional wallop, but they’re not terribly tasty&mdashat least not raw&mdashand they’re very hard to chew. According to Harvard University evolutionary biologists Katherine Zink and Daniel Lieberman, the authors of the Nature paper, proto-humans eating enough root food to stay alive would have had to go through up to 15 million “chewing cycles” a year.

This is where meat stepped&mdashand ran and scurried&mdashin to save the day. Prey that has been killed and then prepared either by slicing, pounding or flaking provides a much more calorie-rich meal with much less chewing than root foods do, boosting nutrient levels overall. (Cooking, which would have made things easier still, did not come into vogue until 500,000 years ago.)

In order to determine how much effort primitive humans saved by eating a diet that included processed animal protein, Zink and Lieberman recruited 24 decidedly modern humans and fed them samples of three kinds of OSU’s (jewel yams, carrots and beets) and one kind of meat (goat, raw, but screened to ensure the absence of any pathogens). Using electromyography sensors, they then measured how much energy the muscles of the head and jaw had to exert to chew and swallow the samples either whole or prepared one of the three ancient ways.

On average, they found that it required from 39% to 46% less force to chew and swallow processed meat than processed root foods. Slicing worked best for meat, not only making it especially easy to chew, but also reducing the size of the individual particles in any swallow, making them more digestible. For OSUs, pounding was best&mdasha delightful fact that one day would lead to the mashed potato. Overall, Zink and Lieberman concluded, a diet that was one-third animal protein and two-thirds OSUs would have saved early humans about two million chews per year&mdasha 13% reduction&mdashmeaning a commensurate savings in time and calorie-burning effort just to get dinner down.

That mattered for reasons that went beyond just giving our ancient ancestors a few extra free hours in their days. A brain is a very nutritionally demanding organ, and if you want to grow a big one, eating at least some meat will provide you far more calories with far less effort than a meatless menu will. What’s more, while animal muscle eaten straight from the carcass requires a lot of ripping and tearing&mdashwhich demands big, sharp teeth and a powerful bite&mdashonce we learned to process our meat, we could do away with some of that, developing smaller teeth and a less pronounced and muscular jaw. This, in turn, may have led to other changes in the skull and neck, favoring a larger brain, better thermoregulation and more advanced speech organs.

“Whatever selection pressures favored these shifts,” the researchers wrote, “they would not have been possible without increased meat consumption combined with food processing technology.”

None of that, of course, means that increased meat consumption&mdashor any meat consumption at all&mdashis necessary for the proto-humans’ 21st century descendants. The modern pleasures of a grilled steak or a BLT may well be trumped by the health and environmental benefits of going vegan&mdashand if the animals got a vote, they’d surely agree. But saying no to meat today does not mean that your genes and your history don’t continue to give it a loud and rousing yes.

Chins are a bit useless so why do we have them?

There are plenty of theories to explain why we have chins, but none of them stands up to scrutiny. Will we ever solve the mystery?

Chins: we all have them, sitting a bit uselessly at the bottom of our faces. Some people have strong chins, others are said to have weaker chins. But if you were pushed to explain what chins are actually for, would you have a good answer? Nobody seems to use their chin for anything useful.

Nobody had put forward a good idea about why humans would be the only animals with chins

It becomes even stranger when you consider that among the all primates &ndash including our extinct relatives &ndash only we have chins. Nobody seems to know why &ndash although over the last century several theories as to its purpose have been offered.

A review of all the previous literature now seeks to put some of these assertions straight. "They [chins] are really strange, and that kind of drew my attention," says James Pampush of Duke University in Durham, North Carolina, who has been studying our humble chin for several years. "Nobody had put forward a good idea about why humans would be the only animals with chins," so he set out to to untangle the enduring puzzle of the human chin in a recent review.

We all have a pretty good idea what a chin is, but it&rsquos useful to define it nonetheless. Put simply, our chin is the protrusion of the bone that appears below the front wall of the human mandible (lower jaw). No other animals have chins &ndash chimpanzee and ape jaws slant inwards for instance. Even our closest extinct relatives such as Neanderthals did not have them.

Nobody can quite agree why the chin exists

In fact, one of the ways that scientists differentiate between an anatomically modern human and a Neanderthal skull is by looking to see if it has a chin. "That is what makes the appearance of chins in anatomically modern humans so interesting. It implies that there was some sort of behavioural or dietary shift between Neanderthals and anatomically modern humans that caused the chin to form," says Zaneta Thayer of the University of Colorado, Denver, another researcher who has studied the human chin.

Although nobody can quite agree why the chin exists, there are three prominent theories that have been around for decades.

To start with it has long been proposed that our chin may help us chew food. The theory goes that we need the extra bone to deal with the stresses involved with chewing. However, this idea falls flat when you compare us to other great apes with similar-shaped jaws.

When we chew, our jaw gets pulled apart a bit like a wishbone and the further apart our jaws are the weaker the bones are. If we were to protect ourselves from the stresses of chewing we would need more bone on the inner wall of the jaw near the tongue, not beneath our jaw.

We don't have a very tough time chewing

That's exactly what you see in chimpanzees and macaques. They have extra bone on the tongue-ward side of their lower jaw, called a "simian shelf", which we do not have. The added bone that forms our chin is not very useful for additional chewing strength.

Another point Pampush is keen to make is that we don't have a very tough time chewing in the first place. Much of the food we eat is soft, especially cooked food. "That's why the chin is not an adaptation for chewing,&rdquo he says.

Flora Groening at the University of Aberdeen in the UK, agrees. Five years ago she used a computer model to look at the mechanical load on the mouth with and without a chin. "There wasn&rsquot clear evidence to support the claim that the human chin is a result of a mechanical adaptation," she says.

Others have argued that our chin helps us to speak, that our tongue needs reinforcements from extra bone below our jaw. We are the primates with the most extensive speech repertoire after all.

The issue here is that we don't need much force to speak, so it&rsquos not at all obvious why we would need extra bone to help with the process. And if we did need any extra bone, just like for chewing it would be far more useful to add it to the inside of our jaw, closer to our tongue, rather than tagging it onto the bottom of our jaw.

If it&rsquos an adaptation for sexual selection then we are the only mammal that has the same in both sexes

The third idea is that the chin doesn't have an immediate function, but that it has been chosen by sexual selection. It is our equivalent of large-flanged orangutan faces or a male elk's large antlers. These are traits that have both been selected for when the opposite sex is considering a mate. This ensures they live on in future generations even if they have no direct benefit or use.

Again there is a problem here, Pampush says. In all other mammals only one sex will have a sexually selected trait. Chins on the other hand are found on men and women. "If it&rsquos an adaptation for sexual selection then we are the only mammal that has the same in both sexes," he says.

The three hypotheses mentioned all therefore fall flat, says Pampush. In fact, he argues that nobody can know why we truly have a chin at all. "Anyone who tells you that they know [why] is lying." Many of the ideas proposed so far have not stood up to scrutiny, he says, while others are untestable.

Unfortunately, then, we are no closer to explaining why we have a chin. But if we look at it another way it might become more apparent how it came to sit on our faces so prominently, despite having no functional use.

Spandrels are a by-product of a change happening elsewhere

It could simply be what's called a "non-adaptive trait" that arises as a by-product of something else. This is an idea that was suggested in 1979 by the biologists Stephen J. Gould and Richard Lewontin. The chin, they said, is a "spandrel". This is the name given to an architectural feature below some church domes that is often so ornate it looks as if it was the starting point for the building&rsquos design. In reality, spandrels only exist because they help support the dome above them. In other words, spandrels &ndash both biological and architectural &ndash are a by-product of a change happening elsewhere.

Our faces getting smaller may be what caused this particular spandrel to show, according to Nathan Holton of the University of Iowa. He says the chin may simply be a by-product of the reduction of the human skull. Our mandibles, for instance, are less robust than those of our extinct hominin relatives. As our ancestors developed and used fire to cook their food, they no longer needed such strong jaws to chew. This means the overall strength of the jaw in turn became reduced.

The appearance of a chin could have helped to maintain some of the strength our lower jaws once had

Other features changed too. We lack a prominent brow bridge and we have a hollow point below our cheek bones (technically called the "canine fossa"). These have also been linked to our smaller faces, Holton says. "The presence of a chin is probably part of this trend as well. In this sense, understanding why we have chins is really about explaining why human faces became smaller."

Groening also favours this idea, and says that the appearance of a chin could have helped to maintain some of the strength our lower jaws once had. "Neanderthals and Homo erectus had such robust mandibles, they didn&rsquot need an extra thickening of the bone in the chin region, they already had strong jaws and robust bone," she says. Modern humans in contrast have very graceful bones. "A chin might help to provide a bit of extra resistance to maintain a certain mechanical strength, but doesn&rsquot really increase the [overall] strength."

On the other hand, a spandrel could also have been caused by a random event or accident, rather than as a by-product of useful adaptations elsewhere in our faces.

"I am doubtful that it's an adaptation," says Pampush, but the problem is that for now nobody can prove it is an accident either. "We don&rsquot have the tools to do so right now."

The chin literally sticks out

So if none of the proposed theories fit the bill, and we cannot prove the spandrel hypothesis, you might wonder why Pampush has spent so long researching the human chin.

It makes more sense when you consider that, although chins are pretty weird, studying them helps pinpoint the evolutionary processes that make us who we are today. It also exposes that evolution works in many ways.

Perhaps surprisingly, it's also rare to find a trait that is uniquely human. Many traits that humans have, other animals do too. The chin on the other hand, literally sticks out, and looking at how it did so may help us understand another step in the process that led to us.

Were Carbs A Brain Food For Our Ancient Ancestors?

A group of British researchers has a hunch that once ancient humans learned to cook, starchy foods like root vegetables or grasses could have given them a calorie bump that fueled the evolution of the human brain. Scott Sherrill-Mix/Flickr hide caption

A group of British researchers has a hunch that once ancient humans learned to cook, starchy foods like root vegetables or grasses could have given them a calorie bump that fueled the evolution of the human brain.

Carbohydrates are a rich source of energy. That's exactly why some of us may feel a bit conflicted about them, since several recent studies and diets have suggested we should cut them to lose weight. (The latest study concluded that total calories matter most if you want to shed pounds.)

Our ancient ancestors, however, gathered carbohydrates greedily for their energy. And by looking at past work on human evolution, a group of British researchers has a hunch that once ancient humans learned to cook, starchy foods like root vegetables or grasses could have given them a calorie bump that fueled the evolution of the human brain.

Researchers studying Paleolithic diets have previously suggested the early human brain began getting big at least 2.5 million years ago after early humans learned to butcher and process meat with stone tools. "But I don't think that's the whole story," says Mark Thomas, an evolutionary geneticist at University College London. "Around 800,000 years ago, the brain truly accelerated in increasing its size."

The Salt

Looks Like The Paleo Diet Wasn't Always So Hot For Ancient Teeth

Shots - Health News

The Paleo Diet Moves From The Gym To The Doctor's Office

In a new paper in The Quarterly Review of Biology, Thomas and colleagues argue that the brain suddenly began evolving faster because our ancestors discovered an even better brain food. "At that point, we develop the use of fire and start consuming more carbohydrate-rich foods," Thomas says. "[That] was critical to the expansion of the brain."

Carbs, particularly long chains of the simple sugar glucose or starches, are an ideal food for fueling the brain, says Thomas. "The brain has an absolute requirement for glucose," he says. And with carb-rich food, the body doesn't need to spend extra energy converting other nutrients, like those found in meat, into glucose to feed the brain.

Once humans began cooking vegetables, starch-digesting enzymes in their guts called amylases could work much more efficiently than they could on raw vegetables. "In potatoes, for example, you could digest the starches about 20 times faster if [they're] cooked rather than uncooked," Thomas says. "And the earliest evidence for fire are also around the 500 to 800,000-year period." Cooking turned carbs into a major source of energy at the same time our brains began expanding in earnest, the team suggests in the paper.

To figure out if humans actually were eating more starches at this time, the team looked at the genes that make amylase. Thomas says genetic analyses suggest these genes started multiplying around the same time people began cooking, meaning our saliva was evolving to carry higher concentrations of the enzyme. "There's more starch to digest and therefore it's an advantage to increase the amount of salivary amylase genes," he says. That would mean early humans could get more glucose out of a mouthful of potato, providing more energy to grow a hungry brain.

But other researchers aren't swayed by the hypothesis. "It makes sense that if you eat more starch you'll have more amylase, and that's a beautiful suggestion. But the fact is the data don't support it yet," says Richard Wrangham, a biological anthropologist at Harvard University.

To support their hypothesis, Thomas and his colleagues cited work from anthropologists at Arizona State University that suggested certain groups of humans with high-starch diets had more amylase in their saliva, while humans with low-starch diets had less.

Wrangham says that study analyzed each population's diets incorrectly. "Based on the data I found, it seemed as though they got it exactly opposite. Their high-starch populations all ate less starch than any of the low-starch populations." According to Wrangham, that means there's no evidence that eating more carbohydrates will make saliva evolve to digest starches better. (Thomas also acknowledges the study is flawed.)

Other critics say that the timeline in Thomas' version of the brain's evolution doesn't fit neatly together, something Thomas admits is true, too. When amylase genes began multiplying, for one, is uncertain. "It's basically sometime in the last million years, but we don't know when," he says. "We want to have more, much more evidence for when [it] started to increase."

And when exactly humans began cooking is also up for debate. "The best evidence we have for fire from 700, 800,000 years ago," says Shara Bailey, a biological anthropologist at New York University. "If they're arguing starch cooking was needed for large brains, well, that's fine," she says. "But large brains were already established by the time we have any good evidence for cooking."

What specific food lead to the rise of the modern human brain might be a moot point. Cooking makes everything easier to eat, and being able to cook at all is probably more important than what early humans cooked, says Suzana Herculano-Houzel, a neuroanatomist at Federal University in Rio de Janeiro, Brazil.

"I like to think our ancestors started roasting everything they can think to put in their mouth – and then you'd get a lot of protein and carbohydrates and fat too because man, roasted fat is delicious," she says. After all, we need a lot of different nutrients to run our body and our brains, and cooking many different things is the only way to easily get all of them.


  1. Due

    and how in such a case it is necessary to enter?

  2. Dudal

    Charming phrase

  3. Yolyamanitzin

    It is agreeable, this very good thought has to be precisely on purpose

  4. Tuk

    Yes, I understand you.

  5. Gurutz

    Bravo, brilliant thought

  6. Schmuel

    Yes, really. It was and with me. We can communicate on this theme. Here or in PM.

  7. Shadrach

    I apologize for interrupting you, but I need a little more information.

Write a message