Skip to main content

Cuisine, Evolution of


CUISINE, EVOLUTION OF. Throughout evolutionary history humans have prepared or transformed foods to make them edible. The preparation of food before consumption, which is the foundation of cuisine, has always been a part of the human behavioral repertoire and helps define the species. Unlike most related mammals and primates that begin their digestion in the process of chewing their food, humans often begin digestive processes outside of the body, using tools for this purpose. In other words, what humans do to food before eating it often transforms the food in ways that make it more digestible.

Abundant archeological evidence shows all kinds of tools used for food preparation throughout human evolutionary history. For example, ancestors from the genus Homo perfected tools that could cut a piece of meat more effectively than their canine and incisor teeth. They found they could crush a nut or other hard seed pod more efficiently with a stone pestle than with their molar teeth. Human ancestors added controlled use of fire several hundred thousand years ago, so apparently the potential for predigesting food outside the body was well developed by the time Homo sapiens emerged. From a biological evolutionary perspective, the continued use of tools and fire and the broad effects of the domestication of plants and animals has altered important aspects of the human food chain and has significantly affected the evolutionary dynamics that underlie the species.

The effect of these important developments in the processing of foods is most evident in human digestive tracts and some of the metabolic pathways associated with the foods humans eat. Evolutionary biologists refer to the changes in the human digestive tract as a result of a relaxation in evolutionary selection. This is evident in the variability of structures that no longer have importance for survival, such as in the structural reduction in size and formation of human teeth or the function of the appendix.

Much of this biological evolution occurred prior to the origin of agriculture that was marked by the domestication of plants and animals. The enormous success of agriculture and horticulture (beginning approximately ten thousand and five thousand years ago respectively), provided the practicing societies with the ability to feed an excess number of their members and thus served as the basic economic subsistence engine for the broad emergence of human civilizations and the overall growth of humanity to its megapopulation size.

Influences of Agriculture

Since the Neolithic era, agricultural practices have continuously improved the productivity of certain plants over others. This intensification has led to an increased dependence on fewer and fewer plants to provide the bulk of most human diets. However, no single plant or any small group of plants, when consumed as raw products from the field, can satisfy all of the nutrient needs of the species. Hence, dependence on fewer plants could have produced nutritional problems (and to some extent this did happen) if humans had continued to eat them more or less raw, as more ancient ancestors did over the thousands of years before the Neolithic era. Thus the Neolithic agricultural diet, characterized by a narrow range of cereal grains and legumes, represented a substantial change from the Paleolithic diet, characterized by a great diversity of hunted and gathered foods. However, this substantial change in diet raises important questions about how and if the species continues to evolve biologically in response to the decrease in the diversity and contents of diets brought about by agriculture.

The relatively rapid shift to an agricultural diet represented a significant nutritional challenge because the diets were largely dependent on relatively few cereal grains and legumes that had serious nutritional limitations. These new limitations included specific nutrient deficiencies, antinutritional properties (such as antitrypsin factors, high levels of phytates, and lectins), and various toxic constituents (such as cyanates and tannins). This shift to agriculture could and did result in strong new sources of natural selection and the rapid evolution of biological traits that tended to compensate for these limitations. However, the vast majority of the adaptations to this new agriculturally based diet came from the increased use of cuisine-based technologies that went far beyond the use of tools and fire, already well established in Paleolithic times. In essence the emergence of a wide variety of cuisine technologies counterbalanced the more limited but important potential of the genetic changes required to adjust to the nature and rate of these new dietary constituents.

It is clear that significant biological adaptations underlie the success of the evolution of some agricultural practices. Many experts accept the evidence that it is the continued secretion of lactase enzyme that makes milk sugars digestible by most northern European adults in contrast to most other adults of the world, who stop secreting the enzyme at the time of weaning. This evolved trait for adult lactase enzyme sufficiency underlies the high and continued dependence of these populations upon dairy foods following the domestication of cattle over eight thousand years ago. Although the specific cultures of northern Europe have undergone many cultural and historic changes in diet over that long period, all of those cultures continue to consume dairy foods in unbroken traditions, such as making yogurts and cheeses that partly lower the milk sugar content. Likewise, good evidence indicates a genetic cline (or gradual geographical change in the gene frequency) of adaptations to the gluten protein in wheat (to which some people have serious intolerance) tracked with increasing frequencies from the Levant, where wheat was first domesticated, all the way across Europe, where it was introduced at later times.

Other genetic adaptations involving nutrition and food also work pharmacologically to influence disease problems. For example, the disease called favism, which results in a profound life-threatening, hemolytic anemia, is due to another enzyme deficiency that helps protect the affected populations from malarial infection. However, the gene Glucose-6 dehydrogenase deficiency (G6PD), a sex-linked gene associated with males, makes afflicted males particularly sensitive to the profound hemolytic affects of the oxidant compounds in the beans. Although the G6PD gene is widespread in all of the regions where fava beans are consumed and causes many deaths every year among sensitive individuals, the pharmacological effects of the beans help prevent malaria. Not surprisingly, more myths and stories promote and prohibit the consumption of these beans than any other food in Indo-European history. Thus foods may have pharmacological properties in addition to nutritional properties, which makes interaction between their consumption and the continued evolution of the populations that eat them complicated.

While genetic adaptations to diet did evolve over the last ten thousand years, most adaptations to agricultural diets evolved at the cultural realm in the form of cuisine technologies. While becoming more dependent on fewer plants in the diet, human forebears produced a classic evolutionary bottleneck in which the increased dependence on fewer plant crops increased the nutritional liabilities each plant retained. Consequently, a continuous complementary evolutionary process related the increased agricultural productivity with the evolution of new cuisine technologies that enhanced the nutrient composition and often simultaneously rid the plants of their toxic and antinutritional effects.


The term "nutriculture" refers to the reciprocity of these preparatory technologies with the overall advantages that agricultural practices have provided for the enormous success in increasing the productivity of plants. In other words, every advance in the agricultural productivity of plants was accompanied by the evolution of cultural strategies to offset the nutritional disadvantages of depending on so few nutritionally unbalanced and potentially toxic foods. Hence "nutriculture" represents the evolved cultural strategies that turn these disadvantages into advantages and the complementarity of these preparative technologies with agriculture. Treating foods before they are consumed can and often does have nutritional and pharmacological consequences for the finished consumable. For example, many different physical, chemical, biochemical, and microbiotic steps "prepare" the plant-based staples in the human diet. These transformations from the "raw to the cooked" become the culturally recognized foods humans eat, and with which they celebrate, remain nutritionally healthy, and ultimately survive and prosper around the world. Thus the evolution of food nutriculture has been just as important as the success of agriculture in producing enough food to continue to feed the world.

In fact many of these technologies become parts of long-standing recipes that fill this encyclopedia and the cookbooks and cooking traditions of the world. These technologies are so important in defining the foods consumed that they become part of the cultural worldview of every society that has ever lived. Every society celebrates with food and incorporates foods as symbols, and many of these traditions provide the cultural memory for how foods should be prepared for healthful consumption.

Although evolutionary anthropologists and biologists have not looked at what humans do to food as part of an evolutionary process equal in importance to agricultural practices, the evidence for such a process underlying basic cuisine practice is strong. Of course this does not mean that every aspect of cuisine practice involving innovations, presentation, and the like has some kind of evolutionary basis. However, it does suggest that many of the fundamental aspects of the transformation of raw materials into foods often has a long and highly evolved natural history that is not always readily recognized as optimizing their nutrient and other qualities. Hence transforming the raw foods for consumption can and does make a difference in health and survival. In some respects this knowledge about preparation and consumption of foods is so much a part of the existence and identity of a society that its members often are more conscious of making food "palatable" to culturally conditioned tastes and expectations than of the nutritional and pharmacological significance of the steps taken.

Importance of Preparation

Other major sources of nutrition follow the nutricultural principle. A classic example of the evolution of cuisine practices involves maize or "corn." While maize is the most productive crop in the world, and virtually all of the great Mesoamerican civilizations depended upon it as a staple, it is not nutritionally the best balanced of staples. Maize has low lysine and tryptophan levels, and its niacin levels, when stored as a staple, are nutritionally indigestible. Specifically the B vitamin niacin becomes bound in a complex called niacytin, and this bound form is indigestible to the effects of stomach acid and gastrointestinal enzymes. However, it is known that the chemical bond that makes niacytin resistant to digestive acid is broken in the presence of an alkali that frees the bound niacin. Although humans can make a small amount of niacin from the essential amino acid tryptophan, corn is deficient in tryptophan. Fortunately beans have relatively high levels of tryptophan, and as long as beans are consumed with corn (maize), the diet is balanced. However, if beans and other regular sources of tryptophan or niacin are not available in the diet, the disease pellagra makes people sick with diarrhea, dermatitis, dementia, and ultimately result in death.

While alkali treatment also enhances the solubility of lysine, it is not universally used, even in the Americas where the crop evolved. However, the Native American societies that were high consumers and growers of maize always used alkali in their cuisine technology. It was a one-to-one relationship between high consumers and growers and their subsequent use of this critical step in the preparation of their food staple. In terms of their recipes, the added alkali was prepared in several different ways, including crushed limestone, roasted mollusk shells, and wood ashes. The net effect of this step was always the same. The food was heated and "cooked" in the lime, and then most of the alkali was removed prior to consumption. Even though the recipes varied among different cultures and traditions, these basic cooking steps did not vary.

In this regard it is interesting to note that Christopher Columbus, who first introduced maize to the Old World, only introduced the food and not the critically important recipe. Pellagra became widespread, resulting in a gradual decrease in the use of maize as a human food. Not until the discovery of vitamins beginning in the 1920s, over four hundred years later, was pellagra defined as a nutritional deficiency associated with the consumption of maize.

However, considering the history of every major civilization, it becomes clear that all depended upon the solutions to similar problems to survive and prosper. Thus, while it is possible to innovate new food technologies that may not have many or any negative consequences in times of nutritional abundance, the same practices may produce serious deficiencies during times of nutritional stress. Thus food preparation has substantial survival advantages, and undoubtedly significant wisdom resides in the related food practices that maintain food preparation traditions.

The use of fermentation to enhance the nutrients of wheat and barley in the production of beer and bread is a classic example of how foods become staples of the diet. Fermentation of wheat and barley with yeast not only produces the alcohol in beer and, to a lesser extent, in bread; it also synthesizes nutritionally essential amino acids from nonessential ones, reduces the toxicity of the tannins in the wheat, and lowers the phytate levels that interfere with calcium absorption. Squeezing, crushing, and heating the manioc (a good source of nutrition known throughout the world for yielding the tapioca starch of dessert puddings) reduces the plant's cyanide content, which can be so high that even breathing the cooking fumes can be deadly. With the notable and important exception of fruits, which evolved to attract mammals to eat the seeds and thus to disperse them, the raw produce is not a viable source of nutrients without the culturally evolved capacity for transforming it into an appropriately edible food.

Biology and Culture

Over time, a trial-and-error process results in the nutritive success or failure of new cuisine strategies. Those strategies that satisfy basic nutritional needs become incorporated into food traditions and provide subtle and not so subtle advantages to the people who practice them. When the cause-and-effect relationship between the cuisine practice and the outcome are readily evident, as in changing the appearance, taste, or aroma of a food and then noting a benefit, it is relatively simple to understand the functional significance of the cuisine practice. However, when considering subtle cause-and-effect relationships not readily evident and expressed some time long after the prepared food is consumed, it is difficult to detect the relationship and consciously to behave in the appropriate way. For example, the time it takes to develop a nutritional disorder for a vitamin like niacin is so long that the appropriate cuisine practice may not ever evolve, as was the case for extracting the niacin from maize in Europe. Epidemiological studies of long-term disease outcomes that may extend over a substantial portion of a lifetime, such as cardiovascular disease and some forms of cancer, demonstrate how subtle some of these effects are.

On the face of it, the degree to which a culturally based diet satisfies basic nutritional needs is a matter related to the biology of humans as omnivores. Humans uniquely depend on cultural adaptations concerning diet to solve the nutrient problems that biology is incapable of solving on its own. Instead, humans have discovered and encoded in cultural traditions wisdom about diet that provides a culinary prescription for survival and good health. What people eat is largely dictated by cultural traditions, but the degree to which a diet satisfies basic nutritional needs largely depends on human biology. This obvious interface between biology and culture has encouraged the development of a new approach or paradigm that analyzes and interprets biological and cultural adaptability as continuously interacting phenomena throughout human evolution.

No doubt the evolution of agriculture would not have occurred without these counterbalancing nutricultural evolutionary steps. In fact this basic theme of nutriculture is repeated with other aspects of cuisine and thus forms the basis of a broad trend throughout history in the consumption of every major plant food.

The remarkable growth of knowledge about what people eat arises from an understanding of both the pre-history of diets and the recorded history of foods. Also a substantial and growing ethnographic and cross-cultural literature concerning folk cooking practices allows tests of specific hypotheses about food processing. The available data in food science and technology, the nutritional sciences, biochemistry, ethnobotany, pharmacology, and the neurosciences is extensive. Using this knowledge to extend the understanding of the biological and biocultural evolutionary processes produces the potential for providing important insights about this nascent study of nutriculture. The varied contents of this Encyclopedia of Food and Culture suggest avenues and examples of nutriculture for exploration.

See also Agriculture, Origins of; Anthropology; Eating: Anatomy and Physiology of Eating; Evolution; Food Archaeology; Maize; Nutrition Transition: Worldwide Diet Change; Paleonutrition, Methods of; Prehistoric Societies; Preparation of Food; Vitamins.


Cavalli-Sforza, L. Luca, Paolo Menozzi, and Alberto Piazza. The History and Geography of Human Genes. Princeton, N.J.: Princeton University Press, 1994.

Katz, Solomon H. "The Biocultural Evolution of Cuisine." In Handbook of the Psychophysiology of Human Eating, edited by R. Shepard, pp. 115140. Wiley Psychophysiology Handbooks. New York: John Wiley, 1989.

Katz, Solomon H. "An Evolutionary Theory of Cuisine." Human Nature 1 (1990): 233259.

Katz, Solomon H. "Food and Biocultural Evolution: A Model for the Investigation of Modern Nutritional Problems." In Nutritional Anthropology, edited by Francis E. Johnston, pp. 4166. New York: Allen Liss, 1987.

Katz, Solomon H., M. Hediger, and L. Valleroy. "Traditional Maize Processing Techniques in the New World: Anthropological and Nutritional Significance." Science 184 (1974): 765773.

Katz, Solomon H., and M. Voigt. "Bread and Beer: The Early Use of Cereals in the Human Diet." Expedition 28, no. 2 (1987): 2334. Also published in various forms in a number of textbooks, trade magazines, and the popular press.

Katz, Solomon H., and Fritz Maytag. "Brewing an Ancient Beer." Archaeology 44, no. 4 (1991): 2427.

Katz, Solomon H., and Fritz Maytag. "Secrets of the Stanzas." Archaeology 44, no. 4 (1991): 2831.

Katz, Solomon H., and Fritz Maytag. "A Thrilling Link with the Past." Archaeology 44, no. 4 (1991): 3233.

Simoons, F. J. "The Determinants of Dairying and Milk Use in the Old World: Ecological, Physiological, and Cultural." In Food, Ecology, and Culture: Readings in the Anthropology of Dietary Practices, edited by J. R. K. Robson, 8391. New York: Gordon and Breach, 1980.

Solomon H. Katz

Cite this article
Pick a style below, and copy the text for your bibliography.

  • MLA
  • Chicago
  • APA

"Cuisine, Evolution of." Encyclopedia of Food and Culture. . 16 Aug. 2018 <>.

"Cuisine, Evolution of." Encyclopedia of Food and Culture. . (August 16, 2018).

"Cuisine, Evolution of." Encyclopedia of Food and Culture. . Retrieved August 16, 2018 from

Learn more about citation styles

Citation styles gives you the ability to cite reference entries and articles according to common styles from the Modern Language Association (MLA), The Chicago Manual of Style, and the American Psychological Association (APA).

Within the “Cite this article” tool, pick a style to see how all available information looks when formatted according to that style. Then, copy and paste the text into your bibliography or works cited list.

Because each style has its own formatting nuances that evolve over time and not all information is available for every reference entry or article, cannot guarantee each citation it generates. Therefore, it’s best to use citations as a starting point before checking the style against your school or publication’s requirements and the most-recent information available at these sites:

Modern Language Association

The Chicago Manual of Style

American Psychological Association

  • Most online reference entries and articles do not have page numbers. Therefore, that information is unavailable for most content. However, the date of retrieval is often important. Refer to each style’s convention regarding the best way to format page numbers and retrieval dates.
  • In addition to the MLA, Chicago, and APA styles, your school, university, publication, or institution may have its own requirements for citations. Therefore, be sure to refer to those guidelines when editing your bibliography or works cited list.