40th Anniversary Blog Series

Why should we make staple foods more nutritious?

July 21, 2015
by Howarth Bouis

The following post by HarvestPlus director Howarth Bouis is part of an ongoing series of blog stories celebrating IFPRI’s 40th anniversary. Each story authored by current and former IFPRI research staff highlights a key research topic through the years from the personal perspective of the researcher.

When I first joined IFPRI in 1982 as a post-doctoral fellow in the Food Consumption and Nutrition Division, energy (calorie) intakes were regarded as the key indicator for improvements in human nutrition.  Back then, the conventional wisdom was that energy intakes improved rapidly with relatively small increases in income.  The poor were hungry and, as their incomes increased, they bought greater amounts of food staples to satiate this hunger. I totally bought into this popular line of thinking at the time.

In 1984-85, Lawrence Haddad and I undertook surveys in Bukidnon in the southern Philippines, the results of which convinced us that such thinking was misguided. For the same 450 households, we collected data on food consumption/intake in two different ways: asking about food expenditures, which was the methodology typically used by economists; and a 24-hour recall on food intakes, which was a methodology used by nutritionists. Much to our surprise and initial dismay, the two methodologies presented completely different dietary patterns as household incomes increased. The food expenditure data set mirrored the conventional wisdom, while the food intake data set (collected originally primarily to look at intra-household food distribution) showed a constant food staple intake, as shown in the table below (Bouis 1994, Table 6):

 

Data Collection MethodologyFood Group
Total Calories Per Capita By Expenditure (Income) Quartile (1 = lowest; 4 = highest)
1
2
3
4
Food ExpenditureStaple Foods
1,121
1,313
1,482
1,634
Non-Staple Foods
229
326
437
678
Total
1,350
1,639
1,919
2,312
24-Hour RecallStaple Foods
1,361
1,431
1,454
1,381
Non-Staple Foods
264
331
422
602
Total
1,625
1,762
1,876
1,983

 

Without going into the technical reasons here (you can read about it in detail in Bouis and Haddad 1992; Bouis, Haddad, and Kennedy 1992; Bouis 1994), eventually we concluded that the 24-hour recall food intake data gave the more accurate pattern of consumption as incomes increased. I learned that challenging the conventional wisdom is met with strong resistance, as we were soon being branded as “extreme revisionists” (Subramanian and Deaton 1996).

Our findings clearly showed that food staple consumption remained constant with income; in fact, consumption might even decline marginally as income increased. Consuming non-staple foods (rich in minerals and vitamins) did increase strongly with rising incomes (as demonstrated above), but incomes of the poor would have to increase by ten-fold or more before adequate mineral and vitamin intake levels could be achieved. Such large increases in income obviously would be a long time in coming.

My own thinking was strongly influenced by the following table (you could call it a “clue”) from the same Bukidnon surveys (Bouis and Haddad 1990, Table 9.1):

Average Height-For-Age Z-Score of Infants 0-11 Months By Expenditure (Income) Quintile (1 = lowest; 5 = highest)
1
2
3
4
5
-2.08
-1.24
-1.20
-0.91
-0.82

 

Why did stunting (low height-for-age score) rates improve so markedly with rising incomes? I thought it must be related to dietary quality, specifically that of mothers before and during pregnancy. This clue led to further investigations with other data sets.  The long and the short of it was that after my first ten years of research at IFPRI, I had convinced myself that research on nutrition in developing countries should focus on dietary quality, not energy. 

And this conviction was supported by recent discoveries in the nutrition community about the widespread prevalence of “hidden hunger” in these countries. Mineral and vitamin deficiencies were becoming viewed as a significant public health problem. To address the problem, supplementation and fortification efforts were being ramped up. Appropriate food vehicles were sought for food fortification – what foods did the poor eat regularly day in and out?

It occurred to me that actually CGIAR research outputs − new crop varieties− were reaching the poor at mealtimes, two to three times a day, day in and day out, especially in South Asia where the numbers suffering from mineral and vitamin deficiencies were highest.  By and large, Asians were consuming the modern varieties of rice and wheat every day – in large quantities. In hindsight, what a missed opportunity! What if the modern varieties had been bred also to have high mineral and vitamin content? That would have been a very efficient, cost-effective way to add more minerals and vitamins to the diets of the poor. This is analogous to having one central processing facility to produce a fortified grain or flour, rather than tens of thousands of mills scattered throughout several countries.  

Even better, staple plants could be “trained” to do the work of fortification -- to draw the trace minerals from the soil and deposit them in the seeds, and then to synthesize the vitamins in the seeds. Seeds multiply themselves; supplements do not. Seeds can empower women farmers; fortified foods do not. If yields of the more-nutrient-dense varieties were as high as those of the regular varieties, the price to consumers would be the same. Mothers would choose to substitute varieties that had more minerals and vitamins for their families for the same price as the older, less nutritious varieties.

A missed opportunity indeed, but all was not lost. Current modern varieties are replaced over time by even better modern varieties.  Biofortification − a simple, powerful idea, but was it too good to be true?  As with most things, the devil is in the detail and conventional wisdoms in other disciplines needed to be addressed first. For example:

  • Would there be tradeoffs in plant breeding outputs – e.g., better nutrition, lower yields?
  • Would biofortified crops mine the soil of valuable trace minerals, leaving them depleted for future use?
  • Would the bioavailability of trace minerals be too low, due to the high phytate content of food staples?
  • Would consumers balk at purchasing and consuming yellow and orange colored food staples (high in provitamin A) when they had grown accustomed to eating their white food staples their whole lives?
  • Would donors invest in a strategy that took at least 10 years to produce on-the-ground impacts?
  • Would the agricultural and nutrition sectors communicate and coordinate their efforts effectively?

Credible research evidence was available to address all these questions in favor of biofortification.  However, it took 10 years to assemble (in some cases to generate) the evidence to convince donors to provide the funds to start HarvestPlus in 2003 (for example, Graham et al 2007).  It then took another ten years to do the plant breeding and nutrition efficacy trials, and to secure biofortified crop releases.  This effectively summarizes my first three decades at IFPRI.

The last three years have been the most satisfying for me, personally – watching the growing dissemination of biofortified crops, determining how best to secure rapid adoption by farmers and consumers. Biofortifed crops have been publicly released in almost 30 countries and are in field testing in more than 40 countries.  More than two million farm households in Africa, Asia, and Latin America have been reached with biofortified seeds directly though HarvestPlus-funded activities.  Other groups, notably the International Potato Center (CIP) and collaborators in disseminating orange sweetpotato, also have been successful in promoting biofortified staple food crops.

Biofortification is but one of many agricultural innovations that can and should be implemented if agriculture is to fulfill one of its primary objectives –to provide the minerals and vitamins and other compounds essential for healthy and productive lives. The ultimate goal is for agriculture to provide adequate dietary quality for all.  In the meantime, supplementation, food fortification, biofortification, nutrition education, and other cost-effective nutrition interventions all will have an important role to play in helping alleviate the unnecessary suffering caused by undernutrition.

References

Bouis, H. and L. Haddad.  1990.  Agricultural commercialization, nutrition, and the rural poor: A study of Philippine farm households. Boulder, Colorado: Lynne Rienner Publishers.

Bouis, H., and L. Haddad.  1992.   Are estimates of calorie-income elasticities too high?: A recalibration of the plausible range. Journal of Development Economics 39 (2): 333-364.

Bouis, H., L. Haddad, and E. Kennedy.  1992.  Does it matter how we survey demand for food?: Evidence from Kenya and the Philippines. Food Policy 17 (6): 349-360.

Bouis, H.  1994. The effect of income on demand for food in poor countries: Are our food consumption databases giving us reliable estimates? Journal of Development Economics 44 (1): 199-226.

Subramanian, S and A. Deaton.  1996. The Demand for Food and Calories, Journal of Political Economy, 104 (1): 133-162.

Graham, R., R. Welch et al.  2007. Nutritious Subsistence Food Systems, Advances in Agronomy, 92: 1-74.