CATEGORIZATION OF RISKS AND BENEFITS
V. CATEGORIZATION OF RISKS AND BENEFITS
A. Categories of Risk
What are the risks associated with our food supply? This can be a very confusing question because of our inability to adequately measure the risks associated with different compo- nents. Most foodborne risks are not associated with food additives. Roberts (1981) and Wodicka (1977) have categorized the major hazards associated with foods, including addi- tives, into five groups, ranked in order of importance: (1) foodborne hazards of microbial origin, (2) nutritional hazards, (3) environmental contaminant hazards, (4) foodborne haz- ards of natural origin, and (5) food and color additive hazards.
The public perception of the risks associated with foods is often in the reverse order of the list above (Oser, 1978). Therefore, it is important to examine each of these areas to gain a better understanding of the total food safety problem. While the major focus of this book is on food additives, an examination of all of the risks associated with the food supply is necessary to provide a perspective on the comparative risks associated with food additives.
The most prevalent hazard associated with food is foodborne disease of microbial origin. Microbial contamination can result from poor sanitary control during preparation The most prevalent hazard associated with food is foodborne disease of microbial origin. Microbial contamination can result from poor sanitary control during preparation
Foodborne diseases of microbial origin are important in food safety because of their wide diversity. These microbial illnesses can range in severity from the very severe, like botulism, to milder illnesses such as staphylococcal food poisoning. Foodborne hazards of microbial origin pose the greatest risk to infants, to the elderly, and to debilitated persons. Listeriosis, associated with Listeria monocytogenes infection, is an excellent example of
a foodborne disease of opportunistic origin that primarily affects compromised individuals. The foodborne diseases of microbial origin are readily recognizable and are often easily diagnosed provided that a sample of food remains to confirm its contamination with the suspect organism. The establishment of clear cause-and-effect relationships for this cate- gory of foodborne disease has been an important factor in its assumption of the top ranking in our classification.
The second major risk associated with food is nutritional hazards. Nutritional haz- ards have earned this high ranking because their adverse effects can come from either deficiencies or excesses in nutrient intake (Stults, 1981). The majority of nutritional haz- ards come from an improper balancing of the food intake in the diet. Diseases caused by nutritional deficiencies such as scurvy (vitamin C), pellagra (niacin), rickets (vitamin D), beriberi (thiamin), and goiter (iodine) are probably the most widely known hazards associ- ated with insufficient nutrient intakes. These diseases were prevalent in the United States in the early twentieth century, but with nutrient fortification of certain foods such as milk and table salt, improved dietary intake, and improved distribution and storage of perishable foods, they have been virtually eliminated.
At the other end of the spectrum are the hazards associated with the consumption of excessive amounts of the fat-soluble vitamins and some of the trace elements. It is important when discussing the toxicity of vitamins to differentiate between the fat-soluble vitamins (A, D, E, K) and the water-soluble vitamins (C and the B vitamins). Since the fat-soluble vitamins are stored in body fat, excessive intake of these vitamins, especially vitamins A and D, might result in accumulation with toxic side effects. On the other hand, excess amounts of the water-soluble vitamins are usually excreted in urine and sweat (Stults, 1981), although mild cases of toxicity are occasionally reported. The degree of toxicity of the trace elements is greatly affected by their interactions with one another. For instance, it is known that toxic amounts of iron can interfere with the absorption and utilization of copper, zinc, and manganese, and that excessive amounts of manganese can interfere with the absorption of vitamin B-12 (Davies, 1978).
For most healthy individuals, consumption of an adequate and varied diet presents no significant risk for these nutritional excesses and deficiencies. Consequently, one might argue that nutritional hazards should not be ranked second on the list of concerns for food safety. However, because of the common occurrence of heart disease, cancer, and stroke and the possible involvement of dietary factors in these diseases, an argument could be made for elevating nutritional hazards to the top ranking and demoting the microbial haz- ards to the second ranking. Furthermore, in many parts of the world, poor food distribution systems, inefficient manufacturing and inadequate storage facilities prevent many people from receiving an adequate and varied diet.
Dietary intakes of cholesterol and saturated fats may contribute to the development Dietary intakes of cholesterol and saturated fats may contribute to the development
In third place on the list of hazards associated with foods are environmental contami- nants. Environmental contaminants can find their way into the food supply by the release of industrial chemicals or from natural sources. Although this category contains chemical substances of a quite diverse nature, there are some common characteristics. For example, these contaminants often persist in the environment and resist degradation. These chemi- cals tend to have a slow rate of metabolism and elimination, which could result in their accumulation in certain body tissues. Also, certain environmental contaminants can accu- mulate in the food supply, such as mercury in swordfish and shark or neurotoxins in shellfish.
Some of the environmental contaminants that pose a hazard to the food supply are polychlorinated biphenyls (PCBs), dioxins, mercury, and lead. PCBs and mercury have been associated with disease in humans due to the consumption of contaminated fish (Munro and Charbonneau, 1981). Contaminants from natural sources usually come from the erosion of rock formations or from soils with naturally high levels of certain sub- stances. The major contaminants of natural origin are mercury, arsenic, selenium, cad- mium, and tin. Pesticides and drug residues in food-producing animals are also included in this category.
When considering guidelines for the control of environmental contaminants, one must remember that toxicity is a function of dose. Therefore, one must know the level of the contaminant in the food and the amount of that food that is normally eaten. This can become quite complicated, but regulatory action levels for some chemical residues in foods have been established. These permitted levels appear to help minimize human exposure to particular contaminants that might be in foods. However, with certain environmental contaminants, such as mercury, the action levels have been set at the level of natural occurrence, minimizing exposure and allowing no room for industrially derived contami- nation or impact of dose. In other cases (PCBs), action levels have been set on the basis of knowledge of toxicity and calculation of risk, which permits more flexibility for deter-
Naturally occurring toxicants rank fourth on the list of major foodborne hazards (Rodricks and Pohland, 1981). Since these contaminants seem to cause problems only under certain extreme conditions, they are ranked relatively low. They also rank low be- cause public opinion seems to view ‘‘natural’’ risks with much less alarm than manmade contaminants (Rogers, 1983). Some of the more common naturally occurring toxicants found in foods are oxalates in spinach, glycoalkaloids in potatoes, mercury in swordfish, mushroom toxins, mycotoxins, and marine toxins. Certain other compounds like biologi- cally active amines and nitrosamines that can be produced during food storage, processing, or preparation can also pose a food hazard.
Whether a chemical is synthetic or ‘‘natural’’ has no bearing on its toxicity. There- fore, this area of food safety concerning naturally occurring contaminants needs to be explored further to determine if the risk from natural contaminants is really low. Naturally occurring seafood toxins and mushroom toxins are relatively common causes of acute foodborne disease (Hughes et al., 1977). The effects of human exposure to natural toxins are difficult to study because consumption of naturally occurring toxins is variable and often cannot be determined. Also, excessive natural toxin consumption generally results in long-term or chronic illness whose source can be difficult to trace.
Ranked below the other four food hazards is the risk obtained from food additives. The GRAS ingredients would be included in this classification. Although GRAS ingredi- ents are not legally food additives, the public perceives no distinction. This class includes thousands of substances. Any potential hazard to humans from a certain food additive depends on the toxicity of the food additive and the level at which the additive is ingested. The four most widely used direct food additives, which account for 93% by weight of all the direct food additives, are sucrose, salt, corn syrup, and dextrose (Clydesdale, 1982; Roberts, 1981). Human exposure to indirect additives is difficult to measure, but this exposure is minimal.
The majority of direct food ingredients are used on the basis of a determination that they are GRAS or prior sanctioned. Review of some items on the GRAS list has indicated that the majority present no significant hazard with normal use, although only a small percentage have been thoroughly evaluated (Roberts, 1981). The other di- rect food additives used in foods have been approved, and their uses are regulated by the FDA.
Why, then, does the public view food additives and certain GRAS ingredients with such concern? Both Roberts (1981) and Oser (1978) speculated that the problem with the public perception of food ingredients is that these foodborne substances must be proven ‘‘safe.’’ It is impossible to ensure the complete safety of any substance for all human beings under all conditions of use. Therefore, any uncertainty about the safety of a food additive can result in the public suspicion of a much greater risk. The recently approved ingredient olestra (brand name Olean), a noncaloric fat replacer, is just one example where debate continues about the safety of its use in foods (ACSH, 1998). Olestra is derived from sugar and vegetable oil, but its molecules are too large to be digested or absorbed in the body. Like fats and oils, it can add taste and texture to savory snack foods, but no calories are provided. Many consumers have reported gastrointestinal distress from eating products with olestra, and others are concerned that this product will prevent the absorption of fat-soluble vitamins and carotenoids into the body. In one study, participants who be- lieved that they were eating olestra snack chips reported gastrointestinal symptoms approx- imately 50% more than participants who believed that they were eating regular chips, Why, then, does the public view food additives and certain GRAS ingredients with such concern? Both Roberts (1981) and Oser (1978) speculated that the problem with the public perception of food ingredients is that these foodborne substances must be proven ‘‘safe.’’ It is impossible to ensure the complete safety of any substance for all human beings under all conditions of use. Therefore, any uncertainty about the safety of a food additive can result in the public suspicion of a much greater risk. The recently approved ingredient olestra (brand name Olean), a noncaloric fat replacer, is just one example where debate continues about the safety of its use in foods (ACSH, 1998). Olestra is derived from sugar and vegetable oil, but its molecules are too large to be digested or absorbed in the body. Like fats and oils, it can add taste and texture to savory snack foods, but no calories are provided. Many consumers have reported gastrointestinal distress from eating products with olestra, and others are concerned that this product will prevent the absorption of fat-soluble vitamins and carotenoids into the body. In one study, participants who be- lieved that they were eating olestra snack chips reported gastrointestinal symptoms approx- imately 50% more than participants who believed that they were eating regular chips,
Equally as important, the public views food additives and certain GRAS ingredients as unnecessary, involuntary sources of risk. The benefits of these food ingredients are not widely appreciated. Also, these food ingredients are not viewed as natural or normal food components, which heightens suspicions in some consumers. Since additives can increase both the quantity and the quality of foods, they will always be used. Therefore, the FDA must continue to review the use of food additives in order to assure the public of the safety of food additives.
B. Categories of Benefits
The benefits derived from the food supply generally fall into four categories: (1) health benefits that reduce some health risk or provide some health benefits such as improved nutrition, (2) supply benefits relating to abundance, diversity, and economic availability, (3) hedonic benefits that provide sensory satisfaction, and (4) benefits that lead to increased convenience (Darby, 1980; Food Safety Council, 1980). Food additives can play an impor- tant role in each of these categories of benefits by improving health, increasing supply, enhancing appeal, or improving convenience. Of these benefits, health benefits should be given the greatest consideration, while supply benefits are second in importance, and in- creased convenience and improved appeal are the least important.
Health benefits of two types may be provided by food additives and other food components: those that prevent or reduce the incidence of specific diseases and those that provide enhanced nutrition. Nitrites have antibotulinal effects and may thus reduce the risk of botulism in cured meats. Nutritional benefits accrue primarily from the presence or addition of nutrients. Nutritional wholesomeness is increased by the enrichment and fortification of certain staple foods, such as bread, milk, and salt, with vitamins and miner- als. Fortification with vitamins and minerals could be viewed as preventing deficiency diseases such as scurvy, beriberi, or goiter. Of course, excessive fortification of foods with certain nutrients can increase risks as noted earlier. Supply benefits are also enhanced by the use of food ingredients that prevent the spoilage of foods, increase the yield of pro- cessing techniques, or provide new sources for desired functions. Preservatives prevent food spoilage and thus increase supply and lower costs. Preservatives also have indirect health benefits by protecting nutrients, preventing the growth of hazardous microbes, and helping to ensure the availability of an abundant and nutritious food supply. In 1999 the U.S. Food and Drug Administration announced that it would provide an expedited review of a new food additive petition if the additive is intended to significantly decrease food- borne human pathogenic organisms or their toxins.
Hedonic benefits include improved color, flavor, and texture to enhance consumer appeal. Convenience benefits accrue from those components of foods that result in time savings during preparation. These benefits usually assume greater importance in affluent
C. Striking a Balance
As noted above, the quantitation of the degree of risk or benefit associated with a food additive is hardly an exact science. With our current regulatory statutes, the entire empha- sis is placed on risk assessment with no consideration given to benefits. While risk–benefit approaches have never been used, sometimes it may be possible to simply evaluate the net risk by comparing the risk of using an additive with the risk of not using it. This risk–risk approach is more acceptable because risk is much more amenable to quantitation than benefit; it is much easier to balance one risk against another than to balance risks against benefits, and risk assessment is a well-accepted regulatory concept. Furthermore the risks of additive use are generally compared by their detrimental effects to human health. While a benefit of some food additives is to enhance health status or prevent disease, most benefits reflect economic considerations for food processors and sensory attributes and convenience for consumers. Thus, an adequate comparison of risk and benefit for every food additive can be difficult to perform and difficult to quantitate (IFT, 1988).
An important concept in either the risk–benefit or risk–risk approach to deci- sionmaking is the concept of a defined, socially acceptable risk level (Food Safety Council, 1980). Some level of risk is inherent with any chemical. But often there is disagreement about the degree of concern about health risks due to chemical consumption. Which risks are less tolerable—ones that could cause acute illness versus chronic illness? or risks that may lead to many cases of temporary illness versus a few cases of mortality? or risks that affect children more than adults? The definition of a socially acceptable risk level is particularly needed when reviewing carcinogenicity data from animal experiments and the extrapolation of these data to human experiences. The Delaney clause of the FD&C Act defines the acceptable risk level as zero, but historical experience has shown that consumers will accept higher levels of risk if they perceive an important benefit. In 1996 the Food Quality and Protection Act repealed the Delaney Clause with respect to pesticide residues in foods. This legislation instituted a ‘‘reasonable certainty of no harm’’ standard that considers risks from different exposures, risks to different population subgroups, and multiple toxicological effects of pesticides on human health (EPA, 1998; Winter and Francis, 1997).
The FDA has operationally defined acceptable risk from chemical consumption as up to one additional case of cancer per million cases, or 10 ⫺6 , when that chemical is consumed at typical levels during a lifetime. However, the Food Safety Council (1980) took a more detailed look at the acceptability of risk on a theoretical basis. The Food Safety Council defined four situations that could arise in risk–benefit considerations: (1) where the chemical has no identifiable risk, (2) where the substance has a clearly unacceptable level of risk, (3) where the chemical has a measured risk level that is less than threshold for acceptability, and (4) where the substance has measured risk level that is greater than the threshold for acceptability. The benefits that might accrue from the use of an additive are considered differently in each of these situations.
In the situation where the substance has no identifiable risk, any measurable benefit should allow use of the substance in the food supply. The lack of identifiable risk should not be overly comforting, however. All chemicals have some inherent toxicity. The failure to demonstrate any toxicity may simply mean that appropriate tests have not been con- ducted. Additional tests or tests conducted at higher dose levels or in different species In the situation where the substance has no identifiable risk, any measurable benefit should allow use of the substance in the food supply. The lack of identifiable risk should not be overly comforting, however. All chemicals have some inherent toxicity. The failure to demonstrate any toxicity may simply mean that appropriate tests have not been con- ducted. Additional tests or tests conducted at higher dose levels or in different species
Substances having clearly unacceptable risks should not be allowed in foods under any circumstances. In this situation, the risks far outweigh the benefits. The decision tree approach to toxicological assessment makes ample use of this concept. A substance dem- onstrating lethal effects in an acute toxicity test at relatively low dose levels is dropped from consideration without further testing. Usually substances in this category would be identified and rejected before any food additive petition is formulated. For most types of toxic effects, no clearly defined level of acceptability has been established, but clearly unacceptable risks should be evident to any toxicologist.
The situation could theoretically arise where a substance will have some well-docu- mented risk but the degree of risk is below the socially acceptable risk level. In such cases, use of the substance would be allowed according to the logic of the Food Safety Council only if a substantial net benefit was evident (Food Safety Council, 1980). This situation assumes that a socially acceptable risk level has been defined for this particular type of toxic response. Such definitions have not been established for most toxic responses. The Food Safety Council (1980) suggested that substances in this category must be care- fully evaluated to ensure that benefits outweighed risks. Health and nutritional benefits would have the greatest impact on such a theoretical comparison. If the benefits were largely those of supply, appeal, or convenience, then much larger benefits would be needed to offset the known risks.
The final situation involves substances where the level of measured risk exceeds the socially acceptable level. Again, the problem arises that the level of acceptability has not been established for any toxicological responses except cancer. However, assuming that such definitions existed, a substance in this category would have to possess a sufficient level of benefits to cause reconsideration of the socially acceptable risk level for that particular case. Presumably, very large health benefits would be one possible offsetting factor. Special regulatory restrictions might be necessary to inform consumers of the possi- ble risks. Saccharin may be a good example. The carcinogenicity of saccharin exceeds the acceptable risk level. The perceived benefits of saccharin accrue to those consumers wishing to control their weight or diabetic conditions. Warning labels have been imposed to alert consumers to the possible carcinogenic hazard. However, further safety evaluation of saccharin led to the repeal of the label requirement in 2000.
Risk–benefit decisions require careful attention to the type and degree of both the risks and the benefits. Such decisions are complicated by the lack of knowledge about how to properly extrapolate risks from animal experiments to human situations, the lack of suitable methods for quantifying benefits, and the problems inherent in comparing health risks with non–health risks.
The use of the risk–risk approach to these decisions can alleviate two of the three complications detailed above. In risk–risk approaches, health risks are compared to health benefits, and other types of benefits are not given much consideration. In other words, the risk of using an additive is compared to the risk of not using the additive. If an additive has a net positive effect on health, then it would be allowed for use in foods. While this approach is commendable, it cannot be applied easily to most types of food additives since some additives have no health benefits or there are limited risks associated with not using them. However, it can be used in some cases and could provide an interesting perspective. Again, it must be emphasized that such an approach is not mandated under
In some cases, the comparative risks are obvious. For example, the risk of using nitrites and acquiring cancer from exposure to nitrosamines must be balanced against the risk of not using nitrites and acquiring Clostridium botulinum toxin from cured meat (IFT, 1988). The risk of acquiring botulism is very small, but the illness is often fatal. In other cases, the comparative risks are more obscure or difficult to quantitate. For example, the small risk of using saccharin and acquiring bladder cancer must be weighed against the alternative risks. Theoretically, the alternative risks would be increased consumption of sucrose and a higher risk of all the diseases associated with obesity. However, other non- nutritive sweeteners exist, and the alternative to these sweeteners is not necessarily equiva- lent sweetness intake with highly caloric sweeteners. The use of saccharin does not prevent obesity.
In some cases, the risk–risk approach is even more difficult to apply. How could it be applied to tartrazine or some other food colorant, for example? Some evidence of risk might be available for use of the food colorant. But it would be very difficult to identify any risks attributed to the lack of availability of the substance. Perhaps consumers would switch to less safe food choices, or perhaps the food industry would select a more hazardous food colorant as an alternative. However, these alternative risks would be diffi- cult to foresee or quantitate. Consequently, the risk–risk approach is likely to be useful only with certain food ingredients.