Flying under the radar: How toxic compounds find their way into everyday products.
This article is part of a 6-day series titled "Toxicants in Consumer Products". The authors, with background knowledge in environmental toxicology, will delve into the most pressing ecotox issues of today.
I would like to invite some familiar (yet infamous) faces to this listen in on this conversation. Some acquaintances appearing on the guest list include DDT and neonicotinoids (widely used insecticides), bisphenol A (found in drinking bottles), and flame retardants such as PBDEs (added to children’s pajamas and other potentially flammable materials around the home).
This, of course, is just a shortlisted collection of synthetic – or manmade – compounds that were (or are still currently) manufactured and widely distributed. Often, they are introduced as a solution to a particular safety or efficiency issue, such as slowing the speed at which a fire can spread within the home, or reducing the impacts of pests on agricultural fields, respectively.
Ironically, these aforementioned products are known to contribute to another, sometimes more problematic widespread problem of ecosystem and organism toxicity.
How have so many seemingly safe consumer products evolved to be sources of toxic exposure? And, most importantly – how did we not catch these problems before they were released on the market for so many people to be exposed to these potentially toxic compounds?
These compounds often have one or more of the following characteristics: persistence within the environment, bioaccumulative tendencies, or inherent toxicity (known as ‘PBT’ in the world of ecotoxicology). Such qualities can generate adverse health and environmental effects. To make matters worse, these negative toxicological consequences are typically not recognized until far too late in the game – after widespread distribution – and therefore exposure – has occurred. It’s at this point that terms like ‘DDT’ are rarely spoken of as brilliant solutions developed by our very own species, but rather societal dilemmas that require severe mitigation.
Adverse effects resulting from these complex associations can be quite difficult, if not impossible, to predict or test for a new substance in a relatively short period of time before entering the market. There may also be unique relationships that are virtually impossible to predict, and can only be studied once a phenomenon appears in an individual or population. For instance, studies have identified differences in clearance rates for PFOA (perfluorooctanoic acid, another surfactant) in male versus female rats. In these types of cases, proactive avoidance of adverse effects is difficult, and without any knowledge of possible negative consequences, we are more inclined to make decisions geared towards the known, proven benefits (based on chemical properties and experimentation), which are expected to outweigh potential costs, which are unknown and not certain to occur.
Perhaps we first need to discuss strategies for taking the proper precautions to prevent (or at least minimize) the adverse impacts of newly synthesized compounds on our safety and our environment. On one hand, the discovery of a valuable, convenient product, such as perfluorooctane sulfonic acid (known as PFOS), a surfactant capable of repelling oil and water, is a useful commodity that a company and its consumers would want to incorporate into products where repellent properties are desired (such as firefighting foam and metal plating). However, it is now known to bioaccumulate in organisms to such a degree that toxic effects are observed in aquatic and terrestrial species. Thus, the manufacturing and distribution of PFOS has been virtually ceased in many developed countries since the early 2000s.
Even when we become aware of the toxic effects and detrimental impacts various products can have on people, wildlife, and the environment, we sometimes choose to maintain their use despite access to this information. The most classic example of this is the introduction of DDT as an insecticide in the late 1930s. A plethora of educational (read: persuasive) videos were released, many in the 1940s, in an attempt to convince the world that DDT was not only effective, but also completely safe. (This video here is a must-watch; the narrator describes how African residents suspect DDT may be poisonous, followed by a man literally eating DDT by the spoonful to prove otherwise. I immediately wondered the status of his health in the years to follow). Other similar videos show people (again, literally) being showered with DDT, as well as other extreme exposure scenarios to demonstrate that toxic effects were of no concern for the chemical, and that widespread use would be completely safe.
Of course, hindsight is 20/20, and it is quite easy for you and I to criticize this approach (or lack thereof) to environmental and human risk assessment, thinking that obviously long-term effects of a chemical need to be studied before it can be deemed safe for use. We now know that DDT is a highly bioaccumulative compound with a multitude of toxic effects such as endocrine disruption, and is a suspected carcinogen, particularly in higher trophic level organisms. But when a society or culture is so focused on finding a chemical solution to an overwhelming, widespread problem, that kind of logic may be replaced with excitement about a new chemical discovery. The development of synthetic compounds not found in nature, brought about completely by innovative research and scientific analysis, is often considered a grand accomplishment, and I presume there is a tendency to produce and promote such products ASAP.
Consider also the importance of cost-benefit analysis in situations where we are already aware of a substance’s toxic effects. DDT, for instance, remains used for its insecticidal properties in select locations around the world in an effort to combat malaria. Yes, DDT is a highly bioaccumulative, toxic compound, but simultaneously, malaria remains a serious public health issue in some locations, producing symptoms such as fever, vomiting, seizures, coma – and continues to cause almost 900,000 deaths annually. The challenging ethical question is: Do the benefits of using some DDT, a substance that causes long-term damage, outweigh the costs of severe illness or death form malaria? We have, of course, learned from the past, and DDT is no longer sprayed above family picnics or mixed with shampoo to show ‘just how safe’ it is – application protocols have been highly modified in response to acknowledged environmental and health impacts. This is a timeless example of a catch-22 situation; any solution to the DDT vs. malaria debate can be argued as either suitable or unacceptable, depending on the viewpoint.
So – who should be taking responsibility for this toxicological mess? Furthermore, is anyone really at fault? I would have to say the answer is, ‘Nobody, but everybody’: manufacturers, distributors, and regulators likely have the highest capacity to prevent harmful exposure, but how can this be done if harmful impacts remain unknown? Furthermore, how have the companies who once manufactured these compounds responded to toxic effects they have caused? If you’re interested in a case study, I’ll link you to this story here, involving PCB contamination in the Hudson River as a result of General Electric (GE) activity. GE may have taken responsibility for the contamination, but the remediation efforts will be less than complete in the end – only 65% removed. GE plans to leave about 80,000 pounds of PCBs in the Hudson. It seems as if taking responsibility for toxic legacies such as these are comparable to the stages of grief for some companies: denial that the contamination ‘belongs’ to them; followed by anger over being assigned clean-up duty; then bargaining, perhaps only agreeing to partial remediation; (okay, perhaps depression isn’t the most relevant emotion); concluding with acceptance, though not everyone reaches this stage, as some take the efforts more seriously than others.
Occasionally (and recently, more often than not) when I’m applying Chap Stick, packing my lunch in plastic containers, or even brushing my teeth, I (perhaps self-destructively) consider whether or not these products are truly safe. What petroleum-based products am I allowing my skin to absorb? Will toxic phthalates leach into my food and affect my body? What is the complete list of ingredients in this toothpaste? Do these cause adverse health effects when used frequently over time? Do suitable alternatives even exist, or is it all just one big marketing ploy? I’m not trying to put forth that everything will poison us and cause our eventual collective death, but ultimately, we, as humans, do put toxic products into our consumer products – intentionally or not – and understanding how and why this occurs helps us to make decisions regarding the health and safety of life around us.
As you can see, I can’t actually provide a solution to detecting toxicants in the products we use before undue exposure has an opportunity to occur. (Sorry, guys). The answer, however, may exist on the regulatory side of things: instead of assuming a substance is safe until proven otherwise, perhaps we should consider reversing the process – substances stay off the shelves until proven safe. This would be significantly more time and energy intensive, but could maybe save us from saying things we regret, perhaps to describe the list of friends I brought in at the beginning of this conversation. Case and point: “DDT is good for me!”.