Five Signs Those Anti-Paper Studies May Be Bogus … And How to Spot Them
Submitted: Kathi Rowzie May 6, 2021
You’ve seen them in popular periodicals, industry newsletters and in your email: some self-interested group announces the completion of a “scientific comparison” that “proves” the superiority of an alternative material or “environmentally friendly” substitute for paper or paper-based packaging. The study appears to have all the trappings and buzz words of legitimate research, but is it?
To give you an idea of how Two Sides approaches this challenge, what follows are five signs that make us suspicious of half-baked or bogus comparisons to paper. You too can look for these signs whenever these studies cross your desk.
To begin with, anyone setting out to prove there’s a better alternative to paper products has a very high hurdle to clear: proving that theirs is more sustainable than paper or paper-based packaging. It can take a lot of data stretching and twisting to yield a conclusion often at odds with the facts, and that kind of manipulation leaves telltale signs.
Who commissioned the study? The first question to ask is best summed up with the Latin phrase, cui bono, who benefits? Was the research conducted in such a way that its results were preordained to support its sponsors? There’s nothing necessarily wrong with a self-interested group or competitor commissioning a study comparing its alternative with paper products, as long as it is honest, scientific, and the researchers are allowed to let the chips fall where they may.
Is it based on a real life cycle assessment (LCA) or is it a marketing piece in LCA disguise? Next, we look to see if the study is one of a growing cottage industry of marketing pieces wrapped in a veneer of life cycle terminology. A good first step is to determine if the study complies with the LCA principles and procedures developed by the International Organization for Standardization (ISO), in this case ISO 14040 and 14044. ISO defines LCA as a compilation and evaluation of the inputs, outputs and the potential environmental impacts of a product throughout its life cycle. A study that conforms with ISO standards carefully defines the products that are bring compared and what they are designed to do (what ISO calls their “functional unit”), sets specific study boundaries around the products, and meets other requirements, including how flows into or out of the production process should be allocated. Adherence to ISO standards doesn’t guarantee the scientific fairness or integrity of a study that makes environmental comparisons, but it makes it more difficult for the sponsors to bias their conclusions and easier to spot when they do.
What’s under the hood? No matter what the supposed pedigree of a study, we need to know what’s in it. Do the parts support the whole? For that very reason, two of the most critical principles of ISO 14040 for LCA studies are transparency and critical review, especially when two or more alternatives are being compared for public consumption. An LCA is transparent when its goals, methodology, data sources and assumptions are visible for all to see. A comparative LCA can only be trusted when we can be sure that it doesn’t set different goalposts for different products, a practice we often see in studies that purport to show the superiority of plastics or alternative-fiber paper to wood-fiber paper. We also check to see if there is an independent critical review by a third-party panel of three experts (a requirement to achieve ISO-conformance) and who is on that panel. ISO standards require that the LCA sponsors appoint panel members whose job it is to examine and comment on the integrity of the study at various stages in the process.
Are there footnotes to nowhere? When we’re confronted with conclusions that defy common sense, our instinct is to trace the data behind those conclusions to the specific, relevant research cited to support them, whether those findings were original to the study or whether they come from another credible source. Here’s where many of these claims break down. We often find that the trail of citations goes in circles, or nowhere at all. Some advocacy groups, in particular, have a habit of citing another advocacy group’s study, and that second group may not have conducted any original research either. The last reference in the chain may just be dangling in space, without any supporting data other than opinion or conjecture.
Does one size fit all? Another common practice is to use generic online environmental calculators. A surprising number of businesses, advocacy groups and even large corporations, who should know better, use these tools to generate data that will serve as the foundation for their conclusions. The lure – they’re typically free, easily accessible and deliver immediate results. However, unlike LCAs, which are product- and process-specific, online calculators are blunt instruments that of necessity are based on national industry averages – and sometimes on assumptions that don’t hold up in the real world. At best, they serve as a starting point to suggest further study. At worst, they are about as relevant to an individual product as a daily horoscope. Change a parameter here or there, and the result could be the opposite of what the calculator suggests.
In a similar way, companies or groups trying to avoid the time and expense of properly conducted LCAs often give in to the temptation to claim that someone else’s study validates their own product comparisons, suggesting, for example, that the results of an LCA on a corrugated box produced in Indonesia would apply to corrugated products produced in North America. Valid LCAs are as accurate a reflection of the processes used to manufacture an individual product as their practitioners can make them. For example, among other things, the LCA for a paper product would evaluate data from the specific mill that manufactured it – raw materials, chemicals, water and energy consumption, type of energy used, greenhouse gas emissions released and so on. With this level of specificity, what’s true for one product is highly unlikely to be true for another.
In the real world, comprehensive, ISO-conformant life cycle studies with external third-party critical review can be expensive and time- and labor-intensive, requiring careful assumptions, mountains of data and sound methodology. For those who market or advocate substitutes for paper products, LCAs often lead to conclusions they hadn’t anticipated and don’t like. Consequently, some of them opt for half-baked environmental comparisons they believe will throw the worst light on paper products.