While skimming through a well-meaning paper exploring the issue of "Environmental Policy and International Competition in a Globalizing World" this afternoon, my attention was caught by an intriguing proposition:
More stringent environmental regulations can also stimulate R&D and innovation processes, which lead to the development of clean technologies that are less costly than traditional end-of-pipe solutions and have additional economic benefits because of material and energy cost savings and increased productivity.
Or, more simply: There's a payoff to being environmentally correct. This is known in the economic literature as "the Porter hypothesis," which dates back to a landmark paper co-authored in 1995 by Harvard economist Michael Porter, "Toward a New Conception of the Environment-Competitiveness Relationship." Porter and co-author Claas van der Linde argued that properly crafted environmental regulations can convey a competitive advantage to firms that are forced to comply with the rules.
How the World Works desperately wants such an assertion to be true, so I did some digging around to find out how the "hypothesis" had held up in subsequent years. As I suspected, it has come under quite a bit of attack. The general thrust: If the possibility of cutting costs or increasing profits exists, companies operating in a free market will find it, whether or not there are stringent rules in place requiring certain behavior.
This is the kind of question that is difficult to settle empirically, which means, for now, I just can't say for certain whether the following declaration by Porter and van der Linde is unimpeachably true:
Competitive advantage, then, rests not on static efficiency nor on optimizing within fixed constraints, but on the capacity for innovation and improvement that shift the constraints.
(The idea being that strict environmental regulations are one way to provide an incentive for innovation and constraint-shifting improvement.)
But who needs it be true? As an aspirational goal, it's good enough.