Federal agencies may have been too quick to judge the Gulf of Mexico safe for fishing, according to a retest of water, sediment and seafood samples taken before and after the 2010 oil spill. Paul W. Sammarco of the Louisiana Universities Marine Consortium told the New York Times that earlier contamination studies may have been affected by oil dispersants. The instances where he found greater contamination, he said, “called into question the timing of decisions by the National Oceanic and Atmospheric Administration to reopen gulf fisheries after the spill.”
Commercial and recreational fishing were closed in over 88,000 square miles of Gulf waters immediately following the spill. By one year out, all fisheries had been reopened; for some, it was much sooner. The study, published in the journal Marine Pollution Bulletin:
found higher levels of many oil-related compounds than earlier studies by NOAA scientists and others, particularly in seawater and sediment. The compounds studied included polycyclic aromatic hydrocarbons, some of which are classified as probably carcinogenic, and volatile organic compounds, which can affect the immune and nervous systems.
“When the numbers first started coming in, I thought these looked awfully high,” Dr. Sammarco said, referring to the data he analyzed, which came from samples that he and other researchers had collected. Then he looked at the NOAA data. “Their numbers were very low,” he said, “I thought what is going on here? It didn’t make sense.”
Dr. Sammarco said that a particular sampling method used in some earlier studies might have led to lower readings. That method uses a device called a Niskin bottle, which takes a sample from a specific point in the water. Because of the widespread use of dispersants during the spill — which raised separate concerns about toxicity — the oil, broken into droplets, may have remained in patches in the water rather than dispersing uniformly.