Beneath the Surface

Reporter / by Holly Capelo /

Powerful computer simulations may be the best method available to quantify the amount of oil leaking from the Deepwater Horizon—and to predict where it will go.

Page 1 of 2

Credit: NASA/GSFC, MODIS Rapid Response

To apprehend the ultimate outcome of the Deepwater Horizon tragedy in the Gulf of Mexico, we take in hand the technological tools at our disposal to make decisions about the future. So far, these tools have offered little certainty.  Due to the failings of British Petroleum (BP) and government regulators to prevent the oilrig explosion that happened on April 20th, and subsequently to contain the spill, there lingers a hazy distinction between corporate dishonesty and the error margins germane to unprecedented emergency circumstances. After an accident of this magnitude, who can quantify which fate is most regrettable, irreparable wildlife devastation in Louisiana marshes, or in underwater canyons? Desecrated beaches in Florida or, if the Loop Current sweeps the oil around the Panhandle, in Georgia?  Lost livelihoods in tourism, or in fishing? Still, groups and individuals doing damage control need to know where the oil is likely to go.

We seek predictions from reliable measurements because even minute amounts of oil in the water can kill fish, plankton, and larvae—which is highly unfortunate for all of us higher up on the food chain. Spill responders and government agencies need finite figures, rather than wildly fluctuating oil-flow estimates, to price cleanup efforts and penalty fines. Fishermen must know when, if ever, they can safely return to work, while the threat of more lost jobs plagues the ongoing economic recovery. Set within a long-term dream of energy independence, the disquiet over these more immediate concerns is so much louder.
Beneath the Gulf of Mexico, there remains a complex underwater system whose fate is described by well-developed computer models. Reciprocally, by providing fresh verification data, measurements of the subsurface plumes are refining the codes’ predictive abilities. But have they the potential to divine the future of the oil, and the myriad troubles that upwell from this disaster? Simulations may be the best way of grasping oil behavior and therefore quelling fears of the unknown, yet exact figures elude us.

Collective anxieties are mounting over imprecise reports of just how much oil has been leaked and how much continues to flow.  Besides impressing upon the national viewership how interminable is the oil flow, video footage of the jet exit site provides information for initial conditions, which are the starting inputs to oil-forecast simulations. Other important initial conditions are local currents and density profiles as well as oil and gas concentrations—all values that were known to low precision even after the initial chaos after the rig exploded. The presence of slushy natural gas compounds, called “hydrates,” near the jet outlet squelched early Deepwater Horizon containment attempts. The hydrate concentration within the plume remains the most telling determinant as to how much of the pollutants will reach the top of the Gulf waters. This unprecedented deepwater blowout, a scenario in which an oil, gas, and ultra-dense water mixture rises from the ocean floor as a jet or plume, has mandated the use of blowout-specific codes which have never been applied to such extreme circumstances.

In salt water, crude oil takes a few days to emulsify into a creamy mousse from surface waves or subsurface turbulence, a few weeks to oxidize due to sun exposure, and many years to biodegrade, sink, and form sediment. Despite calls for an expedient end to the situation, oil spills involve diverse physical processes and resultant response phases, requiring a host of software programs for each stage: “We produce a spill impact-assessment model, a blowout model, and a search-and-rescue model used by the Coast Guard,” explained Dr. Malcolm Spaulding, founding principal of Applied Science Associates (ASA), a private company currently supporting the National Oceanic and Atmospheric Administration’s (NOAA) spill tracking and response efforts in the Gulf of Mexico. ASA provides software called OILMAPDEEP, the predecessor of which they developed in 1989 to support the Exxon Valdez supertanker disaster.

ASA has assisted in many oil spills, but this one is unique. On the surface, weathering processes can evaporate or dissolve an oil slick, and chemical dispersants or controlled burns can predictably alter its behavior. By contrast, a mixed volume of oil, gas, and water jetting from abyssal depths has a longer journey to the surface, over which an assortment of competing effects can influence and change it in novel ways, sapping its upward momentum or sweeping it out in expansive subsurface oil plumes. Government and media reports aside, such oily underwater tendrils are integral to comprehensive blowout models. Plumes associated with the Deepwater Horizon leak should come as no surprise.

While the Exxon Valdez spill is a proxy for US coastal environmental damage, in terms of environmental modeling, ” [the Valdez and the Deepwater Horizon] are entirely different animals”, says Spaulding, because the former was a finite surface spill in a glacier-filled sound with nearby mountains that magnified the wind patterns enormously. A closer analog might be the 1979 Ixtoc subsurface oil spill, but Ixtoc was in comparatively shallow water, and therefore provides little relevant data. “We would not expect any separation of the oil and natural gas below the surface. No gas hydrates would form—in a sense, the Ixtoc plume is quite easy to model,” said Dr. Scott Socolofsky, Associate Professor of Civil Engineering in the Division of Coastal and Ocean Engineering at Texas A&M University. Absent a historical record for deepwater plume behavior, NOAA uses a blowout model that was tested against a controlled crude oil and methane release off the coast of Norway in 2000, which involved Dr. Socolofsky.

Since their early applications in the 1960s, oil-spill trajectory codes have become more sophisticated in their ability to combine multiple effects into comprehensive models. They now account as needed for such aspects as surface slick drift due to wind in the presence of a shore, for how oil reacts with ice, or how breaking waves disperse it. Spaulding said regarding natural, chemical, and surface-burn dispersal, “All the best models now have a dispersant treatment algorithm; you can say ‘I lost this much [oil] and this is how it changed the concentrations in the water column’”.

Page 1 of 2

Tags complexity data innovation systems technology

Share this Stumbleupon Reddit Email + More


  • Ideas

    I Tried Almost Everything Else

    John Rinn, snowboarder, skateboarder, and “genomic origamist,” on why we should dumpster-dive in our genomes and the inspiration of a middle-distance runner.

  • Ideas

    Going, Going, Gone

    The second most common element in the universe is increasingly rare on Earth—except, for now, in America.

  • Ideas

    Earth-like Planets Aren’t Rare

    Renowned planetary scientist James Kasting on the odds of finding another Earth-like planet and the power of science fiction.

The Seed Salon

Video: conversations with leading scientists and thinkers on fundamental issues and ideas at the edge of science and culture.

Are We Beyond the Two Cultures?

Video: Seed revisits the questions C.P. Snow raised about science and the humanities 50 years by asking six great thinkers, Where are we now?

Saved by Science

Audio slideshow: Justine Cooper's large-format photographs of the collections behind the walls of the American Museum of Natural History.

The Universe in 2009

In 2009, we are celebrating curiosity and creativity with a dynamic look at the very best ideas that give us reason for optimism.

Revolutionary Minds
The Interpreters

In this installment of Revolutionary Minds, five people who use the new tools of science to educate, illuminate, and engage.

The Seed Design Series

Leading scientists, designers, and architects on ideas like the personal genome, brain visualization, generative architecture, and collective design.

The Seed State of Science

Seed examines the radical changes within science itself by assessing the evolving role of scientists and the shifting dimensions of scientific practice.

A Place for Science

On the trail of the haunts, homes, and posts of knowledge, from the laboratory to the field.


Witness the science. Stunning photographic portfolios from the pages of Seed magazine.

Sites by Seed Media Group: Seed Media Group | ScienceBlogs | Research Blogging | SEEDMAGAZINE.COM