Speeding up fisheries models 50-50,000 times

Complex fisheries models are like weather forecasts for fish populations: they gather together all the available data about fish trends in numbers over time, numbers at each age, and other information, and then predict the level of sustainable catch that can be taken from the population. Over time, as computing power has grown, these models have also become more complex, and run time has remained consistently high. Explaining all of the data in such models can be done using two methods: maximum likelihood methods, which approximate the uncertainty around model estimates, and Bayesian methods, which more directly measure the uncertainty. Although Bayesian models perform better, the time to run them to completion has been prohibitive, and most fisheries are managed with models in the maximum likelihood framework. Now, a suite of new advances has been combined including new and more efficient Bayesian algorithms, parallel processing, and restructuring so that models are more amenable to Bayesian methods. The new methods offer a 50 to 50,000-fold improvement in run time over existing methods: in one case allowing a model that would take more than 15 years to complete to be run in just over 12 hours. The new work was led by Cole Monnahan as part of his PhD dissertation, and completed while he was a research scientist at SAFS. Coauthors include SAFS professor Trevor Branch, NOAA scientist Jim Thorson, IPHC scientist Ian Stewart, and Cody Szuwalski. The paper is published in the ICES Journal of Marine Science.

Back to Top