Hana Ra’s (BS Biology, 2020) interest in citizen science began when SAFS Professor Julia Parrish gave a presentation on the Coastal Observation and Seabird Survey Team (COASST) program in Ra’s marine biology course. She was amazed by how Parrish had created a program that had both researchers and the general public collecting data that subsequently informed a multitude of management and conservation projects. Ra’s interest would grow through internships with COASST and the NOAA Pacific Islands Fisheries Science Center (PIFSC), where she helped create a citizen science project in her hometown of Honolulu, Hawaii.
During the summer of her junior year, Ra began to help develop the citizen science project “OceanEYEs” as a NOAA/JIMAR PIFSC Young Scientist Opportunity (PYSO) Intern. For this internship, she collaborated with PIFSC researchers and used research data collected from the annual bottomfish surveys to create informative training and education materials. The OceanEYEs project, a partnership between scientists and Zooniverse.org, employs a user-friendly web page, where citizen scientists can help review images from annual bottomfish surveys, tagging and identifying all the fish that they see. Scientists can then use those data in stock assessments and to “train” advanced artificial intelligence (AI) tools for the future.
The images are collected each year during the Bottomfish Fishery-Independent Survey in Hawaii (#BFISH) using state of the art stereo-camera systems. The survey provides an estimate of the number of Deep 7 bottomfish—seven species of fish that have both economic and cultural value to the islands. The data from this survey are used in the Deep 7 stock assessment to provide managers with the best information to make management decisions, including annual commercial fishery catch limits.
The camera systems, which rest on the seafloor for 15 minutes at a time, record hundreds of thousands of images over the course of the survey. These images are currently analyzed by NOAA scientists, but the sheer number of images collected during survey operations can be overwhelming.
NOAA has been investing heavily in the development of AI solutions, allowing scientists to use machine learning and computer vision to analyze images. However, for the machine to learn, it requires large numbers of training images, which are images of fish that a human has already tagged and identified.
The OceanEYEs web page gives users a tutorial on how to recognize each fish species and how to properly mark them in the image. It also has a field guide and text to help users identify and annotate fish. Users can also learn about the science behind OceanEYEs.
Fifteen different people view and annotate each image, and the results are compiled to give a “consensus” annotation. Initial results suggest that consensus annotations can match the accuracy of professional analysts, greatly enhancing NOAA’s capabilities to process image data from the Pacific Islands region.
Want to get involved? Just log on, dive in, and start exploring the underwater world while helping assess the bottomfish populations of Hawaii!
This story was adapted from a NOAA press release published on September 15, 2020