Citizen science, where the nonexpert public joins in freely to produce useful science, has grown to more than 2100 projects on the SciStarter website alone. These projects range from online identification of astronomical objects, to gaming-like projects predicting how proteins will fold (Foldit), to seasonal bird counts. One long-running project is COASST (Coastal Observation and Seabird Survey Team) where members of the public conduct monthly surveys of beach areas from California to Alaska looking for bird carcasses. Any found are photographed, tagged, and identified using a key, and the resulting data have been hugely influential in identifying mass bird die-offs on the west coast of the US, among other findings. A new study has now examined which factors lead to high effort, accuracy, and social connectedness in the COASST project. Unlike most online citizen science projects, where >90% of participants drop out after one event, more than half of COASST participants were still conducting beach surveys more than a year after they joined the project. Bird identification accuracy was 88% overall, and increased rapidly as more carcasses were encountered. Interestingly, many (34%) long-term participants never even found a single carcass, but continued to report their search effort, providing valuable data on bird occurrence and absence. Participants also ranged widely in their social nature, from loners to those who walked mostly in pairs, to those who were highly connected and recruited many others to join in the COASST project. The results show that high-quality data can be successfully collected by extensive citizen science projects, greatly expanding the reach of science. The new work was led by SAFS professor Julia Parrish together with Timothy Jones, Hillary Burgess, and Yurong He at SAFS, Lucy Fortson at the University of Minnesota, and Darlene Cavalier at Arizona State University, and appears in the journal Proceedings of the National Academy of Sciences, USA.