About This Project
This project, partially funded by Sea Grant California, at Cal Poly Humboldt uses Baited Remote Underwater Video to study fish communities in Northern California’s sandy beach surf zone within Marine Protected Areas. The non-invasive method gathers data on species diversity, including commercially, recreationally, and culturally important species. Funding for the MaxN AI plugin for EventMeasure will automate video analysis, increasing efficiency, consistency, and reducing future monitoring costs
Ask the Scientists
Join The DiscussionWhat is the context of this research?
Baited Remote Underwater Videos (BRUVs) consist of cameras mounted on an anchor, with a bait bag suspended at a distance to attract fish and marine species for video capture. This non-invasive, non-destructive method is ideal for monitoring fish species richness, diversity, and abundance, especially in Marine Protected Areas (MPAs), which are critical habitats for many species. While BRUV footage provides valuable insights, it can be time-consuming to analyze, as hours of video may yield only a few fish observations. To improve time efficiency, EventMeasure is working to develop an AI plugin called MaxN. MaxN automatically detects fish presence and timestamps each observation, allowing researchers to focus on key frames. This project aims to assess whether integrating MaxN into BRUV analysis will enhance time efficiency, increase consistency, and improve the capacity for cross-project comparison in MPA monitoring.
What is the significance of this project?
The ability to efficiently and accurately assess fish species richness, diversity, and abundance in Marine Protected Areas (MPAs) is critical for informedconservation and management. Traditional Baited Remote Underwater Video (BRUV) sampling, while non-invasive and highly effective, is hindered by the tedious and time-intensive process of analyzing extensive video footage. This project aims to revolutionize BRUV data analysis by integrating the AI program, MaxN, which automates fish detection and timestamping. This serves to enhance data processing efficiency, improve analytical consistency, and expand the capacity for large-scale, cross-project ecological assessments. If the goals of this project are met, this protocol could be implemented widely in marine research as a non-invasive and time-efficient method to monitor MPA success.
What are the goals of the project?
Our first goal is to explore the AI plugin MaxN as a viable alternative for manual BRUV video analysis. We will train the MaxN plugin with raw BRUV footage collected from an ongoing MPA monitoring effort. We will then evaluate MaxN by comparing the results of the manually processed BRUV footage with the results produced by AI, given the same footage. This comparison will assess whether AI can improve speed, accuracy, and scalability, ultimately reducing the time researchers spend analyzing footage.
Another key objective is to develop a protocol that enhances time efficiency, accuracy, and standardization. This protocol will establish a standardized method for integrating AI into BRUV analysis, improving accuracy, and ensuring consistency across MPA research. The goal is to create a replicable process that minimizes human error while maximizing time efficiency in data analysis.
Budget
This $10,000 funding request is part of a larger Sea Grant California-funded project aimed at monitoring the fish community in Marine Protected Areas (MPAs) to assess their success. BRUVs are one of the methods used in this monitoring effort. The funding will support the purchase and validation of MaxN AI software for BRUV video analysis, helping determine if AI can improve time efficiency, standardization, and accuracy in marine biodiversity monitoring
- AI Software - $3275: Covers MaxN plugin for EventMeasure to automate fish species detection and evaluate AI’s efficiency vs. manual review, helping standardize analysis
- Researcher/Video Analyzer (200 hrs) - $5240: Researcher will validate MaxN’s performance by comparing AI results with manual analysis, assessing its ability to process large footage volumes more efficiently
- Fringe - $555: Payroll taxes (worker’s comp, unemployment, Social Security, Medicare)
- Miscellaneous - $930: Unexpected costs (hardware, software upgrades, storage)
Endorsed by
Project Timeline
The AI program MaxN will be purchased immediately. One month will be budgeted for the primary researcher to become familiar and comfortable with the program, ensuring it is used to its fullest potential. We plan on analyzing about 100 hours of raw BRUV footage both manually and through MaxN, which will take place between May and September. Once this analysis is completed, we expect to spend the following month writing a clear and concise report highlighting our results.
Apr 08, 2025
Receive grant funding and acquire MaxN AI software; train the primary researcher on software operation.
May 08, 2025
Begin analysis of BRUV footage using MaxN AI software, while simultaneously conducting manual analysis for comparison.
Sep 15, 2025
Complete both manual and MaxN AI analysis of BRUV footage.
Sep 22, 2025
Start writing the research report based on our findings.
Nov 22, 2025
Finalize the research report, preparing it for submission.
Meet the Team
Affiliates
Katie Terhaar
Katie Terhaar is a marine fisheries ecologist with a strong focus on coastal ecosystems and marine conservation. She holds a Master of Science in Natural Resources from Humboldt State University (now Cal Poly Humboldt), where her thesis explored the ecology of sandy beach surf zones and evaluated the effectiveness of MPAs in Northern California. Her academic background has equipped her with a comprehensive understanding of marine biology, conservation strategies, and field research techniques.
With five years of experience in BRUV footage analysis and deployment, Katie has contributed to the monitoring and assessment of fish populations in coastal environments. She is skilled in scientific diving, underwater sampling, and species identification, and is focused on improving marine biodiversity monitoring methods.
Katie's work continues to emphasize the integration of field-based data with conservation strategies, aiming to improve the understanding of marine ecosystems and inform more effective and sustainable management practices.
Noah Gabay
MS in progress
Natural Resources - Fisheries, Humboldt State University
Noah Gabay is a marine biologist entering the field of fisheries biology. He has obtained two Bachelors of Science from Cal Poly Humboldt, in biology with a marine concentration and zoology. Upon completion of his undergraduate education, Noah transitioned immediately into a masters program in Natural Resources at Cal Poly Humboldt. The research project for his thesis continues a long term MPA monitoring effort in northern California using BRUVs as a main non-invasive sampling technique.
Lab Notes
Nothing posted yet.
Additional Information
Marine Protected Areas (MPAs) are established in ecologically significant areas that support critical stages of the life histories of various organisms. Because of this, non-invasive sampling techniques are preferred by researchers studying marine species associated with MPAs. Baited Remote Underwater Videos (BRUVs) represent a low-impact method for gathering data on species diversity and abundance in marine and aquatic environments. This technique is particularly conducive to sampling areas where delicate or endangered species may be present, minimizing disruption to the ecosystem. However, BRUV sampling produces numerous hours of video footage which requires rigorous attention from the observer to extract meaningful data. Researchers working with BRUV footage often spend many hours analyzing the footage, sometimes without recording any species.
This project, funded by Sea Grant California and conducted by staff in the Department of Fisheries at Cal Poly Humboldt, focuses on studying the fish community in the sandy beach surf zone along the Northern California coast. The project has been ongoing for five years and is set to continue for at least two more, allowing for the collection of long-term data on MPAs. The study employs BRUVs to document and monitor the fish species in this dynamic environment. By using BRUVs, researchers are able to capture video footage of fish interactions with baited camera traps, providing insights into species composition, behavior, and abundance. This non-invasive method allows for detailed observations without disrupting the delicate coastal ecosystem, offering valuable data for the management and conservation of MPAs in the region.
This research is not only important for commercial and recreational fisheries in the area, but also valuable for the underserved communities in the region. Many of the fish species in the sandy beach surf zone are essential to the cultural and subsistence practices of local Indigenous tribes. By using non-invasive BRUV sampling, this project optimizes data collection while minimizing disruption to the marine ecosystem. The continuation of this work will provide valuable insights to guide the management of MPAs, supporting the protection of both marine biodiversity and the commercial, recreational, and cultural resources central to local communities.
To enhance the efficiency of data analysis, we are seeking funding for the MaxN AI plugin for EventMeasure software, developed by SeaGIS. The MaxN plugin will allow us to rapidly process large volumes of BRUV video data, accurately detecting fish presence with high consistency. Through EventMeasure, the AI program timestamps fish and other species in the footage, enabling researchers to focus only on frames where organisms are identified. This will significantly reduce the time spent analyzing hours of footage, improve consistency, and lower the costs associated with future monitoring efforts. It is expected that by adopting this innovative approach to video analysis, AI will increase time efficiency, increase consistency, and reduce expenses for future monitoring projects in MPAs and other sensitive aquatic habitats in which BRUVs are utilized.
Project Backers
- 0Backers
- 0%Funded
- $0Total Donations
- $0Average Donation