Exploring Electronic Monitoring

Feb 26, 2020 | Aids to Navigation

More and more captains are using cameras to track catch. Footage from electronic monitoring.

 By Melissa Sanderson

[email protected]

How do you measure “soft costs?”  Wait.  What ARE “soft costs?”

Soft costs represent a fisherman’s time and effort. They have a monetary value despite not being an actual invoice to be paid.

The soft and hard costs of electronic monitoring programs (EM) were at the forefront of conversations for two days in February during the National Electronic Monitoring Workshop, in Seattle, Washington, hosted by NOAA Fisheries and the Pacific States Marine Fisheries Commission.

Electronic monitoring should be cheaper than humans.  Industries automate because technology gets cheaper while the cost of human workers is increasing.  But in designing electronic monitoring programs for fishing vessels, it is all too easy to add on all sorts of bells and whistles that increase the costs.  During the workshop, these bells and whistles were often referred to as building a Cadillac when all you need is a Chevy — or a bicycle.

At its core, using cameras to document what is happening on a fishing vessel is a tool to get specific types of data.  To paraphrase Half Moon Bay fisherman and Pacific Council member Bob Dooley, EM data provides certainty, a better understanding, of what comes out of the water.  That certainty is key to accurate fishery science.  That accuracy is what allows for full quota allocations, abundant stocks, and little to no need for uncertainty buffers.

All of us at the workshop agreed that the holy grail of accurate data is at the heart of why we invest time, energy, and a lot of money into designing EM solutions.  But there is often disagreement about how to get that accurate data.

How much precision is necessary? Does it matter if we’re estimating weights from half full totes or do we need to measure every single fish?  What exact data do we need to collect?  Is it enough to just have cameras running to change fishermen’s behavior and prevent illegal, unreported discarding?  Do we also have to watch all that video?  What about all the extra data we could be collecting for science?

EM projects have been testing what EM can do, which encourages innovation but also creates “mission creep.”

Your cell phone is a perfect example of mission creep.  The primary objective of having a cell phone is to communicate with others.  But one day you realized you could also get email and have your calendar in your phone and you were willing to pay a little more for a smarter phone.  Then there were all these applications that could be installed that made your life a little easier or more convenient and you paid more to increase your monthly data allowance.  Suddenly you were using your phone to deposit checks, read the news, video-chat with friends, and stream your favorite shows.  Now you needed unlimited data.

But wait, isn’t a phone for making calls?  Not anymore. Mission creep has expanded what your phone is used for, and dramatically increased your phone bill.

The same thing can happen to EM, as managers, scientists, and even fishermen go beyond what is required and start adding “nice to have” bells and whistles.

This is why it is so important to define the primary objectives of the monitoring program, before designing the solution.  EM to ensure that illegal discarding is not occurring looks very different from EM to quantify the weight of discarded fish.

But some of that mission creep is tantalizing.  How much extra do we spend now, in order to advance future savings?

Let’s say you want a computer to identify and measure fish coming onboard.  Depending on if the fish is in a “photo booth” or dangling off a hook, a machine-learning program will need between 15,000 and one million different images of a single species of fish before it can reliably replace a human reviewer.  Eric Pennaz, from Google, emphasized the importance of storing all the video collected by EM, to develop the image library necessary to train artificial intelligence.  This is expensive, but advances should significantly decrease review costs in the future.  Yet that is counter to demands that we delete video as quickly as possible to reduce storage costs.

How valuable is a tool like EcoCast (https://coastwatch.pfeg.noaa.gov/ecocast/), which aims to reduce interactions with protected species while maximizing target catch?  This dynamic tool uses ocean temperature, wave height, chlorophyll and wind data, combined with historic fishing and observer data to provide fishermen a real-time forecast that predicts where species are likely to be.  EM data could be a valuable addition to expanding EcoCast around the country, but how much does it cost to extract the extra data?

What about a fisherman in New Zealand who is selling his fish for eight times more than the boat next to him because he has embraced full transparency and live-streams video to the Internet, including live video of the fishing method and deck operations?  Check it out here: https://betterfish.co/.

Maybe NOAA or the research community need to supplement monitoring budgets, to accommodate mission creep without placing the financial burden on fishermen.

A repeated theme throughout the workshop was “trust and collaboration.”  Successful EM programs require fishermen, fishery managers, EM service providers and scientists to collaborate.  Their decisions hinge on evaluating trade-offs, how much change in operations a fisherman is willing to make (soft costs), vs. what they are willing to pay someone else to do (hard costs).  For example, when it comes to measuring discards:

  • Are fishermen willing to take time to record more high-quality information, which allows for an audit of a portion of logbooks? If not, an audit might not be possible, which significantly increases review costs, since all trips get reviewed instead of 10 or 20 percent.
  • Are fishermen willing to bring their discards, often unsellable, home to be measured dockside? That results in minimal changes to fish handling on deck, cheap video review costs, but adds the costs of a dockside monitor.
  • Are fishermen determined to return discards to the sea, hopefully alive? Then they need to be willing to handle those fish specially, in view of the cameras. That might mean lining up each fish on a measuring board, or putting all the juvenile haddock together in a bucket that provides volume estimates, or installing flow scales and weighing discards before they go overboard.

Another part of program design is considering who has access to and control of the video.

The workshop shared insights from lawyers, NOAA General Council and the Office of Law Enforcement (OLE).  It was reassuring to hear that OLE will only access video if they have a tip or probable cause to compile evidence; they will not be watching video looking for potential violations.  OLE does not provide confidential data to the Coast Guard.  The Magnuson Stevens Act prohibits Coast Guard access for safety violations; they can only access for fisheries violations and Homeland Security purposes.  But unlike OLE, the Coast Guard could pursue non-fisheries violations if they stumble across it while accessing video related to a fisheries violation. Confidentiality protections depend on if NOAA or a third party EM service provider is reviewing and storing the video. As the program is designed, lawyers should develop very thoughtful and deliberate contracts for how video and summary data is shared, accessed, and controlled.

At the end of the day, NOAA Fisheries was clear that they are looking to the EM community and fishermen to help define what EM looks like for each region and fishery.  EM cannot be a one-size-fits-all solution. It is heartening that NOAA recognizes the need for unique solutions that are safe, achievable, cost-effective, and can evolve.

Speaking of innovations, it was apparent that how regulations are written can allow or restrict the evolution of EM.  It is fantastic if machine learning can cut costs by 50 percent and speed review, but adapting those innovations becomes incredibly difficult if the New England Fisheries Management Council has to go through another two-year process to amend regulations to allow the use of machine learning.  Councils don’t dictate the details of human observer programs, and shouldn’t dictate details of EM programs.  Councils and regulators should instead focus on setting clear monitoring objectives and performance standards, leaving the nuts and bolts of how EM collects and provides data to the service providers.

Melissa Sanderson, Chief Operating Officer at the Fishermen’s Alliance, manages the New England Electronic Monitoring Audit Model program, assisting vessels from Rhode Island to Maine in piloting “EM” solutions. This article shares information and insights she gleaned at the recent National Electronic Monitoring Workshop.

Categories

e-Magazine PDF’s