# Southern resident growth rate ambiguity

About a month ago Orca Relief issued press releases related to the population dynamics of southern resident killer whales.  I spent a little while then marveling at how their new leader Bruce Stedman seemed to have cherry picked data with which to support a claim that the population was in precipitous decline.  This morning I took some time to revisit those plots in light of the equally disturbing statistics and graphs presented during the orca-salmon workshops conducted by Will Stelle over the last couple years (live blogs with audio and/or video here — workshop 2; workshop 3).

Basically, somebody who is statistically-talented and politically-independent needs to clarify for the conservation and stewardship community the state of the Southern Resident killer whale (SRKW) population.  The same person or entity should re-interpret the data whenever we get an updated census (from the Center for Whale Research [CWR] which — as an aside — recently put its NOAA-funded photos and census data behind a membership paywall).

The main misleading aspects of the Orca Relief plots (see below) are that they only went back to 1997 (though we have data back to 1971) and they focused on decreases in juvenile and reproductive females that were drawn from maxima in the time series, rather than being fit to the full range of data.  This is an incomplete statistical presentation, and one that is potentially misleading, just as using stacked bar graphs can cause confusion in any science presentation.  Below are two examples.

Female SRKWs of (undefined) reproductive age by pod.  Stacked bar graphs make it difficult to see and compare trends in the K and L populations; only the J and total population trends can be easily discerned.

Juvenile female SRKW population time series as presented by Orca Relief.  The rate of decline from 2001 to present is an overestimate of decline compared with one made using 1997 to present, or a best linear fit to all available data.  More importantly, why have the previous, available 22 years of data been excluded?

For comparison, here is a plot of all CWR census data (1971-2012) by pod as presented in the 2012 Puget Sound Partnerships orca vital sign:

CWR SRKW population data by pod presented by the PSP.

This shows how potentially misleading it is for an organization like Orca Relief to showcase trends using only data since 1997!  If we use 1971 as the baseline, J and K pod appear to have grown slowly, while L pod’s growth is positive but lower.  It also is apparent that the population dynamics of L pod, more than J or K, drive the sudden changes in the total population, which is growing on average over the whole time series.  As the 2012 PSP’s orca vital sign reports:

“…as of July 1st 2013, the size of the population was 82 individuals, down by four whales relative to the 2010 baseline reference of 86 whales… Although there has been no progress made since 2010, the population has been growing, albeit slowly at about 1% per year, over the longer term (1979 to 2010).”

The stories run by Q13FOX, the San Juan Islander and Island Guardian and picked up by KOMO news didn’t concern themselves with such statistical details.  However, a later piece in the San Juan Islander about reproductive (mature) male SRKWs quoted NOAA’s Eric Ward who argued that the recent decline in mature males is part of the population’s historic fluctuations, not a sign of imminent collapse as suggested by Orca Relief in the Island Guardian.  Both pointed out the good news: that juvenile male populations have been rising recently.

Mature male time series portrayed by Orca Relief (1997-2013)

Juvenile male increase graphed by Orca Relief

Eric Ward’s plot of mature male SRKW population (1971-2013)

As an aside, I was impressed that the San Juan Islander provided references to some of Eric Ward’s papers.  What wasn’t referenced though, were the many discussions of SRKW population trends in the 3 NOAA-sponsored workshops on “The Effects of Salmon Fisheries on Southern Resident Killer Whales.”  In the third workshop, it became clear that Ward presented estimates of population growth rates for the SRKWs (basically +1% annual growth rate) which were at odds with estimates made by his Canadian modeler-analog Antonio Velez-Espino (basically -1% annual growth rate).   (Velez-Espino chose to exclude pre-1987 data, stating that his choice — a compromise between extent and quality of data — was most representative of the current population ~25 years or one generation back.)

So, is the SRKW population in slow decline or slow recovery?!  How concerned should we be with the latest downturn in the overall population?  What does it mean for the recovery process if the population continues to fail to meet the recovery plan goal of maintaining 2.3% annual growth over a period of 14 years?

The final report from the orca-salmon workshops (Hilborn, Cox, Gulland, Hankin, Hobbs, Schindler, Trites, 2012) has some exemplary figures that shed a little more light on the situation, particularly the trends in the population of juvenile females (J pod increasing, L pod decreasing).  Note that the break points between Orca Relief’s and the Science Panel’s juvenile/young and breeding/reproductive age classes are different.  Nevertheless, it’s worth focusing on the  juvenile and reproductive female age groups because they have a strong influence in population models like Velez-Espino’s.  (He stated in the final workshop that the greatest increase to the modeled population growth rate comes from increasing survival and fecundity of young reproductive females.)  Below are figures for “young” (<21 year old) and “young and reproductive” (<43 year old) populations (not directly comparable to the age groups chosen by Orca Relief, but similar).

Young (<21 year old) SRKW population trends and sex ratios.

Young and reproductive SRKW population trends

So, out of time again, I’m left wondering: should we be panicking as the strange year of 2013draws to a close?  What do you think?

References:

 

 Hilborn, R., Cox, S. P., Gulland, F. M. D., Hankin, D. G., Hobbs, N. T., Schindler, D. E., & Trites, A. W. (2012). The Effects of Salmon Fisheries on Southern Resident Killer Whales Final Report of the Independent Science Panel (p. 87). NMFS & DFO. Retrieved from http://www.nwr.noaa.gov/publications/protected_species/marine_mammals/cetaceans/killer_whales/esa_status/kw-chnk-final-rpt.pdf 
 

# ResearchGate converging with Libre peer review?

Before I Mendeleted, one thing that always pissed me off about Mendeley was that they provided no tools for being notified when a private group member added a reference to a shared library.  Getting an email or text when a valued colleague or collaborator publishes a paper is the main way I stay on top of the literature these days.  Sometimes the notification comes directly from the author (often via a topical email listserv like MarMam).  Other times it is in a personal email from a co-author, or in a Tweet from a colleague (Rob Williams and Erin Ashe of Oceans Initiative are good at this).

Given this aspect of my scientific workflow, I’ve been very pleased that Research Gate has been reliably sending me email announcements when the members I follow add publications to their account.  While some of these notifications aren’t so useful because the publications are dated (due to new members uploading their old work), many notifications have led me to recent publications that I’d not yet read.  Here’s an example from this morning’s email:

Thunderbird view of ResearchGate email announcing new pubs from colleagues I follow

I clicked on view, cut and pasted the title into Google, and happily found an open source version of the article published by Hindawi.com.  I read it, found some errors, and wished I could ask a couple questions of the lead author.  Hindawi offered no way to comment (or even contact the publisher), so I went back to ResearchGate and noticed the discuss button:

Note the discuss button in the lower left corner…

I clicked on that and was able to ask a question (not always the best way to start a discussion, but ok).  It took me a while to figure out how to get the title worded right (and under the character limit), how to pose multiple questions, and what tags to add to the discussion topic, but this is how it came through:

I’ll report back if I get any response, but I’m pleased that ResearchGate made posing some questions possible, at least.  This is far from what I’d prefer to be able to do: mark-up the paper directly (or at least my version of it) as I read it in a way that the author could easily understand and respond to (in the form of corrections or improvements to the paper).

Of course, this is what projects like Libre aspire to do…  Which leads me to my next task this morning: exploring what appears to be the launch of Libre!  At last their home page has changed; in fact, it’s re-branded as Liberating Research.  Now that’s a movement I can join!

Unfortunately, it looks like the new site is just a framework for what’s to come.  Although the navigational header looks promising —

— those links to (tools for?) Authors, Articles, Bibliographies, and Disciplines all lead to the same old “Sign up” page for those wanting to hear when the site actually launches.  Here’s looking forward to the wedding of scientific social networks like ResearchGate with improvements in peer review like those offered (very soon!?) by Libre.

# Hydra: a step towards marine telemetry data sharing

Today I finally was able to log on to Hydra a web site that facilitates data sharing among researchers who track the movements of aquatic animals in the Pacific Northwest, U.S.

The whole proprietary tag/receiver technology maintained by Vemco leaves a *lot* to be desired from an open science perspective.  By all rights they should be the ones making it easy for their users to share data.  But Vemco has dominated the market and played defensively for so long that the community here in the progressive Northwest was forced to create Hydra to enable some degree of organized collaboration and data sharing.

As we finish up the final field work for our study of how adult (lackmouth) salmon move in the San Juan Islands, today I’m cleaning up receivers, offloading data, uploading detection files to Hydra, and preparing the receivers to be returned to their owners.  Also, I organized and uploaded to Flickr a bunch of photos from the project.

Data management progress:

• Reset Panasonic toughbook clock to NIST
• Verified data offloaded from Iceberg and Patos receivers (101004, 110848) and prepared for storage
• Offloaded data from previous batch of recoveries (9 receivers) and backed up all data to cloud
• Still need to offload data from latest batch of recoveries (5 receivers)
• When interpreting results, be sure which time Vue exports: local or GMT!  Checking in the Vue software options, I see time is set to UTC.

More progress on 11/14/13:

• Uploaded yesterday’s csv files to Hydra after verifying that drift was no more than 5 minutes for any deployment during our study.
• Updated metadata, including: made new sheet to match Receiver Deployment upload format, copied metadata to it, then converted recover dates to End_Date column; then converted all lon/lat pairs to decimal degree format (and checked/corrected all for accuracy via Google map of deployments)
• At first attempt at deployment metadata upload, I got “Manufacturer” error.  I tried re-formatting Google spreadsheet date as YYYY-MM-DD, removed a () from a site name, removed all blank lines before downloading the .csv file, and then simplified the file name (no spaces) before re-uploading.  That got me this error:
Unable to parse start date: Date string was 2011-11-15, and did not match YYYY-MM-DD HH:MM:SS or MM/DD/YYYY (time data '2011-11-15' does not match format '%Y-%m-%d %H:%M:%S')
• Then I tried switching back to the Google date format of MM/DD/YY even though the output CSV file does not display double digits for the first 9 months or days (e.g. 3/4/2013 not 03/04/2013).  Hydra didn’t seem to mind about that though –
• receivers-recovered-mmddyyyy.csv processed successfully, Deployment File, 32 rows, 0 rejected.
• Finished offloading data from last batch of 5 receivers (but did not open cases as they still need final cleaning): 101589, 100914, 101594, 101621, 108402.

More progress on 11/15/13:

• Backup and upload 5 most recent data files.
• Experiment with Hydra mapping and sharing functionality (seems to leave a lot to be desired wrt figuring out anything about the actual fish whose tags we’ve detected), finish feedback email to Hydra administrators, and send it.
• Emailed Anna+ with updates, links, and .zip of latest data files
• Upload to Google workbook and compare 2012-2013 tag list with rough notes on detections (column P).  I find zero overlap.  Bummer!

# Pondering noise levels from cavitation

Our spreading experiment results and ship receive levels are suggesting that different ships within a single ship class put out different amounts of high-frequency noise (at the same range from our hydrophone and the same speed over ground).  Could this be a manifestation of cavitation within many or all classes of ships?

A clue could come from experimental (e.g. laboratory) measurement of cavitation noise.  How loud is cavitation from propellers at the standard range of 1 meter?

Thus my literature search began.  (Blake, Wolpert, Geib, 1977) show some interesting figures of source spectra for cavitation that have minima near 6 kHz and increasing levels not only at higher frequencies (up to 80kHz) but also at lower frequencies (down to 1 kHz where levels are actually above those at 80kHz).   The figure notations in (Kipple, 2002) suggest that cavitation noise primarily occurs at frequencies above 10-20kHz.

They also show a plot of spectral density of cavitation noise at 31.5 kHz that could be compared to single ship spectrum presented by (Hildebrand, 2006), as well ours…  Hildebrand computed the 31.5 kHz source spectrum level of the Hanjin to be ~110 dB.  The corresponding Blake levels range from 75-100 dB, suggesting that the Hanjin is *really* cavitating!

Just stumbled upon a new “word” in the Urban Dictionary that I plan to use often in this notebook and beyond.  It’s a daily experience for me to get news of a new science article — via email lists like MarMam or in a Tweet from a colleague — only to click on the link and run into a \$20-40 paywall.  I usually scan the abstract, add the doc in Zotero, and tag it with “ref2get” so that when I’m next on the University of Washington campus I can download the PDF.

Since I work in my home office most days, it typically takes from 2-4 weeks before I manage to access the PDF.  Now, in that interim period, I can simplify any response I might have made if the article had been open-access by simply typing —

bp;dr

It’s really an acronym for “behind paywall; didn’t read.”  I’m particularly looking forward to using it in face-to-face conversations with colleagues who still published in closed-access journals or in audience discussions at the next ASA conference.  I predict it will become the openscientist’s version of my kids’ favorite finger symbol: W(hatever)!

Cynthiakane.com Whatever

# Ship source level work with Val in Seattle

Working together in Seattle, we are still struggling to understand and compute how loud commercial ships are at high-frequencies (1-100 kHz).  In pursuit of the optimal way to convert our receive levels into source levels, we are considering spherical spreading vs a frequency-dependent model informed in part by our tonal transmission loss experiments at Lime Kiln.  We also MUST tackle how to apply the physics of sound absorption in sea water, at least for those ships for which we have measured a signal above noise in the 10-100 kHz frequency range.

Today Val is working on comparing 1/3 octave band levels for a few key ships with measurements made by JASCO.  He is also comparing our transmission loss experimental results with theirs, ideally seeing whether we have similar frequency-dependent spreading models and assessing how different they are from a simple assumption of spherical spreading with or without absorption.

I continue scouring the literature (some of it rooted in WWII Naval measurements) for comparable measurements of surface ship noise.  I’m also trying to understand cavitation better, including searching for any measurements of noise from cavitating propellers and/or ships.  Thus far the best recent examples are Kipple and Hildebrand’s single ship (among many measurements of noise below ~1 kHz), but I’ve just found some declassified measurements of surface ships and submarines that appear to extend up into the 10-100 kHz range.

On a procedural note, I need to add Zotpress to this blog so I can insert citations properly…