SEEM 2015 Conference
Statistics in Ecology and Environmental Monitoring
22 – 26 June 2015         Queenstown, New Zealand
Hosted by:

  HOME    PROGRAMME & ABSTRACTS    EXCURSIONS    ACCOMMODATION    REGISTRATION  




Plenary abstracts

Ken Pollock

North Carolina State University

The robust design : historical comments, goodness of fit tests, and future directions

The traditional robust design combines open and closed models into one overall capture-recapture model that has many advantages for long term studies. I will begin with a historical development of how the modeling developed from the original focus on unequal catchability to temporary emigration, unobservable states, and multi-state versions. I will also discuss some new research on goodness of fit testing based on sufficient statistics and exploiting a data partition into between and within primary periods. I will conclude with a discussion of current and future research directions.

Andy Royle

US Geological Survey, Patuxent Wildlife Research Center

Spatial capture-recapture with partial information on individual identity

Considerable recent attention has been focused on using partial or incomplete information about individual identity in capture-recapture models. Most of this work has involved non-spatial capture-recapture models which, in practice, discard information about the spatial location of individual captures in order to formulate a model in terms of a latent non-spatial encounter history. However, there is information about individual identity in the spatial pattern of recaptures. For example, in trying to reconcile single flank camera trap photos of individuals, a left flank and right flank photo are more likely to be the same individual if they are captured in the same trap or in close proximity. And, the likelihood of individual identity should decrease with distance between capture locations. A related problem concerns combining data on unidentifiable detections (scat or hair presence) with encounter histories of known individuals. As with the single flank camera trapping problem, the probability that an observed detection of unknown ID belongs to some individual should depend on the distance between the detections. Here we present a model for such problems which regards the true encounter history of an individual as a latent or partially latent construct and the ID of unknown or partially known samples to be a latent variable which potentially matches a known individual. The latent ID variables are dealt with using a Markov chain Monte Carlo scheme in which they are simulated from the posterior distribution conditional on the data. We provide an illustration of the method to a spatial capture-recapture data set and discuss several useful extensions of the methodology.

Kerrie Mengersen

Queensland University of Technology

Immersive virtual environments: extracting expert information to support Bayesian spatial modelling and analysis

How to get experts to reliably and completely encode what they know has for decades proven to be an elusive goal in knowledge management. An appealing approach for spatial problems is to immerse the expert in a realistic virtual environment and elicit information based on what they 'see'. This information can then be translated to probabilistic statements and distributions that can be used as priors in Bayesian models. We describe a pilot study in which the aim was to predict the presence of a threatened Australian animal, the rock wallaby, by augmenting the sparse observational data with spatial information obtained from expert ecologists. The results suggest that the immersive approach provides a rich source of reliable prior information that can enhance statistical modelling and prediction.

David Warton

University of New South Wales

The case of the missing model: the modernisation of multivariate analysis in ecology

For the best part of four decades, multivariate analysis in ecology has diverged substantially from mainstream statistics, perhaps because state-of-the-art in 1980s' statistics was not capable of handling the complexity frequently seen in multivariate abundance data simultaneously collected across many species. But the methods developed in the ecological literature, still widely used today, have some serious shortcomings that suggest they are fast approaching their use-by date.

The statistical literature appears to be "catching up" with ecology, in part through technologies to fit quite flexible hierarchical models capable of accommodating key data structure. There is a significant movement now to reunify multivariate analysis in ecology with modern statistical practices. Some key developments on this front will be reviewed, and immediate challenges identified.

Shirley Pledger

Victoria University, New Zealand

Visualising ecological community data: a unified statistical method

The detection of patterns of occurrence or abundance of p species (taxa) over n samples (sites, times) is difficult if n and p are large. Ordination and clustering techniques were developed to try to detect overall patterns, such as which sites have similar species composition, which groups of species tend to occur together, or whether there is a succession pattern over time.

Traditional methods of ordination are usually based on mathematical methods such as eigenvalue analysis (e.g. correspondence analysis, principal component analysis) or distance metrics (e.g. multidimensional scaling). Similarly clustering is often based on a distance measure.

Our methods are based on statistical distributions, finite mixtures and maximum likelihood. This yields a unified methodology for model comparisons, ordination, clustering, dimension reduction and pattern detection.

A brief outline of our methodology will be followed by several real examples demonstrating the use of biclustering for pattern detection and the construction of 2-D plots showing the main features of the data. The emphasis will be on applications and visualisation rather than formulae. An update will be given indicating the progress of current work to expand the scope of this ongoing research project.

Murray Efford

University of Otago

Density-dependent home range and the informative parameterization of spatially explicit capture–recapture models

Home-range size varies inversely with density across populations of a species. Whatever the causal explanation, this generalisation has interesting implications for population sampling, and spatially explicit capture–recapture (SECR) in particular. The core SECR model has parameters for both density and scale of detection (often a surrogate for home-range size). A simple re-parameterization incorporates the inverse relationship in fitted SECR models. This has multiple potential benefits: greater parsimony, a more general null model, and explicit modelling of between-population variation in the effect itself. I develop these ideas with data from field studies of populations with both discretely varying and continuously varying density. Other re-parameterizations of the SECR detection model help us make sense of compensatory heterogeneity and the curious robustness of density estimates to misspecification of the detection model. These will be the focus of the remainder of the talk.

Bryan Manly

Manly-Biostatistics Limited, Dunedin, New Zealand

Statistical consulting experiences in the USA: marine mammals, birds and fish in and around water

After 26 years teaching at the University of Otago in New Zealand I moved to the United States in 2000, and then retired back to New Zealand at the end of 2014. In the United States I worked as a statistical consultant for the company Western EcoSystems Technology Inc. which has a combination of biologists and statisticians working together on a wide variety of projects dealing with ecological and environmental problems. In this talk I will discuss the many projects that I worked on from 2000 to 2014, and particularly those involving marine mammals, birds and fish around rivers and seas. These projects include sampling designs and analyses for the Alaska Marine Mammal Observer program, the development of models to predict the number of the endangered delta smelt caught and killed in facilities built to pump water from the Sacramento-San Joaquin Delta in California for drinking water and agriculture, and advising the U.S. Army Corps of Engineers on the design and analysis of studies to estimate the survival of fish passing through dams on the Columbia River in Washington, and on experiments on dam modifications to improve this survival.