Why least appealing? It’s the one I’m looking at.
Posts by Matthijs Hollanders
Cool. I only just wrapped my head around pkgdown, if the tinyverse has a way of creating a website on Github Pages I'd be keen!
Would this work nicely with pkgdown or is the whole point to step away from that?
Single season models are currently supported for single- and multispecies analyses. Dynamic occupancy models, where sites are surveyed over several disjuct periods, are currently under development. To read more about the model, see the model vignette. 5/5
mhollanders.github.io/occARU/artic...
Figure produced by occARU::plot_partitions(fit). Shows variance partitions or component-level scales produced by simplex decomposing the variance of the detection (and potentially occupancy) linear predictor.
occARU accommodates site-level predictors for occupancy and detection and additionally site-by-survey level predictors for detection rates. Continuous, categorical, and ordinal predictors are currently supported. Model complexity is handled via global-local shrinkage priors. 4/n
Figure produced by occARU::plot_surveys(fit), showing temporal detection rates for each species.
occARU takes in standardised CamtrapDB data by default, optionally thins detections with a user-specific value, and aggregates observations to surveys of arbitrary lengths. This balances temporal resolution with computational demands. 3/n
Figure produced by occARU::plot_sites(fit), showing a map site locations shaped by occupancy status and sized by detection rates for each species.
occARU uses hierarchical multispecies Gaussian processes to infer spatial and temporal variation in detection rates, and estimates interspecific correlations in these processes. These effects are orthogonally projected to retain inference on fixed effects. 2/n
occARU: Bayesian (multispecies) occupancy models for ARU data using @mc-stan.org. Automated recording units (ARUs) like camera traps produce rich time series which warrant going beyond occupancy and focusing on detection rates. 1/n
mhollanders.github.io/occARU/
New pre-print! I cover a range of open capture-recapture models (single survey/robust design, multistate/multievent, in (Cormack-)Jolly-Seber variants) in Stan, and provide efficient log likelihood functions. I also introduce a method to account for unequal survey intervals in the entry process.
df_add_utms
df_arrange_by_site_and_survey
df_thinned_by_waiting_time
they look so bad, but I feel compelled to use base pipe
are the bangers in the room with us now?
Lmao
How much carbon did this thread use?
Unironically no. Reflect the heat back in.
Shiny side in is for cooking to reflect the heat back to the food. For storage it’s irrelevant.
Same
what's the model? why not just random school effects with pooled scales?
Congrats!! Can't wait to read it.
it feels natural, but log(gamma(1, 1)) puts more prior mass on negative values. which seems countintuitive, but perhaps it isn't as we're dealing with rates.
Yeah sorry I should've clarified, the idea would be to put a log Gamma prior on the coefficients, i.e.:
intercept ~ gamma(1, 1)
beta ~ gamma(1, 1)
lambda = exp(log(intercept) + log(beta) * x)
y ~ poisson(lambda)
O
Ecologists will literally use normal(0, 1000) priors for intercepts in slopes in logistic and Poisson models.
Haven't been here in ages so just gonna take the liberty of tagging @rmcelreath.bsky.social because I think he'd have some good thoughts on this.
In (Gaussian) linear models, we usually put normal priors for regression coefficients. How do we feel about putting gamma(a, a) priors in something like Poisson regression? Using rates for rates seems nice; the geometric means are still 1, meaning the priors are centered on no multiplicative effects
logistic(0, 1) is literally uniform on [0, 1].
Just a curious thing to do a week before FOMC especially when cuts were predicted.
What was your reasoning?
IG