The first comment is that this is actually a non-trivial theoretical question: there is a rather long thread on r-sig-mixed-models that goes into some of the technical details; you should definitely have a look, even though it gets a bit scary. The basic issue is that the estimated coefficient values for each group are the sum of the fixed-effect parameter and the BLUP/conditional mode for that group, which are different classes of objects (one is a parameter, one is a conditional mean of a random variable), which creates some technical difficulties.
The second point is that (unfortunately) I don't know of an easy way to do this in lme
, so my answer uses lmer
(from the lme4
package).
If you are comfortable doing the easiest thing and ignoring the (possibly ill-defined) covariance between the fixed-effect parameters and the BLUPs, you can use the code below.
Two alternatives would be (1) to fit your model with a Bayesian hierarchical approach (e.g. the MCMCglmm
package) and compute the standard deviations of the posterior predictions for each level (2) use parametric bootstrapping to compute the BLUPs/conditional modes, then take the standard deviations of the bootstrap distributions.
Please remember that as usual this advice comes with no warranty.
library(lme4)
fm1 <- lmer(Reaction ~ Days + (Days | Subject), sleepstudy)
cc <- coef(fm1)$Subject
## variances of fixed effects
fixed.vars <- diag(vcov(fm1))
## extract variances of conditional modes
r1 <- ranef(fm1,condVar=TRUE)
cmode.vars <- t(apply(cv <- attr(r1[[1]],"postVar"),3,diag))
seVals <- sqrt(sweep(cmode.vars,2,fixed.vars,"+"))
res <- cbind(cc,seVals)
res2 <- setNames(res[,c(1,3,2,4)],
c("int","int_se","slope","slope_se"))
## int int_se slope slope_se
## 308 253.6637 13.86649 19.666258 2.7752
## 309 211.0065 13.86649 1.847583 2.7752
## 310 212.4449 13.86649 5.018406 2.7752
## 330 275.0956 13.86649 5.652955 2.7752
## 331 273.6653 13.86649 7.397391 2.7752
## 332 260.4446 13.86649 10.195115 2.7752
glmer
? – Shae