gms | German Medical Science

MAINZ//2011: 56. GMDS-Jahrestagung und 6. DGEpi-Jahrestagung

Deutsche Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie e. V.
Deutsche Gesellschaft für Epidemiologie e. V.

26. - 29.09.2011 in Mainz

Log-normal frailty models fitted as Poisson generalized linear mixed models: New evidence from simulations

Meeting Abstract

Suche in Medline nach

  • Katharina Hirsch - Institut für Medizinische Epidemiologie, Biometrie und Informatik, Halle
  • Andreas Wienke - Institut für Medizinische Epidemiologie, Biometrie und Informatik, Halle
  • Oliver Kuss - Institut für Medizinische Epidemiologie, Biometrie und Informatik, Halle

Mainz//2011. 56. Jahrestagung der Deutschen Gesellschaft für Medizinische Informatik, Biometrie und Epidemiologie (gmds), 6. Jahrestagung der Deutschen Gesellschaft für Epidemiologie (DGEpi). Mainz, 26.-29.09.2011. Düsseldorf: German Medical Science GMS Publishing House; 2011. Doc11gmds105

doi: 10.3205/11gmds105, urn:nbn:de:0183-11gmds1054

Veröffentlicht: 20. September 2011

© 2011 Hirsch et al.
Dieser Artikel ist ein Open Access-Artikel und steht unter den Creative Commons Lizenzbedingungen (http://creativecommons.org/licenses/by-nc-nd/3.0/deed.de). Er darf vervielfältigt, verbreitet und öffentlich zugänglich gemacht werden, vorausgesetzt dass Autor und Quelle genannt werden.


Gliederung

Text

The equivalence of a survival model with a piecewise constant baseline hazard function and a Poisson regression model has been known at least since the 1970s. As shown recently by Feng et al., this equivalence carries over to the case of correlated survival data: A frailty model with a log-normal frailty term can be interpreted and estimated as a generalized linear mixed model (GLMM) with a binary response, a Poisson likelihood, and a specific offset. Proceeding this way, statistical theory and software for GLMMs is readily available for fitting frailty models. By using, for example, SAS PROC GLIMMIX we can estimate frailty models with several hierarchical levels, a wealth of possible covariance matrices, and can apply numerical integration algorithms for exact maximum likelihood estimation. This flexibility comes at the small prize of [1] having to fix the number of pieces for the baseline hazard in advance and [2] having to “explode” the data set by the number of pieces, resulting in potentially very large data sets and thus compromises in computing time.

To assess the equivalence between the two models we performed a simulation study. We generated datasets of correlated Gompertz-distributed event times with varying degree of censoring and two fixed effects. Poisson GLMM models were fitted with different numbers of events per piece using our %PCFrailty macro and compared to [1] a frailty model with the true Gompertz hazard function, [2] a Cox model with random effects, and [3] to a standard Cox model assuming uncorrelated event times. We report bias, mean squared and coverage probabilities and give recommendations to achieve the optimal balance between the number of pieces to achieve a good fit and the size of the data set.


References

1.
Feng S, Wolfe RA, Port FK. Frailty Survival Model Analysis of the National Deceased Donor Kidney Transplant Dataset Using Poisson Variance Structures. Journal of the American Statistical Association. 2005;100:728-735.
2.
Feng S, Nie L, Wolfe R. Laplaces approximation for relative risk frailty models. Lifetime Data Analysis. 2009;15:343-356.
3.
F Vaida, Xu R. Proportional hazards model with random effects. Statistics in Medicine. 2000;19(24):3309-3324.