### Article

## How to implement the nonnegative garrote in R using glmnet

### Search Medline for

### Authors

Published: | September 6, 2019 |
---|

### Outline

### Text

In his seminal paper from 1996, Tibshirani introduced the *lasso*, the “least absolute shrinkage and selection operator” [1]. Since then, a tremendous amount of research has been conducted and as of April 2019, about 200 lasso-related R-packages are available on CRAN.

When devising the lasso, Tibshirani was inspired by Leo Breiman’s *nonnegative garrote* [2]: Just like the lasso, the nonnegative garrote is a procedure that shrinks coefficients and selects covariates in multivariable regression models. In contrast to the lasso, however, the nonnegative garrote is a stepwise procedure: In the first step, the ordinary least squares estimate is computed. In a second step, nonnegative shrinkage factors are computed by solving a constraint optimization problem. The nonnegative garrote is then defined as the elementwise product of the least squares estimate and the nonnegative shrinkage factors. Although it was shown that the nonnegative garrote possesses several desirable theoretical properties such as variable selection consistency [3], [4], not a single R-package on CRAN could be identified that allows for direct use of the nonnegative garrote in practice (as of April 2019).

On this poster, we present an implementation of the nonnegative garrote based on the R-package *glmnet* [5]. We present the corresponding R code and discuss different approaches to determine an optimal value for the hyperparameter controlling the amount of shrinkage applied. By means of simulations, we illustrate that the nonnegative garrote can compete with the lasso in terms of prediction accuracy and even outperform the lasso with respect to the identification of relevant effects. Finally, we discuss how the presented strategy can be adapted to implement the nonnegative garrote for generalized linear models or the Cox proportional hazards model.

The authors declare that they have no competing interests.

The authors declare that an ethics committee vote is not required.

### References

- 1.
- Tibshirani R. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society: Series B (Methodological). 1996;58(1):267-288.
- 2.
- Breiman L. Better subset regression using the nonnegative garrote. Technometrics. 1995;37(4):373-384.
- 3.
- Zou H. The adaptive lasso and its oracle properties. Journal of the American statistical association. 2006;101(476):1418-1429.
- 4.
- Yuan M, Lin Y. On the non-negative garrotte estimator. Journal of the Royal Statistical Society: Series B (Statistical Methodology). 2007;69(2):143-161.
- 5.
- Friedman J, Hastie T, Tibshirani R. Regularization paths for generalized linear models via coordinate descent. Journal of statistical software. 2010;33(1):1.