Knockoff boosted tree for model-free variable selection
Author(s) -
Tao Jiang,
Yuanyuan Li,
Alison A. MotsingerReif
Publication year - 2020
Publication title -
bioinformatics
Language(s) - Uncategorized
Resource type - Journals
SCImago Journal Rank - 3.599
H-Index - 390
eISSN - 1367-4811
pISSN - 1367-4803
DOI - 10.1093/bioinformatics/btaa770
Subject(s) - tree (set theory) , selection (genetic algorithm) , variable (mathematics) , computer science , feature selection , computational biology , artificial intelligence , biology , mathematics , combinatorics , mathematical analysis
The recently proposed knockoff filter is a general framework for controlling the false discovery rate (FDR) when performing variable selection. This powerful new approach generates a 'knockoff' of each variable tested for exact FDR control. Imitation variables that mimic the correlation structure found within the original variables serve as negative controls for statistical inference. Current applications of knockoff methods use linear regression models and conduct variable selection only for variables existing in model functions. Here, we extend the use of knockoffs for machine learning with boosted trees, which are successful and widely used in problems where no prior knowledge of model function is required. However, currently available importance scores in tree models are insufficient for variable selection with FDR control.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom