3 papers across 2 sessions
We provide confidence intervals, prediction intervals, and hypothesis tests for variable importance for boosting with dropout and parallel training.
We establish the sample complexity of agnostic boosting up to logarithmic factors by providing novel upper and lower bounds.
This paper presents methods for robust minimax boosting (RMBoost) that minimize worst-case error probabilities, are robust to general types of label noise, and provide finite-sample performance guarantees with label noise