Provable guarantees for decision tree induction: the agnostic setting
We give strengthened provable guarantees on the performance of widely employed and empirically successful top-down decision tree learning heuristics. While prior works have focused on the realizable setting, we consider the more realistic and challenging agnostic setting. We show that for all monotone functions f and parameters s∈ℕ, these heuristics construct a decision tree of size s^Õ((log s)/ε^2) that achieves error ≤𝗈𝗉𝗍_s + ε, where 𝗈𝗉𝗍_s denotes the error of the optimal size-s decision tree for f. Previously, such a guarantee was not known to be achievable by any algorithm, even one that is not based on top-down heuristics. We complement our algorithmic guarantee with a near-matching s^Ω̃(log s) lower bound.
READ FULL TEXT