Active Covering
We analyze the problem of active covering, where the learner is given an unlabeled dataset and can sequentially label query examples. The objective is to label query all of the positive examples in the fewest number of total label queries. We show under standard non-parametric assumptions that a classical support estimator can be repurposed as an offline algorithm attaining an excess query cost of Θ(n^D/(D+1)) compared to the optimal learner, where n is the number of datapoints and D is the dimension. We then provide a simple active learning method that attains an improved excess query cost of O(n^(D-1)/D). Furthermore, the proposed algorithms only require access to the positive labeled examples, which in certain settings provides additional computational and privacy benefits. Finally, we show that the active learning method consistently outperforms offline methods as well as a variety of baselines on a wide range of benchmark image-based datasets.
READ FULL TEXT