Teacher Improves Learning by Selecting a Training Subset

Yuzhe Ma, Robert Nowak, Philippe Rigollet, Xuezhou Zhang, Xiaojin Zhu
Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, PMLR 84:1366-1375, 2018.

Abstract

We call a learner super-teachable if a teacher can trim down an iid training set while making the learner learn even better. We provide sharp super-teaching guarantees on two learners: the maximum likelihood estimator for the mean of a Gaussian, and the large margin classifier in 1D. For general learners, we provide a mixed-integer nonlinear programming-based algorithm to find a super teaching set. Empirical experiments show that our algorithm is able to find good super-teaching sets for both regression and classification problems.

Cite this Paper


BibTeX
@InProceedings{pmlr-v84-ma18a, title = {Teacher Improves Learning by Selecting a Training Subset}, author = {Ma, Yuzhe and Nowak, Robert and Rigollet, Philippe and Zhang, Xuezhou and Zhu, Xiaojin}, booktitle = {Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics}, pages = {1366--1375}, year = {2018}, editor = {Storkey, Amos and Perez-Cruz, Fernando}, volume = {84}, series = {Proceedings of Machine Learning Research}, month = {09--11 Apr}, publisher = {PMLR}, pdf = {http://proceedings.mlr.press/v84/ma18a/ma18a.pdf}, url = {https://proceedings.mlr.press/v84/ma18a.html}, abstract = {We call a learner super-teachable if a teacher can trim down an iid training set while making the learner learn even better. We provide sharp super-teaching guarantees on two learners: the maximum likelihood estimator for the mean of a Gaussian, and the large margin classifier in 1D. For general learners, we provide a mixed-integer nonlinear programming-based algorithm to find a super teaching set. Empirical experiments show that our algorithm is able to find good super-teaching sets for both regression and classification problems. } }
Endnote
%0 Conference Paper %T Teacher Improves Learning by Selecting a Training Subset %A Yuzhe Ma %A Robert Nowak %A Philippe Rigollet %A Xuezhou Zhang %A Xiaojin Zhu %B Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2018 %E Amos Storkey %E Fernando Perez-Cruz %F pmlr-v84-ma18a %I PMLR %P 1366--1375 %U https://proceedings.mlr.press/v84/ma18a.html %V 84 %X We call a learner super-teachable if a teacher can trim down an iid training set while making the learner learn even better. We provide sharp super-teaching guarantees on two learners: the maximum likelihood estimator for the mean of a Gaussian, and the large margin classifier in 1D. For general learners, we provide a mixed-integer nonlinear programming-based algorithm to find a super teaching set. Empirical experiments show that our algorithm is able to find good super-teaching sets for both regression and classification problems.
APA
Ma, Y., Nowak, R., Rigollet, P., Zhang, X. & Zhu, X.. (2018). Teacher Improves Learning by Selecting a Training Subset. Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 84:1366-1375 Available from https://proceedings.mlr.press/v84/ma18a.html.

Related Material