Paper

ABC: Efficient Selection of Machine Learning Configuration on Large Dataset

A machine learning configuration refers to a combination of preprocessor, learner, and hyperparameters. Given a set of configurations and a large dataset randomly split into training and testing set, we study how to efficiently select the best configuration with approximately the highest testing accuracy when trained from the training set. To guarantee small accuracy loss, we develop a solution using confidence interval (CI)-based progressive sampling and pruning strategy. Compared to using full data to find the exact best configuration, our solution achieves more than two orders of magnitude speedup, while the returned top configuration has identical or close test accuracy.

Results in Papers With Code
(↓ scroll down to see all results)