News & Events

CVPR2021 NAS Competition

There is a lot of evidence that Neural Architecture Search can produce excellent models capable of ML tasks on well-known datasets -  datasets like CIFAR-10 and ImageNet where years of research have created a set of best practices to follow to achieve good results. However, far less attention has been devoted to investigating the "real-world" use case of NAS, where you're searching for a state-of-the-art architecture on an entirely novel task or dataset. In such a case, there is no existing set of best practices to build from, nor extensive research into optimal architectural patterns, augmentation policies, or hyperparameter selection. In essence, we are asking how well NAS algorithms can work “out-of-the-box” with little-to-no time for tuning. To explore this question, we've designed this competition to evaluate NAS algorithms over unseen novel tasks and datasets, while specifically eliminating outside influences like custom pre-training schedules, hyperparameters optimization, or data augmentation policies.In this competition, we ask competitors to produce a NAS algorithm that, when given an unseen task and dataset, outputs a well-performing robust PyTorch architecture. Finally, these results will be returned to the participants for inclusion in their papers.

Last modified: Wed, 19 May 2021 17:46:15 BST