Local Search is a Remarkably Strong Baseline for Neural Architecture Search
نویسندگان
چکیده
Neural Architecture Search (NAS), i.e., the automation of neural network design, has gained much popularity in recent years with increasingly complex search algorithms being proposed. Yet, solid comparisons simple baselines are often missing. At same time, retrospective studies have found many new to be no better than random (RS). In this work we consider use a Local (LS) algorithm for NAS. We particularly multi-objective NAS formulation, accuracy and complexity as two objectives, understanding trade-off between these objectives is arguably among most interesting aspects The proposed LS compared RS evolutionary (EAs), heralded ideal optimization. To promote reproducibility, create release benchmark datasets, named MacroNAS-C10 -C100, containing 200K saved evaluations established image classification tasks, CIFAR-10 CIFAR-100. Our benchmarks designed complementary existing benchmarks, especially that they suited search. additionally version problem larger architecture space. While find show considered explore space fundamentally different ways, also substantially outperforms even performs nearly good state-of-the-art EAs. believe provides strong evidence truly competitive baseline against which should benchmarked.
منابع مشابه
A novel local search method for microaggregation
In this paper, we propose an effective microaggregation algorithm to produce a more useful protected data for publishing. Microaggregation is mapped to a clustering problem with known minimum and maximum group size constraints. In this scheme, the goal is to cluster n records into groups of at least k and at most 2k_1 records, such that the sum of the within-group squ...
متن کاملA Differential Evolution and Spatial Distribution based Local Search for Training Fuzzy Wavelet Neural Network
Abstract Many parameter-tuning algorithms have been proposed for training Fuzzy Wavelet Neural Networks (FWNNs). Absence of appropriate structure, convergence to local optima and low speed in learning algorithms are deficiencies of FWNNs in previous studies. In this paper, a Memetic Algorithm (MA) is introduced to train FWNN for addressing aforementioned learning lacks. Differential Evolution...
متن کاملProgressive Neural Architecture Search
We propose a method for learning CNN structures that is more efficient than previous approaches: instead of using reinforcement learning (RL) or genetic algorithms (GA), we use a sequential model-based optimization (SMBO) strategy, in which we search for architectures in order of increasing complexity, while simultaneously learning a surrogate function to guide the search, similar to A* search....
متن کاملDifferentiable Neural Network Architecture Search
The successes of deep learning in recent years has been fueled by the development of innovative new neural network architectures. However, the design of a neural network architecture remains a difficult problem, requiring significant human expertise as well as computational resources. In this paper, we propose a method for transforming a discrete neural network architecture space into a continu...
متن کاملExploring Neural Architecture Search for Language Tasks
Neural architecture search (NAS), the task of finding neural architectures automatically, has recently emerged as a promising approach for discovering better models than ones designed by humans alone. However, most success stories are for vision tasks and have been quite limited for text, except for a small language modeling datasets. In this paper, we explore NAS for text sequences at scale, b...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Lecture Notes in Computer Science
سال: 2021
ISSN: ['1611-3349', '0302-9743']
DOI: https://doi.org/10.1007/978-3-030-72062-9_37