NASGEM: Neural Architecture Search via Graph Embedding Method

نویسندگان

چکیده

Neural Architecture Search (NAS) automates and prospers the design of neural networks. Estimator-based NAS has been proposed recently to model relationship between architectures their performance enable scalable flexible search. However, existing estimator-based methods encode architecture into a latent space without considering graph similarity. Ignoring similarity in node-based search may induce large inconsistency similar graphs distance continuous encoding space, leading inaccurate representation and/or reduced capacity that can yield sub-optimal results. To preserve correlation information encoding, we propose NASGEM which stands for via Graph Embedding Method. is driven by novel embedding method equipped with measures capture topology information. By precisely estimating using an auxiliary Weisfeiler-Lehman kernel guide utilize additional structural get more accurate improve efficiency. GEMNet, set networks discovered NASGEM, consistently outperforms crafted classification tasks, i.e., 0.4%-3.6% higher accuracy while having 11%- 21% fewer Multiply-Accumulates. We further transfer GEMNet COCO object detection. In both one-stage twostage detectors, our surpasses its manually-crafted automatically-searched counterparts.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Explicit Semantic Ranking for Academic Search via Knowledge Graph Embedding

This paper introduces Explicit Semantic Ranking (ESR), a new ranking technique that leverages knowledge graph embedding. Analysis of the query log from our academic search engine, SemanticScholar.org, reveals that a major error source is its inability to understand the meaning of research concepts in queries. To addresses this challenge, ESR represents queries and documents in the entity space ...

متن کامل

Efficient Neural Architecture Search via Parameter Sharing

We propose Efficient Neural Architecture Search (ENAS), a fast and inexpensive approach for automatic model design. In ENAS, a controller discovers neural network architectures by searching for an optimal subgraph within a large computational graph. The controller is trained with policy gradient to select a subgraph that maximizes the expected reward on a validation set. Meanwhile the model cor...

متن کامل

Graph Augmentation via Metric Embedding

Kleinberg [17] proposed in 2000 the first random graph model achieving to reproduce small world navigability, i.e. the ability to greedily discover polylogarithmic routes between any pair of nodes in a graph, with only a partial knowledge of distances. Following this seminal work, a major challenge was to extend this model to larger classes of graphs than regular meshes, introducing the concept...

متن کامل

Hybed: Hyperbolic Neural Graph Embedding

Neural embeddings have been used with great success in Natural Language Processing (NLP). They provide compact representations that encapsulate word similarity and attain state-of-the-art performance in a range of linguistic tasks. The success of neural embeddings has prompted significant amounts of research into applications in domains other than language. One such domain is graph-structured d...

متن کامل

Graph Kernels via Functional Embedding

We propose a representation of graph as a functional object derived from the power iteration of the underlying adjacency matrix. The proposed functional representation is a graph invariant, i.e., the functional remains unchanged under any reordering of the vertices. This property eliminates the difficulty of handling exponentially many isomorphic forms. Bhattacharyya kernel constructed between ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Proceedings of the ... AAAI Conference on Artificial Intelligence

سال: 2021

ISSN: ['2159-5399', '2374-3468']

DOI: https://doi.org/10.1609/aaai.v35i8.16872