Neural Architecture Search Task
Jump to navigation
Jump to search
A Neural Architecture Search Task is a model search task for neural network models (that automates the process of designing artificial neural networks to optimize their architecture for a given task).
- Context:
- It can (typically) involve defining a search space that represents the potential architectures for a neural network.
- It can (typically) use various Search strategies to explore the search space efficiently, including but not limited to evolutionary algorithms, reinforcement learning, and gradient descent.
- It can employ Performance Estimation Strategies to evaluate the effectiveness of different neural network architectures without fully training them.
- It can aim to optimize multiple objectives, such as maximizing accuracy while minimizing computational cost and memory usage.
- It can be an essential component of Automated Machine Learning (AutoML) by providing a means to automatically select the best model architecture in addition to tuning hyperparameters.
- ...
- Example(s):
- ENAS (Efficient Neural Architecture Search), which uses reinforcement learning to discover efficient architectures.
- NASNet, an architecture search that utilizes reinforcement learning to optimize model architectures for both accuracy and computational efficiency.
- DARTS (Differentiable Architecture Search), which introduces a gradient-based approach to neural architecture search, making the search process differentiable and thus more efficient.
- ...
- Counter-Example(s):
- A Hyperparameter Optimization Task, which focuses solely on tuning the parameters of a given model architecture rather than searching for new architectures.
- A Model Selection Task, which involves choosing the best model from a predefined set of models rather than designing new architectures.
- See: Automated Machine Learning, Artificial Neural Network, Machine Learning, Hyperparameter Optimization.
References
2024
- (Wikipedia, 2024) ⇒ https://en.wikipedia.org/wiki/Neural_architecture_search Retrieved:2024-3-4.
- Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS has been used to design networks that are on par or outperform hand-designed architectures.[1] Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used:[2]
- The search space defines the type(s) of ANN that can be designed and optimized.
- The search strategy defines the approach used to explore the search space.
- The performance estimation strategy evaluates the performance of a possible ANN from its design (without constructing and training it).
- NAS is closely related to hyperparameter optimization [3] and meta-learning and is a subfield of automated machine learning (AutoML).
- Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS has been used to design networks that are on par or outperform hand-designed architectures.[1] Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used:[2]
2020
- (Wikipedia, 2020) ⇒ https://en.wikipedia.org/wiki/Neural_architecture_search Retrieved:2020-6-17.
- Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. ...
- ↑ Cite error: Invalid
<ref>
tag; no text was provided for refs namedZoph 2016
- ↑ Cite error: Invalid
<ref>
tag; no text was provided for refs namedsurvey
- ↑ Matthias Feurer and Frank Hutter. Hyperparameter optimization. In: AutoML: Methods, Systems, Challenges, pages 3–38.