Neural Architecture Search Task: Difference between revisions

From GM-RKB
Jump to navigation Jump to search
m (Text replacement - "<ref name=[":]*[\w\d\s]+[" ]*><\/ref>" to " ")
m (Text replacement - "]] " to "]] ")
 
Line 28: Line 28:
*** The ''search strategy'' defines the approach used to explore the search space.
*** The ''search strategy'' defines the approach used to explore the search space.
*** The ''performance estimation strategy'' evaluates the performance of a possible ANN from its design (without constructing and training it).
*** The ''performance estimation strategy'' evaluates the performance of a possible ANN from its design (without constructing and training it).
** NAS is closely related to [[hyperparameter optimization]] <ref> Matthias Feurer and Frank Hutter. [https://link.springer.com/content/pdf/10.1007%2F978-3-030-05318-5_1.pdf Hyperparameter optimization]. In: ''AutoML: Methods, Systems, Challenges'', pages 3–38. </ref> and [[meta-learning (computer science)|meta-learning]] and is a subfield of [[automated machine learning]] (AutoML).
** NAS is closely related to [[hyperparameter optimization]] <ref> Matthias Feurer and Frank Hutter. [https://link.springer.com/content/pdf/10.1007%2F978-3-030-05318-5_1.pdf Hyperparameter optimization]. In: ''AutoML: Methods, Systems, Challenges'', pages 3–38. </ref> and [[meta-learning (computer science)|meta-learning]] and is a subfield of [[automated machine learning]] (AutoML).


=== 2020 ===
=== 2020 ===

Latest revision as of 02:38, 4 November 2024

A Neural Architecture Search Task is a model search task for neural network models (that automates the process of designing artificial neural networks to optimize their architecture for a given task).



References

2024

  • (Wikipedia, 2024) ⇒ https://en.wikipedia.org/wiki/Neural_architecture_search Retrieved:2024-3-4.
    • Neural architecture search (NAS) is a technique for automating the design of artificial neural networks (ANN), a widely used model in the field of machine learning. NAS has been used to design networks that are on par or outperform hand-designed architectures.[1] Methods for NAS can be categorized according to the search space, search strategy and performance estimation strategy used:[2]
      • The search space defines the type(s) of ANN that can be designed and optimized.
      • The search strategy defines the approach used to explore the search space.
      • The performance estimation strategy evaluates the performance of a possible ANN from its design (without constructing and training it).
    • NAS is closely related to hyperparameter optimization [3] and meta-learning and is a subfield of automated machine learning (AutoML).

2020

  1. Cite error: Invalid <ref> tag; no text was provided for refs named Zoph 2016
  2. Cite error: Invalid <ref> tag; no text was provided for refs named survey
  3. Matthias Feurer and Frank Hutter. Hyperparameter optimization. In: AutoML: Methods, Systems, Challenges, pages 3–38.