Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Disclaimer: I am not in the CV and NLP field and not sure the proposed language can include every thing we want to search in the applications. This work proposed a unified language for search space definition. To the best of my knowledge, this is original work and in my opinion also important as the search space in the NAS field is much more complicated than the normal HPO/BO cases. The overall quality is quite good. However, I have some minor concerns in the search algorithm experiments: Now the "optimize" in Algorithm 4 only pick one from 256 random generated architectures. I agree this is a valid comparison but how many architectures the original search algorithm papers used? How easy it is to implement a new search algorithm in this framework? The paper is written clearly but I think more comments in the Algorithm 1 and 2 can help. It would also be nice to see a comparison of search space definition using the proposed language and the definition in their original paper. I do think this is a significant work on both methodological and empirical side. After reading authors' response ======================== I changed my score to 7.
This paper proposes a formal langauge to describe the search space of architecture search problem. This langauge is a domain specific language embedded in python. Users can write modular, composable, and reusable search space by using this langauge. Originality: The contribution is new. This is the first work that tries to provide a formal langauge for the space definition. Quality: The semantics of the langauge is thoroughly described. However, this is an embedded langauge in python. It does not have its own text format. So the syntax of the language is unclear. Because it has to follow the restriction of the host langauge (Python), the grammer of this langauge is also not very concise. This langauge combines the ideas of mordern deep learing frameworks and hyperparameter search frameworks. The idea of hierachycal 'substitution module' has already appears in some deep learning frameworks (e.g. the 'Block" structure in gluon API of mxnet)). Clarity: The paper is well written with adequate background information. Significance: This is a good tool to formulate the search spaces. I expect many people are willing to use it. It will be better if it can support multiple backends (e.g. tensorflow, pytorch, ...) Questions: 1. Besides modularity, how does this language compared to existing ways to specify search space? Can it also reduce the number of lines of code? 2. I don't like the design of `A['out'].connect(B['in'])`. Deep learning frameworks do not need to explicitly assign these edges in the computational graph. They build the graph by using inputs as arguments and using outputs as return values. 3. How does it handle network transformation based architecture search (e.g. Efficient Architecture Search by Network Transformation, Path-Level Network Transformation for Efficient Architecture Search)? Their search space is basically defined by some network transformation operations. Minors: 1. Add an explanation for Figure 4. Fig. 4 is not cited in the paper.
Clarity: very clear overall Originality: original framework to the best of my knowledge Significance: seems an important contribution to the field, this language should facilitate the development of Neural Architecture Search algorithms. Quality: high quality Minor: "While in different settings these input distributions take different forms, our formulation can work with any of them. Next, we summarize some of these alternatives." =>Unclear sentence. what is “these”? section 5.1: I don't know how helpful the introduction of all the formal notations is for the reader, and it might not the most pedagogical way to explain things. Could maybe move this section to supplement and replace it by more extended examples of code like in fig. 1->3, maybe connecting to known frameworks such as TensorFlow or Pytorch?