Code-to-Vector (code2vec) Neural Network
(Redirected from code2vec)
Jump to navigation
Jump to search
A Code-to-Vector (code2vec) Neural Network is a Path-Attention Neural Network that uses a representation of arbitrary-sized code snippets and learns to aggregate multiple syntactic paths into a single fixed-size vector.
- AKA: Code2Vec.
- Context:
- Source code available at: https://github.com/tech-srl/code2vec
- It can be trained by a Code-to-Vector Neural Network Training System and evaluated by a Code-to-Vector (code2vec) Benchmarking Task.
- Example(s):
- Counter-Example(s):
- See: Attention Mechanism, Code Summarization Task, Bimodal Modelling of Code and Natural Language.
References
2019
- (Alon et al., 2019) ⇒ Uri Alon, Meital Zilberstein, Omer Levy, and Eran Yahav. (2019). “code2vec: Learning Distributed Representations of Code.” In: Proceedings of the ACM on Programming Languages (POPL), Volume 3.
- QUOTE: The goal of this paper is to learn code embeddings, continuous vectors for representing snippets of code. By learning code embeddings, our long-term goal is to enable the application of neural techniques to a wide-range of programming-languages tasks. In this paper, we use the motivating task of semantic labeling of code snippets.