2016 SummarizingSourceCodeUsingaNeur

From GM-RKB
Jump to navigation Jump to search

Subject Headings: Code-NN; Automatic Code Summarization Task.

Notes

Cited By

Quotes

Abstract

High quality source code is often paired with high level summaries of the computation it performs, for example in code documentation or in descriptions posted in online forums. Such summaries are extremely useful for applications such as code search but are expensive to manually author, hence only done for a small fraction of all code that is produced. In this paper, we present the first completely datadriven approach for generating high level summaries of source code. Our model, CODE-NN, uses Long Short Term Memory (LSTM) networks with attention to produce sentences that describe C# code snippets and SQL queries. CODE-NN is trained on a new corpus that is automatically collected from StackOverflow, which we release. Experiments demonstrate strong performance on two tasks: (1) code summarization, where we establish the first end-to-end learning results and outperform strong baselines, and (2) code retrieval, where our learned model improves the state of the art on a recently introduced C# benchmark by a large margin.

References

BibTeX

@inproceedings{2016_SummarizingSourceCodeUsingaNeur,
  author    = {Srinivasan Iyer and
               Ioannis Konstas and
               Alvin Cheung and
 [[Luke Zettlemoyer]]},
  title     = Summarizing Source Code using a Neural Attention Model},
  booktitle = {Proceedings of the 54th Annual Meeting of the Association for Computational
               Linguistics (ACL 2016) Volume 1: Long Papers},
  publisher = {The Association for Computer Linguistics},
  year      = {2016},
  url       = {https://doi.org/10.18653/v1/p16-1195},
  doi       = {10.18653/v1/p16-1195},
}


 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2016 SummarizingSourceCodeUsingaNeurLuke Zettlemoyer
Ioannis Konstas
Srinivasan Iyer
Alvin Cheung
Summarizing Source Code Using a Neural Attention Model2016