2013 GeneratingSequencesWithRecurren: Difference between revisions
Jump to navigation
Jump to search
m (Text replacement - " ... " to " … ") |
m (Text replacement - "==Notes==" to "== Notes ==") |
||
Line 3: | Line 3: | ||
<B>Subject Headings:</B> [[Handwriting Generation]], [[LSTM Cell]]. | <B>Subject Headings:</B> [[Handwriting Generation]], [[LSTM Cell]]. | ||
==Notes== | == Notes == | ||
* http://cs.toronto.edu/~graves/gen_seq_rnn.pdf | * http://cs.toronto.edu/~graves/gen_seq_rnn.pdf | ||
* http://cs.toronto.edu/~graves/handwriting.html demo | * http://cs.toronto.edu/~graves/handwriting.html demo |
Latest revision as of 21:03, 24 June 2021
- (Graves, 2013) ⇒ Alex Graves. (2013). “Generating Sequences With Recurrent Neural Networks.” In: CoRR, abs/1308.0850.
Subject Headings: Handwriting Generation, LSTM Cell.
Notes
Cited By
Quotes
Abstract
This paper shows how Long Short-term Memory recurrent neural networks can be used to generate complex sequences with long-range structure, simply by predicting one data point at a time. The approach is demonstrated for text (where the data are discrete) and online handwriting (where the data are real-valued). It is then extended to handwriting synthesis by allowing the network to condition its predictions on a text sequence. The resulting system is able to generate highly realistic cursive handwriting in a wide variety of styles.
…
References
;
Author | volume | Date Value | title | type | journal | titleUrl | doi | note | year | |
---|---|---|---|---|---|---|---|---|---|---|
2013 GeneratingSequencesWithRecurren | Alex Graves | Generating Sequences With Recurrent Neural Networks |