Wide Residual Network (WRN)
(Redirected from Wide Residual Network)
Jump to navigation
Jump to search
A Wide Residual Network (WRN) is a Deep Residual Neural Network that contains wide-dropout residual blocks.
- Context:
- It was initially developed by Zagoruyko & Komodakis (2016).
- Source code available at: https://github.com/szagoruyko/wide-residual-networks
- Example(s):
- Counter-Example(s):
- See: Residual Neural Network, Residual Neural Network, Convolutional Neural Network, Machine Learning, Deep Learning, Machine Vision.
References
2016
- (Zagoruyko & Komodakis, 2016) ⇒ Sergey Zagoruyko, and Nikos Komodakis. (2016). “Wide Residual Networks.” In: Proceedings of the British Machine Vision Conference 2016 (BMVC 2016).
- QUOTE: Residual block with identity mapping can be represented by the following formula:
[math]\displaystyle{ \mathbf{x}_{l+1}=\mathbf{x}_{l}+\mathcal{F}\left(\mathbf{x}_{l}, \mathcal{W}_{l}\right) }[/math] | (1) |
- where $\mathbf{x}_{l+1}$ and $\mathbf{x}_{l}$ are input and output of the $l$-th unit in the network, $\mathcal{F}$ is a residual function and $\mathcal{W}_{l}$ are parameters of the block. Residual network consists of sequentially stacked residual block.
- where $\mathbf{x}_{l+1}$ and $\mathbf{x}_{l}$ are input and output of the $l$-th unit in the network, $\mathcal{F}$ is a residual function and $\mathcal{W}_{l}$ are parameters of the block. Residual network consists of sequentially stacked residual block.