Shannon Information Theory
Jump to navigation
Jump to search
See: Shannon's Information Measures, Shannon Entropy, Shannon's Equivocation, Shannon's Capacity, Shannon's Redundancy, Shannon's Source Coding Theorem, Shannon-Hartley Theorem.
References
2017
- (Sammut & Webb, 2017) ⇒ Claude Sammut, and Geoffrey I. Webb. (2017). "Shannon's Information". In: (Sammut & Webb, 2017) DOI: 10.1007/978-1-4899-7687-1_968
- QUOTE: If a message announces an event [math]\displaystyle{ E_1 }[/math] of probability [math]\displaystyle{ P(E_1) }[/math] its information content is − [math]\displaystyle{ log2P(E_1) }[/math]. This is also its length in bits.
2016
- (Lombardi et al., 2016) ⇒ Olimpia Lombardi, Federico Holik, Leonardo Vanni (2016). "What Is Shannon Information?". In Synthese 2016, Volume 193, Issue 7. DOI:10.1007/s11229-015-0824-z
1948
- (Shannon, 1948) ⇒ Claude E. Shannon (1948). "A Mathematical Theory Of Communication". The Bell System Technical Journal, Vol. 27, Issue 3. DOI:10.1002/j.1538-7305.1948.tb01338.x