Information Loss
Jump to navigation
Jump to search
See: Information, Loss, Entropy, Noise.
References
2009
- (Hu et al., 1999) ⇒ Xiaohua Hu, Xiaodan Zhang, Caimei Lu, E. K. Park, and Xiaohua Zhou. (2009). “Exploiting Wikipedia as External Knowledge for Document Clustering.” In: Proceedings of ACM SIGKDD Conference (KDD-2009). doi:10.1145/1557019.1557066
- … There are two major issues for this approach: (1) the coverage of the ontology is limited, even for WordNet or Mesh, (2) using ontology terms as replacement or additional features may cause information loss, or introduce noise. …
2006
- (Agirre et al., 2006) ⇒ Eneko Agirre, David Martínez, Oier Lopez de Lacalle, and Aitor Soroa. (2006). “Two Graph-based Algorithms for State-of-the-Art WSD.” In: Proceedings of the Conference on Empirical Methods in Natural Language Processing (EMNLP 2006).
2002
- (Mitra et al., 2002) ⇒ Pabitra Mitra, C. A. Murthy, and Sankar K. Pal. (2002). “Unsupervised Feature Selection Using Feature Similarity.” In: IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(3). [doi:10.1109/34.990133].
- … both the proposed clustering algorithm and the newly introduced feature similarity measure is geared toward two goals: minimizing the information loss (in terms of second order statistics) incurred in the process of feature reduction and minimizing the redundancy present in the reduced features subset.