Convolutional Kernel Function
(Redirected from Convolution Kernel Function)
Jump to navigation
Jump to search
A Convolutional Kernel Function is a relational data kernel function (convolutional function) (that measures the similarity between) for two graph-like structures by summing the similarity of their substructures.
- Context:
- input: Data Structures x, y, such as two strings or Graphs.
- range: Counts of:
- 1) the number of times a Substructure referenced by [math]\displaystyle{ i }[/math] is matched into x
- 2) how many times it is matched into any of its Substructures.
- Example(s):
- Counter-Example(s):
- See: String Kernel Function, Structured SVM Algorithm, Convolutional Neural Network.
References
2004
- (Moschitti, 2004) ⇒ Alessandro Moschitti. (2004). “A Study on Convolution Kernels for Shallow Semantic Parsing.” In: Proceedings of the 42-nd Conference on Association for Computational Linguistic (ACL 2004).
2003
- (Menchetti et al., 2003) ⇒ Sauro Menchetti, Fabrizio Costa, Paolo Frasconi, and Massimiliano Pontil. (2003). “Comparing Convolution Kernels and Recursive Neural Networks for Learning Preferences on Structured Data.” In: Proceedings of IAPR - TC3 International Workshop on Artificial Neural Networks in Pattern Recognition (ANNPR 2003).
- QUOTE: Convolution kernels and recursive neural networks (RNN) are both suitable approaches for supervised learning when the input portion of an instance is a discrete structure like a tree or a graph. … Convolution kernels measure the similarity between two structured instances by summing the similarity of their substructures. Thus, given all possible substructures in instances [math]\displaystyle{ x }[/math] and [math]\displaystyle{ y }[/math], t i(x) counts not only the number of times the substructure referenced by [math]\displaystyle{ i }[/math] is matched into [math]\displaystyle{ x }[/math], but also how many times it is matched into any of its substructures.
2001
- (Collins and Duffy, 2001) ⇒ Michael Collins and N. Duffy. (2001). “Convolution Kernels for Natural Language.” In: Proceedings of NIPS-2001.
1999
- (Haussler, 1999) ⇒ D. Haussler. “Convolution Kernels on Discrete Structures. Technical Report UCSC-CLR-99-10, University of California at Santa Cruz.
- QUOTE: We introduce a new method of constructing kernels on sets whose elements are discrete structures like strings, trees and graphs. The method can be applied iteratively to build a kernel on a infinite set from kernels involving generators of the set. The family of kernels generated generalizes the family of radial basis kernels. It can also be used to define kernels in the form of joint Gibbs probability distributions. Kernels can be built from hidden Markov random fields, generalized regular expressions, pair-HMMs, or ANOVA decompositions. Uses of the method lead to open problems involving the theory of infinitely divisible positive definite functions. Fundamentals of this theory and the theory of reproducing kernel Hilbert spaces are reviewed and applied in establishing the validity of the method. … Here we present some methods for defining series representations for discrete structures using a general type of kernel function we call a convolution kernel.