Association Rule Learning Task
Jump to navigation
Jump to search
An Association Rule Learning Task is a frequent-pattern mining task that is restricted to the discovery of association rules.
- AKA: Association Rule Mining.
- Context:
- Input.
- output:
- It can be solved by an Association Rule Learning System that applies an (Association Rule Learning Algorithm.
- …
- Counter-Example(s):
- See: Lift, Frequent Pattern, Frequent Itemset, Association Task, Minimum Confidence, Minimum Support.
References
2012
- (Wikipedia, 2012) ⇒ http://en.wikipedia.org/wiki/Association_rule_learning
- In data mining, association rule learning is a popular and well researched method for discovering interesting relations between variables in large databases. It is intended to identify strong rules discovered in databases using different measures of interestingness[1]. Based on the concept of strong rules, Rakesh Agrawal et al.[2] introduced association rules for discovering regularities between products in large-scale transaction data recorded by point-of-sale (POS) systems in supermarkets. For example, the rule [math]\displaystyle{ \{\mathrm{onions, potatoes}\} \Rightarrow \{\mathrm{burger}\} }[/math] found in the sales data of a supermarket would indicate that if a customer buys onions and potatoes together, he or she is likely to also buy hamburger meat. Such information can be used as the basis for decisions about marketing activities such as, e.g., promotional pricing or product placements. In addition to the above example from market basket analysis association rules are employed today in many application areas including Web usage mining, intrusion detection and bioinformatics. As opposed to sequence mining, association rule learning typically does not consider the order of items either within a transaction or across transactions.
- ↑ Piatetsky-Shapiro, Gregory (1991), Discovery, analysis, and presentation of strong rules, in Piatetsky-Shapiro, Gregory; and Frawley, William J.; eds., Knowledge Discovery in Databases, AAAI/MIT Press, Cambridge, MA.
- ↑ Template:Cite doi
2008
- (Parthasarathy, 2008) ⇒ Srinivasan Parthasarathy. (2008). “Association Rule Mining - Part 1.” Lecture Notes, CIS 674: Introduction to Datamining; The Ohio State University.
2000
- (Witten & Frank, 2000) ⇒ Ian H. Witten, and Eibe Frank. (2000). “Data Mining: Practical Machine Learning Tools and Techniques with Java implementations.” Morgan Kaufmann.
- QUOTE: … there are far more association rules than classification rules, and the challenge is to avoid being swamped with them. ...
- (Liu et al., 2000) ⇒ Bing Liu, Yiming Ma, and Ching Kian Wong. (2000). “Improving an Association Rule Based Classifier.” In: Proceedings of the 4th European Conference on Principles of Data Mining and Knowledge Discovery (PKDD 2000). doi:10.1007/3-540-45372-5
- Given a set of transactions [math]\displaystyle{ D }[/math] (the dataset), the problem of mining association rules is to discover all rules that have support and confidence greater than the user-specified minimum support (called minsup) and minimum confidence (called minconf). An efficient algorithm for mining association rules is the Apriori algorithm.
1998
- (Kohavi & Provost, 1998) ⇒ Ron Kohavi, and Foster Provost. (1998). “Glossary of Terms.” In: Machine Leanring 30(2-3).
- Association learning: Techniques that find conjunctive implication rules of the form "X and Y implies A and B" (associations) that satisfy given criteria. The conventional association algorithms are sound and complete methods for finding all associations that satisfy criteria for minimum support (at least a specified fraction of the instances must satisfy both sides of the rule) and minimum confidence (at least a specified fraction of instances satisfying the left hand side, or antecedent, must satisfy the right hand side, or consequent).
1993
- (Agrawal et al., 1993) ⇒ Rakesh Agrawal, Tomasz Imieliński, and Arun Swami. (1993). “Mining Association Rules Between Sets of Items in Large Databases.” In: Proceedings of ACM SIGMOD Conference (SIGMOD 1993). do>10.1145/170035.170072.