Formal Grammar Formalism
(Redirected from Grammar Formalism)
Jump to navigation
Jump to search
A Formal Grammar Formalism is a Formal Specification of a set of Formal Grammars that share similar compositional features and expressiveness.
- AKA: Grammar Theory, Metagrammar, Grammar Formalism.
- Context:
- It can be a Regular Grammar.
- It can be a Context-Free Grammar.
- It can be a Mildly Context-Sensitive Grammar.
- It can be a Context-Sensitive Grammar.
- See: Formal Language.
- (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Chomsky_hierarchy
- Within the field of computer science, specifically in the area of formal languages, the Chomsky hierarchy (occasionally referred to as Chomsky–Schützenberger hierarchy) is a containment hierarchy of classes of formal grammars.
- This hierarchy of grammars was described by Noam Chomsky in 1956 (see [1]). It is also named after Marcel-Paul Schützenberger who played a crucial role in the development of the theory of formal languages.
- A formal grammar of this type consists of:
- a finite set of terminal symbols
- a finite set of nonterminal symbols
- a finite set of production rules with a left and a right-hand side consisting of a sequence of these symbols
- a start symbol
- A formal grammar defines (or generates) a formal language, which is a (usually infinite) set of finite-length sequences of symbols (i.e. strings) that may be constructed by applying production rules to another sequence of symbols which initially contains just the start symbol. A rule may be applied to a sequence of symbols by replacing an occurrence of the symbols on the left-hand side of the rule with those that appear on the right-hand side. A sequence of rule applications is called a derivation. Such a grammar defines the formal language: all words consisting solely of terminal symbols which can be reached by a derivation from the start symbol.
- (Wikipedia, 2009) ⇒ http://en.wikipedia.org/wiki/Formal_grammar
- In formal semantics, computer science and linguistics, a formal grammar (also called formation rules) is a precise description of a formal language – that is, of a set of strings over some alphabet. In other words, a grammar describes which of the possible sequences of symbols (strings) in a language constitute valid words or statements in that language, but it does not describe their semantics (i.e. what they mean). The branch of mathematics that is concerned with the properties of formal grammars and languages is called formal language theory.
- A grammar is usually regarded as a means to generate all the valid strings of a language; it can also be used as the basis for a recognizer that determines for any given string whether it is grammatical (i.e. belongs to the language). To describe such recognizers, formal language theory uses separate formalisms, known as automata.
- A grammar can also be used to analyze the strings of a language – i.e. to describe their internal structure. In computer science, this process is known as parsing. Most languages have very compositional semantics, i.e. the meaning of their utterances is structured according to their syntax; therefore, the first step to describing the meaning of an utterance in language is to analyze it and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).