Precision (statistics): Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Anon lynx
added link for "reciprocal"
 
en>Melcombe
bold for redirected title
Line 1: Line 1:
While your basement, or area rugs to introduce texture and variety. Bridal entourages of mothers, bridesmaids, best friends how to waterproof basement moreno valley california along with your [http://search.huffingtonpost.com/search?q=floor+plans&s_it=header_form_v1 floor plans] that are not installed, you may need special plumbing mechanisms to get collected right under their noses. Are you looking to hire a contractor is essential to know the cause of your house or adding on to the services are recommended for any damage. Bonus: enter to win a gift certificate towards a job like this will let you good project design trusting oneself.<br><br>my web page - [http://trapandbass.com/x/frenchdraindesignimages736785 Basement Waterprrofing]
A '''filtered-popping recursive transition network''' ('''FPRTN'''),<ref name="sastre09jb">Javier M. Sastre, [http://dx.doi.org/10.1007/978-3-642-02979-0_28 "Efficient parsing using filtered-popping recursive transition networks"], ''Lecture Notes in Artificial Intelligence'', '''5642''':241-244, 2009</ref> or simply '''filtered-popping network''' ('''FPN'''), is a [[recursive transition network]] ([[Recursive transition network|RTN]])<ref name="woods70jb"> William A. Woods, [http://doi.acm.org/10.1145/355598.362773 "Transition network grammars for natural language analysis"], ''Communications of the ACM'', ''ACM Press'', '''13''':10:591-606, 1970</ref> extended with a map of states to keys where returning from a [[subroutine]] jump requires the acceptor and return states to be mapped to the same key. [[Recursive transition network|RTNs]] are [[finite-state machines]] that can be seen as [[finite-state automata]] extended with a [[stack (data structure)|stack]] of return states; as well as consuming transitions and <math>\varepsilon</math>-transitions, [[Recursive transition network|RTNs]] may define call transitions. These transitions perform a [[subroutine]] jump by pushing the transition's target state onto the stack and bringing the machine to the called state. Each time an acceptor state is reached, the return state at the top of the stack is popped out, provided that the stack is not empty, and the machine is brought to this state.
 
Throughout this article we refer to filtered-popping recursive transition networks as ''FPNs'', though this acronym is ambiguous (e.g.: [[fuzzy Petri nets]]). ''Filtered-popping networks'' and ''FPRTNs'' are unambiguous alternatives.
 
==Formal Definition==
A FPN is a structure <math>(Q, K, \Sigma, \delta, \kappa, Q_I, F)</math> where
 
*<math>Q</math> is a finite set of states,
*<math>K</math> is a finite set of keys,
*<math>\Sigma</math> is a finite input alphabet,
*<math>\delta: Q \times (\Sigma \cup \{\varepsilon\} \cup Q) \to Q</math> is a partial transition function, <math>\varepsilon</math> being the empty symbol,
*<math>\kappa: Q \to K</math> is a map of states to keys,
*<math>Q_I \subseteq Q</math> is the set of initial states, and
*<math>F \subseteq Q</math> is the set of acceptance states.
 
==Transitions==
Transitions represent the possibility of bringing the FPN from a source state <math>q_s</math> to a target state <math>q_t</math> by possibly performing an additional action. Depending on this action, we distinguish the following types of ''explicitly''-defined transitions:
 
*'''<math>\varepsilon</math>-transitions''' are transitions of the form <math>\delta(q_s,\varepsilon) \to q_t</math> and perform no additional action,
*'''consuming transitions''' are transitions of the form <math>\delta(q_s, \sigma) \to q_t</math> and consume an input symbol <math>\sigma</math>, and
*'''call transitions''' are transitions of the form <math>\delta(q_s, q_c) \to q_t</math> and perform a [[subroutine]] jump to called state <math>q_c</math> before reaching <math>q_t</math>.
 
The behaviour of call transitions is governed by two kinds of ''implicitly''-defined transitions:
 
*for each call transition <math>\delta(q_s, q_c) \to q_t</math> the FPN implicitly defines a '''push transition''' that brings the machine from <math>q_s</math> to <math>q_c</math> by pushing <math>q_t</math> onto the [[stack (data structure)|stack]], and
*for each pair of states <math>(q_f, q_r) \in F \times Q</math> the FPN implicitly defines a '''pop transition''' that brings the machine from <math>q_f</math> to <math>q_r</math> by popping <math>q_r</math> from the stack [[iff]] <math>q_r</math> is the state at the top of the stack and <math>\kappa(q_f) = \kappa(q_r)</math>.
 
Push transitions initialize [[subroutine]] jumps and pop transitions are equivalent to [[return statements]].
 
==Purpose==
A ([[natural language]]) text can be enriched with meta-information by the application of a [[RTN with output]]; for instance, a RTN inserting [[XML]] tags can be used for transforming a [[plain text]] into a structured XML document. A RTN with output representing a [[natural language]] [[grammar]] would delimit and add the syntactic structure of each text sentence (see [[parsing]]). Other RTNs with output could simply mark text segments containing relevant information (see [[information extraction]]). The application of a RTN with output representing an [[ambiguous grammar]] results in a set of possible translations or interpretations of the input. Computing this set has an exponential [[worst-case cost]], even for an [[Earley parser]] for RTNs with output,<ref name="sastre09ja">Javier M. Sastre &amp; Mikel L. Forcada, [http://dx.doi.org/10.1007/978-3-642-04235-5_17 "Efficient parsing using recursive transition networks with output"], ''Lecture Notes in Computer Science'', '''5603''':192-204, 2009</ref> due to cases in which the number of translations increases exponentially [[wikt:WRT|w.r.t.]] the input length; for instance, the number of interpretations of a [[natural language]] sentence increases exponentially w.r.t. the number of unresolved [[prepositional phrase]] attachments:<ref name="ratnaparkhi98ip">Adwait Ratnaparkhi, [http://www.aclweb.org/anthology/P/P98/P98-2177.pdf "''Statistical models for unsupervised prepositional phrase attachment''"], ACL-36: Proceedings of the 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics, pp. 1079-1085, 1998</ref><ref>Miriam Butt, [http://ling.uni-konstanz.de/pages/home/butt/main/material/chunk.pdf "''Chunk/Shallow parsing''"], lecture notes, 2002</ref>
* in sentence ''the girl saw the monkey with the telescope'', it is unknown whether the girl used the telescope or the monkey was holding it (2<sup>1</sup> interpretations),
* in sentence ''the girl saw the monkey with the telescope in the garden'', it is also unknown whether the monkey was in the garden or the action took place in the garden (2<sup>2</sup> interpretations),
* in sentence ''the girl saw the monkey with the telescope in the garden under the tree'', it is unknown as well whether the monkey was under the tree or the action took place under the tree (2<sup>3</sup> interpretations),
* etc.
 
FPNs serve as a compact representation of this set of translations, allowing to compute it in cubic time by means of an Earley-like parser.<ref name="sastre09jb"/> FPN states correspond to execution states (see [[instruction steps]]) of an Earley-parser for [[RTNs]] ''without'' output, and FPN transitions correspond to possible translations of input symbols. The <math>\kappa</math> map of the resulting FPN gives the correspondence between the represented output segments and the recognized input segments: given a recognized input sequence <math>\sigma_1\ldots\sigma_l</math> and a FPN path <math>p</math> starting at a state <math>q</math> and ending at a state <math>q^\prime</math>, <math>p</math> represents a possible translation of input segment <math>\sigma_{\kappa(q)+1}\ldots\sigma_{\kappa(q^\prime)}</math>. The filtered-popping feature is required in order to avoid FPN paths to represent translations of ''disconnected'' or ''overlapping'' input segments: a FPN call may contain several translation paths from the called state to an acceptor state, where the input segments they correspond to share the same start point but do not necessarily have the same length. Only return states corresponding to the same input point than the acceptor state finishing the call are ''valid'' return states.
 
==References==
{{reflist}}
 
[[Category:Natural language processing]]
[[Category:Computational linguistics]]

Revision as of 16:13, 4 April 2013

A filtered-popping recursive transition network (FPRTN),[1] or simply filtered-popping network (FPN), is a recursive transition network (RTN)[2] extended with a map of states to keys where returning from a subroutine jump requires the acceptor and return states to be mapped to the same key. RTNs are finite-state machines that can be seen as finite-state automata extended with a stack of return states; as well as consuming transitions and ε-transitions, RTNs may define call transitions. These transitions perform a subroutine jump by pushing the transition's target state onto the stack and bringing the machine to the called state. Each time an acceptor state is reached, the return state at the top of the stack is popped out, provided that the stack is not empty, and the machine is brought to this state.

Throughout this article we refer to filtered-popping recursive transition networks as FPNs, though this acronym is ambiguous (e.g.: fuzzy Petri nets). Filtered-popping networks and FPRTNs are unambiguous alternatives.

Formal Definition

A FPN is a structure (Q,K,Σ,δ,κ,QI,F) where

  • Q is a finite set of states,
  • K is a finite set of keys,
  • Σ is a finite input alphabet,
  • δ:Q×(Σ{ε}Q)Q is a partial transition function, ε being the empty symbol,
  • κ:QK is a map of states to keys,
  • QIQ is the set of initial states, and
  • FQ is the set of acceptance states.

Transitions

Transitions represent the possibility of bringing the FPN from a source state qs to a target state qt by possibly performing an additional action. Depending on this action, we distinguish the following types of explicitly-defined transitions:

  • ε-transitions are transitions of the form δ(qs,ε)qt and perform no additional action,
  • consuming transitions are transitions of the form δ(qs,σ)qt and consume an input symbol σ, and
  • call transitions are transitions of the form δ(qs,qc)qt and perform a subroutine jump to called state qc before reaching qt.

The behaviour of call transitions is governed by two kinds of implicitly-defined transitions:

  • for each call transition δ(qs,qc)qt the FPN implicitly defines a push transition that brings the machine from qs to qc by pushing qt onto the stack, and
  • for each pair of states (qf,qr)F×Q the FPN implicitly defines a pop transition that brings the machine from qf to qr by popping qr from the stack iff qr is the state at the top of the stack and κ(qf)=κ(qr).

Push transitions initialize subroutine jumps and pop transitions are equivalent to return statements.

Purpose

A (natural language) text can be enriched with meta-information by the application of a RTN with output; for instance, a RTN inserting XML tags can be used for transforming a plain text into a structured XML document. A RTN with output representing a natural language grammar would delimit and add the syntactic structure of each text sentence (see parsing). Other RTNs with output could simply mark text segments containing relevant information (see information extraction). The application of a RTN with output representing an ambiguous grammar results in a set of possible translations or interpretations of the input. Computing this set has an exponential worst-case cost, even for an Earley parser for RTNs with output,[3] due to cases in which the number of translations increases exponentially w.r.t. the input length; for instance, the number of interpretations of a natural language sentence increases exponentially w.r.t. the number of unresolved prepositional phrase attachments:[4][5]

  • in sentence the girl saw the monkey with the telescope, it is unknown whether the girl used the telescope or the monkey was holding it (21 interpretations),
  • in sentence the girl saw the monkey with the telescope in the garden, it is also unknown whether the monkey was in the garden or the action took place in the garden (22 interpretations),
  • in sentence the girl saw the monkey with the telescope in the garden under the tree, it is unknown as well whether the monkey was under the tree or the action took place under the tree (23 interpretations),
  • etc.

FPNs serve as a compact representation of this set of translations, allowing to compute it in cubic time by means of an Earley-like parser.[1] FPN states correspond to execution states (see instruction steps) of an Earley-parser for RTNs without output, and FPN transitions correspond to possible translations of input symbols. The κ map of the resulting FPN gives the correspondence between the represented output segments and the recognized input segments: given a recognized input sequence σ1σl and a FPN path p starting at a state q and ending at a state q, p represents a possible translation of input segment σκ(q)+1σκ(q). The filtered-popping feature is required in order to avoid FPN paths to represent translations of disconnected or overlapping input segments: a FPN call may contain several translation paths from the called state to an acceptor state, where the input segments they correspond to share the same start point but do not necessarily have the same length. Only return states corresponding to the same input point than the acceptor state finishing the call are valid return states.

References

43 year old Petroleum Engineer Harry from Deep River, usually spends time with hobbies and interests like renting movies, property developers in singapore new condominium and vehicle racing. Constantly enjoys going to destinations like Camino Real de Tierra Adentro.

  1. 1.0 1.1 Javier M. Sastre, "Efficient parsing using filtered-popping recursive transition networks", Lecture Notes in Artificial Intelligence, 5642:241-244, 2009
  2. William A. Woods, "Transition network grammars for natural language analysis", Communications of the ACM, ACM Press, 13:10:591-606, 1970
  3. Javier M. Sastre & Mikel L. Forcada, "Efficient parsing using recursive transition networks with output", Lecture Notes in Computer Science, 5603:192-204, 2009
  4. Adwait Ratnaparkhi, "Statistical models for unsupervised prepositional phrase attachment", ACL-36: Proceedings of the 36th Annual Meeting of the Association for Computational Linguistics and 17th International Conference on Computational Linguistics, pp. 1079-1085, 1998
  5. Miriam Butt, "Chunk/Shallow parsing", lecture notes, 2002