Accurate Computation of the Relative Entropy Between Stochastic Regular Grammars

نویسنده

  • Rafael C. Carrasco
چکیده

Works dealing with grammatical inference of stochastic grammars often evaluate the relative entropy between the model and the true grammar by means of large test sets generated with the true distribution. In this paper, an iterative procedure to compute the relative entropy between two stochastic deterministic regular grammars is proposed. Resumé Les travails sur l’inférence de grammaires stochastiques évaluent l’entropie relative entre le modèle et la vraie grammaire en utilisant grands ensembles de test générés avec la distribution correcte. Dans cet article, on propose une procédure itérative pour calculer l’entropie relative entre deux grammaires.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Accurate computation of the relative entropybetween stochastic regular

Works dealing with grammatical inference of stochastic grammars often evaluate the relative entropy between the model and the true grammar by means of large test sets generated with the true distribution. In this paper, an iterative procedure to compute the relative entropy between two stochastic deterministic regular grammars is proposed. Resum e Les travails sur l'inf erence de grammaires sto...

متن کامل

Relative Entropy Rate between a Markov Chain and Its Corresponding Hidden Markov Chain

 In this paper we study the relative entropy rate between a homogeneous Markov chain and a hidden Markov chain defined by observing the output of a discrete stochastic channel whose input is the finite state space homogeneous stationary Markov chain. For this purpose, we obtain the relative entropy between two finite subsequences of above mentioned chains with the help of the definition of...

متن کامل

Inferring stochastic regular grammars with recurrent neural networks

Recent work has shown that the extraction of symbolic rules improves the generalization performance of recurrent neural networks trained with complete (positive and negative) samples of regular languages. This paper explores the possibility of inferring the rules of the language when the network is trained instead with stochastic, positive-only data. For this purpose, a recurrent network with t...

متن کامل

Learning stochastic regular grammars with recurrent neural networks

Recent work has shown that the extraction of symbolic rules improves the generalization power of recurrent neural networks trained with complete samples of regular languages. This paper explores the possibility of learning rules when the network is trained with stochastic data. For this purpose, a network with two layers is used. If an automaton is extracted from the network after training and ...

متن کامل

Probabilistic regular graphs

Deterministic graph grammars generate regular graphs, that form a structural extension of configuration graphs of pushdown systems. In this paper, we study a probabilistic extension of regular graphs obtained by labelling the terminal arcs of the graph grammars by probabilities. Stochastic properties of these graphs are expressed using PCTL, a probabilistic extension of computation tree logic. ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • ITA

دوره 31  شماره 

صفحات  -

تاریخ انتشار 1997