Information-theoretic characterizations of conditional mutual independence and Markov random fields
نویسندگان
چکیده
We take the point of view that a Markov random field is a collection of so-called full conditional mutual independencies. Using the theory of -Measure, we have obtained a number of fundamental characterizations related to conditional mutual independence and Markov random fields. We show that many aspects of conditional mutual independence and Markov random fields have very simple set-theoretic descriptions. New insights into the structure of conditional mutual independence and Markov random fields are obtained. Our results have immediate applications in the implication problem of probabilistic conditional independency and relational database. Toward the end of the paper, we obtain a hypergraph characterization of a Markov random field which makes it legitimate to view a Markov random field as a hypergraph. Based on this result, we naturally employ the Graham Reduction, a tool from relational database theory, to recognize a Markov forest. This connection between Markov random fields and hypergraph sheds some light on the possible role of hypergraph theory in the study of Markov random fields.
منابع مشابه
Probabilistic Sufficiency and Algorithmic Sufficiency from the point of view of Information Theory
Given the importance of Markov chains in information theory, the definition of conditional probability for these random processes can also be defined in terms of mutual information. In this paper, the relationship between the concept of sufficiency and Markov chains from the perspective of information theory and the relationship between probabilistic sufficiency and algorithmic sufficien...
متن کاملInformation-Theoretic Inference of Common Ancestors
A directed acyclic graph (DAG) partially represents the conditional independence structure among observations of a system if the local Markov condition holds, that is if every variable is independent of its non-descendants given its parents. In general, there is a whole class of DAGs that represents a given set of conditional independence relations. We are interested in properties of this class...
متن کاملApplication of Pretrained Deep Neural Networks to Large Vocabulary Speech Recognition
The use of Deep Belief Networks (DBN) to pretrain Neural Networks has recently led to a resurgence in the use of Artificial Neural Network Hidden Markov Model (ANN/HMM) hybrid systems for Automatic Speech Recognition (ASR). In this paper we report results of a DBN-pretrained context-dependent ANN/HMM system trained on two datasets that are much larger than any reported previously with DBN-pretr...
متن کاملLearning non-parametric Markov networks with mutual information
We propose a method for learning Markov network structures for continuous data without invoking any assumptions about the distribution of the variables. The method makes use of previous work on a non-parametric estimator for mutual information which is used to create a non-parametric test for multivariate conditional independence. This independence test is then combined with an efficient constr...
متن کاملConditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data
We present conditional random fields , a framework for building probabilistic models to segment and label sequence data. Conditional random fields offer several advantages over hidden Markov models and stochastic grammars for such tasks, including the ability to relax strong independence assumptions made in those models. Conditional random fields also avoid a fundamental limitation of maximum e...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- IEEE Trans. Information Theory
دوره 48 شماره
صفحات -
تاریخ انتشار 2002