Fast Inference in Infinite Hidden Relational Models
نویسندگان
چکیده
Relational learning (Dzeroski & Lavrac, 2001; Friedman et al., 1999; Raedt & Kersting, 2003) is an area of growing interest in machine learning. Xu et al. (2006) introduced the infinite hidden relational model (IHRM) which views relational learning in context of the entity-relationship database model with entities, attributes and relations (compare also (Kemp et al., 2006)). In the IHRM, for each entity an auxiliary latent variable is introduced. The latent variable is the only parent of attributes of the entity and is a parent of attributes of relations the entity participates. The number of hidden states is entity class specific. Therefore it is sensible to work with Dirichlet process (DP) mixture models in which each entity class can optimize its own representational complexity in a self-organized way. For our discussion it is sufficient to say that we integrate a DP mixture model into the IHRM by simply letting the number of hidden states for each entity class approach infinity. Thus, a natural outcome of the IHRM is clustering effect providing interesting insight into the structure of the domain.
منابع مشابه
Learning Infinite Hidden Relational Models
Relational learning analyzes the probabilistic constraints between the attributes of entities and relationships. We extend the expressiveness of relational models by introducing for each entity (or object) an infinite-state latent variable as part of a Dirichlet process (DP) mixture model. It can be viewed as a relational generalization of hidden Markov random field. The information propagates ...
متن کاملDistributed Relational State Representations for Complex Stochastic Processes
Several promising variants of hidden Markov models (HMMs) have recently been developed to efficiently deal with large state and observation spaces and relational structure. Many application domains, however, have an apriori componential structure such as parts in musical scores. In this case, exact inference within relational HMMs still grows exponentially in the number of components. In this p...
متن کاملDistributed Relational State Representations for Complex Stochastic Processes (Extended Abstract)
Several promising variants of hidden Markov models (HMMs) have recently been developed to efficiently deal with large state and observation spaces and relational structure. Many application domains, however, have an apriori componential structure such as parts in musical scores. In this case, exact inference within relational HMMs still grows exponentially in the number of components. In this p...
متن کاملCollapsed Variational Bayes Inference of Infinite Relational Model
The Infinite Relational Model (IRM) is a probabilistic model for relational data clustering that partitions objects into clusters based on observed relationships. This paper presents Averaged CVB (ACVB) solutions for IRM, convergence-guaranteed and practically useful fast Collapsed Variational Bayes (CVB) inferences. We first derive ordinary CVB and CVB0 for IRM based on the lower bound maximiz...
متن کاملNonparametric Relational Learning for Social Network Analysis
Social networks usually involve rich collections of objects, which are jointly linked into complex relational networks. Social network analysis has gained in importance due to the growing availability of data on novel social networks, e.g. citation networks, Web 2.0 social networks like facebook, and the hyperlinked internet. Recently, the infinite hidden relational model (IHRM) has been develo...
متن کامل