Distinct word length frequencies: distributions and symbol entropies
نویسنده
چکیده
The distribution of frequency counts of distinct words by length in a language’s vocabulary will be analyzed using two methods. The first, will look at the empirical distributions of several languages and derive a distribution that reasonably explains the number of distinct words as a function of length. We will be able to derive the frequency count, mean word length, and variance of word length based on the marginal probability of letters and spaces. The second, based on information theory, will demonstrate that the conditional entropies can also be used to estimate the frequency of distinct words of a given length in a language. In addition, it will be shown how these techniques can also be applied to estimate higher order entropies using vocabulary word length.
منابع مشابه
Word-length entropies and correlations of natural language written texts
We study the frequency distributions and correlations of the word lengths of ten European languages. Our findings indicate that a) the word-length distribution of short words quantified by the mean value and the entropy distinguishes the Uralic (Finnish) corpus from the others, b) the tails at long words, manifested in the high-order moments of the distributions, differentiate the Germanic lang...
متن کاملGeneralized Entropies and the Similarity of Texts
We show how generalized Gibbs-Shannon entropies can provide new insights on the statistical properties of texts. The universal distribution of word frequencies (Zipf’s law) implies that the generalized entropies, computed at the word level, are dominated by words in a specific range of frequencies. Here we show that this is the case not only for the generalized entropies but also for the genera...
متن کاملCan Zipf Analyses and Entropy Distinguish between Artiicial and Natural Language Texts?
We study statistical properties of natural texts written in English and of two types of artiicial texts. As statistical tools we use the conventional and the inverse Zipf analyses, the Shannon entropy and a quantity which is a nonlinear function of the word frequencies , the frequency relative \entropy". Our results obtained by investigating eight complete books and sixteen related artiicial te...
متن کاملSimilarity of symbol frequency distributions with heavy tails
Quantifying the similarity between symbolic sequences is a traditional problem in Information Theory which requires comparing the frequencies of symbols in different sequences. In numerous modern applications, ranging from DNA over music to texts, the distribution of symbol frequencies is characterized by heavy-tailed distributions (e.g., Zipf’s law). The large number of low-frequency symbols i...
متن کاملOn the similarity of symbol frequency distributions with heavy tails
Quantifying the similarity between symbolic sequences is a traditional problem in information theory which requires comparing the frequencies of symbols in different sequences. In numerous modern applications, ranging from DNA over music to texts, the distribution of symbol frequencies is characterized by heavy-tailed distributions (e.g., Zipf’s law). The large number of low-frequency symbols i...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Glottometrics
دوره 23 شماره
صفحات -
تاریخ انتشار 2012