Jointly distributed random variables

ثبت نشده
چکیده

Similarly we can get the distribution function of Y easily from the joint distribution function of X and Y : FY (y) = lim x→∞ (x, y) = F (∞, y). The distribution functions FX and FY are sometimes called the marginal distribution functions of X and Y respectively. The joint distribution function F of X and Y contains all the statistical information about X and Y . In particular, given the joint distribution function F of X and Y , we can calculate the probability of any event defined in terms of X and Y . For instance, for any real numbers a ≤ b and c ≤ d, we have P(a < X ≤ b, c < Y ≤ d) = F (b, d)− F (a, d)− F (b, c) + F (a, c).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Joint Distribution Criterion and the Distance Tests for Selective Probabilistic Causality

A general definition and a criterion (a necessary and sufficient condition) are formulated for an arbitrary set of external factors to selectively influence a corresponding set of random entities (generalized random variables, with values in arbitrary observation spaces), jointly distributed at every treatment (a set of factor values containing precisely one value of each factor). The random en...

متن کامل

Order-distance and other metric-like functions on jointly distributed random variables

We construct a class of real-valued nonnegative binary functions on a set of jointly distributed random variables, which satisfy the triangle inequality and vanish at identical arguments (pseudo-quasi-metrics). We apply these functions to the problem of selective probabilistic causality encountered in behavioral sciences and in quantum physics. The problem reduces to that of ascertaining the ex...

متن کامل

Random Utility Representations of Finite m-ary Relations

Block and Marschak (1960, in Olkin et al. (Eds.), Contributions to probability and statistics (pp. 97-132). Stanford, CA: Stanford Univ. Press) discussed the relationship between a probability distribution over the strict linear rankings on a finite set C and a family of jointly distributed random variables indexed by C. The present paper generalizes the concept of random variable (random utili...

متن کامل

On the bounds in Poisson approximation for independent geometric distributed random variables

‎The main purpose of this note is to establish some bounds in Poisson approximation for row-wise arrays of independent geometric distributed random variables using the operator method‎. ‎Some results related to random sums of independent geometric distributed random variables are also investigated.

متن کامل

Circularly-Symmetric Gaussian random vectors

A number of basic properties about circularly-symmetric Gaussian random vectors are stated and proved here. These properties are each probably well known to most researchers who work with Gaussian noise, but I have not found them stated together with simple proofs in the literature. They are usually viewed as too advanced or too detailed for elementary texts but are used (correctly or incorrect...

متن کامل

On the Entropy Region of Gaussian Random Variables

Given n (discrete or continuous) random variables Xi, the (2 n − 1)-dimensional vector obtained by evaluating the joint entropy of all non-empty subsets of {X1,. .. , Xn} is called an entropic vector. Determining the region of entropic vectors is an important open problem with many applications in information theory. Recently, it has been shown that the entropy regions for discrete and continuo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010