# 多变量分布

## 联合分布 Joint Distributions

### 联合离散分布 Joint Discrete Distribution

Definition Joint Distribution Function/c.d.f.:The joint c.d.f. of $n$ random variables $X_1,\dots ,X_n$ is the function $F$ whose value at every point $(x_1,\dots ,x_n)$ in n-dimensional space $\mathbb{R}^n$ is specified by the relation
$$F(x_1,\dots , x_n)=Pr(X_1\leq x_1,X_2\leq x_2,\dots X_n\leq x_n)$$

$$F(x_1,x_2,x_3)=\begin{cases}(1-e^{x_1})(1-e^{-2x_2})(1-e^{-3x_3})&\text{ for }x_1,x_2,x_3\geq 0\\ 0&\text{otherwise} \end{cases}$$

Definition Joint Discrete Distribution/p.f. It is said that $n$ random variables $X_1,\dots ,X_n$ have a discrete joint distribution if the random vector $(X_1,\dots ,X_n)$ can have only a finite number or an infinite sequence of different possible values $(x_1,\dots,x_n)$ in $\mathbb{R}^n$ .The joint p.f. of $X_1,\dots,X_n$ is then defined as the function $f$ such that for every point $(x_1,\dots,x_n)\in \mathbb{R}^n$
$$f(x_1,\dots,x_n)=Pr(X_1=x_1,\dots,X_n=x_n)$$

$$f(\vec{x})=Pr(\vec{X}=\vec{x})$$

Theorem If $X$ has a joint discrete distribution with joint p.f. $f$ then for every subset $C\subset \mathbb{R}^n$ ,
$$Pr(\vec{X}\in C)=\sum_{x\in C}f(x)$$

### 联合连续分布 Joint Continuous Distribution

Definition Continuous Distribution/p.d.f. It is said that $n$ random variables $X_1,\dots,X_n$ have a continuous joint distribution if there is a nonnegative function $f$ defined on $\mathbb{R}^n$ such that for every subset $C\subset \mathbb{R}^n$,
$$Pr[(X_1,\dots,X_n)\in C ]=\underbrace{\int\dots\int}_{C}f(x_1,\dots,x_n)dx_1,\dots,dx_n$$

$$Pr[\vec{X}\in C ]=\underbrace{\int\dots\int}_{C}f(\vec{x})d\vec{x}$$

Theorem If the joint distribution of $X_1,\dots,X_n$ is continuous,then the joint p.d.f. $f$ can be derived from the joint c.d.f. $F$ by using the relation
$$f(x_1,\dots,x_n)=\frac{\partial^nF(x_1,\dots,x_n)}{\partial x_1\dots \partial x_n}$$
at all points $(x_1,\dots,x_n)$ at which the derivative in this relation exists.

$$F(x_1,x_2,x_3)=\begin{cases}(1-e^{x_1})(1-e^{-2x_2})(1-e^{-3x_3})&\text{ for }x_1,x_2,x_3\geq 0\\ 0&\text{otherwise} \end{cases}$$

$$f(x_1,x_2,x_3)=\begin{cases}6e^{-x_1-2x_2-3x_3}&\text{ for }x_1,x_2,x_3> 0\\ 0&\text{otherwise} \end{cases}$$

## 混合分布 Mixed Distributions

Definition Joint p.f./p.d.f. Let $X_1,\dots ,X_n$ be random variables,some of which have a continuous joint distribution and some of which have discrete distributions ,their joint distribution would then be represented by a function $f$ that we call the joint p.f./p.d.f .The function has the property that the probability that $X$ lies in a subset $C \subset \mathbb{R}^n$ is calculated by summing $f(x)$ over the values of the coordinates of $x$ that correspond to the discrete random variables and integrating over those coordinates that correspond to the continuous random variables for all piints $\vec{x}\in C$

## 边缘分布 Marginal Distributions

### 计算边缘概率密度函数 Deriving a Marginal p.d.f.

$$f_1(x_1)=\underbrace{\int^\infty_{-\infty}\dots \int^\infty_{-\infty}}_{n-1}f(x_1,\dots,x_n)dx_2\dots dx_n$$

### 计算边缘概率累积函数 Deriving a Marginal c.d.f.

$$F_1(x_1)=Pr(X_1\leq x_1,X_2<\infty,\dots ,X_n<\infty)\\ =lim_{x_2,\dots,x_n \to \infty}F(x_1,x_2,\dots,x_n)$$

## 独立随机变量 Independent Random Variable

Definition Independent Random Variables,It is said that n random varibales $X_1,\dots,X_n$ are independent if for every $n$ sets $A_1,A_2,\dots A_n$ of real numbers
$$Pr(X_1\in A_1,X_2\in A_2,\dots,X_n\in A_n)=Pr(X_1\in A_1)Pr(X_2 \in A_2)\dots Pr(X_n\in A_n)$$

Theorem Let $F$ denote the joint c.d.f. of $X_1,\dots X_n$ and let $F_i$ denote the marginal univariate c.d.f. of $X_i$ for $i=1,\dots,n$ The variables $X_1,\dots,X_n$ are independent if and only if,for all points $(x_1,x_2,\dots,x_n)\in \mathbb{R}^n$
$$F(x_1,x_2,\dots,x_n)=F_1(x_1)F_2(x_2)\dots F_n(x_n)$$

Defintion Random Samples/i.i.d./Sample Size:Consider a given probability on the real line that can be represented by either a p.f. or a p.d.f. $f$ It is said that $n$ random variables $X_1,\dots,X_n$ form a random sample from this distribution if these random varibales are independent and the marginal p.f. or p.d.f. of each of them is $f$ Such random variables are also said to be independent and identically distributed ,abbrevuated i.i.d.We refer to the number $n$ of random varibales as the sample size

$$g(x_1,\dots,x_n)=f(x_1)f(x_2)\dots f(x_n)$$

Share

Subscribe