# 期望的性质

## 期望的基本定理 Basic Theorems

Linear Function:.If Y=aX+b,where a and b are finite constants,then
$$E(Y)=aE(X)+b$$

$$E(Y)=E(aX+b)=\int^{\infty}_{\infty}(ax+b)f(x)dx\\ =a\int^{\infty}_{-\infty}xf(x)dx+b\int^{\infty}_{-\infty}f(x)dx\\ =aE(x)+b$$

Corollary If $X=c$ with probability 1 ,then $E(X)=c$

$$E(X)=\int^{\infty}_{-\infty}cf(x)dx\\ =c\int^{\infty}_{-\infty}f(x)dx=c$$
Q.E.D

Theorem If there exists a constant such that $Pr(X\geq a)=1$, then $E(X)\geq a$. If there exists a constant $b$ such that $Pr(X\leq b)=1$,then $E(X)\leq b$

$$E(X)=\int^{\infty}_{-\infty}xf(x)dx=\int^{\infty}_{a}xf(x)dx\\ \geq \int^{\infty}_{a}af(x)dx=aPr(X\geq a)=a$$
Q.E.D

Theorem Suppose that $E(x)=a$ and that either $Pr(X\geq a)=1$ or $Pr(X\leq a)=1$ .Then $Pr(X=a)=1$

$$E(X)=p_0a+\sum^{\infty}_{j=1}x_jPr(X=x_j)$$

$$E(X)\geq p_0a + \sum^{\infty}_{j=1}aPr(X=x_j)=a$$

Theorem If $X_1,\dots,X_n$ are $n$ random variables such that each expectation $E(X_i)$ is finite $(i=0,\dots,n)$ ,then
$$E(X_1+\dots+X_n)=E(X_1)+\dots+E(X_n)$$

$$E(X_1+X_2)=\int^{\infty}_{-\infty}\int^{\infty}_{-\infty}(x_1+x_2)f(x_1,x_2)dx_1dx_2\\ =\int^{\infty}_{-\infty}\int^{\infty}_{-\infty}x_1f(x_1,x_2)dx_1dx_2+\int^{\infty}_{-\infty}\int^{\infty}_{-\infty}x_2f(x_1,x_2)dx_1dx_2\\ =\int^{\infty}_{-\infty}x_1f_1(x_1)dx_1+\int^{\infty}_{-\infty}x_2f_2(x_2)dx_2\\ =E(X_1)+E(X_2)$$

Corollary Assume that $E(x_i)$ is finite for $i=1,\dots,n$ For all constants $a_1,\dots,a_n$ and $b$
$$E(a_1X_1+\dots + a_nX_n+b)=a_1E(X_1)+\dots a_nE(X_n)+b$$

Definition Convex Functions A function g of a vector argument is convex if ,for every $\alpha\in (0,1)$ and every x and y,
$$g[\alpha x+(1-\alpha)y] \geq \alpha g(x)+(1-\alpha)g(y)$$

Theorem Jensen’s Inequality. Let g be a convex function,and let $X$ be a random vector with finite mean.Then $E[g(X)]\geq g(E[X])$

## 独立随机变量之积的期望关系 Expectation of a Product of Independent Random Variables

If $X_1,\dots,X_n$ are $n$ independent random variables such that each expectation $E(X_i)$ is finite $(i=1,\dots,n)$ then
$$E(\Pi^{n}_{i=1}X_i)=\Pi^{n}_{i=1}E(X_i)$$

$$f(x_1,\dots,x_i)=\Pi^{n}_{i=1}f_i(x_i)$$

$$E(\Pi^{n}_{i=1}X_i)\\ =\int^{\infty}_{-\infty}\dots \int^{\infty}_{-\infty}(\Pi^{n}_{i=1}x_i)f(x_1,\dots,x_n)dx_1,\dots,x_n\\ =\int^{\infty}_{-\infty}\dots \int^{\infty}_{-\infty}\Pi^{n}_{i=1}[x_if_i(x_i)]dx_1,\dots,x_n\\ =\Pi^{n}_{i=1} \int^{\infty}_{-\infty}x_if_i(x_i)dx_i$$

Subscribe