# Gama分布

## 伽马函数 The Gamma Function

$$f(x)= \begin{cases} e^{-x}&\text{for} x>0\\ 0&\text{otherwise} \end{cases}$$

$$E(X)=\int^{\infty}_{0}xe^{-x}dx\\ Var(X)=\int^{\infty}_{0}x^2e^{-x}dx$$

Definition 5.7.1 The Gamma Function.For each positive number $\alpha$ ,let the value $\Gamma(\alpha)$ be defined by the following integral:
$$\Gamma(\alpha)=\int^{\infty}_{0}x^{\alpha-1}e^{-x}dx=1$$
The function $\Gamma$ defined by Eq.(5.7.1) for $\alpha>0$ is called the gamma function.

$$\Gamma(1)=\int^{\infty}_{0}x^{1-1}e^{-x}dx=1$$

Theorem 5.7.1 If $\alpha>1$ then
$$\Gamma(\alpha)=(\alpha-1)\Gamma(\alpha-1)$$

\begin{aligned} \Gamma(\alpha)&=\int^{\infty}_{0}udv=[uv]^{\infty}_{0}-\int^{\infty}_{0}vdu\\ &=[-x^{\alpha-1}e^{-x}]^{\infty}_{x=0}+(\alpha-1)\int^{\infty}_{0}x^{\alpha-2}e^{-x}dx\\ &=0+(\alpha-1)\Gamma(\alpha-1) \end{aligned}

Theorem 5.7.2 For every positive integer $n$ ,
$$\Gamma(n)=(n-1)!$$

\begin{aligned} \Gamma(n)&=(n-1)\Gamma(n-1)=(n-1)(n-2)\Gamma(n-2)\\ &=(n-1)(n-2)\dots1\cdots \Gamma(1)\\ &=(n-1)! \end{aligned}

$$\Gamma(\frac{1}{2})=2^{\frac{1}{2}}\int^{\infty}_{0}e^{(-\frac{1}{2})y^2}dy$$

$$\Gamma(\frac{1}{2})=\pi^{\frac{1}{2}}$$

Theorem 5.7.3 For each $\alpha>0$ and each $\beta>0$ ,
$$\int^{\infty}_{0}x^{\alpha-1}e^{-\beta x}dx=\frac{\Gamma(\alpha)}{\beta^{\alpha}}$$

\begin{aligned} &\int^{\infty}_{0}x^{\alpha-1}e^{-\beta x}dx\\ &=\frac{1}{\beta}\int^{\infty}_{0}(\frac{y}{\beta})^{\alpha-1}e^{-y}dy\\ &=\frac{1}{\beta^\alpha}\int^{\infty}_{0}y^{\alpha-1}e^{-y}dy\\ &=\frac{\Gamma(\alpha)}{\beta^{\alpha}} \end{aligned}

Theorem 5.7.4 Stirling’s formula:
$$lim_{x\to \infty}\frac{(2\pi)^{1/2}x^{x-1/2}e^{-x}}{\Gamma(x)}=1$$

\begin{aligned} f(x_1,\dots,x_n,z)&=[\Pi^{n}_{i=1}g_1(x_i|z)]f_2(z)\\ &=2z^ne^{-z[2+x_1+\dots+x_n]} \end{aligned}

\begin{aligned} f_n(x_1,\dots,x_n|z)&=\int^{\infty}_{0}f(x_1,\dots,x_n,z)dz\\ &=\frac{2(n!)}{(2+\sum^{n}_{i=1}x_i)^{n+1}} \end{aligned}

## 伽马分布 The Gamma Distributions

$$g_2(z|x_1,\dots,x_n)= \begin{cases} \frac{y^{n+1}}{n!}e^{-yz}&\text{for }z>0\\ 0&\text{otherwise} \end{cases}$$

Definition 5.7.2 Gamma Distributions.Let $\alpha$ and $\beta$ be positive numbers.A random variable $X$ has the gamma distribution with parameters $\alpha$ and $\beta$ if $X$ has a continuous distribution for which the p.d.f. is
$$f(x|\alpha,\beta)= \begin{cases} \frac{\beta^\alpha}{\Gamma(\alpha)}x^{\alpha-1}e^{-\beta x}&\text{ for } x>0\\ 0&\text{otherwise} \end{cases}$$

Gamma分布长这个样子

Theorem 5.7.5 Moments.Let $X$ have the gamma distribution with parameters $\alpha$ and $\beta$ For $k=1,2,\dots$
$$E(X^k)=\frac{\Gamma(\alpha+k)}{\beta^k\Gamma(\alpha)}=\frac{\alpha(\alpha+1)\dots(\alpha+k-1)}{\beta^k}$$
In particular $E(X)=\frac{\alpha}{\beta}$ , and $Var(X)=\frac{\alpha}{\beta^2}$

$k=1,2,\dots$ 然后我们有

\begin{aligned} E(X^k)&=\int^{\infty}_{0}x^kf(x|\alpha,\beta)dx\\ &=\frac{\beta^\alpha}{\Gamma(\alpha)}\int^{\infty}_{0}x^{\alpha+k-1}e^{-\beta x}dx\\ &=\frac{\beta^\alpha}{\Gamma(\alpha)}\cdot \frac{\Gamma(\alpha+k)}{\beta^{\alpha+k}}\\ &=\frac{\Gamma(\alpha+k)}{\beta^k\Gamma(\alpha)} \end{aligned}

Theorem 5.7.6 Moment Generating Function.Let $X$ have the gamma distribution with parameters $\alpha$ and $\beta$ .The m.g.f. of $X$ is
$$\psi(t)=(\frac{\beta}{\beta-t})^\alpha \text{ for }t<\beta$$

$$\psi(t)=\int^{\infty}_{0}e^{tx}f(x|\alpha,\beta)dx=\frac{\beta^\alpha}{\Gamma(\alpha)}\int^{\infty}_{0}x^{\alpha-1}e^{-(\beta-t)x}dx$$

$$\psi(t)=\frac{\beta^\alpha}{\Gamma(\alpha)}\cdot\frac{\Gamma(\alpha)}{(\beta-t)^{\alpha}}=(\frac{\beta}{\beta-t})^\alpha$$

Theorem 5.7.7 If the random varibale $X_1,\dots,X_n$ are independent,and if $X_i$ has the gamma distribution with parameters $\alpha_i$ and $\beta(i=1,\dots,k)$ ,then the sum $X_1+\dots+X_k$ has the gamma distribution with parameters $\alpha_1+\dots+\alpha_k$ and $\beta$

$$\psi_i(t)=(\frac{\beta}{\beta-t})^{\alpha_i} \text{ for }t<\beta\\ \psi(t)=\Pi^{k}_{i=1}\psi_i(t)=(\frac{\beta}{\beta-t})^{\alpha_1+\dots+\alpha_k} \text{ for }t<\beta$$

Subscribe