Bayesian inference exercises

cosmos 4th November 2016 at 2:43pm

See Bayesian inference

1. Dodgy coins

1.1 P(CH)P(HC)P(C|H) \propto P(H|C)

0.75+0.5+0.25=1.5

1.2 Maximum likelihood estimate is C=1C=1

1.3 P(C=cX=H)=P(X=HC=c)P(X=H)P(X=HC=c)=P(X=HC=c)×P(C=c)P(C=c|X=H) = \frac{P(X=H \cap C=c)}{P(X=H)}\propto P(X=H \cap C=c) =P(X=H | C=c) \times P(C=c)

1.4 & 1.5 Using Baye's theorem

Clikelihoodpriorlikelihood x priorposterior
10.751/33/4 x 1/3 = 1/41/41/2=1/2\frac{1/4}{1/2}=1/2
20.51/31/2 x 1/3 = 1/61/61/2=1/3\frac{1/6}{1/2}=1/3
30.251/31/4 x 1/3 = 1/121/121/2=1/6\frac{1/12}{1/2}=1/6
P(X=H)=3/12+2/12+1/12=6/12=1/2P(X=H) = 3/12 + 2/12+1/12 = 6/12 = 1/2

1/2+1/3+1/6=3/6+2/6+1/6=11/2 + 1/3 + 1/6 = 3/6+2/6+1/6 = 1

1.6

Clikelihoodpriorlikelihood x priorposterior
10.7520.75^21/33/4 x 3/4 x 1/3 = 3/163/167/24=9/14\frac{3/16}{7/24}=9/14
20.520.5^21/31/2 x 1/2 x 1/3 = 1/121/127/24=4/14\frac{1/12}{7/24}=4/14
30.2520.25^21/31/4 x 1/4 x 1/3 = 1/481/487/24=1/14\frac{1/48}{7/24}=1/14
P(X=H)=9/48+4/48+1/48=14/48=7/24P(X=H) = 9/48 + 4/48+1/48 = 14/48 = 7/24

1.7

0.11111... use this

P( C=1|H ) = 0.111111111111111

P( C=2|H ) = 0.37037037037037

P( C=3|H ) = 0.518518518518518. Maximum a posteriori

P(X~=HX=H)=0.75×1/2+0.5×1/3+0.25×1/6P(\tilde{X}=H|X=H) = 0.75\times 1/2 + 0.5 \times 1/3 + 0.25 \times 1/6 =3/8+1/6+1/24=9/24+4/24+1/24=14/24=7/12= 3/8 + 1/6 + 1/24 = 9/24+4/24+1/24 =14/24=7/12

P(X~=TX=H)=5/12P(\tilde{X}=T|X=H) = 5/12

1.10 P(X~=HX=H)=C=1rP(X~CX=H)=C=1rP(X~CX=H)×P(CX=H)P(\tilde{X}=H|X=H) = \sum\limits_{C=1}^r P(\tilde{X} \cap C | X=H) = \sum\limits_{C=1}^r P(\tilde{X} | C \cap X=H) \times P(C | X=H) =C=1rP(X~C)×P(CX=H)= \sum\limits_{C=1}^r P(\tilde{X} | C) \times P(C | X=H)

2. The Epidemiology of Lyme disease

2.1 Binomial distribution as likelihood of number of ticks with Borrelia bacteria is reasonable because it's the prob dist if the probability of each tick having the bacteria is the same, and they are independent.

2.2

L(Xiθ)=(nXi)θXi(1θ)nXiL(X_i | \theta) = \binom{n}{X_i} \theta^{X_i} (1-\theta)^{n-X_i}

n=10

2.3

10θ(1θ)910 \theta (1-\theta)^{9}

https://www.wolframalpha.com/input/?i=10+theta+(1-%5Ctheta)%5E%7B9%7D

θ0.1\theta \approx 0.1 is MLE

2.4

https://www.wolframalpha.com/input/?i=integrate+10+q(1-q)%5E%7B9%7D+from+q%3D+0+to+1

2.5

https://www.wolframalpha.com/input/?i=binomial+distribution+with+p%3D0.1,+n%3D10

2.6

ddθlogL(Xθ)=0\frac{d}{d\theta} \log{L(X|\theta)} = 0

ddθlog(nXi)θXi(1θ)nXi=ddθ[log(nXi)+Xilogθ+(nXi)log(1θ)]\frac{d}{d\theta} \log{ \binom{n}{X_i} \theta^{X_i} (1-\theta)^{n-X_i}} = \frac{d}{d\theta} \left[ \log{ \binom{n}{X_i} } + X_i \log{ \theta} + (n-X_i) \log{ (1-\theta)} \right] 0+Xiθ(nXi)(1θ)=0 0 + \frac{X_i}{\theta} - \frac{(n-X_i)}{(1-\theta)} = 0

XiXiθ=θnθXi X_i - X_i \theta = \theta n - \theta X_i

Xi=θn X_i = \theta n => θ=Xin=Xi10 \theta = \frac{X_i}{n} = \frac{X_i}{10}

2.7

Beta distribution as Prior distribution

Plot this on RStudio!

2.8

The expected value (mean) (μ) of a Beta distribution random variable X with two parameters α and β is a function of only the ratio β/α

2.9

Beta(a,b)=Γ(a+b)Γ(a)Γ(b)θa1(1θ)b1Beta(a, b) = \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}\, \theta^{a-1}(1-\theta)^{b-1}

(nXi)θXi(1θ)nXiΓ(a+b)Γ(a)Γ(b)xθa1(1θ)b1=Γ(n+1)Γ(n+1Xi)Γ(Xi+1)Γ(a+b)Γ(a)Γ(b)θXi+a1(1θ)nXi+b1\binom{n}{X_i} \theta^{X_i} (1-\theta)^{n-X_i} \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}\, x\theta^{a-1}(1-\theta)^{b-1} = \frac{\Gamma(n+1)}{\Gamma(n+1-X_i)\Gamma(X_i+1)} \frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}\, \theta^{X_i +a - 1} (1-\theta)^{n-X_i +b -1} θXi+a1(1θ)nXi+b1 \propto \theta^{X_i +a - 1} (1-\theta)^{n-X_i +b -1} , which is propto Beta(Xi1+a,nXi+b)Beta(X_i1+a, n-X_i+b)

See wiki