Bayesian approach

Bayesian approach

  • Suppose z=[x,y] is a random vector with joint density p(z)

  • Suppose we observe a sample realization of x. How does that change our knowledge of y?

  • before observing the sample x, our knowledge of y is given by

p(y)=xp(z)dx
  • after observing x, our knowledge of y is given by

p(y|x)=p(z)p(x)=p(x|y)p(y)p(x)
  • for exmple, if z is Gaussian:

p(y|x)=N(μy+ΣyxΣx1(xμx),ΣyΣyxΣx1Σxy)
p(y|x)=p(z)p(x)=p(x|y)p(y)p(x)
  • in Bayesian parlance

    • p(y) is prior distribution

    • p(y|x) is posterior distribution

    • p(x|y) is likelihood

  • typically, we would think of y as parameters, and use instead θ

  • also, the prior and the likelihood are specified separately

  • as a result, the posterior is not availalbe in closed form

  • simulation method are used instead, to sample from it

since p(x) is independent from θ

p(θ|x)=p(x|θ)p(θ)p(x)p(x|θ)p(θ)

Goal of Bayesian inference: characterize the distribution of θ, given

  • the data x

  • the prior knowledge about θ

  • the model, embodied by the likelihood function

characterize here means estimate the moments of the posterior, based on sample draws from it