Bayesian approach
Bayesian approach¶
Suppose \(\mathbf{z} = [\mathbf{x}^{\prime}, \; \mathbf{y}^{\prime}]^{\prime}\) is a random vector with joint density \(p(\mathbf{z})\)
Suppose we observe a sample realization of \(\mathbf{x}\). How does that change our knowledge of \(\mathbf{y}\)?
before observing the sample \(\mathbf{x}\), our knowledge of \(\mathbf{y}\) is given by
after observing \(\mathbf{x}\), our knowledge of \(\mathbf{y}\) is given by
for exmple, if \(\mathbf{z}\) is Gaussian:
in Bayesian parlance
\(p(\mathbf{y})\) is prior distribution
\(p(\mathbf{y} | \mathbf{x})\) is posterior distribution
\(p(\mathbf{x} | \mathbf{y})\) is likelihood
typically, we would think of \(\mathbf{y}\) as parameters, and use instead \(\mathbf{\theta}\)
also, the prior and the likelihood are specified separately
as a result, the posterior is not availalbe in closed form
simulation method are used instead, to sample from it
since \(p(\mathbf{x})\) is independent from \(\theta\)
Goal of Bayesian inference: characterize the distribution of \(\theta\), given
the data \(\mathbf{x}\)
the prior knowledge about \(\theta\)
the model, embodied by the likelihood function
characterize here means estimate the moments of the posterior, based on sample draws from it