The Jacobian transformation is an algebraic method for determining the probability distribution of a variable y when we know the probability distribution for x and some transfer function which relates x and y.
Let x be a variable with probability density function f(x) and
F(x) = \int f(x)dx,
and
F(y) = \int f(y)dy.
We can determine f(y) as
f(y) = J(x, y)f(x).
where J(x, y) is the Jacobian. In this example, the Jacobian is given by
J(x, y) = \left|\frac{\partial x}{\partial y}\right|.
Now assume you perform a likelihood analysis and you put a uniform prior on a parameter \alpha. The uniform prior on \alpha between \alpha_{\rm low} and \alpha_{\rm high} means that
f(\alpha) = \begin{cases} 1 & \text{ for } \alpha_{\rm low} < \alpha < \alpha_{\rm high},\\ 0 & \text{ otherwise }. \end{cases}
The result of the analysis will be some posterior distribution for \alpha. Now assume you want to use that likelihood to infer something about another parameter \beta = \frac{1}{\alpha}. The uniform prior which was applied to \alpha is now distorted and we can calculate it using the Jacobian transformation above.
First, we calculate the Jacobian
\left|\frac{\partial \alpha}{\partial \beta}\right| = \frac{1}{\beta^2},
which together with f(\alpha) results in
f(\beta) = \frac{1}{\beta^2}.
![]() | ![]() |
Figure 1: A flat prior on \alpha transfers into a strongly scale-dependent prior on \beta = 1/\alpha.
While the prior on \alpha gives the same weight to all scales between 1.5 and 10,
in \beta much more weight is given to large values of \beta.
While the prior on \alpha gives the same weight to all scales between 1.5 and 10,
in \beta much more weight is given to large values of \beta.
Figure 1 shows the prior for \alpha and \beta using 1.5 < \alpha < 10. This means that the uniform prior on \alpha now looks nowhere near a uniform prior on \beta. Given that Bayesian inference always assumes some prior, one has to take into account how priors change in parameter transformations.
Please leave comments/questions below.
best
Florian
No comments:
Post a Comment