Let's assume that $g(z)$ is a
sersic model, i.e. $z = \{n, r_\text{hlr}, F, e_1, e_2, s_x, s_y\}$ and
$$g(z) = F \times I_0 \exp \left( -b_n \left[\left( \frac{r}{r_\text{hlr}}\right)^{\frac{1}{n}} -1\right] \right)$$
The joint inference of $p(z, \gamma | \mathcal{D})$ leads to a
biased posterior...
Marginal shear posterior $p(\gamma|\mathcal{D})$
Maximum a posteriori fit and residuals
...due to model misspecification $\longrightarrow$ Let's learn a more flexible $g_\theta$
Learning from corrupted data
Lanusse et al. 2020
$\longrightarrow$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Inference network}$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Generator network}$
$q_\phi(z|x)$
$p_\theta(x|z)$
$z \sim q_\phi(z|x)$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$\longrightarrow$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$x' \sim p_\theta(x|z)$
Learning from corrupted data
Lanusse et al. 2020
$\longrightarrow$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Inference network}$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Generator network}$
$q_\phi(z|x)$
$p_\theta(x|z)$
$z \sim q_\phi(z|x)$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$\longrightarrow$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$g_\theta(z)$
$x' \sim p_\theta(x|z)$
Learning from corrupted data
Lanusse et al. 2020
$\longrightarrow$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Inference network}$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Generator network}$
$q_\phi(z|x)$
$p_\theta(x|z)$
$z \sim q_\phi(z|x)$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$\longrightarrow$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$g_\theta(z)$
$\ast$
$\Pi$
$x' \sim p_\theta(x|z, \Pi, \Sigma)$
$\longrightarrow$
Learning from corrupted data
Lanusse et al. 2020
$\longrightarrow$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Inference network}$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Generator network}$
$q_\phi(z|x)$
$p_\theta(x|z)$
$z \sim q_\phi(z|x)$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$\longrightarrow$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$g_\theta(z)$
$\ast$
$\Pi$
$x' \sim p_\theta(x|z, \Pi, \Sigma)$
$\longrightarrow$
$\underbrace{\quad \quad \quad}_{g_\theta(z) \ast \Pi}$
$\longrightarrow$
Learning from corrupted data
Lanusse et al. 2020
$\longrightarrow$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Inference network}$
$\underbrace{\quad \quad \quad \quad \quad \quad}_\textrm{Generator network}$
$q_\phi(z|x)$
$p_\theta(x|z)$
$z \sim q_\phi(z|x)$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$\longrightarrow$
$\rightarrow$
$\rightarrow$
$\rightarrow$
$\longrightarrow$
$g_\theta(z)$
$\ast$
$\Pi$
$x' \sim p_\theta(x|z, \Pi, \Sigma)$
$\longrightarrow$
$\underbrace{\quad \quad \quad}_{g_\theta(z) \ast \Pi}$
$\longrightarrow$
Optimized maximizing the ELBO
$\log p(x) \geq \mathbb{E}_{z\sim q_\phi(z|x)} \left[ \log p_\theta(x|z, \Pi, \Sigma) + \mathbb{D}_\text{KL}(q_\phi \| p(z)) \right]$
A generative model for galaxy morphologies
The Bayesian view of the problem: $$ p(z | x ) \propto
p_\theta(x | z, \Sigma, \mathbf{\Pi}) p(z)$$ where:
- $p( z | x )$ is the posterior
-
$p( x | z )$ is the data likelihood,
contains the physics
- $p( z )$ is the prior
Posterior samples
$g_\theta(z)$
$\mathbf{P} (\Pi \ast g_\theta(z))$
Median
Data residuals
$x_n - \mathbf{P} (\Pi \ast g_\theta(z))$
Standard Deviation
$\Longrightarrow$
Uncertainties are fully captured by the posterior.
Joint inference using a generative model for the morphology
Remy, Lanusse, Starck (2022)
Let's use the learned $g_\theta(z)$
The joint inference of $p(z, \gamma | \mathcal{D})$ leads to an
unbiased posterior!
Marginal shear posterior $p(\gamma|\mathcal{D})$
Maximum a posteriori fit and residuals
Takeaway message
-
Ellipticity is not a well defined quantity for arbitrary galaxies $\rightarrow$ bias in shear estimation
-
Forward modeling allows to decouple morphology from observing conditions
- Deep generative models can be used to provide flexible light profile model
- Explicit likelihood: uses of all of our physical knowledge
$ + $ Our method can be applied for varying PSF, noise, or even different instruments!
$\Longrightarrow$ Joint inference of morphology and shear leads to unbiased marginal shear posterior