0%

test

Hello world

This is a test post for markdown testing.

本文档用于Markdown测试。

Definition of Loss

Image recovering is a series of problems like this: given a degenerated image $B$, which is assumed to be degenerated from a clear latent image $I$ by the following process:

where $k$ denotes degenerate kernel, $\otimes$ denotes convolution operation, and $N$ denotes additive noise.

We define the following loss function for image recovering task:

Where $\lVert\cdot\rVert_2$ denotes L2 norm of all elements, $\Psi(x)$ denotes regularization term and $\mu$ is a coefficient of the term. Recovering $k$ and $I$ blindly is ill-posed , so some priors for a clear image, e.g. sparsity of gradients, are needed in terms of regularization terms in the loss function.

For example, if we use gradient sparsity prior for image recovering, the regularization term should be

Where $\nabla$ denotes gradient operator, and $\lVert\cdot \rVert_0$ denotes L0 norm, or number of non-zero elements of a vector/matrix. Then image recovering is to solve the following problem:

Variable splitting solver

Since optimizing L0 norm is NP-hard, we need some method to obtain approximate solution. One of the approximate methods is half-quadratic penalty method. An auxiliary variable $A$ is introduced:

where $\gamma$ is a large parameter to enforce $\lVert\nabla I-A\rVert\approx0$ for its solution, thus $A\approx \nabla I$ holds.

Updating method of half-quadratic variable splitting

See this1.

1. https://jiaya.me/papers/deconvmodelsolver_2014.pdf