Description
The solution of the linear system of equations $Ax\approx b$ arising from the discretization of an ill-posed integral equation with a square integrable kernel is considered. The solution by means of Tikhonov regularization in which $x$ is found to as the minimizer of $J(x)=\{ \|Ax -b\|_2^2 + \lambda^2 \|L x\|_2^2\}$ introduces the unknown regularization parameter $\lambda$ which trades off the fidelity of the solution data fit and its smoothing norm, which is determined by the choice of $L$. The Generalized Discrepancy Principle (GDP) and Unbiased Predictive Risk Estimator (UPRE) are methods for finding $\lambda$ given prior conditions on the noise in the measurements $b$. Here we consider the case of $L=I$, and hence use the relationship between the singular value expansion and the singular value decomposition for square integrable kernels to prove that the GDP and UPRE estimates yield a convergent sequence for $\lambda$ with increasing problem size. Hence the estimate of $\lambda$ for a large problem may be found by down-sampling to a smaller problem, or to a set of smaller problems, and applying these estimators more efficiently on the smaller problems. In consequence the large scale problem can be solved in a single step immediately with the parameter found from the down sampled problem(s).
Download count: 1
Details
Title
- Validity of down-sampling data for regularization parameter estimation when solving large-scale ill-posed inverse problems
Contributors
- Horst, Michael Jacob (Author)
- Renaut, Rosemary (Thesis director)
- Cochran, Douglas (Committee member)
- Wang, Yang (Committee member)
- Barrett, The Honors College (Contributor)
- School of Music (Contributor)
- School of Mathematical and Statistical Sciences (Contributor)
Date Created
The date the item was original created (prior to any relationship with the ASU Digital Repositories.)
2014-05
Resource Type
Collections this item is in