Bayesian inversion for electromyography using low-rank tensor formats

نویسندگان

چکیده

Abstract The reconstruction of the structure biological tissue using electromyographic (EMG) data is a non-invasive imaging method with diverse medical applications. Mathematically, this process an inverse problem. Furthermore, EMG are highly sensitive to changes in electrical conductivity that describes tissue. Modeling inevitable measurement error as stochastic quantity leads Bayesian approach. Solving discretized problem means drawing samples from posterior distribution parameters, e.g., conductivity, given data. Using, Metropolis–Hastings algorithm for purpose involves solving forward different parameter combinations which requires high computational effort. Low-rank tensor formats can reduce effort by providing data-sparse representation all occurring linear systems equations simultaneously and allow their efficient solution. application Bayes’ theorem proves well-posedness derivation proof low-rank precomputation solutions under certain assumptions, resulting theory-based sampling algorithm. Numerical experiments support theoretical results, but also indicate number needed obtain reliable estimates parameters. algorithm, precomputed solution format, draws therefore enables problems infeasible classical methods.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Low-rank Tensor Approximation

Approximating a tensor by another of lower rank is in general an ill posed problem. Yet, this kind of approximation is mandatory in the presence of measurement errors or noise. We show how tools recently developed in compressed sensing can be used to solve this problem. More precisely, a minimal angle between the columns of loading matrices allows to restore both existence and uniqueness of the...

متن کامل

Efficient tensor completion: Low-rank tensor train

This paper proposes a novel formulation of the tensor completion problem to impute missing entries of data represented by tensors. The formulation is introduced in terms of tensor train (TT) rank which can effectively capture global information of tensors thanks to its construction by a wellbalanced matricization scheme. Two algorithms are proposed to solve the corresponding tensor completion p...

متن کامل

Efficient low-rank approximation of the stochastic Galerkin matrix in tensor formats

In this article we describe an efficient approximation of the stochastic Galerkin matrix which stems from a stationary diffusion equation. The uncertain permeability coefficient is assumed to be a log-normal random field with given covariance and mean functions. The approximation is done in the canonical tensor format and then compared numerically with the tensor train and hierarchical tensor f...

متن کامل

Provable Low-Rank Tensor Recovery

In this paper, we rigorously study tractable models for provably recovering low-rank tensors. Unlike their matrix-based predecessors, current convex approaches for recovering low-rank tensors based on incomplete (tensor completion) and/or grossly corrupted (tensor robust principal analysis) observations still suffer from the lack of theoretical guarantees, although they have been used in variou...

متن کامل

Tensor completion using total variation and low-rank matrix factorization

In this paper, we study the problem of recovering a tensor with missing data. We propose a new model combining the total variation regularization and low-rank matrix factorization. A block coordinate decent (BCD) algorithm is developed to efficiently solve the proposed optimization model. We theoretically show that under some mild conditions, the algorithm converges to the coordinatewise minimi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Inverse Problems

سال: 2021

ISSN: ['0266-5611', '1361-6420']

DOI: https://doi.org/10.1088/1361-6420/abd85a