Towards an algorithmic theory of compressed sensing, rutgers univ. Request pdf information theoretic performance bounds for noisy compressive sensing compressive sensing provides a new approach to data acquisition and storage. Lower bounds for compressed sensing with generative models. Computer science information theory, computer science machine learning, electrical engineering and systems science signal. Paper open access related content quantum tomography via. Information theoretic bounds for compressed sensing in sar. Compressed sensing cs deals with the reconstruction of sparse signals from a small number of linear measurements. Tight measurement bounds for exact recovery of structured. On the other hand, an information theoretic analysis can reveal where there currently exists a gap between the performance of computationally tractable methods, and the fundamental limits. Nowadays, after only 6 years, an abundance of theoretical aspects of compressed sensing are explored in more than articles. The goal of compressed sensing is to learn a structured signal x from a limited number of noisy linear measurements y.
For example, reference 4 studied the minimum number of noisy measurements required to recover a sparse signal by using shannon information theory bounds. In this paper, we derive some information theory bounds on the performance of noisy compressive sensing to calculate. Informationtheoretic lower bounds for compressive sensing. On the one hand are rigorous bounds based on information theoretic arguments or the analysis of speci. Information theoretic bounds for compressed sensing. Index termsbasis pursuit, compressed sensing, compressive sampling, information theoretic bounds, lasso, orthogonal matching pursuit, prior information, sparsity pattern recovery. Dror baron information theoretic results in compressed sensing compressed sensing. I m, where i m is an identity matrix of size m, is assumed to be a. Information theoretic bounds for compressed sensing in sar imaging to cite this article. Detection and information theoretic measures for quantifying the distinguishability between multimedia operator chains. On the other hand are exact but heuristic predictions made using the replica method from statistical physics. Introduction to compressed sensing with coding theoretic perspective this book is a course note developed for a graduate level course in spring 2011, at gist, korea. Universal measurement bounds for structured sparse signal. These techniques are based on one of the following categories.
Information theoretic bounds for compressed sensing core. Gurumoorthyb, ajit rajwadec, adepartment of electrical engineering, iit bombay binternational center for theoretical sciences, tifr ictstifr, bangalore cdepartment of computer science and engineering, iit bombay abstract. The smaller matrix sa2rm is a compressedd version of the original data a2rn d we start with an overview of di erent constructions of sketching matrices in. On the one hand are rigorous bounds based on informationtheoretic arguments or the analysis of speci. Abstract compressed sensing cs deals with the reconstruction of sparse signals from a small number of linear measurements. In the cs literature, several information theoretic bounds on. Using an information theoretic metric for compressive. Tight measurement bounds for exact recovery of structured sparse signals. Compressed regression neural information processing systems. Compressed sensing cs is a new framework for sampling and reconstructing. One of the main challenges in cs is to find the support of a sparse signal from a set of noisy observations. Informationtheoretic limits on sparsity recovery in the. The fundamental revelation is that, if an nsample signal x is sparse and has a good kterm approximation in some basis, then it can be reconstructed using m ok lognk n linear projections of x onto another basis.
Sparsity pattern recovery in compressed sensing by galen reeves a dissertation submitted in partial satisfaction of the requirements for the degree of doctor of philosophy in engineering electrical engineering and computer sciences in the graduate division of the university of california, berkeley committee in charge. Spectral compressed sensing via structured matrix completion 1d line spectral estimation as a special case, and indicates how to address multidimensional models. A strong converse bound for multiple hypothesis testing, with applications to highdimensional estimation. Moreover, this methodology is to date extensively utilized by applied. Furthermore, we show an information theoretic lower bound for tomography of rankr states using adaptive sequences of singlecopy pauli measurements. Detection and information theoretic measures for quantifying the. For comparison, we will use the results by hegde and others 2 in a linear regression setup. These problems concern continuous natural phenomena. Compressed mizationsensing bounds prior information weighted n. Signal processing, compressed sensing, information theory and polar. In the cs literature, several information theoretic bounds on the scaling law of the required number of measurements for exact support recovery have been.
Pdf information theoretic bounds for compressed sensing. Indeed, the informationtheoreticconstrained quadratic programming. Informationtheoretic methods in data science edited by. Information theoretic limits for linear prediction with graph. Informationtheoretic bounds of resampling forensics. Sparse signal recovery with multiple prior information. One buzzword you can look up and read more about is the \singlepixel camera. Bounds for optimal compressed sensing matrices and practical reconstruction schemes shriram sarvotham abstract compressed sensing cs is an emerging. Ieee transactions on information forensics and security 11, 4 2016, 774788.
An informationtheoretic approach to distributed compressed sensing. Compressed sensing also known as compressive sensing, compressive sampling, or sparse sampling is a signal processing technique for efficiently acquiring and reconstructing a signal, by finding solutions to underdetermined linear systems. Compressive sensing provides a new approach to data acquisition and storage. In this paper, we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Information theoretic bounds for compressed sensing abstract. In the cs literature, several information theoretic bounds on the scaling law of the required number of measurements for exact support recovery have been derived, where. Written by leading experts in a clear, tutorial style, and using consistent notation and definitions throughout, it shows how information theoretic methods are being used in data acquisition, data.
Index terms an optimal scaling of the number of observations required forrelaxation, compressed sensing, fanos method, highdimensional statistical inference, information theoretic bounds, sparse approximation, sparse random matrices. Sep, 20 in the remaining part of this chapter we derive a few information theoretic bounds pertaining to the problem at hand. Informationtheoretic limits on sparse signal recovery. Information theoretic lower bounds for compressive sensing with generative models abstract. We also propose and prove several interesting statistical properties of the square root of jensenshannon divergence, a wellknown informationtheoretic metric, and exploit other known ones. Information theoretic limits for linear prediction with. Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models s aeron, m zhao, v saligrama 2008 information theory. Information theoretic performance bounds for noisy.
It has recently been shown that for compressive sensing, significantly fewer measurements may be required if the sparsity assumption is replaced by the assumption the unknown vector lies near the range of a suitablychosen generative model. Learn about the stateoftheart at the interface between information theory and data science with this first unified treatment of the subject. Apr 22, 2008 in this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. On the other hand fundamental information theoretic bounds that are algorithm independent have been presented in 2 1.
Here the authors propose a quantity, named sensing capacity, to incorporate the effects of distortion. The standard approach to taking pictures is to rst take a highresolution picture in the \standard basis e. Zhang jingxiong 1, yangke, guojianzhong2 1school of remote sensing and information engineering, wuhan university, wuhan, china. Informationtheoretic bounds on target recognition performance. We consider two types of distortion for reconstruction. The principle observation here is that most natural phenomena of interest is compressible, i. Introduction sparse vectors are widely used tools in. Compressed sensing is an emerging field based on the revelation that a small group of linear projections of a sparse signal contains enough information for reconstruction. Universal measurement bounds for structured sparse signal recovery we analyze the groupstructured sparse recovery problem using a random gaussian measurement model. We develop information theoretic performance bounds on target recognition based on statistical models for sensors and data, and examine conditions under which these bounds are tight. An informationtheoretic approach to distributed compressed. The improved performance of these methods over their standard counterparts is demonstrated using simulations.
Another goal of this paper is to develop information theoretic bounds for the emerging. Information theoretic lower bounds for compressive sensing with generative models. Aug 17, 2000 detection and recognition problems are modeled as composite hypothesis testing problems involving nuisance parameters. Index termscompressed sensing, relaxation, fanos method, highdimensional statistical inference, information theoretic bounds, lasso, model selection, signal denoising, sparsity pattern, sparsity recovery, subset selection, support recovery. Spectral compressed sensing via structured matrix completion. Saligrama, information theoretic bounds to sensing capacity of sensor networks under fixed snr, presented at the information theory workshop, sep. Citeseerx document details isaac councill, lee giles, pradeep teregowda.
Reference 5 investigated the contained information in noisy measurements by viewing the measurement system as an information theoretic channel. Information theoretic bounds for compressed sensing article pdf available in ieee transactions on information theory 5610. Information theoretic bounds to performance of compressed. In this paper we introduce a new theory for distributed compressed sensing dcs that enables new distributed.
The focus of our technique is on the replacement of the generalized kullbackleibler divergence, with an information theoretic metric namely the square root of the jensenshannon divergence, which is related to an approximate, symmetrized version of the poisson log likelihood function. Cs is considered as a new signal acquisition paradigm with which sample taking could be faster than. In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. Compressed sensing cs is a new framework for integrated sensing and compression. Information theoretic bounds for compressed sensing ieee. Bounds for optimal compressed sensing matrices and. Finally, we characterize the privacy properties of the compression procedure in informationtheoretic terms, establishing upper bounds on the rate of information communicated between the. Similarly, in 10, the authors consider the matrix completion problem and again use information theoretic techniques to obtain bounds. In this paper we introduce a new theory for distributed compressed sensing dcs that enables new distributed coding algorithms for multisignal. The theory of compressed sensing, where one is interested in recovering a highdimensional signal from a small number of measurements, has grown into a rich field of investigation and found many applications 24. Numerical experiments are performed showing the practical use of the technique in signal and image reconstruction from compressed measurements under. Index terms an optimal scaling of the number of observations required forrelaxation, compressed sensing, fanos method, highdimensional statistical inference, information theoretic bounds. Furthermore, x can be reconstructed using linear programming, which has. The obtained bounds establish the relation between the complexity of the autoregressive process and the attainable estimation accuracy through the use of a novel measure of complexity.
Informationtheoretic lower bounds for compressive sensing with generative models the goal of standard compressive sensing is to estimate an unknown vecto. Information theoretic results in compressed sensing. In this section, we put our work in the context of existing work on poisson compressed sensing with theoretical performance bounds. Since arguments for establishing information theoretic lower bounds are not algorithm speci. Algorithms and bounds for sensing capacity and compressed sensing with applications to learning graphical models s aeron, m zhao, v saligrama 2008 information theory and applications workshop, 303309, 2008. A novel technique using polar codes signal processing and communications applications conference siu, 2010 ieee 18th,2010 compressed sensing coding and information theory polar codes signal processing. The problem has received significant interest in compressed sensing and sensor networkssnets literature. The fundamental revelation is that, if an n sample signal x is sparse and has a good k term approximation in some basis, then it can be reconstructed using m ok lognk n. Index terms compressive sensing, linear prediction, classi. Compressed sensing cs is a new framework for sampling and. Index termsbasis pursuit, compressed sensing, compressive sampling, informationtheoretic bounds, lasso, orthogonal matching pursuit, prior information, sparsity pattern recovery. In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy random projections of data.
The course aimed at introducing the topic of compressed sensing cs. The problem of sparse estimation via linear measurements commonly referred to as compressive sensing is particularly wellunderstood, with theoretical developments including sharp performance bounds for both practical algorithms 4, 7, 8, 6 and potentially intractable information theoretically optimal algorithms 9, 10, 11, 12. Using an information theoretic metric for compressive recovery under poisson noise sukanya patila, karthik s. We emphasize that although the derivation assumes the measurement matrix to be gaussian, it can be extended to any subgaussian case, by paying a small con.
1459 1459 323 1600 682 247 685 503 201 1511 913 624 28 1597 1404 551 159 1012 986 1147 1229 629 17 940 1415 1011 935 859 859 859 1400 184 627 700 95 820 149 1370 756 251 1156 426 888 1309 675 1420 136 1304 946