CoSaMP

CoSaMP: Iterative signal recovery from incomplete and inaccurate samples. Compressive sampling offers a new paradigm for acquiring signals that are compressible with respect to an orthonormal basis. The major algorithmic challenge in compressive sampling is to approximate a compressible signal from noisy samples. This paper describes a new iterative recovery algorithm called CoSaMP that delivers the same guarantees as the best optimization-based approaches. Moreover, this algorithm offers rigorous bounds on computational cost and storage. It is likely to be extremely efficient for practical problems because it requires only matrix-vector multiplies with the sampling matrix. For compressible signals, the running time is just $O(Nlog ^{2}N)$, where $N$ is the length of the signal.


References in zbMATH (referenced in 157 articles )

Showing results 1 to 20 of 157.
Sorted by year (citations)

1 2 3 ... 6 7 8 next

  1. Barbara, Abdessamad; Jourani, Abderrahim; Vaiter, Samuel: Maximal solutions of sparse analysis regularization (2019)
  2. Kreuzer, Wolfgang: Using B-spline frames to represent solutions of acoustics scattering problems (2019)
  3. Manohar, Krithika; Kaiser, Eurika; Brunton, Steven L.; Kutz, J. Nathan: Optimized sampling for multiscale dynamics (2019)
  4. Xu, Fengmin; Dai, Yuhong; Zhao, Zhihu; Xu, Zongben: Efficient projected gradient methods for cardinality constrained optimization (2019)
  5. Aceska, Roza; Bouchot, Jean-Luc; Li, Shidong: Fusion frames and distributed sparsity (2018)
  6. Brugiapaglia, Simone; Nobile, Fabio; Micheletti, Stefano; Perotto, Simona: A theoretical study of compressed solving for advection-diffusion-reaction problems (2018)
  7. Cai, Jian-Feng; Wang, Tianming; Wei, Ke: Spectral compressed sensing via projected gradient descent (2018)
  8. Elenberg, Ethan R.; Khanna, Rajiv; Dimakis, Alexandros G.; Negahban, Sahand: Restricted strong convexity implies weak submodularity (2018)
  9. Flinth, Axel; Kutyniok, Gitta: PROMP: a sparse recovery approach to lattice-valued signals (2018)
  10. He, Yong; Zhang, Xinsheng; Zhang, Liwen: Variable selection for high dimensional Gaussian copula regression model: an adaptive hypothesis testing procedure (2018)
  11. Lai, Chun-Kit; Li, Shidong; Mondo, Daniel: Spark-level sparsity and the (\ell_1) tail minimization (2018)
  12. Li, Qia; Zhang, Na: Capped (\ell_p) approximations for the composite (\ell_0) regularization problem (2018)
  13. Mathelin, Lionel; Kasper, Kévin; Abou-Kandil, Hisham: Observable dictionary learning for high-dimensional statistical inference (2018)
  14. Saab, Rayan; Wang, Rongrong; Yılmaz, Özgür: Quantization of compressive samples with stable and robust recovery (2018)
  15. Shen, Jie; Li, Ping: A tight bound of hard thresholding (2018)
  16. Tang, Sunli; Fernandez-Granda, Carlos; Lannuzel, Sylvain; Bernstein, Brett; Lattanzi, Riccardo; Cloos, Martijn; Knoll, Florian; Assländer, Jakob: Multicompartment magnetic resonance fingerprinting (2018)
  17. Wang, Gang; Niu, Min-Yao; Fu, Fang-Wei: Deterministic construction of compressed sensing matrices with characters over finite fields (2018)
  18. Yuan, Xiao-Tong; Li, Ping; Zhang, Tong: Gradient hard thresholding pursuit (2018)
  19. Zhu, Zhihui; Li, Gang; Ding, Jiajun; Li, Qiuwei; He, Xiongxiong: On collaborative compressive sensing systems: the framework, design, and algorithm (2018)
  20. Adamo, Alessandro; Grossi, Giuliano; Lanzarotti, Raffaella; Lin, Jianyi: Sparse decomposition by iterating Lipschitzian-type mappings (2017)

1 2 3 ... 6 7 8 next