Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Next revision
Previous revision
teaching:ft1819:compressedsensing:start [2018/10/10 12:24]
ipa created
teaching:ft1819:compressedsensing:start [2021/03/02 13:28] (current)
Line 1: Line 1:
-====== Lecture: A mathematical introduction ​to Compressed Sensing ======+====== Lecture: A Mathematical Introduction ​to Compressed Sensing ======
  
 +====== Exam ======
  
-** Language: ** English or Germanas the audience requests.+The **exam** is scheduled for  **19.02** in room 4/209. Please find you assigned time slot below. 
 +  * **11:00 - 11:35**   Steffen Bauer  
 +  * **11:45 - 12:​20** ​  ​Ulrich Kunz    
 +  * **12:30 - 13:​05** ​ Carlos Albertos Rios  
 +  * small break 
 +  * **13:30 - 14.05** ​ Jonathan Schwarz  
 +  * **14:15 - 14:​50** ​ André Schulze ​  
 +  * **15:00 - 15.35** ​  ​Michael Tabachnik  
 +  * **15:45 - 16:​20** ​ Aliya Amirzhanova  
 + 
 +==== Content ==== 
 +  
 +This lecture covers the basic mathematical concepts In Compressed Sensing (CS). 
 +CS is a new sampling theory that started 2006 with the work of CandesTao and Donoho 
 +[[http://​people.ee.duke.edu/​~lcarin/​01580791.pdf|CRT06]],​ [[http://​ieeexplore.ieee.org/​document/​1614066/​|Don06]] and quickly attracted ​the attention of mathematicians from several areas due to the solid mathematical background. The key idea in CS in order to address the big data problem is not to sample data that can be recovered afterwards. The CS theory is built on three pillars: sparsity, uniform random subsampling and concentration of measure. This lecture will provide an introduction to this basic concepts and will give an overview of the established compressed sensing theory. In addition, we will discuss sparse optimization 
 +algorithms and several applications. The main reference is [[http://​www.springer.com/​de/​book/​9780817649470|FR13]].
  
-** Content: ** This lecture covers basic concepts ​ 
  
 The content of the lecture is targeted at students of mathematics and scientific computing with a long-term interest in mathematical imaging, to prepare them for more advanced topics closer to research. ​ The content of the lecture is targeted at students of mathematics and scientific computing with a long-term interest in mathematical imaging, to prepare them for more advanced topics closer to research. ​
 +In an effort to help students draw relationships between the theoretical concepts and practical applications,​ the course is accompanied by an optional programming project.
  
-** Prerequisites:​ ** All proofs are elementary and only require knowledge from the mandatory undegraduate courses on analysis, linear algebra and probabilities. Basic tools from convex optimization will be provided. 
  
-** Registration:​ ** If you wish to attend the lecture and the exercises, please sign up using [[https://​muesli.mathi.uni-heidelberg.de/​|MÜSLI]].+==== Language ====
  
-<color #​ed1c24>​You have to be logged in to access ​the files listed below.</​color>​+English or German, as the audience requests.
  
-**Lecture Notes ** +====  Prerequisites ==== 
 + 
 +All proofs are elementary and only require knowledge from the mandatory undegraduate courses on analysis, linear algebra and probabilities. Basic tools from convex optimization will be provided. 
 + 
 +==== Literature ==== 
 + 
 +  ​S. Foucart, H. Rauhut, A Mathematical Introduction to Compressive Sensing, Birkhäuser,​ 2013 
 +  ​S. Boyd, L. Vandenberghe,​ Convex Optimization,​ Cambridge University Press, 2004  
 +  ​M. Ledoux, The Concentration of Measure Phenomenon American Mathematical Society, 2005 
 +  ​R. Schneider, W. Weil, Stochastic and Integral Geometry, Springer, 2008  
 +  * J.-L. Starck, F. Mutagh, J.M. Fadili, Sparse Image and Signal Processing, Cambridge University Press, 2010  
 + 
 +==== Registration ==== 
 + 
 +If you wish to attend the lecture and the exercises, please sign up using [[https://​muesli.mathi.uni-heidelberg.de/​|MÜSLI]]. 
 + 
 +<color #​ed1c24>​You have to be logged in to access the files listed below.</​color>​
  
 +==== Lecture Notes ====
 +  * Week 1 {{ :​teaching:​ft1819:​compressedsensing:​introcs_wt2018_19.pdf | Introduction slides}}
 +  * Week 2 {{ :​teaching:​ft1819:​compressedsensing:​lecture1.pdf | Vector spaces and (quasi-)norms;​ Sparsity and compressibility }}
 +  * Week 3 {{ :​teaching:​ft1819:​compressedsensing:​lecture3.pdf |Recovery of individual sparse vectors; NP-hardness of l0-minimization}} {{ :​teaching:​ft1819:​compressedsensing:​l0-np-hardness.pdf |Slides}}
 +  * Week 4 {{ :​teaching:​ft1819:​compressedsensing:​lecture4.pdf |l-p minimization for l-0 minimization;​ Null space property}}
 +  * Week 11 {{ :​teaching:​ft1819:​compressedsensing:​rip.pdf | Restricted Isometry Propery}}
 +  * Week 12 {{ :​teaching:​ft1819:​compressedsensing:​coherence.pdf |Coherence}} {{ :​teaching:​ft1819:​compressedsensing:​greedymethods.pdf |Greedy Methods}}