Modern Signal Processing for Communications is the first course of the module Modern Signal Processing for Communications. The second course is Mathematical Introduction to Machine Learning which is offered every winter semester. You can start with the module in summer or winter semester.

The course will take place entirely online via Big Blue Button every Tuesday from 14:15-15:45pm. Electronic lecture notes will be provided after each lecture. Lectures will not be recorded. The lectures will be held by Dr. Renato L.G Cavalcante.


Learning Outcomes

The main objective of this lecture series is to give students working knowledge on a broad class of Mann-type iterative algorithms in Hilbert spaces, with focus on projection-based methods. These algorithms have been used to solve diverse problems in science and engineering such as interference reduction in MIMO systems, adaptive beamforming, peak-to-average-power-ratio (PAPR) reduction in OFDM systems, acoustic source localization, environmental modeling in wireless multi-agent systems, radio map reconstruction, and machine learning, to name a few. In the initial lectures, we start by giving students the necessary background on real Hilbert spaces and their connection to more general spaces. Students are then exposed to convex feasibility problems and vector space projection methods, and we illustrate as often as possible the mathematical concepts with concrete, real-world applications in wireless communications and signal processing. The final lectures have the objective to introduce students to fixed point algorithms based on Mann-type iterations. In the second track of lectures the students will develop a solid understanding of theoretical foundations of Machine Learning and will be able to develop, apply, and analyse the complexity of the resulting learning algorithms. A special emphasis will be put on methods for construction of efficient learning algorithms. E.g. Empirical risk minimisation via linear programming and ideas based on perceptron algorithms, stochastic subgradient methods, and support vector machines.

Content

The learning content includes:
  1. Introduction and outline of the course
  2. Metric spaces; Vector spaces; Normed vector spaces and Banach spaces; Inner products and real Hilbert spaces
  3. Basics in convex analysis: convex sets, projections and relaxed projections, the fundamental theory of POCS, parallel projection methods, applications (interference reduction in communication systems, acoustic source localization with wireless sensor networks, estimation tasks in massive MIMO systems, kernel machines in sensor networks)
  4. Selected topics in quasi-nonexpansive operator theory. Mann-type iterative algorithms.
  5. Splitting methods for convex optimization (forward-backward splitting methods, proximal/projected gradient methods, etc.)
  6. Model of learning, loss functions, and losses/risks.
  7. Stochastic inequalities and concentration of measure
  8. Uniform laws of large numbers, Rademacher complexity, and learning via uniform convergence.
  9. Vapnik-Chervonenkis dimension and bounds on sample complexity
  10. Support vector machines and kernel methods.
  11. Stochastic subgradient algorithms.

Recommended literature
Censor, Yair, Wei Chen, Patrick L. Combettes, Ran Davidi, and Gabor T. Herman. "On the effectiveness of projection methods for convex feasibility problems with linear inequality constraints." Computational Optimization and Applications 51, no. 3 (2012): 1065-1088.
Combettes, Patrick L. "The foundations of set theoretic estimation." Proceedings of the IEEE 81, no. 2 (1993): 182-208.
I. Yamada, M. Yukawa, and M. Yamagishi, Minimizing the Moreau envelope of nonsmooth convex functions over the fixed point set of certain quasi-nonexpansive mappings, IN: Fixed-Point Algorithms for Inverse Problems in Science and Engineering, H. Bauschke, R. Burachick, P. L.Combettes, V. Elser, D. R. Luke, and H. Wolkowicz, Eds. SpringerVerlag, 2011 (in the future, we will be also using a book currently being written by the same authors)
Luenberger, David G. Optimization by vector space methods. John Wiley & Sons, 1968.
Rudin, Walter. "Principles of Mathematical Analysis (International Series in Pure & Applied Mathematics)." (1976) – Third Edition.
Stark, Henry, Yongi Yang, and Yongyi Yang. Vector space projections: a numerical approach to signal and image processing, neural nets, and optics. John Wiley & Sons, Inc., 1998.
Theodoridis, Sergios, Konstantinos Slavakis, and Isao Yamada. "Adaptive learning in a world of projections." Signal Processing Magazine, IEEE 28.1 (2011): 97-123.
M. Mohri, A. Rostamizadeh, A. Talwalker. “Foundations of Machine Learning”, MIT Press, 2018
R. Vershynin, “High-Dimensional Probability: An Introduction with Applications in Data Science”, Cambridge University Press, 2018
M. Wainwright, “High-Dimensional Statistics: A Non-Asymptotic Viewpoint”, Cambridge University Press, 2019
S. Shalev-Schwartz, S. Ben-David, ”Understanding Machine Learning: From Theory to Algorithms”, Cambridge University Press, 2014


Assigned degree programs

  • Computer Engineering (Master of Science)
  • Elektrotechnik (Master of Science)
  • Wirtschaftsingenieurwesen (Master of Science)