The lab assignments in this course are implemented in the Python programming language. This sentence contains two words that may cause trouble for you: “Programming” and “Python.” If you have never written code before, some of the labs will be a slog. You can do them and you will learn interesting things. But you may feel that the effort is not worth the return. On the other hand, this is as good a chance as any to learn a little bit of programming. Being a good programmer is difficult. But being able to write a few hundred lines of code is something that anyone can learn in a couple of weeks.
If you have programmed before but have never used Python, no big deal. You can learn the peculiarities of Python’s syntax as you go along. Just be cognizant that there may be functions you don’t know about. Make sure you ask your teaching assistants for help. As opposed to spending countless hours searching the Internet.
If you have not use Python before, you need to install it. If you use Mac OS, you already have Python installed. If you use Linux, chances are you already have Python installed as well. If you use Windows, chances are you do not have a Python installation. In such case:
Several of you are excellent programmers. This course is popular with computer science majors, for instance. If you are in this category, please remember that this is not a programing course. Yes, we are writing code to solve problems. But the goal is not to write code. The goal is to use numerical computations to learn. The code we will write is always very easy. Never more than a few dozen lines of code. The goal of writing the code is to learn from the results we obtain. Keep that in mind.
Lab 1: Discrete Complex Exponentials
Where shall we begin? From the very beginning. And in the beginning there was nothing. But then there was time. Our exploration of information processing starts from an exploration of the notion of time. More precisely, from the notion of rate of change. This is because different rates of change are often associated with different types of information. In a picture of a face, no change means you are looking at hair, foreheads, or cheeks. Rapid change means you are looking at a hairline, a pair of eyes, or a pair of lips. Different vowels are associated to different frequencies. So are musical tones. The lull of the ocean spreads across all frequencies. Weather changes occur over days. Season changes over months. Climate changes over years.
The Fourier transform is the mathematical tool that uncovers the relationship between different rates of change and different types of information. In order to define Fourier transforms, which we will do in Lab 2, we first have to study discrete complex exponentials and some of their properties. This is the objective of this Lab.
Lab 2: Discrete Fourier transform (DFT)
We have now seen the DFT expressed as abstract mathematical notation, but how can we implement this in the real world? In this lab, we will code a DFT function in Python, and then use this to explore the DFTs of some signals we have seen in class (square pulse) and some that we have not (triangular pulse, Parzen windows, raised cosine, Gaussian, and Hamming windows). We will then prove some of the important properties of the DFT – conjugate symmetry, energy conservation (Parseval’s Theorem), and linearity, and conservation of inner products (Plancherel’s Theorem).
We will see some more applications of the DFT as it pertains to some real-world signals that we are all familiar with – musical tones. By using the DFT, we will be able to investigate the spectra of an A note and the “Happy Birthday” song that we coded in the last lab. Finally, we will tie all this together to investigate the energy composition of “pure” notes, as well as those that are characterized by harmonics – that is, “real” notes of various instruments.
Lab 3: Inverse Discrete Fourier transform (DFT)
We have successfully implemented DFT transforming signals from time domain to frequency domain. However, can we transform these signals back to time domain without losing any information? Why are DFT important for signal and information processing? In this lab, we will learn Inverse Discrete Fourier Transform that recovers the original signal from its counterpart in the frequency domain. We will first prove a theorem that tells a signal can be recovered from its DFT by taking the Inverse DFT, and then code a Inverse DFT class in Python to implement this process.
We will then introduce an important application of DFT and Inverse DFT that is signal reconstruction and compression. We will use DFT and Inverse DFT Python classes to approximate some signals we have seen in previous labs, such as square pulse and triangular pulse, and study how well these approximations are compared with the original signal. We will further deal with the real-world signal – our voice. We will code a Python class that can record and play our own voice, based on which we will implement DFT and Inverse DFT for voice compression and masking. We will end up with an interesting problem allowing you to uncover secret messages from a signal that you may consider normal.
Lab 4: Fourier transform
In last assignment, we have implemented iDFT to recover discrete signals from frequency domain back to time domain. However, time in the physical world is neither discrete nor finite. In this lab, we will consider Fourier Transform of continuous time signals by combining the sampling theory. In the first part, we will prove that DFT of discrete sampled signals can approximate the Fourier Transform of continuous signals both theoretically and numerically.
Next we will look into the modulation and demodulation of signals. We first prove theorems that shifting a signal in the time domain is equivalent to multiplying by a complex exponential in the frequency domain. To get a better idea of this property, we will again carry out experiments with our voice signals. We will modulate your voice by band-limiting and frequency shifting as well. We will also consider another interesting problem of recovering voice signals separately from a mixed signal.
Lab 5: Sampling
Signals exist in continuous time but it is not unusual for us to process them in discrete time. When we work in discrete time we say that we are doing discrete signal processing, something that is convenient due to the relative ease and lower cost of using computers to manipulate signals. When we use discrete time representations of continuous time signals we need to implement processes to move back and forth between continuous and discrete time. The process of obtaining a discrete time signal from a continuous time signal is called sampling. The process of recovering a continuous time signal from its discrete time samples is called signal reconstruction or interpolation.
Lab 6: Voice Recognition
The goal of this lab is to use your accumulated knowledge of signal and information processing to design a system for the recognition of a spoken digit.
Lab 7 – Voice recognition with a linear time invariant system
The goal of this lab is to use create a linear time invariant system to implement an online recognition of a spoken digit.
Lab 8 – Two-Dimensional Signal Processing and Image De-noising
The goal of this lab is to implement two-dimensional signal processing, and leverage that for image filtering and denoising.
Lab 9 – The Discrete Cosine Transform and JPEG
The goal of this lab is to use two-dimensional Discrete Cosine Transform (2D DCT) to carry out signal compression and reconstruction tasks for image processing applications.
Lab 10: Principal Component Analysis
This lab will introduce the technique of principal component analysis (PCA). We well see how essential information about a data set is contained in its covariance matrix, and we will use the eigenvectors of this matrix to “transform” our signal to the PCA domain. This is analogous to the frequency domain for a time signal!
We will reconstruct a given face using a varying number of principal components, and we will examine the reconstruction error to quantitively show us our accuracy. How should this accuracy compare to the DFT?
Lab 11: Face Recognition
The goal of this lab is to implement face recognition using Principal Component Analysis (PCA). One of the most important applications of the PCA is mapping a dataset into a new space with smaller dimension in a way that minimizes the reconstruction error. Dimensionality reduction is specially useful for datasets in which the dimension of each sample points is large, as in the case of images.
Lab 12: Signal Processing on Graphs
The goal of this lab is to introduce Graph Signal Processing. In previous weeks, we have focused our attention on discrete time signal processing, image processing, and principal component analysis (PCA). These three seemingly unrelated areas can be thought of as the study of signals on particular graphs: a directed cycle, a lattice and a covariance graph. Thus, the theory of signal processing for graphs can be conceived as a unifying theory which develops tools for more general graph domains and, when particularized for the mentioned graphs, recovers some of the existing results.
Lab 13 – Graph Signal Processing – Classification of Cancer Types
The goal of this lab is to apply the graph signal processing to improve the classification of cancer types through the use of genetic networks. In the previous lab, we have studied graph signals, graph shift operators, Graph Fourier Transform (GFT) and total variation. In this lab, we will deal with the genetic network, and use graph signal processing tools to improve the cancer type classification.