Can anyone recommend some good reading for Fourier Analysis (and the Fourier transform) over finite abelian groups? I've found it given brief descriptions in both books on representation theory and on regular Fourier analysis (the best so far in the chapter in Tao and Vu's Additive Combinatorics), but can't find a dedicated, detailed exposition of this area proper.

3 Answers
3

There's Audrey Terras' "Fourier Analysis on Finite Groups and Applications", which covers both the commutative and non-commutative settings (about half the book focuses on finite abelian groups). It's a fairly good introductory text, although it doesn't run very deep.

Although this question has been answered, I'd like to chime in that if you have any interest in probability it might be worth checking out Harmonic Analysis on Finite Groups by Ceccherini-Silberstein, Scarabotti, and Tolli. It does cover the basic techniques of fourier analysis on finite abelian groups and delves into some representation theory in the nonabelian case. The goal is perhaps slightly different from the other books... this book has in mind the development of these tools for the study of random walks on finite groups and other finite markov chains.

Here's a simple example problem that I find both interesting and compelling. Imagine you have a deck of cards, with some deterministic initial configuration. We define an elementary shuffle to be the result of you independently selecting two cards from the deck (with uniform probability measure on the cards) and then swapping them. How many elementary shuffles will it take to make the deck "sufficiently random"? It's clear that if you only perform one elementary shuffle the distribution on the set of possible decks (all 52! of them) is not at all uniform, since you have at most changed 2 cards in your initial configuration. But it seems intuitively true that if you do enough elementary shuffles then the distribution on the set of possible decks should approach uniformity (i.e. in the variation norm). But how long does it take? This problem was solved by Persi Diaconis (and someone else whose name escapes me) using tools of representation theory. It's much simpler in the case where your group is abelian, in which case fourier analysis is good enough.

Btw, the most interesting thing to me (as someone with a curiosity for probability) is not just the quantitative estimates that can be achieved using these tools, but the fact that there is the so called "cutoff" phenomena. That is, not all elementary shuffles are made equal; there's a threshold of a certain number of elementary shuffles at which point the next few shuffles have a much greater impact on the variation distance from the uniform distribution than the preceding shuffles. In essence, if you do a few shuffles you randomize some, but you hit a certain point where a few more are going really going to be doing all the work. Weird! Search google for "cutoff phenomenon" and the first link gives some discussion of this (sadly, I don't have enough points to include two hyperlinks in a post)