Digital Tuner for iOS in Swift

I wrote this series after attending a course on Digital Signal Processing at
my university.
I will try not to go into too much detail. Thus there are many essential
aspects that I deliberately don’t cover (such as complex numbers). You should
be able to follow along even if you don’t have a technical background.
Afterwards, if you find it interesting, I would recommend you to attend a DSP
course at your university.

Fig. 1Screenshot of the Guitar Tuner app

In this last part of a series of articles on Fourier transform, we will bring
theory into practice by building our own digital guitar tuner for iOS in Swift
(fig. 1).

The code listings that are provided in this article are limited. Only code that
is specific to this application is provided. A download link to the entire
project, including everything that is necessary to run the application, in a zip
archive is provided at the end of this article.

1. Getting Started

We will use AudioKit[1] to do the heavy lifting of gathering
output from the built-in microphone and analyzing the data. I can’t stress this
enough: always use frameworks and libraries for the computational heavy lifting.
Do not ever implement those algorithms on your own, except for educational
purposes of course.

In order to add AudioKit to your iOS app, you will want to use
CocoaPods[2]. You can install it from your command line by
running sudo gem install cocoapods.

After you created a new Xcode iOS project, run pod init in the directory of
your project. Then in your Podfile, uncomment use_frameworks! and add
pod 'AudioKit', '~> 2.2' between the first target '...' do and end.
Finally, run pod install. See fig. 2 for a recording.

Fig. 2Recording of terminal when setting up Cocoapods and installing
AudioKit.

2. Frequency Tracker

We start by capturing input from the microphone. Fortunately, AudioKit makes it
really easy to implement this part. Add @import AudioKit to the top of your
Swift file.

// Start application-wide microphone recording.AKManager.sharedManager().enableAudioInput()// Add the built-in microphone.letmicrophone=AKMicrophone()AKOrchestra.addInstrument(microphone)// Add an analyzer and store it in an instance variable.self.analyzer=AKAudioAnalyzer(input:microphone.output)AKOrchestra.addInstrument(self.analyzer!)// Start the microphone and analyzer.self.analyzer!.play()microphone.play()

Fig. 3Listing of code that demonstrates how to setup AudioKit in
Tuner.swift.

Note that the AKAudioAnalyzer implements the algorithm that we examined last
week. Fortunately, using AudioKit makes it really easy because
fig. 3 is all the code we need for this.

3. Polling for Updates

The AKAudioAnalyzer has a property that contains the measured frequency. We
will poll for this property and update the UI accordingly. The number of updates
per seconds is a matter of choice. I choose to update each 100ms, which is 10
times per second. We initialize and schedule a NSTimer using its one-shot
constructor in fig. 4.

Fig. 4Listing of code that demonstrates how to initialize and schedule a
new run loop timer in Tuner.swift.

What we want to do now is to match the frequency we track using the microphone,
with the nearest tone and octave.

4. Tones and Octaves

In music, there are 12 notes in a chromatic scale. Each note starts with a
letter (A-G) and optionally an accidental (sharp or flat). Although there are
multiple naming conventions, this is the (most commonly used) English and Dutch
convention. Each octave consists of those 12 notes. In fig. 5 I
provide a reference table for the frequencies of each note in each octave.

Number

Name

Octave 2

Octave 3

Octave 4

Octave 5

Octave 6

1

C

65.406 Hz

130.81 Hz

261.63 Hz

523.25 Hz

1046.5 Hz

2

C♯ / D♭

69.296 Hz

138.59 Hz

277.18 Hz

554.37 Hz

1108.7 Hz

3

D

73.416 Hz

146.83 Hz

293.66 Hz

587.33 Hz

1174.7 Hz

4

E♭ / D♯

77.782 Hz

155.56 Hz

311.13 Hz

622.25 Hz

1244.5 Hz

5

E

82.407 Hz

164.81 Hz

329.63 Hz

659.26 Hz

1318.5 Hz

6

F

87.307 Hz

174.61 Hz

349.23 Hz

698.46 Hz

1396.9 Hz

7

F♯ / G♭

92.499 Hz

185.00 Hz

369.99 Hz

739.99 Hz

1480.0 Hz

8

G

97.999 Hz

196.00 Hz

392.00 Hz

783.99 Hz

1568.0 Hz

9

A♭ / G♯

103.83 Hz

207.65 Hz

415.30 Hz

830.16 Hz

1661.2 Hz

10

A

110.00 Hz

220.00 Hz

440.00 Hz

880.00 Hz

1760.0 Hz

11

B♭ / A♯

116.54 Hz

233.08 Hz

466.16 Hz

932.33 Hz

1864.7 Hz

12

B

123.47 Hz

246.94 Hz

493.88 Hz

987.77 Hz

1975.5 Hz

Fig. 5Table with note names, octaves and their frequencies.

Obviously, you don’t want to hardcode these frequencies into your application.
For one, you may want to adapt your app later to support custom standard pitches
(e.g. = 435 Hz). For this reason, we need a formula instead that, given
a note (C to B) and an octave (2 to 6 seem reasonable for guitars) can compute
the corresponding frequency. Fortunately, there is.

Let’s look at octaves first. , , , etc. are the same note,
but have a different frequency. Precisely, each octave starts at twice the
previous frequency. Does that make sense? Absolutely, let’s look at an example.
In fig. 6 we can see that each higher tone fits exactly twice in
the lower tone.

Fig. 6Visualization of the same tone (A) in three octaves.

Fig. 7Visualization of distribution of notes within one octave.

So if we can get a different octave by multiplying with where
is the number of octaves, we can reason what we need to do to get a
different note in the same octave. Remember that there are 12 notes, so moving
one octave up is a multiplication with . The notes
within one octave are evenly distributed (as shown in fig. 7). So
if we want to get from A to A♯ all we have to do is multiply with
(or ).

In Swift, I implemented the notes using enumerations (fig. 8).
The pitch is implemented using a class that has instance variables for the note,
octave and computed frequency based on the standard pitch (A440), multiplied
with where ,
and are the relative note and octave respectively (so
this may result in a negative number). In fig. 9 I show some code
for implementing the Pitch class.

Fig. 9Listing of code that I use to implement pitches of all notes in the
2nd to 6th octave.

Finally, to find the nearest pitch, I simply map each pitch to a tuple that
contains the difference between the pitch frequency and the queried frequency.
Then, all I have to do is return the first item in the resulting array. I show
how to do this in fig. 10.

Fig. 10Listing of code that I use to find the nearest pitch to a given
frequency.

5. Rotating Knob View

Although the user interface of the guitar tuner app is really minimalistic, some
thought went into creating the rotating knob view. You will notice that it
consists of several separate layers, some of which rotate.

Fig. 11Individual layers of knob view from KnobView.swift. From left to
right: the thin dash layer, the thick dash layer, the arrow layer, the stable
layer (always facing north) and the text layers for each pitch.

Both dash layers, the arrow layer and the text layers are grouped in another
layer that is rotated. That layer I call the turn layer. Only the rectangular
“stable” layer is fixed and does not rotate so it is not placed in the turn
layer.

5.1. Dash Layers

The first two layers, the thin and thick dash layers, we can get with a simple
CAShapeLayer.

Fig. 12Listing of code adapted from KnobView.swift that shows how to setup
the two dash layers.

In iOS, for some reason the total circumference is expressed in 720 units. For
the thin dash layer we want 120 dashes and each one should be 0.5 unit.
Therefore, the dash pattern is .
In other words: 0.5 painted segment and 5.5 unpainted segment. The dash phase
is half the painted segment so that it is centered.

layer.lineDashPattern=[0.5,5.5]layer.lineDashPhase=0.25

Fig. 13Listing of code adapted from KnobView.swift that shows how to setup
the dash pattern and dash phase of the thin dash layer.

layer.lineDashPattern=[1.5,58.5]layer.lineDashPhase=0.75

Fig. 14Listing of code adapted from KnobView.swift that shows how to setup
the dash pattern and dash phase of the thick dash layer.

5.2. Arrow Layer

The arrow layer is also a simple CAShapeLayer with its path set to a triangle.

Fig. 17Listing of code adapted from KnobView.swift that shows how to setup
all 5 text layers.

Then, every time the tracked frequency changes, we update the pitch labels to
present the nearest pitches. In Pitch.swift I overloaded the + and - operators
so if we want to get the next or previous pitch we can simply do pitch + 1
(see fig. 18). Note that the offset ranges from -2 to 2
and 0 is the big label at the top.

Fig. 18Listing of code adapted from KnobView.swift that shows how to
update the pitch labels.

5.5. Rotating the Knob

Every time the tracked pitch updates, we compute the distance between the
nearest pitch and the tracked frequency in Tuner.swift. We then divide that
distance by the total difference between the nearest pitch and the second
nearest pitch in order to express it as a percentage. Additionally, we multiply
the thing by two in order to get the knob angle between -0.5 and 0.5. In
fig. 19 I show how to do these computations.

Fig. 23Listing of code adapted from PlotView.swift that shows how to setup
a float buffer.

Then we are going to fill it with num floats by using a for-loop and sampling
a sine at each iteration (fig. 24).

// These are arbitrary float parameters to get a nice effect.letphase=(self.amplifier+0.8)/1.8letamplitude=self.amplifier*amplitudevart=(time+Double(i)/Double(num)*self.frequency+phase)floats[i]=Float(sin(t*2*3.14))// Again the 2 is a bit arbitrary but it determines the speed of the sine.time+=self.frequency/44100/2

Fig. 24Listing of code adapted from PlotView.swift that shows how to
compute the result of a sine at each iteration.

Now we multiply the results with a sine of 0.5 Hz to fade out the horizontal
ends and we multiply it with a power to make the plot look more dramatic at the
right side and less symmetric (fig. 25).

Fig. 27Listing of code adapted from DisplayView.swift that shows how to
use a gradient mask to fade out both horizontal ends of a
UIView.

7. Downloads & Links

This is it! I have upload the entire repository to GitHub. You can git clone
the repository or download the latest release in a zip archive or tarball. If
you stumble upon a bug or if you can improve the code (e.g. make it more
Swift-y) please file a Pull Request (at Github) and send me an email, I really
appreciate that!