Introduction

A few years ago I was looking for some Objective-C framework which would allow to speech text on iOS devices in our project. In that time i did not find any, but only tree plain speech synth libraries written i C - eSpeak, Flite and Festival.

After couple days of research and attempts to integrate those libraries for iOS SDK I choosed eSpeak and Flite as candidates (I was able to successfully customize only eSpeak and Flite in reasonable time, they supports more languages, Google use eSpeek for its translation service…).

In next couple of lines is described first speech synthesizer wrapper - ESpeakEngine.

Background

The ESpeakEngine is Objectice-C static library project containing very light wrapper for eSpeak open source speech synthesizer. It does not add any new features to eSpeak, it only exposes its funcionality as Objective-C class methods and combines this functionality with iOS AVFoundation Framework (to see all available properties of eSpeak synthesizer, please read documentation on its homepage url). It also uses standard delegate pattern by defining ESpeakEngineDelegate.

In static library project also exists a test target which contains simple iPhone app. This sample app has only a one screen with the UITextView for text input and the UIButton to start speech syntesis of an entered text.

Using the code

Usage of the ESpeakEngine is very easy, You have to only add a standard dependency on the ESpeakEngine static library project to Your project (simply drag and drop library project file from Finder to Project Navigator):

and link also ESpeakEngine data folder espeak-data - simply drag and drop this folder from referenced eSpeak.xcodeproj project to parent project (to its any group, i drop it in example project to group ESpeakTest/Supporting Files):

Then import the ESpeakEngine header in class which is holding engine instance:

#import"ESpeakEngine.h"

In the init or the viewDidLoad method create a new instance of the ESpeakEngine and set all parameters you want (language, volume, gender… etc.):

Points of Interest

No documentation is included in this up-to-date version. Anyhow, the source code is self-explanatory and has altogether only a few hundred lines, also test application is good start point to look for more properties.

Any questions will be answered, feel free to contact me.

History

2010 Initial version

License

This article, along with any associated source code and files, is licensed under The BSD License

Share

About the Author

Jozef Božek is currently a software engineer at bring-it-together s.r.o. in area of large scale infomation systems and mobile applications development.
He has been developing in C++ nearly full time since 2000, in Java since 2004 and in Objective-C since 2009. He is programming using Java EE SDK, iOS SDK, COM/DCOM, MFC, ATL, STL and so on

I'm sorry Jozef I didn't mean that personally. I take back "nice try". I realised that it was out of my frustration looking for a text-to-speech that sounds natural. I downloaded twitran and can not find where is the button to put the text-to-speech to the test. However I did implemented ESpeakEngine following your guide and some other minor changes because the errors that you get on Xcode 4.5.2 with the current guide. The result was a computerised voice though. This might be okay in a context like twitran but totally inadequate if the purpose is to read instructions to a native English speaker in English. If you think that I'm wrong (I really want to be wrong!) tell me where is the text-to-speech on twitran. I'll buy it if necessary and give you a five stars review if the voice isn't computerised and if so hopefully you can tell me how to make it sound natural in my implementation.

Hi, it's ok, i did not take it personally. I took the eSpeak library as is (as i am not expert in TTS) because for our needs computerized voice was quite suitable (in twitran, there is a speech button in a tweet detail view where it is shown only if you select translation button). Anyway, as i know, google is using eSpeak library as its TTS engine in translate project and resulting voice is quite good. So i think that there is an option to setup the eSpeak properties to get better voice quality - but i did not examine it deeper.

I am using espeak text to speech into my one of application and i want to submit that application to apple store does google.espeak allows to use the espeak libraries in paid applications ? and how i can change voice from male to female and also i want to insert more languages any suggetions
Regards Ammad.Emy

Hi, it was a problem with armv7 architecture - i just updated project settings (i successfully built and tested it on iPhone 5 with iOS 5.1). To get current version of the ESpeakEngine library, please visit my google code repository - http://code.google.com/p/espeak-engine/[^]

I too am having an issue with the arm7 version of the library.
I am using the latest version, downloaded today 10 Jan 2013.
Getting this linker report:

ld: warning: ignoring file /Users/linasses/Library/Developer/Xcode/DerivedData/ESpeakTest-gwthgoszenaqdzazfrpfugqfjhbh/Build/Products/Debug-iphoneos/libeSpeak.a, file was built for archive which is not the architecture being linked (armv7): /Users/linasses/Library/Developer/Xcode/DerivedData/ESpeakTest-gwthgoszenaqdzazfrpfugqfjhbh/Build/Products/Debug-iphoneos/libeSpeak.a
Undefined symbols for architecture armv7:
"_OBJC_CLASS_$_ESpeakEngine", referenced from:
objc-class-ref in FliteEngineTestViewController.o
ld: symbol(s) not found for architecture armv7
clang: error: linker command failed with exit code 1 (use -v to see invocation)

and:

ld: warning: ignoring file /Users/linasses/Library/Developer/Xcode/DerivedData/ESpeakTest-gwthgoszenaqdzazfrpfugqfjhbh/Build/Products/Debug-iphoneos/libeSpeak.a, file was built for archive which is not the architecture being linked (armv7): /Users/linasses/Library/Developer/Xcode/DerivedData/ESpeakTest-gwthgoszenaqdzazfrpfugqfjhbh/Build/Products/Debug-iphoneos/libeSpeak.a

Any ideas about how to fix it?

Also, you say the you have built it for iPhone5 on iOS 5.1, I thought that iPhone5 was 6.0 and greater.
Also, which version of Xcode did you use. (I am trying 4.4.0).

And also ensure if You are linking the ESpeakEngine static library libeSpeak.a for same build Destination as main application e.g. iPhone 6.0 Simulator. Each time you change the build Destination You have to link the ESpeakEngine static library libeSpeak.a which is built for same Destination. You can use a tool lipo, it can combine multiple static library files into single file.

Hi, as you can see on a third screenshot, select your target then "Build Phases" and there select section "Link Binary With Libraries". Click on "+" button in "Link Binary With Libraries" sectoin an then "Add other..." (at the bottom left) - it shows file dialog where you can select libeSpeak.a binary.

I'm having a problem getting this to compile while trying to build to a device. I've followed everything outlined in the tutorial but for some reason I get the following when it builds.

arm-apple-darwin10-llvm-gcc-4.2: /Users/user/Library/Developer/Xcode/DerivedData/TestApp-efxvwtzxfgzskwgjjmffnudkaovh/Build/Products/Debug-iphoneos/libeSpeak.a: No such file or directory

I have a feeling this may be related to linking the binary to the target. When I click the + next to 'Link Binary With Libraries' and select 'libeSpeak.a' from Workspace it appears but it's marked in red which I believe indicates it can't find it or there's a problem?!

It works great on the simulator but when I try running it on an iPhone I get:
Apple Mach-O Linker Error
no such file or directory: '/Users/Jason/Library/Developer/Xcode/DerivedData/TestAppeSpeak-heggxnbxqfhnnrggxnypgjttdrcp/Build/Products/Debug-iphoneos/libeSpeak.a'