Tag Archives: PDAs

I keep getting ideas for apps I’d like to see on Apple’s App Store for iPod touch and iPhone. This one may sound a bit weird but I think it could be fun. An app where you can record your mood and optionally broadcast it to friends. It could become rather sophisticated, actually. And I think it can have interesting consequences.

The idea mostly comes from Philippe Lemay, a psychologist friend of mine and fellow PDA fan. Haven’t talked to him in a while but I was just thinking about something he did, a number of years ago (in the mid-1990s). As part of an academic project, Philippe helped develop a PDA-based research program whereby subjects would record different things about their state of mind at intervals during the day. Apart from the neatness of the data gathering technique, this whole concept stayed with me. As a non-psychologist, I personally get the strong impression that recording your moods frequently during the day can actually be a very useful thing to do in terms of mental health.

And I really like the PDA angle. Since I think of the App Store as transforming Apple’s touch devices into full-fledged PDAs, the connection is rather strong between Philippe’s work at that time and the current state of App Store development.

Since that project of Philippe’s, a number of things have been going on which might help refine the “happy meter” concept.

One is that “lifecasting” became rather big, especially among certain groups of Netizens (typically younger people, but also many members of geek culture). Though the lifecasting concept applies mostly to video streams, there are connections with many other trends in online culture. The connection with vidcasting specifically (and podcasting generally) is rather obvious. But there are other connections. For instance, with mo-, photo-, or microblogging. Or even with all the “mood” apps on Facebook.

Yet I think the “happy meter” could be useful on its own, as a way to track your own mood. “Turns out, my mood was improving pretty quickly on that day.” “Sounds like I didn’t let things affect me too much despite all sorts of things I was going through.”

As a mood-tracker, the “happy meter” should be extremely efficient. Because it’s easy, I’m thinking of sliders. One main slider for general mood and different sliders for different moods and emotions. It would also be possible to extend the “entry form” on occasion, when the user wants to record more data about their mental state.

Of course, everything would be save automatically and “sent to the cloud” on occasion. There could be a way to selectively broadcast some slider values. The app could conceivably send reminders to the user to update their mood at regular intervals. It could even serve as a “break reminder” feature. Though there are limitations on OSX iPhone in terms of interapplication communication, it’d be even neater if the app were able to record other things happening on the touch device at the same time, such as music which is playing or some apps which have been used.

Now, very obviously, there are lots of privacy issues involved. But what social networking services have taught us is that users can have pretty sophisticated notions of privacy management, if they’re given the chance. For instance, adept Facebook users may seem to indiscrimately post just about everything about themselves but are often very clear about what they want to “let out,” in context. So, clearly, every type of broadcasting should be controlled by the user. No opt-out here.

I know this all sounds crazy. And it all might be a very bad idea. But the thing about letting my mind wander is that it helps me remain happy.

Though it hasn’t been announced on its website, Apple’s App Store for OSX iPhone applications is now online. In fact, enterprising iPhone users can allegedly upgrade their phone’s firmware to 2.0 in order to take advantage of this online software shop. As an iPod touch user, I have no such luxury. As of this moment, the firmware upgrade for iPod touch hasn’t been released. Since that upgrade will be free for iPhone and paid for iPod touch, the discrepancy isn’t surprising.

With those third-party applications, my ‘touch will become more of a PDA and the iPhone will become more of a smartphone.

Still, I was able to access the App Store using iTunes 7.7 (which I downloaded directly from Apple’s iTunes website since it wasn’t showing up in Apple Software Update). Adding the “Applications” item in the left-hand sidebar (available through the “General” tab in iTunes Preferences), I can see a list of applications already downloaded in iTunes (i.e., nothing at first launch). At the bottom of that page, there’s a link to get more applications which leads to the App Store. There, I can browse applications, get free apps, or buy some of the paid ones (using the payment information stored in my iTunes account). Prices are the same in USD and CAD (since they are pretty much on par, it all makes sense). Searching and browsing for apps follows all the same conventions as with music, movies, podcasts, or iPod games. Application pages appear in searches for application names (say, “Trism“).

I went through a number of apps and eventually downloaded 28 free ones. I also noted a number of apps I would like to try, including Trism, Units, Things, Outliner, OmniFocus, Steps (one of the rare apps available in French), iCalorie, and one of the multiple Sudoku apps. However, I can’t put apps on a wishlist and demos aren’t available directly through iTunes (I’m assuming they’re available from the iPhone or iPod touch).

I’m actually looking forward to trying out all of these apps. I don’t tend to be an early adopter but this is one case for quick adoption, especially with free apps. I guess a small part of this is that, since Apple has sorted through these apps, I’m assuming none of them contains any malware. Not that I ever fully trust an organization or individual, but my level of trust is higher with the App Store than with, say, the usual software download site (VersionTracker.com, Tucows.com, Download.com, Handango.com). And I trust these download sites much more than the developer sites I find through Web searches.

One thing I notice very quickly is how small many of those apps were. After downloading 28 apps, my “Mobile Applications” folder is 31MB. Of course, many PDA apps were typically under 100KB, but given the size of OSX iPhone devices, I’m glad to notice that I can probably fit that I can probably fit a lot of applications in less than 1GB, leaving more room for podcasts, music, and pictures. On the other hand, filesizes are apparently not listed in the “Applications” section on iTunes (they’re specified on the individual apps’ pages).

Overall, there’s a number of obvious apps, many of which are PDA classic: to do lists, phrasebooks, clocks and timers, calculators (including tips calculators), converters, trackers, weather forecasts, and solitaire or other casual games (like sudoku). No surprise with any of these and I’ll probably use many of them. Typically, these can be difficult to select because developers have had very similar ideas but the apps have slightly different features. Typically, with those apps, free wins over extensive feature lists, even for very cheap software.

Speaking of price, I notice that AppEngines is selling 43 different Public Domain books as separate apps for 1$ each. Now, there’s nothing wrong with making money off Public Domain material (after all, there wouldn’t be a Disney Company without Public Domain works). But it seems strange to me that someone would nickel-and-dime readers by charging for access to individual Public Domain titles. Sure, a standalone app is convenient. But a good electronic book reader should probably be more general than book-specific. Not really because of size constraints and such. But because books are easily conceived as part of a library (or bookshelf), instead of being scattered on a handheld device. The BookZ Text Reader seems more relevant, in this case, and it’s compatible with the Project Gutenberg files. Charging 2$ for that text reader seems perfectly legitimate. For its part, Fictionwise has released a free eReader app for use with its proprietary format. Though it won’t transform OSX iPhone devices into a Kindle killer, this eReader app does seem to at least transfer books through WiFi. Since these books are copyrighted ones, the app can be a nice example of a convenient content marketplace.

I’m a bit surprised that the educational software section of the App Store isn’t more elaborate. It does contain 45 separate apps but several of those are language-specific versions of language tools or apps listed in other categories which happen to have some connection to learning. If it were me, I’d classify language tools in a separate category or subcategory and I might more obvious how different educational apps are classified. On the other hand, I’m quite surprised that Molecules isn’t listed in this educational section.

The reason I care so much is that I see touch devices generally as an important part of the future of education. With iPod touches being bundled with Mac sales in the current “Back-to-School” special and with the obvious interest of different people in putting touch devices in the hands of learners and teachers, I would have expected a slew of educational apps. Not to mention that educational apps have long populated lists of software offerings since the fondly remembered Hypercard days to PDAs and smartphones.

Among the interesting educational apps is Faber Acoustical‘s SignalScope. At 25$, it’s somewhat expensive for an OSX iPhone app, but it’s much less expensive than some equivalent apps on other platforms used to be. It’s also one of the more creative apps I’ve seen in the Store. Unfortunately, for apparently obvious reasons (the iPod touch has no embedded audio in), it’s only available on the iPhone.

Speaking of iPhone-only software… There’s already a way to get audio on the iPod touch using a third-party adapter. I understand that Apple isn’t supporting it officially but I wonder if the iPhone-only tag will prevent people from using it. Small point for most people, I guess. But it’d be really nice if I could use my ‘touch as a voice recorder. Would make for a great fieldwork tool.

One thing I wish were available on the App Store is an alternative mode for text entry. Though I’m already getting decent performance from the default virtual keyboard on my ‘touch, I still wish I had Dasher, MessagEase or even Graffiti.

Among the apps I’ve browsed, I see a number of things which could be described as “standalone versions of Web apps.” There’s already a good number of Web apps compatible or even customized for OSX iPhone devices. The standalone versions can be useful, in part because they can be used offline (great for WiFi-less situations, on the iPod touch). But these “standaloned Web apps” also don’t seem to really take full advantage of Apple’s Cocoa Touch. In the perception of value, I’d say that “standaloned Web apps” rate fairly low, especially since most Web apps are free to use (unless tied with an account on a Web service).

I was also surprised to see that a number of apps which are basically simple jokes are put for sale on the App Store. I was amused to see an OSX iPhone version of Freeverse’s classic “Jared, The Butcher of Songs.” But I’m puzzled by the fact that Hottrix is selling its iBeer app for 3$. Sure, it’s just 3$. But I don’t see the app providing with as much pleasure as a single taster of a craft beer. Not to mention that the beer itself looks (by colour, foam, and carbonation) like a bland pilsner and not like a flavourful beer.

Overall, I’d say the Store is well-made. Again, the same principles are used as for the iTunes Store generally. All application pages have screenshots and some of these screenshots give an excellent idea of what the application does, while other screenshots are surprisingly difficult to understand. Browsing the Store, I noticed how important icons seemed to be in terms of catching my attention. Some application developers did a great job at the textual description of their applications, also catching my attention. But others use “marketingspeak” to brag about their product, which has the effect of making the app more difficult to grasp. Given the number of apps already listed and the simplicity of the classification, such details become quite important. Almost (but not nearly as much) as price, in terms of making an app appealing.

It seems pretty clear to me (and to others, including some free market advocates), that price is an important issue. This was obvious to many of us for a while. But the opening of the App Store makes this issue very obvious.

For instance, regardless of his previous work, CNET journalist Don Reisinger is probably on to something when he argues, in essence, that the free apps may outweigh the benefits of the paid apps, on Apple’s App Store. Even though Apple allegedly coaxed developers into charging for their apps, the fact of the matter is that the App Store clearly shows that no-cost software can be a competitive advantage in the marketplace. The same advantage is obvious in many contexts, including in music. But, as a closed environment, the App Store could serve as an efficient case study in “competing with free.” One thing to keep in mind, as I keep saying, is that there are multiple types of no-cost offerings. In the software world (including on the App Store), there’s a large number of examples of successful applications which incurred no purchase on the users’ part. Yes, sometimes you need a bit of imagination to build a business model on top of no-cost software. But I think the commercial ventures enabled by these “alternative” business models are more diverse than people seem to assume.

One thing I noticed in terms of application pricing on the App Store is that there either seems to be a number of sweet spots or pricing schemes come from a force of habit. Sure, Apple only has a finite list of “tiers” for amounts which can be charged for a given app (with preset currency conversions). But I think that some tiers have been used more than others. For instance, 10$ seems fairly common as a threshold between truly inexpensive apps and a category similar to “shareware.” Some apps are actually as expensive as the desktop versions, though it seems that the most expensive app so far is under 100$.

One thing to note is that several developers of those early App Store products have been involved in Mac development for a while (the Omni Group being an obvious example) but there are also several organizations which seem to be entering Cocoa development for the first time. This could be a bigger halo effect in terms of Mac sales than the original iPod or the iPhone. Profit made through OSX iPhone apps (either through software cost, through services, or even through other monetization schemes) could lead them to develop software for OSX Leopard. At least, they already made an investment in the development platform.

It’ll be interesting to observe what happens with software pricing in relation to the “apparent hand” of a constrained market.

But I’m less interested in this market than in the actual apps. When can I install the “iPhone 2.0” firmware on my iPod touch? Is it now?

Despite all these obstacles, I have been thinking about selling my services online.

One reason is that I really do enjoy teaching. As I keep saying, teaching is my hobby (when I get paid, it’s to learn how to interact with other learners and to set up learning contexts).

In fact, I enjoy almost everything in teaching (the major exception being grading/evaluating). From holding office hours and lecturing to facilitating discussions and answering questions through email. Teaching, for me, is deeply satisfying and I think that learning situations which imply the role of a teacher still make a lot of sense. I also like more informal learning situations and I even try to make my courses more similar to informal teaching. But I still find specific value in a “teaching and learning” system.

Some people seem to assume that teaching a course is the same thing as “selling expertise.” My perspective on learning revolves to a large extent on the difference between teaching and “selling expertise.” One part is that I find a difference between selling a product or process and getting paid in a broader transaction which does involve exchange about knowledge but which isn’t restricted to that exchange. Another part is that I don’t see teachers as specialists imparting their wisdom to eager masses. I see knowledge as being constructed in diverse situations, including formal and informal learning. Expertise is often an obstacle in the kind of teaching I’m interested in!

Funnily enough, I don’t tend to think of expertise as something that is easily measurable or transmissible. Those who study expertise have ways to assess something which is related to “being an expert,” especially in the case of observable skills (many of those are about “playing,” actually: chess, baseball, piano…). My personal perspective on expertise tends to be broader, more fluid. Similar to experience, but with more of a conscious approach to learning.

There also seems to be a major difference between “breadth of expertise” and “topics you can teach.” You don’t necessarily need to be very efficient at some task to help someone learn to do it. In fact, in some cases, being proficient in a domain is an obstacle to teaching in that domain, since expertise is so ingrained as to be very difficult to retrieve consciously.

This is close to “do what I say, not what I do.” I even think that it can be quite effective to actually instruct people without direct experience of these instructions. Similar to consulting, actually. Some people easily disagree with this point and some people tease teachers about “doing vs. teaching.” But we teachers do have a number of ways to respond, some of them snarkier than others. And though I disagree with several parts of his attitude, I quite like this short monologue by Taylor Mali about What Teachers Make.

Another reason I might “sell my expertise” is that I genuinely enjoy sharing my expertise. I usually provide it for free, but I can possibly relate to the value argument. I don’t feel so tied to social systems based on market economy (socialist, capitalist, communist…) but I have to make do.

Another link to “selling expertise” is more disciplinary. As an ethnographer, I enjoy being a “cultural translator.” of sorts. And, in some cases, my expertise in some domains is more of a translation from specialized speech into laypeople’s terms. I’m actually not very efficient at translating utterances from one language to another. But my habit of navigating between different “worlds” makes it possible for me to bridge gaps, cross bridges, serve as mediator, explain something fairly “esoteric” to an outsider. Close to popularization.

So, I’ve been thinking about what can be paid in such contexts which give prominence to expertise. Tutoring, homework help, consulting, coaching, advice, recommendation, writing, communicating, producing content…

And, finally, I’ve been thinking about my domains of expertise. As a “Jack of All Trades,” I can list a lot of those. My level of expertise varies greatly between them and I’m clearly a “Master of None.” In fact, some of them are merely from personal experience or even anecdotal evidence. Some are skills I’ve been told I have. But I’d still feel comfortable helping others with all of them.

Attended Dan Dennett’s “From Animal to Person : How Culture Makes Up our Minds” talk, yesterday. An event hosted by UQAM’s Cognitive Science Institute. Should blog about this pretty soon. It was entertaining and some parts were fairly stimulating. But what surprised me the most had nothing to do with the talk: I was able to take notes efficiently using the onscreen keyboard on my iPod touch (my ‘touch).

As I blogged yesterday, in French, it took me a while to realize that switching keyboard language on the ‘touch also changed the dictionary used for text prediction. Very sensical but I hadn’t realized it. Writing in English with French dictionary predictions was rather painful. I basically had to click bypass the dictionary predictions on most words. Even “to” was transformed into “go” by the predictive keyboard, and I didn’t necessarily notice all the substitutions done. Really, it was a frustrating experience.

It may seem weird that it would take me a while to realize that I could get an English predictive dictionary in a French interface. One reason for the delay is that I expect some degree of awkwardness in some software features, even with some Apple products. Another reason is that I wasn’t using my ‘touch for much text entry, as I’m pretty much waiting for OSX iPhone 2.0 which should bring me alternative text entry methods such as Graffiti, MessagEase and, one can dream, Dasher. If these sound like excuses for my inattention and absent-mindedness, so be it. 😀

At any rate, I did eventually find out that I could switch back and forth between French and English dictionaries for predictive text entry on my ‘touch’s onscreen keyboard. And I’ve been entering a bit of text through this method, especially answers to a few emails.

But, last night, I thought I’d give my ‘touch a try as a note-taking device. I’ve been using PDAs for a number of years and note-taking has been a major component of my PDA usage pattern. In fact, my taking notes on a PDA has been so conspicuous that some people seem to associate me quite directly with this. It may even have helped garner a gadget-freak reputation, even though my attitude toward gadgets tends to be quite distinct from the gadget-freak pattern.

For perhaps obvious reasons, I’ve typically been able to train myself to efficiently use handheld text entry methods. On my NewtonOS MessagePad 130, I initially “got pretty good” at using the default handwriting recognition. This surprised a lot of people because human beings usually have a very hard deciphering my handwriting. Still on the Newton, switching to Graffiti, I became rather proficient at entering text using this shorthand method. On PalmOS devices (HandSpring Visor and a series of Sony Clié devices), I was usually doubling on Graffiti and MessagEase. In all of these cases, I was typically able to take rather extensive notes during different types of oral presentations or simply when I thought about something. Though I mostly used paper to take notes during classes I’ve attended during most of my academic coursework, PDA text entry was usually efficient enough that I could write down some key things in realtime. In fact, I’ve used PDAs rather extensively to take notes during ethnographic field research.

So, note taking was one of the intended uses for my iPod touch. But, again, I thought I would have to wait for text entry alternatives to the default keyboard before I could do it efficiently. So that’s why I was so surprised, yesterday, when I found out that I was able to efficiently take notes during Dennett’s talk using only the default OSX iPhone onscreen keyboard.

The key, here, is pretty much what someone at Apple was describing during some keynote session (might have been the “iPhone Roadmap” event): you need to trust the predictions. Yes, it sounds pretty “touchy-feely” (we’re talking about “touch devices,” after all 😉 ). But, well, it does work better than you would expect.

The difference is even more striking for me because I really was “fighting” the predictions. I couldn’t trust them because most of them were in the wrong language. But, last night, I noticed how surprisingly accurate the predictions could be, even with a large number of characters being mistyped. Part of it has to do with the proximity part of the algorithm. If I type “xartion,” the algorithm guesses that I’m trying to type “cartoon” because ‘x’ is close to ‘c’ and ‘i’ is close to ‘o’ (not an example from last night but one I just tried). The more confident you are that the onscreen keyboard will accurately predict what you’re trying to type, the more comfortably you can enter text. The more comfortable you are at entering text, the more efficient you become at typing, which begins a feedback loop.

Because I didn’t care that specifically about the content of Dennett’s talk, it was an excellent occasion to practise entering text on my ‘touch. The stakes of “capturing” text were fairly low. It almost became a game. When you add characters to a string which is bringing up the appropriate suggestion and delete those extra characters, the suggestion is lost. In other words, using the example above, if I type “xartion,” I get “cartoon” as a suggestion and simply need to type a space or any non-alphabetic character to accept that suggestion. But if I go on typing “xartionu” and go back to delete the ‘u,’ the “cartoon” suggestion disappears. So I was playing a kind of game with the ‘touch as I was typing relatively long strings and trying to avoid extra characters. I lost a few accurate suggestions and had to retype these, but the more I trusted the predictive algorithm, the less frequently did I have to retype.

During a 90 minute talk, I entered about 500 words. While it may not sound like much, I would say that it captured the gist of what I was trying to write down. I don’t think I would have written down much more if I had been writing on paper. Some of these words were the same as the ones Dennett uttered but the bulk of those notes were my own thoughts on what Dennett was saying. So there were different cognitive processes going on at the same time, which greatly slows down each specific process. I would still say that I was able to follow the talk rather closely and that my notes are pretty much appropriate for the task.

Now, I still have some issues with entering text using the ‘touch’s onscreen keyboard.

While it makes sense to make it the default that all suggestions are accepted, there could be an easier way to refuse suggestions that tapping the box where that suggestion appears.

It might also be quite neat (though probably inefficient) if the original characters typed by the user were somehow kept in memory. That way, one could correct inaccurate predictions using the original string.

The keyboard is both very small for fingers and quite big for the screen.

Switching between alphabetic characters and numbers is somewhat inefficient.

While predictions have some of the same effect, the lack of a “spell as you type” feature decreases the assurance in avoiding typos.

Dictionary-based predictions are still inefficient in bilingual writing.

The lack of copy-paste changes a lot of things about text entry.

There’s basically no “command” or “macro” available during text entry.

As a fan of outliners, I’m missing the possibility to structure my notes directly as I enter them.

My own thoughts on the whole thing.
I appreciate the fact that Phil Schiller began the “enterprise” section of the event with comments about a university. Though universities need not be run like profit-hungry corporations, linking Apple’s long-standing educational focus with its newly invigorated enterprise focus makes sense. And I had a brief drift-off moment as I was thinking about Touch products in educational contexts.

I’m surprised at how enthusiastic I get about the enterprise features. Suddenly, I can see Microsoft’s Exchange make sense.

I get the clear impression that even more things will come into place at the end of June than has been said by Apple. Possibly new Touch models or lines. Probably the famous 3G iPhone. Apple-released apps. Renewed emphasis on server technology (XServe, Mac OS X Server, XSan…). New home WiFi products (AirPort, Time Capsule, Apple TV…). New partnerships. Cool VC-funded startups. New features on the less aptly named “iTunes” store.

Though it was obvious already, the accelerometer is an important feature. It seems especially well-adapted to games and casual gamers like myself are likely to enjoy games this feature makes possible. It can also lead to very interesting applications. In fact, the “Etch and Sketch” demo was rather convincing as a display of some core Touch features. These are exactly the features which help sell products.
Actually, I enjoyed the “wow factor” of the event’s demos. I’m convinced that it will energize developers and administrators, whether or not they plan on using Touch products. Some components of Apple’s Touch strategy are exciting enough that the more problematic aspects of this strategy may matter a bit less. Those of us dreaming about Android, OpenMoko, or even a revived NewtonOS can still find things to get inspired by in Apple’s roadmap.

What’s to come, apart from what was announced? No idea. But I do daydream about all of this.
I’m especially interested in the idea of Apple Touch as “mainstream, WiFi, mobile platform.” There’s a lot of potential for Apple-designed, WiFi-enabled handhelds. Whether or not they include a cellphone.
At this point, Apple only makes five models of Touch products: three iPod touches and two iPhones. Flash memory is the main differentiating factor within a line. It makes it relatively easy to decide which device to get but some product diversity could be interesting. While some people expect/hope that Apple will release radically new form factors for Touch devices (e.g., a tablet subnotebook), it’s quite likely that other features will help distinguish Apple’s Touch hardware.
Among features I’d like to get through software, add-ons, or included in a Touch product? Number of things, some alluded to in the “categories” for this post. Some of these I had already posted.

I keep dreaming of different devices which would enhance my personal and professional experience. Not that I’m really a gadget geek. But technology has, to a large extent, been part of improvements in my life.

Though I would hesitate to call “addictive” my relation to computer technology, I certainly tend to depend on it quite a bit.

Some context.

Ok, ok! A lot of context.

Let’s go back. Waaaaay back. To the summer of 1993. I was 21, then, and had already been a Mac-head for more than six years. Without being a complete fanboy of Apple Computers, I guess I was easily impressed by many of its products. During a trip to Cape Cod that summer, I got to read an issue of USA Today. In that issue, I read a review of a new class of computers, the Personal Digital Assistant (PDA). I still remember how I felt. It might not have been my first “tech-induced epiphany” but it was one of the most intense. I not only started drifting off (which was easy enough to do, as I was in the back seat of my mother’s car), I actually started perceiving what my life could be with one of those devices.

Of course, I could afford any of them. Even when it became possible for me to purchase such a device, it remained financially irrational for me to spend that money on a single device, no matter how life-changing it might have been.

Shortly after discovering the existence of PDAs, and still during the summer of 1993, I discovered the existence of the Internet. Actually, it’s all a bit blurry at this point and it’s possible that I may have heard of the Internet before reading that epiphany-inducing USA Today article. Point is, though, that the Internet, not the PDA, changed my life at that point.

Whatever my computing experience had been until that point is hard to remember because the ‘Net changed everything. I know about specific computers I had been using until that point (from a ViC20 to an SE/30). I do remember long evenings spent typing from my handwritten notes taken during lectures. I still get a weird feeling thinking about a few sleepless nights spent playing simple strategy and card games on my father’s old Mac Plus. But I just can’t remember I could live without the ‘Net. I wasn’t thinking the same way.

Not too long after getting my first email account (on Université de Montréal’s Mistral server, running IRIX), the ‘Net helped me land my first real job: research assistant at a speech synthesis lab in Lausanne, Switzerland.

In late 1993 or early 1994, I had sent an email to a prominent ethnomusicologist about applying to the graduate program where she was and mentioned something about computer-based acoustic analysis, having taken a few courses in acoustics. She told me about Signalyze, a demo version of which was available through a Gopher server for that Swiss lab. While looking at that Gopher server, I became interested in the lab’s research projects and contacted Eric Keller, head of that lab and the main developer for Signalyze. I was already planning on spending that summer in Switzerland, working at my father’s cousin’s crêperie, so I thought spending some time in Lausanne interacting with members of Keller’s lab was a good idea. I was just finishing my bachelor’s degree in anthropology at Université de Montréal (with a focus on linguistic anthropology and ethnomusicology). So I was interested in doing something related to sound analysis in musical or speech contexts. Keller asked for my résumé and offered me paid work at his lab for the summer. I ended up spending both that summer and the whole 1994-1995 academic year working at this lab, being paid more than some of my mentors in Montreal.

Technologically-speaking, my life in Switzerland was rather intense. I was spending 15 hours a day in front of a computer, doing acoustic analysis of speech sounds. This computer was a Mac IIvx which had once belonged to UQÀM. A very funny coincidence is that the Mac IIvx I was using had become the source of part of the funding for a fellowship at UQÀM. After I met the incredible woman who became my wife, she received that fellowship.

As this computer had a fast connection to the Internet, I became used to constantly having online access. I was mostly using it to send and receive emails, including messages to and from mailing-lists, but I also got to dabble in HTML a bit and did spend some time on the still burgeoning World Wide Web. I also used a few instant messaging systems but I was still focused on email. In fact, I started using email messages to schedule quick coffee breaks with a friend of mine who was working one floor below me.

This 15-months stay in Switzerland is also when I first got a chance to use a laptop. A friend of my father had lent me his laptop so I could work on a translation contract during weekends. Though this laptop (a PowerBook 170, IIRC) wasn’t very powerful, it did give me a vague idea of what mobile computing might be like.

Coming back to Quebec about my Swiss experience, I began my master’s degree in linguistic anthropology. After looking at different options, I bought a PowerMac 7200 through a friend of mine. That 7200 (and the PowerMac 7300 which followed it) greatly enhanced my stationary computing experience. I probably wasn’t thinking about mobile and handheld devices that much, at that time, but I was still interested in mobile computing.

Things started to change in 1997. At that time, I received a Newton MessagePad 130 through the AECP (Apple Educational Consultant Program). This was a great device. Too big for most pockets. But very nice in almost every other respect. While my handwriting is hard to read by most humans, the Newton’s handwriting did quite a decent job at recognising it. I also became quite adept in Graffiti, Palm Inc.’s handwriting recognition software based on a constructed script from uppercase latin alphabet. I was able to take notes during lectures and conferences. For a while, I carried my Newton anywhere. But it was so bulky that I eventually gave up. I just stopped carrying my Newton around. At one point, I even lent it to a friend who tried it out for a while. But I wasn’t a PDA user anymore. I still needed the perfect PDA. But the Newton wasn’t it.

In early 1998, I went to Mali for the first time. Before I went, I bought a portable cassette recorder to record interviews and some musical performances.

When I moved to Bloomington, IN in September 1998 to do my Ph.D. coursework, I literally had no computer at home. As I had done for a long time during my bachelor’s degree, I spent long hours in computer labs on campus. The computers themselves were quite good (and updated fairly regularly) and IU had one of the best Internet connections available.

In mid-to-late 2001, when rumours of an Apple-branded portable device started surfacing, I was getting ready for my main ethnographic and ethnomusicological fieldwork trip to Mali.

I kept thinking about different tools to use in the field. For some reason, portable equipment for computing and recording was strangely important for me. I still had my Newton MP130. And I was planning on using it in the field. Except if something radically better came along. So I was hoping for the mysterious handheld device Apple was launching to be something of a Newton replacement. Sure, I knew that Steve Jobs had always hated the Newton, apparently for personal reasons. But I secretly hoped that he would come to his senses and allow Apple to revolutionise the handheld market it had spearheaded back in 1993. When I learnt that the device might be related to audio, I thought that it might be both a PDA and an audio device. More importantly for me, I thought that it would have some recording capabilities, making it the ideal field research tool for ethnographers and ethnomusicologists. I was waiting impatiently for the announcement and, like some others, was disappointed by the initial release, especially when I learnt that the iPod didn’t have any recording capabilities. Soon after this, I bought the main devices which would accompany me in my main field trip to Mali: an Apple iBook (Dual USB) laptop with Combo Drive, a HandSpring Visor Deluxe PDA, a Sony MZ-R37 MiniDisc recorder, and a Sony ECM-MS907 microphone. I used all of these extensively throughout my field trip and, though Internet access was spotty, being able to regularly send and receive messages from my iBook was very beneficial for my research practises. I left the MiniDisc recorder and microphone with Yoro Sidibe, the main person with whom I was working in the field, and had to buy other equipment on my way back.

By mid-2004, I bought a used iPod through eBay. I was still living in Montreal but was moving to South Bend, IN, where I was going to spend a year on a teaching fellowship. To make things easier and cheaper, I had the eBay seller send the iPod to my future office in South Bend. When I arrived in South Bend a month or so later, I finally took possession of my first ever iPod. It was an iPod 2G 20GB with FireWire. It came in a rather big box which also included: the original AC adapter, two extra adapters (including a car one), two pouches, the original headphones, and the original remote control.

My iBook (Dual USB) only had a 10GB hard drive so most of my MP3s were on CD-Rs that I had burnt for use with a CD-MP3 player (at the time, a Rio Volt that I had received as a gift a few years prior). I had also brought in my CD collection, in CD Projects (and similar) carrying cases. Hundreds of CDs, a rather heavy and voluminous burden.

I eventually got a good part of my CD collection on the iPod. And I rediscovered music.

Funny to say, for an ethnomusicologist. But pretty realistic. I had lost touch with this type of private music listening. As convenient as it was to use, my Rio Volt didn’t really enable me to connect with music. It merely allowed me to carry some music with me.

Fairly early on, during my first iPod’s career as my main music device, the remote control started acting funny. Sometimes, it would reboot the iPod for no reason. Using the headphones directly (without the remote control), I didn’t have that problem. Though I know very little about electronics, it seemed to me that something was wrong in the connection between the remote control and the jack. I asked the prior owner who said he never had had a problem with the remote control. I resorted to not using the remote control and went on my happy way to iPod happiness for almost two years. Apple was releasing new iPod models and I would have liked to own them, but my finances wouldn’t allow me to purchase one of them and my iPod 2G was still giving me a lot of pleasure.

When Apple introduced podcast support in mid-2005, I became something of a podcast addict. I subscribed to tons of podcasts and was enjoying the iTunes/iPod integration to its fullest potential. A portion of my music MP3 collection was still taking the largest amount of disk space on my iPod but I was spending more time listening to podcasts than listening to MP3s from my personal collection.

In early 2006, I finally transformed my whole CD collection to MP3s thanks to the large hard drive (160GB) in the refurbished emachines H3070 that I had to buy to replace my then-defunct iBook. The complete collection took over 90GB and it took me quite a while to sort it all out. In fact, large chunks of this MP3 collection remain unexplored to this day. My main “active” collection represents about 15GB, which did fit on my iPod’s 20GB hard drive with enough room for lots of podcasts. So, despite being quite outdated by that time, my iPod 2G was giving a lot of pleasure.

Then, in mid-2006, I started having problems with the headphone jack on this iPod. Come to think of it, I probably had problems with that headphone jack before that time, but it was never enough of a problem to detract me from enjoying my iPod. By mid-2006, however, I was frequently losing sound in one headphone because the jack was moving around. My music- and podcast-listening life wasn’t as happy as it had been. And I started looking elsewhere for audio devices.