After spending the last year game programming on the iPhone, I’ve finally returned to Android. My six prior tutorials were outdated (Android v1.0) so I took the time to update them to v1.5 (Cupcake). The most popular of those tutorials was the Streaming MediaPlayer tutorial so that’s the primary focus of this post.

At the time of initially writing the streaming tutorial, Android’s media streaming function didn’t work well so I wrote my own. As of v1.5 however, Android’s MediaPlayer streams very well. That said, it’s still useful to know how to retrieve a media file from a server and store it on the device. This would be useful to immediate replay of the file at a later date or for caching files for later playback ‘off the grid’.

You may want to look at my old Android v1.0 streaming tutorial for additional details. I don’t want to rewrite that post completely so I’ll mostly focus here on the changes required for Android v1.5. That said, there is very little difference between my old tutorial and this new one.

The most important change is that MediaPlayer.setDataSource() now takes a FileDescriptor instead of a String path to the media File. It seems the reason to use FileDescriptors is for security/permission reasons. In either case though, passing a String path to our media File resulted in errors such as “PVMFErrNotSupported” and “Prepare failed.: status=0x1″. So until I learn otherwise, I recommend using FileDescriptor for the MediaPlayer.

The only other change is I can no longer find any File move functionality it Android so I wrote my own. It does exactly what it says, it moves data in one File to a new File location. This is used while streaming the media file so we can double buffer it. The double buffering allows us to simulaneously download to one File while playing from another File. We sync the downloaded data between the Files as more data is downloaded:

I implemented your new revision and it works fine for the first file I play albeit with a stutter at data swaps. I can’t seem to get it to load any subsequent files though. I’ve gone so far as trying to load a new instance of it, but it merely crashes the program. Any ideas or hints you can give me?
Thanks

The tutorial app was only designed to load & play a single file. You would need to add MediaPlayer shutdown and cleanup at the end of the file playing for it to download and play another file (exercise for the reader).

With a restart though, I was able to load and reload files without problems. What error message are you getting?

I try to use this to play a shoutcast but there are always gaps between the files.
Any ideas how to improve this?
I tried different buffer sizes, but there is no solution.
I’m using the code from the 1.5 tutorial.

I’m not getting any errors… none that I can detect anyway. I tried reseting the mediaplayer before calling startstreaming(…) again, but that didn’t work either, it didn’t do anything. If I had an idea as to where it was locking up it would make it much easier to fix.

At this point with the new sdks, is this still a beneficial route to go for streaming? Doesn’t the standard mediaplayer buffer the data? The only real reason I’m going through all this is because I want the entire mp3 file loaded on to the device so if someone rewinds/fastforwards the file doesn’t reload from the server and I’m not sure if mediaplayer does this automatically.

I’m unsure how Android handles the buffering upon fast forward/rewind. My tutorial was written before Android’s streaming was working well. I’m fairly happy with the built-in streaming of v1.5 so I don’t use my own code anymore.

I am trying to do some live video delivery to the android platform. I have done this for the iPhone and given Apple’s HTTP adaptive streaming, this is made very easy. But not so on Android. What path would you recommend?

I am writing along the lines of what you have done (thanks for the examples) but I don’t want to have glitches when switching between files. Would be made simpler if there was a way for Android to implement something like M3U8 or another video playlist playback mechanism.

I didn’t include this in the tutorial as I wasn’t sure it would work at the time. However as the MediaPlayer is currently written, you don’t need to double buffer the streamed media as in the current tutorial. You can actually just write directly the file as it’s playing.

You would need to add logic to pause playback if the playback exceeds the amount of downloaded file though.

Could anybody please explain how to pre-allocate the memory in SDCard so that no other application can access this space. If we are maintaining a queue for downloading, how to pre-allocate(or lock) a partcular amount memory space in external storage, so that whatever in queue can be downloaded safely. Is there any other work arround than filling the space with dummy data?

. Could anybody please explain how to pre-allocate the memory in SDCard so that no other application can access this space. If we are maintaining a queue for downloading, how to pre-allocate(or lock) a partcular amount memory space in external storage, so that whatever in queue can be downloaded safely. Is there any other work arround than filling the space with dummy data??

That list doesn’t say anything about support for streaming formats. File-based media typically have a header with information about the file. Streams provide that information in other ways. That would seem to preclude saving the stream to a file for replay as a file-based format.

In short unless Android supports streamed audio, I doubt this tutorial would be much help.

hhmmm ok, sounds plausible 😀
Not very happy to hear that even with a new and modern platform we still aren’t able to actually stream AAC(+) content… Seems like they looked a lot at J2ME and decided that a 6 year old platform still is a good way to go
I did try your project, and it works, but I did notice that on my device (Android dev Phone 2, which is a HTC Magic running 1.6) the “holes” inbetween the played pieces have a tendency to grow through time… So if the first couple of switches between chunks souns decent after 6-7 switches it will start to have holes of several seconds… Is this normal ?

[quote]Are you saying that once you’ve downloaded the audio files, the final file residing on the device has gaps in the audio? [/quote]
Nope the files themselves were ok, but there is a gap which keeps getting bigger inbetween the stopping of a file and the starting up of the next one… So inbetween “chunks” (files in this case)…
I heard though that ogg would be faster thus completely removing those gaps, only my source is mp3 so is there some way, to your knwoledge, to encode mp3 to ogg in Android itself ?

Hey.. thanks for the excellent tutorials and the time you have put in to help everyone developing android apps.

I am working on something which is close to the streaming media player but a little different. I want to be listening on a port on my phone for some raw audio data, that will be sent from a location on my network using netcat. I pretty much want to perform a netcat listen function on my android and pipe out the data to the audio hardware. You must have worked or heard about the AudioRecord and AudioTrack classes which give access to raw media.

I was wondering if you could refine the following algorithm for me and point me to the implementation, as I know a bit of java but not too confident with using Sockets, Pipes and Android stuff.

The procedure i am planning to use is:

1) Use a new ServerSocket(port) to listen on particular port.
2) The other side connects to this port using netcat :
3) Make a new Pipe which takes the outputstream from the socket, and feeds it into the input of the audio hardware.
4) Use methods from AudioTrack to play the raw audio in real time

Any help in this area would be very much appreciated, as I have been struggling to find anything in this area.

That’s quite the challenge you’ve set for yourself. Seems like mostly a networking challenge at this point to get the audio onto the phone. (1) should be easy as I’m sure there’s documentation on the web about it.

For (2), I don’t know anything about netcat. Why are you using it? I assume there’s a reason you’re connecting from the server to the phone rather than vice versa? I would guess you want to push the data to the phone but don’t want the phone wasting battery power by constantly polling the server for data.
How is your server going to discover the IP address of your phone though? My worry is that the IP address changes as the phone roams.

Hey, Thanks for the prompt reply.
Yup, its a challenging one, but quite fun to do ! I will probably elaborate on my project a bit more to give u a slight picture of what we I am doing.
My first step, which is complete now, was to look at MightyOhms Wifi Radio Project and transform it into a full duplex audio comms system between two routers (with USB sound cards) by installing OpenWRT on it. Its just a linux flavour to unleash your router. I set up one of the routers as an Access point and the other as a client, and by using netcat as I said above and a recorder/player program on the routers I was able to achieve this setup.
So all that is good. Now, I was looking at replacing the client side router, with another wifi client..ie. the Android phone.
The comms has to be real time, thats why I am not going for polling. IP address on roaming is not an issue, as being my own lan, I can use static IP addressing.
So yeah, this will help me establish a 2 way audio path.

For step (4), i already looked at the code from http://hashspeaks.wordpress.com/2009/06/18/audiorecord-part-4/ and made it into an android project, however, when I heard what I recorded, It seemed like something is overdriving the microphone. And I seem to be having no control on that via the API. Whatever I record closet to the phone turns into clipped garbage, but sounds from far away are clear.

It would be a huge help if you could test that code on hashirs blog on an android G1 phone running 1.6(which I tested it on) , so that I can confirm its not something I am doing wrong or something to do with my phone.

Hey .. Just found the solution to my problem !
It was just endianness ! I shud hv been careful while playing the file on an intel PC, and specify Big endian format for audio. My problem was that it was recording correctly, but listening in the wrong format introduces noise in the recording (but still records ..which is why i never thought it cud be the parameters) ! I heard the right one in audacity, as it let me choose between endianness.

hi Biosopher, I tried it with mp3 format and I found it is supported media format. In fact I tried to run it with your codebase url without changing it and got previuos errors that I posted to you. Now I tried with wma format uploading to my own site and got the following error in logCat:

hello Biosopher,
is there any need of changing in your code for proper media file or need to change the url of media file? here I found that you are changing the extension as *.dat for the media file if I’m correct abt it.

you were right. This “PVMFErrNotSupported” error is due to FileDescriptor Class. I used file descriptor instead of absolute file path for media player setDataSource() method and it plays for a couple of seconds and then shows these errors: Here it says it doesn’t get the desired file.

This tutorial only supports single files downloads so you must have updated it to support multiple files. Look at how you’re handling the handoff between files. Seems like you are still writing to a file even though no data is being recieved.

[…] in all my searches, which is generally too bad because that’s way out of date (incidentally, here is the newer version). You don’t need to buffer or double buffer media for the media player anymore. However, the […]

Hi Biosopher,
We tried playing another audio file as well as a video file with the code you have provided.. It played for 5 sec and then stops playing, but continues to buffer and download the complete file. We guess the problem would be with the number of bytes you assign for two buffers –
1) the buffer into which the data is copied from the URL. – it is 16384
2) the buffer used to move the file – it is 8192
Does these numbers have any significance with the file size or the data rate at which the file is downloaded?

Do u have an idea if Android emulator 1.5/2.0 supports real-time video streaming?

Thanks for the reply.
We are doing a project where we need to develop the RTP/RTSP stack on Android for real time streaming.

We are basically receiving the rtp packets for audio/video files over the socket. But we are not able to play them back real-time.

You mentioned Android’s MediaPlayer supports streaming of video/audio, does this mean real-time support?

We are referring to the code you have provided for (Cupcake(v1.5)).

For understanding of the code we tried playing other audio files changing the parameters 16384 and 8192 according to the file size. We couldn’t play the entire file.
So it would be great if you could just explain the significance of those byte array lengths.

hello Biosopher
I m aslo using the same code for 2.1 or later but I m required to play MP4 instead only mp3…so I have created surface view and set the display holder for it … also I have to stream the content from SD card rather than any streaming server….

I haven’t had the best of luck displaying mp4 files on Android. 3GP files tend to work best. If you’re playing off the SDCard, then you’re not streaming the video. You’re simply reading a video file off the SDCard. In that case, I recommend 3GP.

In my setup I have a live stream source connected which an Axis
encoder (250S) receives and encodes to MPEG2 which I then receive into
VLC (re-encodes as h264 – mpeg4) and re-stream out as an RTSP stream, using a .sdp definition.

I’m attempting to display this LIVE video source using the Media Video
demo that comes with the Android SDK, i’ve also attempted to use a
http link in the Android emulator browser.

But to no avail, I don’t ever see a video playing.

Can anyone confirm if the android emulator is able to play this type
of video source and if the
real android device say an HTC Hero could play this using RTSP.

Unfortunately I’m not experienced enough with live streaming to conclude yea/nay on whether Android can stream live. Given the numerous questions on this tutorial though, it seems Android doesn’t readily support live streaming out of the box.

CatDaaaady posted this comment on another one of my postings. It’s useful to you and everyone else:

“With the android app I have been running, MyNPR, I used Biosopher’s code to get me going playing a live stream of local npr stations.
My code is here (http://code.google.com/p/mynpr/source/browse/#svn/trunk/mynpr/src/com/webeclubbin/mynpr)
Yesterday, NPR released the code for the official NPR app for Android.
Why is this so great? Because they have real live streaming for stations. No saving files locally and playing them in a queue!
But there is a catch… it is currently not working in the app. BUT looking at their code we can see the direction they are going.

Basically they are doing what I thought I would have to do to make it all work with out saving files. Take the live stream , repackage the packets into something the mediaplayer can use, then send it to the mediaplayer. I was thinking of converting the live stream to some sort of rtps stream. But NPR is doing something different. I am not sure what though. Check the links below for the code.

Funny side note: I swear some of their code is “very” close to Biosopher’s code . Even down to the variable names.

Hi Biosopher,
Thank you for your quick reply.
But, What you told is to fill the space with dummy bits(as i mentioned in the question). Is there any other way for this.
Because I’m maintaining a download queue. So the number of files downloading can be more than one(it can’t be pre-calculated). If I create a file with dummy bits filled, after dowloading the files in queue to a single file, I have to partition it(it can be done). What I need is just lock the SDCard blocks(say i’m reserving 100MB for downloading). I hope you understood what I meant.
Thank you very much.

I don’t recall if the durection is available from the .mp3 header. I think it has to do with the file size and bit rate. Don’t have the code in front of me at the moment. In either case, you can see where the duration is being calculated by looking through the code.

I’m trying to stream a raw AAC stream from a site. I used the buffering technique provided in the tutorial, which works great in all of my emulators, however it’s not working on a single real device. It just hangs forever.

I haven’t checked any logcats yet, but does anyone have an idea what could be going wrong here?

Anyone find a solution to the gap problem yet? I’ve tried the StreamingMediaPlayer and the myNPR versions, but both still have gaps. The reason I need this working is because I need to access a stream that requires authentication and I have not been able to find a way to do this with the MediaPlayer.

It’s unclear to me why anyone would be having issues with gaps. As the tutorial code is written, it’s simply incrementally downloading a file from the server. I.e., the code is written to read a single ‘finite’ audio file from the server NOT to download some type of continuous stream of audio. The only data written to the hard disk should be data sent by the server which should only be the contents of the audio file as present on the server’s disk…thus no gaps.

Have you altered the code in some way or are you downloading an actual audio stream?

Yes, I am trying to download an actual audio stream. And yes, I know that your code was designed originally for single, finite audio files, but I have not found any other way of playing an audio stream that requires authentication. I merely used this code as a starting point to see if I could tweak it to work with audio streams instead of files.

At this point I am able to authenticate with the server and get the audio stream to play in the MediaPlayer, but I have not been able to remove the skips that occur when sending the buffered audio stream to the MediaPlayer. That is, the code currently stores a few seconds of the stream in the cache and then sends that to the MediaPlayer and then repeats this process.

I’ve attempted the multiple-MediaPlayer-type solution in the myNRL code, but that still causes skips.

There are two streams that I wish to play. Both are setup at the server the exact same way (shoutcast/ICY/MP3/http). However, one requires a username/password and the other does not.

The stream that does not require a username/password streams just fine using MediaPlayer.

The stream that does require a username/password will not work with MediaPlayer. I’ve tried using an Authenticator and also putting the username/password into the URL (see previous post).

I am not convinced that the problem is how the streams are configured. I am able to play the stream that requires a username/password in my iPhone version of the app. But what happens on the iPhone is: 1) attempt to connect to URL 2) receive request to authenticate 3) send credentials 4) receive stream.

There doesn’t appear to be a way using MediaPlayer to authenticate.

I am able to authenticate using a URLConnection with an Authenticator (I added the Authenticator to your code to accomplish this). So obviously it is possible on the Android to authenticate. However the MediaPlayer just does not use the Authenticator.

i have tried to play video and audio both but it starts playing and after 2 or 3 seconds it stuck and stops playing while progress shows streaming complete but the video won’t play after 2 or 3 seconds
error log

hi all,
First of all I’m not a programmer but I have mobile sites made for me; I have this question about sound files. I like to post 5 sound files of about 1min each on a mobile website. Viewers of my site should be able to listen to these sound files, also with ‘older’ smart phones (ie. eg. Android 1).

My programmers in India told me that this is only possible with Flash, however I always learned that Flash is a no-brainer for mobile.

You’re right. Flash doesn’t work on the iphone and runs slow on Android. For the iPhone, simply add a link to the .mp3 file on your webpage. When the user clicks the .mp3 files, the iPhone will launch the phone’s media player.

I believe for Android, the user must press and hold the link until a menu pops up and select “Save Link”. This should then download the .mp3 file for you and you will be able to play it.

[…] to do this? I use this documentation MediaPlayer | Android Developers And I also found this: Android Streaming MediaPlayer Tutorial – Updated to v1.5 (Cupcake) Pocket Journey but non of them help to recover the position where you was downloading… Hope some of you know […]

[…] #1: This tutorial was written for Android v1.0. I have just updated the Android streaming media player tutorial/code to v1.5 (Cupcake) with some additional information on the updated code. You should read that post as well as this […]

thanks a ton for writing this tutorial, this is exactly what I’m looking for.
I’ve checked out the tutorial for Android v1.0 and installed the proved .apk to test the app without modifying anything. It starts streaming and playing just fine. But after a few seconds, playback stops (but the download continues shown by the progressbar and “Audio full loaded: 1717 Kb read”). I’m still to figure out where the file is exactly being saved though, just to verify it full downloads.

My guess is that it fails at this part: “This is where the magic happens as we download media content from the the url stream until we have enough content buffered to start the MediaPlayer. We then let the MediaPlayer play in the background while we download the remaining audio. If the MediaPlayer reaches the end of the buffered audio, then we transfer any newly downloaded audio to the MediaPlayer and let it start playing again.”

In this article you write that “MediaPlayer.setDataSource() now takes a FileDescriptor instead of a String path to the media File.” So I thought the problem I’m having would be fixed by updating to the version from this article.

I downloaded your source files, installed the .apk that has been updated for Android v1.5 but it didn’t work at all. When I hit the “Start Streaming” button, it remains in a pushed down and greyed out state. It never pops back up, no progress bars show up and no sound is being played. I didn’t modify anything and I’m running this app on my Samsung Galaxy Note N7000 with Android 4.1. Do you know what’s wrong?

I have another question:
You write in a reply “I’m fairly happy with the built-in streaming of v1.5 so I don’t use my own code anymore.” I’m glad you are happy For me, the main reason I’m interested in your solution is this aspect: “We store the streamed audio locally so you could cache it on device for later use”, as written in your previous article. I’d like to stream an mp3 file initially and simultaneously store it on the SD card. The next time I want to play the file, it will be loaded from the SD card. The mp3 will be downloaded just once. If you are not using your own code any more, how do you handle this particular scenario?

Okay, I was in the train the other day and couldn’t really test beyond running the downloaded apk. There was nothing wrong with the code, just that the mp3 file is down. I tested with one of my own and it works excellently

Just back from an Internet-free vacation. Did you solve your problem with caching the file? Sadly I don’t think the MediaPlayer built into Android supports caching…thought perhaps it could be extended to do so.