I have an application that is playing MP3 files that are available at a public URL. Unfortunately the server does not support streaming, but the Android makes the user experience quite acceptable.
It all works fine for all platforms except for JellyBean. When requesting the MP3, JB requests for a Range-Header for 10 times. Only after the 10-th attempt it seems to revert to the old behavior. Looks like this already reported issue.
I found another SO thread where a solution recommended is to use Tranfer-Encoding: chunked header. But just below there is a comment that this doesn't work.
For the moment I have no control whatsoever to deliver above response headers, but until I will be able to do that I thought to search for an alternative at client side. (even so, I can only return a Content-Range that contains indexes from 0 to Content-Length - 1. Ex. Content-Range: bytes 0-3123456/3123457).
What I tried to do is to implement a pseudo-streaming at client side by:
- Open an input stream to the MP3.
- Decode the incoming bytes using JLayer. I found the decoding at this link.
- Send the decoded array bytes to an already playeable stream_mode AudioTrack.
The piece of code that does the decoding can be found there, I have only modified it so it will receive an InputStream:
public byte[] decode(InputStream inputStream, int startMs, int maxMs) throws IOException {
ByteArrayOutputStream outStream = new ByteArrayOutputStream(1024);
float totalMs = 0;
boolean seeking = true;
try {
Bitstream bitstream = new Bitstream(inputStream);
Decoder decoder = new Decoder();
boolean done = false;
while (!done) {
Header frameHeader = bitstream.readFrame();
if (frameHeader == null) {
done = true;
} else {
totalMs += frameHeader.ms_per_frame();
if (totalMs >= startMs) {
seeking = false;
}
if (!seeking) {
// logger.debug("Handling header: " + frameHeader.layer_string());
SampleBuffer output = (SampleBuffer) decoder.decodeFrame(frameHeader, bitstream);
if (output.getSampleFrequency() != 44100 || output.getChannelCount() != 2) {
throw new IllegalArgumentException("mono or non-44100 MP3 not supported");
}
short[] pcm = output.getBuffer();
for (short s : pcm) {
outStream.write(s & 0xff);
outStream.write((s >> 8) & 0xff);
}
}
if (totalMs >= (startMs + maxMs)) {
done = true;
}
}
bitstream.closeFrame();
}
return outStream.toByteArray();
} catch (BitstreamException e) {
throw new IOException("Bitstream error: " + e);
} catch (DecoderException e) {
throw new IOException("Decoder error: " + e);
}
}
I am requesting the decoded bytes in time chunks: starting with (0, 5000) so I will have a bigger array to play at first, then I am requesting the next byte arrays that span over a second: (5000, 1000), (6000, 1000), (7000, 1000), etc.
The decoding is fast enough and is done in another thread and once a decoded byte array is available I am using a blocking queue to write it to the AudioTrack that is playing in another thread.
The problem is that the playback is not smooth as the chunks are not continuous in a track (each chunk is continuous, but added in the AudioTrack results in a sloppy playback).
To wrap up:
- If you have bumped into this JellyBean issue, how did you solve it?
- If any of you tried my approach, what am I doing wrong in above code? If this is the solution you used, I can publish the rest of the code.
Thanks!