You are not giving an awful lot of specifics on what exactly you have tried and what the problematic areas are, so I just made a small test to see if I could reproduce any of what you're describing.
I do not have any conclusive findings, but can at least confirm that my Galaxy Nexus (Android 4.0.2) is able to play three videos simultaneously without any problems. On the other hand, an old Samsung Galaxy Spica (Android 2.1-update1) I had lying around only plays a single file at a time - it appears to always be the first SurfaceView
.
I further investigated different API levels by setting up emulators for Android 3.0, 2.3.3, and 2.2. All these platforms appear to be able to handle playback of multiple video files onto different surface views just fine. I did one final test with an emulator running 2.1-update1 too, which interestingly also played the test case without problems, unlike the actual phone. I did notice some slight differences in how the layout was rendered though.
This behaviour leads me to suspect that there's not really any software limitation to what you're after, but it seems to depend on the hardware wether simultaneous playback of multiple video files is supported. Hence the support for this scenario will differ per device. From an emperical point of view, I definitely think it would be interesting to test this hypotheses on some more physical devices.
Just for reference some details with regards to the implementation:
- I set up two slightly different implementations: one based on three
MediaPlayer
instances in a single Activity
, and one in which these were factored out into three separate fragments with each their own MediaPlayer
object. (I did not find any playback differences for these two implementations by the way)
- A single 3gp file (thanks for that, Apple), located in the
assets
folder, was used for playback with all players.
- The code for both implementations is attached below and largely based on Googles
MediaPlayerDemo_Video
sample implementation - I did strip away some code not required for the actual testing. The result is by no means complete or suitable for using in live apps.
Activity-based implementation:
public class MultipleVideoPlayActivity extends Activity implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {
private static final String TAG = "MediaPlayer";
private static final int[] SURFACE_RES_IDS = { R.id.video_1_surfaceview, R.id.video_2_surfaceview, R.id.video_3_surfaceview };
private MediaPlayer[] mMediaPlayers = new MediaPlayer[SURFACE_RES_IDS.length];
private SurfaceView[] mSurfaceViews = new SurfaceView[SURFACE_RES_IDS.length];
private SurfaceHolder[] mSurfaceHolders = new SurfaceHolder[SURFACE_RES_IDS.length];
private boolean[] mSizeKnown = new boolean[SURFACE_RES_IDS.length];
private boolean[] mVideoReady = new boolean[SURFACE_RES_IDS.length];
@Override public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.multi_videos_layout);
// create surface holders
for (int i=0; i<mSurfaceViews.length; i++) {
mSurfaceViews[i] = (SurfaceView) findViewById(SURFACE_RES_IDS[i]);
mSurfaceHolders[i] = mSurfaceViews[i].getHolder();
mSurfaceHolders[i].addCallback(this);
mSurfaceHolders[i].setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
}
public void onBufferingUpdate(MediaPlayer player, int percent) {
Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onBufferingUpdate percent: " + percent);
}
public void onCompletion(MediaPlayer player) {
Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onCompletion called");
}
public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): onVideoSizeChanged called");
if (width == 0 || height == 0) {
Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
return;
}
int index = indexOf(player);
if (index == -1) return; // sanity check; should never happen
mSizeKnown[index] = true;
if (mVideoReady[index] && mSizeKnown[index]) {
startVideoPlayback(player);
}
}
public void onPrepared(MediaPlayer player) {
Log.d(TAG, "MediaPlayer(" + indexOf(player) + "): onPrepared called");
int index = indexOf(player);
if (index == -1) return; // sanity check; should never happen
mVideoReady[index] = true;
if (mVideoReady[index] && mSizeKnown[index]) {
startVideoPlayback(player);
}
}
public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceChanged called");
}
public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceDestroyed called");
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "SurfaceHolder(" + indexOf(holder) + "): surfaceCreated called");
int index = indexOf(holder);
if (index == -1) return; // sanity check; should never happen
try {
mMediaPlayers[index] = new MediaPlayer();
AssetFileDescriptor afd = getAssets().openFd("sample.3gp");
mMediaPlayers[index].setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mMediaPlayers[index].setDisplay(mSurfaceHolders[index]);
mMediaPlayers[index].prepare();
mMediaPlayers[index].setOnBufferingUpdateListener(this);
mMediaPlayers[index].setOnCompletionListener(this);
mMediaPlayers[index].setOnPreparedListener(this);
mMediaPlayers[index].setOnVideoSizeChangedListener(this);
mMediaPlayers[index].setAudioStreamType(AudioManager.STREAM_MUSIC);
}
catch (Exception e) { e.printStackTrace(); }
}
@Override protected void onPause() {
super.onPause();
releaseMediaPlayers();
}
@Override protected void onDestroy() {
super.onDestroy();
releaseMediaPlayers();
}
private void releaseMediaPlayers() {
for (int i=0; i<mMediaPlayers.length; i++) {
if (mMediaPlayers[i] != null) {
mMediaPlayers[i].release();
mMediaPlayers[i] = null;
}
}
}
private void startVideoPlayback(MediaPlayer player) {
Log.v(TAG, "MediaPlayer(" + indexOf(player) + "): startVideoPlayback");
player.start();
}
private int indexOf(MediaPlayer player) {
for (int i=0; i<mMediaPlayers.length; i++) if (mMediaPlayers[i] == player) return i;
return -1;
}
private int indexOf(SurfaceHolder holder) {
for (int i=0; i<mSurfaceHolders.length; i++) if (mSurfaceHolders[i] == holder) return i;
return -1;
}
}
R.layout.multi_videos_layout:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">
<SurfaceView android:id="@+id/video_1_surfaceview"
android:layout_width="fill_parent" android:layout_height="0dp"
android:layout_weight="1" />
<SurfaceView android:id="@+id/video_2_surfaceview"
android:layout_width="fill_parent" android:layout_height="0dp"
android:layout_weight="1" />
<SurfaceView android:id="@+id/video_3_surfaceview"
android:layout_width="fill_parent" android:layout_height="0dp"
android:layout_weight="1" />
</LinearLayout>
Fragment-based implementation:
public class MultipleVideoPlayFragmentActivity extends FragmentActivity {
private static final String TAG = "MediaPlayer";
@Override public void onCreate(Bundle icicle) {
super.onCreate(icicle);
setContentView(R.layout.multi_videos_activity_layout);
}
public static class VideoFragment extends Fragment implements
OnBufferingUpdateListener, OnCompletionListener, OnPreparedListener, OnVideoSizeChangedListener, SurfaceHolder.Callback {
private MediaPlayer mMediaPlayer;
private SurfaceView mSurfaceView;
private SurfaceHolder mSurfaceHolder;
private boolean mSizeKnown;
private boolean mVideoReady;
@Override public View onCreateView(LayoutInflater inflater, ViewGroup container, Bundle savedInstanceState) {
return inflater.inflate(R.layout.multi_videos_fragment_layout, container, false);
}
@Override public void onActivityCreated(Bundle savedInstanceState) {
super.onActivityCreated(savedInstanceState);
mSurfaceView = (SurfaceView) getView().findViewById(R.id.video_surfaceview);
mSurfaceHolder = mSurfaceView.getHolder();
mSurfaceHolder.addCallback(this);
mSurfaceHolder.setType(SurfaceHolder.SURFACE_TYPE_PUSH_BUFFERS);
}
public void onBufferingUpdate(MediaPlayer player, int percent) {
Log.d(TAG, "onBufferingUpdate percent: " + percent);
}
public void onCompletion(MediaPlayer player) {
Log.d(TAG, "onCompletion called");
}
public void onVideoSizeChanged(MediaPlayer player, int width, int height) {
Log.v(TAG, "onVideoSizeChanged called");
if (width == 0 || height == 0) {
Log.e(TAG, "invalid video width(" + width + ") or height(" + height + ")");
return;
}
mSizeKnown = true;
if (mVideoReady && mSizeKnown) {
startVideoPlayback();
}
}
public void onPrepared(MediaPlayer player) {
Log.d(TAG, "onPrepared called");
mVideoReady = true;
if (mVideoReady && mSizeKnown) {
startVideoPlayback();
}
}
public void surfaceChanged(SurfaceHolder holder, int i, int j, int k) {
Log.d(TAG, "surfaceChanged called");
}
public void surfaceDestroyed(SurfaceHolder holder) {
Log.d(TAG, "surfaceDestroyed called");
}
public void surfaceCreated(SurfaceHolder holder) {
Log.d(TAG, "surfaceCreated called");
try {
mMediaPlayer = new MediaPlayer();
AssetFileDescriptor afd = getActivity().getAssets().openFd("sample.3gp");
mMediaPlayer.setDataSource(afd.getFileDescriptor(), afd.getStartOffset(), afd.getLength());
mMediaPlayer.setDisplay(mSurfaceHolder);
mMediaPlayer.prepare();
mMediaPlayer.setOnBufferingUpdateListener(this);
mMediaPlayer.setOnCompletionListener(this);
mMediaPlayer.setOnPreparedListener(this);
mMediaPlayer.setOnVideoSizeChangedListener(this);
mMediaPlayer.setAudioStreamType(AudioManager.STREAM_MUSIC);
}
catch (Exception e) { e.printStackTrace(); }
}
@Override public void onPause() {
super.onPause();
releaseMediaPlayer();
}
@Override public void onDestroy() {
super.onDestroy();
releaseMediaPlayer();
}
private void releaseMediaPlayer() {
if (mMediaPlayer != null) {
mMediaPlayer.release();
mMediaPlayer = null;
}
}
private void startVideoPlayback() {
Log.v(TAG, "startVideoPlayback");
mMediaPlayer.start();
}
}
}
R.layout.multi_videos_activity_layout:
<?xml version="1.0" encoding="utf-8"?>
<LinearLayout xmlns:android="http://schemas.android.com/apk/res/android"
android:layout_width="match_parent" android:layout_height="match_parent"
android:orientation="vertical">
<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
android:id="@+id/video_1_fragment" android:layout_width="fill_parent"
android:layout_height="0dp" android:layout_weight="1" />
<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
android:id="@+id/video_2_fragment" android:layout_width="fill_parent"
android:layout_height="0dp" android:layout_weight="1" />
<fragment class="mh.so.video.MultipleVideoPlayFragmentActivity$VideoFragment"
android:id="@+id/video_3_fragment" android:layout_width="fill_parent"
android:layout_height="0dp" android:layout_weight="1" />
</LinearLayout>
R.layout.multi_videos_fragment_layout:
<?xml version="1.0" encoding="utf-8"?>
<SurfaceView xmlns:android="http://schemas.android.com/apk/res/android"
android:id="@+id/video_surfaceview" android:layout_width="fill_parent"
android:layout_height="fill_parent" />
Update: Although it's been around for a while now, I just thought it'd be worth pointing out that Google's Grafika project showcases a 'double decode' feature, which "Decodes two video streams simultaneously to two TextureViews.". Not sure how well it scales to more than two video files, but nevertheless relevant for the original question.