I am developing an application that merges mp4 clips using the mp4parser library (isoparser-1.0-RC-27.jar and aspectjrt-1.8.0.jar). When two clips are merged, they become a single clip but as more clips are added to it, the output mp4 has it's audio behind the video.
Here is the code:
Movie[] clips = new Movie[2];
//location of the movie clip storage
File mediaStorageDir = new File(Environment.getExternalStoragePublicDirectory(
Environment.DIRECTORY_PICTURES), "TestMerge");
//Build the two clips into movies
Movie firstClip = MovieCreator.build(first);
Movie secondClip = MovieCreator.build(second);
//Add both movie clips
clips[0] = firstClip;
clips[1] = secondClip;
//List for audio and video tracks
List<Track> videoTracks = new LinkedList<Track>();
List<Track> audioTracks = new LinkedList<Track>();
//Iterate all the movie clips and find the audio and videos
for (Movie movie: clips) {
for (Track track : movie.getTracks()) {
if (track.getHandler().equals("soun"))
audioTracks.add(track);
if (track.getHandler().equals("vide"))
videoTracks.add(track);
}
}
//Result movie from putting the audio and video together from the two clips
Movie result = new Movie();
//Append all audio and video
if (videoTracks.size() > 0)
result.addTrack(new AppendTrack(videoTracks.toArray(new Track[videoTracks.size()])));
if (audioTracks.size() > 0)
result.addTrack(new AppendTrack(audioTracks.toArray(new Track[audioTracks.size()])));
//Output the resulting movie to a new mp4 file
String timeStamp = new SimpleDateFormat("yyyyMMdd_HHmmss").format(new Date());
String outputLocation = mediaStorageDir.getPath()+timeStamp;
Container out = new DefaultMp4Builder().build(result);
FileChannel fc = new RandomAccessFile(String.format(outputLocation), "rw").getChannel();
out.writeContainer(fc);
fc.close();
//Now set the active URL to play as the combined videos!
setURL(outputLocation);
}
My guess is that as more clips are being added, the synchronization of video to audio is being messed up, since if two longer clips are merged then the audio/video is fine. Is there anyway to prevent this poor sync of video and audio in multiple smaller clips, or has anyone found a solution to doing so using mp4parser?? FFMpeg is another solution I am considering, but haven't found anyone else using it to do this
EDIT: I have discovered that the audio is typically longer than the video, therefore, this is what causes the final resulting video to be offset so much when more and more clips are added to create one clip. I am going to solve by chopping off audio samples