我的项目需要有一个多个mp3文件播放在一起完全同步的mp3播放器。我尝试过MediaPlayer,但问题是当我在循环中启动两个mp3文件时,它们稍微不同步。当然,它们是在调用Play()之前创建和准备的。这些不仅仅是声音,还有需要无缝循环的3-4分钟的音乐文件。
目前我正在努力使用AudioTrack,因为文件是AndroidAssets的,当我从它创建一个流时,ByteReader会给出一个内存不足的错误……
那么有没有更好的方法来实现mp3音乐同步播放呢?
谢谢Greg
发布于 2015-02-26 21:36:47
我已经为这个问题纠结了好几天了。我的方法是使用MediaExtractor/MediaCodec/AudioTrack让多个线程流式传输/解码/播放mp3文件。我让它在C#中工作,但在回放过程中观察到了很多GC活动。下面你可以找到我的单曲代码。你可以在这里阅读更多关于我遇到的问题以及我将如何解决它的内容:How to stream data from MediaCodec to AudioTrack with Xamarin for Android。
我还发现,在低端设备上,曲目无法同步播放(延迟为10s到100s ms)。我认为问题是当我执行audioTrack.Play()时,AudioTrack的缓冲区中没有足够的数据来立即开始播放,并且根据输入文件的格式,它需要不同数量的mp3帧来填充它,因此曲目开始具有不同的延迟。我正在试验的解决方案是推迟audioTrack.Play(),直到我知道缓冲区有足够的字节(AudioTrack.GetMinBufferSize(...))要立即开始播放,只需调用audioTrack.Play()即可。
var fd = this.Resources.OpenRawResourceFd(Resource.Raw.PianoInsideMics);
var extractor = new MediaExtractor();
extractor.SetDataSource(fd.FileDescriptor, fd.StartOffset, fd.Length);
extractor.SelectTrack(0);
var trackFormat = extractor.GetTrackFormat(0);
var decoder = MediaCodec.CreateDecoderByType(trackFormat.GetString(MediaFormat.KeyMime));
decoder.Configure(trackFormat, null, null, MediaCodecConfigFlags.None);
var thread = new Thread(() =>
{
decoder.Start();
var decoderInputBuffers = decoder.GetInputBuffers();
var decoderOutputBuffers = decoder.GetOutputBuffers();
var inputIndex = decoder.DequeueInputBuffer(-1);
var inputBuffer = decoderInputBuffers[inputIndex];
var bufferInfo = new MediaCodec.BufferInfo();
byte[] audioBuffer = null;
AudioTrack audioTrack = null;
var read = extractor.ReadSampleData(inputBuffer, 0);
while (read > 0)
{
decoder.QueueInputBuffer(inputIndex, 0, read, extractor.SampleTime,
extractor.SampleFlags == MediaExtractorSampleFlags.Sync ? MediaCodecBufferFlags.SyncFrame : MediaCodecBufferFlags.None);
extractor.Advance();
var outputIndex = decoder.DequeueOutputBuffer(bufferInfo, -1);
if (outputIndex == (int) MediaCodecInfoState.OutputFormatChanged)
{
trackFormat = decoder.OutputFormat;
}
else if (outputIndex >= 0)
{
if (bufferInfo.Size > 0)
{
var outputBuffer = decoderOutputBuffers[outputIndex];
if (audioBuffer == null || audioBuffer.Length < bufferInfo.Size)
{
audioBuffer = new byte[bufferInfo.Size];
Debug.WriteLine("Allocated new audiobuffer: {0}", audioBuffer.Length);
}
outputBuffer.Rewind();
outputBuffer.Get(audioBuffer, 0, bufferInfo.Size);
decoder.ReleaseOutputBuffer(outputIndex, false);
if (audioTrack == null)
{
var sampleRateInHz = trackFormat.GetInteger(MediaFormat.KeySampleRate);
var channelCount = trackFormat.GetInteger(MediaFormat.KeyChannelCount);
var channelConfig = channelCount == 1 ? ChannelOut.Mono : ChannelOut.Stereo;
audioTrack = new AudioTrack(
Stream.Music,
sampleRateInHz,
channelConfig,
Encoding.Pcm16bit,
AudioTrack.GetMinBufferSize(sampleRateInHz, channelConfig, Encoding.Pcm16bit)*2,
AudioTrackMode.Stream);
audioTrack.Play();
}
audioTrack.Write(audioBuffer, 0, bufferInfo.Size);
}
}
inputIndex = decoder.DequeueInputBuffer(-1);
inputBuffer = decoderInputBuffers[inputIndex];
read = extractor.ReadSampleData(inputBuffer, 0);
}
});
https://stackoverflow.com/questions/24618714
复制相似问题