I've implemented an audio player using AVAudioPlayer
(not AVPlayer
). I'm able to handle the remote control events with the following method. It works quite alright so far, however I see two more subtypes
for these events: UIEventSubtypeRemoteControlEndSeekingForward
and UIEventSubtypeRemoteControlEndSeekingBackward
.
- (void)remoteControlReceivedWithEvent:(UIEvent *)event {
//if it is a remote control event handle it correctly
if (event.type == UIEventTypeRemoteControl)
{
if (event.subtype == UIEventSubtypeRemoteControlPlay)
{
[self playAudio];
}
else if (event.subtype == UIEventSubtypeRemoteControlPause)
{
[self pauseAudio];
}
else if (event.subtype == UIEventSubtypeRemoteControlTogglePlayPause)
{
[self togglePlayPause];
}
else if (event.subtype == UIEventSubtypeRemoteControlBeginSeekingBackward)
{
[self rewindTheAudio]; //this method rewinds the audio by 15 seconds.
}
else if (event.subtype == UIEventSubtypeRemoteControlBeginSeekingForward)
{
[self fastForwardTheAudio]; //this method fast-forwards the audio by 15 seconds.
}
}
So the questions:
In order to have things work right, am I supposed to implement those two subtypes, too?
This method only enables the
rewind
,play/pause
, andfast forward
buttons on lock screen, but it doesn't display the file title, artwork, and duration. How can I display that info usingAVAudioPlayer
orAVAudioSession
(I don't really want one more library/API to implement this)?2-a. I discovered
MPNowPlayingInfoCenter
while searching and I don't know much about it. Do I have to use it to implement those stuff above? :-[