I was having the same problem and figured it out after digging through the documentation for a while. As far as I can tell, the problem is that the play method gets all of the metadata that you haven't yet fetched that is needed for playing the asset. In my case, it was a video.
This is a relevant piece of documentation from Apple. https://developer.apple.com/documentation/avfoundation/media_assets/retrieving_media_metadata
This is the actual function that I used to load the metadata.
https://developer.apple.com/documentation/avfoundation/avasynchronouskeyvalueloading/3747326-load
Here is the video player that I have that loads the metadata asynchronously before attempting to play the asset. There can be a small delay in loading the video, which shows a black screen on the video player, but it does not block any UI, and the warning went away.
import SwiftUI
import AVKit
struct VideoPlayerView: View {
private var videoURL : URL
@State private var player: AVPlayer?
@State private var isMuted: Bool = true
var showMuteButton: Bool
init(url: URL, showMuteButton: Bool = true) {
self.videoURL = url
self.showMuteButton = showMuteButton
}
var body: some View {
VideoPlayer(player: player)
.onAppear() {
Task {
player = AVPlayer()
await loadPlayerItem(self.videoURL)
player?.isMuted = true
self.isMuted = player?.isMuted ?? false
player?.play()
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: player?.currentItem, queue: .main) { _ in
self.player?.seek(to: .zero)
self.player?.play()
}
}//: Task
}//: onAppear
.onDisappear() {
Task {
// Stop the player when the view disappears
player?.pause()
}
}
.onChange(of: videoURL) { oldValue, newValue in
Task {
await loadPlayerItem(newValue)
}
}
.overlay(alignment: .topTrailing) {
if showMuteButton {
Image(systemName: isMuted ? "speaker.slash.fill" : "speaker.wave.2.fill")
.font(.footnote)
.foregroundColor(.white)
.padding(8)
.background(Color.gray.opacity(0.3))
.clipShape(Circle())
.padding()
.onTapGesture {
player?.isMuted.toggle()
self.isMuted = player?.isMuted ?? false
}
}
}
}
func loadPlayerItem(_ videoURL: URL) async {
let asset = AVAsset(url: videoURL)
do {
let (_, _, _) = try await asset.load(.tracks, .duration, .preferredTransform)
} catch let error {
print(error.localizedDescription)
}
let item = AVPlayerItem(asset: asset)
player?.replaceCurrentItem(with: item)
}
}
The key function is the "loadPlayerItem". You don't actually need the return values for the load function, but having loaded them caches them so that the play function has everything it needs to play the asset.
I hope this helps.