How to get a specific video area in Swift AVPlayer?
Asked Answered
C

0

13

I am working on Swift base MAC OS-X application. The user will choose a video file and the NSView Controller will play the video using AVPlayerView. The user can draw a rectangle over the AVPlayerView and I have achieved this functionality. How to show the selected area video in a rectangle to a camera Preview Layer at right bottom?

Here is the output: Output Pic

Here is the Code:

var videoURL = NSURL() // The video file URL by the file chooser
var videoSize = NSSize() // Here the video resolution size will be kept i.e. 720*480
var player:AVPlayerView! // The AVPlayerView to play video

var cameraPreviewLayer:AVPlayerLayer! //The right bottom camera Preview Layer
var rectangleLayer:AVPlayerLayer! // The rectangle that will draw over AVPlayer while mouse dragging

var isClicked = false
var startPoint = NSPoint() // holds the starting point location of mouse pointer while draging

var rect:NSRect!

override func viewDidLoad() {
    super.viewDidLoad()

    self.player = AVPlayerView()

    self.player.frame.origin = CGPoint(x: 0, y: 0)
    self.player.setFrameSize(self.videoSize)

    self.player.player = AVPlayer.playerWithURL(videoURL) as! AVPlayer
    self.view.addSubview(self.player)

    setupRectangle()
    setupCameraPreviewLayer()
}

// intially setup white color rectangle that will draged over AVPlayer
func setupRectangle() {
    self.rectangleLayer = AVPlayerLayer()
    self.rectangleLayer.backgroundColor = NSColor.clearColor().CGColor
    self.rectangleLayer.borderColor = NSColor.whiteColor().CGColor
    self.rectangleLayer.borderWidth = 1.5
    self.rectangleLayer.frame = self.player.bounds
}
// intially setup CameraPreview Layer that will show at right bottom of AVPlayer
func setupCameraPreviewLayer(){
    cameraPreviewLayer = AVPlayerLayer(player: self.player.player)

    //place the cameraPreview layer at right bottom
    cameraPreviewLayer.frame.origin.x = self.player.framesize.width-100
    cameraPreviewLayer.frame.origin.y = self.player.frame.origin.y-100
    cameraPreviewLayer.frame.size.width = 100
    cameraPreviewLayer.frame.size.height = 100
}

override func  mouseDown(theEvent: NSEvent) {
    startPoint = theEvent.locationInWindow
    isClicked = true
    removeCameraPreviewLayer()
}

override func mouseDragged(theEvent: NSEvent) {
    var endPoint = theEvent.locationInWindow

    if( isClicked ){
        rect = NSRect(x: startPoint.x, y: startPoint.y, width: -(startPoint.x-endPoint.x), height: -(startPoint.y-endPoint.y))
        drawCustomRect(rect)
    }
}

override func  mouseUp(theEvent: NSEvent) {
    var endPoint = theEvent.locationInWindow

    if( isClicked ){

// I think we have to do some magic code here
        addCameraPreviewLayer()
    }
    isClicked = false;
}

// redraw the white color rectange over avplayer
func drawCustomRect(rect: NSRect) {
    self.rectangleLayer.frame = rect
    self.player.layer?.addSublayer(self.rectangleLayer)
}

// add Camera PreviewLayer from AVPlayer
func addCameraPreviewLayer() {
    self.player.layer?.addSublayer(self.layer)
}

// remove Camera PreviewLayer from AVPlayer
func removeCameraPreviewLayer() {
    self.cameraPreviewLayer.removeFromSuperlayer()
}

Here is the desire output picture which I wants. desired output

Suppose Video has a size 720*480, user has draw a rectangle and its points are (x1, y1) (x2,y1) (x3,y3) (x4,y4). How could I crop a video in camera Preview layer(at right bottom) which shows the video of area same as rectangle selected by the user?

Any one help how to achieve this functionality? I have spend many days on this and exhaust.

Note: I can do it in open-CV by video processing using ROI, but the requirement is to do it in native SWIFT Language.

Cutlery answered 4/6, 2015 at 13:53 Comment(4)
Hey did you find it's solution? Stuck in same situation for few days.Moony
Unfortunately, I have not found the solution in native swift language. I have accomplish this problem by integrating the open cv in my swift language project. Let me explain the solution; 1) Open the video 2) add the functionality of selecting area by user 3) Slice the video using open cv by its frames (video composes of multiple frames and it come after one another called FPS) 3) get ROI from frames(which user select) of video and play it in player with the selected segment of video frames processed by open cv.Cutlery
@Cutlery Which openCV library did you use for this? Can you provide GitHub link?Retrospect
@FarazAhmedKhan I have used openCV github.com/opencv/opencv at that time. You can find its compiled libraries for you xCode project.Cutlery

© 2022 - 2024 — McMap. All rights reserved.