Crop area different than Selected Area in iOS?
Asked Answered
E

1

6

Here is link on github https://github.com/spennyf/cropVid/tree/master to try it out your self and see what I am talking about it would take 1 minute to test. Thanks!

I am taking a video with a square to show what part of vid will be cropped. Like this:

enter image description here

Right now I am doing this of a piece of paper with 4 lines in the square, and half a line difference on top and bottom. And then I crop the video using code I will post, but then when I display the video I see this (Ignore background and green circle):

enter image description here

As you can see there are more than four lines, so I am setting it to crop a certain part but it is adding more, when I am using the same rectangle that is displayed in the camera, and the same rectangle that is used to crop?

So my question is why is the cropping not the same size?

Here is how I do crop and display:

//this is the square on the camera
UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height-80)];
    UIImageView *image = [[UIImageView alloc] init];
    image.layer.borderColor=[[UIColor whiteColor] CGColor];
image.frame = CGRectMake(self.view.frame.size.width/2 - 58 , 100 , 116, 116);
    CALayer *imageLayer = image.layer;
    [imageLayer setBorderWidth:1];
[view addSubview:image];
    [picker setCameraOverlayView:view];

//this is crop rect
CGRect rect = CGRectMake(self.view.frame.size.width/2 - 58, 100, 116, 116);
[self applyCropToVideoWithAsset:assest AtRect:rect OnTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(assest.duration.value, 1))
                    ExportToUrl:exportUrl ExistingExportSession:exporter WithCompletion:^(BOOL success, NSError *error, NSURL *videoUrl) {
//here is player
AVPlayer *player = [AVPlayer playerWithURL:videoUrl];

                            AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player];
layer.frame = CGRectMake(self.view.frame.size.width/2 - 58, 100, 116, 116);
}];

And here is code that does the crop:

- (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset
{
    AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
CGSize size = [videoTrack naturalSize];
CGAffineTransform txf = [videoTrack preferredTransform];

if (size.width == txf.tx && size.height == txf.ty)
    return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft;
else if (txf.tx == 0 && txf.ty == 0)
    return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight;
else if (txf.tx == 0 && txf.ty == size.width)
    return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown;
else
    return UIImageOrientationUp;  //return UIInterfaceOrientationPortrait;
}

And here is rest of the cropping code:

- (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success, NSError* error, NSURL* videoUrl))completion
{

//    NSLog(@"CALLED");
//create an avassetrack with our asset
AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

//create a video composition and preset some settings
AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
videoComposition.frameDuration = CMTimeMake(1, 30);

CGFloat cropOffX = cropRect.origin.x;
CGFloat cropOffY = cropRect.origin.y;
CGFloat cropWidth = cropRect.size.width;
CGFloat cropHeight = cropRect.size.height;
//    NSLog(@"width: %f - height: %f - x: %f - y: %f", cropWidth, cropHeight, cropOffX, cropOffY);

videoComposition.renderSize = CGSizeMake(cropWidth, cropHeight);

//create a video instruction
AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
instruction.timeRange = cropTimeRange;

AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];

UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset];

CGAffineTransform t1 = CGAffineTransformIdentity;
CGAffineTransform t2 = CGAffineTransformIdentity;

switch (videoOrientation) {
    case UIImageOrientationUp:
        t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX, 0 - cropOffY );
        t2 = CGAffineTransformRotate(t1, M_PI_2 );
        break;
    case UIImageOrientationDown:
        t1 = CGAffineTransformMakeTranslation(0 - cropOffX, clipVideoTrack.naturalSize.width - cropOffY ); // not fixed width is the real height in upside down
        t2 = CGAffineTransformRotate(t1, - M_PI_2 );
        break;
    case UIImageOrientationRight:
        t1 = CGAffineTransformMakeTranslation(0 - cropOffX, 0 - cropOffY );
        t2 = CGAffineTransformRotate(t1, 0 );
        break;
    case UIImageOrientationLeft:
        t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX, clipVideoTrack.naturalSize.height - cropOffY );
        t2 = CGAffineTransformRotate(t1, M_PI  );
        break;
    default:
        NSLog(@"no supported orientation has been found in this video");
        break;
}

CGAffineTransform finalTransform = t2;
[transformer setTransform:finalTransform atTime:kCMTimeZero];

//add the transformer layer instructions, then add to video composition
instruction.layerInstructions = [NSArray arrayWithObject:transformer];
videoComposition.instructions = [NSArray arrayWithObject: instruction];

//Remove any prevouis videos at that path
[[NSFileManager defaultManager]  removeItemAtURL:outputUrl error:nil];

if (!exporter){
    exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ;
}
// assign all instruction for the video processing (in this case the transformation for cropping the video
exporter.videoComposition = videoComposition;
exporter.outputFileType = AVFileTypeQuickTimeMovie;
if (outputUrl){
    exporter.outputURL = outputUrl;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        switch ([exporter status]) {
            case AVAssetExportSessionStatusFailed:
                NSLog(@"crop Export failed: %@", [[exporter error] localizedDescription]);
                if (completion){
                    dispatch_async(dispatch_get_main_queue(), ^{
                        completion(NO,[exporter error],nil);
                    });
                    return;
                }
                break;
            case AVAssetExportSessionStatusCancelled:
                NSLog(@"crop Export canceled");
                if (completion){
                    dispatch_async(dispatch_get_main_queue(), ^{
                        completion(NO,nil,nil);
                    });
                    return;
                }
                break;
            default:
                break;
        }
        if (completion){
            dispatch_async(dispatch_get_main_queue(), ^{
                completion(YES,nil,outputUrl);
            });
        }

    }];
}

return exporter;
}

So my question is why is the video area not the same as the crop/ camera area, when I have used exactly the same coordinates and size of square?

Elyseelysee answered 4/4, 2015 at 19:3 Comment(5)
Just to be sure, once the cropped video is performed ( so in the completion block ) is should be saved on the iphone disk. Please, check that file directly, i mean access to the file ( connecting the iphone to the mac, and using tool like iExplorer or iFunBox ). then copy it on the mac, and open it with the default mac quick time player. In this way you'll be sure that the resulting cropped video is exactly what you see in that square. Also, be sure that the crop area use the proper coordinates to the referred view, for both x and y axisOckeghem
@LucaIaco Okay I am using iExplorer and put the video on my mac and played it with quick time and the cropped area is still not correct. I have looked at the coordinates again and again and I am sure they are right. I am going to make a git hub project a post the link, so you could download and run and see for yourself if you wouldn't mind. Right now I am take video of a green square and just the square in the cropped part, but then I see white when its cropped. I would really appreciate if you look at projectElyseelysee
Here is correct link github.com/spennyf/cropVidElyseelysee
@LucaIaco were you able to try it for yourself?Elyseelysee
i'll try it as soon as possible ;)Ockeghem
D
-1

Maybe Check This Previous Question.

It looks like it might be similar to what you are experiencing. A user on that question suggested cropping this way:

CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage], cropRect);
UIImage *croppedImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);

I hope that this helps or at least gives you a start in the right direction.

Dinse answered 15/4, 2015 at 14:0 Comment(3)
This answer is completely irrelevant. The question asks about cropping a video, not an image.Bight
Sorry about that, I was up pretty late and definitely misread this one! Thanks for the heads up.Dinse
Haha, np, happens to all of us.Bight

© 2022 - 2024 — McMap. All rights reserved.