Quantcast
Channel: CSDN博客移动开发推荐文章
Viewing all articles
Browse latest Browse all 5930

[算法]iOS 视频添加水印,合成视频两种方案(整体渲染和分割渲染)

$
0
0

        现手机里有一段视频,通过APP给他添加一个水印。iOS提供了在视频上添加layer的接口,添加一个水印还是很方便的(添加水印)。添加完水印有一个渲染过程,在手机这种设备上还是比较慢的,比如:对1分钟的高清视频(960x540)进行渲染需要20秒左右。如何在现有API基础上提高渲染速度,提升用户体验,成了问题。笔者发现两种渲染方法:
        先看图,这有一个6秒的视频,我抓了四张关键帧。只在第2,3两张关键帧上添加字幕(一个关键帧代表1.5秒。所以,两个关键帧就代表是3秒时长)


        第一种方案:视频分割 + 逐段渲染 + 合并
            将视频分割为3端,即第1、第2,3、第4。有水印的是一组,没水印也是一组。对第2,3渲染。最后将三段视频合并到一起。
        第二种方案:整体渲染
            水印虽然不是从一开始出现,但是,我们可以对layer添加动画(参考下面OC代码)。
举个例子,假如水印少的情况。比如只对第一帧添加水印。分割渲染方案是3.0秒,整体渲染方案是3.5秒。假如水印多的情况,第1,2,3帧都有水印。分割渲染方案是4.1秒,整体渲染方案还是3.5秒。通过结果得出一点结论:整体渲染,无论水印多还是少,耗时是一样的。分割渲染怎么会有这么个差别呢?
        分割渲染:由于涉及三步:1.分割,2.渲染视频,3.合并视频。每一步都需要时间。根据测试经验,第一步分割视频的时间很少,可以忽略。那么就剩下渲染和合并时间了。就拿刚才6秒的视频来说:一张水印和三张水印。合并视频时间是一样的。都是两段(一段1.5秒,另外一段4.5秒)。但是水印多了渲染时间就长了。
        最后得出结论:水印少的时候,使用分割渲染方法。水印多的情况使用整体渲染。

整体渲染代码:

#define kEffectVideoFileName_Animation @"tmpMov-effect.mov"
    - (void)renderWholeVideo:(AVAsset *)asset{
    // 1 - Early exit if there's no video file selected
    if (!asset) {
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First"
                                                       delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
        [alert show];
        return;
    }
    
    // 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.
    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];
    
    // 3 - Video track
    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration)
                        ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                         atTime:kCMTimeZero error:nil];
    
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:[asset tracksWithMediaType:AVMediaTypeAudio][0] atTime:kCMTimeZero error:nil];
    
    // 3.1 - Create AVMutableVideoCompositionInstruction
    AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);
    
    // 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.
    __block     CGSize naturalSize;
    AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [self transformVideo:asset track:videoTrack isVideoAssetPortrait:^(CGSize finalSize) {
        naturalSize = finalSize;
    }];
    [videolayerInstruction setOpacity:0.0 atTime:asset.duration];
    
    // 3.3 - Add instructions
    mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
    
    AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
    
    
    float renderWidth, renderHeight;
    renderWidth = naturalSize.width;
    renderHeight = naturalSize.height;
    mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
    mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
    mainCompositionInst.frameDuration = CMTimeMake(1, 30);
    
    
    
    [self applyVideoEffectsWithAnimation:mainCompositionInst size:naturalSize];
    
    
    NSString *myPathDocs = [NSTemporaryDirectory() stringByAppendingPathComponent:kEffectVideoFileName_Animation];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];
    /*先移除旧文件*/
    [PublicUIMethod removeFile:url];
    
    // 5 - Create exporter
    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                                      presetName:AVAssetExportPresetHighestQuality];
    [self.exportSessions addObject:exporter];
    exporter.outputURL=url;
    exporter.outputFileType = AVFileTypeQuickTimeMovie;
    exporter.shouldOptimizeForNetworkUse = YES;
    exporter.videoComposition = mainCompositionInst;
    weakifyself;
    [exporter exportAsynchronouslyWithCompletionHandler:^{
        strongifyself;
        dispatch_async(dispatch_get_main_queue(), ^{
            switch ([exporter status]) {
                case AVAssetExportSessionStatusFailed:
                    
                    DDLogWarn(@"render Export failed: %@ and order : %d", [exporter error], 0);
                    break;
                case AVAssetExportSessionStatusCancelled:
                    
                    NSLog(@"render Export canceled order : %d", 0);
                    break;
                default:
                {
                    NSLog(@"'%@' render finish",[myPathDocs lastPathComponent]);
                    [self pushToPreviePage:myPathDocs];
                }
                    break;
            }
        });
    }];
    [self monitorSingleExporter:exporter];
}

- (void)applyVideoEffectsWithAnimation:(AVMutableVideoComposition *)composition size:(CGSize)size
{
    // Set up layer
    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
    videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
    [parentLayer addSublayer:videoLayer];
    
    
    /**/
    CMTime timeFrame = [self frameTime];
    CGFloat granularity = CMTimeGetSeconds(timeFrame);
    /*caption layer*/
    for (int j=0; j<self.effectsArray.count; j++) {
        NSArray* effectSeries = (NSArray *)self.effectsArray[j];
        FSVideoCaptionDescriptionModel *description = [[effectSeries firstObject] as:FSVideoCaptionDescriptionModel.class];
        NSArray *captions = [description reOrder];
        if (!captions || captions.count == 0) {
            //没有字幕就别瞎搞了
            continue;
        }
        
        FSCaptionModel *captionModel = captions.firstObject;
        UIImage *image = captionModel.image;/*将水印生成图片,采用图片方法添加水印*/
        CGFloat scaleY = captionModel.scaleY;
        CGFloat scaleHeight = captionModel.scaleHeight;
        CALayer *layer = [CALayer layer];
        layer.frame = CGRectMake(0, size.height * scaleY, size.width, size.height * scaleHeight);
        layer.contents = (__bridge id)image.CGImage;
        
        /*
         字幕动画由两个组成:
         1. 显示所有字幕<动画开始前保持,初始状态。动画时间是0.结束后不移除动画>
         2. 隐藏字幕,到最后。
         */
        CGFloat showStartTime = description.startIndex * granularity;
        CGFloat hiddenAginStartTime = showStartTime + effectSeries.count*granularity;
        
        CABasicAnimation *animation = nil;
        if (showStartTime > 0) {
            animation = [CABasicAnimation animationWithKeyPath:@"opacity"];
            animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionDefault];
            [animation setFromValue:[NSNumber numberWithFloat:0.0]];
            [animation setToValue:[NSNumber numberWithFloat:1.0]];
            [animation setBeginTime:showStartTime];
            [animation setFillMode:kCAFillModeBackwards];/*must be backwards*/
            [animation setRemovedOnCompletion:NO];/*must be no*/
            [layer addAnimation:animation forKey:@"animateOpacityShow"];
        }
        /*最后一个字幕片段不是整的1.5s或者5秒。就不隐藏动画了*/
        if (j != self.effectsArray.count-1) {
            animation = [CABasicAnimation animationWithKeyPath:@"opacity"];
            animation.timingFunction = [CAMediaTimingFunction functionWithName:kCAMediaTimingFunctionDefault];
            [animation setFromValue:[NSNumber numberWithFloat:1.0]];
            [animation setToValue:[NSNumber numberWithFloat:0.0]];
            [animation setBeginTime:hiddenAginStartTime];
            [animation setRemovedOnCompletion:NO];/*must be no*/
            [animation setFillMode:kCAFillModeForwards];
            [layer addAnimation:animation forKey:@"animateOpacityHiddenAgin"];
        }
        
        [parentLayer addSublayer:layer];
    }
    
    parentLayer.geometryFlipped = YES;
    composition.animationTool = [AVVideoCompositionCoreAnimationTool
                                 videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];
}

- (void)monitorSingleExporter:(AVAssetExportSession *)exporter{
    double delay = 1.0;
    int64_t delta = (int64_t)delay * NSEC_PER_SEC;
    dispatch_time_t poptime = dispatch_time(DISPATCH_TIME_NOW, delta);
    dispatch_after(poptime, dispatch_get_main_queue(), ^{
        if (exporter.status == AVAssetExportSessionStatusExporting) {
            NSLog(@"whole progress is %f",  exporter.progress);
            [self monitorSingleExporter:exporter];
        }
    });
}
-(AVMutableVideoCompositionLayerInstruction *) transformVideo:(AVAsset *)asset track:(AVMutableCompositionTrack *)firstTrack isVideoAssetPortrait:(void(^)(CGSize size))block{
    AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
    
    
    AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
    UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;
    BOOL isVideoAssetPortrait_  = NO;
    CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
    if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {
        videoAssetOrientation_ = UIImageOrientationRight;
        videoTransform = CGAffineTransformMakeRotation(M_PI_2);

        videoTransform = CGAffineTransformTranslate(videoTransform, 0, -videoAssetTrack.naturalSize.height);
        isVideoAssetPortrait_ = YES;
    }
    if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {
        videoAssetOrientation_ =  UIImageOrientationLeft;
        //这个地方很恶心,涉及到reveal看不到的坐标系
        videoTransform = CGAffineTransformMakeRotation(-M_PI_2);
        videoTransform = CGAffineTransformTranslate(videoTransform, - videoAssetTrack.naturalSize.width, 0);
        isVideoAssetPortrait_ = YES;
    }
    if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {
        videoAssetOrientation_ =  UIImageOrientationUp;
    }
    if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {
        videoTransform = CGAffineTransformMakeRotation(-M_PI);
        videoTransform = CGAffineTransformTranslate(videoTransform, -videoAssetTrack.naturalSize.width, -videoAssetTrack.naturalSize.height);
//        videoTransform = CGAffineTransformRotate(videoTransform, M_PI/180*45);
        videoAssetOrientation_ = UIImageOrientationDown;
    }
    [videolayerInstruction setTransform:videoTransform atTime:kCMTimeZero];
    
    CGSize naturalSize;
    if(isVideoAssetPortrait_){
        naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);
    } else {
        naturalSize = videoAssetTrack.naturalSize;
    }
    
    
    if(block){
        block(naturalSize);
    }
    return videolayerInstruction;
}


作者:hherima 发表于2017/5/25 14:49:40 原文链接
阅读:158 评论:0 查看评论

Viewing all articles
Browse latest Browse all 5930

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>