iOS 音频视频制作
--iOS多媒体 概览随着移动互联网的发展,如今的手机早已不是打电话、发短信那么简单了,播放音乐、视频、录音、拍照等都是很常用的功能。在iOS中对于多媒体的支持是非常强大的,无论是音视频播放、录制,还是对麦克风、摄像头的操作都提供了多套API。在今天的文章中将会对这些内容进行一一介绍: 音频在iOS中音频播放从形式上可以分为音效播放和音乐播放。前者主要指的是一些短音频播放,通常作为点缀音频,对于这类音频不需要进行进度、循环等控制。后者指的是一些较长的音频,通常是主音频,对于这些音频的播放通常需要进行精确的控制。在iOS中播放两类音频分别使用AudioToolbox.framework和AVFoundation.framework来完成音效和音乐播放。 音效AudioToolbox.framework是一套基于C语言的框架,使用它来播放音效其本质是将短音频注册到系统声音服务(System Sound Service)。System Sound Service是一种简单、底层的声音播放服务,但是它本身也存在着一些限制:
使用System Sound Service 播放音效的步骤如下:
下面是一个简单的示例程序: //// KCMainViewController.m// Audio//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.// 音效播放#import 'KCMainViewController.h'#import <AudioToolbox/AudioToolbox.h>@interface KCMainViewController ()@end@implementation KCMainViewController- (void)viewDidLoad { [super viewDidLoad]; [self playSoundEffect:@'videoRing.caf'];}/** * 播放完成回调函数 * * @param soundID 系统声音ID * @param clientData 回调时传递的数据 */void soundCompleteCallback(SystemSoundID soundID,void * clientData){ NSLog(@'播放完成...');}/** * 播放音效文件 * * @param name 音频文件名称 */-(void)playSoundEffect:(NSString *)name{ NSString *audioFile=[[NSBundle mainBundle] pathForResource:name ofType:nil]; NSURL *fileUrl=[NSURL fileURLWithPath:audioFile]; //1.获得系统声音ID SystemSoundID soundID=0; /** * inFileUrl:音频文件url * outSystemSoundID:声音id(此函数会将音效文件加入到系统音频服务中并返回一个长整形ID) */ AudioServicesCreateSystemSoundID((__bridge CFURLRef)(fileUrl), &soundID); //如果需要在播放完之后执行某些操作,可以调用如下方法注册一个播放完成回调函数 AudioServicesAddSystemSoundCompletion(soundID, NULL, NULL, soundCompleteCallback, NULL); //2.播放音频 AudioServicesPlaySystemSound(soundID);//播放音效// AudioServicesPlayAlertSound(soundID);//播放音效并震动}@end 音乐如果播放较大的音频或者要对音频有精确的控制则System Sound Service可能就很难满足实际需求了,通常这种情况会选择使用AVFoundation.framework中的AVAudioPlayer来实现。AVAudioPlayer可以看成一个播放器,它支持多种音频格式,而且能够进行进度、音量、播放速度等控制。首先简单看一下AVAudioPlayer常用的属性和方法:
AVAudioPlayer的使用比较简单:
下面就使用AVAudioPlayer实现一个简单播放器,在这个播放器中实现了播放、暂停、显示播放进度功能,当然例如调节音量、设置循环模式、甚至是声波图像(通过分析音频分贝值)等功能都可以实现,这里就不再一一演示。界面效果如下: 当然由于AVAudioPlayer一次只能播放一个音频文件,所有上一曲、下一曲其实可以通过创建多个播放器对象来完成,这里暂不实现。播放进度的实现主要依靠一个定时器实时计算当前播放时长和音频总时长的比例,另外为了演示委托方法,下面的代码中也实现了播放完成委托方法,通常如果有下一曲功能的话播放完可以触发下一曲音乐播放。下面是主要代码: //// ViewController.m// KCAVAudioPlayer//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>#define kMusicFile @'刘若英 - 原来你也在这里.mp3'#define kMusicSinger @'刘若英'#define kMusicTitle @'原来你也在这里'@interface ViewController ()<AVAudioPlayerDelegate>@property (nonatomic,strong) AVAudioPlayer *audioPlayer;//播放器@property (weak, nonatomic) IBOutlet UILabel *controlPanel; //控制面板@property (weak, nonatomic) IBOutlet UIProgressView *playProgress;//播放进度@property (weak, nonatomic) IBOutlet UILabel *musicSinger; //演唱者@property (weak, nonatomic) IBOutlet UIButton *playOrPause; //播放/暂停按钮(如果tag为0认为是暂停状态,1是播放状态)@property (weak ,nonatomic) NSTimer *timer;//进度更新定时器@end@implementation ViewController- (void)viewDidLoad { [super viewDidLoad]; [self setupUI]; }/** * 初始化UI */-(void)setupUI{ self.title=kMusicTitle; self.musicSinger.text=kMusicSinger;}-(NSTimer *)timer{ if (!_timer) { _timer=[NSTimer scheduledTimerWithTimeInterval:0.5 target:self selector:@selector(updateProgress) userInfo:nil repeats:true]; } return _timer;}/** * 创建播放器 * * @return 音频播放器 */-(AVAudioPlayer *)audioPlayer{ if (!_audioPlayer) { NSString *urlStr=[[NSBundle mainBundle]pathForResource:kMusicFile ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; NSError *error=nil; //初始化播放器,注意这里的Url参数只能时文件路径,不支持HTTP Url _audioPlayer=[[AVAudioPlayer alloc]initWithContentsOfURL:url error:&error]; //设置播放器属性 _audioPlayer.numberOfLoops=0;//设置为0不循环 _audioPlayer.delegate=self; [_audioPlayer prepareToPlay];//加载音频文件到缓存 if(error){ NSLog(@'初始化播放器过程发生错误,错误信息:%@',error.localizedDescription); return nil; } } return _audioPlayer;}/** * 播放音频 */-(void)play{ if (![self.audioPlayer isPlaying]) { [self.audioPlayer play]; self.timer.fireDate=[NSDate distantPast];//恢复定时器 }}/** * 暂停播放 */-(void)pause{ if ([self.audioPlayer isPlaying]) { [self.audioPlayer pause]; self.timer.fireDate=[NSDate distantFuture];//暂停定时器,注意不能调用invalidate方法,此方法会取消,之后无法恢复 }}/** * 点击播放/暂停按钮 * * @param sender 播放/暂停按钮 */- (IBAction)playClick:(UIButton *)sender { if(sender.tag){ sender.tag=0; [sender setImage:[UIImage imageNamed:@'playing_btn_play_n'] forState:UIControlStateNormal]; [sender setImage:[UIImage imageNamed:@'playing_btn_play_h'] forState:UIControlStateHighlighted]; [self pause]; }else{ sender.tag=1; [sender setImage:[UIImage imageNamed:@'playing_btn_pause_n'] forState:UIControlStateNormal]; [sender setImage:[UIImage imageNamed:@'playing_btn_pause_h'] forState:UIControlStateHighlighted]; [self play]; }}/** * 更新播放进度 */-(void)updateProgress{ float progress= self.audioPlayer.currentTime /self.audioPlayer.duration; [self.playProgress setProgress:progress animated:true];}#pragma mark - 播放器代理方法-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{ NSLog(@'音乐播放完成...');}@end运行效果: 音频会话事实上上面的播放器还存在一些问题,例如通常我们看到的播放器即使退出到后台也是可以播放的,而这个播放器如果退出到后台它会自动暂停。如果要支持后台播放需要做下面几件事情: 1.设置后台运行模式:在plist文件中添加Required background modes,并且设置item 0=App plays audio or streams audio/video using AirPlay(其实可以直接通过Xcode在Project Targets-Capabilities-Background Modes中设置) 2.设置AVAudioSession的类型为AVAudioSessionCategoryPlayback并且调用setActive::方法启动会话。 AVAudioSession *audioSession=[AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayback error:nil]; [audioSession setActive:YES error:nil]; 3.为了能够让应用退到后台之后支持耳机控制,建议添加远程控制事件(这一步不是后台播放必须的) 前两步是后台播放所必须设置的,第三步主要用于接收远程事件,这部分内容之前的文章中有详细介绍,如果这一步不设置虽让也能够在后台播放,但是无法获得音频控制权(如果在使用当前应用之前使用其他播放器播放音乐的话,此时如果按耳机播放键或者控制中心的播放按钮则会播放前一个应用的音频),并且不能使用耳机进行音频控制。第一步操作相信大家都很容易理解,如果应用程序要允许运行到后台必须设置,正常情况下应用如果进入后台会被挂起,通过该设置可以上应用程序继续在后台运行。但是第二步使用的AVAudioSession有必要进行一下详细的说明。 在iOS中每个应用都有一个音频会话,这个会话就通过AVAudioSession来表示。AVAudioSession同样存在于AVFoundation框架中,它是单例模式设计,通过sharedInstance进行访问。在使用Apple设备时大家会发现有些应用只要打开其他音频播放就会终止,而有些应用却可以和其他应用同时播放,在多种音频环境中如何去控制播放的方式就是通过音频会话来完成的。下面是音频会话的几种会话模式:
注意:是否遵循静音键表示在播放过程中如果用户通过硬件设置为静音是否能关闭声音。 根据前面对音频会话的理解,相信大家开发出能够在后台播放的音频播放器并不难,但是注意一下,在前面的代码中也提到设置完音频会话类型之后需要调用setActive::方法将会话激活才能起作用。类似的,如果一个应用已经在播放音频,打开我们的应用之后设置了在后台播放的会话类型,此时其他应用的音频会停止而播放我们的音频,如果希望我们的程序音频播放完之后(关闭或退出到后台之后)能够继续播放其他应用的音频的话则可以调用setActive::方法关闭会话。代码如下: //// ViewController.m// KCAVAudioPlayer//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.// AVAudioSession 音频会话#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>#define kMusicFile @'刘若英 - 原来你也在这里.mp3'#define kMusicSinger @'刘若英'#define kMusicTitle @'原来你也在这里'@interface ViewController ()<AVAudioPlayerDelegate>@property (nonatomic,strong) AVAudioPlayer *audioPlayer;//播放器@property (weak, nonatomic) IBOutlet UILabel *controlPanel; //控制面板@property (weak, nonatomic) IBOutlet UIProgressView *playProgress;//播放进度@property (weak, nonatomic) IBOutlet UILabel *musicSinger; //演唱者@property (weak, nonatomic) IBOutlet UIButton *playOrPause; //播放/暂停按钮(如果tag为0认为是暂停状态,1是播放状态)@property (weak ,nonatomic) NSTimer *timer;//进度更新定时器@end@implementation ViewController- (void)viewDidLoad { [super viewDidLoad]; [self setupUI]; }/** * 显示当面视图控制器时注册远程事件 * * @param animated 是否以动画的形式显示 */-(void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; //开启远程控制 [[UIApplication sharedApplication] beginReceivingRemoteControlEvents]; //作为第一响应者 //[self becomeFirstResponder];}/** * 当前控制器视图不显示时取消远程控制 * * @param animated 是否以动画的形式消失 */-(void)viewWillDisappear:(BOOL)animated{ [super viewWillDisappear:animated]; [[UIApplication sharedApplication] endReceivingRemoteControlEvents]; //[self resignFirstResponder];}/** * 初始化UI */-(void)setupUI{ self.title=kMusicTitle; self.musicSinger.text=kMusicSinger;}-(NSTimer *)timer{ if (!_timer) { _timer=[NSTimer scheduledTimerWithTimeInterval:0.5 target:self selector:@selector(updateProgress) userInfo:nil repeats:true]; } return _timer;}/** * 创建播放器 * * @return 音频播放器 */-(AVAudioPlayer *)audioPlayer{ if (!_audioPlayer) { NSString *urlStr=[[NSBundle mainBundle]pathForResource:kMusicFile ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; NSError *error=nil; //初始化播放器,注意这里的Url参数只能时文件路径,不支持HTTP Url _audioPlayer=[[AVAudioPlayer alloc]initWithContentsOfURL:url error:&error]; //设置播放器属性 _audioPlayer.numberOfLoops=0;//设置为0不循环 _audioPlayer.delegate=self; [_audioPlayer prepareToPlay];//加载音频文件到缓存 if(error){ NSLog(@'初始化播放器过程发生错误,错误信息:%@',error.localizedDescription); return nil; } //设置后台播放模式 AVAudioSession *audioSession=[AVAudioSession sharedInstance]; [audioSession setCategory:AVAudioSessionCategoryPlayback error:nil];// [audioSession setCategory:AVAudioSessionCategoryPlayback withOptions:AVAudioSessionCategoryOptionAllowBluetooth error:nil]; [audioSession setActive:YES error:nil]; //添加通知,拔出耳机后暂停播放 [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(routeChange:) name:AVAudioSessionRouteChangeNotification object:nil]; } return _audioPlayer;}/** * 播放音频 */-(void)play{ if (![self.audioPlayer isPlaying]) { [self.audioPlayer play]; self.timer.fireDate=[NSDate distantPast];//恢复定时器 }}/** * 暂停播放 */-(void)pause{ if ([self.audioPlayer isPlaying]) { [self.audioPlayer pause]; self.timer.fireDate=[NSDate distantFuture];//暂停定时器,注意不能调用invalidate方法,此方法会取消,之后无法恢复 }}/** * 点击播放/暂停按钮 * * @param sender 播放/暂停按钮 */- (IBAction)playClick:(UIButton *)sender { if(sender.tag){ sender.tag=0; [sender setImage:[UIImage imageNamed:@'playing_btn_play_n'] forState:UIControlStateNormal]; [sender setImage:[UIImage imageNamed:@'playing_btn_play_h'] forState:UIControlStateHighlighted]; [self pause]; }else{ sender.tag=1; [sender setImage:[UIImage imageNamed:@'playing_btn_pause_n'] forState:UIControlStateNormal]; [sender setImage:[UIImage imageNamed:@'playing_btn_pause_h'] forState:UIControlStateHighlighted]; [self play]; }}/** * 更新播放进度 */-(void)updateProgress{ float progress= self.audioPlayer.currentTime /self.audioPlayer.duration; [self.playProgress setProgress:progress animated:true];}/** * 一旦输出改变则执行此方法 * * @param notification 输出改变通知对象 */-(void)routeChange:(NSNotification *)notification{ NSDictionary *dic=notification.userInfo; int changeReason= [dic[AVAudioSessionRouteChangeReasonKey] intValue]; //等于AVAudioSessionRouteChangeReasonOldDeviceUnavailable表示旧输出不可用 if (changeReason==AVAudioSessionRouteChangeReasonOldDeviceUnavailable) { AVAudioSessionRouteDescription *routeDescription=dic[AVAudioSessionRouteChangePreviousRouteKey]; AVAudioSessionPortDescription *portDescription= [routeDescription.outputs firstObject]; //原设备为耳机则暂停 if ([portDescription.portType isEqualToString:@'Headphones']) { [self pause]; } } // [dic enumerateKeysAndObjectsUsingBlock:^(id key, id obj, BOOL *stop) {// NSLog(@'%@:%@',key,obj);// }];}-(void)dealloc{ [[NSNotificationCenter defaultCenter] removeObserver:self name:AVAudioSessionRouteChangeNotification object:nil];}#pragma mark - 播放器代理方法-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag{ NSLog(@'音乐播放完成...'); //根据实际情况播放完成可以将会话关闭,其他音频应用继续播放 [[AVAudioSession sharedInstance]setActive:NO error:nil];}@end 在上面的代码中还实现了拔出耳机暂停音乐播放的功能,这也是一个比较常见的功能。在iOS7及以后的版本中可以通过通知获得输出改变的通知,然后拿到通知对象后根据userInfo获得是何种改变类型,进而根据情况对音乐进行暂停操作。 扩展--播放音乐库中的音乐众所周知音乐是iOS的重要组成播放,无论是iPod、iTouch、iPhone还是iPad都可以在iTunes购买音乐或添加本地音乐到音乐库中同步到你的iOS设备。在MediaPlayer.frameowork中有一个MPMusicPlayerController用于播放音乐库中的音乐。 下面先来看一下MPMusicPlayerController的常用属性和方法:
那么接下来的问题就是如何获取MPMediaQueue或者MPMediaItemCollection?MPMediaQueue对象有一系列的类方法来获得媒体队列: + (MPMediaQuery *)albumsQuery; 有了这些方法,就可以很容易获到歌曲、播放列表、专辑媒体等媒体队列了,这样就可以通过:- (void)setQueueWithQuery:(MPMediaQuery *)query方法设置音乐来源了。又或者得到MPMediaQueue之后创建MPMediaItemCollection,使用- (void)setQueueWithItemCollection:(MPMediaItemCollection *)itemCollection设置音乐来源。 有时候可能希望用户自己来选择要播放的音乐,这时可以使用MPMediaPickerController,它是一个视图控制器,类似于UIImagePickerController,选择完播放来源后可以在其代理方法中获得MPMediaItemCollection对象。 无论是通过哪种方式获得MPMusicPlayerController的媒体源,可能都希望将每个媒体的信息显示出来,这时候可以通过MPMediaItem对象获得。一个MPMediaItem代表一个媒体文件,通过它可以访问媒体标题、专辑名称、专辑封面、音乐时长等等。无论是MPMediaQueue还是MPMediaItemCollection都有一个items属性,它是MPMediaItem数组,通过这个属性可以获得MPMediaItem对象。 下面就简单看一下MPMusicPlayerController的使用,在下面的例子中简单演示了音乐的选择、播放、暂停、通知、下一曲、上一曲功能,相信有了上面的概念,代码读起来并不复杂(示例中是直接通过MPMeidaPicker进行音乐选择的,但是仍然提供了两个方法getLocalMediaQuery和getLocalMediaItemCollection来演示如何直接通过MPMediaQueue获得媒体队列或媒体集合): //// ViewController.m// MPMusicPlayerController//// Created by Kenshin Cui 14/03/30// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <MediaPlayer/MediaPlayer.h>@interface ViewController ()<MPMediaPickerControllerDelegate>@property (nonatomic,strong) MPMediaPickerController *mediaPicker;//媒体选择控制器@property (nonatomic,strong) MPMusicPlayerController *musicPlayer; //音乐播放器@end@implementation ViewController- (void)viewDidLoad { [super viewDidLoad];}-(void)dealloc{ [self.musicPlayer endGeneratingPlaybackNotifications];}/** * 获得音乐播放器 * * @return 音乐播放器 */-(MPMusicPlayerController *)musicPlayer{ if (!_musicPlayer) { _musicPlayer=[MPMusicPlayerController systemMusicPlayer]; [_musicPlayer beginGeneratingPlaybackNotifications];//开启通知,否则监控不到MPMusicPlayerController的通知 [self addNotification];//添加通知 //如果不使用MPMediaPickerController可以使用如下方法获得音乐库媒体队列 //[_musicPlayer setQueueWithItemCollection:[self getLocalMediaItemCollection]]; } return _musicPlayer;}/** * 创建媒体选择器 * * @return 媒体选择器 */-(MPMediaPickerController *)mediaPicker{ if (!_mediaPicker) { //初始化媒体选择器,这里设置媒体类型为音乐,其实这里也可以选择视频、广播等// _mediaPicker=[[MPMediaPickerController alloc]initWithMediaTypes:MPMediaTypeMusic]; _mediaPicker=[[MPMediaPickerController alloc]initWithMediaTypes:MPMediaTypeAny]; _mediaPicker.allowsPickingMultipleItems=YES;//允许多选// _mediaPicker.showsCloudItems=YES;//显示icloud选项 _mediaPicker.prompt=@'请选择要播放的音乐'; _mediaPicker.delegate=self;//设置选择器代理 } return _mediaPicker;}/** * 取得媒体队列 * * @return 媒体队列 */-(MPMediaQuery *)getLocalMediaQuery{ MPMediaQuery *mediaQueue=[MPMediaQuery songsQuery]; for (MPMediaItem *item in mediaQueue.items) { NSLog(@'标题:%@,%@',item.title,item.albumTitle); } return mediaQueue;}/** * 取得媒体集合 * * @return 媒体集合 */-(MPMediaItemCollection *)getLocalMediaItemCollection{ MPMediaQuery *mediaQueue=[MPMediaQuery songsQuery]; NSMutableArray *array=[NSMutableArray array]; for (MPMediaItem *item in mediaQueue.items) { [array addObject:item]; NSLog(@'标题:%@,%@',item.title,item.albumTitle); } MPMediaItemCollection *mediaItemCollection=[[MPMediaItemCollection alloc]initWithItems:[array copy]]; return mediaItemCollection;}#pragma mark - MPMediaPickerController代理方法//选择完成-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection{ MPMediaItem *mediaItem=[mediaItemCollection.items firstObject];//第一个播放音乐 //注意很多音乐信息如标题、专辑、表演者、封面、时长等信息都可以通过MPMediaItem的valueForKey:方法得到,但是从iOS7开始都有对应的属性可以直接访问// NSString *title= [mediaItem valueForKey:MPMediaItemPropertyAlbumTitle];// NSString *artist= [mediaItem valueForKey:MPMediaItemPropertyAlbumArtist];// MPMediaItemArtwork *artwork= [mediaItem valueForKey:MPMediaItemPropertyArtwork]; //UIImage *image=[artwork imageWithSize:CGSizeMake(100, 100)];//专辑图片 NSLog(@'标题:%@,表演者:%@,专辑:%@',mediaItem.title ,mediaItem.artist,mediaItem.albumTitle); [self.musicPlayer setQueueWithItemCollection:mediaItemCollection]; [self dismissViewControllerAnimated:YES completion:nil];}//取消选择-(void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker{ [self dismissViewControllerAnimated:YES completion:nil];}#pragma mark - 通知/** * 添加通知 */-(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(playbackStateChange:) name:MPMusicPlayerControllerPlaybackStateDidChangeNotification object:self.musicPlayer];}/** * 播放状态改变通知 * * @param notification 通知对象 */-(void)playbackStateChange:(NSNotification *)notification{ switch (self.musicPlayer.playbackState) { case MPMusicPlaybackStatePlaying: NSLog(@'正在播放...'); break; case MPMusicPlaybackStatePaused: NSLog(@'播放暂停.'); break; case MPMusicPlaybackStateStopped: NSLog(@'播放停止.'); break; default: break; }}#pragma mark - UI事件- (IBAction)selectClick:(UIButton *)sender { [self presentViewController:self.mediaPicker animated:YES completion:nil];}- (IBAction)playClick:(UIButton *)sender { [self.musicPlayer play];}- (IBAction)puaseClick:(UIButton *)sender { [self.musicPlayer pause];}- (IBAction)stopClick:(UIButton *)sender { [self.musicPlayer stop];}- (IBAction)nextClick:(UIButton *)sender { [self.musicPlayer skipToNextItem];}- (IBAction)prevClick:(UIButton *)sender { [self.musicPlayer skipToPreviousItem];}@end 录音除了上面说的,在AVFoundation框架中还要一个AVAudioRecorder类专门处理录音操作,它同样支持多种音频格式。与AVAudioPlayer类似,你完全可以将它看成是一个录音机控制类,下面是常用的属性和方法:
AVAudioRecorder很多属性和方法跟AVAudioPlayer都是类似的,但是它的创建有所不同,在创建录音机时除了指定路径外还必须指定录音设置信息,因为录音机必须知道录音文件的格式、采样率、通道数、每个采样点的位数等信息,但是也并不是所有的信息都必须设置,通常只需要几个常用设置。关于录音设置详见帮助文档中的“AV Foundation Audio Settings Constants”。 下面就使用AVAudioRecorder创建一个录音机,实现了录音、暂停、停止、播放等功能,实现效果大致如下: 在这个示例中将实行一个完整的录音控制,包括录音、暂停、恢复、停止,同时还会实时展示用户录音的声音波动,当用户点击完停止按钮还会自动播放录音文件。程序的构建主要分为以下几步:
下面是主要代码: //// ViewController.m// AVAudioRecorder//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>#define kRecordAudioFile @'myRecord.caf'@interface ViewController ()<AVAudioRecorderDelegate>@property (nonatomic,strong) AVAudioRecorder *audioRecorder;//音频录音机@property (nonatomic,strong) AVAudioPlayer *audioPlayer;//音频播放器,用于播放录音文件@property (nonatomic,strong) NSTimer *timer;//录音声波监控(注意这里暂时不对播放进行监控)@property (weak, nonatomic) IBOutlet UIButton *record;//开始录音@property (weak, nonatomic) IBOutlet UIButton *pause;//暂停录音@property (weak, nonatomic) IBOutlet UIButton *resume;//恢复录音@property (weak, nonatomic) IBOutlet UIButton *stop;//停止录音@property (weak, nonatomic) IBOutlet UIProgressView *audioPower;//音频波动@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad]; [self setAudioSession];}#pragma mark - 私有方法/** * 设置音频会话 */-(void)setAudioSession{ AVAudioSession *audioSession=[AVAudioSession sharedInstance]; //设置为播放和录音状态,以便可以在录制完之后播放录音 [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:nil]; [audioSession setActive:YES error:nil];}/** * 取得录音文件保存路径 * * @return 录音文件路径 */-(NSURL *)getSavePath{ NSString *urlStr=[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject]; urlStr=[urlStr stringByAppendingPathComponent:kRecordAudioFile]; NSLog(@'file path:%@',urlStr); NSURL *url=[NSURL fileURLWithPath:urlStr]; return url;}/** * 取得录音文件设置 * * @return 录音设置 */-(NSDictionary *)getAudioSetting{ NSMutableDictionary *dicM=[NSMutableDictionary dictionary]; //设置录音格式 [dicM setObject:@(kAudioFormatLinearPCM) forKey:AVFormatIDKey]; //设置录音采样率,8000是电话采样率,对于一般录音已经够了 [dicM setObject:@(8000) forKey:AVSampleRateKey]; //设置通道,这里采用单声道 [dicM setObject:@(1) forKey:AVNumberOfChannelsKey]; //每个采样点位数,分为8、16、24、32 [dicM setObject:@(8) forKey:AVLinearPCMBitDepthKey]; //是否使用浮点数采样 [dicM setObject:@(YES) forKey:AVLinearPCMIsFloatKey]; //....其他设置等 return dicM;}/** * 获得录音机对象 * * @return 录音机对象 */-(AVAudioRecorder *)audioRecorder{ if (!_audioRecorder) { //创建录音文件保存路径 NSURL *url=[self getSavePath]; //创建录音格式设置 NSDictionary *setting=[self getAudioSetting]; //创建录音机 NSError *error=nil; _audioRecorder=[[AVAudioRecorder alloc]initWithURL:url settings:setting error:&error]; _audioRecorder.delegate=self; _audioRecorder.meteringEnabled=YES;//如果要监控声波则必须设置为YES if (error) { NSLog(@'创建录音机对象时发生错误,错误信息:%@',error.localizedDescription); return nil; } } return _audioRecorder;}/** * 创建播放器 * * @return 播放器 */-(AVAudioPlayer *)audioPlayer{ if (!_audioPlayer) { NSURL *url=[self getSavePath]; NSError *error=nil; _audioPlayer=[[AVAudioPlayer alloc]initWithContentsOfURL:url error:&error]; _audioPlayer.numberOfLoops=0; [_audioPlayer prepareToPlay]; if (error) { NSLog(@'创建播放器过程中发生错误,错误信息:%@',error.localizedDescription); return nil; } } return _audioPlayer;}/** * 录音声波监控定制器 * * @return 定时器 */-(NSTimer *)timer{ if (!_timer) { _timer=[NSTimer scheduledTimerWithTimeInterval:0.1f target:self selector:@selector(audioPowerChange) userInfo:nil repeats:YES]; } return _timer;}/** * 录音声波状态设置 */-(void)audioPowerChange{ [self.audioRecorder updateMeters];//更新测量值 float power= [self.audioRecorder averagePowerForChannel:0];//取得第一个通道的音频,注意音频强度范围时-160到0 CGFloat progress=(1.0/160.0)*(power+160.0); [self.audioPower setProgress:progress];}#pragma mark - UI事件/** * 点击录音按钮 * * @param sender 录音按钮 */- (IBAction)recordClick:(UIButton *)sender { if (![self.audioRecorder isRecording]) { [self.audioRecorder record];//首次使用应用时如果调用record方法会询问用户是否允许使用麦克风 self.timer.fireDate=[NSDate distantPast]; }}/** * 点击暂定按钮 * * @param sender 暂停按钮 */- (IBAction)pauseClick:(UIButton *)sender { if ([self.audioRecorder isRecording]) { [self.audioRecorder pause]; self.timer.fireDate=[NSDate distantFuture]; }}/** * 点击恢复按钮 * 恢复录音只需要再次调用record,AVAudioSession会帮助你记录上次录音位置并追加录音 * * @param sender 恢复按钮 */- (IBAction)resumeClick:(UIButton *)sender { [self recordClick:sender];}/** * 点击停止按钮 * * @param sender 停止按钮 */- (IBAction)stopClick:(UIButton *)sender { [self.audioRecorder stop]; self.timer.fireDate=[NSDate distantFuture]; self.audioPower.progress=0.0;}#pragma mark - 录音机代理方法/** * 录音完成,录音完成后播放录音 * * @param recorder 录音机对象 * @param flag 是否成功 */-(void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag{ if (![self.audioPlayer isPlaying]) { [self.audioPlayer play]; } NSLog(@'录音完成!');}@end 运行效果: 音频队列服务大家应该已经注意到了,无论是前面的录音还是音频播放均不支持网络流媒体播放,当然对于录音来说这种需求可能不大,但是对于音频播放来说有时候就很有必要了。AVAudioPlayer只能播放本地文件,并且是一次性加载所以音频数据,初始化AVAudioPlayer时指定的URL也只能是File URL而不能是HTTP URL。当然,将音频文件下载到本地然后再调用AVAudioPlayer来播放也是一种播放网络音频的办法,但是这种方式最大的弊端就是必须等到整个音频播放完成才能播放,而不能使用流式播放,这往往在实际开发中是不切实际的。那么在iOS中如何播放网络流媒体呢?就是使用AudioToolbox框架中的音频队列服务Audio Queue Services。 使用音频队列服务完全可以做到音频播放和录制,首先看一下录音音频服务队列: 一个音频服务队列Audio Queue有三部分组成: 三个缓冲器Buffers:每个缓冲器都是一个存储音频数据的临时仓库。 一个缓冲队列Buffer Queue:一个包含音频缓冲器的有序队列。 一个回调Callback:一个自定义的队列回调函数。 声音通过输入设备进入缓冲队列中,首先填充第一个缓冲器;当第一个缓冲器填充满之后自动填充下一个缓冲器,同时会调用回调函数;在回调函数中需要将缓冲器中的音频数据写入磁盘,同时将缓冲器放回到缓冲队列中以便重用。下面是Apple官方关于音频队列服务的流程示意图: 类似的,看一下音频播放缓冲队列,其组成部分和录音缓冲队列类似。 但是在音频播放缓冲队列中,回调函数调用的时机不同于音频录制缓冲队列,流程刚好相反。将音频读取到缓冲器中,一旦一个缓冲器填充满之后就放到缓冲队列中,然后继续填充其他缓冲器;当开始播放时,则从第一个缓冲器中读取音频进行播放;一旦播放完之后就会触发回调函数,开始播放下一个缓冲器中的音频,同时填充第一个缓冲器放;填充满之后再次放回到缓冲队列。下面是详细的流程: 当然,要明白音频队列服务的原理并不难,问题是如何实现这个自定义的回调函数,这其中我们有大量的工作要做,控制播放状态、处理异常中断、进行音频编码等等。由于牵扯内容过多,而且不是本文目的,如果以后有时间将另开一篇文章重点介绍,目前有很多第三方优秀框架可以直接使用,例如AudioStreamer、FreeStreamer。由于前者当前只有非ARC版本,所以下面不妨使用FreeStreamer来简单演示在线音频播放的过程,当然在使用之前要做如下准备工作: 1.拷贝FreeStreamer中的Reachability.h、Reachability.m和Common、astreamer两个文件夹中的内容到项目中。 2.添加FreeStreamer使用的类库:CFNetwork.framework、AudioToolbox.framework、AVFoundation.framework 3.如果引用libxml2.dylib编译不通过,需要在Xcode的Targets-Build Settings-Header Build Path中添加$(SDKROOT)/usr/include/libxml2。 4.将FreeStreamer中的FreeStreamerMobile-Prefix.pch文件添加到项目中并将Targets-Build Settings-Precompile Prefix Header设置为YES,在Targets-Build Settings-Prefix Header设置为$(SRCROOT)/项目名称/FreeStreamerMobile-Prefix.pch(因为Xcode6默认没有pch文件) 然后就可以编写代码播放网络音频了: //// ViewController.m// AudioQueueServices//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.// 使用FreeStreamer实现网络音频播放#import 'ViewController.h'#import 'FSAudioStream.h'@interface ViewController ()@property (nonatomic,strong) FSAudioStream *audioStream;@end@implementation ViewController- (void)viewDidLoad { [super viewDidLoad]; [self.audioStream play];}/** * 取得本地文件路径 * * @return 文件路径 */-(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle]pathForResource:@'刘若英 - 原来你也在这里.mp3' ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url;}-(NSURL *)getNetworkUrl{ NSString *urlStr=@'http://192.168.1.102/liu.mp3'; NSURL *url=[NSURL URLWithString:urlStr]; return url;}/** * 创建FSAudioStream对象 * * @return FSAudioStream对象 */-(FSAudioStream *)audioStream{ if (!_audioStream) { NSURL *url=[self getNetworkUrl]; //创建FSAudioStream对象 _audioStream=[[FSAudioStream alloc]initWithUrl:url]; _audioStream.onFailure=^(FSAudioStreamError error,NSString *description){ NSLog(@'播放过程中发生错误,错误信息:%@',description); }; _audioStream.onCompletion=^(){ NSLog(@'播放完成!'); }; [_audioStream setVolume:0.5];//设置声音 } return _audioStream;}@end其实FreeStreamer的功能很强大,不仅仅是播放本地、网络音频那么简单,它还支持播放列表、检查包内容、RSS订阅、播放中断等很多强大的功能,甚至还包含了一个音频分析器,有兴趣的朋友可以访问官网查看详细用法 视频MPMoviePlayerController在iOS中播放视频可以使用MediaPlayer.framework种的MPMoviePlayerController类来完成,它支持本地视频和网络视频播放。这个类实现了MPMediaPlayback协议,因此具备一般的播放器控制功能,例如播放、暂停、停止等。但是MPMediaPlayerController自身并不是一个完整的视图控制器,如果要在UI中展示视频需要将view属性添加到界面中。下面列出了MPMoviePlayerController的常用属性和方法:
注意MPMediaPlayerController的状态等信息并不是通过代理来和外界交互的,而是通过通知中心,因此从上面的列表中可以看到常用的一些通知。由于MPMoviePlayerController本身对于媒体播放做了深度的封装,使用起来就相当简单:创建MPMoviePlayerController对象,设置frame属性,将MPMoviePlayerController的view添加到控制器视图中。下面的示例中将创建一个播放控制器并添加播放状态改变及播放完成的通知: //// ViewController.m// MPMoviePlayerController//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <MediaPlayer/MediaPlayer.h>@interface ViewController ()@property (nonatomic,strong) MPMoviePlayerController *moviePlayer;//视频播放控制器@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad]; //播放 [self.moviePlayer play]; //添加通知 [self addNotification]; }-(void)dealloc{ //移除所有通知监控 [[NSNotificationCenter defaultCenter] removeObserver:self];}#pragma mark - 私有方法/** * 取得本地文件路径 * * @return 文件路径 */-(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@'The New Look of OS X Yosemite.mp4' ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url;}/** * 取得网络文件路径 * * @return 文件路径 */-(NSURL *)getNetworkUrl{ NSString *urlStr=@'http://192.168.1.161/The New Look of OS X Yosemite.mp4'; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url;}/** * 创建媒体播放控制器 * * @return 媒体播放控制器 */-(MPMoviePlayerController *)moviePlayer{ if (!_moviePlayer) { NSURL *url=[self getNetworkUrl]; _moviePlayer=[[MPMoviePlayerController alloc]initWithContentURL:url]; _moviePlayer.view.frame=self.view.bounds; _moviePlayer.view.autoresizingMask=UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight; [self.view addSubview:_moviePlayer.view]; } return _moviePlayer;}/** * 添加通知监控媒体播放控制器状态 */-(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackStateChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:self.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackFinished:) name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayer]; }/** * 播放状态改变,注意播放完成时的状态是暂停 * * @param notification 通知对象 */-(void)mediaPlayerPlaybackStateChange:(NSNotification *)notification{ switch (self.moviePlayer.playbackState) { case MPMoviePlaybackStatePlaying: NSLog(@'正在播放...'); break; case MPMoviePlaybackStatePaused: NSLog(@'暂停播放.'); break; case MPMoviePlaybackStateStopped: NSLog(@'停止播放.'); break; default: NSLog(@'播放状态:%li',self.moviePlayer.playbackState); break; }}/** * 播放完成 * * @param notification 通知对象 */-(void)mediaPlayerPlaybackFinished:(NSNotification *)notification{ NSLog(@'播放完成.%li',self.moviePlayer.playbackState);}@end 运行效果:
//// ViewController.m// MPMoviePlayerController//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.// 视频截图#import 'ViewController.h'#import <MediaPlayer/MediaPlayer.h>@interface ViewController ()@property (nonatomic,strong) MPMoviePlayerController *moviePlayer;//视频播放控制器@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad]; //播放 [self.moviePlayer play]; //添加通知 [self addNotification]; //获取缩略图 [self thumbnailImageRequest];}-(void)dealloc{ //移除所有通知监控 [[NSNotificationCenter defaultCenter] removeObserver:self];}#pragma mark - 私有方法/** * 取得本地文件路径 * * @return 文件路径 */-(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@'The New Look of OS X Yosemite.mp4' ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url;}/** * 取得网络文件路径 * * @return 文件路径 */-(NSURL *)getNetworkUrl{ NSString *urlStr=@'http://192.168.1.161/The New Look of OS X Yosemite.mp4'; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url;}/** * 创建媒体播放控制器 * * @return 媒体播放控制器 */-(MPMoviePlayerController *)moviePlayer{ if (!_moviePlayer) { NSURL *url=[self getNetworkUrl]; _moviePlayer=[[MPMoviePlayerController alloc]initWithContentURL:url]; _moviePlayer.view.frame=self.view.bounds; _moviePlayer.view.autoresizingMask=UIViewAutoresizingFlexibleWidth|UIViewAutoresizingFlexibleHeight; [self.view addSubview:_moviePlayer.view]; } return _moviePlayer;}/** * 获取视频缩略图 */-(void)thumbnailImageRequest{ //获取13.0s、21.5s的缩略图 [self.moviePlayer requestThumbnailImagesAtTimes:@[@13.0,@21.5] timeOption:MPMovieTimeOptionNearestKeyFrame];}#pragma mark - 控制器通知/** * 添加通知监控媒体播放控制器状态 */-(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackStateChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:self.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackFinished:) name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerThumbnailRequestFinished:) name:MPMoviePlayerThumbnailImageRequestDidFinishNotification object:self.moviePlayer]; }/** * 播放状态改变,注意播放完成时的状态是暂停 * * @param notification 通知对象 */-(void)mediaPlayerPlaybackStateChange:(NSNotification *)notification{ switch (self.moviePlayer.playbackState) { case MPMoviePlaybackStatePlaying: NSLog(@'正在播放...'); break; case MPMoviePlaybackStatePaused: NSLog(@'暂停播放.'); break; case MPMoviePlaybackStateStopped: NSLog(@'停止播放.'); break; default: NSLog(@'播放状态:%li',self.moviePlayer.playbackState); break; }}/** * 播放完成 * * @param notification 通知对象 */-(void)mediaPlayerPlaybackFinished:(NSNotification *)notification{ NSLog(@'播放完成.%li',self.moviePlayer.playbackState);}/** * 缩略图请求完成,此方法每次截图成功都会调用一次 * * @param notification 通知对象 */-(void)mediaPlayerThumbnailRequestFinished:(NSNotification *)notification{ NSLog(@'视频截图完成.'); UIImage *image=notification.userInfo[MPMoviePlayerThumbnailImageKey]; //保存图片到相册(首次调用会请求用户获得访问相册权限) UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);}@end 截图效果: 扩展--使用AVFoundation生成缩略图通过前面的方法大家应该已经看到,使用MPMoviePlayerController来生成缩略图足够简单,但是如果仅仅是是为了生成缩略图而不进行视频播放的话,此刻使用MPMoviePlayerController就有点大材小用了。其实使用AVFundation框架中的AVAssetImageGenerator就可以获取视频缩略图。使用AVAssetImageGenerator获取缩略图大致分为三个步骤:
//// ViewController.m// AVAssetImageGenerator//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>@interface ViewController ()@end@implementation ViewController- (void)viewDidLoad { [super viewDidLoad]; //获取第13.0s的缩略图 [self thumbnailImageRequest:13.0];}#pragma mark - 私有方法/** * 取得本地文件路径 * * @return 文件路径 */-(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@'The New Look of OS X Yosemite.mp4' ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url;}/** * 取得网络文件路径 * * @return 文件路径 */-(NSURL *)getNetworkUrl{ NSString *urlStr=@'http://192.168.1.161/The New Look of OS X Yosemite.mp4'; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url;}/** * 截取指定时间的视频缩略图 * * @param timeBySecond 时间点 */-(void)thumbnailImageRequest:(CGFloat )timeBySecond{ //创建URL NSURL *url=[self getNetworkUrl]; //根据url创建AVURLAsset AVURLAsset *urlAsset=[AVURLAsset assetWithURL:url]; //根据AVURLAsset创建AVAssetImageGenerator AVAssetImageGenerator *imageGenerator=[AVAssetImageGenerator assetImageGeneratorWithAsset:urlAsset]; /*截图 * requestTime:缩略图创建时间 * actualTime:缩略图实际生成的时间 */ NSError *error=nil; CMTime time=CMTimeMakeWithSeconds(timeBySecond, 10);//CMTime是表示电影时间信息的结构体,第一个参数表示是视频第几秒,第二个参数表示每秒帧数.(如果要活的某一秒的第几帧可以使用CMTimeMake方法) CMTime actualTime; CGImageRef cgImage= [imageGenerator copyCGImageAtTime:time actualTime:&actualTime error:&error]; if(error){ NSLog(@'截取视频缩略图时发生错误,错误信息:%@',error.localizedDescription); return; } CMTimeShow(actualTime); UIImage *image=[UIImage imageWithCGImage:cgImage];//转化为UIImage //保存到相册 UIImageWriteToSavedPhotosAlbum(image,nil, nil, nil); CGImageRelease(cgImage);}@end 生成的缩略图效果: MPMoviePlayerViewController其实MPMoviePlayerController如果不作为嵌入视频来播放(例如在新闻中嵌入一个视频),通常在播放时都是占满一个屏幕的,特别是在iPhone、iTouch上。因此从iOS3.2以后苹果也在思考既然MPMoviePlayerController在使用时通常都是将其视图view添加到另外一个视图控制器中作为子视图,那么何不直接创建一个控制器视图内部创建一个MPMoviePlayerController属性并且默认全屏播放,开发者在开发的时候直接使用这个视图控制器。这个内部有一个MPMoviePlayerController的视图控制器就是MPMoviePlayerViewController,它继承于UIViewController。MPMoviePlayerViewController内部多了一个moviePlayer属性和一个带有url的初始化方法,同时它内部实现了一些作为模态视图展示所特有的功能,例如默认是全屏模式展示、弹出后自动播放、作为模态窗口展示时如果点击“Done”按钮会自动退出模态窗口等。在下面的示例中就不直接将播放器放到主视图控制器,而是放到一个模态视图控制器中,简单演示MPMoviePlayerViewController的使用。 //// ViewController.m// MPMoviePlayerViewController//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.// MPMoviePlayerViewController使用#import 'ViewController.h'#import <MediaPlayer/MediaPlayer.h>@interface ViewController ()//播放器视图控制器@property (nonatomic,strong) MPMoviePlayerViewController *moviePlayerViewController;@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad];}-(void)dealloc{ //移除所有通知监控 [[NSNotificationCenter defaultCenter] removeObserver:self];}#pragma mark - 私有方法/** * 取得本地文件路径 * * @return 文件路径 */-(NSURL *)getFileUrl{ NSString *urlStr=[[NSBundle mainBundle] pathForResource:@'The New Look of OS X Yosemite.mp4' ofType:nil]; NSURL *url=[NSURL fileURLWithPath:urlStr]; return url;}/** * 取得网络文件路径 * * @return 文件路径 */-(NSURL *)getNetworkUrl{ NSString *urlStr=@'http://192.168.1.161/The New Look of OS X Yosemite.mp4'; urlStr=[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; return url;}-(MPMoviePlayerViewController *)moviePlayerViewController{ if (!_moviePlayerViewController) { NSURL *url=[self getNetworkUrl]; _moviePlayerViewController=[[MPMoviePlayerViewController alloc]initWithContentURL:url]; [self addNotification]; } return _moviePlayerViewController;}#pragma mark - UI事件- (IBAction)playClick:(UIButton *)sender { self.moviePlayerViewController=nil;//保证每次点击都重新创建视频播放控制器视图,避免再次点击时由于不播放的问题// [self presentViewController:self.moviePlayerViewController animated:YES completion:nil]; //注意,在MPMoviePlayerViewController.h中对UIViewController扩展两个用于模态展示和关闭MPMoviePlayerViewController的方法,增加了一种下拉展示动画效果 [self presentMoviePlayerViewControllerAnimated:self.moviePlayerViewController];}#pragma mark - 控制器通知/** * 添加通知监控媒体播放控制器状态 */-(void)addNotification{ NSNotificationCenter *notificationCenter=[NSNotificationCenter defaultCenter]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackStateChange:) name:MPMoviePlayerPlaybackStateDidChangeNotification object:self.moviePlayerViewController.moviePlayer]; [notificationCenter addObserver:self selector:@selector(mediaPlayerPlaybackFinished:) name:MPMoviePlayerPlaybackDidFinishNotification object:self.moviePlayerViewController.moviePlayer]; }/** * 播放状态改变,注意播放完成时的状态是暂停 * * @param notification 通知对象 */-(void)mediaPlayerPlaybackStateChange:(NSNotification *)notification{ switch (self.moviePlayerViewController.moviePlayer.playbackState) { case MPMoviePlaybackStatePlaying: NSLog(@'正在播放...'); break; case MPMoviePlaybackStatePaused: NSLog(@'暂停播放.'); break; case MPMoviePlaybackStateStopped: NSLog(@'停止播放.'); break; default: NSLog(@'播放状态:%li',self.moviePlayerViewController.moviePlayer.playbackState); break; }}/** * 播放完成 * * @param notification 通知对象 */-(void)mediaPlayerPlaybackFinished:(NSNotification *)notification{ NSLog(@'播放完成.%li',self.moviePlayerViewController.moviePlayer.playbackState);}@end 运行效果: 这里需要强调一下,由于MPMoviePlayerViewController的初始化方法做了大量工作(例如设置URL、自动播放、添加点击Done完成的监控等),所以当再次点击播放弹出新的模态窗口的时如果不销毁之前的MPMoviePlayerViewController,那么新的对象就无法完成初始化,这样也就不能再次进行播放。 AVPlayerMPMoviePlayerController足够强大,几乎不用写几行代码就能完成一个播放器,但是正是由于它的高度封装使得要自定义这个播放器变得很复杂,甚至是不可能完成。例如有些时候需要自定义播放器的样式,那么如果要使用MPMoviePlayerController就不合适了,如果要对视频有自由的控制则可以使用AVPlayer。AVPlayer存在于AVFoundation中,它更加接近于底层,所以灵活性也更强: AVPlayer本身并不能显示视频,而且它也不像MPMoviePlayerController有一个view属性。如果AVPlayer要显示必须创建一个播放器层AVPlayerLayer用于展示,播放器层继承于CALayer,有了AVPlayerLayer之添加到控制器视图的layer中即可。要使用AVPlayer首先了解一下几个常用的类: AVAsset:主要用于获取多媒体信息,是一个抽象类,不能直接使用。 AVURLAsset:AVAsset的子类,可以根据一个URL路径创建一个包含媒体信息的AVURLAsset对象。 AVPlayerItem:一个媒体资源管理对象,管理者视频的一些基本信息和状态,一个AVPlayerItem对应着一个视频资源。 下面简单通过一个播放器来演示AVPlayer的使用,播放器的效果如下: 在这个自定义的播放器中实现了视频播放、暂停、进度展示和视频列表功能,下面将对这些功能一一介绍。 首先说一下视频的播放、暂停功能,这也是最基本的功能,AVPlayer对应着两个方法play、pause来实现。但是关键问题是如何判断当前视频是否在播放,在前面的内容中无论是音频播放器还是视频播放器都有对应的状态来判断,但是AVPlayer却没有这样的状态属性,通常情况下可以通过判断播放器的播放速度来获得播放状态。如果rate为0说明是停止状态,1是则是正常播放状态。 其次要展示播放进度就没有其他播放器那么简单了。在前面的播放器中通常是使用通知来获得播放器的状态,媒体加载状态等,但是无论是AVPlayer还是AVPlayerItem(AVPlayer有一个属性currentItem是AVPlayerItem类型,表示当前播放的视频对象)都无法获得这些信息。当然AVPlayerItem是有通知的,但是对于获得播放状态和加载状态有用的通知只有一个:播放完成通知AVPlayerItemDidPlayToEndTimeNotification。在播放视频时,特别是播放网络视频往往需要知道视频加载情况、缓冲情况、播放情况,这些信息可以通过KVO监控AVPlayerItem的status、loadedTimeRanges属性来获得。当AVPlayerItem的status属性为AVPlayerStatusReadyToPlay是说明正在播放,只有处于这个状态时才能获得视频时长等信息;当loadedTimeRanges的改变时(每缓冲一部分数据就会更新此属性)可以获得本次缓冲加载的视频范围(包含起始时间、本次加载时长),这样一来就可以实时获得缓冲情况。然后就是依靠AVPlayer的- (id)addPeriodicTimeObserverForInterval:(CMTime)interval queue:(dispatch_queue_t)queue usingBlock:(void (^)(CMTime time))block方法获得播放进度,这个方法会在设定的时间间隔内定时更新播放进度,通过time参数通知客户端。相信有了这些视频信息播放进度就不成问题了,事实上通过这些信息就算是平时看到的其他播放器的缓冲进度显示以及拖动播放的功能也可以顺利的实现。 最后就是视频切换的功能,在前面介绍的所有播放器中每个播放器对象一次只能播放一个视频,如果要切换视频只能重新创建一个对象,但是AVPlayer却提供了- (void)replaceCurrentItemWithPlayerItem:(AVPlayerItem *)item方法用于在不同的视频之间切换(事实上在AVFoundation内部还有一个AVQueuePlayer专门处理播放列表切换,有兴趣的朋友可以自行研究,这里不再赘述)。 下面附上代码: //// ViewController.m// AVPlayer//// Created by Kenshin Cui on 14/03/30.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>@interface ViewController ()@property (nonatomic,strong) AVPlayer *player;//播放器对象@property (weak, nonatomic) IBOutlet UIView *container; //播放器容器@property (weak, nonatomic) IBOutlet UIButton *playOrPause; //播放/暂停按钮@property (weak, nonatomic) IBOutlet UIProgressView *progress;//播放进度@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad]; [self setupUI]; [self.player play];}-(void)dealloc{ [self removeObserverFromPlayerItem:self.player.currentItem]; [self removeNotification];}#pragma mark - 私有方法-(void)setupUI{ //创建播放器层 AVPlayerLayer *playerLayer=[AVPlayerLayer playerLayerWithPlayer:self.player]; playerLayer.frame=self.container.frame; //playerLayer.videoGravity=AVLayerVideoGravityResizeAspect;//视频填充模式 [self.container.layer addSublayer:playerLayer];}/** * 截取指定时间的视频缩略图 * * @param timeBySecond 时间点 *//** * 初始化播放器 * * @return 播放器对象 */-(AVPlayer *)player{ if (!_player) { AVPlayerItem *playerItem=[self getPlayItem:0]; _player=[AVPlayer playerWithPlayerItem:playerItem]; [self addProgressObserver]; [self addObserverToPlayerItem:playerItem]; } return _player;}/** * 根据视频索引取得AVPlayerItem对象 * * @param videoIndex 视频顺序索引 * * @return AVPlayerItem对象 */-(AVPlayerItem *)getPlayItem:(int)videoIndex{ NSString *urlStr=[NSString stringWithFormat:@'http://192.168.1.161/%i.mp4',videoIndex]; urlStr =[urlStr stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding]; NSURL *url=[NSURL URLWithString:urlStr]; AVPlayerItem *playerItem=[AVPlayerItem playerItemWithURL:url]; return playerItem;}#pragma mark - 通知/** * 添加播放器通知 */-(void)addNotification{ //给AVPlayerItem添加播放完成通知 [[NSNotificationCenter defaultCenter] addObserver:self selector:@selector(playbackFinished:) name:AVPlayerItemDidPlayToEndTimeNotification object:self.player.currentItem];}-(void)removeNotification{ [[NSNotificationCenter defaultCenter] removeObserver:self];}/** * 播放完成通知 * * @param notification 通知对象 */-(void)playbackFinished:(NSNotification *)notification{ NSLog(@'视频播放完成.');}#pragma mark - 监控/** * 给播放器添加进度更新 */-(void)addProgressObserver{ AVPlayerItem *playerItem=self.player.currentItem; UIProgressView *progress=self.progress; //这里设置每秒执行一次 [self.player addPeriodicTimeObserverForInterval:CMTimeMake(1.0, 1.0) queue:dispatch_get_main_queue() usingBlock:^(CMTime time) { float current=CMTimeGetSeconds(time); float total=CMTimeGetSeconds([playerItem duration]); NSLog(@'当前已经播放%.2fs.',current); if (current) { [progress setProgress:(current/total) animated:YES]; } }];}/** * 给AVPlayerItem添加监控 * * @param playerItem AVPlayerItem对象 */-(void)addObserverToPlayerItem:(AVPlayerItem *)playerItem{ //监控状态属性,注意AVPlayer也有一个status属性,通过监控它的status也可以获得播放状态 [playerItem addObserver:self forKeyPath:@'status' options:NSKeyValueObservingOptionNew context:nil]; //监控网络加载情况属性 [playerItem addObserver:self forKeyPath:@'loadedTimeRanges' options:NSKeyValueObservingOptionNew context:nil];}-(void)removeObserverFromPlayerItem:(AVPlayerItem *)playerItem{ [playerItem removeObserver:self forKeyPath:@'status']; [playerItem removeObserver:self forKeyPath:@'loadedTimeRanges'];}/** * 通过KVO监控播放器状态 * * @param keyPath 监控属性 * @param object 监视器 * @param change 状态改变 * @param context 上下文 */-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context{ AVPlayerItem *playerItem=object; if ([keyPath isEqualToString:@'status']) { AVPlayerStatus status= [[change objectForKey:@'new'] intValue]; if(status==AVPlayerStatusReadyToPlay){ NSLog(@'正在播放...,视频总长度:%.2f',CMTimeGetSeconds(playerItem.duration)); } }else if([keyPath isEqualToString:@'loadedTimeRanges']){ NSArray *array=playerItem.loadedTimeRanges; CMTimeRange timeRange = [array.firstObject CMTimeRangeValue];//本次缓冲时间范围 float startSeconds = CMTimeGetSeconds(timeRange.start); float durationSeconds = CMTimeGetSeconds(timeRange.duration); NSTimeInterval totalBuffer = startSeconds + durationSeconds;//缓冲总长度 NSLog(@'共缓冲:%.2f',totalBuffer);// }}#pragma mark - UI事件/** * 点击播放/暂停按钮 * * @param sender 播放/暂停按钮 */- (IBAction)playClick:(UIButton *)sender {// AVPlayerItemDidPlayToEndTimeNotification //AVPlayerItem *playerItem= self.player.currentItem; if(self.player.rate==0){ //说明时暂停 [sender setImage:[UIImage imageNamed:@'player_pause'] forState:UIControlStateNormal]; [self.player play]; }else if(self.player.rate==1){//正在播放 [self.player pause]; [sender setImage:[UIImage imageNamed:@'player_play'] forState:UIControlStateNormal]; }}/** * 切换选集,这里使用按钮的tag代表视频名称 * * @param sender 点击按钮对象 */- (IBAction)navigationButtonClick:(UIButton *)sender { [self removeNotification]; [self removeObserverFromPlayerItem:self.player.currentItem]; AVPlayerItem *playerItem=[self getPlayItem:sender.tag]; [self addObserverToPlayerItem:playerItem]; //切换视频 [self.player replaceCurrentItemWithPlayerItem:playerItem]; [self addNotification];}@end 运行效果: 到目前为止无论是MPMoviePlayerController还是AVPlayer来播放视频都相当强大,但是它也存在着一些不可回避的问题,那就是支持的视频编码格式很有限:H.264、MPEG-4,扩展名(压缩格式):.mp4、.mov、.m4v、.m2v、.3gp、.3g2等。但是无论是MPMoviePlayerController还是AVPlayer它们都支持绝大多数音频编码,所以大家如果纯粹是为了播放音乐的话也可以考虑使用这两个播放器。那么如何支持更多视频编码格式呢?目前来说主要还是依靠第三方框架,在iOS上常用的视频编码、解码框架有:VLC、ffmpeg, 具体使用方式今天就不再做详细介绍。 摄像头UIImagePickerController拍照和视频录制下面看一下在iOS如何拍照和录制视频。在iOS中要拍照和录制视频最简单的方法就是使用UIImagePickerController。UIImagePickerController继承于UINavigationController,前面的文章中主要使用它来选取照片,其实UIImagePickerController的功能不仅如此,它还可以用来拍照和录制视频。首先看一下这个类常用的属性和方法:
要用UIImagePickerController来拍照或者录制视频通常可以分为如下步骤:
当然这个过程中有很多细节可以设置,例如是否显示拍照控制面板,拍照后是否允许编辑等等,通过上面的属性/方法列表相信并不难理解。下面就以一个示例展示如何使用UIImagePickerController来拍照和录制视频,下面的程序中只要将_isVideo设置为YES就是视频录制模式,录制完后在主视图控制器中自动播放;如果将_isVideo设置为NO则为拍照模式,拍照完成之后在主视图控制器中显示拍摄的照片: //// ViewController.m// UIImagePickerController//// Created by Kenshin Cui on 14/04/05.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <MobileCoreServices/MobileCoreServices.h>#import <AVFoundation/AVFoundation.h>@interface ViewController ()<UIImagePickerControllerDelegate,UINavigationControllerDelegate>@property (assign,nonatomic) int isVideo;//是否录制视频,如果为1表示录制视频,0代表拍照@property (strong,nonatomic) UIImagePickerController *imagePicker;@property (weak, nonatomic) IBOutlet UIImageView *photo;//照片展示视图@property (strong ,nonatomic) AVPlayer *player;//播放器,用于录制完视频后播放视频@end@implementation ViewController#pragma mark - 控制器视图事件- (void)viewDidLoad { [super viewDidLoad]; //通过这里设置当前程序是拍照还是录制视频 _isVideo=YES;}#pragma mark - UI事件//点击拍照按钮- (IBAction)takeClick:(UIButton *)sender { [self presentViewController:self.imagePicker animated:YES completion:nil];}#pragma mark - UIImagePickerController代理方法//完成-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info{ NSString *mediaType=[info objectForKey:UIImagePickerControllerMediaType]; if ([mediaType isEqualToString:(NSString *)kUTTypeImage]) {//如果是拍照 UIImage *image; //如果允许编辑则获得编辑后的照片,否则获取原始照片 if (self.imagePicker.allowsEditing) { image=[info objectForKey:UIImagePickerControllerEditedImage];//获取编辑后的照片 }else{ image=[info objectForKey:UIImagePickerControllerOriginalImage];//获取原始照片 } [self.photo setImage:image];//显示照片 UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);//保存到相簿 }else if([mediaType isEqualToString:(NSString *)kUTTypeMovie]){//如果是录制视频 NSLog(@'video...'); NSURL *url=[info objectForKey:UIImagePickerControllerMediaURL];//视频路径 NSString *urlStr=[url path]; if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(urlStr)) { //保存视频到相簿,注意也可以使用ALAssetsLibrary来保存 UISaveVideoAtPathToSavedPhotosAlbum(urlStr, self, @selector(video:didFinishSavingWithError:contextInfo:), nil);//保存视频到相簿 } } [self dismissViewControllerAnimated:YES completion:nil];}-(void)imagePickerControllerDidCancel:(UIImagePickerController *)picker{ NSLog(@'取消');}#pragma mark - 私有方法-(UIImagePickerController *)imagePicker{ if (!_imagePicker) { _imagePicker=[[UIImagePickerController alloc]init]; _imagePicker.sourceType=UIImagePickerControllerSourceTypeCamera;//设置image picker的来源,这里设置为摄像头 _imagePicker.cameraDevice=UIImagePickerControllerCameraDeviceRear;//设置使用哪个摄像头,这里设置为后置摄像头 if (self.isVideo) { _imagePicker.mediaTypes=@[(NSString *)kUTTypeMovie]; _imagePicker.videoQuality=UIImagePickerControllerQualityTypeIFrame1280x720; _imagePicker.cameraCaptureMode=UIImagePickerControllerCameraCaptureModeVideo;//设置摄像头模式(拍照,录制视频) }else{ _imagePicker.cameraCaptureMode=UIImagePickerControllerCameraCaptureModePhoto; } _imagePicker.allowsEditing=YES;//允许编辑 _imagePicker.delegate=self;//设置代理,检测操作 } return _imagePicker;}//视频保存后的回调- (void)video:(NSString *)videoPath didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo{ if (error) { NSLog(@'保存视频过程中发生错误,错误信息:%@',error.localizedDescription); }else{ NSLog(@'视频保存成功.'); //录制完之后自动播放 NSURL *url=[NSURL fileURLWithPath:videoPath]; _player=[AVPlayer playerWithURL:url]; AVPlayerLayer *playerLayer=[AVPlayerLayer playerLayerWithPlayer:_player]; playerLayer.frame=self.photo.frame; [self.photo.layer addSublayer:playerLayer]; [_player play]; }}@end 运行效果(视频录制): AVFoundation拍照和录制视频不得不说UIImagePickerController确实强大,但是与MPMoviePlayerController类似,由于它的高度封装性,要进行某些自定义工作就比较复杂了。例如要做出一款类似于美颜相机的拍照界面就比较难以实现了,此时就可以考虑使用AVFoundation来实现。AVFoundation中提供了很多现成的播放器和录音机,但是事实上它还有更加底层的内容可以供开发者使用。因为AVFoundation中抽了很多和底层输入、输出设备打交道的类,依靠这些类开发人员面对的不再是封装好的音频播放器AVAudioPlayer、录音机(AVAudioRecorder)、视频(包括音频)播放器AVPlayer,而是输入设备(例如麦克风、摄像头)、输出设备(图片、视频)等。首先了解一下使用AVFoundation做拍照和视频录制开发用到的相关类: AVCaptureSession:媒体(音、视频)捕获会话,负责把捕获的音视频数据输出到输出设备中。一个AVCaptureSession可以有多个输入输出: AVCaptureDevice:输入设备,包括麦克风、摄像头,通过该对象可以设置物理设备的一些属性(例如相机聚焦、白平衡等)。 AVCaptureDeviceInput:设备输入数据管理对象,可以根据AVCaptureDevice创建对应的AVCaptureDeviceInput对象,该对象将会被添加到AVCaptureSession中管理。 AVCaptureOutput:输出数据管理对象,用于接收各类输出数据,通常使用对应的子类AVCaptureAudioDataOutput、AVCaptureStillImageOutput、AVCaptureVideoDataOutput、AVCaptureFileOutput,该对象将会被添加到AVCaptureSession中管理。注意:前面几个对象的输出数据都是NSData类型,而AVCaptureFileOutput代表数据以文件形式输出,类似的,AVCcaptureFileOutput也不会直接创建使用,通常会使用其子类:AVCaptureAudioFileOutput、AVCaptureMovieFileOutput。当把一个输入或者输出添加到AVCaptureSession之后AVCaptureSession就会在所有相符的输入、输出设备之间建立连接(AVCaptionConnection): AVCaptureVideoPreviewLayer:相机拍摄预览图层,是CALayer的子类,使用该对象可以实时查看拍照或视频录制效果,创建该对象需要指定对应的AVCaptureSession对象。 使用AVFoundation拍照和录制视频的一般步骤如下:
拍照下面看一下如何使用AVFoundation实现一个拍照程序,在这个程序中将实现摄像头预览、切换前后摄像头、闪光灯设置、对焦、拍照保存等功能。应用大致效果如下: 在程序中定义会话、输入、输出等相关对象。 @interface ViewController ()@property (strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递@property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据@property (strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流@property (strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;//相机拍摄预览图层@property (weak, nonatomic) IBOutlet UIView *viewContainer;@property (weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮@property (weak, nonatomic) IBOutlet UIButton *flashAutoButton;//自动闪光灯按钮@property (weak, nonatomic) IBOutlet UIButton *flashOnButton;//打开闪光灯按钮@property (weak, nonatomic) IBOutlet UIButton *flashOffButton;//关闭闪光灯按钮@property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标@end 在控制器视图将要展示时创建并初始化会话、摄像头设备、输入、输出、预览图层,并且添加预览图层到视图中,除此之外还做了一些初始化工作,例如添加手势(点击屏幕进行聚焦)、初始化界面等。 -(void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; //初始化会话 _captureSession=[[AVCaptureSession alloc]init]; if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率 _captureSession.sessionPreset=AVCaptureSessionPreset1280x720; } //获得输入设备 AVCaptureDevice *captureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头 if (!captureDevice) { NSLog(@'取得后置摄像头时出现问题.'); return; } NSError *error=nil; //根据输入设备初始化设备输入对象,用于获得输入数据 _captureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:captureDevice error:&error]; if (error) { NSLog(@'取得设备输入对象时出错,错误原因:%@',error.localizedDescription); return; } //初始化设备输出对象,用于获得输出数据 _captureStillImageOutput=[[AVCaptureStillImageOutput alloc]init]; NSDictionary *outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG}; [_captureStillImageOutput setOutputSettings:outputSettings];//输出设置 //将设备输入添加到会话中 if ([_captureSession canAddInput:_captureDeviceInput]) { [_captureSession addInput:_captureDeviceInput]; } //将设备输出添加到会话中 if ([_captureSession canAddOutput:_captureStillImageOutput]) { [_captureSession addOutput:_captureStillImageOutput]; } //创建视频预览层,用于实时展示摄像头状态 _captureVideoPreviewLayer=[[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession]; CALayer *layer=self.viewContainer.layer; layer.masksToBounds=YES; _captureVideoPreviewLayer.frame=layer.bounds; _captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式 //将视频预览层添加到界面中 //[layer addSublayer:_captureVideoPreviewLayer]; [layer insertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer]; [self addNotificationToCaptureDevice:captureDevice]; [self addGenstureRecognizer]; [self setFlashModeButtonStatus];} 在控制器视图展示和视图离开界面时启动、停止会话。 -(void)viewDidAppear:(BOOL)animated{ [super viewDidAppear:animated]; [self.captureSession startRunning];}-(void)viewDidDisappear:(BOOL)animated{ [super viewDidDisappear:animated]; [self.captureSession stopRunning];} 定义闪光灯开闭及自动模式功能,注意无论是设置闪光灯、白平衡还是其他输入设备属性,在设置之前必须先锁定配置,修改完后解锁。 /** * 改变设备属性的统一操作方法 * * @param propertyChange 属性改变操作 */-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{ AVCaptureDevice *captureDevice= [self.captureDeviceInput device]; NSError *error; //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁 if ([captureDevice lockForConfiguration:&error]) { propertyChange(captureDevice); [captureDevice unlockForConfiguration]; }else{ NSLog(@'设置设备属性过程发生错误,错误信息:%@',error.localizedDescription); }}/** * 设置闪光灯模式 * * @param flashMode 闪光灯模式 */-(void)setFlashMode:(AVCaptureFlashMode )flashMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFlashModeSupported:flashMode]) { [captureDevice setFlashMode:flashMode]; } }];} 定义切换摄像头功能,切换摄像头的过程就是将原有输入移除,在会话中添加新的输入,但是注意动态修改会话需要首先开启配置,配置成功后提交配置。 #pragma mark 切换前后摄像头- (IBAction)toggleButtonClick:(UIButton *)sender { AVCaptureDevice *currentDevice=[self.captureDeviceInput device]; AVCaptureDevicePosition currentPosition=[currentDevice position]; [self removeNotificationFromCaptureDevice:currentDevice]; AVCaptureDevice *toChangeDevice; AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionFront; if (currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront) { toChangePosition=AVCaptureDevicePositionBack; } toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition]; [self addNotificationToCaptureDevice:toChangeDevice]; //获得要调整的设备输入对象 AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil]; //改变会话的配置前一定要先开启配置,配置完成后提交配置改变 [self.captureSession beginConfiguration]; //移除原有输入对象 [self.captureSession removeInput:self.captureDeviceInput]; //添加新的输入对象 if ([self.captureSession canAddInput:toChangeDeviceInput]) { [self.captureSession addInput:toChangeDeviceInput]; self.captureDeviceInput=toChangeDeviceInput; } //提交会话配置 [self.captureSession commitConfiguration]; [self setFlashModeButtonStatus];} 添加点击手势操作,点按预览视图时进行聚焦、白平衡设置。 /** * 设置聚焦点 * * @param point 聚焦点 */-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus]; } if ([captureDevice isFocusPointOfInterestSupported]) { [captureDevice setFocusPointOfInterest:point]; } if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose]; } if ([captureDevice isExposurePointOfInterestSupported]) { [captureDevice setExposurePointOfInterest:point]; } }];}/** * 添加点按手势,点按时聚焦 */-(void)addGenstureRecognizer{ UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)]; [self.viewContainer addGestureRecognizer:tapGesture];}-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{ CGPoint point= [tapGesture locationInView:self.viewContainer]; //将UI坐标转化为摄像头坐标 CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point]; [self setFocusCursorWithPoint:point]; [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];} 定义拍照功能,拍照的过程就是获取连接,从连接中获得捕获的输出数据并做保存操作。 #pragma mark 拍照- (IBAction)takeButtonClick:(UIButton *)sender { //根据设备输出获得连接 AVCaptureConnection *captureConnection=[self.captureStillImageOutput connectionWithMediaType:AVMediaTypeVideo]; //根据连接取得设备输出的数据 [self.captureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer) { NSData *imageData=[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; UIImage *image=[UIImage imageWithData:imageData]; UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);// ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init];// [assetsLibrary writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil]; } }];} 最后附上完整代码: //// ViewController.m// AVFoundationCamera//// Created by Kenshin Cui on 14/04/05.// Copyright (c) 2014年 cmjstudio. All rights reserved.//#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>#import <AssetsLibrary/AssetsLibrary.h>typedef void(^PropertyChangeBlock)(AVCaptureDevice *captureDevice);@interface ViewController ()@property (strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递@property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据@property (strong,nonatomic) AVCaptureStillImageOutput *captureStillImageOutput;//照片输出流@property (strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;//相机拍摄预览图层@property (weak, nonatomic) IBOutlet UIView *viewContainer;@property (weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮@property (weak, nonatomic) IBOutlet UIButton *flashAutoButton;//自动闪光灯按钮@property (weak, nonatomic) IBOutlet UIButton *flashOnButton;//打开闪光灯按钮@property (weak, nonatomic) IBOutlet UIButton *flashOffButton;//关闭闪光灯按钮@property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad]; }-(void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; //初始化会话 _captureSession=[[AVCaptureSession alloc]init]; if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率 _captureSession.sessionPreset=AVCaptureSessionPreset1280x720; } //获得输入设备 AVCaptureDevice *captureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头 if (!captureDevice) { NSLog(@'取得后置摄像头时出现问题.'); return; } NSError *error=nil; //根据输入设备初始化设备输入对象,用于获得输入数据 _captureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:captureDevice error:&error]; if (error) { NSLog(@'取得设备输入对象时出错,错误原因:%@',error.localizedDescription); return; } //初始化设备输出对象,用于获得输出数据 _captureStillImageOutput=[[AVCaptureStillImageOutput alloc]init]; NSDictionary *outputSettings = @{AVVideoCodecKey:AVVideoCodecJPEG}; [_captureStillImageOutput setOutputSettings:outputSettings];//输出设置 //将设备输入添加到会话中 if ([_captureSession canAddInput:_captureDeviceInput]) { [_captureSession addInput:_captureDeviceInput]; } //将设备输出添加到会话中 if ([_captureSession canAddOutput:_captureStillImageOutput]) { [_captureSession addOutput:_captureStillImageOutput]; } //创建视频预览层,用于实时展示摄像头状态 _captureVideoPreviewLayer=[[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession]; CALayer *layer=self.viewContainer.layer; layer.masksToBounds=YES; _captureVideoPreviewLayer.frame=layer.bounds; _captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式 //将视频预览层添加到界面中 //[layer addSublayer:_captureVideoPreviewLayer]; [layer insertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer]; [self addNotificationToCaptureDevice:captureDevice]; [self addGenstureRecognizer]; [self setFlashModeButtonStatus];}-(void)viewDidAppear:(BOOL)animated{ [super viewDidAppear:animated]; [self.captureSession startRunning];}-(void)viewDidDisappear:(BOOL)animated{ [super viewDidDisappear:animated]; [self.captureSession stopRunning];}-(void)dealloc{ [self removeNotification];}#pragma mark - UI方法#pragma mark 拍照- (IBAction)takeButtonClick:(UIButton *)sender { //根据设备输出获得连接 AVCaptureConnection *captureConnection=[self.captureStillImageOutput connectionWithMediaType:AVMediaTypeVideo]; //根据连接取得设备输出的数据 [self.captureStillImageOutput captureStillImageAsynchronouslyFromConnection:captureConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) { if (imageDataSampleBuffer) { NSData *imageData=[AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer]; UIImage *image=[UIImage imageWithData:imageData]; UIImageWriteToSavedPhotosAlbum(image, nil, nil, nil);// ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init];// [assetsLibrary writeImageToSavedPhotosAlbum:[image CGImage] orientation:(ALAssetOrientation)[image imageOrientation] completionBlock:nil]; } }];}#pragma mark 切换前后摄像头- (IBAction)toggleButtonClick:(UIButton *)sender { AVCaptureDevice *currentDevice=[self.captureDeviceInput device]; AVCaptureDevicePosition currentPosition=[currentDevice position]; [self removeNotificationFromCaptureDevice:currentDevice]; AVCaptureDevice *toChangeDevice; AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionFront; if (currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront) { toChangePosition=AVCaptureDevicePositionBack; } toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition]; [self addNotificationToCaptureDevice:toChangeDevice]; //获得要调整的设备输入对象 AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil]; //改变会话的配置前一定要先开启配置,配置完成后提交配置改变 [self.captureSession beginConfiguration]; //移除原有输入对象 [self.captureSession removeInput:self.captureDeviceInput]; //添加新的输入对象 if ([self.captureSession canAddInput:toChangeDeviceInput]) { [self.captureSession addInput:toChangeDeviceInput]; self.captureDeviceInput=toChangeDeviceInput; } //提交会话配置 [self.captureSession commitConfiguration]; [self setFlashModeButtonStatus];}#pragma mark 自动闪光灯开启- (IBAction)flashAutoClick:(UIButton *)sender { [self setFlashMode:AVCaptureFlashModeAuto]; [self setFlashModeButtonStatus];}#pragma mark 打开闪光灯- (IBAction)flashOnClick:(UIButton *)sender { [self setFlashMode:AVCaptureFlashModeOn]; [self setFlashModeButtonStatus];}#pragma mark 关闭闪光灯- (IBAction)flashOffClick:(UIButton *)sender { [self setFlashMode:AVCaptureFlashModeOff]; [self setFlashModeButtonStatus];}#pragma mark - 通知/** * 给输入设备添加通知 */-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{ //注意添加区域改变捕获通知必须首先设置设备允许捕获 [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { captureDevice.subjectAreaChangeMonitoringEnabled=YES; }]; NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //捕获区域发生改变 [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];}-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];}/** * 移除所有通知 */-(void)removeNotification{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self];}-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //会话出错 [notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:captureSession];}/** * 设备连接成功 * * @param notification 通知对象 */-(void)deviceConnected:(NSNotification *)notification{ NSLog(@'设备已连接...');}/** * 设备连接断开 * * @param notification 通知对象 */-(void)deviceDisconnected:(NSNotification *)notification{ NSLog(@'设备已断开.');}/** * 捕获区域改变 * * @param notification 通知对象 */-(void)areaChange:(NSNotification *)notification{ NSLog(@'捕获区域改变...');}/** * 会话出错 * * @param notification 通知对象 */-(void)sessionRuntimeError:(NSNotification *)notification{ NSLog(@'会话发生错误.');}#pragma mark - 私有方法/** * 取得指定位置的摄像头 * * @param position 摄像头位置 * * @return 摄像头设备 */-(AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{ NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *camera in cameras) { if ([camera position]==position) { return camera; } } return nil;}/** * 改变设备属性的统一操作方法 * * @param propertyChange 属性改变操作 */-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{ AVCaptureDevice *captureDevice= [self.captureDeviceInput device]; NSError *error; //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁 if ([captureDevice lockForConfiguration:&error]) { propertyChange(captureDevice); [captureDevice unlockForConfiguration]; }else{ NSLog(@'设置设备属性过程发生错误,错误信息:%@',error.localizedDescription); }}/** * 设置闪光灯模式 * * @param flashMode 闪光灯模式 */-(void)setFlashMode:(AVCaptureFlashMode )flashMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFlashModeSupported:flashMode]) { [captureDevice setFlashMode:flashMode]; } }];}/** * 设置聚焦模式 * * @param focusMode 聚焦模式 */-(void)setFocusMode:(AVCaptureFocusMode )focusMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:focusMode]; } }];}/** * 设置曝光模式 * * @param exposureMode 曝光模式 */-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:exposureMode]; } }];}/** * 设置聚焦点 * * @param point 聚焦点 */-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus]; } if ([captureDevice isFocusPointOfInterestSupported]) { [captureDevice setFocusPointOfInterest:point]; } if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose]; } if ([captureDevice isExposurePointOfInterestSupported]) { [captureDevice setExposurePointOfInterest:point]; } }];}/** * 添加点按手势,点按时聚焦 */-(void)addGenstureRecognizer{ UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)]; [self.viewContainer addGestureRecognizer:tapGesture];}-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{ CGPoint point= [tapGesture locationInView:self.viewContainer]; //将UI坐标转化为摄像头坐标 CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point]; [self setFocusCursorWithPoint:point]; [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];}/** * 设置闪光灯按钮状态 */-(void)setFlashModeButtonStatus{ AVCaptureDevice *captureDevice=[self.captureDeviceInput device]; AVCaptureFlashMode flashMode=captureDevice.flashMode; if([captureDevice isFlashAvailable]){ self.flashAutoButton.hidden=NO; self.flashOnButton.hidden=NO; self.flashOffButton.hidden=NO; self.flashAutoButton.enabled=YES; self.flashOnButton.enabled=YES; self.flashOffButton.enabled=YES; switch (flashMode) { case AVCaptureFlashModeAuto: self.flashAutoButton.enabled=NO; break; case AVCaptureFlashModeOn: self.flashOnButton.enabled=NO; break; case AVCaptureFlashModeOff: self.flashOffButton.enabled=NO; break; default: break; } }else{ self.flashAutoButton.hidden=YES; self.flashOnButton.hidden=YES; self.flashOffButton.hidden=YES; }}/** * 设置聚焦光标位置 * * @param point 光标位置 */-(void)setFocusCursorWithPoint:(CGPoint)point{ self.focusCursor.center=point; self.focusCursor.transform=CGAffineTransformMakeScale(1.5, 1.5); self.focusCursor.alpha=1.0; [UIView animateWithDuration:1.0 animations:^{ self.focusCursor.transform=CGAffineTransformIdentity; } completion:^(BOOL finished) { self.focusCursor.alpha=0; }];}@end 运行效果:
视频录制其实有了前面的拍照应用之后要在此基础上做视频录制功能并不复杂,程序只需要做如下修改:
相比拍照程序,程序的修改主要就是以上三点。当然为了让程序更加完善在下面的视频录制程序中加入了屏幕旋转视频、自动布局和后台保存任务等细节。下面是修改后的程序: //// ViewController.m// AVFoundationCamera//// Created by Kenshin Cui on 14/04/05.// Copyright (c) 2014年 cmjstudio. All rights reserved.// 视频录制#import 'ViewController.h'#import <AVFoundation/AVFoundation.h>#import <AssetsLibrary/AssetsLibrary.h>typedef void(^PropertyChangeBlock)(AVCaptureDevice *captureDevice);@interface ViewController ()<AVCaptureFileOutputRecordingDelegate>//视频文件输出代理@property (strong,nonatomic) AVCaptureSession *captureSession;//负责输入和输出设备之间的数据传递@property (strong,nonatomic) AVCaptureDeviceInput *captureDeviceInput;//负责从AVCaptureDevice获得输入数据@property (strong,nonatomic) AVCaptureMovieFileOutput *captureMovieFileOutput;//视频输出流@property (strong,nonatomic) AVCaptureVideoPreviewLayer *captureVideoPreviewLayer;//相机拍摄预览图层@property (assign,nonatomic) BOOL enableRotation;//是否允许旋转(注意在视频录制过程中禁止屏幕旋转)@property (assign,nonatomic) CGRect *lastBounds;//旋转的前大小@property (assign,nonatomic) UIBackgroundTaskIdentifier backgroundTaskIdentifier;//后台任务标识@property (weak, nonatomic) IBOutlet UIView *viewContainer;@property (weak, nonatomic) IBOutlet UIButton *takeButton;//拍照按钮@property (weak, nonatomic) IBOutlet UIImageView *focusCursor; //聚焦光标@end@implementation ViewController#pragma mark - 控制器视图方法- (void)viewDidLoad { [super viewDidLoad];}-(void)viewWillAppear:(BOOL)animated{ [super viewWillAppear:animated]; //初始化会话 _captureSession=[[AVCaptureSession alloc]init]; if ([_captureSession canSetSessionPreset:AVCaptureSessionPreset1280x720]) {//设置分辨率 _captureSession.sessionPreset=AVCaptureSessionPreset1280x720; } //获得输入设备 AVCaptureDevice *captureDevice=[self getCameraDeviceWithPosition:AVCaptureDevicePositionBack];//取得后置摄像头 if (!captureDevice) { NSLog(@'取得后置摄像头时出现问题.'); return; } //添加一个音频输入设备 AVCaptureDevice *audioCaptureDevice=[[AVCaptureDevice devicesWithMediaType:AVMediaTypeAudio] firstObject]; NSError *error=nil; //根据输入设备初始化设备输入对象,用于获得输入数据 _captureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:captureDevice error:&error]; if (error) { NSLog(@'取得设备输入对象时出错,错误原因:%@',error.localizedDescription); return; } AVCaptureDeviceInput *audioCaptureDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:audioCaptureDevice error:&error]; if (error) { NSLog(@'取得设备输入对象时出错,错误原因:%@',error.localizedDescription); return; } //初始化设备输出对象,用于获得输出数据 _captureMovieFileOutput=[[AVCaptureMovieFileOutput alloc]init]; //将设备输入添加到会话中 if ([_captureSession canAddInput:_captureDeviceInput]) { [_captureSession addInput:_captureDeviceInput]; [_captureSession addInput:audioCaptureDeviceInput]; AVCaptureConnection *captureConnection=[_captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo]; if ([captureConnection isVideoStabilizationSupported ]) { captureConnection.preferredVideoStabilizationMode=AVCaptureVideoStabilizationModeAuto; } } //将设备输出添加到会话中 if ([_captureSession canAddOutput:_captureMovieFileOutput]) { [_captureSession addOutput:_captureMovieFileOutput]; } //创建视频预览层,用于实时展示摄像头状态 _captureVideoPreviewLayer=[[AVCaptureVideoPreviewLayer alloc]initWithSession:self.captureSession]; CALayer *layer=self.viewContainer.layer; layer.masksToBounds=YES; _captureVideoPreviewLayer.frame=layer.bounds; _captureVideoPreviewLayer.videoGravity=AVLayerVideoGravityResizeAspectFill;//填充模式 //将视频预览层添加到界面中 //[layer addSublayer:_captureVideoPreviewLayer]; [layer insertSublayer:_captureVideoPreviewLayer below:self.focusCursor.layer]; _enableRotation=YES; [self addNotificationToCaptureDevice:captureDevice]; [self addGenstureRecognizer];}-(void)viewDidAppear:(BOOL)animated{ [super viewDidAppear:animated]; [self.captureSession startRunning];}-(void)viewDidDisappear:(BOOL)animated{ [super viewDidDisappear:animated]; [self.captureSession stopRunning];}-(BOOL)shouldAutorotate{ return self.enableRotation;}////屏幕旋转时调整视频预览图层的方向//-(void)willTransitionToTraitCollection:(UITraitCollection *)newCollection withTransitionCoordinator:(id<UIViewControllerTransitionCoordinator>)coordinator{// [super willTransitionToTraitCollection:newCollection withTransitionCoordinator:coordinator];//// NSLog(@'%i,%i',newCollection.verticalSizeClass,newCollection.horizontalSizeClass);// UIInterfaceOrientation orientation = [[UIApplication sharedApplication] statusBarOrientation];// NSLog(@'%i',orientation);// AVCaptureConnection *captureConnection=[self.captureVideoPreviewLayer connection];// captureConnection.videoOrientation=orientation;// //}//屏幕旋转时调整视频预览图层的方向-(void)willRotateToInterfaceOrientation:(UIInterfaceOrientation)toInterfaceOrientation duration:(NSTimeInterval)duration{ AVCaptureConnection *captureConnection=[self.captureVideoPreviewLayer connection]; captureConnection.videoOrientation=(AVCaptureVideoOrientation)toInterfaceOrientation;}//旋转后重新设置大小-(void)didRotateFromInterfaceOrientation:(UIInterfaceOrientation)fromInterfaceOrientation{ _captureVideoPreviewLayer.frame=self.viewContainer.bounds;}-(void)dealloc{ [self removeNotification];}#pragma mark - UI方法#pragma mark 视频录制- (IBAction)takeButtonClick:(UIButton *)sender { //根据设备输出获得连接 AVCaptureConnection *captureConnection=[self.captureMovieFileOutput connectionWithMediaType:AVMediaTypeVideo]; //根据连接取得设备输出的数据 if (![self.captureMovieFileOutput isRecording]) { self.enableRotation=NO; //如果支持多任务则则开始多任务 if ([[UIDevice currentDevice] isMultitaskingSupported]) { self.backgroundTaskIdentifier=[[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:nil]; } //预览图层和视频方向保持一致 captureConnection.videoOrientation=[self.captureVideoPreviewLayer connection].videoOrientation; NSString *outputFielPath=[NSTemporaryDirectory() stringByAppendingString:@'myMovie.mov']; NSLog(@'save path is :%@',outputFielPath); NSURL *fileUrl=[NSURL fileURLWithPath:outputFielPath]; [self.captureMovieFileOutput startRecordingToOutputFileURL:fileUrl recordingDelegate:self]; } else{ [self.captureMovieFileOutput stopRecording];//停止录制 }}#pragma mark 切换前后摄像头- (IBAction)toggleButtonClick:(UIButton *)sender { AVCaptureDevice *currentDevice=[self.captureDeviceInput device]; AVCaptureDevicePosition currentPosition=[currentDevice position]; [self removeNotificationFromCaptureDevice:currentDevice]; AVCaptureDevice *toChangeDevice; AVCaptureDevicePosition toChangePosition=AVCaptureDevicePositionFront; if (currentPosition==AVCaptureDevicePositionUnspecified||currentPosition==AVCaptureDevicePositionFront) { toChangePosition=AVCaptureDevicePositionBack; } toChangeDevice=[self getCameraDeviceWithPosition:toChangePosition]; [self addNotificationToCaptureDevice:toChangeDevice]; //获得要调整的设备输入对象 AVCaptureDeviceInput *toChangeDeviceInput=[[AVCaptureDeviceInput alloc]initWithDevice:toChangeDevice error:nil]; //改变会话的配置前一定要先开启配置,配置完成后提交配置改变 [self.captureSession beginConfiguration]; //移除原有输入对象 [self.captureSession removeInput:self.captureDeviceInput]; //添加新的输入对象 if ([self.captureSession canAddInput:toChangeDeviceInput]) { [self.captureSession addInput:toChangeDeviceInput]; self.captureDeviceInput=toChangeDeviceInput; } //提交会话配置 [self.captureSession commitConfiguration]; }#pragma mark - 视频输出代理-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didStartRecordingToOutputFileAtURL:(NSURL *)fileURL fromConnections:(NSArray *)connections{ NSLog(@'开始录制...');}-(void)captureOutput:(AVCaptureFileOutput *)captureOutput didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray *)connections error:(NSError *)error{ NSLog(@'视频录制完成.'); //视频录入完成之后在后台将视频存储到相簿 self.enableRotation=YES; UIBackgroundTaskIdentifier lastBackgroundTaskIdentifier=self.backgroundTaskIdentifier; self.backgroundTaskIdentifier=UIBackgroundTaskInvalid; ALAssetsLibrary *assetsLibrary=[[ALAssetsLibrary alloc]init]; [assetsLibrary writeVideoAtPathToSavedPhotosAlbum:outputFileURL completionBlock:^(NSURL *assetURL, NSError *error) { if (error) { NSLog(@'保存视频到相簿过程中发生错误,错误信息:%@',error.localizedDescription); } if (lastBackgroundTaskIdentifier!=UIBackgroundTaskInvalid) { [[UIApplication sharedApplication] endBackgroundTask:lastBackgroundTaskIdentifier]; } NSLog(@'成功保存视频到相簿.'); }]; }#pragma mark - 通知/** * 给输入设备添加通知 */-(void)addNotificationToCaptureDevice:(AVCaptureDevice *)captureDevice{ //注意添加区域改变捕获通知必须首先设置设备允许捕获 [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { captureDevice.subjectAreaChangeMonitoringEnabled=YES; }]; NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //捕获区域发生改变 [notificationCenter addObserver:self selector:@selector(areaChange:) name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];}-(void)removeNotificationFromCaptureDevice:(AVCaptureDevice *)captureDevice{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self name:AVCaptureDeviceSubjectAreaDidChangeNotification object:captureDevice];}/** * 移除所有通知 */-(void)removeNotification{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; [notificationCenter removeObserver:self];}-(void)addNotificationToCaptureSession:(AVCaptureSession *)captureSession{ NSNotificationCenter *notificationCenter= [NSNotificationCenter defaultCenter]; //会话出错 [notificationCenter addObserver:self selector:@selector(sessionRuntimeError:) name:AVCaptureSessionRuntimeErrorNotification object:captureSession];}/** * 设备连接成功 * * @param notification 通知对象 */-(void)deviceConnected:(NSNotification *)notification{ NSLog(@'设备已连接...');}/** * 设备连接断开 * * @param notification 通知对象 */-(void)deviceDisconnected:(NSNotification *)notification{ NSLog(@'设备已断开.');}/** * 捕获区域改变 * * @param notification 通知对象 */-(void)areaChange:(NSNotification *)notification{ NSLog(@'捕获区域改变...');}/** * 会话出错 * * @param notification 通知对象 */-(void)sessionRuntimeError:(NSNotification *)notification{ NSLog(@'会话发生错误.');}#pragma mark - 私有方法/** * 取得指定位置的摄像头 * * @param position 摄像头位置 * * @return 摄像头设备 */-(AVCaptureDevice *)getCameraDeviceWithPosition:(AVCaptureDevicePosition )position{ NSArray *cameras= [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (AVCaptureDevice *camera in cameras) { if ([camera position]==position) { return camera; } } return nil;}/** * 改变设备属性的统一操作方法 * * @param propertyChange 属性改变操作 */-(void)changeDeviceProperty:(PropertyChangeBlock)propertyChange{ AVCaptureDevice *captureDevice= [self.captureDeviceInput device]; NSError *error; //注意改变设备属性前一定要首先调用lockForConfiguration:调用完之后使用unlockForConfiguration方法解锁 if ([captureDevice lockForConfiguration:&error]) { propertyChange(captureDevice); [captureDevice unlockForConfiguration]; }else{ NSLog(@'设置设备属性过程发生错误,错误信息:%@',error.localizedDescription); }}/** * 设置闪光灯模式 * * @param flashMode 闪光灯模式 */-(void)setFlashMode:(AVCaptureFlashMode )flashMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFlashModeSupported:flashMode]) { [captureDevice setFlashMode:flashMode]; } }];}/** * 设置聚焦模式 * * @param focusMode 聚焦模式 */-(void)setFocusMode:(AVCaptureFocusMode )focusMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:focusMode]; } }];}/** * 设置曝光模式 * * @param exposureMode 曝光模式 */-(void)setExposureMode:(AVCaptureExposureMode)exposureMode{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:exposureMode]; } }];}/** * 设置聚焦点 * * @param point 聚焦点 */-(void)focusWithMode:(AVCaptureFocusMode)focusMode exposureMode:(AVCaptureExposureMode)exposureMode atPoint:(CGPoint)point{ [self changeDeviceProperty:^(AVCaptureDevice *captureDevice) { if ([captureDevice isFocusModeSupported:focusMode]) { [captureDevice setFocusMode:AVCaptureFocusModeAutoFocus]; } if ([captureDevice isFocusPointOfInterestSupported]) { [captureDevice setFocusPointOfInterest:point]; } if ([captureDevice isExposureModeSupported:exposureMode]) { [captureDevice setExposureMode:AVCaptureExposureModeAutoExpose]; } if ([captureDevice isExposurePointOfInterestSupported]) { [captureDevice setExposurePointOfInterest:point]; } }];}/** * 添加点按手势,点按时聚焦 */-(void)addGenstureRecognizer{ UITapGestureRecognizer *tapGesture=[[UITapGestureRecognizer alloc]initWithTarget:self action:@selector(tapScreen:)]; [self.viewContainer addGestureRecognizer:tapGesture];}-(void)tapScreen:(UITapGestureRecognizer *)tapGesture{ CGPoint point= [tapGesture locationInView:self.viewContainer]; //将UI坐标转化为摄像头坐标 CGPoint cameraPoint= [self.captureVideoPreviewLayer captureDevicePointOfInterestForPoint:point]; [self setFocusCursorWithPoint:point]; [self focusWithMode:AVCaptureFocusModeAutoFocus exposureMode:AVCaptureExposureModeAutoExpose atPoint:cameraPoint];}/** * 设置聚焦光标位置 * * @param point 光标位置 */-(void)setFocusCursorWithPoint:(CGPoint)point{ self.focusCursor.center=point; self.focusCursor.transform=CGAffineTransformMakeScale(1.5, 1.5); self.focusCursor.alpha=1.0; [UIView animateWithDuration:1.0 animations:^{ self.focusCursor.transform=CGAffineTransformIdentity; } completion:^(BOOL finished) { self.focusCursor.alpha=0; }];}@end 运行效果:
总结前面用了大量的篇幅介绍了iOS中的音、视频播放和录制,有些地方用到了封装好的播放器、录音机直接使用,有些是直接调用系统服务自己组织封装,正如本篇开头所言,iOS对于多媒体支持相当灵活和完善,那么开放过程中如何选择呢,下面就以一个表格简单对比一下各个开发技术的优缺点。
|