文章详情

短信预约-IT技能 免费直播动态提醒

请输入下面的图形验证码

提交验证

短信预约提醒成功

iOS实现摄像头实时采集图像

2022-05-24 11:22

关注

本文实例为大家分享了iOS实现摄像头实时采集图像的具体代码,供大家参考,具体内容如下

新接到一个实时获取摄像头当前照片的需求,在设定的时间内需要保持摄像头处在开启状态并可以实时回调到当前的图片数据信息;

此次结合 AVCaptureDevice、AVCaptureSession、AVCaptureVideoPreviewLayer 将其与 UIView、UIImageView 和 UIImage 相结合;

Github

具体实现 code 如下:


#import <UIKit/UIKit.h>
#import <CoreVideo/CoreVideo.h>
#import <CoreMedia/CoreMedia.h>
#import <AVFoundation/AVFoundation.h>
 
NS_ASSUME_NONNULL_BEGIN
 
@interface YHCameraView : UIView <AVCaptureVideoDataOutputSampleBufferDelegate>
 
@property (nonatomic, weak) UIImageView *cameraImageView;
@property (strong, nonatomic) AVCaptureDevice* device;
@property (strong, nonatomic) AVCaptureSession* captureSession;
@property (strong, nonatomic) AVCaptureVideoPreviewLayer* previewLayer;
@property (strong, nonatomic) UIImage* cameraImage;
 
@end
 
NS_ASSUME_NONNULL_END

#import "YHCameraView.h"
 
@implementation YHCameraView
 
- (instancetype)initWithFrame:(CGRect)frame {
    if (self = [super initWithFrame:frame]) {
        self.backgroundColor = [UIColor lightGrayColor];
        [self createUI];
    }
    return self;
}
 

 
- (void)createUI {
    NSArray* devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    for(AVCaptureDevice *device in devices)
    {
        if([device position] == AVCaptureDevicePositionFront) // 前置摄像头
            self.device = device;
    }
    
    AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:self.device error:nil];
    AVCaptureVideoDataOutput* output = [[AVCaptureVideoDataOutput alloc] init];
    output.alwaysDiscardsLateVideoFrames = YES;
 
    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", NULL);
    [output setSampleBufferDelegate:self queue:queue];
 
    NSString* key = (NSString *) kCVPixelBufferPixelFormatTypeKey;
    NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [output setVideoSettings:videoSettings];
 
    self.captureSession = [[AVCaptureSession alloc] init];
    [self.captureSession addInput:input];
    [self.captureSession addOutput:output];
    [self.captureSession setSessionPreset:AVCaptureSessionPresetPhoto];
 
    self.previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
 
    // CHECK FOR YOUR APP
    NSInteger screenWidth = self.frame.size.width;
    NSInteger screenHeitht = self.frame.size.height;
    self.previewLayer.frame = self.bounds;
    self.previewLayer.orientation = AVCaptureVideoOrientationPortrait;
    // CHECK FOR YOUR APP
 
//    [self.layer insertSublayer:self.previewLayer atIndex:0];   // Comment-out to hide preview layer
 
    [self.captureSession startRunning];
}
 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);
    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);
 
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);
 
    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);
 
    self.cameraImage = [UIImage imageWithCGImage:newImage scale:1.0f orientation:UIImageOrientationLeftMirrored]; // UIImageOrientationDownMirrored
    self.cameraImageView.image = [UIImage imageWithCGImage:newImage scale:1.0f orientation:UIImageOrientationLeftMirrored];
 
    CGImageRelease(newImage);
 
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
}
 
@end

将其实例化后在需要的时候直接获取其 cameraView 的 cameraImage 即可;


#pragma mark - 快照采集
/// 快照采集
- (YHCameraView *)cameraView {
    if (!_cameraView) {
        YHCameraView *view = [[YHCameraView alloc] init];
        view.frame = CGRectMake(1, 1, 1, 1);
        view.cameraImageView.image = view.cameraImage;
        _cameraView = view;
    }
    return _cameraView;
}
 
NSString *strImg = [YHCameraManager imageBase64EncodedWithImage:self.cameraView.cameraImage
                                                       AndImageType:@"JPEG"]; // 获取照片信息


+ (NSString *)imageBase64EncodedWithImage:(UIImage *)img AndImageType:(NSString *)type {
    NSString *callBack = nil;
    if ([img isKindOfClass:[UIImage class]]) {
        NSData *data = [NSData data];
        if ([type isEqualToString:@"PNG"]) {
            data = UIImagePNGRepresentation(img);
        } else {
            data = UIImageJPEGRepresentation(img, 1.0f);
        }
        
        NSString *encodedImgStr = [data base64EncodedStringWithOptions:NSDataBase64Encoding64CharacterLineLength];
        
        NSLog(@"YHCameraManager\nencodedImgStr: %@", encodedImgStr);
        return encodedImgStr;
    } else {
        return callBack;
    }
}

以上就是本文的全部内容,希望对大家的学习有所帮助,也希望大家多多支持编程网。

阅读原文内容投诉

免责声明:

① 本站未注明“稿件来源”的信息均来自网络整理。其文字、图片和音视频稿件的所属权归原作者所有。本站收集整理出于非商业性的教育和科研之目的,并不意味着本站赞同其观点或证实其内容的真实性。仅作为临时的测试数据,供内部测试之用。本站并未授权任何人以任何方式主动获取本站任何信息。

② 本站未注明“稿件来源”的临时测试数据将在测试完成后最终做删除处理。有问题或投稿请发送至: 邮箱/279061341@qq.com QQ/279061341

软考中级精品资料免费领

  • 历年真题答案解析
  • 备考技巧名师总结
  • 高频考点精准押题
  • 2024年上半年信息系统项目管理师第二批次真题及答案解析(完整版)

    难度     813人已做
    查看
  • 【考后总结】2024年5月26日信息系统项目管理师第2批次考情分析

    难度     354人已做
    查看
  • 【考后总结】2024年5月25日信息系统项目管理师第1批次考情分析

    难度     318人已做
    查看
  • 2024年上半年软考高项第一、二批次真题考点汇总(完整版)

    难度     435人已做
    查看
  • 2024年上半年系统架构设计师考试综合知识真题

    难度     224人已做
    查看

相关文章

发现更多好内容

猜你喜欢

AI推送时光机
位置:首页-资讯-移动开发
咦!没有更多了?去看看其它编程学习网 内容吧
首页课程
资料下载
问答资讯