SlideShare a Scribd company logo
Building Native Camera Access - Part III
The iOS port is a steep climb. Unlike the Android version which maps almost directly to the native code.

Still, one of the advantages in iOS programming is the cleaner underlying API that often simplifies common use cases. Due to this I chose to skip 3rd party libraries and
try to implement the functionality of Camera Kit directly on the native iOS API's.
#import <Foundation/Foundation.h>
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
}
-(void)start;
-(void)stop;
-(void)setVideoBitRate:(int)param;
-(int)getPreviewWidth;
-(BOOL)isStarted;
-(void)setMethod:(int)param;
-(void*)getView;
-(void)setPermissions:(int)param;
-(int)getFacing;
-(void)setZoom:(float)param;
-(int)toggleFacing;
-(int)getCaptureWidth;
-(float)getHorizontalViewingAngle;
-(void)setJpegQuality:(int)param;
-(void)stopVideo;
-(BOOL)isFacingBack;
-(int)getFlash;
-(void)captureImage;
-(int)getPreviewHeight;
-(void)captureVideoFile:(NSString*)param;
-(void)setLockVideoAspectRatio:(BOOL)param;
-(void)setFocus:(int)param;
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
When we use Generate Native Stubs the iOS stubs include two files an h and an m file. Lets review the h file first, I'll look at what was generated in both before we begin.

This is a standard objective-c import statement that adds the basic Apple iOS API, imports in Objective-C are more like C includes than Java imports.
#import <Foundation/Foundation.h>
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
}
-(void)start;
-(void)stop;
-(void)setVideoBitRate:(int)param;
-(int)getPreviewWidth;
-(BOOL)isStarted;
-(void)setMethod:(int)param;
-(void*)getView;
-(void)setPermissions:(int)param;
-(int)getFacing;
-(void)setZoom:(float)param;
-(int)toggleFacing;
-(int)getCaptureWidth;
-(float)getHorizontalViewingAngle;
-(void)setJpegQuality:(int)param;
-(void)stopVideo;
-(BOOL)isFacingBack;
-(int)getFlash;
-(void)captureImage;
-(int)getPreviewHeight;
-(void)captureVideoFile:(NSString*)param;
-(void)setLockVideoAspectRatio:(BOOL)param;
-(void)setFocus:(int)param;
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
This is the class definition for the native interface notice that NSObject is the common base class here
#import <Foundation/Foundation.h>
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
}
-(void)start;
-(void)stop;
-(void)setVideoBitRate:(int)param;
-(int)getPreviewWidth;
-(BOOL)isStarted;
-(void)setMethod:(int)param;
-(void*)getView;
-(void)setPermissions:(int)param;
-(int)getFacing;
-(void)setZoom:(float)param;
-(int)toggleFacing;
-(int)getCaptureWidth;
-(float)getHorizontalViewingAngle;
-(void)setJpegQuality:(int)param;
-(void)stopVideo;
-(BOOL)isFacingBack;
-(int)getFlash;
-(void)captureImage;
-(int)getPreviewHeight;
-(void)captureVideoFile:(NSString*)param;
-(void)setLockVideoAspectRatio:(BOOL)param;
-(void)setFocus:(int)param;
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
The method signatures should be pretty readable as they map directly to the Java equivalents
#import <Foundation/Foundation.h>
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
}
-(void)start;
-(void)stop;
-(void)setVideoBitRate:(int)param;
-(int)getPreviewWidth;
-(BOOL)isStarted;
-(void)setMethod:(int)param;
-(void*)getView;
-(void)setPermissions:(int)param;
-(int)getFacing;
-(void)setZoom:(float)param;
-(int)toggleFacing;
-(int)getCaptureWidth;
-(float)getHorizontalViewingAngle;
-(void)setJpegQuality:(int)param;
-(void)stopVideo;
-(BOOL)isFacingBack;
-(int)getFlash;
-(void)captureImage;
-(int)getPreviewHeight;
-(void)captureVideoFile:(NSString*)param;
-(void)setLockVideoAspectRatio:(BOOL)param;
-(void)setFocus:(int)param;
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
getView expects a peer component on the Java side, from this side we'll return a UIView instance
#import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h"
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{}
-(void)stop{}
-(void)setVideoBitRate:(int)param{}
-(int)getPreviewWidth{
return 0;
}
-(BOOL)isStarted{
return NO;
}
-(void)setMethod:(int)param{}
-(void*)getView{
return nil;
}
-(void)setPermissions:(int)param{}
-(int)getFacing{
return 0;
}
-(void)setZoom:(float)param{}
-(int)toggleFacing{
return 0;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Once we get this the implementation code is more of the same. We import the header file we just reviewed just like includes in C.
#import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h"
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{}
-(void)stop{}
-(void)setVideoBitRate:(int)param{}
-(int)getPreviewWidth{
return 0;
}
-(BOOL)isStarted{
return NO;
}
-(void)setMethod:(int)param{}
-(void*)getView{
return nil;
}
-(void)setPermissions:(int)param{}
-(int)getFacing{
return 0;
}
-(void)setZoom:(float)param{}
-(int)toggleFacing{
return 0;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Method implementations match their declaration with curly braces for the implementation
#import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h"
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{}
-(void)stop{}
-(void)setVideoBitRate:(int)param{}
-(int)getPreviewWidth{
return 0;
}
-(BOOL)isStarted{
return NO;
}
-(void)setMethod:(int)param{}
-(void*)getView{
return nil;
}
-(void)setPermissions:(int)param{}
-(int)getFacing{
return 0;
}
-(void)setZoom:(float)param{}
-(int)toggleFacing{
return 0;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Parameter names are important in iOS. If you change the name of the parameter you will break compilation as the name is a part of the method signature
#import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h"
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{}
-(void)stop{}
-(void)setVideoBitRate:(int)param{}
-(int)getPreviewWidth{
return 0;
}
-(BOOL)isStarted{
return NO;
}
-(void)setMethod:(int)param{}
-(void*)getView{
return nil;
}
-(void)setPermissions:(int)param{}
-(int)getFacing{
return 0;
}
-(void)setZoom:(float)param{}
-(int)toggleFacing{
return 0;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Most types are familiar but some like boolean use a slightly different syntax with YES & NO instead of true & false
#import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h"
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{}
-(void)stop{}
-(void)setVideoBitRate:(int)param{}
-(int)getPreviewWidth{
return 0;
}
-(BOOL)isStarted{
return NO;
}
-(void)setMethod:(int)param{}
-(void*)getView{
return nil;
}
-(void)setPermissions:(int)param{}
-(int)getFacing{
return 0;
}
-(void)setZoom:(float)param{}
-(int)toggleFacing{
return 0;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The same is true for nil instead of null
#import <AVFoundation/AVFoundation.h>
const int FACING_BACK = 0;
const int FACING_FRONT = 1;
const int FLASH_OFF = 0;
const int FLASH_ON = 1;
const int FLASH_AUTO = 2;
const int FLASH_TORCH = 3;
const int FOCUS_OFF = 0;
const int FOCUS_CONTINUOUS = 1;
const int FOCUS_TAP = 2;
const int FOCUS_TAP_WITH_MARKER = 3;
const int METHOD_STANDARD = 0;
const int METHOD_STILL = 1;
const int VIDEO_QUALITY_480P = 0;
const int VIDEO_QUALITY_720P = 1;
const int VIDEO_QUALITY_1080P = 2;
const int VIDEO_QUALITY_2160P = 3;
const int VIDEO_QUALITY_HIGHEST = 4;
const int VIDEO_QUALITY_LOWEST = 5;
const int VIDEO_QUALITY_QVGA = 6;
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
We'll start by bringing in the constants from the Java Constants interface. In the Java implementation we could ignore their values in the native side because the
implementation already used the exact same values. We don't have that privilege. I'll also add an import to the native AVFoundation (Audio Video Foundation) which is
the iOS API for media.

These are copied directly from the Java code with the public static final portion replaced to const. This will make coding the rest easier.
BOOL firstTimeCameraKitLaunch = YES;
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{
if(firstTimeCameraKitLaunch) {
direction = FACING_BACK;
flash = FLASH_OFF;
focus = FOCUS_CONTINUOUS;
method = METHOD_STANDARD;
videoQuality = VIDEO_QUALITY_480P;
previewLayer = nil;
device = nil;
photoOutput = nil;
captureSession = nil;
stillImageOutput = nil;
firstTimeCameraKitLaunch = NO;
zoom = 1;
[self lazyInit];
} else {
dispatch_sync(dispatch_get_main_queue(), ^{
[captureSession startRunning];
});
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
So now that we have a sense of the scope lets start implementing the important methods one by one. The natural place to start is the start method. 

For this to work we first need to define the firstTimeCamerKitLaunch variable
BOOL firstTimeCameraKitLaunch = YES;
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{
if(firstTimeCameraKitLaunch) {
direction = FACING_BACK;
flash = FLASH_OFF;
focus = FOCUS_CONTINUOUS;
method = METHOD_STANDARD;
videoQuality = VIDEO_QUALITY_480P;
previewLayer = nil;
device = nil;
photoOutput = nil;
captureSession = nil;
stillImageOutput = nil;
firstTimeCameraKitLaunch = NO;
zoom = 1;
[self lazyInit];
} else {
dispatch_sync(dispatch_get_main_queue(), ^{
[captureSession startRunning];
});
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The first time start is invoked we initialize the default values of the various constants to identical values we have in the Android version
BOOL firstTimeCameraKitLaunch = YES;
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{
if(firstTimeCameraKitLaunch) {
direction = FACING_BACK;
flash = FLASH_OFF;
focus = FOCUS_CONTINUOUS;
method = METHOD_STANDARD;
videoQuality = VIDEO_QUALITY_480P;
previewLayer = nil;
device = nil;
photoOutput = nil;
captureSession = nil;
stillImageOutput = nil;
firstTimeCameraKitLaunch = NO;
zoom = 1;
[self lazyInit];
} else {
dispatch_sync(dispatch_get_main_queue(), ^{
[captureSession startRunning];
});
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This method initializes the camera view the first time around notice that self is the equivalent of this
BOOL firstTimeCameraKitLaunch = YES;
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{
if(firstTimeCameraKitLaunch) {
direction = FACING_BACK;
flash = FLASH_OFF;
focus = FOCUS_CONTINUOUS;
method = METHOD_STANDARD;
videoQuality = VIDEO_QUALITY_480P;
previewLayer = nil;
device = nil;
photoOutput = nil;
captureSession = nil;
stillImageOutput = nil;
firstTimeCameraKitLaunch = NO;
zoom = 1;
[self lazyInit];
} else {
dispatch_sync(dispatch_get_main_queue(), ^{
[captureSession startRunning];
});
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
dispatch_sync is the iOS equivalent of callSeriallyAndWait we want the block below to execute on the native iOS thread & we want to wait until it's finished
BOOL firstTimeCameraKitLaunch = YES;
@implementation com_codename1_camerakit_impl_CameraNativeAccessImpl
-(void)start{
if(firstTimeCameraKitLaunch) {
direction = FACING_BACK;
flash = FLASH_OFF;
focus = FOCUS_CONTINUOUS;
method = METHOD_STANDARD;
videoQuality = VIDEO_QUALITY_480P;
previewLayer = nil;
device = nil;
photoOutput = nil;
captureSession = nil;
stillImageOutput = nil;
firstTimeCameraKitLaunch = NO;
zoom = 1;
[self lazyInit];
} else {
dispatch_sync(dispatch_get_main_queue(), ^{
[captureSession startRunning];
});
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The capture session is stopped on the stop call so if this isn't the first time around we need to restart the capture session
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Before we proceed to the lazy init method lets look at the variables we added into the header file.

The direction the camera is facing... Front or Back
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Whether flash is on/off or auto-flash mode
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Focus can be based on point or automatic
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Allows for several modes of capture, I didn't implement this for now
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
A set of constants indicating the resolution for recorded video
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Current camera zoom value
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
I store YES here if the app was given permission to access the camera
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Since some callbacks for video and photo might be similar I set this to to indicate what I'm capturing
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
This is the actual UI element we will see on the screen a UIView is the iOS parallel to Component, I'll discuss this soon
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Notice that I imported the CameraKitView class here. It’s a class I added and I’ll cover it soon…
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
This is the native capture device representing the camera. A different device instance is used when we flip between the back & front cameras
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
A session encapsulates the capture process, we need to acquire access to the camera with a session and relinquish it in stop
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
The preview layer is where the video from the camera is drawn, this is a CALayer which is a graphics surface that we can assign to a UIView. I'll discuss this when
covering CameraKitView
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
This class is responsible for capturing a movie and saving it to a file
#import "CameraKitView.h"
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject {
int direction;
int flash;
int focus;
int method;
int videoQuality;
float zoom;
BOOL authorized;
BOOL capturingVideo;
CameraKitView* container;
AVCaptureDevice* device;
AVCaptureSession* captureSession;
AVCaptureVideoPreviewLayer* previewLayer;
AVCaptureMovieFileOutput* movieOutput;
AVCapturePhotoOutput* photoOutput;
AVCaptureStillImageOutput* stillImageOutput;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
The last two entries handle photos, the former works on iOS 10 and newer devices and the latter works on older devices/OS.

That's a lot to digest but we are just getting started…
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Lets move right into the lazyInit method. When I started with the code I thought the camera will initialize lazily but that didn't fit the rest of the API so I abandoned that
approach and initialized on start. I didn't bother changing the name since it isn't user visible anyway.

The content of the following block runs on the native iOS thread synchronously. The method won't return until the code is finished
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This is the equivalent of new Object() in Objective-C we allocate the object and invoke its init method which is sort of a constructor
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
We're asking the AVCaptureDevice whether we have permission to use the media device, this can result in one of 4 outcomes
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Not determined means we need to ask for permission
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
So we ask which should prompt the user with a permission dialog that he can accept or reject
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If he accepted we move to the second phase of authorization in lazyInitPostAuthorization
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If permission was denied in some way there isn't much we can do... The user will see a blank view
-(void)lazyInit {
dispatch_sync(dispatch_get_main_queue(), ^{
container = [[CameraKitView alloc] init];
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusNotDetermined:
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo
completionHandler:^( BOOL granted ) {
if ( ! granted ) {
authorized = NO;
return;
}
authorized = YES;
[self lazyInitPostAuthorization];
}];
break;
case AVAuthorizationStatusDenied:
case AVAuthorizationStatusRestricted:
authorized = NO;
break;
case AVAuthorizationStatusAuthorized:
authorized = YES;
[self lazyInitPostAuthorization];
break;
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If authorization was already granted previously we move on. Notice that for this to work we need the ios.NSCameraUsageDescription constant I discussed before.
Without that build hint permission is denied automatically
-(void)lazyInitPostAuthorization {
if ([AVCaptureDeviceDiscoverySession class]) {
if(direction == FACING_FRONT) {
device = [AVCaptureDevice
defaultDeviceWithDeviceType:
AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionFront];
} else {
device = [AVCaptureDevice
defaultDeviceWithDeviceType:
AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
}
} else {
if(direction == FACING_FRONT) {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionFront) {
device = d;
break;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Lets move on to the lazyInitPostAuthorization method. This is a complex method so I'll divide it into 2 parts for simplicity. The first part of the method deals with detection
of the "device" meaning picking the right camera.

This checks if a specific class exists, iOS 10 deprecated an API and introduced a new one. If the new API doesn't exist we'll fallback to the old API
-(void)lazyInitPostAuthorization {
if ([AVCaptureDeviceDiscoverySession class]) {
if(direction == FACING_FRONT) {
device = [AVCaptureDevice
defaultDeviceWithDeviceType:
AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionFront];
} else {
device = [AVCaptureDevice
defaultDeviceWithDeviceType:
AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
}
} else {
if(direction == FACING_FRONT) {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionFront) {
device = d;
break;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If we reached this block we are in iOS 10 or newer
-(void)lazyInitPostAuthorization {
if ([AVCaptureDeviceDiscoverySession class]) {
if(direction == FACING_FRONT) {
device = [AVCaptureDevice
defaultDeviceWithDeviceType:
AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionFront];
} else {
device = [AVCaptureDevice
defaultDeviceWithDeviceType:
AVCaptureDeviceTypeBuiltInWideAngleCamera
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
}
} else {
if(direction == FACING_FRONT) {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionFront) {
device = d;
break;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
You will notice that getting a device in iOS 10 is one method with the only difference being the position argument value. Notice Objective-C method invocations use
argument names as part of the invocation.

Notice I referred to Objective-C messages as methods. There is a difference between the two but it's not something you need to understand as a casual Objective-C
user.
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
}
} else {
if(direction == FACING_FRONT) {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionFront) {
device = d;
break;
}
}
} else {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionBack) {
device = d;
break;
}
}
}
}
// ... common device code ...
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This code is running on a device with an OS prior to iOS 10, here we loop over all the devices within AVCaptureDevice
mediaType:AVMediaTypeVideo
position:AVCaptureDevicePositionBack];
}
} else {
if(direction == FACING_FRONT) {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionFront) {
device = d;
break;
}
}
} else {
for(AVCaptureDevice* d in [AVCaptureDevice devices]) {
if(d.position == AVCaptureDevicePositionBack) {
device = d;
break;
}
}
}
}
// ... common device code ...
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If a device is in the right position we update the device value and exit the loop
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The bottom portion of the method is common to both iOS 10+ and prior. 

Objective-C often accepts pointers to error variables which it assigns in case of an error, I didn't check for error here which I should.
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The input object can be created from the device, we need it to start a session and don't need it after that at this time
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
We allocate a new capture session and set the input value
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Preview shows the video of the session in our view. It's a CALayer which we can't add directly to the screen
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
setLayer is a method I added to CameraKitView, I'll discuss that when covering CameraKitView
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This is how you show a CALayer within a UIView
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
These methods allow us to keep common code between the initialization code and the setter methods. That means a call to setFlash will trigger updateFlash internally I'll
cover all 4 methods soon.
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput
deviceInputWithDevice:device error:&error];
captureSession = [[AVCaptureSession alloc] init];
[captureSession addInput:input];
previewLayer = [AVCaptureVideoPreviewLayer
layerWithSession:captureSession];
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[container setLayer:previewLayer];
[container.layer addSublayer:previewLayer];
[self updateFlash];
[self updateZoom];
[self updateFocus];
[self updateVideoQuality];
[captureSession startRunning];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The final line starts the capture session. This seems like a lot and it is a lot. We went through the "heavy lifting" portion of the code and as you can see it might not be
trivial but it isn't hard. I didn't know half of these methods when I started out but that's the great thing about being a programmer in this day and age: we can google it.

More Related Content

Similar to Building a Native Camera Access Library - Part III - Transcript.pdf

Building a Native Camera Access Library - Part II.pdf
Building a Native Camera Access Library - Part II.pdfBuilding a Native Camera Access Library - Part II.pdf
Building a Native Camera Access Library - Part II.pdf
ShaiAlmog1
 
Creating a Facebook Clone - Part XLIII - Transcript.pdf
Creating a Facebook Clone - Part XLIII - Transcript.pdfCreating a Facebook Clone - Part XLIII - Transcript.pdf
Creating a Facebook Clone - Part XLIII - Transcript.pdf
ShaiAlmog1
 
Vue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapyVue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapy
Darío Blanco Iturriaga
 
How React Native, Appium and me made each other shine @Frontmania 16-11-2018
How React Native, Appium and me made each other shine @Frontmania 16-11-2018How React Native, Appium and me made each other shine @Frontmania 16-11-2018
How React Native, Appium and me made each other shine @Frontmania 16-11-2018
Wim Selles
 
Power ai image-pipeline
Power ai image-pipelinePower ai image-pipeline
Power ai image-pipeline
Paulo Sergio Lemes Queiroz
 
iOS training (basic)
iOS training (basic)iOS training (basic)
iOS training (basic)
Gurpreet Singh Sachdeva
 
Android workshop
Android workshopAndroid workshop
Android workshop
Michael Galpin
 
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
Katy Slemon
 
XilinxのxsimでSoftware Driven Verification.pdf
XilinxのxsimでSoftware  Driven Verification.pdfXilinxのxsimでSoftware  Driven Verification.pdf
XilinxのxsimでSoftware Driven Verification.pdf
Mr. Vengineer
 
Desmistificando o Phonegap (Cordova)
Desmistificando o Phonegap (Cordova)Desmistificando o Phonegap (Cordova)
Desmistificando o Phonegap (Cordova)
Loiane Groner
 
303 TANSTAAFL: Using Open Source iPhone UI Code
303 TANSTAAFL: Using Open Source iPhone UI Code303 TANSTAAFL: Using Open Source iPhone UI Code
303 TANSTAAFL: Using Open Source iPhone UI Code
jonmarimba
 
MobileConf 2015: Desmistificando o Phonegap (Cordova)
MobileConf 2015: Desmistificando o Phonegap (Cordova)MobileConf 2015: Desmistificando o Phonegap (Cordova)
MobileConf 2015: Desmistificando o Phonegap (Cordova)
Loiane Groner
 
2012 04-19 theory-of_operation
2012 04-19 theory-of_operation2012 04-19 theory-of_operation
2012 04-19 theory-of_operation
bobwolff68
 
Skinning Android for Embedded Applications
Skinning Android for Embedded ApplicationsSkinning Android for Embedded Applications
Skinning Android for Embedded Applications
VIA Embedded
 
Leaving Interface Builder Behind
Leaving Interface Builder BehindLeaving Interface Builder Behind
Leaving Interface Builder Behind
John Wilker
 
Non Conventional Android Programming En
Non Conventional Android Programming EnNon Conventional Android Programming En
Non Conventional Android Programming En
guest9bcef2f
 
Non Conventional Android Programming (English)
Non Conventional Android Programming (English)Non Conventional Android Programming (English)
Non Conventional Android Programming (English)
Davide Cerbo
 
Android Camera Architecture
Android Camera ArchitectureAndroid Camera Architecture
Android Camera Architecture
Picker Weng
 
Angular Unit Testing NDC Minn 2018
Angular Unit Testing NDC Minn 2018Angular Unit Testing NDC Minn 2018
Angular Unit Testing NDC Minn 2018
Justin James
 
Capture image on eye blink
Capture image on eye blinkCapture image on eye blink
Capture image on eye blink
InnovationM
 

Similar to Building a Native Camera Access Library - Part III - Transcript.pdf (20)

Building a Native Camera Access Library - Part II.pdf
Building a Native Camera Access Library - Part II.pdfBuilding a Native Camera Access Library - Part II.pdf
Building a Native Camera Access Library - Part II.pdf
 
Creating a Facebook Clone - Part XLIII - Transcript.pdf
Creating a Facebook Clone - Part XLIII - Transcript.pdfCreating a Facebook Clone - Part XLIII - Transcript.pdf
Creating a Facebook Clone - Part XLIII - Transcript.pdf
 
Vue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapyVue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapy
 
How React Native, Appium and me made each other shine @Frontmania 16-11-2018
How React Native, Appium and me made each other shine @Frontmania 16-11-2018How React Native, Appium and me made each other shine @Frontmania 16-11-2018
How React Native, Appium and me made each other shine @Frontmania 16-11-2018
 
Power ai image-pipeline
Power ai image-pipelinePower ai image-pipeline
Power ai image-pipeline
 
iOS training (basic)
iOS training (basic)iOS training (basic)
iOS training (basic)
 
Android workshop
Android workshopAndroid workshop
Android workshop
 
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
 
XilinxのxsimでSoftware Driven Verification.pdf
XilinxのxsimでSoftware  Driven Verification.pdfXilinxのxsimでSoftware  Driven Verification.pdf
XilinxのxsimでSoftware Driven Verification.pdf
 
Desmistificando o Phonegap (Cordova)
Desmistificando o Phonegap (Cordova)Desmistificando o Phonegap (Cordova)
Desmistificando o Phonegap (Cordova)
 
303 TANSTAAFL: Using Open Source iPhone UI Code
303 TANSTAAFL: Using Open Source iPhone UI Code303 TANSTAAFL: Using Open Source iPhone UI Code
303 TANSTAAFL: Using Open Source iPhone UI Code
 
MobileConf 2015: Desmistificando o Phonegap (Cordova)
MobileConf 2015: Desmistificando o Phonegap (Cordova)MobileConf 2015: Desmistificando o Phonegap (Cordova)
MobileConf 2015: Desmistificando o Phonegap (Cordova)
 
2012 04-19 theory-of_operation
2012 04-19 theory-of_operation2012 04-19 theory-of_operation
2012 04-19 theory-of_operation
 
Skinning Android for Embedded Applications
Skinning Android for Embedded ApplicationsSkinning Android for Embedded Applications
Skinning Android for Embedded Applications
 
Leaving Interface Builder Behind
Leaving Interface Builder BehindLeaving Interface Builder Behind
Leaving Interface Builder Behind
 
Non Conventional Android Programming En
Non Conventional Android Programming EnNon Conventional Android Programming En
Non Conventional Android Programming En
 
Non Conventional Android Programming (English)
Non Conventional Android Programming (English)Non Conventional Android Programming (English)
Non Conventional Android Programming (English)
 
Android Camera Architecture
Android Camera ArchitectureAndroid Camera Architecture
Android Camera Architecture
 
Angular Unit Testing NDC Minn 2018
Angular Unit Testing NDC Minn 2018Angular Unit Testing NDC Minn 2018
Angular Unit Testing NDC Minn 2018
 
Capture image on eye blink
Capture image on eye blinkCapture image on eye blink
Capture image on eye blink
 

More from ShaiAlmog1

The Duck Teaches Learn to debug from the masters. Local to production- kill ...
The Duck Teaches  Learn to debug from the masters. Local to production- kill ...The Duck Teaches  Learn to debug from the masters. Local to production- kill ...
The Duck Teaches Learn to debug from the masters. Local to production- kill ...
ShaiAlmog1
 
create-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdfcreate-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdf
ShaiAlmog1
 
create-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdfcreate-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdf
ShaiAlmog1
 
create-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdfcreate-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdf
ShaiAlmog1
 
create-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdfcreate-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdf
ShaiAlmog1
 
create-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdfcreate-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdf
ShaiAlmog1
 
create-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdfcreate-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdf
ShaiAlmog1
 
create-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdfcreate-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdf
ShaiAlmog1
 
create-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdfcreate-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdf
ShaiAlmog1
 
create-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdfcreate-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdf
ShaiAlmog1
 
create-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdfcreate-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdf
ShaiAlmog1
 
create-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdfcreate-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdf
ShaiAlmog1
 
create-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdfcreate-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdfCreating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdfCreating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdfCreating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdfCreating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdfCreating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdfCreating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdf
ShaiAlmog1
 
Creating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdfCreating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdf
ShaiAlmog1
 

More from ShaiAlmog1 (20)

The Duck Teaches Learn to debug from the masters. Local to production- kill ...
The Duck Teaches  Learn to debug from the masters. Local to production- kill ...The Duck Teaches  Learn to debug from the masters. Local to production- kill ...
The Duck Teaches Learn to debug from the masters. Local to production- kill ...
 
create-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdfcreate-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdf
 
create-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdfcreate-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdf
 
create-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdfcreate-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdf
 
create-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdfcreate-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdf
 
create-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdfcreate-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdf
 
create-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdfcreate-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdf
 
create-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdfcreate-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdf
 
create-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdfcreate-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdf
 
create-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdfcreate-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdf
 
create-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdfcreate-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdf
 
create-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdfcreate-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdf
 
create-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdfcreate-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdf
 
Creating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdfCreating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdf
 
Creating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdfCreating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdf
 
Creating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdfCreating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdf
 
Creating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdfCreating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdf
 
Creating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdfCreating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdf
 
Creating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdfCreating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdf
 
Creating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdfCreating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdf
 

Recently uploaded

Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
Kumud Singh
 
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
Edge AI and Vision Alliance
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Malak Abu Hammad
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
Quotidiano Piemontese
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 
Removing Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software FuzzingRemoving Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software Fuzzing
Aftab Hussain
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
Alpen-Adria-Universität
 
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Speck&Tech
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
Matthew Sinclair
 
How to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptxHow to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptx
danishmna97
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
sonjaschweigert1
 
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
Neo4j
 
UiPath Test Automation using UiPath Test Suite series, part 6
UiPath Test Automation using UiPath Test Suite series, part 6UiPath Test Automation using UiPath Test Suite series, part 6
UiPath Test Automation using UiPath Test Suite series, part 6
DianaGray10
 
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
James Anderson
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
DianaGray10
 
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIEnchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Vladimir Iglovikov, Ph.D.
 
20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
Matthew Sinclair
 
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdfUni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems S.M.S.A.
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
ControlCase
 

Recently uploaded (20)

Mind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AIMind map of terminologies used in context of Generative AI
Mind map of terminologies used in context of Generative AI
 
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
“Building and Scaling AI Applications with the Nx AI Manager,” a Presentation...
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 
Removing Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software FuzzingRemoving Uninteresting Bytes in Software Fuzzing
Removing Uninteresting Bytes in Software Fuzzing
 
Video Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the FutureVideo Streaming: Then, Now, and in the Future
Video Streaming: Then, Now, and in the Future
 
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
 
How to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptxHow to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptx
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
 
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...
 
UiPath Test Automation using UiPath Test Suite series, part 6
UiPath Test Automation using UiPath Test Suite series, part 6UiPath Test Automation using UiPath Test Suite series, part 6
UiPath Test Automation using UiPath Test Suite series, part 6
 
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...
 
UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5UiPath Test Automation using UiPath Test Suite series, part 5
UiPath Test Automation using UiPath Test Suite series, part 5
 
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIEnchancing adoption of Open Source Libraries. A case study on Albumentations.AI
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AI
 
20240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 202420240607 QFM018 Elixir Reading List May 2024
20240607 QFM018 Elixir Reading List May 2024
 
Uni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdfUni Systems Copilot event_05062024_C.Vlachos.pdf
Uni Systems Copilot event_05062024_C.Vlachos.pdf
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
 

Building a Native Camera Access Library - Part III - Transcript.pdf

  • 1. Building Native Camera Access - Part III The iOS port is a steep climb. Unlike the Android version which maps almost directly to the native code. Still, one of the advantages in iOS programming is the cleaner underlying API that often simplifies common use cases. Due to this I chose to skip 3rd party libraries and try to implement the functionality of Camera Kit directly on the native iOS API's.
  • 2. #import <Foundation/Foundation.h> @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { } -(void)start; -(void)stop; -(void)setVideoBitRate:(int)param; -(int)getPreviewWidth; -(BOOL)isStarted; -(void)setMethod:(int)param; -(void*)getView; -(void)setPermissions:(int)param; -(int)getFacing; -(void)setZoom:(float)param; -(int)toggleFacing; -(int)getCaptureWidth; -(float)getHorizontalViewingAngle; -(void)setJpegQuality:(int)param; -(void)stopVideo; -(BOOL)isFacingBack; -(int)getFlash; -(void)captureImage; -(int)getPreviewHeight; -(void)captureVideoFile:(NSString*)param; -(void)setLockVideoAspectRatio:(BOOL)param; -(void)setFocus:(int)param; com_codename1_camerakit_impl_CameraNativeAccessImpl.h When we use Generate Native Stubs the iOS stubs include two files an h and an m file. Lets review the h file first, I'll look at what was generated in both before we begin. This is a standard objective-c import statement that adds the basic Apple iOS API, imports in Objective-C are more like C includes than Java imports.
  • 3. #import <Foundation/Foundation.h> @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { } -(void)start; -(void)stop; -(void)setVideoBitRate:(int)param; -(int)getPreviewWidth; -(BOOL)isStarted; -(void)setMethod:(int)param; -(void*)getView; -(void)setPermissions:(int)param; -(int)getFacing; -(void)setZoom:(float)param; -(int)toggleFacing; -(int)getCaptureWidth; -(float)getHorizontalViewingAngle; -(void)setJpegQuality:(int)param; -(void)stopVideo; -(BOOL)isFacingBack; -(int)getFlash; -(void)captureImage; -(int)getPreviewHeight; -(void)captureVideoFile:(NSString*)param; -(void)setLockVideoAspectRatio:(BOOL)param; -(void)setFocus:(int)param; com_codename1_camerakit_impl_CameraNativeAccessImpl.h This is the class definition for the native interface notice that NSObject is the common base class here
  • 4. #import <Foundation/Foundation.h> @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { } -(void)start; -(void)stop; -(void)setVideoBitRate:(int)param; -(int)getPreviewWidth; -(BOOL)isStarted; -(void)setMethod:(int)param; -(void*)getView; -(void)setPermissions:(int)param; -(int)getFacing; -(void)setZoom:(float)param; -(int)toggleFacing; -(int)getCaptureWidth; -(float)getHorizontalViewingAngle; -(void)setJpegQuality:(int)param; -(void)stopVideo; -(BOOL)isFacingBack; -(int)getFlash; -(void)captureImage; -(int)getPreviewHeight; -(void)captureVideoFile:(NSString*)param; -(void)setLockVideoAspectRatio:(BOOL)param; -(void)setFocus:(int)param; com_codename1_camerakit_impl_CameraNativeAccessImpl.h The method signatures should be pretty readable as they map directly to the Java equivalents
  • 5. #import <Foundation/Foundation.h> @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { } -(void)start; -(void)stop; -(void)setVideoBitRate:(int)param; -(int)getPreviewWidth; -(BOOL)isStarted; -(void)setMethod:(int)param; -(void*)getView; -(void)setPermissions:(int)param; -(int)getFacing; -(void)setZoom:(float)param; -(int)toggleFacing; -(int)getCaptureWidth; -(float)getHorizontalViewingAngle; -(void)setJpegQuality:(int)param; -(void)stopVideo; -(BOOL)isFacingBack; -(int)getFlash; -(void)captureImage; -(int)getPreviewHeight; -(void)captureVideoFile:(NSString*)param; -(void)setLockVideoAspectRatio:(BOOL)param; -(void)setFocus:(int)param; com_codename1_camerakit_impl_CameraNativeAccessImpl.h getView expects a peer component on the Java side, from this side we'll return a UIView instance
  • 6. #import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h" @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{} -(void)stop{} -(void)setVideoBitRate:(int)param{} -(int)getPreviewWidth{ return 0; } -(BOOL)isStarted{ return NO; } -(void)setMethod:(int)param{} -(void*)getView{ return nil; } -(void)setPermissions:(int)param{} -(int)getFacing{ return 0; } -(void)setZoom:(float)param{} -(int)toggleFacing{ return 0; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Once we get this the implementation code is more of the same. We import the header file we just reviewed just like includes in C.
  • 7. #import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h" @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{} -(void)stop{} -(void)setVideoBitRate:(int)param{} -(int)getPreviewWidth{ return 0; } -(BOOL)isStarted{ return NO; } -(void)setMethod:(int)param{} -(void*)getView{ return nil; } -(void)setPermissions:(int)param{} -(int)getFacing{ return 0; } -(void)setZoom:(float)param{} -(int)toggleFacing{ return 0; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Method implementations match their declaration with curly braces for the implementation
  • 8. #import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h" @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{} -(void)stop{} -(void)setVideoBitRate:(int)param{} -(int)getPreviewWidth{ return 0; } -(BOOL)isStarted{ return NO; } -(void)setMethod:(int)param{} -(void*)getView{ return nil; } -(void)setPermissions:(int)param{} -(int)getFacing{ return 0; } -(void)setZoom:(float)param{} -(int)toggleFacing{ return 0; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Parameter names are important in iOS. If you change the name of the parameter you will break compilation as the name is a part of the method signature
  • 9. #import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h" @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{} -(void)stop{} -(void)setVideoBitRate:(int)param{} -(int)getPreviewWidth{ return 0; } -(BOOL)isStarted{ return NO; } -(void)setMethod:(int)param{} -(void*)getView{ return nil; } -(void)setPermissions:(int)param{} -(int)getFacing{ return 0; } -(void)setZoom:(float)param{} -(int)toggleFacing{ return 0; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Most types are familiar but some like boolean use a slightly different syntax with YES & NO instead of true & false
  • 10. #import "com_codename1_camerakit_impl_CameraNativeAccessImpl.h" @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{} -(void)stop{} -(void)setVideoBitRate:(int)param{} -(int)getPreviewWidth{ return 0; } -(BOOL)isStarted{ return NO; } -(void)setMethod:(int)param{} -(void*)getView{ return nil; } -(void)setPermissions:(int)param{} -(int)getFacing{ return 0; } -(void)setZoom:(float)param{} -(int)toggleFacing{ return 0; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The same is true for nil instead of null
  • 11. #import <AVFoundation/AVFoundation.h> const int FACING_BACK = 0; const int FACING_FRONT = 1; const int FLASH_OFF = 0; const int FLASH_ON = 1; const int FLASH_AUTO = 2; const int FLASH_TORCH = 3; const int FOCUS_OFF = 0; const int FOCUS_CONTINUOUS = 1; const int FOCUS_TAP = 2; const int FOCUS_TAP_WITH_MARKER = 3; const int METHOD_STANDARD = 0; const int METHOD_STILL = 1; const int VIDEO_QUALITY_480P = 0; const int VIDEO_QUALITY_720P = 1; const int VIDEO_QUALITY_1080P = 2; const int VIDEO_QUALITY_2160P = 3; const int VIDEO_QUALITY_HIGHEST = 4; const int VIDEO_QUALITY_LOWEST = 5; const int VIDEO_QUALITY_QVGA = 6; com_codename1_camerakit_impl_CameraNativeAccessImpl.h We'll start by bringing in the constants from the Java Constants interface. In the Java implementation we could ignore their values in the native side because the implementation already used the exact same values. We don't have that privilege. I'll also add an import to the native AVFoundation (Audio Video Foundation) which is the iOS API for media. These are copied directly from the Java code with the public static final portion replaced to const. This will make coding the rest easier.
  • 12. BOOL firstTimeCameraKitLaunch = YES; @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{ if(firstTimeCameraKitLaunch) { direction = FACING_BACK; flash = FLASH_OFF; focus = FOCUS_CONTINUOUS; method = METHOD_STANDARD; videoQuality = VIDEO_QUALITY_480P; previewLayer = nil; device = nil; photoOutput = nil; captureSession = nil; stillImageOutput = nil; firstTimeCameraKitLaunch = NO; zoom = 1; [self lazyInit]; } else { dispatch_sync(dispatch_get_main_queue(), ^{ [captureSession startRunning]; }); } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m So now that we have a sense of the scope lets start implementing the important methods one by one. The natural place to start is the start method. For this to work we first need to define the firstTimeCamerKitLaunch variable
  • 13. BOOL firstTimeCameraKitLaunch = YES; @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{ if(firstTimeCameraKitLaunch) { direction = FACING_BACK; flash = FLASH_OFF; focus = FOCUS_CONTINUOUS; method = METHOD_STANDARD; videoQuality = VIDEO_QUALITY_480P; previewLayer = nil; device = nil; photoOutput = nil; captureSession = nil; stillImageOutput = nil; firstTimeCameraKitLaunch = NO; zoom = 1; [self lazyInit]; } else { dispatch_sync(dispatch_get_main_queue(), ^{ [captureSession startRunning]; }); } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The first time start is invoked we initialize the default values of the various constants to identical values we have in the Android version
  • 14. BOOL firstTimeCameraKitLaunch = YES; @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{ if(firstTimeCameraKitLaunch) { direction = FACING_BACK; flash = FLASH_OFF; focus = FOCUS_CONTINUOUS; method = METHOD_STANDARD; videoQuality = VIDEO_QUALITY_480P; previewLayer = nil; device = nil; photoOutput = nil; captureSession = nil; stillImageOutput = nil; firstTimeCameraKitLaunch = NO; zoom = 1; [self lazyInit]; } else { dispatch_sync(dispatch_get_main_queue(), ^{ [captureSession startRunning]; }); } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This method initializes the camera view the first time around notice that self is the equivalent of this
  • 15. BOOL firstTimeCameraKitLaunch = YES; @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{ if(firstTimeCameraKitLaunch) { direction = FACING_BACK; flash = FLASH_OFF; focus = FOCUS_CONTINUOUS; method = METHOD_STANDARD; videoQuality = VIDEO_QUALITY_480P; previewLayer = nil; device = nil; photoOutput = nil; captureSession = nil; stillImageOutput = nil; firstTimeCameraKitLaunch = NO; zoom = 1; [self lazyInit]; } else { dispatch_sync(dispatch_get_main_queue(), ^{ [captureSession startRunning]; }); } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m dispatch_sync is the iOS equivalent of callSeriallyAndWait we want the block below to execute on the native iOS thread & we want to wait until it's finished
  • 16. BOOL firstTimeCameraKitLaunch = YES; @implementation com_codename1_camerakit_impl_CameraNativeAccessImpl -(void)start{ if(firstTimeCameraKitLaunch) { direction = FACING_BACK; flash = FLASH_OFF; focus = FOCUS_CONTINUOUS; method = METHOD_STANDARD; videoQuality = VIDEO_QUALITY_480P; previewLayer = nil; device = nil; photoOutput = nil; captureSession = nil; stillImageOutput = nil; firstTimeCameraKitLaunch = NO; zoom = 1; [self lazyInit]; } else { dispatch_sync(dispatch_get_main_queue(), ^{ [captureSession startRunning]; }); } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The capture session is stopped on the stop call so if this isn't the first time around we need to restart the capture session
  • 17. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Before we proceed to the lazy init method lets look at the variables we added into the header file. The direction the camera is facing... Front or Back
  • 18. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Whether flash is on/off or auto-flash mode
  • 19. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Focus can be based on point or automatic
  • 20. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Allows for several modes of capture, I didn't implement this for now
  • 21. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h A set of constants indicating the resolution for recorded video
  • 22. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Current camera zoom value
  • 23. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h I store YES here if the app was given permission to access the camera
  • 24. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Since some callbacks for video and photo might be similar I set this to to indicate what I'm capturing
  • 25. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h This is the actual UI element we will see on the screen a UIView is the iOS parallel to Component, I'll discuss this soon
  • 26. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h Notice that I imported the CameraKitView class here. It’s a class I added and I’ll cover it soon…
  • 27. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h This is the native capture device representing the camera. A different device instance is used when we flip between the back & front cameras
  • 28. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h A session encapsulates the capture process, we need to acquire access to the camera with a session and relinquish it in stop
  • 29. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h The preview layer is where the video from the camera is drawn, this is a CALayer which is a graphics surface that we can assign to a UIView. I'll discuss this when covering CameraKitView
  • 30. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h This class is responsible for capturing a movie and saving it to a file
  • 31. #import "CameraKitView.h" @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject { int direction; int flash; int focus; int method; int videoQuality; float zoom; BOOL authorized; BOOL capturingVideo; CameraKitView* container; AVCaptureDevice* device; AVCaptureSession* captureSession; AVCaptureVideoPreviewLayer* previewLayer; AVCaptureMovieFileOutput* movieOutput; AVCapturePhotoOutput* photoOutput; AVCaptureStillImageOutput* stillImageOutput; } com_codename1_camerakit_impl_CameraNativeAccessImpl.h The last two entries handle photos, the former works on iOS 10 and newer devices and the latter works on older devices/OS. That's a lot to digest but we are just getting started…
  • 32. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Lets move right into the lazyInit method. When I started with the code I thought the camera will initialize lazily but that didn't fit the rest of the API so I abandoned that approach and initialized on start. I didn't bother changing the name since it isn't user visible anyway. The content of the following block runs on the native iOS thread synchronously. The method won't return until the code is finished
  • 33. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This is the equivalent of new Object() in Objective-C we allocate the object and invoke its init method which is sort of a constructor
  • 34. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m We're asking the AVCaptureDevice whether we have permission to use the media device, this can result in one of 4 outcomes
  • 35. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Not determined means we need to ask for permission
  • 36. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m So we ask which should prompt the user with a permission dialog that he can accept or reject
  • 37. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If he accepted we move to the second phase of authorization in lazyInitPostAuthorization
  • 38. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If permission was denied in some way there isn't much we can do... The user will see a blank view
  • 39. -(void)lazyInit { dispatch_sync(dispatch_get_main_queue(), ^{ container = [[CameraKitView alloc] init]; switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) { case AVAuthorizationStatusNotDetermined: [AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^( BOOL granted ) { if ( ! granted ) { authorized = NO; return; } authorized = YES; [self lazyInitPostAuthorization]; }]; break; case AVAuthorizationStatusDenied: case AVAuthorizationStatusRestricted: authorized = NO; break; case AVAuthorizationStatusAuthorized: authorized = YES; [self lazyInitPostAuthorization]; break; } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If authorization was already granted previously we move on. Notice that for this to work we need the ios.NSCameraUsageDescription constant I discussed before. Without that build hint permission is denied automatically
  • 40. -(void)lazyInitPostAuthorization { if ([AVCaptureDeviceDiscoverySession class]) { if(direction == FACING_FRONT) { device = [AVCaptureDevice defaultDeviceWithDeviceType: AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront]; } else { device = [AVCaptureDevice defaultDeviceWithDeviceType: AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; } } else { if(direction == FACING_FRONT) { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionFront) { device = d; break; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Lets move on to the lazyInitPostAuthorization method. This is a complex method so I'll divide it into 2 parts for simplicity. The first part of the method deals with detection of the "device" meaning picking the right camera. This checks if a specific class exists, iOS 10 deprecated an API and introduced a new one. If the new API doesn't exist we'll fallback to the old API
  • 41. -(void)lazyInitPostAuthorization { if ([AVCaptureDeviceDiscoverySession class]) { if(direction == FACING_FRONT) { device = [AVCaptureDevice defaultDeviceWithDeviceType: AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront]; } else { device = [AVCaptureDevice defaultDeviceWithDeviceType: AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; } } else { if(direction == FACING_FRONT) { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionFront) { device = d; break; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If we reached this block we are in iOS 10 or newer
  • 42. -(void)lazyInitPostAuthorization { if ([AVCaptureDeviceDiscoverySession class]) { if(direction == FACING_FRONT) { device = [AVCaptureDevice defaultDeviceWithDeviceType: AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionFront]; } else { device = [AVCaptureDevice defaultDeviceWithDeviceType: AVCaptureDeviceTypeBuiltInWideAngleCamera mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; } } else { if(direction == FACING_FRONT) { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionFront) { device = d; break; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m You will notice that getting a device in iOS 10 is one method with the only difference being the position argument value. Notice Objective-C method invocations use argument names as part of the invocation. Notice I referred to Objective-C messages as methods. There is a difference between the two but it's not something you need to understand as a casual Objective-C user.
  • 43. mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; } } else { if(direction == FACING_FRONT) { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionFront) { device = d; break; } } } else { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionBack) { device = d; break; } } } } // ... common device code ... } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This code is running on a device with an OS prior to iOS 10, here we loop over all the devices within AVCaptureDevice
  • 44. mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionBack]; } } else { if(direction == FACING_FRONT) { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionFront) { device = d; break; } } } else { for(AVCaptureDevice* d in [AVCaptureDevice devices]) { if(d.position == AVCaptureDevicePositionBack) { device = d; break; } } } } // ... common device code ... } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If a device is in the right position we update the device value and exit the loop
  • 45. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m The bottom portion of the method is common to both iOS 10+ and prior. Objective-C often accepts pointers to error variables which it assigns in case of an error, I didn't check for error here which I should.
  • 46. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m The input object can be created from the device, we need it to start a session and don't need it after that at this time
  • 47. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m We allocate a new capture session and set the input value
  • 48. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m Preview shows the video of the session in our view. It's a CALayer which we can't add directly to the screen
  • 49. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m setLayer is a method I added to CameraKitView, I'll discuss that when covering CameraKitView
  • 50. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m This is how you show a CALayer within a UIView
  • 51. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m These methods allow us to keep common code between the initialization code and the setter methods. That means a call to setFlash will trigger updateFlash internally I'll cover all 4 methods soon.
  • 52. NSError *error = nil; AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; captureSession = [[AVCaptureSession alloc] init]; [captureSession addInput:input]; previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:captureSession]; previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill; [container setLayer:previewLayer]; [container.layer addSublayer:previewLayer]; [self updateFlash]; [self updateZoom]; [self updateFocus]; [self updateVideoQuality]; [captureSession startRunning]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m The final line starts the capture session. This seems like a lot and it is a lot. We went through the "heavy lifting" portion of the code and as you can see it might not be trivial but it isn't hard. I didn't know half of these methods when I started out but that's the great thing about being a programmer in this day and age: we can google it.