SlideShare a Scribd company logo
1 of 28
Download to read offline
Building Native Camera Access - Part V
In this final part we’ll finish the iOS portion implementation
-(void*)getView{
return container;
}
-(int)getFlash{
return flash;
}
-(int)getFacing{
return direction;
}
-(BOOL)isFacingFront{
return direction == FACING_FRONT;
}
-(BOOL)isFacingBack{
return direction == FACING_BACK;
}
-(int)getPreviewWidth{
return (int) previewLayer.frame.size.width;
}
-(int)getPreviewHeight{
return (int) previewLayer.frame.size.height;
}
-(BOOL)isSupported{
return YES;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
There are some methods that I've skipped entirely so I won't discuss them here but there are a lot of simple methods like we had in the Android port which just delegate
onwards. The getters are the simplest as they don't even need to enter the iOS thread.

This returns the camera view which should be initialized once start() is invoked
-(void*)getView{
return container;
}
-(int)getFlash{
return flash;
}
-(int)getFacing{
return direction;
}
-(BOOL)isFacingFront{
return direction == FACING_FRONT;
}
-(BOOL)isFacingBack{
return direction == FACING_BACK;
}
-(int)getPreviewWidth{
return (int) previewLayer.frame.size.width;
}
-(int)getPreviewHeight{
return (int) previewLayer.frame.size.height;
}
-(BOOL)isSupported{
return YES;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This is a standard method in every native interface. Everything in between is trivial.
-(void)setZoom:(float)param{
if(zoom != param) {
zoom = param;
dispatch_async(dispatch_get_main_queue(), ^{
[self updateZoom];
});
}
}
-(void)setFocus:(int)param{
if(focus != param) {
focus = param;
dispatch_async(dispatch_get_main_queue(), ^{
[self updateFocus];
});
}
}
-(void)setFlash:(int)param{
// same code...
}
-(void)setFacing:(int)param{
// same code...
}
-(void)setVideoQuality:(int)param{
// same code...
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The setters are a bit more verbose but not by much, they are all practically identical so I'll only focus on the first.

I don't want to call update if the value didn't actually change
-(void)setZoom:(float)param{
if(zoom != param) {
zoom = param;
dispatch_async(dispatch_get_main_queue(), ^{
[self updateZoom];
});
}
}
-(void)setFocus:(int)param{
if(focus != param) {
focus = param;
dispatch_async(dispatch_get_main_queue(), ^{
[self updateFocus];
});
}
}
-(void)setFlash:(int)param{
// same code...
}
-(void)setFacing:(int)param{
// same code...
}
-(void)setVideoQuality:(int)param{
// same code...
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
We update on the main thread by invoking the update methods from before. Notice I use the async call. As a rule of thumb always use the async call unless you MUST
use the sync call. It's faster and has a lower chance of a deadlock. In this case I don't need the action to happen within this millisecond so async will work just fine.
-(float)getVerticalViewingAngle{
__block float fov = 0;
dispatch_sync(dispatch_get_main_queue(), ^{
fov = device.activeFormat.videoFieldOfView / 16.0 * 9;
});
return fov;
}
-(float)getHorizontalViewingAngle{
__block float fov = 0;
dispatch_sync(dispatch_get_main_queue(), ^{
fov = device.activeFormat.videoFieldOfView;
});
return fov;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
I also have two special getters that need to run on the event dispatch thread. Normally this wouldn't be a problem but when we need to to return a value that's a bit
challenging.

The __block keyword allows us to mark the field as a field we can change from the lambda expression.
-(float)getVerticalViewingAngle{
__block float fov = 0;
dispatch_sync(dispatch_get_main_queue(), ^{
fov = device.activeFormat.videoFieldOfView / 16.0 * 9;
});
return fov;
}
-(float)getHorizontalViewingAngle{
__block float fov = 0;
dispatch_sync(dispatch_get_main_queue(), ^{
fov = device.activeFormat.videoFieldOfView;
});
return fov;
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Here is a case where we MUST use dispatch_sync. If we used async the return statement would have happened before we assign the value.

Now all that is left is the actual capture methods!
-(void)captureImage{
dispatch_async(dispatch_get_main_queue(), ^{
capturingVideo = NO;
if([AVCapturePhotoOutput class]) {
if(photoOutput == nil) {
photoOutput = [[AVCapturePhotoOutput alloc] init];
}
AVCapturePhotoSettings* settings =
[AVCapturePhotoSettings photoSettings];
[photoOutput capturePhotoWithSettings:settings
delegate:self];
} else {
// ... Code for iOS 9 compatibility
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The capture methods are a bit more complicated. Surprisingly capturing a still image is a bit harder than capturing a video. Since the captureImage method is a bit long
I've split it into two like before. This is a similar case of iOS 10 requiring a new API.

All capture methods must run on the native UI thread as they deal with the native UI. Failing to do this leads to weird crashes.
-(void)captureImage{
dispatch_async(dispatch_get_main_queue(), ^{
capturingVideo = NO;
if([AVCapturePhotoOutput class]) {
if(photoOutput == nil) {
photoOutput = [[AVCapturePhotoOutput alloc] init];
}
AVCapturePhotoSettings* settings =
[AVCapturePhotoSettings photoSettings];
[photoOutput capturePhotoWithSettings:settings
delegate:self];
} else {
// ... Code for iOS 9 compatibility
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This is the new iOS 10+ API, if it isn't here we'll execute the iOS 9 compatible code
-(void)captureImage{
dispatch_async(dispatch_get_main_queue(), ^{
capturingVideo = NO;
if([AVCapturePhotoOutput class]) {
if(photoOutput == nil) {
photoOutput = [[AVCapturePhotoOutput alloc] init];
}
AVCapturePhotoSettings* settings =
[AVCapturePhotoSettings photoSettings];
[photoOutput capturePhotoWithSettings:settings
delegate:self];
} else {
// ... Code for iOS 9 compatibility
}
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Up to this point everything is pretty trivial. Delegate is a special concept in iOS similar to Java's interfaces this means the current class implements the delegate
@interface com_codename1_camerakit_impl_CameraNativeAccessImpl :
NSObject<
AVCapturePhotoCaptureDelegate,
AVCaptureFileOutputRecordingDelegate> {
// everything here is unchanged
}
// everything here is unchanged
@end
com_codename1_camerakit_impl_CameraNativeAccessImpl.h
Before I go to the iOS 9 code within this method lets look at the delegate... In order to implement the delegate we need to make a small change to the header file. The
delegates are declared in a syntax that's reminiscent of the Java generics syntax. I also added the delegate needed for video recording while I'm here already so that's
two delegates.
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput
didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)
photoSampleBuffer
previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer
resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings
error:(NSError *)error {
if(error == nil && !capturingVideo) {
NSData *d = [AVCapturePhotoOutput
JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
return;
}
if(error) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String(
d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Now that these are in place lets look at the delegate code. 

Delegate method declarations are huge! This goes all the way to the curly brackets, this makes Java's verbosity seem quaint. This is all just one delegate callback
method declaration.
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput
didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)
photoSampleBuffer
previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer
resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings
error:(NSError *)error {
if(error == nil && !capturingVideo) {
NSData *d = [AVCapturePhotoOutput
JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
return;
}
if(error) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String(
d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If this is an error or the delegate was invoked because of a video event we don't want to step into the image processing code
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput
didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)
photoSampleBuffer
previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer
resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings
error:(NSError *)error {
if(error == nil && !capturingVideo) {
NSData *d = [AVCapturePhotoOutput
JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
return;
}
if(error) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String(
d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
NSData handles almost all IO in iOS. It's similar to ByteBuffer in Java SE and allows us to map files from/into storage.
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput
didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)
photoSampleBuffer
previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer
resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings
error:(NSError *)error {
if(error == nil && !capturingVideo) {
NSData *d = [AVCapturePhotoOutput
JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
return;
}
if(error) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String(
d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
I use nsDataToByteArr to convert the NSData object to a Java byte array, notice that in the native code all Java objects are effectively JAVA_OBJECT
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput
didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)
photoSampleBuffer
previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer
resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings
error:(NSError *)error {
if(error == nil && !capturingVideo) {
NSData *d = [AVCapturePhotoOutput
JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
return;
}
if(error) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String(
d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
I can now invoke the callback method with the given byte array, notice the callback method includes the full name and argument types
-(void)captureOutput:(AVCapturePhotoOutput *)captureOutput
didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)
photoSampleBuffer
previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer
resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings
bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings
error:(NSError *)error {
if(error == nil && !capturingVideo) {
NSData *d = [AVCapturePhotoOutput
JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer
previewPhotoSampleBuffer:previewPhotoSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
return;
}
if(error) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String(
d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The VM passes thread local context on the stack for every method, this makes things like stack traces work
#include "com_codename1_camerakit_impl_CameraCallbacks.h"
extern JAVA_OBJECT nsDataToByteArr(NSData *data);
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
For the code to compile we need to add these declarations below the #import statements. The VM generates C code so we import it with include. 

The nsDataToByteArr method is a method from the iOS port. I could have just included the whole header but there is no need in this case. It makes the process of
converting an NSData much simpler
if(stillImageOutput == nil) {
stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = [[NSDictionary alloc]
initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
[stillImageOutput setOutputSettings:outputSettings];
}
[captureSession addOutput:stillImageOutput];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection){
break;
}
}
[stillImageOutput
captureStillImageAsynchronouslyFromConnection:videoConnection
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Now that all this is out of the way lets go back to the captureImage class and the iOS 9 compatibility block. This is effectively code I got from stackoverflow

I needed this since most samples are for iOS 10+ now and ignore the legacy but I still have quite a few iOS 9 devices that can't upgrade to 10. So I'm assuming there is
still some market for compatibility:

Most this code is boilerplate code for detecting the camera, no wonder it was reworked
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection){
break;
}
}
[stillImageOutput
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer,
NSError *error) {
NSData *d = [AVCaptureStillImageOutput
jpegStillImageNSDataRepresentation:imageSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
[captureSession removeOutput:stillImageOutput];
}];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The capture image process is asynchronous and invokes the lambda expression below to process the resulting image
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection){
break;
}
}
[stillImageOutput
captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageSampleBuffer,
NSError *error) {
NSData *d = [AVCaptureStillImageOutput
jpegStillImageNSDataRepresentation:imageSampleBuffer];
JAVA_OBJECT byteArray = nsDataToByteArr(d);
com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY(
getThreadLocalData(), byteArray);
[captureSession removeOutput:stillImageOutput];
}];
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The capture image process is asynchronous and invokes the lambda expression below to process the resulting image. The rest of the code should be very familiar as it's
almost identical to the one we used in the iOS 10 version. We just callback into Java.

With this image capture should now work for both old and new devices. All the basics should work with the exception of video!
-(void)captureVideo{
NSURL *furl = [NSURL fileURLWithPath:[NSTemporaryDirectory()
stringByAppendingPathComponent:@"temp.mov"]];
[self captureVideoFile:[furl absoluteString]];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Capturing video is surprisingly easier than capturing still images. First the simplest method is captureVideo. The no-arguments version of this method saves to a
temporary file name. We invoke the version that accepts the file with a “temp.mov” file path.
-(void)captureVideoFile:(NSString*)param{
dispatch_async(dispatch_get_main_queue(), ^{
if(movieOutput != nil) {
[captureSession removeOutput:movieOutput];
[movieOutput stopRecording];
[movieOutput release];
}
capturingVideo = YES;
movieOutput = [[AVCaptureMovieFileOutput alloc] init];
[captureSession addOutput:movieOutput];
[movieOutput startRecordingToOutputFileURL:
[NSURL URLWithString:param] recordingDelegate:self];
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The version that accepts a path isn't much harder. 

The AVCaptureMovieFileOutput class represents the recording process, we lazily initialize it
-(void)captureVideoFile:(NSString*)param{
dispatch_async(dispatch_get_main_queue(), ^{
if(movieOutput != nil) {
[captureSession removeOutput:movieOutput];
[movieOutput stopRecording];
[movieOutput release];
}
capturingVideo = YES;
movieOutput = [[AVCaptureMovieFileOutput alloc] init];
[captureSession addOutput:movieOutput];
[movieOutput startRecordingToOutputFileURL:
[NSURL URLWithString:param] recordingDelegate:self];
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
We bind the output value to the capture session
-(void)captureVideoFile:(NSString*)param{
dispatch_async(dispatch_get_main_queue(), ^{
if(movieOutput != nil) {
[captureSession removeOutput:movieOutput];
[movieOutput stopRecording];
[movieOutput release];
}
capturingVideo = YES;
movieOutput = [[AVCaptureMovieFileOutput alloc] init];
[captureSession addOutput:movieOutput];
[movieOutput startRecordingToOutputFileURL:
[NSURL URLWithString:param] recordingDelegate:self];
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Now we can provide a URL from the file argument and start recording, again we use `self` as the delegate
- (void)captureOutput:(AVCaptureFileOutput *)output
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray<AVCaptureConnection *> *)connections
error:(NSError *)error {
if(capturingVideo && outputFileURL != nil) {
NSString* url = [outputFileURL absoluteString];
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onVideo___java_lang_String
(d, fromNSString(d, url));
return;
}
if(error != nil) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String
(d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Which obviously leads us to the delegate method. Most of this should be very familiar now that we went through the image capture code.
- (void)captureOutput:(AVCaptureFileOutput *)output
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
fromConnections:(NSArray<AVCaptureConnection *> *)connections
error:(NSError *)error {
if(capturingVideo && outputFileURL != nil) {
NSString* url = [outputFileURL absoluteString];
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onVideo___java_lang_String
(d, fromNSString(d, url));
return;
}
if(error != nil) {
struct ThreadLocalData* d = getThreadLocalData();
com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String
(d, nil, nil, nil);
return;
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This callback accepts a path String which we can translate from NSString using the fromNSString API call
-(void)stopVideo{
dispatch_async(dispatch_get_main_queue(), ^{
[movieOutput stopRecording];
[captureSession removeOutput:movieOutput];
[movieOutput release];
movieOutput = nil;
});
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The last remaining piece is the stop video method. This is a trivial method if you've been keeping up. We just stop the recording and cleanup. And with that last bit of
code we are DONE!

I hope you found this useful, there is a lot of code to go through but most of it is damn trivial once you look it over. You can create native interfaces that do just about
anything if you have the patience to debug and google the native API's.

More Related Content

Similar to Building a Native Camera Access Library - Part V - Transcript.pdf

20150516 modern web_conf_tw
20150516 modern web_conf_tw20150516 modern web_conf_tw
20150516 modern web_conf_tw
Tse-Ching Ho
 
Navigating the wild seas of es6 modules
Navigating the wild seas of es6 modulesNavigating the wild seas of es6 modules
Navigating the wild seas of es6 modules
Gil Tayar
 
Курсы по мобильной разработке под iOS. 4 лекция. Возможности телефона
Курсы по мобильной разработке под iOS. 4 лекция. Возможности телефонаКурсы по мобильной разработке под iOS. 4 лекция. Возможности телефона
Курсы по мобильной разработке под iOS. 4 лекция. Возможности телефона
Глеб Тарасов
 

Similar to Building a Native Camera Access Library - Part V - Transcript.pdf (20)

303 TANSTAAFL: Using Open Source iPhone UI Code
303 TANSTAAFL: Using Open Source iPhone UI Code303 TANSTAAFL: Using Open Source iPhone UI Code
303 TANSTAAFL: Using Open Source iPhone UI Code
 
Vue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapyVue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapy
 
Building a Native Camera Access Library - Part II.pdf
Building a Native Camera Access Library - Part II.pdfBuilding a Native Camera Access Library - Part II.pdf
Building a Native Camera Access Library - Part II.pdf
 
Design functional solutions in Java, a practical example
Design functional solutions in Java, a practical exampleDesign functional solutions in Java, a practical example
Design functional solutions in Java, a practical example
 
Android workshop
Android workshopAndroid workshop
Android workshop
 
Conceitos e prática no desenvolvimento iOS - Mobile Conf 2014
Conceitos e prática no desenvolvimento iOS - Mobile Conf 2014Conceitos e prática no desenvolvimento iOS - Mobile Conf 2014
Conceitos e prática no desenvolvimento iOS - Mobile Conf 2014
 
20150516 modern web_conf_tw
20150516 modern web_conf_tw20150516 modern web_conf_tw
20150516 modern web_conf_tw
 
Creating a Facebook Clone - Part XLIII - Transcript.pdf
Creating a Facebook Clone - Part XLIII - Transcript.pdfCreating a Facebook Clone - Part XLIII - Transcript.pdf
Creating a Facebook Clone - Part XLIII - Transcript.pdf
 
Writing Your App Swiftly
Writing Your App SwiftlyWriting Your App Swiftly
Writing Your App Swiftly
 
從零開始學 Android
從零開始學 Android從零開始學 Android
從零開始學 Android
 
I phone勉強会 (2011.11.23)
I phone勉強会 (2011.11.23)I phone勉強会 (2011.11.23)
I phone勉強会 (2011.11.23)
 
Practical tips for building apps with kotlin
Practical tips for building apps with kotlinPractical tips for building apps with kotlin
Practical tips for building apps with kotlin
 
Unbundling the JavaScript module bundler - Codemotion Rome 2018
Unbundling the JavaScript module bundler - Codemotion Rome 2018Unbundling the JavaScript module bundler - Codemotion Rome 2018
Unbundling the JavaScript module bundler - Codemotion Rome 2018
 
Unbundling the JavaScript module bundler - Luciano Mammino - Codemotion Rome ...
Unbundling the JavaScript module bundler - Luciano Mammino - Codemotion Rome ...Unbundling the JavaScript module bundler - Luciano Mammino - Codemotion Rome ...
Unbundling the JavaScript module bundler - Luciano Mammino - Codemotion Rome ...
 
KMM survival guide: how to tackle struggles between Kotlin and Swift
KMM survival guide: how to tackle struggles between Kotlin and SwiftKMM survival guide: how to tackle struggles between Kotlin and Swift
KMM survival guide: how to tackle struggles between Kotlin and Swift
 
Navigating the wild seas of es6 modules
Navigating the wild seas of es6 modulesNavigating the wild seas of es6 modules
Navigating the wild seas of es6 modules
 
Capture image on eye blink
Capture image on eye blinkCapture image on eye blink
Capture image on eye blink
 
iOS API Design
iOS API DesigniOS API Design
iOS API Design
 
Protocol-Oriented Programming in Swift
Protocol-Oriented Programming in SwiftProtocol-Oriented Programming in Swift
Protocol-Oriented Programming in Swift
 
Курсы по мобильной разработке под iOS. 4 лекция. Возможности телефона
Курсы по мобильной разработке под iOS. 4 лекция. Возможности телефонаКурсы по мобильной разработке под iOS. 4 лекция. Возможности телефона
Курсы по мобильной разработке под iOS. 4 лекция. Возможности телефона
 

More from ShaiAlmog1

More from ShaiAlmog1 (20)

The Duck Teaches Learn to debug from the masters. Local to production- kill ...
The Duck Teaches  Learn to debug from the masters. Local to production- kill ...The Duck Teaches  Learn to debug from the masters. Local to production- kill ...
The Duck Teaches Learn to debug from the masters. Local to production- kill ...
 
create-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdfcreate-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdf
 
create-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdfcreate-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdf
 
create-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdfcreate-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdf
 
create-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdfcreate-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdf
 
create-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdfcreate-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdf
 
create-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdfcreate-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdf
 
create-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdfcreate-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdf
 
create-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdfcreate-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdf
 
create-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdfcreate-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdf
 
create-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdfcreate-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdf
 
create-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdfcreate-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdf
 
create-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdfcreate-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdf
 
Creating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdfCreating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdf
 
Creating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdfCreating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdf
 
Creating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdfCreating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdf
 
Creating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdfCreating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdf
 
Creating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdfCreating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdf
 
Creating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdfCreating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdf
 
Creating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdfCreating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdf
 

Recently uploaded

CORS (Kitworks Team Study 양다윗 발표자료 240510)
CORS (Kitworks Team Study 양다윗 발표자료 240510)CORS (Kitworks Team Study 양다윗 발표자료 240510)
CORS (Kitworks Team Study 양다윗 발표자료 240510)
Wonjun Hwang
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc
 
Microsoft BitLocker Bypass Attack Method.pdf
Microsoft BitLocker Bypass Attack Method.pdfMicrosoft BitLocker Bypass Attack Method.pdf
Microsoft BitLocker Bypass Attack Method.pdf
Overkill Security
 
Tales from a Passkey Provider Progress from Awareness to Implementation.pptx
Tales from a Passkey Provider  Progress from Awareness to Implementation.pptxTales from a Passkey Provider  Progress from Awareness to Implementation.pptx
Tales from a Passkey Provider Progress from Awareness to Implementation.pptx
FIDO Alliance
 

Recently uploaded (20)

ChatGPT and Beyond - Elevating DevOps Productivity
ChatGPT and Beyond - Elevating DevOps ProductivityChatGPT and Beyond - Elevating DevOps Productivity
ChatGPT and Beyond - Elevating DevOps Productivity
 
Event-Driven Architecture Masterclass: Challenges in Stream Processing
Event-Driven Architecture Masterclass: Challenges in Stream ProcessingEvent-Driven Architecture Masterclass: Challenges in Stream Processing
Event-Driven Architecture Masterclass: Challenges in Stream Processing
 
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
Event-Driven Architecture Masterclass: Engineering a Robust, High-performance...
 
Simplifying Mobile A11y Presentation.pptx
Simplifying Mobile A11y Presentation.pptxSimplifying Mobile A11y Presentation.pptx
Simplifying Mobile A11y Presentation.pptx
 
Vector Search @ sw2con for slideshare.pptx
Vector Search @ sw2con for slideshare.pptxVector Search @ sw2con for slideshare.pptx
Vector Search @ sw2con for slideshare.pptx
 
Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...
Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...
Human Expert Website Manual WCAG 2.0 2.1 2.2 Audit - Digital Accessibility Au...
 
الأمن السيبراني - ما لا يسع للمستخدم جهله
الأمن السيبراني - ما لا يسع للمستخدم جهلهالأمن السيبراني - ما لا يسع للمستخدم جهله
الأمن السيبراني - ما لا يسع للمستخدم جهله
 
CORS (Kitworks Team Study 양다윗 발표자료 240510)
CORS (Kitworks Team Study 양다윗 발표자료 240510)CORS (Kitworks Team Study 양다윗 발표자료 240510)
CORS (Kitworks Team Study 양다윗 발표자료 240510)
 
Event-Driven Architecture Masterclass: Integrating Distributed Data Stores Ac...
Event-Driven Architecture Masterclass: Integrating Distributed Data Stores Ac...Event-Driven Architecture Masterclass: Integrating Distributed Data Stores Ac...
Event-Driven Architecture Masterclass: Integrating Distributed Data Stores Ac...
 
JohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptxJohnPollard-hybrid-app-RailsConf2024.pptx
JohnPollard-hybrid-app-RailsConf2024.pptx
 
UiPath manufacturing technology benefits and AI overview
UiPath manufacturing technology benefits and AI overviewUiPath manufacturing technology benefits and AI overview
UiPath manufacturing technology benefits and AI overview
 
State of the Smart Building Startup Landscape 2024!
State of the Smart Building Startup Landscape 2024!State of the Smart Building Startup Landscape 2024!
State of the Smart Building Startup Landscape 2024!
 
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
TrustArc Webinar - Unified Trust Center for Privacy, Security, Compliance, an...
 
How to Check CNIC Information Online with Pakdata cf
How to Check CNIC Information Online with Pakdata cfHow to Check CNIC Information Online with Pakdata cf
How to Check CNIC Information Online with Pakdata cf
 
Microsoft BitLocker Bypass Attack Method.pdf
Microsoft BitLocker Bypass Attack Method.pdfMicrosoft BitLocker Bypass Attack Method.pdf
Microsoft BitLocker Bypass Attack Method.pdf
 
Tales from a Passkey Provider Progress from Awareness to Implementation.pptx
Tales from a Passkey Provider  Progress from Awareness to Implementation.pptxTales from a Passkey Provider  Progress from Awareness to Implementation.pptx
Tales from a Passkey Provider Progress from Awareness to Implementation.pptx
 
Design Guidelines for Passkeys 2024.pptx
Design Guidelines for Passkeys 2024.pptxDesign Guidelines for Passkeys 2024.pptx
Design Guidelines for Passkeys 2024.pptx
 
Design and Development of a Provenance Capture Platform for Data Science
Design and Development of a Provenance Capture Platform for Data ScienceDesign and Development of a Provenance Capture Platform for Data Science
Design and Development of a Provenance Capture Platform for Data Science
 
JavaScript Usage Statistics 2024 - The Ultimate Guide
JavaScript Usage Statistics 2024 - The Ultimate GuideJavaScript Usage Statistics 2024 - The Ultimate Guide
JavaScript Usage Statistics 2024 - The Ultimate Guide
 
Working together SRE & Platform Engineering
Working together SRE & Platform EngineeringWorking together SRE & Platform Engineering
Working together SRE & Platform Engineering
 

Building a Native Camera Access Library - Part V - Transcript.pdf

  • 1. Building Native Camera Access - Part V In this final part we’ll finish the iOS portion implementation
  • 2. -(void*)getView{ return container; } -(int)getFlash{ return flash; } -(int)getFacing{ return direction; } -(BOOL)isFacingFront{ return direction == FACING_FRONT; } -(BOOL)isFacingBack{ return direction == FACING_BACK; } -(int)getPreviewWidth{ return (int) previewLayer.frame.size.width; } -(int)getPreviewHeight{ return (int) previewLayer.frame.size.height; } -(BOOL)isSupported{ return YES; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m There are some methods that I've skipped entirely so I won't discuss them here but there are a lot of simple methods like we had in the Android port which just delegate onwards. The getters are the simplest as they don't even need to enter the iOS thread. This returns the camera view which should be initialized once start() is invoked
  • 3. -(void*)getView{ return container; } -(int)getFlash{ return flash; } -(int)getFacing{ return direction; } -(BOOL)isFacingFront{ return direction == FACING_FRONT; } -(BOOL)isFacingBack{ return direction == FACING_BACK; } -(int)getPreviewWidth{ return (int) previewLayer.frame.size.width; } -(int)getPreviewHeight{ return (int) previewLayer.frame.size.height; } -(BOOL)isSupported{ return YES; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This is a standard method in every native interface. Everything in between is trivial.
  • 4. -(void)setZoom:(float)param{ if(zoom != param) { zoom = param; dispatch_async(dispatch_get_main_queue(), ^{ [self updateZoom]; }); } } -(void)setFocus:(int)param{ if(focus != param) { focus = param; dispatch_async(dispatch_get_main_queue(), ^{ [self updateFocus]; }); } } -(void)setFlash:(int)param{ // same code... } -(void)setFacing:(int)param{ // same code... } -(void)setVideoQuality:(int)param{ // same code... } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The setters are a bit more verbose but not by much, they are all practically identical so I'll only focus on the first. I don't want to call update if the value didn't actually change
  • 5. -(void)setZoom:(float)param{ if(zoom != param) { zoom = param; dispatch_async(dispatch_get_main_queue(), ^{ [self updateZoom]; }); } } -(void)setFocus:(int)param{ if(focus != param) { focus = param; dispatch_async(dispatch_get_main_queue(), ^{ [self updateFocus]; }); } } -(void)setFlash:(int)param{ // same code... } -(void)setFacing:(int)param{ // same code... } -(void)setVideoQuality:(int)param{ // same code... } com_codename1_camerakit_impl_CameraNativeAccessImpl.m We update on the main thread by invoking the update methods from before. Notice I use the async call. As a rule of thumb always use the async call unless you MUST use the sync call. It's faster and has a lower chance of a deadlock. In this case I don't need the action to happen within this millisecond so async will work just fine.
  • 6. -(float)getVerticalViewingAngle{ __block float fov = 0; dispatch_sync(dispatch_get_main_queue(), ^{ fov = device.activeFormat.videoFieldOfView / 16.0 * 9; }); return fov; } -(float)getHorizontalViewingAngle{ __block float fov = 0; dispatch_sync(dispatch_get_main_queue(), ^{ fov = device.activeFormat.videoFieldOfView; }); return fov; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m I also have two special getters that need to run on the event dispatch thread. Normally this wouldn't be a problem but when we need to to return a value that's a bit challenging. The __block keyword allows us to mark the field as a field we can change from the lambda expression.
  • 7. -(float)getVerticalViewingAngle{ __block float fov = 0; dispatch_sync(dispatch_get_main_queue(), ^{ fov = device.activeFormat.videoFieldOfView / 16.0 * 9; }); return fov; } -(float)getHorizontalViewingAngle{ __block float fov = 0; dispatch_sync(dispatch_get_main_queue(), ^{ fov = device.activeFormat.videoFieldOfView; }); return fov; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Here is a case where we MUST use dispatch_sync. If we used async the return statement would have happened before we assign the value. Now all that is left is the actual capture methods!
  • 8. -(void)captureImage{ dispatch_async(dispatch_get_main_queue(), ^{ capturingVideo = NO; if([AVCapturePhotoOutput class]) { if(photoOutput == nil) { photoOutput = [[AVCapturePhotoOutput alloc] init]; } AVCapturePhotoSettings* settings = [AVCapturePhotoSettings photoSettings]; [photoOutput capturePhotoWithSettings:settings delegate:self]; } else { // ... Code for iOS 9 compatibility } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The capture methods are a bit more complicated. Surprisingly capturing a still image is a bit harder than capturing a video. Since the captureImage method is a bit long I've split it into two like before. This is a similar case of iOS 10 requiring a new API. All capture methods must run on the native UI thread as they deal with the native UI. Failing to do this leads to weird crashes.
  • 9. -(void)captureImage{ dispatch_async(dispatch_get_main_queue(), ^{ capturingVideo = NO; if([AVCapturePhotoOutput class]) { if(photoOutput == nil) { photoOutput = [[AVCapturePhotoOutput alloc] init]; } AVCapturePhotoSettings* settings = [AVCapturePhotoSettings photoSettings]; [photoOutput capturePhotoWithSettings:settings delegate:self]; } else { // ... Code for iOS 9 compatibility } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This is the new iOS 10+ API, if it isn't here we'll execute the iOS 9 compatible code
  • 10. -(void)captureImage{ dispatch_async(dispatch_get_main_queue(), ^{ capturingVideo = NO; if([AVCapturePhotoOutput class]) { if(photoOutput == nil) { photoOutput = [[AVCapturePhotoOutput alloc] init]; } AVCapturePhotoSettings* settings = [AVCapturePhotoSettings photoSettings]; [photoOutput capturePhotoWithSettings:settings delegate:self]; } else { // ... Code for iOS 9 compatibility } }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Up to this point everything is pretty trivial. Delegate is a special concept in iOS similar to Java's interfaces this means the current class implements the delegate
  • 11. @interface com_codename1_camerakit_impl_CameraNativeAccessImpl : NSObject< AVCapturePhotoCaptureDelegate, AVCaptureFileOutputRecordingDelegate> { // everything here is unchanged } // everything here is unchanged @end com_codename1_camerakit_impl_CameraNativeAccessImpl.h Before I go to the iOS 9 code within this method lets look at the delegate... In order to implement the delegate we need to make a small change to the header file. The delegates are declared in a syntax that's reminiscent of the Java generics syntax. I also added the delegate needed for video recording while I'm here already so that's two delegates.
  • 12. -(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef) photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if(error == nil && !capturingVideo) { NSData *d = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); return; } if(error) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String( d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Now that these are in place lets look at the delegate code. 
 Delegate method declarations are huge! This goes all the way to the curly brackets, this makes Java's verbosity seem quaint. This is all just one delegate callback method declaration.
  • 13. -(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef) photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if(error == nil && !capturingVideo) { NSData *d = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); return; } if(error) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String( d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If this is an error or the delegate was invoked because of a video event we don't want to step into the image processing code
  • 14. -(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef) photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if(error == nil && !capturingVideo) { NSData *d = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); return; } if(error) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String( d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m NSData handles almost all IO in iOS. It's similar to ByteBuffer in Java SE and allows us to map files from/into storage.
  • 15. -(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef) photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if(error == nil && !capturingVideo) { NSData *d = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); return; } if(error) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String( d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m I use nsDataToByteArr to convert the NSData object to a Java byte array, notice that in the native code all Java objects are effectively JAVA_OBJECT
  • 16. -(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef) photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if(error == nil && !capturingVideo) { NSData *d = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); return; } if(error) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String( d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m I can now invoke the callback method with the given byte array, notice the callback method includes the full name and argument types
  • 17. -(void)captureOutput:(AVCapturePhotoOutput *)captureOutput didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef) photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error { if(error == nil && !capturingVideo) { NSData *d = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); return; } if(error) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String( d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The VM passes thread local context on the stack for every method, this makes things like stack traces work
  • 18. #include "com_codename1_camerakit_impl_CameraCallbacks.h" extern JAVA_OBJECT nsDataToByteArr(NSData *data); com_codename1_camerakit_impl_CameraNativeAccessImpl.m For the code to compile we need to add these declarations below the #import statements. The VM generates C code so we import it with include. The nsDataToByteArr method is a method from the iOS port. I could have just included the whole header but there is no need in this case. It makes the process of converting an NSData much simpler
  • 19. if(stillImageOutput == nil) { stillImageOutput = [[AVCaptureStillImageOutput alloc] init]; NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil]; [stillImageOutput setOutputSettings:outputSettings]; } [captureSession addOutput:stillImageOutput]; AVCaptureConnection *videoConnection = nil; for (AVCaptureConnection *connection in stillImageOutput.connections) { for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection){ break; } } [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection com_codename1_camerakit_impl_CameraNativeAccessImpl.m Now that all this is out of the way lets go back to the captureImage class and the iOS 9 compatibility block. This is effectively code I got from stackoverflow I needed this since most samples are for iOS 10+ now and ignore the legacy but I still have quite a few iOS 9 devices that can't upgrade to 10. So I'm assuming there is still some market for compatibility: Most this code is boilerplate code for detecting the camera, no wonder it was reworked
  • 20. for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection){ break; } } [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { NSData *d = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); [captureSession removeOutput:stillImageOutput]; }]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m The capture image process is asynchronous and invokes the lambda expression below to process the resulting image
  • 21. for (AVCaptureInputPort *port in [connection inputPorts]) { if ([[port mediaType] isEqual:AVMediaTypeVideo] ) { videoConnection = connection; break; } } if (videoConnection){ break; } } [stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) { NSData *d = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer]; JAVA_OBJECT byteArray = nsDataToByteArr(d); com_codename1_camerakit_impl_CameraCallbacks_onImage___byte_1ARRAY( getThreadLocalData(), byteArray); [captureSession removeOutput:stillImageOutput]; }]; com_codename1_camerakit_impl_CameraNativeAccessImpl.m The capture image process is asynchronous and invokes the lambda expression below to process the resulting image. The rest of the code should be very familiar as it's almost identical to the one we used in the iOS 10 version. We just callback into Java. With this image capture should now work for both old and new devices. All the basics should work with the exception of video!
  • 22. -(void)captureVideo{ NSURL *furl = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent:@"temp.mov"]]; [self captureVideoFile:[furl absoluteString]]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Capturing video is surprisingly easier than capturing still images. First the simplest method is captureVideo. The no-arguments version of this method saves to a temporary file name. We invoke the version that accepts the file with a “temp.mov” file path.
  • 23. -(void)captureVideoFile:(NSString*)param{ dispatch_async(dispatch_get_main_queue(), ^{ if(movieOutput != nil) { [captureSession removeOutput:movieOutput]; [movieOutput stopRecording]; [movieOutput release]; } capturingVideo = YES; movieOutput = [[AVCaptureMovieFileOutput alloc] init]; [captureSession addOutput:movieOutput]; [movieOutput startRecordingToOutputFileURL: [NSURL URLWithString:param] recordingDelegate:self]; }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The version that accepts a path isn't much harder. The AVCaptureMovieFileOutput class represents the recording process, we lazily initialize it
  • 24. -(void)captureVideoFile:(NSString*)param{ dispatch_async(dispatch_get_main_queue(), ^{ if(movieOutput != nil) { [captureSession removeOutput:movieOutput]; [movieOutput stopRecording]; [movieOutput release]; } capturingVideo = YES; movieOutput = [[AVCaptureMovieFileOutput alloc] init]; [captureSession addOutput:movieOutput]; [movieOutput startRecordingToOutputFileURL: [NSURL URLWithString:param] recordingDelegate:self]; }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m We bind the output value to the capture session
  • 25. -(void)captureVideoFile:(NSString*)param{ dispatch_async(dispatch_get_main_queue(), ^{ if(movieOutput != nil) { [captureSession removeOutput:movieOutput]; [movieOutput stopRecording]; [movieOutput release]; } capturingVideo = YES; movieOutput = [[AVCaptureMovieFileOutput alloc] init]; [captureSession addOutput:movieOutput]; [movieOutput startRecordingToOutputFileURL: [NSURL URLWithString:param] recordingDelegate:self]; }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Now we can provide a URL from the file argument and start recording, again we use `self` as the delegate
  • 26. - (void)captureOutput:(AVCaptureFileOutput *)output didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections error:(NSError *)error { if(capturingVideo && outputFileURL != nil) { NSString* url = [outputFileURL absoluteString]; struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onVideo___java_lang_String (d, fromNSString(d, url)); return; } if(error != nil) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String (d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Which obviously leads us to the delegate method. Most of this should be very familiar now that we went through the image capture code.
  • 27. - (void)captureOutput:(AVCaptureFileOutput *)output didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL fromConnections:(NSArray<AVCaptureConnection *> *)connections error:(NSError *)error { if(capturingVideo && outputFileURL != nil) { NSString* url = [outputFileURL absoluteString]; struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onVideo___java_lang_String (d, fromNSString(d, url)); return; } if(error != nil) { struct ThreadLocalData* d = getThreadLocalData(); com_codename1_camerakit_impl_CameraCallbacks_onError___java_lang_String_java_lang_String_java_lang_String (d, nil, nil, nil); return; } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This callback accepts a path String which we can translate from NSString using the fromNSString API call
  • 28. -(void)stopVideo{ dispatch_async(dispatch_get_main_queue(), ^{ [movieOutput stopRecording]; [captureSession removeOutput:movieOutput]; [movieOutput release]; movieOutput = nil; }); } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The last remaining piece is the stop video method. This is a trivial method if you've been keeping up. We just stop the recording and cleanup. And with that last bit of code we are DONE! I hope you found this useful, there is a lot of code to go through but most of it is damn trivial once you look it over. You can create native interfaces that do just about anything if you have the patience to debug and google the native API's.