SlideShare a Scribd company logo
1 of 20
Download to read offline
Building Native Camera Access - Part IV
In this part we continue the iOS Port work starting with CameraKitView
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface CameraKitView : UIView {
AVCaptureVideoPreviewLayer *innerLayer;
}
-(void)layoutSubviews;
-(void)setLayer:(AVCaptureVideoPreviewLayer*)l;
@end
CameraKitView.h
Before I go into the 4 new update methods I'd like to detour through the CameraKitView class first. 

This is pretty standard. We just derive from UIView which is the standard component type for iOS UI's
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface CameraKitView : UIView {
AVCaptureVideoPreviewLayer *innerLayer;
}
-(void)layoutSubviews;
-(void)setLayer:(AVCaptureVideoPreviewLayer*)l;
@end
CameraKitView.h
This is the layer we set, keeping it as a variable is convenient
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface CameraKitView : UIView {
AVCaptureVideoPreviewLayer *innerLayer;
}
-(void)layoutSubviews;
-(void)setLayer:(AVCaptureVideoPreviewLayer*)l;
@end
CameraKitView.h
This is a method we're overriding from UIView it's invoked when the UIView is arranged by iOS which also has something similar to a layout manager
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
@interface CameraKitView : UIView {
AVCaptureVideoPreviewLayer *innerLayer;
}
-(void)layoutSubviews;
-(void)setLayer:(AVCaptureVideoPreviewLayer*)l;
@end
CameraKitView.h
This is the setLayer method we discussed before
#import <Foundation/Foundation.h>
#import "CameraKitView.h"
@implementation CameraKitView
-(void)setLayer:(AVCaptureVideoPreviewLayer*)l {
innerLayer = l;
}
-(void)layoutSubviews {
innerLayer.frame = self.bounds;
}
@end
CameraKitView.m
The implementation of the class is even simpler. We store the layer into the member field in set layer.
#import <Foundation/Foundation.h>
#import "CameraKitView.h"
@implementation CameraKitView
-(void)setLayer:(AVCaptureVideoPreviewLayer*)l {
innerLayer = l;
}
-(void)layoutSubviews {
innerLayer.frame = self.bounds;
}
@end
CameraKitView.m
When laying out the view I update the fame based on the bounds. Both represent the physical location of the view on the screen.ā€Ø
Codename One positions the UIView automatically but the CALayer within is positioned by this class. So when Codename One places the UIView based on the layout
manager, the bounds of the UIView are copied into the layer so it shows in the same place
-(void)updateDirection {
[previewLayer removeFromSuperlayer];
[container setLayer:nil];
[captureSession stopRunning];
[previewLayer release];
[captureSession release];
[self lazyInitPostAuthorization];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Now that this is out of the way lets go into the update methods. Before I go into the ones we already saw there is a hidden one which is invoked when a user invokes
setFacing to change the direction. It's not in the code from before as the functionality was embedded into that code.

When we change the camera direction we need to pick a new device and eļ¬€ectively start over again which is what lazyInitPostAuthorization does. Here we discard the
preview layer & stop the capture session
-(void)updateDirection {
[previewLayer removeFromSuperlayer];
[container setLayer:nil];
[captureSession stopRunning];
[previewLayer release];
[captureSession release];
[self lazyInitPostAuthorization];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Objective-C uses reference counting, we need to release objects we allocated. This is handled automatically by ARC normally but ARC collides with the GC
-(void)updateDirection {
[previewLayer removeFromSuperlayer];
[container setLayer:nil];
[captureSession stopRunning];
[previewLayer release];
[captureSession release];
[self lazyInitPostAuthorization];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Because this is essentially recreating the UI we go through the second part of initialization over again
Reference Counting & ARC
Ā© Codename One 2017 all rights reserved
āœ¦Objective-C Doesnā€™t have garbage collection
āœ¦They use reference counting
āœ¦ARC (Automatic Reference Counter) Hides manual
reference counting
āœ“ Reference counters are problematic with cyclic references
āœ“ Reference counting provides more deterministic behavior
āœ“ Garbage collectors are faster but sometimes stall
āœ¦We tried a hybrid approach
Objective-C and Swift don't have a garbage collector like Java does. 

Instead they use reference counting which means every object has a number representing the areas of the code that need it. When I don't want an object deleted I invoke
[myObject retain]; and it's saved when I don't care about it anymore I do [myObject release];. A retain operation increments a number and a release call decrements it.
When the number reaches 0 the object is released. There is more to it but that's the basic GIST of it.

A few years back Apple introduced a compiler enhancement that automatically injects retain/release calls into the code so developers don't "see" this and it feels more
like working with a GC. This is called ARC which standard for "Automatic Reference Counting". Unfortunately we can't get ARC to play nicely with our Garbage Collector.

I won't go to deep into the subject of GC vs. Reference Counting as it's a problematic subject but here's the GIST of it:

Reference counting can fail with cyclic references (Object-A needs Object-B and visa versa). GC's are immune to such cases

Reference counting provides more deterministic behavior. That means it will always perform at exactly the same speed so we can rely on its behavior

Garbage collectors are faster but sometimes stall. For UIā€™s this can sometimes be a problem as a GC will behave one way in one execution and diļ¬€erently in another
execution.

We looked at using a hybrid reference counter/GC solution when developing our VM and eventually scrapped that as there were no benefits in that approach. Both
approaches are workable and you need to be ready to debug their pitfalls
-(void)updateVideoQuality {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
switch(videoQuality) {
case VIDEO_QUALITY_480P:
captureSession.sessionPreset = AVCaptureSessionPreset640x480;
break;
case VIDEO_QUALITY_720P:
captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
break;
case VIDEO_QUALITY_LOWEST:
case VIDEO_QUALITY_QVGA:
captureSession.sessionPreset = AVCaptureSessionPresetLow;
break;
case VIDEO_QUALITY_1080P:
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
break;
case VIDEO_QUALITY_HIGHEST:
case VIDEO_QUALITY_2160P:
captureSession.sessionPreset = AVCaptureSessionPreset3840x2160;
break;
}
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Moving on lets look at the updateVideoQuality method. While it's a bit bigger the core concepts are relatively simple.

If this is invoked before start() it's totally fine, this method will be invoked again when start() is invoked.
-(void)updateVideoQuality {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
switch(videoQuality) {
case VIDEO_QUALITY_480P:
captureSession.sessionPreset = AVCaptureSessionPreset640x480;
break;
case VIDEO_QUALITY_720P:
captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
break;
case VIDEO_QUALITY_LOWEST:
case VIDEO_QUALITY_QVGA:
captureSession.sessionPreset = AVCaptureSessionPresetLow;
break;
case VIDEO_QUALITY_1080P:
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
break;
case VIDEO_QUALITY_HIGHEST:
case VIDEO_QUALITY_2160P:
captureSession.sessionPreset = AVCaptureSessionPreset3840x2160;
break;
}
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
When we manipulate some configurations we need to acquire a device lock to prevent concurrent modifications
-(void)updateVideoQuality {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
switch(videoQuality) {
case VIDEO_QUALITY_480P:
captureSession.sessionPreset = AVCaptureSessionPreset640x480;
break;
case VIDEO_QUALITY_720P:
captureSession.sessionPreset = AVCaptureSessionPreset1280x720;
break;
case VIDEO_QUALITY_LOWEST:
case VIDEO_QUALITY_QVGA:
captureSession.sessionPreset = AVCaptureSessionPresetLow;
break;
case VIDEO_QUALITY_1080P:
captureSession.sessionPreset = AVCaptureSessionPreset1920x1080;
break;
case VIDEO_QUALITY_HIGHEST:
case VIDEO_QUALITY_2160P:
captureSession.sessionPreset = AVCaptureSessionPreset3840x2160;
break;
}
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
The rest is a standard switch case to map the standard constants to iOS constants.
-(void)updateFlash {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
switch(flash) {
case FLASH_ON:
if([device isFlashModeSupported:AVCaptureFlashModeOn]) {
[device setFlashMode:AVCaptureFlashModeOn];
}
break;
case FLASH_OFF:
if([device isFlashModeSupported:AVCaptureFlashModeOff]) {
[device setFlashMode:AVCaptureFlashModeOff];
}
break;
case FLASH_AUTO:
if([device isFlashModeSupported:AVCaptureFlashModeAuto]) {
[device setFlashMode:AVCaptureFlashModeAuto];
}
break;
}
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This was pretty simple. Next on the line is updateFlash which is also as simple. I can go over the method but there is really nothing here that we didn't discuss in the
previous method. We return for null device, we lock for configuration & we convert the constant type.
-(void)updateFocus {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
switch(focus) {
case FOCUS_OFF:
[device setFocusMode:AVCaptureFocusModeLocked];
break;
case FOCUS_CONTINUOUS:
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
break;
case FOCUS_TAP_WITH_MARKER:
case FOCUS_TAP:
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc]
initWithTarget:self
action:@selector(tapToFocus:)];
[tapGR setNumberOfTapsRequired:1];
[tapGR setNumberOfTouchesRequired:1];
[container addGestureRecognizer:tapGR];
break;
}
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
Surprisingly updateFocus does have something new to say despite being pretty identical to the first two. I'll skip the identical part and discuss the final section.

There is no builtin focus on tap in iOS so we need to use some code to do this
-(void)updateFocus {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
switch(focus) {
case FOCUS_OFF:
[device setFocusMode:AVCaptureFocusModeLocked];
break;
case FOCUS_CONTINUOUS:
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
break;
case FOCUS_TAP_WITH_MARKER:
case FOCUS_TAP:
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc]
initWithTarget:self
action:@selector(tapToFocus:)];
[tapGR setNumberOfTapsRequired:1];
[tapGR setNumberOfTouchesRequired:1];
[container addGestureRecognizer:tapGR];
break;
}
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This invoked the tapToFocus method on self (which is this object in Objective-C) when a user taps container
// converted from this answer: https://stackoverflow.com/a/17025083/756809
-(void)tapToFocus:(UITapGestureRecognizer *)singleTap {
CGPoint touchPoint = [singleTap locationInView:container];
CGPoint convertedPoint = [previewLayer
captureDevicePointOfInterestForPoint:touchPoint];
if([device isFocusPointOfInterestSupported] &&
[device isFocusModeSupported:AVCaptureFocusModeAutoFocus]){
NSError *error = nil;
[device lockForConfiguration:&error];
if(!error){
[device setFocusPointOfInterest:convertedPoint];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
}
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
So lets look at the tapToFocus method. Notice that this logic isn't something I came up with. I took it from a stackoverflow answer which is very convenient for
implementing these sort of things. 

A tap returns a point on the screen which we can convert to a point relative to the coordinate space of the previewLayer
// converted from this answer: https://stackoverflow.com/a/17025083/756809
-(void)tapToFocus:(UITapGestureRecognizer *)singleTap {
CGPoint touchPoint = [singleTap locationInView:container];
CGPoint convertedPoint = [previewLayer
captureDevicePointOfInterestForPoint:touchPoint];
if([device isFocusPointOfInterestSupported] &&
[device isFocusModeSupported:AVCaptureFocusModeAutoFocus]){
NSError *error = nil;
[device lockForConfiguration:&error];
if(!error){
[device setFocusPointOfInterest:convertedPoint];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
}
}
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
If we can declare a focus point of interest we can just set the focus to the right point. It's not necessarily trivial but mostly boilerplate
-(void)updateZoom {
if(device == nil) {
return;
}
[device lockForConfiguration:nil];
device.videoZoomFactor = MAX(1.0,
MIN(zoom, device.activeFormat.videoMaxZoomFactor));
[device unlockForConfiguration];
}
com_codename1_camerakit_impl_CameraNativeAccessImpl.m
This brings us to the last and simplest of these methods. This is mostly a rehash of the other methods so I won't discuss it. Notice it guards against zooming too much or
too little.

And that's it, with these changes camera will basically work! We just need to fill in a few more "details" which are mostly boilerplate.

More Related Content

Similar to Building a Native Camera Access Library - Part IV - Transcript.pdf

Core animation taobao
Core animation taobaoCore animation taobao
Core animation taobao
yarshure Kong
Ā 
appserver.io tutorial
appserver.io tutorialappserver.io tutorial
appserver.io tutorial
appserver.io
Ā 

Similar to Building a Native Camera Access Library - Part IV - Transcript.pdf (20)

Vue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapyVue JS @ MindDoc. The progressive road to online therapy
Vue JS @ MindDoc. The progressive road to online therapy
Ā 
Making React Native UI Components with Swift
Making React Native UI Components with SwiftMaking React Native UI Components with Swift
Making React Native UI Components with Swift
Ā 
Camera2 API: Overview
Camera2 API: OverviewCamera2 API: Overview
Camera2 API: Overview
Ā 
Creating custom Validators on Reactive Forms using Angular 6
Creating custom Validators on Reactive Forms using Angular 6Creating custom Validators on Reactive Forms using Angular 6
Creating custom Validators on Reactive Forms using Angular 6
Ā 
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
How to Implement Basic Angular Routing and Nested Routing With Params in Angu...
Ā 
Introduction to aop
Introduction to aopIntroduction to aop
Introduction to aop
Ā 
Building a Native Camera Access Library - Part V.pdf
Building a Native Camera Access Library - Part V.pdfBuilding a Native Camera Access Library - Part V.pdf
Building a Native Camera Access Library - Part V.pdf
Ā 
Core animation taobao
Core animation taobaoCore animation taobao
Core animation taobao
Ā 
Capture image on eye blink
Capture image on eye blinkCapture image on eye blink
Capture image on eye blink
Ā 
Testing C# and ASP.net using Ruby
Testing C# and ASP.net using RubyTesting C# and ASP.net using Ruby
Testing C# and ASP.net using Ruby
Ā 
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
PVS-Studio and Continuous Integration: TeamCity. Analysis of the Open RollerC...
Ā 
Building an angular application -1 ( API: Golang, Database: Postgres) v1.0
Building an angular application -1 ( API: Golang, Database: Postgres) v1.0Building an angular application -1 ( API: Golang, Database: Postgres) v1.0
Building an angular application -1 ( API: Golang, Database: Postgres) v1.0
Ā 
appserver.io tutorial
appserver.io tutorialappserver.io tutorial
appserver.io tutorial
Ā 
Building a Native Camera Access Library - Part I.pdf
Building a Native Camera Access Library - Part I.pdfBuilding a Native Camera Access Library - Part I.pdf
Building a Native Camera Access Library - Part I.pdf
Ā 
How to implement camera recording for USB webcam or IP camera in C#.NET
How to implement camera recording for USB webcam or IP camera in C#.NETHow to implement camera recording for USB webcam or IP camera in C#.NET
How to implement camera recording for USB webcam or IP camera in C#.NET
Ā 
vite-en.pdf
vite-en.pdfvite-en.pdf
vite-en.pdf
Ā 
How to dispatch redux action with timeout
How to dispatch redux action with timeoutHow to dispatch redux action with timeout
How to dispatch redux action with timeout
Ā 
Angular 2 Migration - JHipster Meetup 6
Angular 2 Migration - JHipster Meetup 6Angular 2 Migration - JHipster Meetup 6
Angular 2 Migration - JHipster Meetup 6
Ā 
Dancing with websocket
Dancing with websocketDancing with websocket
Dancing with websocket
Ā 
Puppet Performance Profiling
Puppet Performance ProfilingPuppet Performance Profiling
Puppet Performance Profiling
Ā 

More from ShaiAlmog1

More from ShaiAlmog1 (20)

The Duck Teaches Learn to debug from the masters. Local to production- kill ...
The Duck Teaches  Learn to debug from the masters. Local to production- kill ...The Duck Teaches  Learn to debug from the masters. Local to production- kill ...
The Duck Teaches Learn to debug from the masters. Local to production- kill ...
Ā 
create-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdfcreate-netflix-clone-06-client-ui.pdf
create-netflix-clone-06-client-ui.pdf
Ā 
create-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdfcreate-netflix-clone-01-introduction_transcript.pdf
create-netflix-clone-01-introduction_transcript.pdf
Ā 
create-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdfcreate-netflix-clone-02-server_transcript.pdf
create-netflix-clone-02-server_transcript.pdf
Ā 
create-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdfcreate-netflix-clone-04-server-continued_transcript.pdf
create-netflix-clone-04-server-continued_transcript.pdf
Ā 
create-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdfcreate-netflix-clone-01-introduction.pdf
create-netflix-clone-01-introduction.pdf
Ā 
create-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdfcreate-netflix-clone-06-client-ui_transcript.pdf
create-netflix-clone-06-client-ui_transcript.pdf
Ā 
create-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdfcreate-netflix-clone-03-server.pdf
create-netflix-clone-03-server.pdf
Ā 
create-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdfcreate-netflix-clone-04-server-continued.pdf
create-netflix-clone-04-server-continued.pdf
Ā 
create-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdfcreate-netflix-clone-05-client-model_transcript.pdf
create-netflix-clone-05-client-model_transcript.pdf
Ā 
create-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdfcreate-netflix-clone-03-server_transcript.pdf
create-netflix-clone-03-server_transcript.pdf
Ā 
create-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdfcreate-netflix-clone-02-server.pdf
create-netflix-clone-02-server.pdf
Ā 
create-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdfcreate-netflix-clone-05-client-model.pdf
create-netflix-clone-05-client-model.pdf
Ā 
Creating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdfCreating a Whatsapp Clone - Part II.pdf
Creating a Whatsapp Clone - Part II.pdf
Ā 
Creating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdfCreating a Whatsapp Clone - Part IX - Transcript.pdf
Creating a Whatsapp Clone - Part IX - Transcript.pdf
Ā 
Creating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdfCreating a Whatsapp Clone - Part II - Transcript.pdf
Creating a Whatsapp Clone - Part II - Transcript.pdf
Ā 
Creating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdfCreating a Whatsapp Clone - Part V - Transcript.pdf
Creating a Whatsapp Clone - Part V - Transcript.pdf
Ā 
Creating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdfCreating a Whatsapp Clone - Part IV - Transcript.pdf
Creating a Whatsapp Clone - Part IV - Transcript.pdf
Ā 
Creating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdfCreating a Whatsapp Clone - Part IV.pdf
Creating a Whatsapp Clone - Part IV.pdf
Ā 
Creating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdfCreating a Whatsapp Clone - Part I - Transcript.pdf
Creating a Whatsapp Clone - Part I - Transcript.pdf
Ā 

Recently uploaded

Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
vu2urc
Ā 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
giselly40
Ā 

Recently uploaded (20)

Data Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt RobisonData Cloud, More than a CDP by Matt Robison
Data Cloud, More than a CDP by Matt Robison
Ā 
The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024The 7 Things I Know About Cyber Security After 25 Years | April 2024
The 7 Things I Know About Cyber Security After 25 Years | April 2024
Ā 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Ā 
šŸ¬ The future of MySQL is Postgres šŸ˜
šŸ¬  The future of MySQL is Postgres   šŸ˜šŸ¬  The future of MySQL is Postgres   šŸ˜
šŸ¬ The future of MySQL is Postgres šŸ˜
Ā 
Slack Application Development 101 Slides
Slack Application Development 101 SlidesSlack Application Development 101 Slides
Slack Application Development 101 Slides
Ā 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Ā 
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
Ā 
A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)A Domino Admins Adventures (Engage 2024)
A Domino Admins Adventures (Engage 2024)
Ā 
Histor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slideHistor y of HAM Radio presentation slide
Histor y of HAM Radio presentation slide
Ā 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
Ā 
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptxFactors to Consider When Choosing Accounts Payable Services Providers.pptx
Factors to Consider When Choosing Accounts Payable Services Providers.pptx
Ā 
A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?A Year of the Servo Reboot: Where Are We Now?
A Year of the Servo Reboot: Where Are We Now?
Ā 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
Ā 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
Ā 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
Ā 
08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men08448380779 Call Girls In Friends Colony Women Seeking Men
08448380779 Call Girls In Friends Colony Women Seeking Men
Ā 
Scaling API-first ā€“ The story of a global engineering organization
Scaling API-first ā€“ The story of a global engineering organizationScaling API-first ā€“ The story of a global engineering organization
Scaling API-first ā€“ The story of a global engineering organization
Ā 
CNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of ServiceCNv6 Instructor Chapter 6 Quality of Service
CNv6 Instructor Chapter 6 Quality of Service
Ā 
Finology Group ā€“ Insurtech Innovation Award 2024
Finology Group ā€“ Insurtech Innovation Award 2024Finology Group ā€“ Insurtech Innovation Award 2024
Finology Group ā€“ Insurtech Innovation Award 2024
Ā 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
Ā 

Building a Native Camera Access Library - Part IV - Transcript.pdf

  • 1. Building Native Camera Access - Part IV In this part we continue the iOS Port work starting with CameraKitView
  • 2. #import <Foundation/Foundation.h> #import <AVFoundation/AVFoundation.h> @interface CameraKitView : UIView { AVCaptureVideoPreviewLayer *innerLayer; } -(void)layoutSubviews; -(void)setLayer:(AVCaptureVideoPreviewLayer*)l; @end CameraKitView.h Before I go into the 4 new update methods I'd like to detour through the CameraKitView class first. This is pretty standard. We just derive from UIView which is the standard component type for iOS UI's
  • 3. #import <Foundation/Foundation.h> #import <AVFoundation/AVFoundation.h> @interface CameraKitView : UIView { AVCaptureVideoPreviewLayer *innerLayer; } -(void)layoutSubviews; -(void)setLayer:(AVCaptureVideoPreviewLayer*)l; @end CameraKitView.h This is the layer we set, keeping it as a variable is convenient
  • 4. #import <Foundation/Foundation.h> #import <AVFoundation/AVFoundation.h> @interface CameraKitView : UIView { AVCaptureVideoPreviewLayer *innerLayer; } -(void)layoutSubviews; -(void)setLayer:(AVCaptureVideoPreviewLayer*)l; @end CameraKitView.h This is a method we're overriding from UIView it's invoked when the UIView is arranged by iOS which also has something similar to a layout manager
  • 5. #import <Foundation/Foundation.h> #import <AVFoundation/AVFoundation.h> @interface CameraKitView : UIView { AVCaptureVideoPreviewLayer *innerLayer; } -(void)layoutSubviews; -(void)setLayer:(AVCaptureVideoPreviewLayer*)l; @end CameraKitView.h This is the setLayer method we discussed before
  • 6. #import <Foundation/Foundation.h> #import "CameraKitView.h" @implementation CameraKitView -(void)setLayer:(AVCaptureVideoPreviewLayer*)l { innerLayer = l; } -(void)layoutSubviews { innerLayer.frame = self.bounds; } @end CameraKitView.m The implementation of the class is even simpler. We store the layer into the member field in set layer.
  • 7. #import <Foundation/Foundation.h> #import "CameraKitView.h" @implementation CameraKitView -(void)setLayer:(AVCaptureVideoPreviewLayer*)l { innerLayer = l; } -(void)layoutSubviews { innerLayer.frame = self.bounds; } @end CameraKitView.m When laying out the view I update the fame based on the bounds. Both represent the physical location of the view on the screen.ā€Ø Codename One positions the UIView automatically but the CALayer within is positioned by this class. So when Codename One places the UIView based on the layout manager, the bounds of the UIView are copied into the layer so it shows in the same place
  • 8. -(void)updateDirection { [previewLayer removeFromSuperlayer]; [container setLayer:nil]; [captureSession stopRunning]; [previewLayer release]; [captureSession release]; [self lazyInitPostAuthorization]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Now that this is out of the way lets go into the update methods. Before I go into the ones we already saw there is a hidden one which is invoked when a user invokes setFacing to change the direction. It's not in the code from before as the functionality was embedded into that code. When we change the camera direction we need to pick a new device and eļ¬€ectively start over again which is what lazyInitPostAuthorization does. Here we discard the preview layer & stop the capture session
  • 9. -(void)updateDirection { [previewLayer removeFromSuperlayer]; [container setLayer:nil]; [captureSession stopRunning]; [previewLayer release]; [captureSession release]; [self lazyInitPostAuthorization]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Objective-C uses reference counting, we need to release objects we allocated. This is handled automatically by ARC normally but ARC collides with the GC
  • 10. -(void)updateDirection { [previewLayer removeFromSuperlayer]; [container setLayer:nil]; [captureSession stopRunning]; [previewLayer release]; [captureSession release]; [self lazyInitPostAuthorization]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Because this is essentially recreating the UI we go through the second part of initialization over again
  • 11. Reference Counting & ARC Ā© Codename One 2017 all rights reserved āœ¦Objective-C Doesnā€™t have garbage collection āœ¦They use reference counting āœ¦ARC (Automatic Reference Counter) Hides manual reference counting āœ“ Reference counters are problematic with cyclic references āœ“ Reference counting provides more deterministic behavior āœ“ Garbage collectors are faster but sometimes stall āœ¦We tried a hybrid approach Objective-C and Swift don't have a garbage collector like Java does. Instead they use reference counting which means every object has a number representing the areas of the code that need it. When I don't want an object deleted I invoke [myObject retain]; and it's saved when I don't care about it anymore I do [myObject release];. A retain operation increments a number and a release call decrements it. When the number reaches 0 the object is released. There is more to it but that's the basic GIST of it. A few years back Apple introduced a compiler enhancement that automatically injects retain/release calls into the code so developers don't "see" this and it feels more like working with a GC. This is called ARC which standard for "Automatic Reference Counting". Unfortunately we can't get ARC to play nicely with our Garbage Collector. I won't go to deep into the subject of GC vs. Reference Counting as it's a problematic subject but here's the GIST of it: Reference counting can fail with cyclic references (Object-A needs Object-B and visa versa). GC's are immune to such cases Reference counting provides more deterministic behavior. That means it will always perform at exactly the same speed so we can rely on its behavior Garbage collectors are faster but sometimes stall. For UIā€™s this can sometimes be a problem as a GC will behave one way in one execution and diļ¬€erently in another execution. We looked at using a hybrid reference counter/GC solution when developing our VM and eventually scrapped that as there were no benefits in that approach. Both approaches are workable and you need to be ready to debug their pitfalls
  • 12. -(void)updateVideoQuality { if(device == nil) { return; } [device lockForConfiguration:nil]; switch(videoQuality) { case VIDEO_QUALITY_480P: captureSession.sessionPreset = AVCaptureSessionPreset640x480; break; case VIDEO_QUALITY_720P: captureSession.sessionPreset = AVCaptureSessionPreset1280x720; break; case VIDEO_QUALITY_LOWEST: case VIDEO_QUALITY_QVGA: captureSession.sessionPreset = AVCaptureSessionPresetLow; break; case VIDEO_QUALITY_1080P: captureSession.sessionPreset = AVCaptureSessionPreset1920x1080; break; case VIDEO_QUALITY_HIGHEST: case VIDEO_QUALITY_2160P: captureSession.sessionPreset = AVCaptureSessionPreset3840x2160; break; } [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Moving on lets look at the updateVideoQuality method. While it's a bit bigger the core concepts are relatively simple. If this is invoked before start() it's totally fine, this method will be invoked again when start() is invoked.
  • 13. -(void)updateVideoQuality { if(device == nil) { return; } [device lockForConfiguration:nil]; switch(videoQuality) { case VIDEO_QUALITY_480P: captureSession.sessionPreset = AVCaptureSessionPreset640x480; break; case VIDEO_QUALITY_720P: captureSession.sessionPreset = AVCaptureSessionPreset1280x720; break; case VIDEO_QUALITY_LOWEST: case VIDEO_QUALITY_QVGA: captureSession.sessionPreset = AVCaptureSessionPresetLow; break; case VIDEO_QUALITY_1080P: captureSession.sessionPreset = AVCaptureSessionPreset1920x1080; break; case VIDEO_QUALITY_HIGHEST: case VIDEO_QUALITY_2160P: captureSession.sessionPreset = AVCaptureSessionPreset3840x2160; break; } [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m When we manipulate some configurations we need to acquire a device lock to prevent concurrent modifications
  • 14. -(void)updateVideoQuality { if(device == nil) { return; } [device lockForConfiguration:nil]; switch(videoQuality) { case VIDEO_QUALITY_480P: captureSession.sessionPreset = AVCaptureSessionPreset640x480; break; case VIDEO_QUALITY_720P: captureSession.sessionPreset = AVCaptureSessionPreset1280x720; break; case VIDEO_QUALITY_LOWEST: case VIDEO_QUALITY_QVGA: captureSession.sessionPreset = AVCaptureSessionPresetLow; break; case VIDEO_QUALITY_1080P: captureSession.sessionPreset = AVCaptureSessionPreset1920x1080; break; case VIDEO_QUALITY_HIGHEST: case VIDEO_QUALITY_2160P: captureSession.sessionPreset = AVCaptureSessionPreset3840x2160; break; } [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m The rest is a standard switch case to map the standard constants to iOS constants.
  • 15. -(void)updateFlash { if(device == nil) { return; } [device lockForConfiguration:nil]; switch(flash) { case FLASH_ON: if([device isFlashModeSupported:AVCaptureFlashModeOn]) { [device setFlashMode:AVCaptureFlashModeOn]; } break; case FLASH_OFF: if([device isFlashModeSupported:AVCaptureFlashModeOff]) { [device setFlashMode:AVCaptureFlashModeOff]; } break; case FLASH_AUTO: if([device isFlashModeSupported:AVCaptureFlashModeAuto]) { [device setFlashMode:AVCaptureFlashModeAuto]; } break; } [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This was pretty simple. Next on the line is updateFlash which is also as simple. I can go over the method but there is really nothing here that we didn't discuss in the previous method. We return for null device, we lock for configuration & we convert the constant type.
  • 16. -(void)updateFocus { if(device == nil) { return; } [device lockForConfiguration:nil]; switch(focus) { case FOCUS_OFF: [device setFocusMode:AVCaptureFocusModeLocked]; break; case FOCUS_CONTINUOUS: [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; break; case FOCUS_TAP_WITH_MARKER: case FOCUS_TAP: [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapToFocus:)]; [tapGR setNumberOfTapsRequired:1]; [tapGR setNumberOfTouchesRequired:1]; [container addGestureRecognizer:tapGR]; break; } [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m Surprisingly updateFocus does have something new to say despite being pretty identical to the first two. I'll skip the identical part and discuss the final section. There is no builtin focus on tap in iOS so we need to use some code to do this
  • 17. -(void)updateFocus { if(device == nil) { return; } [device lockForConfiguration:nil]; switch(focus) { case FOCUS_OFF: [device setFocusMode:AVCaptureFocusModeLocked]; break; case FOCUS_CONTINUOUS: [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; break; case FOCUS_TAP_WITH_MARKER: case FOCUS_TAP: [device setFocusMode:AVCaptureFocusModeContinuousAutoFocus]; UITapGestureRecognizer *tapGR = [[UITapGestureRecognizer alloc] initWithTarget:self action:@selector(tapToFocus:)]; [tapGR setNumberOfTapsRequired:1]; [tapGR setNumberOfTouchesRequired:1]; [container addGestureRecognizer:tapGR]; break; } [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This invoked the tapToFocus method on self (which is this object in Objective-C) when a user taps container
  • 18. // converted from this answer: https://stackoverflow.com/a/17025083/756809 -(void)tapToFocus:(UITapGestureRecognizer *)singleTap { CGPoint touchPoint = [singleTap locationInView:container]; CGPoint convertedPoint = [previewLayer captureDevicePointOfInterestForPoint:touchPoint]; if([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]){ NSError *error = nil; [device lockForConfiguration:&error]; if(!error){ [device setFocusPointOfInterest:convertedPoint]; [device setFocusMode:AVCaptureFocusModeAutoFocus]; [device unlockForConfiguration]; } } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m So lets look at the tapToFocus method. Notice that this logic isn't something I came up with. I took it from a stackoverflow answer which is very convenient for implementing these sort of things. A tap returns a point on the screen which we can convert to a point relative to the coordinate space of the previewLayer
  • 19. // converted from this answer: https://stackoverflow.com/a/17025083/756809 -(void)tapToFocus:(UITapGestureRecognizer *)singleTap { CGPoint touchPoint = [singleTap locationInView:container]; CGPoint convertedPoint = [previewLayer captureDevicePointOfInterestForPoint:touchPoint]; if([device isFocusPointOfInterestSupported] && [device isFocusModeSupported:AVCaptureFocusModeAutoFocus]){ NSError *error = nil; [device lockForConfiguration:&error]; if(!error){ [device setFocusPointOfInterest:convertedPoint]; [device setFocusMode:AVCaptureFocusModeAutoFocus]; [device unlockForConfiguration]; } } } com_codename1_camerakit_impl_CameraNativeAccessImpl.m If we can declare a focus point of interest we can just set the focus to the right point. It's not necessarily trivial but mostly boilerplate
  • 20. -(void)updateZoom { if(device == nil) { return; } [device lockForConfiguration:nil]; device.videoZoomFactor = MAX(1.0, MIN(zoom, device.activeFormat.videoMaxZoomFactor)); [device unlockForConfiguration]; } com_codename1_camerakit_impl_CameraNativeAccessImpl.m This brings us to the last and simplest of these methods. This is mostly a rehash of the other methods so I won't discuss it. Notice it guards against zooming too much or too little. And that's it, with these changes camera will basically work! We just need to fill in a few more "details" which are mostly boilerplate.