Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle.
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Chris Adamson
Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle
What's a Core Image? An Image-Processing Framework on iOS and OS XFlatiron School
Flatiron students Steven Zhou and Heidi Hansen explain how core images work on iOS and OS X to help developers process images efficiently without dealing with low level interactions with GPU or CPU.

Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Image and Video Processing Using Adobe Image Foundation's Toolkit For Flash -...Kevin Goldsmith
In 2007, Adobe launched Pixel Bender for the Flash Runtime. This allowed Flash Developers parallel processing for the first time. This presentation was the first introduction to the new capabilities in the Flash Runtime.
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Core Image: The Most Fun API You're Not Using, CocoaConf Atlanta, December 2014Chris Adamson
Graphics on iOS and OS X isn't just about stroking shapes and paths in Core Graphics and trying to figure out OpenGL. The Core Image framework gives you access to about 100 built-in filters, providing everything from photographic effects and color manipulation to face-finding and QR Code generation. It can leverage the power of the GPU to provide performance fast enough to perform complex effects work on real-time video capture. But even if you're not writing the next Final Cut Pro or Photoshop, it's easy to call in Core Image for simple tasks, like putting a blur in part of your UI for transitions or privacy reasons. In this session, we'll explore the many ways Core Image can make your app sizzle
What's a Core Image? An Image-Processing Framework on iOS and OS XFlatiron School
Flatiron students Steven Zhou and Heidi Hansen explain how core images work on iOS and OS X to help developers process images efficiently without dealing with low level interactions with GPU or CPU.

Stupid Video Tricks, CocoaConf Seattle 2014Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff
Image and Video Processing Using Adobe Image Foundation's Toolkit For Flash -...Kevin Goldsmith
In 2007, Adobe launched Pixel Bender for the Flash Runtime. This allowed Flash Developers parallel processing for the first time. This presentation was the first introduction to the new capabilities in the Flash Runtime.
Stupid Video Tricks, CocoaConf Las VegasChris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Stupid Video Tricks (CocoaConf DC, March 2014)Chris Adamson
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Want to know the untold secrets of imaging on iOS? This talks goes through performance considerations about a number of imaging APIs on iOS, including some examples of how we integrated them in our own apps. Image loading, processing, and display will be analysed and discussed to find best APIs for particular use cases.
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on.
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Chris Adamson
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Chris Adamson
[updated from previous version to include Watch Connectivity, screenshots of WKInterfaceMovie]
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Firebase: Totally Not Parse All Over Again (Unless It Is)Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Chris Adamson
When the first 128K Macs landed in 1984, it was the first time many of us could undo a mistake with just a keystroke, or exchange data between documents or applications with cut/copy/paste and the system clipboard. Fast forward 30 years and we all use this stuff… but do you know how to actually impement it? Especially on iOS, these everyday features are surprisingly absent from many developers' toolchests. In this session, we'll flashback to the era of Reagan, Rubik's Cubes, and Return of the Jedi, to see these hot hits of the early 80's are represented in modern-day Cocoa.
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Forward Swift 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
Introduction to digital image processing, image processing, digital image, analog image, formation of digital image, level of digital image processing, components of a digital image processing system, advantages of digital image processing, limitations of digital image processing, fields of digital image processing, ultrasound imaging, x-ray imaging, SEM, PET, TEM
This presentation reviews the various approaches you can take to creating animations in your iOS apps.
It discusses UIKit animations, including curling pages up (as in the Maps app), flipping back and forth (as in the Stocks app) and image view animations.
It also describes Core Animation, including its implicit animations that occur when you change the value of certain animatable properties, as well as how to create explicit animations.
COMP 4026 Advanced HCI lecture 6 on OpenFrameworks and Google's Project Soli. Taught by Mark Billinghurst at the University of South Australia on August 25th 2016.
Want to know the untold secrets of imaging on iOS? This talks goes through performance considerations about a number of imaging APIs on iOS, including some examples of how we integrated them in our own apps. Image loading, processing, and display will be analysed and discussed to find best APIs for particular use cases.
Get On The Audiobus (CocoaConf Atlanta, November 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Building A Streaming Apple TV App (CocoaConf San Jose, Nov 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on
Building A Streaming Apple TV App (CocoaConf DC, Sept 2016)Chris Adamson
Apple TV offers a friendly SDK, full of familiar view controllers and Foundation classes, with everything an iOS developer needs to develop their own streaming channel. Except for… you know… the streaming part. In this session, we'll look at how Apple's HTTP Live Streaming video works -- from flat files or live sources -- and how to get it from your computer to a streaming server and then to an Apple TV. We'll also look at common challenges for building streaming channel apps, like serving metadata, protecting content, and supporting single sign-on.
Video Killed the Rolex Star (CocoaConf Columbus, July 2015)Chris Adamson
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Firebase: Totally Not Parse All Over Again (Unless It Is) (CocoaConf San Jose...Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Get On The Audiobus (CocoaConf Boston, October 2013)Chris Adamson
Audiobus is an iOS app that allows other apps to work together as an audio-processing toolchain: play your MIDI keyboard into one app, run it through filters in other apps, and mix it in a third. All in real-time, foreground or background. That such a thing is possible on the locked down iOS platform is remarkable enough, but what's even more remarkable is that hundreds of audio apps have added Audiobus support in the few months since its debut, including Apple's own GarageBand. In this session, we'll take a look at the Audiobus SDK and see how to create inputs, outputs, and filters that can be managed by the Audiobus app to process audio in collaboration with other apps on the device.
Video Killed the Rolex Star (CocoaConf San Jose, November, 2015)Chris Adamson
[updated from previous version to include Watch Connectivity, screenshots of WKInterfaceMovie]
watchOS 2.0 brings media functionality to Apple Watch, offering audio and video playback and audio capture. But lest you plan on writing Logic or Final Cut for the watch: what's available on the wrist has its limits, and you hit them quickly. In this session, we'll see what the WKInterfaceController offers us for miniature mobile media, and how we can get the benefits of AV Foundation and Core Audio by moving our movies, songs, and podcasts back and forth between the watch and the iPhone.
Firebase: Totally Not Parse All Over Again (Unless It Is)Chris Adamson
With Facebook shutting down Parse, everybody knows to never again depend on a third party for their backend solution, right? Sure, and after you spend six months trying to write your own syncing service, how's that working? In 2016, Google has added a ton of features to Firebase, their popular backend-as-a-service solution. Firebase's primary offering is a realtime database in the cloud that syncs changes to and from multiple concurrent users, and their Swift-friendly iOS SDK makes it ideal for mobile use. In this session, you'll learn how to set up a Firebase backend and build an iOS app around it.
Revenge of the 80s: Cut/Copy/Paste, Undo/Redo, and More Big Hits (CocoaConf C...Chris Adamson
When the first 128K Macs landed in 1984, it was the first time many of us could undo a mistake with just a keystroke, or exchange data between documents or applications with cut/copy/paste and the system clipboard. Fast forward 30 years and we all use this stuff… but do you know how to actually impement it? Especially on iOS, these everyday features are surprisingly absent from many developers' toolchests. In this session, we'll flashback to the era of Reagan, Rubik's Cubes, and Return of the Jedi, to see these hot hits of the early 80's are represented in modern-day Cocoa.
AV Foundation makes it reasonably straightforward to capture video from the camera and edit together a nice family video. This session is not about that stuff. This session is about the nooks and crannies where AV Foundation exposes what's behind the curtain. Instead of letting AVPlayer read our video files, we can grab the samples ourselves and mess with them. AVCaptureVideoPreviewLayer, meet the CGAffineTransform. And instead of dutifully passing our captured video frames to the preview layer and an output file, how about if we instead run them through a series of Core Image filters? Record your own screen? Oh yeah, we can AVAssetWriter that. With a few pointers, a little experimentation, and a healthy disregard for safe coding practices, Core Media and Core Video let you get away with some neat stuff.
Forward Swift 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
Introduction to digital image processing, image processing, digital image, analog image, formation of digital image, level of digital image processing, components of a digital image processing system, advantages of digital image processing, limitations of digital image processing, fields of digital image processing, ultrasound imaging, x-ray imaging, SEM, PET, TEM
This presentation reviews the various approaches you can take to creating animations in your iOS apps.
It discusses UIKit animations, including curling pages up (as in the Maps app), flipping back and forth (as in the Stocks app) and image view animations.
It also describes Core Animation, including its implicit animations that occur when you change the value of certain animatable properties, as well as how to create explicit animations.
COMP 4026 Advanced HCI lecture 6 on OpenFrameworks and Google's Project Soli. Taught by Mark Billinghurst at the University of South Australia on August 25th 2016.
Dino2 - the Amazing Evolution of the VA Smalltalk Virtual MachineESUG
Dino2 - the Amazing Evolution of the VA Smalltalk Virtual Machine
First Name: John
Last Name: O'Keefe
Type: Talk
Video1: https://www.youtube.com/watch?v=Ii8Dwq1b6YI
Video2: https://www.youtube.com/watch?v=30L7fWvtddU
Over the last 18 months we have evolved the VA Smalltalk VM from a Smalltalk model-based 32-bit VM to a C-based 32/64-bit VM. During this talk I will tell the story of our journey along this evolutionary path, describe some of the innovative techniques and approaches we took to reach our goal, and demonstrate the running 64-bit VM.
Bio:
I have been developing software for over 45 years. I joined the original IBM Smalltalk prototype team in 1990 and was a founding member of the IBM VisualAge Smalltalk development team. I was Team Lead and Chief Architect of IBM VisualAge Smalltalk from 1997 to 2007. In February 2007, I joined Instantiations to lead the VA Smalltalk development team. I am currently the CTO and Principal Smalltalk Architect focusing on future product architecture and development. I live in Durham, NC and work in Raleigh, NC.
HTML5 contains many new interesting features that make the platform a capaple development platform. Sockets, SVG, geolocation, local storage and many more are included in the platform. In this one hour session, we will look at cool implementations of 10 features of HTML5
Similar to Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014) (20)
Whatever Happened to Visual Novel Anime? (AWA/Youmacon 2018)Chris Adamson
Ten years ago, visual novels were one of the most prominent sources for anime adaptations. Today, the flood has slowed to a trickle, and Steins;Gate 0 seems like the last gasp of a dying breed. This panel looks at what went right and wrong with VN anime, and whether they might ever make a comeback.
Whatever Happened to Visual Novel Anime? (JAFAX 2018)Chris Adamson
Not long ago, adaptations of visual novels were a major inspiration for anime, with a dozen or more shows a year based on VNs. These include classics like Clannad, Fate/Stay Night, and Higurashi. Today, the flood has slowed to a trickle, with only six VN anime in 2017, all of them commercial flops. This panel will track the rise and fall of VN adaptations, the traits that make them good and bad for anime, and whether this year's Steins;Gate 0 represents a new hope for the genre or a last gasp.
Media Frameworks Versus Swift (Swift by Northwest, October 2017)Chris Adamson
As much as we love Swift for developing our apps, playgrounds, and even on the server, there are some things for which Swift is not a good match. The media frameworks on iOS are a good example of this. Dropping into Core Audio can twist your Swift code so badly it’s hardly readable anymore. And there are parts of AV Foundation where using Swift is literally not allowed.
In this talk, we’ll show off some of these tricky scenarios, see what we can do to make things better, and think about what this means for the Swift language and its future prospects.
Fall Premieres: Media Frameworks in iOS 11, macOS 10.13, and tvOS 11 (CocoaCo...Chris Adamson
What’s Apple planning for its media frameworks in the next 12 months? What’s it doing with Apple TV, or the HTTP Live Streaming standard? We won’t know until the curtain drops on WWDC! In this talk, we’ll amass everything audio- and video-related that gets announced throughout the week, combine it with the solid base of frameworks already present in the Apple platforms, and figure out from there what we’re going to be playing with in 2018.
CocoaConf Chicago 2017: Media Frameworks and Swift: This Is FineChris Adamson
Swift is great for writing iOS and Mac apps, and its creators also mean for it to be used as a systems programming language. However, certain traits about Swift make it officially off-limits for use in some audio/video-processing scenarios. What's the deal, is it not fast enough or what? We'll look at what media apps can and can't do in Swift, and what you're supposed to do instead. We'll also look at strategies for knowing what responsibilities to dole out to Swift and to C, and how to make those parts of your code play nicely with each other.
(This is a longer version of a talk previously presented at Forward Swift 2017)
Glitch-Free A/V Encoding (CocoaConf Boston, October 2013)Chris Adamson
The iPhone is the best iPod Apple's ever made, and the iPad has replaced the TV for many users. And while developers can use documentation and books master the media frameworks (AV Foundation, Core Audio, and the rest), there's nothing in Xcode that will keep your audio from dropping out, fix artifacting on video with a lot of motion, or properly balance performance on the most-capable new Retina devices with backwards-compatibility with older ones. This session offers a ground-level intro to what's actually in your iTunes songs and streaming videos, and how to best encode them for the realities of iOS devices, their storage capacities and the networks they live on. We'll shoot, compress, and stream, all from a MacBook Air, and take a close look and listen to the results.
Mobile Movies with HTTP Live Streaming (CocoaConf DC, March 2013)Chris Adamson
If your iOS app streams video, then you're going to be using HTTP Live Streaming. Between the serious support for it in iOS, and App Store rules mandating its use in some cases, there realistically is no other choice. But where do you get started and what do you have to do? In this session, we'll take a holistic look at how to use HLS. We'll cover how to encode media for HLS and how to get the best results for all the clients and bitrates you might need to support, how to serve that media (and whether it makes sense to let someone else do it for you), and how to integrate the HLS stream into your app.
Core Audio in iOS 6 (CocoaConf Chicago, March 2013)Chris Adamson
Core Audio gets a bunch of neat new tricks in iOS 6, particularly for developers working with Audio Units. New effect units include an improved ability to vary pitch and playback speed, a digital delay unit, and OS X's powerful matrix mixer. There's now a new place to use units too, as the Audio Queue now offers developers a way to "tap" into the data being queued up for playback. To top it all off, a new "multi-route" system allows us to play out of multiple, multi-channel output devices at the same time.
Want to see, and hear, how all this stuff works? This section is the place to find out.
Look past the square braces and the damned header files and Objective-C -- the essential language of iOS development -- really isn't that different from other object-oriented languages. Classes, single-inheritance, polymorphism, implementation hiding... check, check, check, and check. So it's really not that difficult for old Java / Python / Ruby / C++ dogs to learn new tricks once they install Xcode, right?
To be a competent Obj-C programmer, not that hard.
To be a great Obj-C programmer... now that's another story.
In this session, we will look at traits that are unique to Objective-C, the tricks that bring out the expressiveness and power of the language. We'll also look at how to write idiomatic code that will be easily understood and maintained by other Objective-C developers. We'll look at how Automatic Reference Counting resembles but is really nothing like Garbage Collection, how properties put plain old instance variables to shame, how we loosely couple classes with delegates and notification, how blocks help us un-block our code by simplifying asynchronicity, and more.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Enhancing Performance with Globus and the Science DMZGlobus
ESnet has led the way in helping national facilities—and many other institutions in the research community—configure Science DMZs and troubleshoot network issues to maximize data transfer performance. In this talk we will present a summary of approaches and tips for getting the most out of your network infrastructure using Globus Connect Server.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Le nuove frontiere dell'AI nell'RPA con UiPath Autopilot™UiPathCommunity
In questo evento online gratuito, organizzato dalla Community Italiana di UiPath, potrai esplorare le nuove funzionalità di Autopilot, il tool che integra l'Intelligenza Artificiale nei processi di sviluppo e utilizzo delle Automazioni.
📕 Vedremo insieme alcuni esempi dell'utilizzo di Autopilot in diversi tool della Suite UiPath:
Autopilot per Studio Web
Autopilot per Studio
Autopilot per Apps
Clipboard AI
GenAI applicata alla Document Understanding
👨🏫👨💻 Speakers:
Stefano Negro, UiPath MVPx3, RPA Tech Lead @ BSP Consultant
Flavio Martinelli, UiPath MVP 2023, Technical Account Manager @UiPath
Andrei Tasca, RPA Solutions Team Lead @NTT Data
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
The Art of the Pitch: WordPress Relationships and Sales
Core Image: The Most Fun API You're Not Using (CocoaConf Columbus 2014)
1. Core Image
The Most Fun API
You’re Not Using
Chris Adamson • @invalidname
CocoaConf Columbus, August 2014
2.
3.
4. “Core Image is an image processing and analysis
technology designed to provide near real-time
processing for still and video images.”
5. Agenda
• Images, Filters, and Contexts
• The Core Image Filter Gallery
• Neat Tricks with Built-In Filters
• Core Image on OS X
6. Core Image, Core Concepts
• Core Image is
of the time
• A chain of filters describes a “recipe” of processing
steps to be applied to one or more images
• “Stringly typed”
• You only get pixels when you render
7. Typical Workflow
• Start with a source CIImage
• Apply one or more filters
• Render resulting CIImage to a CIContext, or
convert CIImage out to another type
• A few filters take or produce types other than
CIImage (CIQRCodeGenerator)
8. CIImage
• An image provided to or produced by Core Image
• But no bitmap of pixel data!
• Immutable
• -imageByCroppingToRect,
-imageByApplyingTransform
• -extent — a CGRect of the image’s size
9. CIImage sources
• NSURL
• CGImageRef
• Bitmap or JPEG/PNG/TIFF in NSData
• OpenGL texture
• Core Video image/pixel buffer
10. CIContext
• Rendering destination for a CIImage (-
[drawImage:inRect:fromRect:])
• This is where you get pixels (also, this is the processor-
intenstive part)
• On iOS, must be created from an EAGLContext. On
Mac, can be created with CGContextRef
• Can also produce output as a CGImageRef, bitmap data,
or a CVPixelBuffer (iOS only)
12. CIFilter
• Performs an image processing operation
• Typically takes and produces a CIImage
• All parameters are provided via -[setValue:forKey:]
• Stringly-typed!
• Output is retrieved with -[valueForKey:]
16. Core Image Filter Reference
Filter Name
Parameters
Note the type & number to
provide
Categories
Watch for CICategoryBuiltIn
and CICategoryVideo
Example Figure
Availability
Watch for versioning and
OS X-only filters
17. Filter Categories
• Group filters by functionality: CICategoryBlur,
CICategoryGenerator,
CICategoryCompositeOperation, etc.
• Also group filters by availability and
appropriateness: CICategoryBuiltIn,
CICategoryVideo, CICategoryNonSquarePixels
18. CICategoryGenerator
• No input image, just produces an output
• CICategoryGradient is also output-only
• Example: CICheckerboardGenerator
32. Other output options
• Use a CIContext
• -[drawImage:inRect:fromRect:] draws pixels to
the EAGLContext (iOS) or CGContextRef (OS
X) that the CIContext was created from.
• CIContext can also render to a void* bitmap
• On iOS, can create a CVPixelBufferRef, typically
used for writing to a file with AVAssetWriter
33. Chaining filters
• Use the output of one filter as the input to the next
• This doesn’t cost anything, because the CIImages
just hold state, not pixels
39. Apply filters
// Get CIImage from source image
CGImageRef
! ! ! ! ! ! ! !
loupeImage = [
!
// Apply sepia filter
[self
loupeImage = [
!
// Set sepia-filtered image as input to blend-with-mask
[_blendWithMaskFilter
! ! ! ! ! !
loupeImage = [
40. Render in CIContext
if ([
! [
}!
! !
[self
!
// GL-on-Retina fix
CGRect
drawBoundsInPoints.
drawBoundsInPoints.
! !
// drawing to CIContext draws to the EAGLESContext it's based on
[self
! ! ! ! !! ! !
! ! ! ! ! ! ! !
!
// Refresh GLKView contents immediately
[self
41. Working with Video
• AVFoundation AVCaptureVideoDataOutput and
AVAssetReader deliver CMSampleBuffers
• CMSampleBuffers have timing information and
CVImageBuffers/CVPixelBuffers
• +[CIImage imageWithCVPixelBuffer:]
43. Chroma Key (“green screen”
recipe
• Use a CIColorCube to map green-ish colors to
transparent
• Use CISourceOverCompositing to draw this
alpha’ed image over another image
51. Other Points of Interest
• CIQRCodeGenerator filter — Converts data (e.g., a string) to
a QR Code
• CILenticularHaloGenerator filter — aka, lens flare
• CIDetector — Class (not a filter) to find features in images.
Currently only supports face finding (returned as an array of
CIFeatures). Optionally detects smiles and eye blinks within
faces.
• CIImage has a red-eye enhancement that takes the array of
face CIFeatures to tell it where to apply the effect
52. Core Image on OS X
• Core Image is part of QuartzCore (or Image Kit), so
you don’t @import CoreImage
• Many more filters are available
• Can create your own filter with OpenGL Shading
Language (plus some CI extensions). See CIKernel.
• Also available in iOS 8
• Filters can be set on CALayers
53. CALayer Filters on OS X
• Views must be layer-backed (obviously)
• Must also call -[NSView
setLayerUsesCoreImageFilters:] on 10.9+
• CALayer has properties: filters, compositingFilter,
backgroundFilters, minificationFilter,
magnificationFilter
• These exist on iOS, but do nothing
57. Wrap Up: Stuff to Remember
• Get psyched about filters, but remember to check
that they’re on your targeted platform/version.
• Drawing to a CIContext on iOS must be GL-
backed (e.g., with a GLKView)
58. Q&A
Slides and code will be posted to:
http://www.slideshare.net/invalidname/
!
@invalidname
http://subfurther.com/blog