The touch screen technology is widely used in PDA, smart phone, PMP, ATM, information kiosk and many other types of equipment in industrial, medical and commercial environment. Actually the technology enabling these devices is not new, since it was invented by Dr. Samuel C. Hurst in 1971. But it becomes hotter after the release of popular iPhone and iPod touch. With new patents filed for the touch screen technology, Apple brings a new wave to this mature segment and more companies are involved in this revolution with improved interactive UI, ICs, assembly modules and software components.
P-ISM (“Pen-style Personal Networking Gadget Package”), which is nothing but the new discovery, which is under developing, stage by NEC Corporation. P-ISM is a gadget package including five functions: a pen-style cellular phone with a handwriting data input function, virtual keyboard, a very small projector, camera scanner, and personal ID key with cashless pass function. P-ISMs are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalist pen style enables the ultimate ubiquitous computing.
This presentation provides an overview of multi-touch hardware, products, applications and market examples as well as samples of projects of TNO. More information on http://www.tno.nl/nui
The touch screen technology is widely used in PDA, smart phone, PMP, ATM, information kiosk and many other types of equipment in industrial, medical and commercial environment. Actually the technology enabling these devices is not new, since it was invented by Dr. Samuel C. Hurst in 1971. But it becomes hotter after the release of popular iPhone and iPod touch. With new patents filed for the touch screen technology, Apple brings a new wave to this mature segment and more companies are involved in this revolution with improved interactive UI, ICs, assembly modules and software components.
P-ISM (“Pen-style Personal Networking Gadget Package”), which is nothing but the new discovery, which is under developing, stage by NEC Corporation. P-ISM is a gadget package including five functions: a pen-style cellular phone with a handwriting data input function, virtual keyboard, a very small projector, camera scanner, and personal ID key with cashless pass function. P-ISMs are connected with one another through short-range wireless technology. The whole set is also connected to the Internet through the cellular phone function. This personal gadget in a minimalist pen style enables the ultimate ubiquitous computing.
This presentation provides an overview of multi-touch hardware, products, applications and market examples as well as samples of projects of TNO. More information on http://www.tno.nl/nui
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraIJERD Editor
- Gesture gaming is a method by which users having a laptop/pc/x-box play games using natural or
bodily gestures. This paper presents a way of playing free flash games on the internet using an ordinary webcam
with the help of open source technologies. Emphasis in human activity recognition is given on the pose
estimation and the consistency in the pose of the player. These are estimated with the help of an ordinary web
camera having different resolutions from VGA to 20mps. Our work involved giving a 10 second documentary to
the user on how to play a particular game using gestures and what are the various kinds of gestures that can be
performed in front of the system. The initial inputs of the RGB values for the gesture component is obtained by
instructing the user to place his component in a red box in about 10 seconds after the short documentary before
the game is finished. Later the system opens the concerned game on the internet on popular flash game sites like
miniclip, games arcade, GameStop etc and loads the game clicking at various places and brings the state to a
place where the user is to perform only gestures to start playing the game. At any point of time the user can call
off the game by hitting the esc key and the program will release all of the controls and return to the desktop. It
was noted that the results obtained using an ordinary webcam matched that of the Kinect and the users could
relive the gaming experience of the free flash games on the net. Therefore effective in game advertising could
also be achieved thus resulting in a disruptive growth to the advertising firms.
Each one of us is constantly surrounded by multi-touch technologies in everyday life. We keep our smartphones with us all the time, we work with tablet computers and touch screens. But also in stores, in museums and exhibitions, and on trade fairs, the intuitive touch-gesture on a surface has become second nature to us.
But how exactly does the underlying technology work, and how can businesses make optimal use of them, e.g. at their point of sale (POS)?
The multi-touch experts of Garamantis Interactive Technologies have gathered all information on this ubiquitous technology and “forged” them into one large infographic.
This graphic is addressed to anyone who wants to become an instant expert on multi-touch technology within a few minutes, but particularly to businesses and agencies looking for a way to optimally apply this technology in their work.
Real time hand gesture recognition system for dynamic applicationsijujournal
Virtual environments have always been considered as a means for more visceral and efficient human computer interaction by a diversified range of applications. The spectrum of applications includes analysis of complex scientific data, medical training, military simulation, phobia therapy and virtual prototyping. Evolution of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are not sufficient for the still widening spectrum of Human computer interaction. Gloves and sensor based trackers are unwieldy, constraining and uncomfortable to use. Due to the limitation of these devices the useable command set based diligences is also limited. Direct use of hands as an input device is an innovative method for providing natural Human Computer Interaction which has its inheritance from textbased interfaces through 2D graphical-based interfaces, multimedia supported interfaces, to full-fledged multi-participant Virtual Environment (VE) systems. Conceiving a future era of human-computer interaction with the implementations of 3D application where the user may be able to move and rotate objects simply by moving and rotating his hand - all without help of any input device. The research effort centralizes on the efforts of implementing an application that employs computer vision algorithms and gesture recognition techniques which in turn results in developing a low cost interface device for interacting with objects in virtual environment using hand gestures. The prototype architecture of the application comprises of a central computational module that applies the camshift technique for tracking of hands and its gestures. Haar like technique has been utilized as a classifier that is creditworthy for locating hand position and classifying gesture. The patterning of gestures has been done for recognition by mapping the number of defects that is formed in the hand with the assigned gestures. The virtual objects are produced using Open GL library. This hand gesture recognition technique aims to substitute the use of mouse for interaction with the virtual objects. This will be useful to promote controlling applications like virtual games, browsing images etc in virtual environment using hand gestures.
Real time hand gesture recognition system for dynamic applicationsijujournal
Virtual environments have always been considered as a means for more visceral and efficient human computer interaction by a diversified range of applications. The spectrum of applications includes analysis of complex scientific data, medical training, military simulation, phobia therapy and virtual prototyping.
Evolution of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are
not sufficient for the still widening spectrum of Human computer interaction. Gloves and sensor based trackers are unwieldy, constraining and uncomfortable to use. Due to the limitation of these devices the useable command set based diligences is also limited. Direct use of hands as an input device is an
innovative method for providing natural Human Computer Interaction which has its inheritance from textbased interfaces through 2D graphical-based interfaces, multimedia-supported interfaces, to full-fledged multi-participant Virtual Environment (VE) systems. Conceiving a future era of human-computer
interaction with the implementations of 3D application where the user may be able to move and rotate objects simply by moving and rotating his hand - all without help of any input device.
SMARCOS Abstract Paper submitted to ICCHP 2012Smarcos Eu
This study is part of the European project "Smarcos" (http://www.smarcos-project.eu/) that includes among its goals the development of services which are specifically designed and accessible for blind users.
In this paper we present the prototype application designed to make the main phone features available in a way which is accessible for a blind user. The prototype has been developed to firstly evaluate the interaction modalities based on gestures, audio and vibro-tactile feedback.
Learn Entity Framework in a day with Code First, Model First and Database FirstJibran Rasheed Khan
Learn Entity Framework in a day with Code First, Model First and Database First
•Introduction to Entity Framework (EF)
•Architecture
•What’s new!
•Different approaches to work with (Code first, Database first and model first)
•Choosing right work model
•Pictorial Tour to each model
•Features & Advantages
•Question & Answer
for any help and understanding feel free to contact
thank you
Gesture Gaming on the World Wide Web Using an Ordinary Web CameraIJERD Editor
- Gesture gaming is a method by which users having a laptop/pc/x-box play games using natural or
bodily gestures. This paper presents a way of playing free flash games on the internet using an ordinary webcam
with the help of open source technologies. Emphasis in human activity recognition is given on the pose
estimation and the consistency in the pose of the player. These are estimated with the help of an ordinary web
camera having different resolutions from VGA to 20mps. Our work involved giving a 10 second documentary to
the user on how to play a particular game using gestures and what are the various kinds of gestures that can be
performed in front of the system. The initial inputs of the RGB values for the gesture component is obtained by
instructing the user to place his component in a red box in about 10 seconds after the short documentary before
the game is finished. Later the system opens the concerned game on the internet on popular flash game sites like
miniclip, games arcade, GameStop etc and loads the game clicking at various places and brings the state to a
place where the user is to perform only gestures to start playing the game. At any point of time the user can call
off the game by hitting the esc key and the program will release all of the controls and return to the desktop. It
was noted that the results obtained using an ordinary webcam matched that of the Kinect and the users could
relive the gaming experience of the free flash games on the net. Therefore effective in game advertising could
also be achieved thus resulting in a disruptive growth to the advertising firms.
Each one of us is constantly surrounded by multi-touch technologies in everyday life. We keep our smartphones with us all the time, we work with tablet computers and touch screens. But also in stores, in museums and exhibitions, and on trade fairs, the intuitive touch-gesture on a surface has become second nature to us.
But how exactly does the underlying technology work, and how can businesses make optimal use of them, e.g. at their point of sale (POS)?
The multi-touch experts of Garamantis Interactive Technologies have gathered all information on this ubiquitous technology and “forged” them into one large infographic.
This graphic is addressed to anyone who wants to become an instant expert on multi-touch technology within a few minutes, but particularly to businesses and agencies looking for a way to optimally apply this technology in their work.
Real time hand gesture recognition system for dynamic applicationsijujournal
Virtual environments have always been considered as a means for more visceral and efficient human computer interaction by a diversified range of applications. The spectrum of applications includes analysis of complex scientific data, medical training, military simulation, phobia therapy and virtual prototyping. Evolution of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are not sufficient for the still widening spectrum of Human computer interaction. Gloves and sensor based trackers are unwieldy, constraining and uncomfortable to use. Due to the limitation of these devices the useable command set based diligences is also limited. Direct use of hands as an input device is an innovative method for providing natural Human Computer Interaction which has its inheritance from textbased interfaces through 2D graphical-based interfaces, multimedia supported interfaces, to full-fledged multi-participant Virtual Environment (VE) systems. Conceiving a future era of human-computer interaction with the implementations of 3D application where the user may be able to move and rotate objects simply by moving and rotating his hand - all without help of any input device. The research effort centralizes on the efforts of implementing an application that employs computer vision algorithms and gesture recognition techniques which in turn results in developing a low cost interface device for interacting with objects in virtual environment using hand gestures. The prototype architecture of the application comprises of a central computational module that applies the camshift technique for tracking of hands and its gestures. Haar like technique has been utilized as a classifier that is creditworthy for locating hand position and classifying gesture. The patterning of gestures has been done for recognition by mapping the number of defects that is formed in the hand with the assigned gestures. The virtual objects are produced using Open GL library. This hand gesture recognition technique aims to substitute the use of mouse for interaction with the virtual objects. This will be useful to promote controlling applications like virtual games, browsing images etc in virtual environment using hand gestures.
Real time hand gesture recognition system for dynamic applicationsijujournal
Virtual environments have always been considered as a means for more visceral and efficient human computer interaction by a diversified range of applications. The spectrum of applications includes analysis of complex scientific data, medical training, military simulation, phobia therapy and virtual prototyping.
Evolution of ubiquitous computing, current user interaction approaches with keyboard, mouse and pen are
not sufficient for the still widening spectrum of Human computer interaction. Gloves and sensor based trackers are unwieldy, constraining and uncomfortable to use. Due to the limitation of these devices the useable command set based diligences is also limited. Direct use of hands as an input device is an
innovative method for providing natural Human Computer Interaction which has its inheritance from textbased interfaces through 2D graphical-based interfaces, multimedia-supported interfaces, to full-fledged multi-participant Virtual Environment (VE) systems. Conceiving a future era of human-computer
interaction with the implementations of 3D application where the user may be able to move and rotate objects simply by moving and rotating his hand - all without help of any input device.
SMARCOS Abstract Paper submitted to ICCHP 2012Smarcos Eu
This study is part of the European project "Smarcos" (http://www.smarcos-project.eu/) that includes among its goals the development of services which are specifically designed and accessible for blind users.
In this paper we present the prototype application designed to make the main phone features available in a way which is accessible for a blind user. The prototype has been developed to firstly evaluate the interaction modalities based on gestures, audio and vibro-tactile feedback.
Learn Entity Framework in a day with Code First, Model First and Database FirstJibran Rasheed Khan
Learn Entity Framework in a day with Code First, Model First and Database First
•Introduction to Entity Framework (EF)
•Architecture
•What’s new!
•Different approaches to work with (Code first, Database first and model first)
•Choosing right work model
•Pictorial Tour to each model
•Features & Advantages
•Question & Answer
for any help and understanding feel free to contact
thank you
Deep Dive building solutions on the SharePoint Framework - SPS Brussels 2016Waldek Mastykarz
The must-see session for every SharePoint developer. Learn how to get the most out of the SharePoint Framework and build powerful solutions for SharePoint and Office 365 using the latest developer opportunities.
Zo'n 10 jaar geleden werd het fenomeen open source software met enige scepsis ontvangen en gekarakteriseerd als onbetrouwbaar en onveilig. Ondertussen is open source software niet meer het exclusieve terrein van IT experts, maar komen er steeds meer bestuurders en managers mee in aanraking. Deze presentatie verbindt een inhoudelijke verdieping met het belang van deze software voor de publieke sector. Deze presentatie biedt u verschillende constructen om tegen het fenomeen aan te kijken en geeft daarbij aan hoe u dit begrip concreet kunt inbedden in uw beleid. De presentatie sluit af met het benoemen van een aantal trends en ontwikkelingen en de relevantie ervan voor de publieke sector.
Google glass and the wearable revolution - NYCCamp 2013Frank Carey
Brief history of wearables from the first iPhone to Google Glass. Gives context to some of the engineering decisions and what's possible in the current API. Video for the slides is currently at http://www.ustream.tv/recorded/35842151
The PPTs from one of the event of iWillStudy.com - a leading start-up in the education space in India. This PPT is being used at an event where they taught iPhone programming and applications development.
On January 11, 2018, Sony Corporation released aibo (https://aibo.sony.jp/). aibo that is back on market beyond the time of 12 years constructed via robotics framework named ROS. In this presentation, we introduce examples of development in aibo from the point of view of ROS, starting with introduction of aibo, architecture, embedded technology, real-time optimization, robot development environment, simulation etc.
In the late 1990s, building an electronic product company was a herculean feat. Components had to be designed from scratch. Firmware engineers spent months writing low-level code to control silicon. Communication stacks were written from the ground up. Assembly lines were built by buying capital equipment and hiring manufacturing workers. Distribution was dependent on retailers buying your first 5K units without any prior sales traction.
After 30 years, the story is different. No longer do startups spend months writing communication stacks for radios. They drop in a $3 pre-certified radio module with WiFi and Bluetooth with most of the functionality already in place. Hire a contract manufacturer who is willing to build a 10K unit each month. Similar to software, many hardest technical problems have been abstracted away. As starting a hardware startup has gotten easier, more founders have done so and competitive advantage or barriers to entry are today’s hard problems.
Fitbit and GoPro went public while significantly profitable, having raised only $77M and $90M respectively from VCs excluding capital from CMs. This is far fewer VC dollars raised than the average SaaS IPO. Fitbit scaled from $0 to $1.9B in revenue in 7 years while very few SAAS startups scale that quickly (SaaS public revenue average at 7 years is ~$90M). As hardware startups increasingly adopt software-like business models, it’s more common to see 45%+ gross margin business in GoPro, Fitbit, Dropcam, and much higher on the subscription or data storage product.
The “hard” part of any sector changes as old problems are solved and new problems appear in any startup history. The fundamental difficulties of building any disruptive company remain the same: team, market, distribution, and marketing. These are the problems that unite all companies trying to build value in the world. Our investor panel is going to share their insights in evaluating, assessing, and managing their high-risk ventures.
Empowered Entrepreneurs and Hyper Growth in Mobile EraBess Ho
Investment Panel titled "Empowered Entrepreneurs and Hyper Growth in Mobile Era" at Silicon Vally China Wireless (SVCW) Conference in 2013.
Snapchat evaluated at $800M is gaining 200 million images daily, growing faster than Instagram (130M users), Facebook (1B users) and Twitter (550M users). Instagram hits 5 Million video uploads within 24 hours. In China, Alibaba's gross merchandise volume surpassed Amazon and eBay in Q4 2012. Sina's Weibo, Chinese's version of Twitter, is growing at 2x Year To Year with a 530 million users from zero revenue to $100 Million revenue in a year. Our investor panel would discuss upcoming startup to watch and past successful investment.
Guest Lecture for The Art of Institutes on Mobile Design. Specific target to what design students should learn to become sufficient as mobile designer and to excel in designing for mobile.
JumpyBirds iTunes for Toddlers & Amazon for MomsBess Ho
JumpyBirds is an Entertainment TV app which is going to change the way Toddlers get their favorite songs at home and to change the way Moms shop and buy their child's digital toys. It focuses in solving the traditional problems of how difficult for working busy moms to find and shop quality educational DVDs for children in places where Children's DVD collection in retail stores is limited due to shelf space and inventory.
Our concept design demonstrates the convenience of smart phone to unlock fresh and additional digital content without exiting the app or install additional app. We also significantly reduce the no. of clicks on TV Remote Control from roughly 50 clicks to a single click to request content and offer alternative micro-payment in mobile. Our concept design would offer convenience of purchasing toys from Brick and mortar along with digital content. Mom will be able to buy a teddy bear that appears in the "Teddy Bear" song and surprise their children of the toy as gift on special occasions or act of encouragement of learning.
JumpyBirds's TV app idea is to deliver "Happiness" to both Mom & toddler at their convenience and in the comfort of their home.
About:
JumpyBirds is a Silicon Valley-based startup focuses in creative design and innovative technology in devices.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
42. Touch
Audio
Visual
Sensor
Geo
Data
iPhone’s Anatomy
43. Laboratory robot has confirmed with a human
test: Apple’s iPhone touchscreen is by far the
most accurate touch panel on the market
iPhone
Google Nexus One
Motorola Droid
HTC Droid Eris
Palm Pre
BlackBerry Storm 2
Reference: http://www.appleinsider.com/articles/10/03/24/robotic_test_reconfirms_apples_iphone_touchscreen_superiority.html
iPhone’s Anatomy
44. Laboratory robot has confirmed with a human
test: Apple’s iPhone touchscreen is by far the
most accurate touch panel on the market
Reference: http://www.appleinsider.com/articles/10/03/24/robotic_test_reconfirms_apples_iphone_touchscreen_superiority.html
iPhone’s Anatomy
45. MOTO Lab Experiment
7mm robotic “finger” for “medium touch”
4mm robotic “finger” for a “very light
touch”
iPhone’s Anatomy
47. “All touchscreens are
not created equal.”
Screen sensitivity is a combination of
1) hardware component quality
2) design and
3) software integration - Operating System to
ensure responsiveness for the user
iPhone’s Anatomy
48. Resistive VS Capacitive
Touchscreen Technology
49. Resistive Touchscreens
A screen where two thin metallic layers are separated by a
narrow gap. A finger pushing down on the top layer makes
contact with the bottom surface and the point of contact is
computed by the accompanying electronics.
iPhone’s Anatomy
50. Resistive Touchscreens
A screen where two thin metallic layers are separated by a
narrow gap. A finger pushing down on the top layer makes
contact with the bottom surface and the point of contact is
computed by the accompanying electronics.
iPhone’s Anatomy
51. Resistive Touchscreens
A screen where two thin metallic layers are separated by a
narrow gap. A finger pushing down on the top layer makes
contact with the bottom surface and the point of contact is
computed by the accompanying electronics.
iPhone’s Anatomy
53. Capacitive Touchscreens
This capacitive technology responds to the electrical
properties of your skin, not the pressure of your finger
to figure out where you are touching the screen.
iPhone’s Anatomy
54. Capacitive Touchscreens
This capacitive technology responds to the electrical
properties of your skin, not the pressure of your finger
to figure out where you are touching the screen.
iPhone’s Anatomy
55. Resistive VS Capacitive
Touchscreen Technology
56. Resistive VS Capacitive
Touchscreen Technology
57. Resistive VS Capacitive
Touchscreen Technology
58. Web OS VS Native
HTML5 / JS / CSS JAVA Objective-C
Touchscreen Technology
65. Activity Monitor
CPU Sampler
Leaks
Object
Allocations
Core Data
File Activity
UI Recorder
Core Animation
Open GL ES
System Usage
Available Instruments
66. Activity Monitor overall CPU, memory, disk &
Monitor network activity
Precise time-based sampling of CPU
CPU Sampler
usage
Leaks Detects memory leaks
Object
Measures memory usage by class
Allocations
Monitor Core Data activity &
Core Data
performance
Monitor application’s interaction with
File Activity
the file system
Available Instruments
67. Captures & play back UI events to
UI Recorder run exact same sequence of user
interactions
Core Measures Core Animation graphics
Animation performance and resulting CPU load
Measures Open GL ES graphics
Open GL ES
performance and resulting CPU load
Monitors file, network & memory I/O
System Usage
use and duration for each method
Available Instruments
76. iPhone Icon
Promotion
57 x 57 pixel
iPad Icon
512 x 512 pixel 72 x 72 pixel
iPhone App Store Submission
77. 162 ppi 132 ppi
Core Text Framework
Text-rendering & layout
features
Animated text with special
effects
Display Density (pixels per inch)
78. Hardware Software
Prefer saving and
Accelerometer
loading data invisibly
Compass
Without an explicit
Core Location
“save” or “load” option
Wi-Fi
Supports Open GL ES
Bluetooth
2.0 with legacy
Microphone
support for Open GL
headphone socket
ES 1.0
iPad
81. Carrier 10:32 AM
Carrier 10:32 AM
Carrier 3G 10:32 AM
Signal Strength
Carrier
Current Network
Connection
Time
Battery Charge
Status Bar
82. 6
Music Videos Podcasts Search More
9 99 999
Should be present on all
screens
switch between modes &
views
Badges are
superimposed in the tab
bar to inform user of
new items
Tab Bar & Badges
84. Activity indicator for nav bar
on grey background...
Network activity
Display if it takes more
than a couple of seconds
to perform the task
Activity Indicator
85. Back Button Pane Label Button
Optional instructions for this pane
Cancel Pane Label Save
Pane Label Disabled
Groups All Contacts +
Inbox (20) 2 of 50
Display title of current
view
Display buttons that
trigger action to the view
or navigate
Navigation Bar
89. MockApp
mockapp.com/ Google
MockApp Cancel
http://mockapp.com/
MockApp Cancel
Google
Browser Bar
90. Sample search text
Search
Clear Directions Cancel
Start: Current Location
End:
Type a company name or stock ID.
Cancel
Search Cancel
Placeholder text
Bookmarks button & Clear
button
Prompt with descriptive title
above the search bar
Search Bar
91. Primary action
you can also do this
or maybe this
or why not that
Cancel
Delete
Cancel
Selection Or Confirmation Menu
Important or common action
should appear to the top
Destructive action use red button
Action Sheets
94. Main Message
Optional explanation of what a user needs to do
Primary Button
Confirmation Message
Optional explanation of what the system is
asking
Secondary Primary
Confirmation Message
Please enter your password
Optional explanation of what the system is
john@doe.com
asking
Password
Cancel OK
Primary
Use it wisely
Require immediate user attention
Alerts
95. Web20 Expo SF
Sent from iPhone
It supports multiple lines
It supports scrolling
Text Views
96. From: Twitter Hide
To: Dotan Saguy
Everybody is now following you on Twitter!
April 1, 2035 1:33 PM Mark Unread
Display rich HTML content
Web / Email Views
97. Q W E R T Y U I O P
A S D F G H J K L
#+= Z X C V B N M
ABC space @ . return
1 2
ABC
3
DEF
4
GHI
5
JKL
6
MNO
7
PQRS
8
TUV
9
WXYZ
0
+
Keyboards
98. N G A
A B
This is a regular table B Pete Gardener C
D
C
view D E
F
M
E
F Tess Grady G
G H
I
Divided into sections H
I M.J. Grey J
(the letters are the sections) J K
K L
Each row is an item L
M
Jenn Guggenheim M
N
O
of the list N
O H P
P Q
and can contain several Q
Sara Hashimoto
R
R S
data elements (image, text, etc.) S T
T U
O U Em Hirsch V
W
Display lists of items
List is divided into
sections separated by grey
headers
Table Views
99. Unread 29
First Last name mobile
Item to delete or move
Ready to be deleted Delete
Not pressed yet Current status
When pressed it highlights
This is a regular table view
With icons like in
The “more” tab
of the iPod app
Podcasts
Table Views
100. This group has 3 items
This item has been selected
Items highlight briefly when hit
This group has only 1 item
You can insert headers too
This one lets you drill down
This one shows Current status
Segmented controls
Tab One Tab Two Tab Three
Tab One Tab Two Tab Three
Tab One Tab Two Tab Three
Grouped Table Views
101. Bess Ho
home (111) 222-3333
mobile (111) 222-3333
work (111) 222-3333
whatever (111) 222-3333
Text Message Share Contact
You can even insert instructions like these as
well if they’re helpful in this context.
Silent
This item is turned OFF
Ring
This item is turned ON
ON
Grouped Table Views
102. August 03 2007 7 45
September 04 2008 8 46
October 05 2009 9 47 AM
November 06 2010 10 48 PM
December 07 2011 11 49
Sat Oct 3 7 50 28
Sun Oct 4 8 55 29
Today 9 00 AM 0 hours 30 mins
Tue Oct 6 10 05 PM 1 31
Wed Oct 7 11 10 2 32
Date & Time Pickers
103. First & default value
Second value
Third value
Value Picker
104. Tap
~ Single Mouse Click
Swipe
Reveal the delete button
in a table-view row
Drag
Scroll or Pan
Gestures
105. Tap
~ Single Mouse Click
Swipe
Reveal the delete button
in a table-view row
Drag
Scroll or Pan
Gestures
106. Tap
~ Single Mouse Click
Swipe swipe
Reveal the delete button
in a table-view row
Drag
Scroll or Pan
Gestures
107. Tap
~ Single Mouse Click
Swipe
Reveal the delete button
in a table-view row
Drag
Scroll or Pan
Gestures
108. Tap
~ Single Mouse Click
Swipe
Reveal the delete button
in a table-view row
Drag drag
Scroll or Pan
Gestures
109. Double
Tap
Zoom-in / Zoom-out
Touch &
Hold view in
Display a magnified
editable text
Gestures
110. Double
Tap
Zoom-in / Zoom-out
2x
tap
Touch &
Hold view in
Display a magnified
editable text
Gestures
111. Double
Tap
Zoom-in / Zoom-out
2x
tap
Touch &
Hold view in
Display a magnified
touch
and
editable text hold
Gestures
112. Pinch
Close
Zoom-in
Pinch Open
Zoom-out
Gestures
113. Pinch
Close
Zoom-in
pinch close
Pinch Open
Zoom-out
Gestures
114. Pinch
Close
Zoom-in
pinch close
Pinch Open
Zoom-out
pinch open
Gestures
121. Audio Toolbox
Framework
AV Foundation
Framework
OpenAL Framework
Audio Unit Framework
Audio Queue Services
Remote IO Unit
Media Player Framework
iPhone Audio Frameworks
131. Kodak Pearl Module
Dental System
Practice Management Systems
3D & Extraoral Imaging
Intraoral Radiography
Intraoral Digital Image
Intraoral X-ray Image
132. Video Conferencing
Eye Tracking
Iris Scanning
Front-Facing Camera (Future)
140. Accuracy Constants
CCLocation Class
locationManager.desiredAccuracy
is the most important property of
Location Manager. It determines the
amount of power it consumed.
Constant values are to specify the accuracy of a location.
kCLLocationAccuracyBest
Best
kCLLocationAccuracyNearestTenMete
10 Meters
rs
100 Meters
kCLLocationAccuracyHundredMeters
1000 Meters
kCLLocationAccuracyKilometer
3000 Meters
kCLLocationAccuracyThreeKilometers
Core Services Layer: Core Location
141. CLLocation Class
Constants
CCLocationDegrees
Delivers a latitude or longitude value specified in
degrees. Data type is double.
CCLocationSpeed
Delivers the speed at which the device is moving in
meters per second. Data type is double.
Core Services Layer: Core Location
142. CLLocation Class
Constants
CCLocationDirection
Delivers a direction that is measured in degrees
and relative to true north. Data type is double.
North is 0 degrees
East is 90 degrees
South is 180 degrees
Any “-” value indicates an invalid direction
Core Services Layer: Core Location
146. CCLocationManager
Core Location
Create a CCLocationManager object to
get heading by invoking
[CCLocationManager
startUpdatingHeading].
iPhone 3GS contains a magnetometer - a
magnetic field detector. It displays the
raw x, y, and z magnetometer values.
Magnitude of the magnetic field is
computed in strength.
Core Services Layer: Core Location
147. CLLocationManager
Core Location
if (locationManager.headingAvailable == NO) {
self.locationManager = nil; // No compass is
available
} else {
// heading service configuration
locationManager.headingFilter =
kCLHeadingFilterNone;
// setup delegate callbacks
locationManager.delegate = self;
// start the compass
[locationManager startUpdatingHeading];
}
}
Core Services Layer: Core Location
149. CLHeading
Core Location
- (void)locationManager:(CLLocationManager *)manager
didUpdateHeading:(CLHeading *)heading {
// Update the labels with the raw x, y, and z values.
[xLabel setText:[NSString stringWithFormat:@"%.1f",
heading.x]];
[yLabel setText:[NSString stringWithFormat:@"%.1f",
heading.y]];
[zLabel setText:[NSString stringWithFormat:@"%.1f",
heading.z]];
}
MapKit Framework: Class
151. MKAnnotationView
MKMapView
MKPinAnnotationView
MKPlacemark
MKReverseGeocoder
MKUserLocation
MapKit Framework: Class
152. MKReverseGeocoder
MKReverseGeocoder offers
services to convert a map
coordinate (latitude & Longitude)
to info such as country, city, or
street. It works with a network-
based map service to look up
placemark information for a
specified coordinate value.
Cocoa Touch Layer: MapKit Framework
153. MKReverseGeocoder
Each app is limited to amount
of reverse geocoding
Send one reverse-geocoding
request for any one user action
Reuse the results from initial
request
Suggest not to send one
reverse-geocode request per
minute
Cocoa Touch Layer: MapKit Framework
154. MKAnnotationView
MKMapView
MKPinAnnotationView
MKPlacemark
MKReverseGeocoder
MKUserLocation
MapKit Framework: Class
156. MKMapView Class
MKMapType
It delivers the type of map to display.
MKMapTypeStandard
MKMapTypeSatellite
MKMapTypeHybrid
Cocoa Touch Layer: MapKit Framework
157. MKAnnotationView
MKMapView
MKPinAnnotationView
MKPlacemark
MKReverseGeocoder
MKUserLocation
MapKit Framework: Class
167. Google Location
Maps Services
External
Library
Android SDK
168. Google Maps
External Library
Use Google APIs add-on
Download Maps external library
Must register with Google Maps
service
Obtain a Maps API Key
Android SDK
169. AndroidManifest.xml
Declare Maps Library
Request internet permission
Hide title bar
<uses-library
android:name=”com.google.android.maps” />
<uses-permission
android:name=”android.permission.INTERNET” />
<activity android:name=”.HelloMaps”
android:label=”@string/app_name”
android:theme=”@android:style/Theme.NoTitleBar”>
Android SDK
iPhone Tech Day, iPhone Dev Camp, Android Lab, Nokia Workshop, BlackBerry Conference, Palm Developer Day
iPhone was found to have straight and accurate lines, with some weaknesses at the edge of the panel with the light touch. Nexus One, which MOTO said had "solid performance" much like the Droid Eris. Both the Palm Pre and BlackBerry Storm 2 performed well in the medium test, but produced significant signal loss when the very light touch was employed. The poorest performer of the bunch was the Motorola Droid, which featured "significant waviness and stair-stepping," even with the medium touch test. In the light touch, signal drops were extremely common.
"On inferior touchscreens, it&#x2019;s basically impossible to draw straight lines," MOTO reports. "Instead, the lines look jagged or zig-zag, no matter how slowly you go, because the sensor size is too big, the touch-sampling rate is too low, and/or the algorithms that convert gestures into images are too non-linear to faithfully represent user inputs."
Many layers account for the performance of a touchscreen But it all comes down to how well the electronics and the mechanical hardware are integrated.
A projected capacitive touchscreen &#x2014; the kind that&#x2019;s usually used in phones &#x2014; has a glass insulator coated with a transparent conductive layer. The layer is etched into a gridlike pattern. When a finger touches the surface of the screen, it distorts the electrostatic field. That can be measured as a change in capacitance.&#xA0; The location of the touch is computed and it is passed on to a software application that relates the touch into actions for the device.
Smartphone users have no way to measure exactly how well the capacitive sensor system on their phone is actually working. Their perception is based on the feedback they see on the screen, says Hsu. That means a touchscreen could be quite fast and accurate, but if the visual display doesn&#x2019;t keep up, it won&#x2019;t feel smooth or responsive.
That&#x2019;s where an ASIC, or application specific integrated circuit, is needed to measure and amplify the signals. Apple reportedly designed its own ASIC for the iPhone&#x2019;s touchscreen, while most other companies buy an ASIC from one of the touchscreen chipmakers.
Palm tries to tweak the touchscreen through firmware updates.
One reason why Apple&#x2019;s touch sensor is so sensitive to light touch is that the company uses a 12-volt power source for the sensing lines in the touchscreen sensor, versus the 3- to 5-volt power source that most other component manufacturers have. That higher voltage drive takes a toll on the battery life because it uses up more power, but it also translates into more accurate sensing, which means a better touch experience, say researchers at&#xA0;Moto.
JavaScript or HTML5 doesn&#x2019;t support multi-touch events. Sense up to 5 touches. None of JS or HTML5 are designed to handle 5 OnClick events. WebOS also has tough time in playing multiple audio tags simultaneously. JS won&#x2019;t be able to manually release memory. It relies on window onload refresh page to release memory.
JavaScript or HTML5 doesn&#x2019;t support multi-touch events. Sense up to 5 touches. None of JS or HTML5 are designed to handle 5 OnClick events. WebOS also has tough time in playing multiple audio tags simultaneously. JS won&#x2019;t be able to manually release memory. It relies on window onload refresh page to release memory.
Android - not open
Palm - not open
BlackBerry - not open
Nokia - not open
NASA&#x2019;s Ames Research Center NASA Scientist
Silicon sensing chip in micro-board with 64 nanosensors
Detect trace amounts of ammonia, methane, chlorine gas
This zoom lens by Conice attaches to your iPhone via a protective plastic case which snaps onto your iPhone, it can easily be removed when not in use. The lens and case weigh in at 4.69 ounces, which nearly doubles the 4.8 ounce weight of youriPhone, so you will need a steady hand while taking your photos.
Open up External Accessories. Simplify hardware certification. Accept medical applications.