SlideShare a Scribd company logo
1 of 39
Exposing Security Risks
For Commercial Mobile
Devices (CMDs)
Jeffrey Voas, PhD, FIEEE
Jeff.voas@nist.gov
NIST
Maturity of Technologies (source Gartner)
CIO Business Priorities
Mobile Device Manufacturer
Sales
Mobile OS trends
Risks in Mobile Security Supply Chain
Devices
Secure Verify Test Deploy
Enterprise
Security
Enterprise
Security
Enterprise
Security
Enterprise
Security
Enterprise
Security
Device
Provisioning
MDM/Middleware
Providers
Hardware Components of Interest
• CPU
• Display
• Graphics
• Audio
• Microphone
• Wi-fi
• GPS
• Touchscreen
• Accelerometer
• Compass
• And more…
Application Vetting: Big Picture
Progression of Testing
Results
• An examiner will review the responses from the
developer
• A human evaluation of trust in the responses is
made
• If questionable, developer may be asked for
clarifications or app recommended for re-work
• If believed, app is ready for app store inclusion or for
next two assurance approaches
Questions
1. Does the app that you offer contain any security issues that need to be mitigated or revealed? For example,
does the app access phone, network (wifi and/or Bluetooth), and/or GPS resources?, does the app access
camera or other recording hardware or software?, does the app access operating system resources or
libraries?”, and “what information, if any, is transferred from the phone to a remote host (including during PC
synchronization) by the app?”. If so, please explain. If you believe that your app does not contain any security
issues, please explain why, and then ignore remaining Questionnaire. If you answer “no” here, please give
detail and good reasoning. If you answer “yes” please address as many of the following questions as are
reasonable for your circumstances. Lacking information may trigger additional review of your application for
app approval and thus delay adoption into the appstore.
2. How long has the software source been available? Is there an active user community providing peer review and
actively evolving the software?
3. Does the license/contract restrict the licensee from discovering flaws or disclosing details about software
defects or weaknesses with others (e.g., is there a “gag rule” or limits on sharing information about discovered
flaws)?
4. Does software have a positive reputation? Does software have a positive reputation relative to security? Are
there reviews that recommend it?
5. What are the processes (e.g., ISO 9000, CMMI, etc.), methods, tools (e.g., IDEs, compilers), techniques, etc. used
to produce and transform the software (brief summary response)?
6. What security measurement practices and data does the company use to assist product planning?
7. Describe the training the company offers related to defining security requirements, secure architecture and
design, secure coding practices, and security testing. Explain.
8. Are there some requirements for security that are “structured” as part of general release-ability of a product
and others that are “as needed” or “custom” for a particular release?
9. What process is utilized by the company to prioritize security-related enhancement requests?
10. What review processes are implemented to ensure that nonfunctional security requirements are unambiguous,
traceable and testable throughout the entire Software Development Life Cycle (SDLC)?
11. Are security requirements developed independently of the rest of the requirements engineering activities, or are
they integrated into the mainstream requirements activities?
12. Are misuse/abuse cases derived from the application requirements? Are relevant attack patterns used to
identify and document potential threats?
13. What threat assumptions were made, if any, when designing protections for the software and information
assets processed?
14. What security design and security architecture documents are prepared as part of the SDLC process?
15. What threat modeling process, if any, is used when designing the software protections?
16. How are confidentiality, availability, and integrity addressed in the software design?
17. What are/were the languages and non-developmental components used to produce the software (brief summary
response)?
18. What secure development standards and/or guidelines are provided to developers?
Questions (cont.)
19. Are tools provided to help developers verify that the software they have produced has a minimal number of
weaknesses that could lead to exploitable vulnerabilities? What are the tools, and how have they been qualified?
What is the breadth of common software weaknesses covered (e.g., specific CWEs)?
20. Does the company have formal coding standards for each language in use? If yes, how are they enforced?
21. Does the software development plan include security peer reviews?
22. Does the organization incorporate security risk management activities as part of the software development
methodology? If yes, will a copy of the documentation of this methodology be available or information on how to
obtain it from a publicly accessible source?
23. Does the software‟s exception-handling mechanism prevent all faults from leaving the software, its resources,
and its data (in memory and on disk) in a vulnerable state? Does the exception-handling mechanism provide
more than one option for responding to a fault? If so, can the exception-handling options be configured by the
administrator or overridden?
24. Does the software validate (e.g., filter with white listing) inputs from potentially un-trusted sources before being
used? Is a validation test suite or diagnostic available to validate that the application software is operating
correctly and in a secure configuration following installation?
25. Has the software been designed to execute within a constrained execution environment (e.g., virtual machine,
sandbox, chroot jail, single-purpose pseudo-user, etc.) and is it designed to isolate and minimize the extent of
damage possible by a successful attack?
26. Does the documentation explain how to install, configure, and/or use the software securely? Does it identify
options that should not normally be used because they create security weaknesses?
27. Where applicable, does the program use run-time infrastructure defenses (such as address space randomization,
stack overflow protection, preventing execution from data memory, and taint checking)?
28. How does the company minimize the risk of reverse engineering of binaries? Are source code obfuscation
techniques used? Are legal agreements in place to protect against potential liabilities of non-secure software?
29. Does the software default to requiring the administrator (or user of a single-user software package) to expressly
approve the automatic installation of patches/upgrades, downloading of files, execution of plug-ins or other
“helper” applications, and downloading and execution of mobile code?
30. Does the software have any security critical dependencies or need additional controls from other software (e.g.,
operating system, directory service, applications), firmware, or hardware? If yes, please describe.
31. Does the software include content produced by suppliers other than the primary developer? If so, who?
32. What are the policies and procedures for verifying the quality and security of non-developmental components
used?
33. What types of functional tests are/were performed on the software during its development (e.g., spot checking,
component-level testing, integrated testing)?
34. Who and when are security tests performed on the product? Are tests performed by an internal test team, by an
independent third party, or by both?
35. What degree of code coverage does testing provide?
36. Are misuse test cases included to exercise potential abuse scenarios of the software?
Questions (cont.)
37. Are security-specific regression tests performed during the development process? If yes, how frequently are the
tests performed? Are regression test scripts available?
38. Does the company‟s defect classification scheme include security categories? During testing what proportion of
identified defects relate to security?
39. How has the software been measured/assessed for its resistance to identified, relevant attack patterns? Are
Common Vulnerabilities & Exposures (CVEs) or Common Weakness Enumerations (CWEs) used? How have
exploitable flaws been tracked and mitigated?
40. Has the software been evaluated against the Common Criteria, FIPS 140-2, or other formal evaluation process?
What evaluation assurance was achieved? If the product claims conformance to a protection profile, which
one(s)? Are the security target and evaluation report available?
41. Are static or dynamic software security analysis tools used to identify weaknesses in the software that can lead
to exploitable vulnerabilities? If yes, which tools are used? What classes of weaknesses are covered? When in
the SDLC (e.g., unit level, subsystem, system, certification and accreditation) are these scans performed? Are
SwA experts involved in the analysis of the scan results?
42. Are there current publicly-known vulnerabilities in the software (e.g., an unrepaired CWE entry)?
43. Has the software been certified and accredited? What release/version/configuration? When? By whom? What
criteria or scheme was used to evaluate and accredit the software?
44. Is there a Support Life cycle Policy within the organization for the software in question? Does it outline and
establish a consistent and predictable support timeline?
45. How will patches and/or Service Packs be distributed to the purchasing/using organization?
46. How extensively are patches and Service Packs tested before they are released?
47. How are reports of defects, vulnerabilities, and security incidents involving the software collected, tracked, and
prioritized?
48. What policies and processes does the company use to verify that software components do not contain
unintended, “dead”, or malicious code? What tools are used?
49. How frequently are major versions of the software released?
50. Are configuration/change controls in place to prevent unauthorized modifications or additions to source code
and related documentation? Do these controls detect and report unexpected modifications/additions to source
code? Do they aid in rolling back an affected artifact to a pre-modified version?
51. Does the company perform background checks on members of the software development team? If so, are there
any additional “vetting” checks done on people who work on critical application components, such as security?
Explain.
52. Please provide company names of all 3rd party entities (foreign and domestic) with whom you, the supplier,
contracts software development for this app.
Results
• Dynamic reliability and performance
measurement of the product in the lab under
assumed operational profiles.
• Static analysis of the source code using COTS
and open source tools that search for
programming errors such as buffer overflows.
– Top 25 Common Programming Weaknesses CWEs)
[http://cwe.mitre.org/top25/#ProfileAutomatedManual]
– Capability to identify Application Bugs and Unwanted
Functionality
Common Weakness
Enumerations (CWEs)
• Detects hundreds of CWEs
• Detects 23 of the top 25 CWEs:
– 1. CWE-79 Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting')
– 2. CWE-89 Improper Neutralization of Special Elements used in an SQL Command ('SQL Injection')
– 3. CWE-120 Buffer Copy without Checking Size of Input ('Classic Buffer Overflow')
– 4. CWE-352 Cross-Site Request Forgery (CSRF)
– 5. CWE-285 Improper Access Control (Authorization)
– 6. CWE-807 Reliance on Untrusted Inputs in a Security Decision
– 7. CWE-22 Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')
– 8. CWE-434 Unrestricted Upload of File with Dangerous Type
– 9. CWE-78 Improper Neutralization of Special Elements used in an OS Command ('OS Command Injection')
– 10. CWE-311 Missing Encryption of Sensitive Data
– 11. CWE-798 Use of Hard-coded Credentials
– 12. CWE-805 Buffer Access with Incorrect Length Value
– 13. CWE-98 Improper Control of Filename for Include/Require Statement in PHP Program ('PHP File Inclusion')
– 14. CWE-129 Improper Validation of Array Index
– 15. CWE-754 Improper Check for Unusual or Exceptional Conditions
– 16. CWE-209 Information Exposure Through an Error Message
– 17. CWE-190 Integer Overflow or Wraparound
– 18. CWE-131 Incorrect Calculation of Buffer Size
– 19. CWE-306 Not supported
– 20. CWE-494 Not supported
– 21. CWE-732 Incorrect Permission Assignment for Critical Resource
– 22. CWE-770 Allocation of Resources Without Limits or Throttling
– 23. CWE-601 URL Redirection to Untrusted Site ('Open Redirect')
– 24. CWE-327 Use of a Broken or Risky Cryptographic Algorithm
– 25. CWE-362 Race Condition
NIST’s National Vulnerability Database
(NVD)
• Free to Public
• Contains 176 Android Vulnerabilities
Results
• Data collected may include:
 Amount of time an app is executed
 Type and amount of data transmitted
 Feature usage within an app
 Number of exception calls
• Benefits include:
Usage data that can be used for billing,
Reducing number of apps/functionality
Additional app testing
Note: Instrumentation can be turned on and off easily,
and done selectively as well. Also, instrumentation
does incur performance and footprint hits.
Application Static Analysis does not cover
Program Functionality
Fortify, Coverity, and other application testing tools
cover regular, non-Android specific Bugs:
• No Security Analysis of the Code Functionality
• No Power Analysis of the Application components
and code
• No Profiling of the resource consumption of
individual applications
• Cannot Regulate/Deny the access and use of
phone subsystems (Camera, Microphone, GPS..)
Application Testing Framework
Android Specific Analysis includes analysis of the
Application Security Manifest (not supported by third-party
vendors)
• Tailored to the Android Permission Model
• Verify if the requested permissions are warranted
by the submitted code
• Curtails excessive permissions and enforces a
tighter security model
Modifications on the Android Engine to enable
dynamic policies
• Control the underlying Dalvik engine to report
absence/depletion of resources instead of lack of
permissions
• Regulate access to critical/restricted resources
Application Testing Framework
ATP Architecture
ATP analyzes Android code bundles and returns
messages, analysis reports, and signed APKs
ATP
Repository
Android
code
bundle
Developer Security Assessor
Application
Store
Application Testing Portal
App Manager Analyses Engine
Request Handler
Registration Handler
Submission Validator
UI Handler
API Handler
Pre-Processor
Tool Invoker
Post-Processor
APK Compiler/Signer
Result Handler
Android
Application
Analysis /Reports
& Signed APKs
Security assessor
examines submissions
that do not pass ATP
analysis.
24
Mobilize-ATP Workflow (PASS Use-Case)
NIST Testing
Portal (ATP)
App Store
1. Submit Android code
bundle
2. Register submission
3. Tool 1 analysis
7. Tool n
analysis
…
4. Tool 1 status message &
analysis report
6. Tool 2 status message &
analysis report
8. Tool n status message
& analysis report
9. Assess results
11. PASS message & APK
ATP applies Testing to Analyze Android code bundles
10. Sign APK
PASS?
APKs are generated
and signed only if all
security analyses
pass.
AVs and Testing
Tools are invoked in
parallel on received
submissions
5. Tool 2 analysis
Application Analysis Report
26
ATP Monitor
Application Analysis Results
Analyzed ~267,000 Applications from the
Google Android Market
•Many Applications with:
• Access to Camera/GPS/Microphone
• Access to Sdcard Data/Contacts
• Network/Reach-back Functionality
(Updates/???)
• Persistent Presence (Service)
• Deviation in Functionality from what was
included in the Application Description
• Some with remote access capabilities
Application Vetting & Testing
Device Lock-down and Encryption of ALL Data
and Communications
Enforcement of Security Policies in the Android
Framework
Second-level Defenses placed in the Android
Linux Kernel
Prevent Attacks that bypass Android Security Framework
Android has Inherited some (if not all) of the Linux
Vulnerabilities
Java Native Interface to Linux Libraries a potential
Avenue for Exploitation
Defense in-Depth:
Multiple Levels of Security
Conclusions
Assuring the Secure Operation of Smart
Devices has a wide-range of requirements
 Application Testing
Static & Dynamic
In-Field Instrumentation
Power Behavior Metering & Policing
Physical Device Security
Lock-Down of the Device I/O (USB, WiFi, etc.)
Encryption of Data both on the Phone & Network
Securing Provisioning Process
Power Metering Framework
• Design & Implement an accurate model for
accounting and policing energy
consumption
• Two-pronged approach
• Meter the per-process CPU & Device utilization over time
• Identify the relative impact of each device component on energy
consumption
• Design an Android kernel subsystem to
estimate energy
• Meter energy consumption for each App/process
• Use for characterizing application behavior
• This behavior is Application dependent
• Sometimes the behavior is also User dependent
Power Analysis & Modeling Goals
• Understand Power Consumption on Mobile Devices
• How to accurately model power consumption on a
smart-phone?
– Some tasks are delegated (e.g. wi-fi)
– There are many hardware subsystems (e.g. input sensors)
• Accounting for where and for what power is
consumed in real-time can help the user make
informed decisions
• Use the Power Profiling to Design Defenses
Why is power profiling important?
Market study: Up to 75% of total power
consumption spent by applications powering 3rd
party advertisements[1]
There is incentive for developers and users to
use a proper energy accounting infrastructure to
make more informed decisions about where to
spend remaining device power
[1] http://www.bbc.co.uk/news/technology-
17431109
Contributions
 Measurements of each subsystem were performed by
leveraging system hooks in Android kernel wake locks
driver.
 Our power model is time-interval independent.
Measurements are cumulative and real-time.
 No pre-defined set of states with fixed power ratings
for each, in our model.
 In depth understanding of how power is consumed in
subsystems: cpu, display, wifi, audio
PIXEL POWER ANALYSIS
Display Subsystem (1) : Model
 Display uptime: ‘main’ wakelock
 This measures the amount of time that the display was alive
 Each pixel consumed different amount of energy
depending on its color.
 Display subsystem usage:
Udisp = pixel strength x display
uptime
 We calculate pixel strength from a snapshot of the
framebuffer every fixed time interval.
Display Subsystem (2) : pixel color
1
1.5
2
2.5
3
3.5
4
4.5
5
5.5
6
6.5
7
7.5
8
8.5
9
9.5
10
0 10 20 30 40 50 60 70 80 90 100
Rate
of
Current
Discharge
(mAh/m)
Display Brightness (%)
Display Brightness vs. Rate of Current Discharge
black (rgb 0,0,0)
white (rgb 255,255,255)
red (rgb 255,0,0)
green (rgb 0,255,0)
blue (rgb 0,0,255)
magenta (rgb 255,0,255)
cyan (rgb 0,255,255)
orange (rgb 255,255,0)
Display Subsystem (3) : Pixel Consumption
 red, green, blue - different impact on current discharge.
 By regression, we found the relative impact to be:
 ( 0.4216 : 0.7469 : 1 ) for R : G : B respectively.
 Remaining colors can be calculated using RGB values.
 Lets calculate Orange Vs Blue for 100% brightness:
 Orange: 0.4216*255 + 0.7469*255 + 1*0 = 297.968 (pixel power)
 Blue: 0.4216*0 + 0.7469*0 + 1*255 = 255.00 (pixel power)
 Verification: Comparing these values to area under the curves
from the graph, we find the error = 0.87 %
 i.e. ( 255.00 : 297.968 ) :: ( 165.316 : 191.499 )
Conclusions
• Our measuring and power consumption analysis
approach is accurate and has small power footprint.
• The results from our regression equations are
obtained instantaneously and does not need a lot of
CPU time. Power usage attributions to individual
applications achieved with a low error of under 1%.
• Individual subsystems – cpu, display, wi-fi, audio -
were analyzed for their power utilization.
• Paper in IEEE SERE’12 available if requested.
High-Level Project Overview
outpost
App
Developers
App
Store
Banks
• Vetted apps ultimately go into an app
store.
• Backflows of user feedback and in-field
test data.
• If feedback is good, an app becomes
app store accepted, and money is
deposited; otherwise, a new version
from the developers needed.

More Related Content

Similar to Taiwan_2wehuikl;lkjjk;ivfazzfffggggh.ppt

325838924-Splunk-Use-Case-Framework-Introduction-Session
325838924-Splunk-Use-Case-Framework-Introduction-Session325838924-Splunk-Use-Case-Framework-Introduction-Session
325838924-Splunk-Use-Case-Framework-Introduction-Session
Ryan Faircloth
 
17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...
17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...
17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...
abhichowdary16
 
Project Quality-SIPOCSelect a process of your choice and creat.docx
Project Quality-SIPOCSelect a process of your choice and creat.docxProject Quality-SIPOCSelect a process of your choice and creat.docx
Project Quality-SIPOCSelect a process of your choice and creat.docx
wkyra78
 
Criterion 1A - 4 - MasteryPros and Cons Thoroughly compares the
Criterion 1A - 4 - MasteryPros and Cons Thoroughly compares theCriterion 1A - 4 - MasteryPros and Cons Thoroughly compares the
Criterion 1A - 4 - MasteryPros and Cons Thoroughly compares the
CruzIbarra161
 

Similar to Taiwan_2wehuikl;lkjjk;ivfazzfffggggh.ppt (20)

Procuring an Application Security Testing Partner
Procuring an Application Security Testing PartnerProcuring an Application Security Testing Partner
Procuring an Application Security Testing Partner
 
Security is our duty and we shall deliver it - White Paper
Security is our duty and we shall deliver it - White PaperSecurity is our duty and we shall deliver it - White Paper
Security is our duty and we shall deliver it - White Paper
 
Building a Product Security Practice in a DevOps World
Building a Product Security Practice in a DevOps WorldBuilding a Product Security Practice in a DevOps World
Building a Product Security Practice in a DevOps World
 
Shifting the conversation from active interception to proactive neutralization
Shifting the conversation from active interception to proactive neutralization Shifting the conversation from active interception to proactive neutralization
Shifting the conversation from active interception to proactive neutralization
 
A Framework for Developing and Operationalizing Security Use Cases
A Framework for Developing and Operationalizing Security Use CasesA Framework for Developing and Operationalizing Security Use Cases
A Framework for Developing and Operationalizing Security Use Cases
 
325838924-Splunk-Use-Case-Framework-Introduction-Session
325838924-Splunk-Use-Case-Framework-Introduction-Session325838924-Splunk-Use-Case-Framework-Introduction-Session
325838924-Splunk-Use-Case-Framework-Introduction-Session
 
17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...
17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...
17-MOD 6 Conducting Security Audits & MOD 7 Information Security Audit Prepar...
 
Best Practices, Types, and Tools for Security Testing in 2023.docx
Best Practices, Types, and Tools for Security Testing in 2023.docxBest Practices, Types, and Tools for Security Testing in 2023.docx
Best Practices, Types, and Tools for Security Testing in 2023.docx
 
OWASP Secure Coding Practices - Quick Reference Guide
OWASP Secure Coding Practices - Quick Reference GuideOWASP Secure Coding Practices - Quick Reference Guide
OWASP Secure Coding Practices - Quick Reference Guide
 
Ssdf nist
Ssdf nistSsdf nist
Ssdf nist
 
Penetration Testing Services - Redfox Cyber Security
Penetration Testing Services - Redfox Cyber SecurityPenetration Testing Services - Redfox Cyber Security
Penetration Testing Services - Redfox Cyber Security
 
Software Security Assurance for DevOps
Software Security Assurance for DevOpsSoftware Security Assurance for DevOps
Software Security Assurance for DevOps
 
Software Security Assurance for Devops
Software Security Assurance for DevopsSoftware Security Assurance for Devops
Software Security Assurance for Devops
 
Exploring the World of Software Testing.pdf
Exploring the World of Software Testing.pdfExploring the World of Software Testing.pdf
Exploring the World of Software Testing.pdf
 
Efficient Security Development and Testing Using Dynamic and Static Code Anal...
Efficient Security Development and Testing Using Dynamic and Static Code Anal...Efficient Security Development and Testing Using Dynamic and Static Code Anal...
Efficient Security Development and Testing Using Dynamic and Static Code Anal...
 
Project Quality-SIPOCSelect a process of your choice and creat.docx
Project Quality-SIPOCSelect a process of your choice and creat.docxProject Quality-SIPOCSelect a process of your choice and creat.docx
Project Quality-SIPOCSelect a process of your choice and creat.docx
 
Application Threat Modeling
Application Threat ModelingApplication Threat Modeling
Application Threat Modeling
 
Criterion 1A - 4 - MasteryPros and Cons Thoroughly compares the
Criterion 1A - 4 - MasteryPros and Cons Thoroughly compares theCriterion 1A - 4 - MasteryPros and Cons Thoroughly compares the
Criterion 1A - 4 - MasteryPros and Cons Thoroughly compares the
 
Most effective QA & testing types
Most effective QA & testing typesMost effective QA & testing types
Most effective QA & testing types
 
Most effective QA & testing types
Most effective QA & testing typesMost effective QA & testing types
Most effective QA & testing types
 

More from gealehegn

More from gealehegn (11)

MIC_3e_Ch3Add more information to your upload.ppt
MIC_3e_Ch3Add more information to your upload.pptMIC_3e_Ch3Add more information to your upload.ppt
MIC_3e_Ch3Add more information to your upload.ppt
 
How to Create an Effective PowerPoint.ppt
How to Create an Effective PowerPoint.pptHow to Create an Effective PowerPoint.ppt
How to Create an Effective PowerPoint.ppt
 
hel29999999999999999999999999999999999999999999.ppt
hel29999999999999999999999999999999999999999999.ppthel29999999999999999999999999999999999999999999.ppt
hel29999999999999999999999999999999999999999999.ppt
 
csce201 - software - sec Basic Security.ppt
csce201 - software - sec Basic Security.pptcsce201 - software - sec Basic Security.ppt
csce201 - software - sec Basic Security.ppt
 
4_25655_SE731_2020_1__2_1_Lecture 1 - Course Outline and Secure SDLC.ppt
4_25655_SE731_2020_1__2_1_Lecture 1 - Course Outline and Secure SDLC.ppt4_25655_SE731_2020_1__2_1_Lecture 1 - Course Outline and Secure SDLC.ppt
4_25655_SE731_2020_1__2_1_Lecture 1 - Course Outline and Secure SDLC.ppt
 
ch03Threat Modeling - Locking the Door to Vulnerabilities.ppt
ch03Threat Modeling - Locking the Door to Vulnerabilities.pptch03Threat Modeling - Locking the Door to Vulnerabilities.ppt
ch03Threat Modeling - Locking the Door to Vulnerabilities.ppt
 
PCI_Security_Awareness12345678904321.ppt
PCI_Security_Awareness12345678904321.pptPCI_Security_Awareness12345678904321.ppt
PCI_Security_Awareness12345678904321.ppt
 
pci-comp pci requirements and controls.ppt
pci-comp pci requirements and controls.pptpci-comp pci requirements and controls.ppt
pci-comp pci requirements and controls.ppt
 
SecurityDevelopmentLifecycle 202512.pptx
SecurityDevelopmentLifecycle 202512.pptxSecurityDevelopmentLifecycle 202512.pptx
SecurityDevelopmentLifecycle 202512.pptx
 
Educause+PCI+briefing+4-19-20162345.pptx
Educause+PCI+briefing+4-19-20162345.pptxEducause+PCI+briefing+4-19-20162345.pptx
Educause+PCI+briefing+4-19-20162345.pptx
 
PCI DSS Training compliance training for companies
PCI DSS Training compliance training for companiesPCI DSS Training compliance training for companies
PCI DSS Training compliance training for companies
 

Recently uploaded

SPLICE Working Group: Reusable Code Examples
SPLICE Working Group:Reusable Code ExamplesSPLICE Working Group:Reusable Code Examples
SPLICE Working Group: Reusable Code Examples
Peter Brusilovsky
 
Personalisation of Education by AI and Big Data - Lourdes Guàrdia
Personalisation of Education by AI and Big Data - Lourdes GuàrdiaPersonalisation of Education by AI and Big Data - Lourdes Guàrdia
Personalisation of Education by AI and Big Data - Lourdes Guàrdia
EADTU
 
SURVEY I created for uni project research
SURVEY I created for uni project researchSURVEY I created for uni project research
SURVEY I created for uni project research
CaitlinCummins3
 

Recently uploaded (20)

PSYPACT- Practicing Over State Lines May 2024.pptx
PSYPACT- Practicing Over State Lines May 2024.pptxPSYPACT- Practicing Over State Lines May 2024.pptx
PSYPACT- Practicing Over State Lines May 2024.pptx
 
SPLICE Working Group: Reusable Code Examples
SPLICE Working Group:Reusable Code ExamplesSPLICE Working Group:Reusable Code Examples
SPLICE Working Group: Reusable Code Examples
 
Improved Approval Flow in Odoo 17 Studio App
Improved Approval Flow in Odoo 17 Studio AppImproved Approval Flow in Odoo 17 Studio App
Improved Approval Flow in Odoo 17 Studio App
 
ĐỀ THAM KHẢO KÌ THI TUYỂN SINH VÀO LỚP 10 MÔN TIẾNG ANH FORM 50 CÂU TRẮC NGHI...
ĐỀ THAM KHẢO KÌ THI TUYỂN SINH VÀO LỚP 10 MÔN TIẾNG ANH FORM 50 CÂU TRẮC NGHI...ĐỀ THAM KHẢO KÌ THI TUYỂN SINH VÀO LỚP 10 MÔN TIẾNG ANH FORM 50 CÂU TRẮC NGHI...
ĐỀ THAM KHẢO KÌ THI TUYỂN SINH VÀO LỚP 10 MÔN TIẾNG ANH FORM 50 CÂU TRẮC NGHI...
 
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdfFICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
 
Stl Algorithms in C++ jjjjjjjjjjjjjjjjjj
Stl Algorithms in C++ jjjjjjjjjjjjjjjjjjStl Algorithms in C++ jjjjjjjjjjjjjjjjjj
Stl Algorithms in C++ jjjjjjjjjjjjjjjjjj
 
UChicago CMSC 23320 - The Best Commit Messages of 2024
UChicago CMSC 23320 - The Best Commit Messages of 2024UChicago CMSC 23320 - The Best Commit Messages of 2024
UChicago CMSC 23320 - The Best Commit Messages of 2024
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
Analyzing and resolving a communication crisis in Dhaka textiles LTD.pptx
Analyzing and resolving a communication crisis in Dhaka textiles LTD.pptxAnalyzing and resolving a communication crisis in Dhaka textiles LTD.pptx
Analyzing and resolving a communication crisis in Dhaka textiles LTD.pptx
 
TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...
TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...
TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...
 
How to Manage Website in Odoo 17 Studio App.pptx
How to Manage Website in Odoo 17 Studio App.pptxHow to Manage Website in Odoo 17 Studio App.pptx
How to Manage Website in Odoo 17 Studio App.pptx
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & Systems
 
Personalisation of Education by AI and Big Data - Lourdes Guàrdia
Personalisation of Education by AI and Big Data - Lourdes GuàrdiaPersonalisation of Education by AI and Big Data - Lourdes Guàrdia
Personalisation of Education by AI and Big Data - Lourdes Guàrdia
 
An overview of the various scriptures in Hinduism
An overview of the various scriptures in HinduismAn overview of the various scriptures in Hinduism
An overview of the various scriptures in Hinduism
 
Supporting Newcomer Multilingual Learners
Supporting Newcomer  Multilingual LearnersSupporting Newcomer  Multilingual Learners
Supporting Newcomer Multilingual Learners
 
e-Sealing at EADTU by Kamakshi Rajagopal
e-Sealing at EADTU by Kamakshi Rajagopale-Sealing at EADTU by Kamakshi Rajagopal
e-Sealing at EADTU by Kamakshi Rajagopal
 
The Story of Village Palampur Class 9 Free Study Material PDF
The Story of Village Palampur Class 9 Free Study Material PDFThe Story of Village Palampur Class 9 Free Study Material PDF
The Story of Village Palampur Class 9 Free Study Material PDF
 
VAMOS CUIDAR DO NOSSO PLANETA! .
VAMOS CUIDAR DO NOSSO PLANETA!                    .VAMOS CUIDAR DO NOSSO PLANETA!                    .
VAMOS CUIDAR DO NOSSO PLANETA! .
 
SURVEY I created for uni project research
SURVEY I created for uni project researchSURVEY I created for uni project research
SURVEY I created for uni project research
 
TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...
TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...
TỔNG HỢP HƠN 100 ĐỀ THI THỬ TỐT NGHIỆP THPT TOÁN 2024 - TỪ CÁC TRƯỜNG, TRƯỜNG...
 

Taiwan_2wehuikl;lkjjk;ivfazzfffggggh.ppt

  • 1. Exposing Security Risks For Commercial Mobile Devices (CMDs) Jeffrey Voas, PhD, FIEEE Jeff.voas@nist.gov NIST
  • 2. Maturity of Technologies (source Gartner)
  • 6. Risks in Mobile Security Supply Chain Devices Secure Verify Test Deploy Enterprise Security Enterprise Security Enterprise Security Enterprise Security Enterprise Security Device Provisioning MDM/Middleware Providers
  • 7. Hardware Components of Interest • CPU • Display • Graphics • Audio • Microphone • Wi-fi • GPS • Touchscreen • Accelerometer • Compass • And more…
  • 10.
  • 11. Results • An examiner will review the responses from the developer • A human evaluation of trust in the responses is made • If questionable, developer may be asked for clarifications or app recommended for re-work • If believed, app is ready for app store inclusion or for next two assurance approaches
  • 12. Questions 1. Does the app that you offer contain any security issues that need to be mitigated or revealed? For example, does the app access phone, network (wifi and/or Bluetooth), and/or GPS resources?, does the app access camera or other recording hardware or software?, does the app access operating system resources or libraries?”, and “what information, if any, is transferred from the phone to a remote host (including during PC synchronization) by the app?”. If so, please explain. If you believe that your app does not contain any security issues, please explain why, and then ignore remaining Questionnaire. If you answer “no” here, please give detail and good reasoning. If you answer “yes” please address as many of the following questions as are reasonable for your circumstances. Lacking information may trigger additional review of your application for app approval and thus delay adoption into the appstore. 2. How long has the software source been available? Is there an active user community providing peer review and actively evolving the software? 3. Does the license/contract restrict the licensee from discovering flaws or disclosing details about software defects or weaknesses with others (e.g., is there a “gag rule” or limits on sharing information about discovered flaws)? 4. Does software have a positive reputation? Does software have a positive reputation relative to security? Are there reviews that recommend it? 5. What are the processes (e.g., ISO 9000, CMMI, etc.), methods, tools (e.g., IDEs, compilers), techniques, etc. used to produce and transform the software (brief summary response)? 6. What security measurement practices and data does the company use to assist product planning? 7. Describe the training the company offers related to defining security requirements, secure architecture and design, secure coding practices, and security testing. Explain. 8. Are there some requirements for security that are “structured” as part of general release-ability of a product and others that are “as needed” or “custom” for a particular release? 9. What process is utilized by the company to prioritize security-related enhancement requests? 10. What review processes are implemented to ensure that nonfunctional security requirements are unambiguous, traceable and testable throughout the entire Software Development Life Cycle (SDLC)? 11. Are security requirements developed independently of the rest of the requirements engineering activities, or are they integrated into the mainstream requirements activities? 12. Are misuse/abuse cases derived from the application requirements? Are relevant attack patterns used to identify and document potential threats? 13. What threat assumptions were made, if any, when designing protections for the software and information assets processed? 14. What security design and security architecture documents are prepared as part of the SDLC process? 15. What threat modeling process, if any, is used when designing the software protections? 16. How are confidentiality, availability, and integrity addressed in the software design? 17. What are/were the languages and non-developmental components used to produce the software (brief summary response)? 18. What secure development standards and/or guidelines are provided to developers?
  • 13. Questions (cont.) 19. Are tools provided to help developers verify that the software they have produced has a minimal number of weaknesses that could lead to exploitable vulnerabilities? What are the tools, and how have they been qualified? What is the breadth of common software weaknesses covered (e.g., specific CWEs)? 20. Does the company have formal coding standards for each language in use? If yes, how are they enforced? 21. Does the software development plan include security peer reviews? 22. Does the organization incorporate security risk management activities as part of the software development methodology? If yes, will a copy of the documentation of this methodology be available or information on how to obtain it from a publicly accessible source? 23. Does the software‟s exception-handling mechanism prevent all faults from leaving the software, its resources, and its data (in memory and on disk) in a vulnerable state? Does the exception-handling mechanism provide more than one option for responding to a fault? If so, can the exception-handling options be configured by the administrator or overridden? 24. Does the software validate (e.g., filter with white listing) inputs from potentially un-trusted sources before being used? Is a validation test suite or diagnostic available to validate that the application software is operating correctly and in a secure configuration following installation? 25. Has the software been designed to execute within a constrained execution environment (e.g., virtual machine, sandbox, chroot jail, single-purpose pseudo-user, etc.) and is it designed to isolate and minimize the extent of damage possible by a successful attack? 26. Does the documentation explain how to install, configure, and/or use the software securely? Does it identify options that should not normally be used because they create security weaknesses? 27. Where applicable, does the program use run-time infrastructure defenses (such as address space randomization, stack overflow protection, preventing execution from data memory, and taint checking)? 28. How does the company minimize the risk of reverse engineering of binaries? Are source code obfuscation techniques used? Are legal agreements in place to protect against potential liabilities of non-secure software? 29. Does the software default to requiring the administrator (or user of a single-user software package) to expressly approve the automatic installation of patches/upgrades, downloading of files, execution of plug-ins or other “helper” applications, and downloading and execution of mobile code? 30. Does the software have any security critical dependencies or need additional controls from other software (e.g., operating system, directory service, applications), firmware, or hardware? If yes, please describe. 31. Does the software include content produced by suppliers other than the primary developer? If so, who? 32. What are the policies and procedures for verifying the quality and security of non-developmental components used? 33. What types of functional tests are/were performed on the software during its development (e.g., spot checking, component-level testing, integrated testing)? 34. Who and when are security tests performed on the product? Are tests performed by an internal test team, by an independent third party, or by both? 35. What degree of code coverage does testing provide? 36. Are misuse test cases included to exercise potential abuse scenarios of the software?
  • 14. Questions (cont.) 37. Are security-specific regression tests performed during the development process? If yes, how frequently are the tests performed? Are regression test scripts available? 38. Does the company‟s defect classification scheme include security categories? During testing what proportion of identified defects relate to security? 39. How has the software been measured/assessed for its resistance to identified, relevant attack patterns? Are Common Vulnerabilities & Exposures (CVEs) or Common Weakness Enumerations (CWEs) used? How have exploitable flaws been tracked and mitigated? 40. Has the software been evaluated against the Common Criteria, FIPS 140-2, or other formal evaluation process? What evaluation assurance was achieved? If the product claims conformance to a protection profile, which one(s)? Are the security target and evaluation report available? 41. Are static or dynamic software security analysis tools used to identify weaknesses in the software that can lead to exploitable vulnerabilities? If yes, which tools are used? What classes of weaknesses are covered? When in the SDLC (e.g., unit level, subsystem, system, certification and accreditation) are these scans performed? Are SwA experts involved in the analysis of the scan results? 42. Are there current publicly-known vulnerabilities in the software (e.g., an unrepaired CWE entry)? 43. Has the software been certified and accredited? What release/version/configuration? When? By whom? What criteria or scheme was used to evaluate and accredit the software? 44. Is there a Support Life cycle Policy within the organization for the software in question? Does it outline and establish a consistent and predictable support timeline? 45. How will patches and/or Service Packs be distributed to the purchasing/using organization? 46. How extensively are patches and Service Packs tested before they are released? 47. How are reports of defects, vulnerabilities, and security incidents involving the software collected, tracked, and prioritized? 48. What policies and processes does the company use to verify that software components do not contain unintended, “dead”, or malicious code? What tools are used? 49. How frequently are major versions of the software released? 50. Are configuration/change controls in place to prevent unauthorized modifications or additions to source code and related documentation? Do these controls detect and report unexpected modifications/additions to source code? Do they aid in rolling back an affected artifact to a pre-modified version? 51. Does the company perform background checks on members of the software development team? If so, are there any additional “vetting” checks done on people who work on critical application components, such as security? Explain. 52. Please provide company names of all 3rd party entities (foreign and domestic) with whom you, the supplier, contracts software development for this app.
  • 15.
  • 16. Results • Dynamic reliability and performance measurement of the product in the lab under assumed operational profiles. • Static analysis of the source code using COTS and open source tools that search for programming errors such as buffer overflows. – Top 25 Common Programming Weaknesses CWEs) [http://cwe.mitre.org/top25/#ProfileAutomatedManual] – Capability to identify Application Bugs and Unwanted Functionality
  • 17. Common Weakness Enumerations (CWEs) • Detects hundreds of CWEs • Detects 23 of the top 25 CWEs: – 1. CWE-79 Improper Neutralization of Input During Web Page Generation ('Cross-site Scripting') – 2. CWE-89 Improper Neutralization of Special Elements used in an SQL Command ('SQL Injection') – 3. CWE-120 Buffer Copy without Checking Size of Input ('Classic Buffer Overflow') – 4. CWE-352 Cross-Site Request Forgery (CSRF) – 5. CWE-285 Improper Access Control (Authorization) – 6. CWE-807 Reliance on Untrusted Inputs in a Security Decision – 7. CWE-22 Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal') – 8. CWE-434 Unrestricted Upload of File with Dangerous Type – 9. CWE-78 Improper Neutralization of Special Elements used in an OS Command ('OS Command Injection') – 10. CWE-311 Missing Encryption of Sensitive Data – 11. CWE-798 Use of Hard-coded Credentials – 12. CWE-805 Buffer Access with Incorrect Length Value – 13. CWE-98 Improper Control of Filename for Include/Require Statement in PHP Program ('PHP File Inclusion') – 14. CWE-129 Improper Validation of Array Index – 15. CWE-754 Improper Check for Unusual or Exceptional Conditions – 16. CWE-209 Information Exposure Through an Error Message – 17. CWE-190 Integer Overflow or Wraparound – 18. CWE-131 Incorrect Calculation of Buffer Size – 19. CWE-306 Not supported – 20. CWE-494 Not supported – 21. CWE-732 Incorrect Permission Assignment for Critical Resource – 22. CWE-770 Allocation of Resources Without Limits or Throttling – 23. CWE-601 URL Redirection to Untrusted Site ('Open Redirect') – 24. CWE-327 Use of a Broken or Risky Cryptographic Algorithm – 25. CWE-362 Race Condition
  • 18. NIST’s National Vulnerability Database (NVD) • Free to Public • Contains 176 Android Vulnerabilities
  • 19.
  • 20. Results • Data collected may include:  Amount of time an app is executed  Type and amount of data transmitted  Feature usage within an app  Number of exception calls • Benefits include: Usage data that can be used for billing, Reducing number of apps/functionality Additional app testing Note: Instrumentation can be turned on and off easily, and done selectively as well. Also, instrumentation does incur performance and footprint hits.
  • 21. Application Static Analysis does not cover Program Functionality Fortify, Coverity, and other application testing tools cover regular, non-Android specific Bugs: • No Security Analysis of the Code Functionality • No Power Analysis of the Application components and code • No Profiling of the resource consumption of individual applications • Cannot Regulate/Deny the access and use of phone subsystems (Camera, Microphone, GPS..) Application Testing Framework
  • 22. Android Specific Analysis includes analysis of the Application Security Manifest (not supported by third-party vendors) • Tailored to the Android Permission Model • Verify if the requested permissions are warranted by the submitted code • Curtails excessive permissions and enforces a tighter security model Modifications on the Android Engine to enable dynamic policies • Control the underlying Dalvik engine to report absence/depletion of resources instead of lack of permissions • Regulate access to critical/restricted resources Application Testing Framework
  • 23. ATP Architecture ATP analyzes Android code bundles and returns messages, analysis reports, and signed APKs ATP Repository Android code bundle Developer Security Assessor Application Store Application Testing Portal App Manager Analyses Engine Request Handler Registration Handler Submission Validator UI Handler API Handler Pre-Processor Tool Invoker Post-Processor APK Compiler/Signer Result Handler Android Application Analysis /Reports & Signed APKs Security assessor examines submissions that do not pass ATP analysis.
  • 24. 24 Mobilize-ATP Workflow (PASS Use-Case) NIST Testing Portal (ATP) App Store 1. Submit Android code bundle 2. Register submission 3. Tool 1 analysis 7. Tool n analysis … 4. Tool 1 status message & analysis report 6. Tool 2 status message & analysis report 8. Tool n status message & analysis report 9. Assess results 11. PASS message & APK ATP applies Testing to Analyze Android code bundles 10. Sign APK PASS? APKs are generated and signed only if all security analyses pass. AVs and Testing Tools are invoked in parallel on received submissions 5. Tool 2 analysis
  • 27. Application Analysis Results Analyzed ~267,000 Applications from the Google Android Market •Many Applications with: • Access to Camera/GPS/Microphone • Access to Sdcard Data/Contacts • Network/Reach-back Functionality (Updates/???) • Persistent Presence (Service) • Deviation in Functionality from what was included in the Application Description • Some with remote access capabilities
  • 28. Application Vetting & Testing Device Lock-down and Encryption of ALL Data and Communications Enforcement of Security Policies in the Android Framework Second-level Defenses placed in the Android Linux Kernel Prevent Attacks that bypass Android Security Framework Android has Inherited some (if not all) of the Linux Vulnerabilities Java Native Interface to Linux Libraries a potential Avenue for Exploitation Defense in-Depth: Multiple Levels of Security
  • 29. Conclusions Assuring the Secure Operation of Smart Devices has a wide-range of requirements  Application Testing Static & Dynamic In-Field Instrumentation Power Behavior Metering & Policing Physical Device Security Lock-Down of the Device I/O (USB, WiFi, etc.) Encryption of Data both on the Phone & Network Securing Provisioning Process
  • 30. Power Metering Framework • Design & Implement an accurate model for accounting and policing energy consumption • Two-pronged approach • Meter the per-process CPU & Device utilization over time • Identify the relative impact of each device component on energy consumption • Design an Android kernel subsystem to estimate energy • Meter energy consumption for each App/process • Use for characterizing application behavior • This behavior is Application dependent • Sometimes the behavior is also User dependent
  • 31. Power Analysis & Modeling Goals • Understand Power Consumption on Mobile Devices • How to accurately model power consumption on a smart-phone? – Some tasks are delegated (e.g. wi-fi) – There are many hardware subsystems (e.g. input sensors) • Accounting for where and for what power is consumed in real-time can help the user make informed decisions • Use the Power Profiling to Design Defenses
  • 32. Why is power profiling important? Market study: Up to 75% of total power consumption spent by applications powering 3rd party advertisements[1] There is incentive for developers and users to use a proper energy accounting infrastructure to make more informed decisions about where to spend remaining device power [1] http://www.bbc.co.uk/news/technology- 17431109
  • 33. Contributions  Measurements of each subsystem were performed by leveraging system hooks in Android kernel wake locks driver.  Our power model is time-interval independent. Measurements are cumulative and real-time.  No pre-defined set of states with fixed power ratings for each, in our model.  In depth understanding of how power is consumed in subsystems: cpu, display, wifi, audio
  • 35. Display Subsystem (1) : Model  Display uptime: ‘main’ wakelock  This measures the amount of time that the display was alive  Each pixel consumed different amount of energy depending on its color.  Display subsystem usage: Udisp = pixel strength x display uptime  We calculate pixel strength from a snapshot of the framebuffer every fixed time interval.
  • 36. Display Subsystem (2) : pixel color 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 7.5 8 8.5 9 9.5 10 0 10 20 30 40 50 60 70 80 90 100 Rate of Current Discharge (mAh/m) Display Brightness (%) Display Brightness vs. Rate of Current Discharge black (rgb 0,0,0) white (rgb 255,255,255) red (rgb 255,0,0) green (rgb 0,255,0) blue (rgb 0,0,255) magenta (rgb 255,0,255) cyan (rgb 0,255,255) orange (rgb 255,255,0)
  • 37. Display Subsystem (3) : Pixel Consumption  red, green, blue - different impact on current discharge.  By regression, we found the relative impact to be:  ( 0.4216 : 0.7469 : 1 ) for R : G : B respectively.  Remaining colors can be calculated using RGB values.  Lets calculate Orange Vs Blue for 100% brightness:  Orange: 0.4216*255 + 0.7469*255 + 1*0 = 297.968 (pixel power)  Blue: 0.4216*0 + 0.7469*0 + 1*255 = 255.00 (pixel power)  Verification: Comparing these values to area under the curves from the graph, we find the error = 0.87 %  i.e. ( 255.00 : 297.968 ) :: ( 165.316 : 191.499 )
  • 38. Conclusions • Our measuring and power consumption analysis approach is accurate and has small power footprint. • The results from our regression equations are obtained instantaneously and does not need a lot of CPU time. Power usage attributions to individual applications achieved with a low error of under 1%. • Individual subsystems – cpu, display, wi-fi, audio - were analyzed for their power utilization. • Paper in IEEE SERE’12 available if requested.
  • 39. High-Level Project Overview outpost App Developers App Store Banks • Vetted apps ultimately go into an app store. • Backflows of user feedback and in-field test data. • If feedback is good, an app becomes app store accepted, and money is deposited; otherwise, a new version from the developers needed.