2. CLOUD .?
Cloud computing is a model for enabling convenient,
on-demand network access to a shared pool of
configurable computing resources that can be rapidly
provisioned and released with minimal management
effort.
6. BUILDING CLOUD ENVIRONMENT
Heterogeneous System support.
Service Management
Dynamic Workload and Resource
Management
7. Reliabilty, Availability and Security
Integartions with Existing data Center management
tools
Visibility and reporting
Cloud must be a converged infrastructure – Supports
DR, Elasticity, Avoid Single point of failure.
There has to be fully automated orchestration of
service management and software distribution
across the converged infrastructure
10. CLOUD SECURITY
Data breaches.
Multi-Factor authentication and Encryption of data.
Insufficient identity, Credential and Access
management
Weak password
Identity solution between the customers
Cryptographic keys
Any centralized storage mechanism containing data
secrets (e.g. passwords, private keys, confidential
customer contact database) is an extremely high-value
target for attackers
11. Insecure Interfaces and APIs
System Vulnerabilities
Kernel, System libraries and application tools- put the
security of all services and data significant risks
Bugs are everywhere
Solution - Vulnerability scanning, secuirty patches or
upgrades. Secure design and architecture can lessen
the chances of attacker taking full control of every part
of an information system .
Heartbleed, Shellshock
12. Account Hijacking
Phising, fraud, reuse of passwords.
Organizations should look to prohibit the sharing of
account credentials among user services.
Amazon systems were used to run Zeus Botnodes
Malicious Insiders
Advanced Persistent Threats
Spearphishing, direct hacking systems, delivering
attack code through USB devices, penetration through
partner networks and use of unsecured or third-party
networks are common points of entry for APTs.
13. Data Loss
Insufficient Due Diligence
Good Roadmap and Checklist for due diligence for evaluating
technologies
An organization that rushes to adopt cloud technologies and
choose CSPs without performing due diligence exposes itself to a
myriad of commercial, financial, technical, legal and compliance
risks that jeopardize its success. Amazon AWS experience an
outage, due to accidental deletion of information that controls load
balancing.
Nirvanix cloud storage specialist hosted data for IBM, DELL went
bankruptcy for the above reasons.
Facebook faced issues after buying M&A.
Denial of Service
Shared Technology Vulnerabilities
14. PHYSICAL SECURITY
• The elements of physical security are also a key element in
ensuring that data center operations and delivery teams can
provide continuous and authenticated uptime of greater than
99.9999%
• Physical access control and monitoring, including 24/7/365
onsite security, biometric hand geometry readers inside “man
traps,” bullet-resistant walls, concrete bollards, closed-circuit
TV (CCTV) integrated video, and silent alarms.
• Environmental controls and backup power
• Policies, processes, and procedures
15. NETWORK SECURITY
• Denial of Service: .
DNS Hacking, Routing Table “Poisoning”, XDoS attacks
o syn cookies
o Connection limiting
o Internal bandwidth maintained
• Port Scanning
o Port scans are violation of Acceptable Use Policy(AUP)
• Man in the Middle Attack: To overcome it always use SSL
• IP Spoofing: Spoofing is the creation of TCP/IP packets using
somebody else's IP address.
o Host based firewall infrastructure
o Infrastructure will not permit an instance to send traffic with a source IP
or MAC address other than its own.
16. SECURITY IN THE MIDDLEWARE
It supports the Security groups, where we can define our
own security groups and assign ACLs
The firewall can be configured in groups permitting
different classes of instances to have different rules for
ex) webserver
http –port 80
https-port 443
SSH –port 22
-- IAM & Certificates based communication between
cloud components.
17. CREDENTIAL MANAGEMENT
• Access Credentials
o Access Keys
o X.509 certificates
o Key pairs
• Sign-In Credentials
o Email Address (User Name) and Password
o Account Identifiers
• Account Identifiers
o Account ID
o Canonical ID
18. EC2 SECURITY
• Host OS
o Built on bastion host
o Cryptographically strong SSH keys to access bastion host
o Access are logged and routinely audited
• Guest OS
o Virtual instances are controlled by customer
o Customers have full root access and administrative
controls
o Customers use token or key based authentication
19. EC2 SECURITY
Firewall:
Set with default as deny mode
Requires customers X.509 certificate and keys to
authorize change
API
Calls to launch and terminate instances are signed
by X.509 certificate/secret Access keys
API calls are encrypted in transit with SSL
21. DATA ISOLATION ( VM ISOLATION)
All the VM’s in the hypervisor are communicating via
event channels and shared memory with in the host.
By creating the policies in the hypervisor level we can
the allow/deny the interdomain communication.
Implemented in XSM Framework similar to seLinux
Security Label
Object : Role : Type
22. DIGITAL CERTIFICATE LOGIN
It prevents Account hijacking.
Every user will be distributed with the Digital Certificates
which is approved by CA.
Digital certificates have Private key, Public key, Name,
Unique serial number, etc.,
User Certificates are verified in the LDAP for allow/deny
the user.
23. • Role Based Access Control
• Individual roles will be assigned to the user
• Based on the roles policies are written
• We can create groups also
Example: Normal users are not allowed to
create VMs only allowed make a request.
RBAC
24. LOG MANAGEMENT ENGINE
Real time log Correlation Engine
Able to find the Error within some seconds
Using logstash + Elasticsearch + Kibana3 we
achieved.
Web applications also available
Easily we can search the logs based on the time
and text
25. PRIVACY
It is less technical issue and more of policy and
legal issues. Policies have to empower people to
control the collection, use and distribution of their
personal information.
26. THINGS TO CONSIDER:
Notice
Choice
Onward Transfer
Security
Data integrity
Access
Enforcement
27. PRIVACY BY DESIGN
Data minimization
Controllability
Transparency
User-friendly systems
Data confidentiality
Data quality
Use limitation
1.Support s data center existing infrastructure
2. Service offering should include resource guarantees, metering rules, resource management and billing cycles.
3, Must meet consumer workload and resource aware. Cloud computing makes all the components of data center virtualized, not just compute and memory. The environment should deliver the maximum performance. SLA also have to met.
24/7 worlkload
Shared resources so have to consider the internal, external, security and mulittenancy must be integrated.
Service need to be able to provide access to only authorized users and in the shared pool model the users need to be able to trust that their data and application are secure.
99.999% availability – 5.26 minutes in a year
By the use of Weak password
CSP should understand the security around the Cloud identity solution such as process, infra, segmentation between the customers
Cryptographic keys, including TLS certificates, keys used to protect access to data and keys used to encrypt data at rest must be rotated periodically.
Any centralized storage mechanism containing data secrets (e.g. passwords, private keys, confidential customer contact database) is an extremely high-value target for attackers
Cryptographic keys, including TLS certificates, keys used to protect access to data and keys used to encrypt data at rest must be rotated periodically.
The security and availability of general cloud services is dependent on the security of these basic APIs.
Organization and 3rd parties may build on these interfaces to offer VAS to their customers. This introduce the complexity of new layered API, it also increases risks.
APIs and UI are exposed to outside world – faces heavy attacks
Data stored in the cloud can be lost for reasons other than malicious attacks. An accidental deletion by the cloud service provider, or worse, a physical catastrophe such as a fire or earthquake, can lead to the permanent loss of customer data unless the provider or cloud consumer takes adequate measures to back up data, following best practices in business continuity and disaster recovery
Solution: geographic redundancy, data backup with in the cloud, amd premise to cloud backups.
Amazon EC2 suffered loss of data loss loss of customers & Sony Hijack
Denial-of-service (DoS) attacks are attacks meant to prevent users of a service from being able to access their data or their applications. By forcing the targeted cloud service to consume inordinate amounts of finite system resources such as processor power, memory, disk space or network bandwidth, the attacker—or attackers, as is the case in distributed denial-of-service (DDoS)
Cloud service providers deliver their services scalably by sharing infrastructure, platforms or applications. Underlying components (e.g., CPU caches, GPUs, etc.) that comprise the infrastructure supporting cloud services deployment may not have been designed to offer strong isolation properties for a multitenant architecture (IaaS), re-deployable platforms (PaaS) or multicustomer applications (SaaS).
Side channel attacks. (Inter-Vm communication)
Vulnerability – “The unchecked buffer vulnerability (CVE-2015-3456) occurs in the code for QEMU’s virtual floppy disk controller. A successful buffer overflow attack exploiting this vulnerability can enable an attacker to execute his or her code in the hypervisor’s security context and escape from the guest operating system to gain control over the entire host.”
Notice: have to inform that userdata is collected and about how it will be used.
Choice: Enduser can allow/disallow to collect or transfer data to third parties.
Onward Transfer: Transfer of data to third parties may only occur to other organizations that follow adequate data protection principles.
Security: Reasonable efforts must be prevent loss of collected information.
Data integrity: data must be relevant and reliable for the purpose of collected info.
Access: Individuals must be able to access information held about them and correct or delete if it is inaccurate
Enforcement: There must be effective means of enforcing these rules.
Data minimization: data processing systems are to be designed and selected in accordance with the aim of collecting, processing or using no personal data at all or as few personal data as possible.
Controllability: an IT system should provide the data subjects with effective means of control concerning their personal data. The possibilities regarding consent and objection should be supported by technological means.
Transparency: both developers and operators of IT systems have to ensure that the data subjects are sufficiently informed about the means of operation of the systems. Electronic access / information should be enabled.
User-friendly systems: privacy-related functions and facilities should be user friendly, i.e. they should provide sufficient help and simple interfaces to be used also by less experienced users.
Data confidentiality: it is necessary to design and secure IT systems in a way that only authorized entities have access to personal data.
Data quality: data controllers have to support data quality by technical means. Relevant data should be accessible if needed for lawful purposes.
Use limitation: IT systems which can be used for different purposes or are run in a multi-user environment (i.e. virtually connected systems, such as data warehouses, cloud computing, digital identifiers) have to guarantee that data and processes serving different tasks or purposes can be segregated from each other in a secure way.