System Design Specifications
There are various methods of protecting data-in-transit, also referred to as data-in-motion. However, the most significant vulnerability with cloud storage is not securing the data in transit; it is the security of the data at rest. Therefore, before transmitting data, it is essential to ensure the data is encrypted with tools such as VeraCrypt, which is a tool that enables the use of encrypted containers to protect data at rest.
Secure encryption must be used in order to maintain confidentiality and integrity when transmitting data between the cloud server and the client. Encryption will ensure that only users with the key that was used to encrypt the data will be able to decrypt the data and view the contents (Alsulami, Alharbi, & Monowar, 2015). One method of encryption would be a technique such as the hybrid cryptographic scheme shown in Figure 1.
Figure 1. Hybrid Cryptographic Scheme.
As we see in Figure 1, Alice is sending an encrypted message to Bob using the Hybrid Cryptographic Scheme, which utilizes a combination of Public Key Crypto, Secret Key Crypto, and a Hash Function. Alice’s Private Key and the Hash Function are used to creating a digital signature, and Bob’s public key is combined with a random session key and public key crypto to create the encrypted session key. Alice’s message and the random session key are used in conjunction with the hash function and secret key crypto to formulate the encrypted message.
The combination of the encrypted message and the encrypted session key becomes what is known as the digital envelope. The hash function is a one-way encryption algorithm that uses no key, but instead uses a fixed-length hash value that computes based upon the plaintext, which makes it impossible for both the contents and length of the plaintext to be recovered, thus providing a digital fingerprint to ensure the integrity of the file. Bob recovers the hash value by decrypting the digital signature with Alice's public key. Then Bob recovers the secret session key using his private key and decrypts the encrypted message. If the resultant hash value is different from the value supplied by Alice, then Bob knows that the message has been altered; if the hash values are the same, Bob can have confidence that the message he received is identical to the one that Alice sent (Kessler, 2019).
Now that the messages are encrypted, we will need to use a secure means of transmitting the messages from point A to point B. Various protocols can provide security, such as Hypertext Transfer Protocol Secure (HTTPS), which is a variant of HTTP that adds a layer of security through an SSL or TLS protocol connection (“What is HTTPS,” n.d.). SSL ensures that before communication is established between a client browser and a cloud server, an encrypted link is created between the two (“What is Secure Sockets Layer,” n.d.). TLS is more efficient and secure than SSL as it has stronger message authentication, key mat.
System Design SpecificationsThere are various methods of pro.docx
1. System Design Specifications
There are various methods of protecting data-in-transit, also
referred to as data-in-motion. However, the most significant
vulnerability with cloud storage is not securing the data in
transit; it is the security of the data at rest. Therefore, before
transmitting data, it is essential to ensure the data is encrypted
with tools such as VeraCrypt, which is a tool that enables the
use of encrypted containers to protect data at rest.
Secure encryption must be used in order to maintain
confidentiality and integrity when transmitting data between the
cloud server and the client. Encryption will ensure that only
users with the key that was used to encrypt the data will be able
to decrypt the data and view the contents (Alsulami, Alharbi, &
Monowar, 2015). One method of encryption would be a
technique such as the hybrid cryptographic scheme shown in
Figure 1.
Figure 1. Hybrid Cryptographic Scheme.
As we see in Figure 1, Alice is sending an encrypted message to
Bob using the Hybrid Cryptographic Scheme, which utilizes a
combination of Public Key Crypto, Secret Key Crypto, and a
Hash Function. Alice’s Private Key and the Hash Function are
used to creating a digital signature, and Bob’s public key is
combined with a random session key and public key crypto to
create the encrypted session key. Alice’s message and the
random session key are used in conjunction with the hash
function and secret key crypto to formulate the encrypted
message.
The combination of the encrypted message and the encrypted
session key becomes what is known as the digital envelope. The
hash function is a one-way encryption algorithm that uses no
key, but instead uses a fixed-length hash value that computes
based upon the plaintext, which makes it impossible for both the
2. contents and length of the plaintext to be recovered, thus
providing a digital fingerprint to ensure the integrity of the file.
Bob recovers the hash value by decrypting the digital signature
with Alice's public key. Then Bob recovers the secret session
key using his private key and decrypts the encrypted message. If
the resultant hash value is different from the value supplied by
Alice, then Bob knows that the message has been altered; if the
hash values are the same, Bob can have confidence that the
message he received is identical to the one that Alice sent
(Kessler, 2019).
Now that the messages are encrypted, we will need to use a
secure means of transmitting the messages from point A to point
B. Various protocols can provide security, such as Hypertext
Transfer Protocol Secure (HTTPS), which is a variant of HTTP
that adds a layer of security through an SSL or TLS protocol
connection (“What is HTTPS,” n.d.). SSL ensures that before
communication is established between a client browser and a
cloud server, an encrypted link is created between the two
(“What is Secure Sockets Layer,” n.d.). TLS is more efficient
and secure than SSL as it has stronger message authentication,
key material generation, and other encryption algorithms. For
example, TLS supports pre-shared keys, secure remote
passwords, elliptical-curve keys, and Kerberos, whereas SSL
does not (Kerravala, 2018).
However, for a more secure method of transmitting data,
Internet Protocol Security (IPSec), as in Figure 2, would
provide the most secure means for data-in-transit. IPSec has two
modes, tunnel mode, and transport mode, that utilize two
security protocols, encapsulating security payload (ESP) with a
protocol ID of 50, and authentication header (AH) with a
protocol ID of 51 (Administrator, n.d.). In order to allow IPSec
traffic to pass through the firewall, typically UDP port 500
should be open and permit IP protocol IDs 50 and 51 on both
inbound and outbound firewall filters. Although the ports that
are utilized will vary depending upon the VPN server and
provider, this is the most common port utilized for IPSec.
3. Figure 2. IPSec Tunnel Mode.
Below in Figure 3 is an example of what the ideal cloud system
design specifications would look like using IPSec tunnel mode.
The communications all occur securely over an HTTPS
connection to the internet-facing web service of vCloud Air.
The Hybrid Cloud Manager also communicates with the
Network Services Provisioning API in vCloud Air as needed to
support the on-premises operations (Francis, 2016).
Figure 3. VMware vSphere Cloud Architecture.
SDLC’s framework defines steps to be taken in the software
development process and includes Planning, Development,
Documentation, Testing, Deployment and Maintenance.
However, the traditional approach of SDLC must be amended
and made suitable for the cloud computing environment. There
are numerous practices for the development of software which
includes waterfall, rapid application development, joint
application development, and extreme programming with many
life cycle models being built upon the traditional waterfall
model framework (Software Development). The waterfall model
require that one phase is worked on at a time then moving on to
the next until done. However, this may not be feasible in
practice.
Waterfall Method
In contrast, Agile software development performs tests
regularly with prototypes during each phase to demonstrate the
application’s functionality and is more focused on the current
customer demands rather than a stable, long-term project. With
this approach, the cost of change decreases when software
applications are put in the cloud. Cloud computing is a new
archetypal shift in the computing world providing more
scalable, flexible and robust web applications, with almost
instant access to software and developmental environments.
Agile Method
4. The flexibility and robustness of the cloud aligned the maturity
of SDLC frameworks provides for today’s rapidly evolving
technological environment which requires faster development
processes and higher productivity. Agile thinking should be
adopted as this software development approach focuses on
product delivery, quality assurance (QA), feature development,
and maintenance releases.
Developing applications for public or private cloud platforms
entails many of the same characteristics as applications
designed for any other platform. There are five characteristics
of cloud computing namely: on-demand self-service, broad
network access, resource pooling, rapid elasticity and measured
service. There are four cloud deployment models used which are
private cloud, community cloud, public cloud and the hybrid
cloud. In our particular circumstance, SaaS will be employed
for the HR department and their needs as there are many
benefits.
Traditional vs Cloud Priorities
SaaS is a complete operating environment with applications,
management and a user interface. There are no installation or
hardware concerns, backups occur automatically, differently
located employees can access same data, large data sets are
easier to work with, there are no compatibility issues and
upgrades are handled in the cloud (Miyachi, 2013).
Software Development Plan
This is the phase during which design specifications are refined
and coded with supporting documentation. A complete
information system will be completed in the development phase.
This will include acquiring and installing systems environment,
creating and testing databases, preparing test case procedures
and test files, coding, compiling, refining and performing test
readiness (Kommalapati & Zack, 2011). There are numerous
outlined steps:
· Set up development environment
· Develop service through iterative process
5. · Deploy and test continuously throughout the iteration
· Enlighten application with instrumentation
· Integrate application security [e.g. single sign-on,
authorization policies]
· Integrate with cloud and on-premise systems, if necessary
· Streamline data synchronization, extraction and uploading
· Integrate support and helpdesk processes
· Test service as well as integrated business processes
· Test support and helpdesk processes
IT implementation, if the case refers to the new system,
involves preliminary assessment of both external and internal
factors that contribute to costs of implementation, the usability
of the system, and business need of organization. The need for
new system implementation usually derives from the need to
replace or upgrade the process that is no longer effective, or to
support a new business process that requires technical support.
Externally, the company should be exposed to the following
aspects.
First, the company should seek vendors that have already
experienced the implementation process or best practices of
successful implementation in other companies that have
probably faced some issues and found the way to deal with
them. Second, the company should consider the hardware and
software requirements that fit the capacity of a new system. The
implementation of a new system will require more efforts due to
the need to upgrade some existent infrastructure in order to
synchronize data exchange and processing functions.
Internally, Tycoon
Solution
s must conduct an in-depth analysis of the user’s needs and
preferences regarding the tasks that the system will have to
6. replace or support. There are numerous examples in IT industry
when implemented systems have not been used due to not being
able to resolve a user’s problem completely, offering partial
solution to the problem. This needs can be identified by using
questionnaires, user stories, interviews, or job analysis, where
the critical elements are identified before the system
implementation. Moreover, communication with users is a
continuous process; thus, even after the system implementation,
the communication must be followed in order to receive
feedback on the system functionality. This implementation
might require reverting to external resources again, seeking for
methods to improve the functionality of the system by installing
additional modules or system patching.
The transit of the data from and to a cloud environment could
be protected through the application of several technology
considerations. First, it is worth mentioning that the data is hard
to intercept by intruders if the protocols the data transfer
process relies on a protected by the firewall. Second, the data
should be encrypted using the recent encryption algorithms that
prevent the data from decoding during the transition. Secure
encoding and decoding algorithms also ensure that the data is
encoded and decoded using the primary and secondary key,
which prevents the intruder from obtaining it off one protected
and was not shared before the encoding process was initiated
(Washer, 2014). It means that the use of the most hashing
7. algorithms will also contribute to better data encapsulation,
hence leading to higher data integrity.
Meanwhile, it is worth admitting that the data still could be
compromised during the transit based on the interruption
because of the contrasting server settings. For instance, if the
cloud server is not configured appropriately, there is a risk of
facing the slow process of data exchange, which consequently
creates a timely opportunity for intruders to interfere and
decode it. Moreover, it is essential to use the most modern
equipment and perform ping tests that would demonstrate if the
speed of data exchange is secure and satisfy the requirements of
users and administrators.
References
Administrator. (n.d.). Understanding VPN IPSec Tunnel Mode
and IPSec Transport Mode - What's the Difference? Retrieved
from http://www.firewall.cx/networking-topics/protocols/870-
ipsec-modes.html
Alsulami, N., Alharbi, E., & Monowar, M. (2015, March 3). A
Survey on Approaches of Data Confidentiality in the Cloud
Computing Systems. Journal of Emerging Trends in Computing
and Information Sciences. Retrieved August 25, 2019, from
http://www.cisjournal.org/journalofcomputing/archive/vol6no3/
vol6no3_9.pdf
Alsulami, N., Alharbi, E., & Monowar, M. (2015). A Survey on
8. Approaches of Data Confidentiality in the Cloud Computing
Systems. Journal of Emerging Trends in Computing and
Information sciences, 6(3), 188-197.
Francis, M. (2016, December 21). Hybrid Cloud Manager
Deployment Considerations. Retrieved from
https://blogs.vmware.com/consulting/2016/12/hybrid-cloud-
manager-deployment.html
Kerravala, Z. (2018, November 9). What is Transport Layer
Security (TLS)? Retrieved from
https://www.networkworld.com/article/2303073/lan-wan-what-
is-transport-layer-security-protocol.html
Kessler, G. C. (2019, August 14). An Overview of
Cryptography. Retrieved from
https://www.garykessler.net/library/crypto.html#why3
Kommalapati, H. & Zack, W. H. (2011, October 3). The SaaS
development lifecycle. InfoQ. Retrieved from
https://www.infoq.com/articles/SaaS-Lifecycle
Miyachi, C. (2013). Software systems architecture in a world of
cloud computing. Retrieved from
https://sdm.mit.edu/news/news_articles/webinar_052013/miyach
i_052013.pdf
Washer, E. (2014). Leveraging Cloud-Based Business Networks
For Collaboration. Retrieved from
http://blogs.sap.com/innovation/cloud-computing/leveraging-
cloud-based-business-networks-collaboration-01251933
9. What is HTTPS? - Definition from Techopedia. (n.d.). Retrieved
from https://www.techopedia.com/definition/5361/hypertext-
transport-protocol-secure-https
What Is Secure Sockets Layer (SSL), (n.d.). Retrieved from
https://www.digicert.com/ssl/
Organization
Workday Human Capital is a Human Resources (HR) company
that is involved with housing large amounts of employee data
throughout the United States. The hardware associated with
storing all of the collected data and the accompanying
maintenance is proving to be costly year after year. Workday
Human Capital intends to drastically reduce the incurred costs
from hosting data on physical hardware, increase the
availability of data and reduce downtime associated with
patching or upgrades. In order to reach the intended goals,
Workday HR plans to move all of its HR data into the cloud.
Amazon Web Services (AWS) is one of the most prominent
cloud providers that offers several services to include Software
as a Service or SaaS also referred to as Web-based software, on-
demand software and hosted software. With SaaS, the cloud
provider is responsible for maintaining several components to
10. include data servers, storage, redundancy and availability.
AWS will provide Workday HR with reliable data integrity
and confidentiality protections for data in transit that will be
using applications in the cloud. The security team at AWS
follows the SDLC methodology, also known as the systems
development life cycle. This methodology is used to develop
and implement the appropriate cloud architecture that will meet
the company’s specific needs. The SDLC report will enumerate
the different ways and methods that data is secured while hosted
in the cloud and provide more information on the development
plan as well as the cloud computing model.
Scope
The system development life cycle cannot be complete without
incorporating security as an essential tenant. By implementing
security controls, identifying and mitigating threats, disruptions
can be minimized and therefore reduce the associated costs with
unavailability of data to the company and its clients. With
better security controls, the threat surface is reduced and
attackers have less opportunities to exploit certain
vulnerabilities. The SDLC is also a framework of structuring,
planning and controlling the creation of an Information
Technology system. Several frameworks exist in the SDLC
realm, to include Waterfall, Rapid application development and
Agile. The engineers at AWS will be following the XXXXX
11. method to support the Workday Cloud migration.
The processes of security and software development life cycle
are important for developing a cloud-based system. These
processes will follow an incremental approach, which is
executed by joining two or more logically related modules and
testing their functionality. With the XXXX cycle approach,
requirements are split into subgroups. The engineers at AWS
recommend migrating information systems to a cloud computing
environment for several reasons. First, availability allows the
organization’s data to be available anywhere with internet
connectivity. Second, scalability allows the organization’s
information systems resources to be scaled up or down
depending on the concurrent needs at the time. Lastly, moving
to a cloud based service can allow Workday HR to focus on its
business goals rather than the on premise equipment that require
maintenance, time and extensive resources.
When dealing with sensitive information, it is best to choose the
private cloud model. It is also important to ensure that the cloud
computing environment uses encryption for both data that is in
transit and data at rest. With strong encryption protocols,
Workday HR’s data can be protected against the many different
cyber attacks aimed at compromising sensitive employee and
client information. The AWS security engineers will also
assure interoperability, which allows unrestricted sharing of
12. resources between dissimilar systems, both software and
application, or between networks.
In addition to strong encryption protocols, a robust risk
management strategy needs to be implemented to protect the
data stored in the cloud environment. Performing periodic
penetration testing can be a great element of the risk
management framework, because it allows for an attacker’s
perspective. The different types of pen testing can determine
zero days vulnerabilities, alongside determining where those
vulnerabilities and threats reside in the infrastructure. It is also
important to develop a threat model for the company in order to
help determine risk. Threat modeling is helpful because it
identifies, analyzes and mitigates security risks to systems and
applications. The security team will then classify all threats
based on their severity and devise an action plan and best
practices to countermeasure and mitigate the attacks.
Testing and Integration (Part 7)
The testing and integration phase ensures that all the different
components of the software are performing as intended. Testing
enables service providers to make objective assessments in
regards to the degree of conformance required for the company
based on specific specifications and requirements. The testing
phase also ensures that all systems meet the required
functionality to perform reliably, in a secure manner and be
available when needed without interruption. Without the testing
13. phase, user’s needs might not be met according to the policy
implemented. Testing is a crucial part of the SDLC because it
helps expose possible defects and faults that may be hiding
within the software. Any undetected fault or defects are
potential vulnerabilities that could affect the confidentiality,
integrity, and availability of the organization’s network.
Safeguarding organizational data is vital to its survival and
ensures its continued success as a whole. Data that is within a
network can be in several states. First, data can be at rest where
it is stored on a device or backup medium such as hard drives,
backup tapes and possibly mobile devices. The data at rest is
quite literally inactive or not currently used or transmitted
across the network. Second, Data in use is data that is not only
stored on a hard drive or external media storage, but is also
actively being processed by at least one application. Finally,
data in motion or transit is data that is currently traveling across
a network or is sitting in a computer’s RAM ready to be read,
updated or processed. Data in transit must be protected against
eavesdropping where an attacker or hacker can place himself in
the middle between two legitimate parties trying to establish a
communication.
It is also extremely important to maintain the CIA triad
(confidentiality, integrity and availability) during all
implementation phases so as to keep data secure. Each data state
present a unique set of challenges that require different
14. methodologies to ensure it is protected. For example, below are
encryption types used for data at rest:
· Full Disk Encryption (FDE)
· Hardware Security Model (HSM)
· Encrypting File System (EFS)
· Database Encryption
· Virtual Encryption
· File and Folder Encryption (Green, Grahn, Hair, O'Leary, &
Nelson, 2018).
For data in motion, Virtual Private Network (VPN), Wi-Fi
Protected Access ((WPA/WPA2), Secured Socket Layer (SSL),
and Secure Shell (SSH) are some methods that are used to
ensure secure end to end movement. However, the most common
protection method used for data in motion is SSL VPN. These
methods help protect and mitigate against attacks that are
similar in nature to Man-in-the-middle and packet sniffer
attacks. Furthermore, unlike data at rest and data in motion,
data in use is much harder to protect. The reason being since it
is being used, it must usually be decrypted therefore leaving it
exposed. This leaves the servers open for attacks while the data
is being transmitted in an unencrypted state.
Test Phases
According to LaTonya Pearson of Segue Technologies (2017),
four levels of software testing exist. The levels include:unit
tests, integration testing, system testing and acceptance testing.
15. In addition to these four levels, regression testing can be
accomplished at any time throughout the testing process. Unit
Testing is typically focused on the specific units and
components of the software to ensure that each aspect is
operating smoothly and is fully functional. The main priority of
this level of testing is to determine if the software is running
and functioning as originally designed. An example of this type
of testing would be white-box testing.
Integration Testing gives the opportunity to combine multiple
units and test them as a group. This level of testing is primarily
done to help isolate defects and faults in the programs
functions. The importance of this step is vital because no matter
how well a program is running on its own, if it is not properly
integrated, the overall functionality of the program may not
properly operate.
System Testing is technically the first level of testing that
occurs with the application completely as a whole. This step is
used to determine if all requirements and standards have been
met; third party testers also do this testing level for an unbiased
result. System testing is important because it verifies that the
software meets all of the technical and functional requirements
set by the customer.
Lastly, the acceptance testing level is completed to determine
whether the software is ready for release. This level of testing is
where a final examination is completed to ensure that the
16. overall functionality and operability meets the business needs of
the customer. With the complex and extensiveness of these
tests, it is important that testers are involved at an early stage
and throughout the entire process to guarantee an on time
release with minimal issues.
Functional Design
To provide the best product one of the first steps is gathering,
reviewing and then defining requirements. Hans Jonasson
(2007), presenting at the PMI Global Congress is quoted as
stating, “defining scope [requirements] is a critical part of
succeeding on a project”. Without proper requirements,
projects trend towards ending up over-budget on both time and
money. In the realm of cybersecurity, improper requirements
often mean security requirements were subject to the same lack
of definition which in today’s cyber-climate this can be a
devastating mistake. Because of this many organizations look
towards defined processes to ensure that they do not end up
with ill-defined requirements of any variety.
One of which is the System Quality Requirements Engineering
(SQUARE) process which was developed at Carnegie Mellon
University. According to the United States Computer
17. Emergency Readiness Team (US-CERT) website, “[SQUARE]
provides a means for eliciting, categorizing, and prioritizing
security requirements for information technology systems and
applications.” (2013) SQUARE defines nine steps that help an
organization better define their security requirements and in
order they can be seen in Table 1.
Table 1. AWS System Quality Requirements Engineering
Process (SQUARE).
Number
Step
Input
Techniques
Participants
Output
1
Agree on definitions
Candidate definitions from IEEE and other standards
Structured interviews, focus group
Stakeholders, requirements engineer
Agreed-to definitions
2
Identify assets and security goals
Definitions, candidate goals, business drivers, policies and
procedures, examples
Facilitated work session, surveys, interviews
18. Stakeholders, requirements engineer
Assets and goals
3
Develop artifacts to support security requirements definition
Potential artifacts (e.g., scenarios, misuse cases, templates,
forms)
Work session
Requirements engineer
Needed artifacts: scenarios, misuse cases, models, templates,
forms
4
Perform risk assessment
Misuse cases, scenarios, security goals
Risk assessment method, analysis of anticipated risk against
organizational risk tolerance, including threat analysis
Requirements engineer, risk expert, stakeholders
Risk assessment results
5
Select elicitation techniques
Goals, definitions, candidate techniques, expertise of
stakeholders, organizational style, culture, level of security
needed, cost/benefit analysis, etc.
Work session
Requirements engineer
Selected elicitation techniques
19. 6
Elicit security requirements
Artifacts, risk assessment results, selected techniques
Joint Application Development (JAD), interviews, surveys,
model-based analysis, checklists, lists of reusable requirements
types, document reviews
Stakeholders facilitated by requirements engineer
Initial cut at security requirements
7
Categorize requirements as to level (system, software, etc.) and
whether they are requirements or other kinds of constraints
Initial requirements, architecture
Work session using a standard set of categories
Requirements engineer, other specialists as needed
Categorized requirements
8
Prioritize requirements
Categorized requirements and risk assessment results
Prioritization methods such as Analytical Hierarchy Process
(AHP), Triage, Win-Win
Stakeholders facilitated by requirements engineer
Prioritized requirements
9
Inspect requirements
Prioritized requirements, candidate formal inspection technique
20. Inspection method such as Fagan, peer reviews
Inspection team
Initial selected requirements, documentation of decision-making
process and rationale
Note. Reprinted from SQUARE process, by United States
Computer Emergency Readiness Team, retrieved from
https://www.us-cert.gov/bsi/articles/best-
practices/requirements-engineering/square-process.
Different Ways to Secure Cloud Data
Many in the IT industry tout the cloud as the future of data
storage. CIO’s online magazine indicated it’s because of the
cloud’s ability to, “reduce costs and increase business
efficiency” and because the cloud provides increased security
(2018). Part of this is that the cloud providers are often held to
a very exacting standard to get their accreditations. As their
business relies on providing this trust, it is often a much higher
standard than the average medium-to-small businesses’ IT staff
can maintain. According to their documentation Amazon Web
Services (2019) complies with a variety of IT security
standards, some of which include: FISMA, DIACAP, and ISO
9001. This allows the security architects to utilize and leverage
the cloud’s performance and scalability to provide security
services that often would far exceed the normal budget of many
businesses.
21. An important distinction does exist in cloud security however
with Amazon Web Services (2019) pointing out, “AWS manages
the security of the cloud, you are responsible for security in the
cloud.” This distinction is very important to cyber architects,
as what and how the architecture is defined in the cloud
perimeter is vitally important to the delivery of a secure
product. In other words, while cloud providers like AWS can
provide a safe and secure house, they can’t help if all the doors
and windows are left open and unlocked. While sounding like
common sense, it is often why many security breaches in the
cloud occur; the cloud wasn’t hacked, the company’s instance in
that cloud was. To prevent this, cyber architects should follow
the same methods as they have always, just with the added
benefit that they can now leverage much more powerful and
distributed resources.
Define and enforce good role-based access control (RBAC) and
use either Platform as a Service (PaaS) to create duplicate
domain controllers, or leverage AWS Managed Microsoft Active
Directory, which is AWS Software as a Service (SaaS) offering.
If using a hybrid solution, set up a VPN and implement IPSec
to ensure that the data in transit is between the cloud and the
premises. AWS offers Virtual Private Cloud (VPC) for
organizations that want to have their own private chunk of the
cloud. These VPC’s are akin to leasing private lines from an
ISP to set up a private network but cost a fraction of the price.
22. No matter what the security architect can dream up, the cloud
enables it to be bigger, faster, and cheaper than attempting to do
the same in-house.
References
Amazon Web Services. (2019). Security and Compliance.
Retrieved from
docs.aws.amazon.com: https://docs.aws.amazon.com/aws-
technical-content/latest/aws-overview/security-and-
compliance.html
CIO. (2018). The Future of Cloud Services. Retrieved from
cio.com:
https://www.cio.com/article/3328547/the-future-of-cloud-
services.html
Jonasson, H. (2007). Determining project requirements-best
practices and tips. PMI global
Congress 2007. Newtown Square, PA: Project Management
Institute.
United States Computer Emergency Readiness Team. (2013,
July 05). SQUARE Process. Retrieved from us-cert.gov:
https://www.us-cert.gov/bsi/articles/best-
practices/requirements-engineering/square-process