1
45
Table of Contents
Introduction3
Need for technology-based solutions3
Infrastructure Automation Tools4
Implementation4
The Central Theory: Organizational Management and Memory4
Organizational Management4
Organizational Memory4
Need of Data Archival And Storage5
Data Storage.5
Types of Storage.6
Data Archival9
Data Archival Process9
Archiving principles12
Data Management Systems12
Enterprise Resource Planning Systems (ERP systems) for data integration.13
Microservices.15
Properties of Monolithic.17
Conclusion22
References24
Introduction
Technology is considered vital in today's globalized world. Especially in terms of business, information technology has both quantifiable and unquantifiable benefits. It is essential to communicate with customers and stakeholders regularly and necessary for communicating quickly and clearly. It helps in implementing business operations efficiently and effectively, also. A business with robust technological capacity creates new opportunities for a company to stay ahead of the competition and grow eventually (Rangus & Slavec, 2017). Consequently, it also makes dynamic teams that can interact from anywhere in the world—furthermore, technology aids in understanding the business needs and managing and securing confidential and critical data.Need for technology-based solutions
The need for data recovery, active and continuous data processing by its life cycle of significance and utility for research, scientific and educational purposes (Bukari Zakaria & Mamman, 2014). The acknowledgment that information is an organization's key asset since late, decisively affecting its profitability, has contributed to some comprehensive corporate memory approaches. The key causes of competitive advantage are corporate memory and organizational learning ability (C. Priya, 2011). Hence the main obstacle is the effectiveness of information management while ensuring the consistency of training facilities.
Organizations need robust technology-based solutions. Thus, software developers have developed and deployed various forms of overtime architectures that enable software products to become resource-effective and usable. Some architectures implement their frameworks in either one layer or various layers or levels (Suresh, 2012). It is understood that ERP implementation efficiency of ERP implementations is influenced by the rise or excess of a certain degree of capability in the volume of data to process (Johansson, 2012). In the last couple of decades, new architectures have been created with creativity that offers optimum solutions. Thus, the microservices architecture is gaining room and becoming part of the technological, financial, and advertising decision-making process. Microservices replace monolithic, tightly dispersed system-focused applications with an independent operation (Vrîncianu, Anica-Popa, & Anica-Popa, 2009).Infrastructure Automation Tools
One issue as microservices are applied is that any s ...
This document proposes a new framework called Actionable Knowledge As A Service (AKAAS) for managing knowledge in cloud computing environments. It discusses how traditional knowledge management systems are challenged by cloud computing and social/technological changes. The framework aims to provide on-demand, customizable knowledge to users based on their needs and interactions. It argues that user behaviors and needs should be the focus, rather than just the volume of published content. Analytics using data on user interactions are proposed to help discover knowledge tailored to specific contexts. The goal is to evolve from push-based knowledge delivery to personalized, actionable knowledge acquisition.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of
time is consumed in collecting requirements from the organization to build an archiving system. Sometimes
the system does not meet the organization needs. This paper proposes a domain-based requirement
engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and
SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed
during analyzing and designing the archiving systems decreased significantly. The proposed methodology
also reduces the system errors that may happen at the early stages of the development of the system.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of time is consumed in collecting requirements from the organization to build an archiving system. Sometimes the system does not meet the organization needs. This paper proposes a domain-based requirement engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed during analyzing and designing the archiving systems decreased significantly. The proposed methodology also reduces the system errors that may happen at the early stages of the development of the system.
DESIGN, DEVELOPMENT & IMPLEMENTATION OF ONTOLOGICAL KNOWLEDGE BASED SYSTEM FO...IJDKP
This document summarizes an article that describes the design and development of an ontological knowledge-based system to support reconfigurable assembly lines in the automotive industry. The system uses an ontology to represent the relationships between products, processes, and resources. It aims to facilitate rapid reconfiguration of assembly lines in response to changing product requirements. The system is intended to help automotive companies address challenges like increasing competition, complex products and processes, and the need to adapt quickly to changes and new customer requirements.
Investigating the use of an Artificial Intelligence Model in an ERP Cloud-Bas...AI Publications
Enterprise Resource Planning (ERP) systems are necessary to improve an enterprise's management performance. However, the perception of information technology (IT) professionals about the integration of artificial intelligence (AI) and machine learning with ERP cloud service platforms is unknown. Few studies have examined how leaders can implement AI for strategic management, but no study has qualitatively explored AIs integration in the cloud ERP system. This qualitative phenomenological study explored IT professionals’ perceptions regarding the integration of AI and Supervised-machine (S-machine) learning into cloud service platforms in the enhancement of the cloud ERP system. Two research questions were developed for this study: 1) What are the perceptions of IT professionals regarding the use of an AI model to integrate SaaS and ERP? and 2) What are the perceptions of IT professionals regarding how AI can be integrated in order to enhance the security of using an ERP cloud-based system? Through a hermeneutical lens and a focus on integrating the Application Programming Interface (API), purposive sampling was used to interview five AI experts, three Machine Learning (ML) experts, five Cybersecurity experts, and two Cloud Service Providers provided their lived experiences with AI and S-machine learning. Five main themes emerged, including 1) use of an AI model to integrate SaaS and ERP helped perform work efficiently, 2) challenges for integrating AI into cloud service ERP and SaaS, 3) resources needed to fully implement an AI into cloud-service ERP or SaaS, 4) the best practices for developing and implementing an AI model for ERP and SaaS, and 5) how security of an ERP clouds-based system is optimized by integrating AI. The culmination of these findings has positive implications for individuals and organizations to improve management performance. While this study does not proposal a new theory, this study extends current literature on the application of theories related to technology integration.
ORIGINAL PAPERFuture software organizations – agile goals .docxvannagoforth
ORIGINAL PAPER
Future software organizations – agile goals and roles
Petri Kettunen1 & Maarit Laanti2
Received: 31 July 2017 /Accepted: 5 December 2017 /Published online: 16 December 2017
# The Author(s) 2017. This article is an open access publication
Abstract
Digital transformation is rapidly causing major, even disruptive changes in many industries. Moreover, global developments like
digital platforms (cloud) and IoTcreate fundamentally new connections at many levels between objects, organizations and people
(systems-of-systems). These are by nature dynamic and often work in real time – further increasing the complexity. These
systemic changes bring up new profound questions: What are those new software-intensive systems like? How are they created
and developed? Which principles should guide such organizational design? Agile enterprises are by definition proficient with
such capabilities. What solutions are the current scaled agile frameworks such as SAFe and LeSS proposing, and why? In this
paper, we aim to recognize the design principles of future software organizations, and discuss existing experiences from various
different organizations under transformations, and the insights gained. The purpose is to systematize this by proposing a
competence development impact-mapping grid for new digitalization drivers and goals with potential solutions based on our
agile software enterprise transformation experiences. Our research approach is based on the resource-based and competence-
based views (RBV, CBV) of organizations. We point out how most decision-making in companies will be more and more
software-related when companies focus on software. This has profound impacts on organizational designs, roles and competen-
cies. Moreover, increasing data-intensification poses new demands for more efficient organizational data processing and effective
knowledge utilization capabilities. However, decisive systematic transformations of companies bring new powerful tools for
steering successfully under such new business conditions. We demonstrate this via real-life examples.
Keywords Digital transformation . SAFe . LeSS . Agile enterprise . Systems thinking . Value streams
Introduction
Digital transformations are a cause of rapid and even disrup-
tive change in a majority of companies and future competitive
environments. Fundamentally novel models for organizations
and businesses (like Uber-type) are emerging, and traditional
companies as well must consider their structure and their roles
in achieving new business goals. Both the software producer
organization and the customer viewpoints should be under-
stood via comprehensive sense making. Companies now be-
gin to focus on software – either following strategy, or ad hoc,
when forced by competitive pressure imposed on them by
digitalization.
We view future competitive companies as agile and sus-
tainable, as well as more fundamentally software-based with
respect to both their outcomes (products ...
EXPLORING THE LINK BETWEEN LEADERSHIP AND DEVOPS PRACTICE AND PRINCIPLE ADOPTIONacijjournal
Our research focuses in software-intensive organizations and highlights the challenges that surface as a result of the transitioning process of highly-structured to DevOps practices and principles adoption. The approach collected data via a series of thirty (30) interviews, with practitioners from the EMEA
region (Czech Republic, Estonia, Italy, Georgia, Greece, The Netherlands, Saudi Arabia, South Africa, UAE, UK), working in nine (9) different industry domains and ten (10) different countries. A set of agile, lean and DevOps practices and principles were identified, which organizations select as part of DevOps-oriented adoption. The most frequently adopted ITIL® service management practices, contributing to DevOps practice and principle adoption success, indicate that DevOps-oriented
organizations benefit from the existence of change management, release and deployment management, service level management, incident management and service catalog management. We also uncover that the DevOps adoption leadership role is required in a DevOps team setting and that it should, initially, be an individual role.
CRESUS-T: A COLLABORATIVE REQUIREMENTS ELICITATION SUPPORT TOOLijseajournal
Communicating an organisation's requirements in a semantically consistent and understandable manner
and then reflecting the potential impact of those requirements on the IT infrastructure presents a major
challenge among stakeholders. Initial research findings indicate a desire among business executives for a
tool that allows them to communicate organisational changes using natural language and a model of the IT
infrastructure that supports those changes. Building on a detailed analysis and evaluation of these findings,
the innovative CRESUS-T support tool was designed and implemented. The purpose of this research was to
investigate to what extent CRESUS-T both aids communication in the development of a shared
understanding and supports collaborative requirements elicitation to bring about organisational, and
associated IT infrastructural, change. In order to determine the extent shared understanding was fostered,
the support tool was evaluated in a case study of a business process for the roll out of the IT software
image at a third level educational institution. Statistical analysis showed that the CRESUS-T support tool
fostered shared understanding in the case study, through increased communication. Shared understanding
is also manifested in the creation of two knowledge representation artefacts namely, a requirements model
and the IT infrastructure model. The CRESUS-T support tool will be useful to requirements engineers and
business analysts that have to gather requirements asynchronously.
This document proposes a new framework called Actionable Knowledge As A Service (AKAAS) for managing knowledge in cloud computing environments. It discusses how traditional knowledge management systems are challenged by cloud computing and social/technological changes. The framework aims to provide on-demand, customizable knowledge to users based on their needs and interactions. It argues that user behaviors and needs should be the focus, rather than just the volume of published content. Analytics using data on user interactions are proposed to help discover knowledge tailored to specific contexts. The goal is to evolve from push-based knowledge delivery to personalized, actionable knowledge acquisition.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of
time is consumed in collecting requirements from the organization to build an archiving system. Sometimes
the system does not meet the organization needs. This paper proposes a domain-based requirement
engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and
SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed
during analyzing and designing the archiving systems decreased significantly. The proposed methodology
also reduces the system errors that may happen at the early stages of the development of the system.
AN ITERATIVE HYBRID AGILE METHODOLOGY FOR DEVELOPING ARCHIVING SYSTEMSijseajournal
With the massive growth of the organizations files, the needs for archiving system become a must. A lot of time is consumed in collecting requirements from the organization to build an archiving system. Sometimes the system does not meet the organization needs. This paper proposes a domain-based requirement engineering system that efficiently and effectively develops different archiving systems based on new
suggested technique that merges the two best used agile methodologies: extreme programming (XP) and SCRUM. The technique is tested on a real case study. The results shows that the time and effort consumed during analyzing and designing the archiving systems decreased significantly. The proposed methodology also reduces the system errors that may happen at the early stages of the development of the system.
DESIGN, DEVELOPMENT & IMPLEMENTATION OF ONTOLOGICAL KNOWLEDGE BASED SYSTEM FO...IJDKP
This document summarizes an article that describes the design and development of an ontological knowledge-based system to support reconfigurable assembly lines in the automotive industry. The system uses an ontology to represent the relationships between products, processes, and resources. It aims to facilitate rapid reconfiguration of assembly lines in response to changing product requirements. The system is intended to help automotive companies address challenges like increasing competition, complex products and processes, and the need to adapt quickly to changes and new customer requirements.
Investigating the use of an Artificial Intelligence Model in an ERP Cloud-Bas...AI Publications
Enterprise Resource Planning (ERP) systems are necessary to improve an enterprise's management performance. However, the perception of information technology (IT) professionals about the integration of artificial intelligence (AI) and machine learning with ERP cloud service platforms is unknown. Few studies have examined how leaders can implement AI for strategic management, but no study has qualitatively explored AIs integration in the cloud ERP system. This qualitative phenomenological study explored IT professionals’ perceptions regarding the integration of AI and Supervised-machine (S-machine) learning into cloud service platforms in the enhancement of the cloud ERP system. Two research questions were developed for this study: 1) What are the perceptions of IT professionals regarding the use of an AI model to integrate SaaS and ERP? and 2) What are the perceptions of IT professionals regarding how AI can be integrated in order to enhance the security of using an ERP cloud-based system? Through a hermeneutical lens and a focus on integrating the Application Programming Interface (API), purposive sampling was used to interview five AI experts, three Machine Learning (ML) experts, five Cybersecurity experts, and two Cloud Service Providers provided their lived experiences with AI and S-machine learning. Five main themes emerged, including 1) use of an AI model to integrate SaaS and ERP helped perform work efficiently, 2) challenges for integrating AI into cloud service ERP and SaaS, 3) resources needed to fully implement an AI into cloud-service ERP or SaaS, 4) the best practices for developing and implementing an AI model for ERP and SaaS, and 5) how security of an ERP clouds-based system is optimized by integrating AI. The culmination of these findings has positive implications for individuals and organizations to improve management performance. While this study does not proposal a new theory, this study extends current literature on the application of theories related to technology integration.
ORIGINAL PAPERFuture software organizations – agile goals .docxvannagoforth
ORIGINAL PAPER
Future software organizations – agile goals and roles
Petri Kettunen1 & Maarit Laanti2
Received: 31 July 2017 /Accepted: 5 December 2017 /Published online: 16 December 2017
# The Author(s) 2017. This article is an open access publication
Abstract
Digital transformation is rapidly causing major, even disruptive changes in many industries. Moreover, global developments like
digital platforms (cloud) and IoTcreate fundamentally new connections at many levels between objects, organizations and people
(systems-of-systems). These are by nature dynamic and often work in real time – further increasing the complexity. These
systemic changes bring up new profound questions: What are those new software-intensive systems like? How are they created
and developed? Which principles should guide such organizational design? Agile enterprises are by definition proficient with
such capabilities. What solutions are the current scaled agile frameworks such as SAFe and LeSS proposing, and why? In this
paper, we aim to recognize the design principles of future software organizations, and discuss existing experiences from various
different organizations under transformations, and the insights gained. The purpose is to systematize this by proposing a
competence development impact-mapping grid for new digitalization drivers and goals with potential solutions based on our
agile software enterprise transformation experiences. Our research approach is based on the resource-based and competence-
based views (RBV, CBV) of organizations. We point out how most decision-making in companies will be more and more
software-related when companies focus on software. This has profound impacts on organizational designs, roles and competen-
cies. Moreover, increasing data-intensification poses new demands for more efficient organizational data processing and effective
knowledge utilization capabilities. However, decisive systematic transformations of companies bring new powerful tools for
steering successfully under such new business conditions. We demonstrate this via real-life examples.
Keywords Digital transformation . SAFe . LeSS . Agile enterprise . Systems thinking . Value streams
Introduction
Digital transformations are a cause of rapid and even disrup-
tive change in a majority of companies and future competitive
environments. Fundamentally novel models for organizations
and businesses (like Uber-type) are emerging, and traditional
companies as well must consider their structure and their roles
in achieving new business goals. Both the software producer
organization and the customer viewpoints should be under-
stood via comprehensive sense making. Companies now be-
gin to focus on software – either following strategy, or ad hoc,
when forced by competitive pressure imposed on them by
digitalization.
We view future competitive companies as agile and sus-
tainable, as well as more fundamentally software-based with
respect to both their outcomes (products ...
EXPLORING THE LINK BETWEEN LEADERSHIP AND DEVOPS PRACTICE AND PRINCIPLE ADOPTIONacijjournal
Our research focuses in software-intensive organizations and highlights the challenges that surface as a result of the transitioning process of highly-structured to DevOps practices and principles adoption. The approach collected data via a series of thirty (30) interviews, with practitioners from the EMEA
region (Czech Republic, Estonia, Italy, Georgia, Greece, The Netherlands, Saudi Arabia, South Africa, UAE, UK), working in nine (9) different industry domains and ten (10) different countries. A set of agile, lean and DevOps practices and principles were identified, which organizations select as part of DevOps-oriented adoption. The most frequently adopted ITIL® service management practices, contributing to DevOps practice and principle adoption success, indicate that DevOps-oriented
organizations benefit from the existence of change management, release and deployment management, service level management, incident management and service catalog management. We also uncover that the DevOps adoption leadership role is required in a DevOps team setting and that it should, initially, be an individual role.
CRESUS-T: A COLLABORATIVE REQUIREMENTS ELICITATION SUPPORT TOOLijseajournal
Communicating an organisation's requirements in a semantically consistent and understandable manner
and then reflecting the potential impact of those requirements on the IT infrastructure presents a major
challenge among stakeholders. Initial research findings indicate a desire among business executives for a
tool that allows them to communicate organisational changes using natural language and a model of the IT
infrastructure that supports those changes. Building on a detailed analysis and evaluation of these findings,
the innovative CRESUS-T support tool was designed and implemented. The purpose of this research was to
investigate to what extent CRESUS-T both aids communication in the development of a shared
understanding and supports collaborative requirements elicitation to bring about organisational, and
associated IT infrastructural, change. In order to determine the extent shared understanding was fostered,
the support tool was evaluated in a case study of a business process for the roll out of the IT software
image at a third level educational institution. Statistical analysis showed that the CRESUS-T support tool
fostered shared understanding in the case study, through increased communication. Shared understanding
is also manifested in the creation of two knowledge representation artefacts namely, a requirements model
and the IT infrastructure model. The CRESUS-T support tool will be useful to requirements engineers and
business analysts that have to gather requirements asynchronously.
EXPLORING THE LINK BETWEEN LEADERSHIP AND DEVOPS PRACTICE AND PRINCIPLE ADOPTIONacijjournal
The document discusses exploring the link between leadership and DevOps practice and principle adoption. It analyzes data from interviews with 30 practitioners working in software-intensive organizations across different industries and countries. The interviews identified a set of agile, lean, and DevOps practices and principles commonly adopted as part of transitioning to DevOps. It was found that DevOps-oriented organizations can benefit from certain existing ITIL service management practices like change management and release management. Additionally, the research uncovered that DevOps adoption requires leadership, initially in the form of an individual role to guide the team through the transition process.
574An Integrated Framework for IT Infrastructure Management by Work Flow Mana...idescitation
Information Technology (IT) is one of the most emerging
fields in today’s Internet world. IT can be defined in various ways,
but is broadly considered to encompass the use of computers and
telecommunications equipment to store, retrieve, transmit and
manipulate data. Infrastructure is the base for everything. IT also
has an infrastructure, which can be managed and maintained
properly. For an organization’s Information Technology,
Infrastructure Management (IM) is the management of essential
operation components, such as policies, processes, equipment,
data, human resources and external contacts, for overall
effectiveness.
In this paper, we propose a methodology to manage the IT
Infrastructure in a better way. Our methodology uses the tree-
structure based architecture to manage the infrastructure with less
manual power. The process of how to manage the infrastructure is
discussed with efficient methodology and necessary steps with
algorithm, in this paper. Also, in this paper, the process of workflow
management on IT infrastructure management has been provided.
At present, the state-of-the-art supplies for conducting a face-to-face design thinking workshop typically consists of self-stick notes and stickers, markers, and whiteboards. However, this analog way of working is incongruent with the realities of global software companies, where most products and services are developed by distributed teams. This paper explores the process of facilitating remote design thinking workshops, using information technology and communication tools. The paper is based on a participatory action research undertaken by the author as a part of the doctoral thesis - ‘a study on an approach to prepare the organization mindset to build design-led innovation culture to become a customer-centric and future driven software company’ in the Indian IT sector. The participating company realized the innovation breakthroughs using design thinking can happen only when their organization can collaborate across disciplines, silos, time zones; and were looking for a solution to scale design thinking in their organization. KEYWORDS: Collaboration, Digital Design Thinking, Distributed Teams, Innovation, Remote Design Thinking, Scale Design Thinking
Published in International Research Journal of Marketing and Economics ISSN: (2349-0314) Impact Factor- 5.779, Volume 5, Issue 7, July 2018
The document proposes a domain-driven data mining methodology to efficiently process tickets in an IT organization. The methodology involves classifying tickets by category, identifying tickets with high issue rates, applying root cause analysis (RCA) to determine the root cause of issues, and applying continuous improvement (CI) to identify and implement solutions. An experiment applying the methodology to a banking sector showed it improved processing rates and reduced tickets with issues compared to processing tickets independently without categorization or RCA/CI. The methodology aims to efficiently solve ticket issues, increase customer satisfaction and requests, and improve processing without waiting for service level agreements.
AN OVERVIEW OF EXISTING FRAMEWORKS FOR INTEGRATING FRAGMENTED INFORMATION SYS...ijistjournal
Literatures show that there are several structured integration frameworks which emerged with the aim of facilitating application integration. But weakness and strength of these frameworks are not known. This paper aimed at reviewing these frameworks with the focus on identifying their weakness and strength. To accomplish this, recommended comparison factors were identified and used to compare these frameworks. Findings shows that most of these structure frameworks are custom based on their motives. They focus on integrating applications from different sectors within an organization for the purpose of eliminating communication inefficiencies. There is no framework which guides application’s integrators on goals of integrations, outcomes of integration, outputs of integration and skills which will be required for types of applications expected to be integrated. The study recommended further study on integration framework especial on designing unstructured framework which will support and guide application’s integrators with consideration on consumer’s surrounding environment.
AN OVERVIEW OF EXISTING FRAMEWORKS FOR INTEGRATING FRAGMENTED INFORMATION SYS...ijistjournal
Literatures show that there are several structured integration frameworks which emerged with the aim of facilitating pplication integration. But weakness and strength of these frameworks are not known. This
paper aimed at reviewing these frameworks with the focus on identifying their weakness and strength. Toaccomplish this, recommended comparison factors were identified and used to compare these frameworks.Findings shows that most of these structure frameworks are custom based on their motives. They focus onintegrating applications from different sectors within an organization for the purpose of eliminating communication inefficiencies. There is no framework which guides pplication’s integrators on goals of integrations, outcomes of integration, outputs of integration and skills which will be required for
types of applications expected to be integrated. The study recommended further study on integration
framework especial on designing unstructured framework which will support and guide application’s
integrators with consideration on consumer’s surrounding environment.
In this White Paper we provide some insights into the differences between Live-Wireframe applications authoring and programming using traditional tools.
Digital transformation requires organizations to be agile and responsive to changing business needs. Large organizations can adopt agile practices like Microsoft has done by implementing frequent feedback loops and updates. Adopting a hybrid multi-cloud strategy allows organizations to have flexibility, choice, and consistency across environments which provides agility and responsiveness needed for digital transformation. Agile is a journey that all organizations are on to continuously innovate, adapt processes and culture, and deliver value to customers.
Using Machine Learning embedded in Organizational Responsibility Model, added to the ten characteristics of the CIO Master and the twelve competencies of the workforce can help lead the Digital Transformation of the traditional public organizations to the Exponential.
An Overview of Workflow Management on Mobile Agent TechnologyIJERA Editor
This document discusses mobile agent technology for workflow management. It provides an overview of current research on using mobile agents to automate business processes across distributed systems. The document summarizes several related works on topics like inter-organizational workflows, mobile agent communication, coordination techniques, and workflow partitioning and scheduling algorithms. It aims to improve methods for designing and implementing prototype models for mobile agent-based workflow management systems.
DevOps shifting software engineering strategy Value based perspectiveiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
This document discusses how DevOps is emerging as a new methodology to address challenges in distributed software engineering and global software engineering (GSE). It introduces a new 5-level DevOps maturity model based on CMMI, assessing communication, automation, governance, and quality. The document also proposes a transformation framework to help organizations assess their current state and implement DevOps practices to bridge gaps between development and operations teams.
Architecture Framework for Resolution of System Complexity in an EnterpriseIOSR Journals
This document presents an architecture framework for resolving system complexity in an enterprise. It discusses how enterprise architecture can be used to address issues like requirement complexity, organizational complexity, process complexity, and design complexity. The framework breaks down the enterprise information system into subsystems like back-end systems, front-end systems, management tools, and communication systems. It also separates concerns into different architecture layers - an external enterprise model, conceptual enterprise model, front-end systems, back-end systems, and management tools. The framework is intended to provide a structured approach to managing complexity by organizing enterprise data and functions across the different systems and models.
A framework for ERP systems in sme based On cloud computing technologyijccsa
This document proposes a framework for implementing ERP systems for SMEs using cloud computing technology. It begins with an introduction discussing issues with current ERP systems and how cloud computing could address them. It then reviews background literature on ERP systems and cloud computing. The objectives of the research are outlined as comparing ERP before and after moving to cloud, proposing a generic cloud-based ERP framework for SMEs, and testing the framework. A case study of a company called Awal is discussed for evaluating the proposed framework.
The document discusses enterprise resource planning (ERP) systems and proposes a research agenda on the topic. It provides background on ERP systems and their growth. A taxonomy of ERP research is presented that identifies major streams of ERP research including ERP adoption, technical aspects, and inclusion in information systems curricula. Key factors related to successful ERP adoption are discussed such as balancing standardization and flexibility, organizational preparedness, and management support and change management.
Model-Driven Context-Aware Approach to Software Configuration Management: A F...theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
IMPROVING PRIVACY AND SECURITY IN MULTITENANT CLOUD ERP SYSTEMSacijjournal
This paper discusses cloud ERP security challenges and their existing solutions. Initially, a set of definitions associated with ERP systems, cloud computing, and multi-tenancy, along with their respective challenges and issues regarding security and privacy, are provided. Next, a set of security challenges is listed, discussed, and mapped to the existing solutions to solve these problems. This thesis aims to build an effective approach to the cloud ERP security management model in terms of data storage, data virtualization, data isolation, and access security in cloud ERP. The following proposed techniques are used to improve the security for multi-tenant SaaS: database virtualization, implementation of data encryption and search functionality on databases and developed systems, distribution of data between tenant and ERP providers, secure application deployment in multi-tenant environments, implementation of the authentication and developed systems together as a two-factor authentication, and improved user access control for multi-tenant ERP clouds.
DESIGN AND DEVELOPMENT OF BUSINESS RULES MANAGEMENT SYSTEM (BRMS) USING ATLAN...ijcsit
The document describes the design and development of a Business Rules Management System (BRMS) using the ATL and Eclipse Sirius frameworks. It proposes a new "Target Ecore meta model" to improve the structure and management of business rules. The system allows business rules to be modeled and transformed from their current format into an object-oriented format using ATL model transformations. This provides improved modularity, scalability and extensibility of the rules compared to the original structure. A case study demonstrates transforming an example business rule from a software package based on the proposed approach.
CRESUS: A TOOL TO SUPPORT COLLABORATIVE REQUIREMENTS ELICITATION THROUGH ENHA...cscpconf
Communicating an organisation's requirements in a semantically consistent and understandable manner and then reflecting the potential impact of those requirements on the IT infrastructure presents a major challenge among stakeholders. Initial research findings indicate a desire among business executives for a tool that allows them to communicate organisational changes using natural language and a simulation of the IT infrastructure that supports those changes. Building on a detailed analysis and evaluation of these findings, the innovative CRESUS tool was designed and implemented. The purpose of this research was to investigate to what extent CRESUS both aids communication in the development of a shared understanding and supports collaborative requirements elicitation to bring about organisational, and associated IT infrastructural, change. This paper presents promising results that show how such a tool can facilitate collaborative requirements elicitation through increased communication around organisational change and the IT infrastructure.
Cloud Computing Applications and Benefits for Small Businesses .docxclarebernice
Cloud Computing: Applications and Benefits for Small Businesses
Abstract
Cloud computing is one of the most talked about topics in the world of technology and entrepreneurship. Until now it has never been so easy for people, especially small business owner’s, to have the tools and resources readily available just one click away and at the fraction of the cost of the typical investment a few years back. Cloud computing offers cost-effective solutions at various levels that can be customize to meet the needs of anyone. Cloud computing can be thought of as a new found technology and this paper defines the concept of the cloud and provides a brief background of where most business are in regards to the use of this technology. This is then continued by describing the types of cloud currently available and potential use. The paper then presents a short but important section of cloud security issues and challenges. Finally, the paper discusses the benefits each of the different levels of cloud computing can provide small business.
Introduction
The use of cloud computing has grown exponentially in the last decade, according to Weins (2015) eight-four percent of enterprises that make use of such services in one way or another. Could computing by definition is internet-based computing, where by shared resources, software and information are provided to the end user as metered services much like a utility does(Bradley, 2014). For businesses in many cases could computing is use for IT solution purposes as it can provide IT-related capabilities as a service using internet technologies.
With the fast pace of today’s market businesses need to provide fast and reliable services to their customers in order to remain competitive. The concept of could computing is not something new as it uses existing technology and processes; however it can be consider new in sense that using these technologies has revolutionized the manner in which we host and cater services to customers. Startup companies and small businesses can take advantage of could computing to reduce spending on IT, be more adept to changes in the market, change scale and lower risk and cost.
Given the structural complexity of larger organization, Alijani (2014) states that it is essential for cloud computing to deliver rear value rather than serve as a platform for simple task. The need to deliver rear value is just as important for small businesses. For small businesses value is important but it’s their customer relationship and public image, flexibility and continuity. As such small business owners need to consider the benefits, drawback s and the effect of cloud computing on their organization before taking the decision to implement.
Types of cloud computing
There are three categories or levels cloud computing, this are: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS).
Infrastructure as a Service (I ...
Assignment Application Adoption of New Technology SystemsAs a nu.docxMatthewTennant613
Assignment: Application: Adoption of New Technology Systems
As a nurse, you can have a great impact on the success or failure of the adoption of EHRs. It is important for nurses to understand their role as change agents and the ways they can influence others when addressing the challenges of changing to a drastically different way of doing things.
Everett Rogers, a pioneer in the field of the diffusion of innovations, identified five qualities that determine individual attitudes towards adopting new technology (2003). He theorized that individuals are concerned with:
Relative advantage: The individual adopting the new innovation must see how it will be an improvement over the old way of doing things.
Compatibility with existing values and practices: The adopter must understand how the new innovation aligns with current practices.
Simplicity: The adopter must believe he or she can easily master the new technology; the more difficult learning the new system appears, the greater the resistance that will occur.
Trialability: The adopter should have the opportunity to “play around’ with the new technology and explore its capabilities.
Observable results: The adopter must have evidence that the proposed innovation has been successful in other situations.
Note:
You are not required to purchase Rogers’ book or pursue further information regarding his list of five qualities. The information provided here is sufficient to complete this Assignment. The full reference for Rogers’ work is provided below the due date on this page.
For this Assignment, you assume the role of a nurse facilitator in a small hospital in upstate New York. You have been part of a team preparing for the implementation of a new electronic health records system. Decisions as to the program that will be used have been finalized, and you are now tasked with preparing the nurses for the new system. There has been an undercurrent of resistance expressed by nurses, and you must respond to their concerns. You have a meeting scheduled with the nurses 1 week prior to the training on the new EHR system. Consider how you can use the five qualities outlined by Rogers (2003) to assist in preparing the nurses for the upcoming implementation.
To prepare
Review the Learning Resources this week about successful implementations of EHRs.
Consider how you would present the new EHR system to the nurses to win their approval.
Reflect on the five qualities outlined by Rogers. How would addressing each of those areas improve the likelihood of success?
By Day 7 of Week 6
Write a 3- to 5-page paper which includes the following:
Using Rogers’ (2003) theory as a foundation, outline how you would approach the meeting with the nurses. Be specific as to the types of information or activities you could provide to address each area and include how you would respond to resistance.
Analyze the role of nurses as change agents in facilitating the adoption of new technology.
McGonigle, D., & Mastrian, K. G. (2015). .
Assignment Accreditation and Quality EnhancementThe purpose of ac.docxMatthewTennant613
Assignment: Accreditation and Quality Enhancement
The purpose of accreditation is to ensure that institutions meet academic, fiscal, and ethical standards. Institutions also use the review process as part of their continuous improvement efforts.
To prepare:
For this Assignment, select two different regional accrediting bodies of higher education. Next, select an institution in each region so that each has similar characteristics, such as size, focus, or other attributes. Compare the institutions and their accrediting commission.
To complete:
Write a 3- to 4-page paper in which you respond to the following:
Briefly describe each accrediting body and each institution you selected.
Describe the type of accreditation that each institution has, how long they have had it, and if they have any other forms of accreditation (such as specialty or program).
Analyze the institutions, and describe at least three reasons why accreditation is important to each.
Analyze how accreditation might contribute to these institutions’ continuous improvement efforts.
Analyze how the accreditation process differs and is similar in each region and for each institution.
Your paper should be written using scholarly language and in APA style. Provide URL links to the institutions and accrediting commissions.
.
EXPLORING THE LINK BETWEEN LEADERSHIP AND DEVOPS PRACTICE AND PRINCIPLE ADOPTIONacijjournal
The document discusses exploring the link between leadership and DevOps practice and principle adoption. It analyzes data from interviews with 30 practitioners working in software-intensive organizations across different industries and countries. The interviews identified a set of agile, lean, and DevOps practices and principles commonly adopted as part of transitioning to DevOps. It was found that DevOps-oriented organizations can benefit from certain existing ITIL service management practices like change management and release management. Additionally, the research uncovered that DevOps adoption requires leadership, initially in the form of an individual role to guide the team through the transition process.
574An Integrated Framework for IT Infrastructure Management by Work Flow Mana...idescitation
Information Technology (IT) is one of the most emerging
fields in today’s Internet world. IT can be defined in various ways,
but is broadly considered to encompass the use of computers and
telecommunications equipment to store, retrieve, transmit and
manipulate data. Infrastructure is the base for everything. IT also
has an infrastructure, which can be managed and maintained
properly. For an organization’s Information Technology,
Infrastructure Management (IM) is the management of essential
operation components, such as policies, processes, equipment,
data, human resources and external contacts, for overall
effectiveness.
In this paper, we propose a methodology to manage the IT
Infrastructure in a better way. Our methodology uses the tree-
structure based architecture to manage the infrastructure with less
manual power. The process of how to manage the infrastructure is
discussed with efficient methodology and necessary steps with
algorithm, in this paper. Also, in this paper, the process of workflow
management on IT infrastructure management has been provided.
At present, the state-of-the-art supplies for conducting a face-to-face design thinking workshop typically consists of self-stick notes and stickers, markers, and whiteboards. However, this analog way of working is incongruent with the realities of global software companies, where most products and services are developed by distributed teams. This paper explores the process of facilitating remote design thinking workshops, using information technology and communication tools. The paper is based on a participatory action research undertaken by the author as a part of the doctoral thesis - ‘a study on an approach to prepare the organization mindset to build design-led innovation culture to become a customer-centric and future driven software company’ in the Indian IT sector. The participating company realized the innovation breakthroughs using design thinking can happen only when their organization can collaborate across disciplines, silos, time zones; and were looking for a solution to scale design thinking in their organization. KEYWORDS: Collaboration, Digital Design Thinking, Distributed Teams, Innovation, Remote Design Thinking, Scale Design Thinking
Published in International Research Journal of Marketing and Economics ISSN: (2349-0314) Impact Factor- 5.779, Volume 5, Issue 7, July 2018
The document proposes a domain-driven data mining methodology to efficiently process tickets in an IT organization. The methodology involves classifying tickets by category, identifying tickets with high issue rates, applying root cause analysis (RCA) to determine the root cause of issues, and applying continuous improvement (CI) to identify and implement solutions. An experiment applying the methodology to a banking sector showed it improved processing rates and reduced tickets with issues compared to processing tickets independently without categorization or RCA/CI. The methodology aims to efficiently solve ticket issues, increase customer satisfaction and requests, and improve processing without waiting for service level agreements.
AN OVERVIEW OF EXISTING FRAMEWORKS FOR INTEGRATING FRAGMENTED INFORMATION SYS...ijistjournal
Literatures show that there are several structured integration frameworks which emerged with the aim of facilitating application integration. But weakness and strength of these frameworks are not known. This paper aimed at reviewing these frameworks with the focus on identifying their weakness and strength. To accomplish this, recommended comparison factors were identified and used to compare these frameworks. Findings shows that most of these structure frameworks are custom based on their motives. They focus on integrating applications from different sectors within an organization for the purpose of eliminating communication inefficiencies. There is no framework which guides application’s integrators on goals of integrations, outcomes of integration, outputs of integration and skills which will be required for types of applications expected to be integrated. The study recommended further study on integration framework especial on designing unstructured framework which will support and guide application’s integrators with consideration on consumer’s surrounding environment.
AN OVERVIEW OF EXISTING FRAMEWORKS FOR INTEGRATING FRAGMENTED INFORMATION SYS...ijistjournal
Literatures show that there are several structured integration frameworks which emerged with the aim of facilitating pplication integration. But weakness and strength of these frameworks are not known. This
paper aimed at reviewing these frameworks with the focus on identifying their weakness and strength. Toaccomplish this, recommended comparison factors were identified and used to compare these frameworks.Findings shows that most of these structure frameworks are custom based on their motives. They focus onintegrating applications from different sectors within an organization for the purpose of eliminating communication inefficiencies. There is no framework which guides pplication’s integrators on goals of integrations, outcomes of integration, outputs of integration and skills which will be required for
types of applications expected to be integrated. The study recommended further study on integration
framework especial on designing unstructured framework which will support and guide application’s
integrators with consideration on consumer’s surrounding environment.
In this White Paper we provide some insights into the differences between Live-Wireframe applications authoring and programming using traditional tools.
Digital transformation requires organizations to be agile and responsive to changing business needs. Large organizations can adopt agile practices like Microsoft has done by implementing frequent feedback loops and updates. Adopting a hybrid multi-cloud strategy allows organizations to have flexibility, choice, and consistency across environments which provides agility and responsiveness needed for digital transformation. Agile is a journey that all organizations are on to continuously innovate, adapt processes and culture, and deliver value to customers.
Using Machine Learning embedded in Organizational Responsibility Model, added to the ten characteristics of the CIO Master and the twelve competencies of the workforce can help lead the Digital Transformation of the traditional public organizations to the Exponential.
An Overview of Workflow Management on Mobile Agent TechnologyIJERA Editor
This document discusses mobile agent technology for workflow management. It provides an overview of current research on using mobile agents to automate business processes across distributed systems. The document summarizes several related works on topics like inter-organizational workflows, mobile agent communication, coordination techniques, and workflow partitioning and scheduling algorithms. It aims to improve methods for designing and implementing prototype models for mobile agent-based workflow management systems.
DevOps shifting software engineering strategy Value based perspectiveiosrjce
IOSR Journal of Computer Engineering (IOSR-JCE) is a double blind peer reviewed International Journal that provides rapid publication (within a month) of articles in all areas of computer engineering and its applications. The journal welcomes publications of high quality papers on theoretical developments and practical applications in computer technology. Original research papers, state-of-the-art reviews, and high quality technical notes are invited for publications.
This document discusses how DevOps is emerging as a new methodology to address challenges in distributed software engineering and global software engineering (GSE). It introduces a new 5-level DevOps maturity model based on CMMI, assessing communication, automation, governance, and quality. The document also proposes a transformation framework to help organizations assess their current state and implement DevOps practices to bridge gaps between development and operations teams.
Architecture Framework for Resolution of System Complexity in an EnterpriseIOSR Journals
This document presents an architecture framework for resolving system complexity in an enterprise. It discusses how enterprise architecture can be used to address issues like requirement complexity, organizational complexity, process complexity, and design complexity. The framework breaks down the enterprise information system into subsystems like back-end systems, front-end systems, management tools, and communication systems. It also separates concerns into different architecture layers - an external enterprise model, conceptual enterprise model, front-end systems, back-end systems, and management tools. The framework is intended to provide a structured approach to managing complexity by organizing enterprise data and functions across the different systems and models.
A framework for ERP systems in sme based On cloud computing technologyijccsa
This document proposes a framework for implementing ERP systems for SMEs using cloud computing technology. It begins with an introduction discussing issues with current ERP systems and how cloud computing could address them. It then reviews background literature on ERP systems and cloud computing. The objectives of the research are outlined as comparing ERP before and after moving to cloud, proposing a generic cloud-based ERP framework for SMEs, and testing the framework. A case study of a company called Awal is discussed for evaluating the proposed framework.
The document discusses enterprise resource planning (ERP) systems and proposes a research agenda on the topic. It provides background on ERP systems and their growth. A taxonomy of ERP research is presented that identifies major streams of ERP research including ERP adoption, technical aspects, and inclusion in information systems curricula. Key factors related to successful ERP adoption are discussed such as balancing standardization and flexibility, organizational preparedness, and management support and change management.
Model-Driven Context-Aware Approach to Software Configuration Management: A F...theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
IMPROVING PRIVACY AND SECURITY IN MULTITENANT CLOUD ERP SYSTEMSacijjournal
This paper discusses cloud ERP security challenges and their existing solutions. Initially, a set of definitions associated with ERP systems, cloud computing, and multi-tenancy, along with their respective challenges and issues regarding security and privacy, are provided. Next, a set of security challenges is listed, discussed, and mapped to the existing solutions to solve these problems. This thesis aims to build an effective approach to the cloud ERP security management model in terms of data storage, data virtualization, data isolation, and access security in cloud ERP. The following proposed techniques are used to improve the security for multi-tenant SaaS: database virtualization, implementation of data encryption and search functionality on databases and developed systems, distribution of data between tenant and ERP providers, secure application deployment in multi-tenant environments, implementation of the authentication and developed systems together as a two-factor authentication, and improved user access control for multi-tenant ERP clouds.
DESIGN AND DEVELOPMENT OF BUSINESS RULES MANAGEMENT SYSTEM (BRMS) USING ATLAN...ijcsit
The document describes the design and development of a Business Rules Management System (BRMS) using the ATL and Eclipse Sirius frameworks. It proposes a new "Target Ecore meta model" to improve the structure and management of business rules. The system allows business rules to be modeled and transformed from their current format into an object-oriented format using ATL model transformations. This provides improved modularity, scalability and extensibility of the rules compared to the original structure. A case study demonstrates transforming an example business rule from a software package based on the proposed approach.
CRESUS: A TOOL TO SUPPORT COLLABORATIVE REQUIREMENTS ELICITATION THROUGH ENHA...cscpconf
Communicating an organisation's requirements in a semantically consistent and understandable manner and then reflecting the potential impact of those requirements on the IT infrastructure presents a major challenge among stakeholders. Initial research findings indicate a desire among business executives for a tool that allows them to communicate organisational changes using natural language and a simulation of the IT infrastructure that supports those changes. Building on a detailed analysis and evaluation of these findings, the innovative CRESUS tool was designed and implemented. The purpose of this research was to investigate to what extent CRESUS both aids communication in the development of a shared understanding and supports collaborative requirements elicitation to bring about organisational, and associated IT infrastructural, change. This paper presents promising results that show how such a tool can facilitate collaborative requirements elicitation through increased communication around organisational change and the IT infrastructure.
Cloud Computing Applications and Benefits for Small Businesses .docxclarebernice
Cloud Computing: Applications and Benefits for Small Businesses
Abstract
Cloud computing is one of the most talked about topics in the world of technology and entrepreneurship. Until now it has never been so easy for people, especially small business owner’s, to have the tools and resources readily available just one click away and at the fraction of the cost of the typical investment a few years back. Cloud computing offers cost-effective solutions at various levels that can be customize to meet the needs of anyone. Cloud computing can be thought of as a new found technology and this paper defines the concept of the cloud and provides a brief background of where most business are in regards to the use of this technology. This is then continued by describing the types of cloud currently available and potential use. The paper then presents a short but important section of cloud security issues and challenges. Finally, the paper discusses the benefits each of the different levels of cloud computing can provide small business.
Introduction
The use of cloud computing has grown exponentially in the last decade, according to Weins (2015) eight-four percent of enterprises that make use of such services in one way or another. Could computing by definition is internet-based computing, where by shared resources, software and information are provided to the end user as metered services much like a utility does(Bradley, 2014). For businesses in many cases could computing is use for IT solution purposes as it can provide IT-related capabilities as a service using internet technologies.
With the fast pace of today’s market businesses need to provide fast and reliable services to their customers in order to remain competitive. The concept of could computing is not something new as it uses existing technology and processes; however it can be consider new in sense that using these technologies has revolutionized the manner in which we host and cater services to customers. Startup companies and small businesses can take advantage of could computing to reduce spending on IT, be more adept to changes in the market, change scale and lower risk and cost.
Given the structural complexity of larger organization, Alijani (2014) states that it is essential for cloud computing to deliver rear value rather than serve as a platform for simple task. The need to deliver rear value is just as important for small businesses. For small businesses value is important but it’s their customer relationship and public image, flexibility and continuity. As such small business owners need to consider the benefits, drawback s and the effect of cloud computing on their organization before taking the decision to implement.
Types of cloud computing
There are three categories or levels cloud computing, this are: Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a Service (SaaS).
Infrastructure as a Service (I ...
Assignment Application Adoption of New Technology SystemsAs a nu.docxMatthewTennant613
Assignment: Application: Adoption of New Technology Systems
As a nurse, you can have a great impact on the success or failure of the adoption of EHRs. It is important for nurses to understand their role as change agents and the ways they can influence others when addressing the challenges of changing to a drastically different way of doing things.
Everett Rogers, a pioneer in the field of the diffusion of innovations, identified five qualities that determine individual attitudes towards adopting new technology (2003). He theorized that individuals are concerned with:
Relative advantage: The individual adopting the new innovation must see how it will be an improvement over the old way of doing things.
Compatibility with existing values and practices: The adopter must understand how the new innovation aligns with current practices.
Simplicity: The adopter must believe he or she can easily master the new technology; the more difficult learning the new system appears, the greater the resistance that will occur.
Trialability: The adopter should have the opportunity to “play around’ with the new technology and explore its capabilities.
Observable results: The adopter must have evidence that the proposed innovation has been successful in other situations.
Note:
You are not required to purchase Rogers’ book or pursue further information regarding his list of five qualities. The information provided here is sufficient to complete this Assignment. The full reference for Rogers’ work is provided below the due date on this page.
For this Assignment, you assume the role of a nurse facilitator in a small hospital in upstate New York. You have been part of a team preparing for the implementation of a new electronic health records system. Decisions as to the program that will be used have been finalized, and you are now tasked with preparing the nurses for the new system. There has been an undercurrent of resistance expressed by nurses, and you must respond to their concerns. You have a meeting scheduled with the nurses 1 week prior to the training on the new EHR system. Consider how you can use the five qualities outlined by Rogers (2003) to assist in preparing the nurses for the upcoming implementation.
To prepare
Review the Learning Resources this week about successful implementations of EHRs.
Consider how you would present the new EHR system to the nurses to win their approval.
Reflect on the five qualities outlined by Rogers. How would addressing each of those areas improve the likelihood of success?
By Day 7 of Week 6
Write a 3- to 5-page paper which includes the following:
Using Rogers’ (2003) theory as a foundation, outline how you would approach the meeting with the nurses. Be specific as to the types of information or activities you could provide to address each area and include how you would respond to resistance.
Analyze the role of nurses as change agents in facilitating the adoption of new technology.
McGonigle, D., & Mastrian, K. G. (2015). .
Assignment Accreditation and Quality EnhancementThe purpose of ac.docxMatthewTennant613
Assignment: Accreditation and Quality Enhancement
The purpose of accreditation is to ensure that institutions meet academic, fiscal, and ethical standards. Institutions also use the review process as part of their continuous improvement efforts.
To prepare:
For this Assignment, select two different regional accrediting bodies of higher education. Next, select an institution in each region so that each has similar characteristics, such as size, focus, or other attributes. Compare the institutions and their accrediting commission.
To complete:
Write a 3- to 4-page paper in which you respond to the following:
Briefly describe each accrediting body and each institution you selected.
Describe the type of accreditation that each institution has, how long they have had it, and if they have any other forms of accreditation (such as specialty or program).
Analyze the institutions, and describe at least three reasons why accreditation is important to each.
Analyze how accreditation might contribute to these institutions’ continuous improvement efforts.
Analyze how the accreditation process differs and is similar in each region and for each institution.
Your paper should be written using scholarly language and in APA style. Provide URL links to the institutions and accrediting commissions.
.
ASSIGNMENT A
Op
e
r
a
t
i
o
n
s
Ma
n
ageme
n
t-
Y
o
u h
a
v
e
b
e
en req
u
e
st
e
d
t
o pr
o
v
i
d
e a
d
v
i
c
e a
n
d
ju
s
t
i
f
i
ed
r
e
c
om
m
en
d
at
i
o
n
s, as a
n
ew gr
adu
ate
o
f
m
a
n
a
g
e
m
ent s
t
ud
ies,
t
o
t
h
e
m
a
n
a
g
e
m
ent
o
f TOYS(C
yp
r
u
s)
L
td. R
e
g
ar
d
i
n
g their
p
r
o
b
l
e
m
s with
t
h
eir
p
r
o
du
cts and p
r
o
ce
s
ses. They a
r
e
p
artic
u
la
r
l
y in
t
ere
s
t
ed in
y
o
u
r
o
b
se
r
v
at
i
o
n
s r
e
g
ar
d
i
n
g
t
h
eir
p
r
o
p
o
sals
t
o
ov
e
rc
om
e the
i
r po
o
r per
f
o
r
m
a
n
ce in the
m
ar
k
e
t
s they are i
n
.
They h
a
v
e as
k
ed
t
h
at
y
o
u pro
v
i
d
e
t
h
e
m with justif
i
ed
r
eas
o
n
s f
o
r r
e
c
o
m
m
en
d
ati
on
s.
Y
o
u a
r
e
t
o write
y
o
u
r r
e
c
om
m
en
d
at
i
o
n
s, fi
nd
i
ng
s and
o
b
se
r
v
at
i
o
n
s with fu
l
ly ju
s
tified ar
gum
ents.
ASSIGNMENTB - Project Output
1. Project Output 1: A pilot study or a small scale exploratory research. 4,800 words (80% of module marks)
Students will be required to select a topic relevant to their professional/ business interests and needs. Students will be expected to formulate a specific research question, identify, describe and justify the methods they will use and conduct a small scale research project in their chosen topic.
2. Report 1: A reflective journal. 1,200 words (20% of module marks)
.
Assignment Adaptive ResponseAs an advanced practice nurse, you wi.docxMatthewTennant613
The document discusses three patient scenarios presenting with different disorders and instructs the reader to identify the pathophysiology, alterations, and adaptive responses for each scenario. It also asks the reader to construct a mind map on one of the disorders discussed, covering epidemiology, pathophysiology, risk factors, clinical presentation, diagnosis, and adaptive responses.
Assignment 5 Senior Seminar Project Due Week 10 and worth 200 poi.docxMatthewTennant613
Assignment 5: Senior Seminar Project
Due Week 10 and worth 200 points
In Week 1, you chose a topic area and problem or challenge within that area. Throughout this course, you have researched the dynamics of the problem. The final piece of your project is to develop a viable solution that considers resources, policy, stakeholders, organizational readiness, administrative structures and other internal and external factors, as applicable. Using the papers you have written throughout this course, consolidate your findings into a succinct project.
Write a ten (10) page paper that as a minimum, your project should include:
Identify the topical area (e.g., local police department, community jail, border patrol)
Define a problem or challenge within your topical area that you understand in some depth or have an interest in (examples include high crime rate, poor morale, high levels of violence or recidivism, high number of civilian complaints of harassment, inadequate equipment). Outline the context of the problem or challenge, including the history and any policy decisions that have contributed to the situation.
Describe how internal or external stakeholders have influenced the situation in a positive or negative way. How will you consider stakeholders in your solution to the problem? How will you motivate individuals to buy into your solution?
Discuss how technologies or information systems have contributed to the problem and how you will propose technology be implemented into the solution.
Discuss what data you have collected or researched to indicate there is a problem. Include at least two sources of data and how each is relevant to the problem.
Develop an effective and efficient solution(s) and a course of action (i.e., plan) that addresses the problem or challenge.
Explain what methods of assessment you will employ to measure the effectiveness of your solutions.
Develop a 10-15 slide PowerPoint Presentation that summarizes the seven items above.
Use at least 8 quality references. Note: Wikipedia and other Websites do not qualify as academic resources.
Your assignment must follow these formatting requirements:
Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides; citations and references must follow APA or school-specific format. Check with your professor for any additional instructions.
Include a cover page containing the title of the assignment, the student’s name, the professor’s name, the course title, and the date. The cover page and the reference page are not included in the required assignment page length.
The specific course learning outcomes associated with this assignment are:
Assess a policy or problem and develop solutions based on available resources, taking into account the political and global implications.
Use technology and information resources to research issues in criminal justice.
Write clearly and concisely about criminal justice using proper writing mechanics.
Grading for t.
Assignment 5 Federal Contracting Activities and Contract Types Du.docxMatthewTennant613
Assignment 5: Federal Contracting Activities and Contract Types
Due Week 10 and worth 240 points
Note
: Refer to scenarios and readings from previous weeks in order to complete this assignment.
The Department of Defense plans to issue a $400,000 government contract to a company that specializes in drone navigation technologies. As a result, a government auditor has been contacted to examine the operational data VectorCal and one competitor (previously identified as “your company”) in order to decide which company should win the government contract.
Note
: You may create and /or make all necessary assumptions needed for the completion of this assignment.
Write a six to eight (6-8) page paper in which you:
Create a one-page overview of the history and background of each company vying for the government contract.
Specify at least one (1) of the recent major contracts that was awarded to both companies. Explain the fundamental reasons why both companies were awarded the contract(s) that you specified.
Determine the type(s) of contract for which both companies might be eligible (e.g., fixed-price, cost reimbursement, etc.). Justify your response.
Discuss at least three (3) direct costs and three (3) indirect costs that each company incurred during the production of its navigation system. Explain the manner in which this data would factor into your decision as to which company would be more eligible to receive the contract.
Suggest which company should be awarded this government contract based on the data that was presented for each company. Next, provide three to five (3-5) reasons to support your stance.
Use at least three (3) quality resources in this assignment.
Note
: Wikipedia and similar Websites do not qualify as quality resources.
Your assignment must follow these formatting requirements:
Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides; citations and references must follow APA or school-specific format. Check with your professor for any additional instructions.
Include a cover page containing the title of the assignment, the student’s name, the professor’s name, the course title, and the date. The cover page and the reference page are not included in the required assignment page length.
The specific course learning outcomes associated with this assignment are:
Specify the government policies regarding profit and pricing adjustments for contracts.
Evaluate the role played by contract auditors.
Use technology and information resources to research issues in cost and price analysis.
Write clearly and concisely about cost and price analysis using proper writing mechanics.
Points: 240
Assignment 5: Federal Contracting Activities and Contract Types
Criteria
Unacceptable
Below 60% F
Meets Minimum Expectations
60-69% D
Fair
70-79% C
Proficient
80-89% B
Exemplary
90-100% A
1. Createa one-page overview of the history and background of each company vying for the government contract.
Weight: 15%
.
Assignment 5 CrowdsourcingDue 06102017 At 1159 PMCrowdso.docxMatthewTennant613
Assignment 5: Crowdsourcing
Due 06/10/2017 At 11:59 PM
Crowdsourcing in the field of interface design takes tasks traditionally performed by specific individuals and spreads them out among a group of people or a community. These assignments are usually done through an open call. Crowdsourcing has become increasingly popular with the growth of Web 2.0 and online communities.
Write a fifteen to eighteen (15-16) page paper in which you:
Examine the invention and growth of crowdsourcing in the field of interface design.
Describe the impact that crowdsourcing has had on the field of interface design.
Analyze and discuss at least three (3) benefits of incorporating crowdsourcing in a design project.
Analyze and discuss at least three (3) challenges of incorporating crowdsourcing in a design project.
Propose a solution for generating interest in your design project from an online community.
Suggest a solution for evaluating the skill set and quality of the code submitted by potentially unknown users.
Describe how crowdsourcing may affect the budget and timeline of a design project.
Assess crowdsourcing in regard to the legal, societal, and ethical issues it raises, and suggest methods to alleviate these concerns.
Use at least five (5) quality resources in this assignment. Note: Wikipedia and similar Websites do not qualify as quality resources.
Your assignment must follow these formatting requirements:
Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides; citations and references must follow APA or school-specific format. Check with your professor for any additional instructions.
Include a cover page containing the title of the assignment, the student’s name, the professor’s name, the course title, and the date. The cover page and the reference page are not included in the required assignment page length.
The specific course learning outcomes associated with this assignment are:
Compare and contrast the design and development processes in HCI.
Describe legal, societal, and ethical issues in HCI design.
Describe the inherent design issues across HCI environments.
Analyze and evaluate interface design models.
Use technology and information resources to research issues in human-computer interaction.
Write clearly and concisely about HCI topics using proper writing mechanics and technical style conventions.
.
Assignment 4What are the power motivators of police leaders Expla.docxMatthewTennant613
Assignment 4
What are the power motivators of police leaders? Explain with examples.
What is the Leadership Skill Mix? Explain each category with examples.
Your text identifies three models derived from decision-making theory. Identify those models with examples of each.
List the steps, and explain the rationale, that decision makers should take when confronted with an ethical issue.
.
Assignment 4Project ProgressDue Week 9 and worth 200 points.docxMatthewTennant613
Assignment 4:
Project Progress
Due Week 9 and worth 200 points
Note:
The assignments are a series of papers that are based on the same case, which is located in the Student Center of the course shell. The assignments are dependent upon one another.
During the project life cycle, project risk reviews and reports are required as previously identified in the risk management plan. Two months after the project started, the following events have taken place.
The top-two (2) threats have occurred.
The top opportunity has been realized.
The project’s risk budget is already exhausted.
The risk management schedule has been shortened by two (2) months.
Write a five to seven (5-7) page paper in which you:
Analyze the impact of those events on the project.
Determine if any mitigation activities are required and explain why.
Determine if budget / schedule changes are necessary and explain why.
Update the risk register and highlight the changes made. Provide the justification for the changes.
Use at least four (4) quality resources in this assignment.
Note:
Wikipedia and similar Websites do not qualify as quality resources.
Your assignment must follow these formatting requirements:
Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides.
Include a cover page containing the title of the assignment, the student’s name, the professor’s name, the course title, and the date. The cover page and the reference page are not included in the required assignment page length.
.
Assignment 4 PresentationChoose any federal statute that is curre.docxMatthewTennant613
Assignment 4: Presentation
Choose any federal statute that is currently in the news. You will have to research that statute and at least two court cases pertaining to the statute. Then, prepare a PowerPoint Presentation of 6 to 8 slides addressing the following:
Provide a summary perspective of the statute.
From the two cases relevant to the statute you researched, analyze and evaluate each case separately by providing the following (about two paragraphs per case):
Facts of the case
Issues
Rule
Identify and discuss the legal ramifications and violations of any legal subjects and/or decisions related to any constitutional principles and/or administrative agency.
Make an argument for or against the statute. Discuss and persuade the audience of your position as a public administrator for or against it.
Your assignment must:
Include ten (10) PowerPoint slides, with two (2) devoted to each of the topics in items 2–4 above. Slides should abbreviate the information in no more than five or six (5 or 6) bullet points each.
In the Notes View of each PowerPoint slide, incorporate the notes you would use when presenting the slides to an audience.
Slide titles should be based on the criteria described above (e.g., “Four Major Changes,” “Major Court Cases,” etc.)
In addition to the ten (10) content slides required, a title slide and a reference slide are to be included. The title slide is to contain the title of the assignment, your name, the instructor’s name, the course title, and the date. The reference slide should list, in APA format, the sources you consulted in writing the paper.
The specific course learning outcomes associated with this assignment are:
Interpret the language of the U.S. Constitution and the U.S. legal system in order to explain the principles and process of constitutional, regulatory, and administrative laws at the federal and state levels.
Use the “case” approach to the U.S. legal system for researching cases, laws, and other legal communications using technology and information resources.
Evaluate legal subjects relevant to public administration to include property, government contracts, employment, and torts.
Relate the administrative process, constitutional and statutory requirements, to the scope of judicial review of administrative agency decisions.
Assess legal decisions related to the administration of public goods.
Apply and rule on moral and ethical analysis to issues relevant to the public administration decision-making process.
Use technology and information resources to research issues in constitution and administrative law.
Write clearly and concisely about issues in constitution and administrative law using proper writing mechanics.
.
Assignment 4 The Perfect ManagerWrite a one to two (1–2) page pap.docxMatthewTennant613
Assignment 4: The Perfect Manager
Write a one to two (1–2) page paper in which you describe the characteristics of the perfect manager to see a company through all stages of organizational growth.
The format of the paper is to be as follows:
Typed, double-spaced, New Times Roman font (size 12), one-inch margins on all sides. APA format.
In addition to the one to two (1–2) pages required, a title page is to be included. The title page is to contain the title of the assignment, your name, the instructor’s name, the course title, and the date
.
Assignment 4 Presentation Choose any federal statute that is cu.docxMatthewTennant613
Assignment 4: Presentation
Choose any federal statute that is currently in the news. You will have to research that statute and at least two court cases pertaining to the statute. Then, prepare a PowerPoint Presentation of 6 to 8 slides addressing the following:
Provide a summary perspective of the statute.
From the two cases relevant to the statute you researched, analyze and evaluate each case separately by providing the following (about two paragraphs per case):
Facts of the case
Issues
Rule
Identify and discuss the legal ramifications and violations of any legal subjects and/or decisions related to any constitutional principles and/or administrative agency.
Make an argument for or against the statute. Discuss and persuade the audience of your position as a public administrator for or against it.
Your assignment must:
Include ten (10) PowerPoint slides, with two (2) devoted to each of the topics in items 2–4 above. Slides should abbreviate the information in no more than five or six (5 or 6) bullet points each.
In the Notes View of each PowerPoint slide, incorporate the notes you would use when presenting the slides to an audience.
Slide titles should be based on the criteria described above (e.g., "Four Major Changes," "Major Court Cases," etc.)
In addition to the ten (10) content slides required, a title slide and a reference slide are to be included. The title slide is to contain the title of the assignment, your name, the instructor’s name, the course title, and the date. The reference slide should list, in APA format, the sources you consulted in writing the paper.
The specific course learning outcomes associated with this assignment are:
Interpret the language of the U.S. Constitution and the U.S. legal system in order to explain the principles and process of constitutional, regulatory, and administrative laws at the federal and state levels.
Use the "case" approach to the U.S. legal system for researching cases, laws, and other legal communications using technology and information resources.
Evaluate legal subjects relevant to public administration to include property, government contracts, employment, and torts.
Relate the administrative process, constitutional and statutory requirements, to the scope of judicial review of administrative agency decisions.
Assess legal decisions related to the administration of public goods.
Apply and rule on moral and ethical analysis to issues relevant to the public administration decision-making process.
Use technology and information resources to research issues in constitution and administrative law.
Write clearly and concisely about issues in constitution and administrative law using proper writing mechanics.
.
Assignment 4 Inmates Rights and Special CircumstancesDue Week 8 a.docxMatthewTennant613
This document outlines an assignment for a course on inmates' rights and special circumstances. The assignment requires students to write a 3-5 page paper analyzing: 1) the legal mechanisms inmates can use to challenge their confinement and whether such challenges are costly for the government; 2) four management issues arising from inmates with special needs and recommendations to address each issue; and 3) whether supermax housing violates the Eighth Amendment's ban on cruel and unusual punishment. Students must use at least three quality references and follow specific formatting guidelines. The assignment aims to help students analyze issues in corrections and propose improvements.
Assignment 4 Part D Your Marketing Plan – Video Presentation.docxMatthewTennant613
This document provides instructions for a marketing plan video presentation as part of an assignment. The video presentation is to market a technology company called Gravity Technology and present their marketing plan. The video presentation is the fourth part of the overall assignment.
Assignment 4 DUE Friday 72117 @ 1100amTurn in a written respon.docxMatthewTennant613
Assignment 4: DUE Friday 7/21/17 @ 11:00am
Turn in a written response of a minimum 250 words for each item( R, E, O, S) below. Be sure to fully address all the implications of each item. Although some level of personal commitment to your response is expected, try to avoid excessive use of “I feel…” or “I think…” statements. Attempt to imagine you are writing for a broader group of people; i.e., not just what you would do, but what all of mankind should do. The rubric for grading responses is the REOS method, where R stands for Reasoning (your logic should tie together), E stands for Evidence (Your arguments which need support should be supported by mentioning the name of someone usually cited), O stands for Observation (your unique contributions, if any), and S stands for Substance (you say something meaningful and significant, in the instructor’s opinion). ON YOUR PAPER PUT: R, then write this answer. Under the R put an E, then write this answer. Under the E, put the O, then write this answer and under the O put the S, then write this answer.
R: answer
E: answer
O: answer
S: answer
Imagine you are a community corrections (probation) officer assigned an overwhelming juvenile caseload in a jurisdiction where the age of consent is 18. One weekend while you are out at a college bar with your friends, you spot one of your probationers, Jill, obviously drunk and dancing with a man twice her age (Jill is 16). You go over to talk, but she tells you to mind your own business and leaves with the man. Sometime later, she comes back and begs you not to report anything. She explains that she has had several violations lately, and one more will send her away. You also know she has been doing better in school and has a chance at going to college. Do you report her?
Textbook: Close, D. & Meier, N. (2003). Morality in criminal justice: An introduction to ethics.
Belmont, CA. Wadsworth Publishing
.
Assignment 4 Database Modeling and NormalizationImagine that yo.docxMatthewTennant613
Assignment 4: Database Modeling and Normalization
Imagine that you work for a consulting firm that offers information technology and database services. Part of its core services is to optimize and offer streamline solutions for efficiency. In this scenario, your firm has been awarded a contract to implement a new personnel system for a government agency. This government agency has requested an optimized data repository for its system which will enable the management staff to perform essential human resources (HR) duties along with the capability to produce ad hoc reporting features for various departments. They look forward to holding data that will allow them to perform HR core functions such as hiring, promotions, policy enforcement, benefits management, and training.
Using this scenario, write a three to four (3-4) page paper in which you:
Determine the steps in the development of an effective Entity Relationship Model (ERM) Diagram and determine the possible iterative steps / factors that one must consider in this process with consideration of the HR core functions and responsibilities of the client.
Analyze the risks that can occur if any of the developmental or iterative steps of creating an ERM Diagram are not performed.
Select and rank at least five (5) entities that would be required for the development of the data repositories.
Specify the components that would be required to hold time-variant data for policy enforcement and training management.
Diagram a possible 1:M solution that will hold salary history data, job history, and training history for each employee through the use of graphical tools.
Note:
The graphically depicted solution is not included in the required page length.
Plan each step of the normalization process to ensure the 3NF level of normalization using the selected five (5) entities of the personnel database solution. Document each step of the process and justify your assumptions in the process.
Diagram at least five (5) possible entities that will be required to sustain a personnel solution. The diagram should include the following:
Dependency diagrams
Multivalued dependencies
Note:
The graphically depicted solution is not included in the required page length.
Your assignment must follow these formatting requirements:
Be typed, double spaced, using Times New Roman font (size 12), with one-inch margins on all sides; citations and references must follow APA or school-specific format. Check with your professor for any additional instructions.
Include a cover page containing the title of the assignment, the student’s name, the professor’s name, the course title, and the date. The cover page and the reference page are not included in the required assignment page length.
Include charts or diagrams created in a drawing tool with which you are familiar. The completed diagrams / charts must be imported into the Word document before the paper is submitted.
The specific course learning outcomes associated with this as.
Assignment 3 Inductive and Deductive ArgumentsIn this assignment,.docxMatthewTennant613
Assignment 3: Inductive and Deductive Arguments
In this assignment, you will apply key concepts covered in the module readings. You will identify the component parts of arguments and differentiate between various types of arguments such as inductive and deductive. You will then construct specific, original arguments.
There are
two
parts to the assignment. Complete both parts. The following is a summary of the assignment tasks.
Part 1
1a: Identify Components of Arguments
Identify the component parts of the argument, premises and conclusion, for the passages. Where applicable, highlight key words or phrases that identify a claim as a premise or a conclusion. Part 1a has three questions.
1b: Identify Arguments as Inductive or Deductive
Identify the arguments as inductive or deductive for given passages. Offer a brief explanation why each argument is either inductive or deductive. 1b has three questions.
Part 2
2a:
Argument Identification and Analysis
In these longer text passages, identify the key components of each argument. For each argument, list the main conclusion and the reasons (or premises) that support the conclusion.
2b: Constructing Original Arguments
Construct one original inductive argument. Using 75
–
100 words, explain why the argument is an inductive one. Then, construct one original deductive argument. Using 75
–
100 words, explain why the argument is a deductive one.
2c: Finding Native Argument Examples
Find one example of an argument from contemporary media; this can be a short argument. Include or reproduce the original passage of the argument, paraphrase the conclusion(s), and identify the argument as either inductive or deductive. Using 75
–
100 words, explain why the argument is either inductive or deductive.
Download
details for this assignment here and respond to each item thoroughly.
Submit your assignment in Word format. Apply APA standards to citation of sources. Use the following file naming convention: LastnameFirstInitial_M1_A3.doc. For example, if your name is John Smith, your document will be named SmithJ_M1_A3.doc.
By
Wednesday, March 29, 2017
, deliver your assignment to the
M1: Assignment 3 Dropbox
.
Assignment 3 Grading Criteria
Maximum Points
Identified and explained types and component parts of arguments displaying analysis and application of research.
24
Accurately identified key component parts of arguments in longer text passages, reflecting comprehension and critical thinking.
12
Constructed original inductive and deductive arguments demonstrating in-depth understanding of concepts.
30
Evaluated and explained instances from contemporary media to identify arguments as representative of inductive or deductive reasoning.
20
Wrote in a clear, concise, and organized manner; demonstrated ethical scholarship in accurate representation and attribution of sources; displayed accurate spelling, grammar, and punctuation.
14
Total:
Recognizing Arguments
In this assignment, you will apply key concepts .
Assignment 3 Wireless WorldWith the fast-moving technology, the w.docxMatthewTennant613
Assignment 3: Wireless World
With the fast-moving technology, the world has adopted wireless technology and has become reliant on it. You nearly use your wireless devices to do everything such as checking your grocery lists to handling complicated business decisions through third-party services. The need for high bandwidth and greater capacity has never been important, unless you shifted to wireless technology.
In this assignment, you will conduct research on a wireless network and compare it with another wireless network.
Tasks:
Create a 4- to 5-page paper and address the following:
Identify and describe any three uses for a wireless network. Two common wireless networks are Voice over Internet Protocol (VoIP) and wireless network interface cards (wireless NICs). Smartphones and personal digital assistants (PDAs) also rely on Wi-Fi networks for network connectivity. Many of these devices have mobile broadband connectivity as well.
Compare and contrast the identified uses of the wireless network chosen by you with the other one, out of the ones mentioned above.
Explain how RFID tags might be used in conjunction with product identification or inventory systems.
Compare and contrast RFID with any another technology that is similar in nature.
Note
: Utilize at least three scholarly or professional sources (beyond your textbook) in your paper. Your paper should be written in a clear, concise, and organized manner; demonstrate ethical scholarship in accurate representation and attribution of sources (i.e., in APA format); and display accurate spelling, grammar, and punctuation.
Submission Details:
By
Wednesday, February 8, 2017
, save your paper as M1_A3_Lastname_Firstname.doc and submit it to the
M1 Assignment 3 Dropbox
.
Assignment 3 Grading Criteria
Maximum Points
Identified and described three uses for a wireless network chosen by you, out of the mentioned wireless networks (VoIP and wireless NICs). Utilized scholarly or professional resources in support.
16
Compared and contrasted the identified uses of the wireless network chosen by you with the other network. Utilized scholarly or professional resources in support.
24
Explained how RFID tags might be used in conjunction with product identification or inventory systems. Included many meaningful details; utilized scholarly or professional resources in support.
16
Compared and contrasted RFID with any another technology that is similar in nature. Included many relevant details; utilized scholarly or professional resources in support.
24
Wrote in a clear, concise, and organized manner; demonstrated ethical scholarship in accurate representation and attribution of sources; displayed accurate spelling, grammar, and punctuation.
20
Total:
100
.
Assignment 3 Web Design Usability Guide PresentationBefore you .docxMatthewTennant613
Assignment 3: Web Design Usability Guide Presentation
Before you learn how to use web-authoring software to design, edit, and update web-based content, you need to understand basic concepts regarding user interface design and usability. For this assignment, you will create a Web Design Usability Guide Presentation of approximately 3–5 slides that identifies the main interface design criteria for the website of an organization with which you are familiar (i.e., current or past employer) following the directions below.
Directions:
After you have identified an organization, analyze the website and in 3–5 slides (including detailed speaker’s notes):
Describe the interface and UX criteria (include a diagram).
Explain the page navigation preferences, such as:
Features
Location
Look and Feel
Naming Conventions
Other
Identify mobile website considerations (include a diagram), such as:
Available features
Content and design
Responsive design
Supported browsers
Other
Identify the preferred programming language(s):
ASP
HTML
Javascript
PHP
Other
Identify the supported browsers, such as:
Chrome
Firefox
Internet Explorer
Opera
Safari
Outline the testing protocol.
Define specific steps and systems one should take to review a website and test its features.
Include steps to resolve any potential problems.
Your completed assignment should consist of a 3- to 5-slide PowerPoint presentation (including detailed speaker’s notes). Use at least two scholarly articles to complete your research, referencing them in text as you use them and at the end in a reference list. Your writing should be clear, concise, and organized; demonstrate ethical scholarship in accurate representation and attribution of resources; and display accurate spelling, grammar, and punctuation.
Submission Details:
By
Wednesday, August 2, 2017
, save the document as M1_A3_Lastname_Firstname.doc and submit it to the
M1 Assignment 3 Dropbox
.
Assignment 3 Grading Criteria
Maximum Points
Create a Web Design Usability Guide for an organization that describes the interface and UX criteria. Include a diagram.
16
Create a Web Design Usability Guide for an organization that explains the page navigation components.
20
Create a Web Design Usability Guide for an organization that identifies the mobile website considerations.
8
Create a Web Design Usability Guide for an organization that identifies the programming language.
8
Create a Web Design Usability Guide for an organization that identifies supported browsers.
8
Create a Web Design Usability Guide for an organization that outlines the testing protocol.
20
Write in a clear, concise, and organized manner; demonstrate ethical scholarship in accurate representation and attribution of sources (i.e., APA); and display accurate spelling, grammar, and punctuation.
20
Total:
100
.
Assignment 3 Understanding the Prevalence of Community PolicingAs.docxMatthewTennant613
Assignment 3: Understanding the Prevalence of Community Policing
As a backlash, the professional model, which reflects a "we are the experts and you are not" attitude, alienated the police from the public. Problems and crime kept growing, and people wanted to be more involved in their communities. Therefore, community members started to work closely with the police. The police saw their resources diminish and decided it was critical to engage the communities to more effectively combat rising crime.
Today, the vast majority of law enforcement agencies state that they subscribe to the community policing philosophy. The implementation of the philosophy is varied, but most agencies acknowledge the value of having a positive working relationship within the community.
Thus, it is important to understand the history of modern policing to comprehend some possible conclusions as to why agencies began adopting the community policing philosophy.
Tasks:
Prepare a three to four page report answering the following questions.
What are the main reasons for the majority of US law enforcement agencies to adopt the community policing philosophy?
What is the most important aspect of community policing that is attractive to the community?
What is the most important aspect of community policing that is attractive to the police?
What aspects of prior policing models are not acceptable in today's communities?
Note
: Use at least three scholarly sources, with at least one source that is not part of the assigned readings. Include a separate page at the end of the report, in APA format, that links back to your in-text citations and supports your recommendations.
Submission Details:
Save the final report as M1_A3_Lastname_Firstname.doc.
By
Week 1, Day 7
, submit your final report to the
M1: Assignment 3 Dropbox
.
Assignment 3 Grading Criteria
Maximum Points
Analyzed the main reasons that led the majority of US law enforcement agencies to adopt the community policing philosophy.
28
Evaluated the most important aspect of community policing that is attractive to the community and the police.
28
Evaluated various aspects of prior policing models that are not acceptable in today's communities.
24
Wrote in a clear, concise, and organized manner; demonstrated ethical scholarship in the accurate representation and attribution of sources; and used accurate spelling, grammar, and punctuation.
20
Total:
100
.
Chapter wise All Notes of First year Basic Civil Engineering.pptxDenish Jangid
Chapter wise All Notes of First year Basic Civil Engineering
Syllabus
Chapter-1
Introduction to objective, scope and outcome the subject
Chapter 2
Introduction: Scope and Specialization of Civil Engineering, Role of civil Engineer in Society, Impact of infrastructural development on economy of country.
Chapter 3
Surveying: Object Principles & Types of Surveying; Site Plans, Plans & Maps; Scales & Unit of different Measurements.
Linear Measurements: Instruments used. Linear Measurement by Tape, Ranging out Survey Lines and overcoming Obstructions; Measurements on sloping ground; Tape corrections, conventional symbols. Angular Measurements: Instruments used; Introduction to Compass Surveying, Bearings and Longitude & Latitude of a Line, Introduction to total station.
Levelling: Instrument used Object of levelling, Methods of levelling in brief, and Contour maps.
Chapter 4
Buildings: Selection of site for Buildings, Layout of Building Plan, Types of buildings, Plinth area, carpet area, floor space index, Introduction to building byelaws, concept of sun light & ventilation. Components of Buildings & their functions, Basic concept of R.C.C., Introduction to types of foundation
Chapter 5
Transportation: Introduction to Transportation Engineering; Traffic and Road Safety: Types and Characteristics of Various Modes of Transportation; Various Road Traffic Signs, Causes of Accidents and Road Safety Measures.
Chapter 6
Environmental Engineering: Environmental Pollution, Environmental Acts and Regulations, Functional Concepts of Ecology, Basics of Species, Biodiversity, Ecosystem, Hydrological Cycle; Chemical Cycles: Carbon, Nitrogen & Phosphorus; Energy Flow in Ecosystems.
Water Pollution: Water Quality standards, Introduction to Treatment & Disposal of Waste Water. Reuse and Saving of Water, Rain Water Harvesting. Solid Waste Management: Classification of Solid Waste, Collection, Transportation and Disposal of Solid. Recycling of Solid Waste: Energy Recovery, Sanitary Landfill, On-Site Sanitation. Air & Noise Pollution: Primary and Secondary air pollutants, Harmful effects of Air Pollution, Control of Air Pollution. . Noise Pollution Harmful Effects of noise pollution, control of noise pollution, Global warming & Climate Change, Ozone depletion, Greenhouse effect
Text Books:
1. Palancharmy, Basic Civil Engineering, McGraw Hill publishers.
2. Satheesh Gopi, Basic Civil Engineering, Pearson Publishers.
3. Ketki Rangwala Dalal, Essentials of Civil Engineering, Charotar Publishing House.
4. BCP, Surveying volume 1
Walmart Business+ and Spark Good for Nonprofits.pdfTechSoup
"Learn about all the ways Walmart supports nonprofit organizations.
You will hear from Liz Willett, the Head of Nonprofits, and hear about what Walmart is doing to help nonprofits, including Walmart Business and Spark Good. Walmart Business+ is a new offer for nonprofits that offers discounts and also streamlines nonprofits order and expense tracking, saving time and money.
The webinar may also give some examples on how nonprofits can best leverage Walmart Business+.
The event will cover the following::
Walmart Business + (https://business.walmart.com/plus) is a new shopping experience for nonprofits, schools, and local business customers that connects an exclusive online shopping experience to stores. Benefits include free delivery and shipping, a 'Spend Analytics” feature, special discounts, deals and tax-exempt shopping.
Special TechSoup offer for a free 180 days membership, and up to $150 in discounts on eligible orders.
Spark Good (walmart.com/sparkgood) is a charitable platform that enables nonprofits to receive donations directly from customers and associates.
Answers about how you can do more with Walmart!"
ISO/IEC 27001, ISO/IEC 42001, and GDPR: Best Practices for Implementation and...PECB
Denis is a dynamic and results-driven Chief Information Officer (CIO) with a distinguished career spanning information systems analysis and technical project management. With a proven track record of spearheading the design and delivery of cutting-edge Information Management solutions, he has consistently elevated business operations, streamlined reporting functions, and maximized process efficiency.
Certified as an ISO/IEC 27001: Information Security Management Systems (ISMS) Lead Implementer, Data Protection Officer, and Cyber Risks Analyst, Denis brings a heightened focus on data security, privacy, and cyber resilience to every endeavor.
His expertise extends across a diverse spectrum of reporting, database, and web development applications, underpinned by an exceptional grasp of data storage and virtualization technologies. His proficiency in application testing, database administration, and data cleansing ensures seamless execution of complex projects.
What sets Denis apart is his comprehensive understanding of Business and Systems Analysis technologies, honed through involvement in all phases of the Software Development Lifecycle (SDLC). From meticulous requirements gathering to precise analysis, innovative design, rigorous development, thorough testing, and successful implementation, he has consistently delivered exceptional results.
Throughout his career, he has taken on multifaceted roles, from leading technical project management teams to owning solutions that drive operational excellence. His conscientious and proactive approach is unwavering, whether he is working independently or collaboratively within a team. His ability to connect with colleagues on a personal level underscores his commitment to fostering a harmonious and productive workplace environment.
Date: May 29, 2024
Tags: Information Security, ISO/IEC 27001, ISO/IEC 42001, Artificial Intelligence, GDPR
-------------------------------------------------------------------------------
Find out more about ISO training and certification services
Training: ISO/IEC 27001 Information Security Management System - EN | PECB
ISO/IEC 42001 Artificial Intelligence Management System - EN | PECB
General Data Protection Regulation (GDPR) - Training Courses - EN | PECB
Webinars: https://pecb.com/webinars
Article: https://pecb.com/article
-------------------------------------------------------------------------------
For more information about PECB:
Website: https://pecb.com/
LinkedIn: https://www.linkedin.com/company/pecb/
Facebook: https://www.facebook.com/PECBInternational/
Slideshare: http://www.slideshare.net/PECBCERTIFICATION
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
Leveraging Generative AI to Drive Nonprofit InnovationTechSoup
In this webinar, participants learned how to utilize Generative AI to streamline operations and elevate member engagement. Amazon Web Service experts provided a customer specific use cases and dived into low/no-code tools that are quick and easy to deploy through Amazon Web Service (AWS.)
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
spot a liar (Haiqa 146).pptx Technical writhing and presentation skills
145Table of Conten
1. 1
45
Table of Contents
Introduction3
Need for technology-based solutions3
Infrastructure Automation Tools4
Implementation4
The Central Theory: Organizational Management and Memory4
Organizational Management4
Organizational Memory4
Need of Data Archival And Storage5
Data Storage.5
Types of Storage.6
Data Archival9
Data Archival Process9
2. Archiving principles12
Data Management Systems12
Enterprise Resource Planning Systems (ERP systems) for data
integration.13
Microservices.15
Properties of Monolithic.17
Conclusion22
References24
Introduction
Technology is considered vital in today's globalized world.
Especially in terms of business, information technology has
both quantifiable and unquantifiable benefits. It is essential to
communicate with customers and stakeholders regularly and
necessary for communicating quickly and clearly. It helps in
implementing business operations efficiently and effectively,
also. A business with robust technological capacity creates new
opportunities for a company to stay ahead of the competition
and grow eventually (Rangus & Slavec, 2017). Consequently, it
also makes dynamic teams that can interact from anywhere in
the world—furthermore, technology aids in understanding the
business needs and managing and securing confidential and
critical data.Need for technology-based solutions
The need for data recovery, active and continuous data
processing by its life cycle of significance and utility for
research, scientific and educational purposes (Bukari Zakaria &
Mamman, 2014). The acknowledgment that information is an
organization's key asset since late, decisively affecting its
profitability, has contributed to some comprehensive corporate
memory approaches. The key causes of competitive advantage
are corporate memory and organizational learning ability (C.
Priya, 2011). Hence the main obstacle is the effectiveness of
information management while ensuring the consistency of
training facilities.
3. Organizations need robust technology-based solutions. Thus,
software developers have developed and deployed various forms
of overtime architectures that enable software products to
become resource-effective and usable. Some architectures
implement their frameworks in either one layer or various layers
or levels (Suresh, 2012). It is understood that ERP
implementation efficiency of ERP implementations is
influenced by the rise or excess of a certain degree of capability
in the volume of data to process (Johansson, 2012). In the last
couple of decades, new architectures have been created with
creativity that offers optimum solutions. Thus, the
microservices architecture is gaining room and becoming part of
the technological, financial, and advertising decision-making
process. Microservices replace monolithic, tightly dispersed
system-focused applications with an independent operation
(Vrîncianu, Anica-Popa, & Anica-Popa, 2009).Infrastructure
Automation Tools
One issue as microservices are applied is that any service
operation must be implemented and measured in the cloud.
Companies deploying microservices can also use various
automation platforms such as DevOps, Docker, Chef, Puppet,
and automated scaling. These instruments save time and money
by implementing them (Balalaie et al., 2018).
Regrettably, further growth, migration, and integration are
required. Thus, infrastructure costs are the key focus for
companies adopting the listed trends to achieve agility,
autonomous development, and scalability. Another challenge is
the output ensemble of microservices. While it could solve any
apparent technological problem, its configuration and
capabilities must still be consistent with the new architecture.
Though different solutions still exist, there is still no precision
assessment of transitioning from ERP architectures to
microservices.
The current research is described as empirical investigations.
The new data processing services have been promoting
distributed and modular data analysis modules based on
4. microservices. These modules enhance data availability to
render intelligent services by enhancing accessible, stable, and
consistent functionalities to improve data availability by
additional context (K s&t, 2019).
In one study proposed by Stubbs et al. to discuss container
technologies in microservices design and service exploration
difficulty. The authors propose, based on the Serf initiative, a
decentralized open-source approach. They defined the
construction of a synchronization solution of data files between
repositories using Docker using Git. Due to this report's
findings, Serfnode was identified, which unites the Docker's
containers with another community of existing clusters and does
not impact the original container's dignity (Stubbs et al., 2015).
Similarly, the approach allowed frameworks for control and
oversight that perfectly completed the container because they
allow the applications operating in each shared space to be
isolated and independent. While containers can simplify
containers' use and delivery, they do nothing to solve
underserviced connectivity through a complex network. Finally,
this research examines alternatives that allow Microservices and
Containers to be used to the greatest possible
extent.Implementation
In terms of implementing the above, according to Sandoe &
Olfman, corporate memory is in line with IT advances and can
counter much unnecessary organizational forgetfulness. The
paper shows how structuring philosophy can be used to bridge
irreconcilable views (Ehrhart et al., 2015). The paradigm
presented in this paper shows that collective memory comprises
laws and tools that remedy interactivity and organizational
structure. This model is appropriate for the categorization of
current and future co-memory structures based on IT.
Comprehensively, the paper's forecast shows a mnemonic
transition in culture to discursive organization models that
primarily depend on IT-based co-membrane (Sandoe & Olfman,
1992).
The contrast between microservices implementations to ERP
5. architecture has been clarified by Singh & K Peddoju. These
authors deployed the proposed Docker Container Microservices
and tested them as a case study utilizing a social networking
framework. Because of the efficiency contrast, Jmeter8 was
built and used to apply constant applications for both designs.
For the design of the ERP, the application has been forwarded
with a web-based API. By comparison, HAProxy is used to send
queries to the intended service for a microservices architecture.
The findings showed that the application designed and
implemented using the microservices method decreases the time
and commitment required for the application to be deployed and
continually integrated. Their findings have also established that
the ERP paradigm is superimposed due to low response times
and good performance by microservices. Our experimental
findings show that containers are acceptable launches compared
to virtual machines for microservices applications (VMs).
Several suggested experiments have been conducted on the
benefits and drawbacks of moving from an ERP to one of the
microservices architecture (Singh & K Peddoju, 2017).The
Central Theory: Organizational Management and Memory
Organizational Management
Sandoe and Olfman (1992) and Morrison (1997) describe two
organizational management forms that satisfy two functions:
representation and interpretation. Representation presents the
circumstances for a given situation or position. Analysis
promotes adaptation and learning by offering frames of
character reference, methods, regulations, or a means to
synthesize past information for application to new situations
(Organizational Memory). This theory is especially applicable
in using information systems. Organizational and cultural
factors play a major role in the optimal functioning of
information systems (Booth & Rowlinson, 2006). Specifically,
the implementation of robust services needs well-defined
contracts with all teams involved rather than catering to each
team's individual/special needs. Organizational dynamics
determine how the contracts, as mentioned above, are
6. negotiated, designed, and implemented.
Organizational dynamics are rooted in an organizational culture
defined as patterns of shared values, beliefs, and assumptions
underlying behavioral norms between organizational members
(Schein 1992). This definition implies that the culture is
persistent and rooted in the shared history and experiences
developed over a long time. Hence, organizational culture plays
a long-term role because this cultural persistence has become
important in understanding resistance to new IT
implementations and their subsequent adoption. In global
organizations, national sentiments expand the scope of
organizational culture. Organizational Memory
Empirical knowledge is a key to competitiveness. Therefore,
conservation of organizational memory is growing progressively
essential to organizations. With the convenience of innovative
information technologies, information systems become a crucial
part of this memory (Perez & Ramos, 2013).
Organizational Memory Information Systems (OMIS) bring
together culture, history, business process, human memory, and
the actuality into an integrated knowledge-based business
system. OMIS's assist businesses in fitting in different
databases, capturing the skill of retiring staff, enhancing
organizational expertise, and providing decision-making support
to employees facing new and complex issues while integrating
disparate and uneven types of knowledge (Roth & Kleiner,
1998).
The organizational memory dictated by culture is continuously
exposed to restructuring and change, is recreated, reconfigured,
and enhanced by new knowledge by organizational learning
procedures through shaping organizational performance by
capitalizing and evaluating the cognitive acquis of the
enterprise (Linger et al., 1999). The company is most frequently
described as an "elaborate, immaterial and permanent
representation of knowledge and facts." The organizational
memory diagrams an organization's cognitive infrastructure that
enables an organization to recognize, compile, convert,
7. capitalize, and value awareness, facts, rules, and community
values.
Certain analysts have evaluated almost 40% of the Fortune 500
firms' activities in 2005 as part of their corporate learning,
using some form of information management systems (Siong
Choy & Yong Suk, 2005). This study exposed some critical
aspects of organizational culture that reduce the efficiency of
information management systems.Need of Data Archival And
Storage
Too frequently, when preparing digital workspace programs,
digital archive projects are set down on a priority list. Business
is incorrect to assume that low storage expenses and a powerful
search engine require all its records. What would go wrong, and
besides? For knowledge processing, archiving is crucial and
will allow a company more oversight of their data operations.
When an organization expands, more data is generated – data
that need closely handled and controlled to be correctly used.
Holding tabs on these records can be difficult for firms who
never implement an archiving scheme (Borgerud & Borglund,
2020). Records not archived becomes harder to find, protect and
distribute when housed in a surrounding environment - like a
desktop - and would thus be useless to other user groups. This
will potentially adversely impact organizational operations and
the morale of workers.Data Storage.
The main purpose of data collection is to digitally archive files
and records and preserve them for the storage facility's potential
use. If required, storage systems may depend on
electromagnetic, mechanical, or other devices to conserve and
restore data. Data storage allows it to archive files in case of an
accidental computer crash, data breach, or data archiving for
safekeeping and fast recovery (Spoorthy et al., 2014).
Although not all databases must be preserved, it is necessary to
preserve what needs to be preserved to make the data safe and
accessible. Data storage refers to a variety of ways in which
physical media store information to be accessed once users
require it (XIE & CHEN, 2013). In the evolution of
8. computation, the storeroom equipment has greatly evolved over
the years, from room size microprocessor computers'
electromagnetic instruments to state-of-the-art solid-state drive
technologies (SSDs) and, just like many products in the
technical field, these approaches keep evolving as the need for
data and storage increases.
Data storing on physical hard discs, discs or USB drives or in
the cloud is possible. The main thing is, if your machine ever
crashes beyond recovery, the files are substantiated and readily
accessible. Reliability, the strength of security capabilities and
the costs of implementing and maintaining the infrastructure are
among the most critical things to remember when it comes to
data storage (Esposito, 2018). By browsing various data storage
systems and products, one can make the most suitable option for
your enterprise.
The corporation's storage style plays a major role in the
accessibility of its records, the number of archive expenses, and
the data's safety after it has been archived. An archive is only
valuable when data can be accessed when necessary, so it
should be regularly checked that the stock chosen by the
organization is still working.
Types of Storage. Offline storage.
Data undoubtedly grow, but one of the traditional storage types
still has a role in modern industry. Offline backup has been
there for years and includes archiving vital files using digital
discs such as CDs and Blu-Rays. And if the data is not
accessible immediately since the storage choice is more new,
offline storage is extremely protected while being accessible in
the event of a network outage.
The offline storage is also ideal if the company has regulatory
obligations or if knowledge for legal purposes has to be
supplied. It should be maintained on a written media to ensure
the information is lawfully permitted. RAID discs and other
cloud storage cannot be placed (Chan Jianli et al., 2020).Online
storage.
9. Although it can sound intuitive to include all online storage in a
similar classification, two distinct offers currently exist. Then
the online storing facilitates the store of data stored in the cloud
for customers and companies. This is what researchers mean by
cloud storage for the objectives of this post. Cloud storage will
function very well, provided it progressively safeguards data
and does not require upfront resources (Rausher et al., 2010).
The drawback, however, is that it could be unacceptable for
data to be collected if complete data retrieval is required.
Any businesses that have taken a cryptographic signature of
cloud service to develop some of the advantages of energy and
reliability are not happy putting their information in the hands
of 3rd Party cloud infrastructure suppliers. While this was when
out of small enterprises' grasp, advances now enable small
enterprises to tap into personal cloud storage.Cold Storage.
Data less commonly viewed and does not, therefore, require fast
access to colder data. This contains information that is no
longer being used actively and may not be necessary for
months, years, centuries, or even ever. Practical forms of cold-
storage documents include ancient ventures, information used to
hold other company records, or something worthwhile but not
needed shortly (Zhao et al., 2020). Data recovery and reaction
times are usually much longer than those for the active control
of data on Cold Cloud storage networks. Services such as
Amazon Glacier and Google Coldline are practical instances of
cold cloud computing (Zhao et al., 2020).Cloud Storage
Cloud storage is the organization, with the required rights, of
data stored anywhere that everyone can reach on the Internet.
You do not have to be wired to a corporate network because you
do not have access to information on devices. Microsoft,
Google, and IBM are common cloud storage providers (Yuhuan,
2017). Cloud storage is supported by cloud-based IT ecosystems
that allow cloud computing to operate cloud-based tasks. Cloud
storage has no internal network access or specific data storage
connectivity.
Data services are differentiated from hardware devices as the
10. basis of a cloud storage volume. Network virtualization is one
approach to dissect, taking a dozen separate servers (either
convenient or confidential) and abstracting computing capacity.
This entire virtual storage area can be grouped into an
information lake termed a unified repository, accessible to
consumers (Langos & Giancaspro, 2015). That generated cloud
storage when such information lakes are linked to the
web.Block storage.
The storage block is also designed to separate the user interface
information and be best used in various contexts. However,
when information is analyzed, the storage program reorganizes
and returns the information blocks from those contexts. It is
normally used in SAN settings and has to be connected to a
working server (Kumari et al., 2019). This must be done on a
network.
The storage process cannot be easily retrieved because it would
not consist of a single physical layer, like files' storage. The
blocks are separate and can be subdivided to enable them to be
accessible from another web browser, enabling them to
customize their data. It is an inexpensive, secure, and user -
friendly way of storing data (Fujita & Ogawara, 2005). It goes
best for companies that carry out large transactions and
introduce massive databases, suggesting that the more
information they have to store, the easier that can get with
block storage.
However, there are a few downsides. The storage of blocks can
be costly. It has no metadata handling, meaning that it must be
handled based on a program or database—adding something else
to think for a programmer or server operator.
Data Archival
Data archiving is a practice in which data that is not operational
anymore are identified and transferred from processing to long-
term storage systems. Archival files are preserved in order to be
able to be returned to service at any point. Archived records are
processed at a lower cost level to reduce primary disc
11. consumption and associated costs. A significant part of a
company's data archiving policy is archiving the data and
classifying data as an archiving nominee (McDaniel, 2014).
Data Archival ProcessPurpose.
Businesses will store data for business image objects through a
data archiving mechanism. This method is carried out by a
business process-related archiving entity, though, in this file
storage subject, the arrangement or arrangement of the data is
specified. When the data are archived, the machine copies the
information to archive archives scans the archived data after
multiple tests, and, if accurate, extracts it from the operating
system. In contrast to the main method, subfields for viewing
and reloading archived data and device profiles still exist
(Hujda et al., 2016).Preparing the data.
As a source of information, the company has all aspects of its
software project (files, resources, source files, test reports,
etc.). (SVS). Consequently, the setup is checked to ensure that
none is lacking. There is no problem. Till all the elements
accessible are checked, an archive may be created. It must be a
robust database, and companies must set the time of archiving.
The period of the archiving is contractually, contextually, and
risk-based. The archiving media and procedure have to be
modified according to the period. Verification is needed for
archiving on external drives (Kornei, 2019). For archiving on an
external hard drive, a validation procedure is essential, and
discs are changed regularly.Process Flow.
Major comment threads, including the study, writing, and
deletion, form the fundamental archiving process. You may
combine these if the appropriate customization settings are
made. Parallel systems for research and writing may be handled
if the parallel analysis method is used. To accomplish this,
appropriate data packages are created, which are parallelly
processed by separate jobs. The subprocess initially analyses
the archiving object data set and then creates the appropriate
package templates for parallel processing specified by the
12. program cap (Bruno, 2014). Profiles are insistently saved in the
archive and subsequently used by the research and writing
subcategories if configured for the archival object in the global
personalization settings.
To minimize the overarching runtime of the archive project, the
software profile creation aims to use simultaneous package
managers to review and write subprocess. The dataset must,
however, further than practicable, be split into different
packages of the same size (Ribeiro, 2001). As the data
distribution can alter over time, the pre-step needs to be
repeated regularly to ensure that appropriate profiles are
provided.Simulation.
The simulation feature excluding the deletion or labeling of
commercial items from the operating database would follow all
the archive procedure phases. It is just a test run. In real fact, it
generates an archive destination address, which differs between
it and figures. This could be utilized to check the predicted
performance (Onggo & Hill, 2014). Although you could use a
test database other than the true operating bases to do most of
the same research, the benefit of using the actual thing makes it
a better test. It guarantees that the conditions for the evaluation
are operationally consistent with the database
specification.Write.
The analytical software begins immediately with the normal
setup. The writing process clones the specified information
from the operating database to files in the research sub-process.
Consequently, the information is archived throughout this
process.
Like in the research method, the data are recorded in parallel,
discrete-time positions. Each task processes the various sub-
packages from the collective file. With each parallel processing
task, exactly one archive is formed; it can comprise one or even
more documents.Delete.
The erase subprocess extracts data from the operating storage as
the data is copied into the backup archives. To do so, stored
records can be accessed and removed only if they are read from
13. the archive effectively. This protocol ensures that unless the
machine is equipped to archive data according to the guidelines
and configurations, data is deleted from the database. A regular
setup begins automatically when the write operation finishes a
single archive file successfully. The amount of generated
deletion procedures is often the same as the number of
documents generated by the written program.
If an archive document ceases to be accessed, the data is left to
be archived in the operating system and is retrieved again
during the next archive exercise by the writing process. Either
you should selectively remove the already generated archive
files or keep them in the archive. The latter choice is innocuous
because files will not be deleted from the OS: Only when an
erase process has been accomplished effectively are archive
information systems developed.Data integrity issues.
When data is archived, it is often usually removed from the
database from which it is archived. If replicates of these data
are present in other databases, data combinations could not be
compatible with these data. When all records are made, the
results can vary. This can lead to an alarming condition where
most consumers in one system vary from those of other system
users (Khidzir & Ahmed, 2018). It might even be crucial to
provide an extended archiving method, which deletes copies
from other records simultaneously if databases are to maintain
continuity.Accessing the data.
The data is stored in a separate archive pool until the project is
completed. It is different from every file stream that is
generated for the substitution program. Access to the pool of
retired applications should be independent of exposure to the
archive's water stream. Perhaps both would not fit with the
access rationale. This increases to a degree above the logic of
breaking metadata. The developed archive channel will verify
the 100% rule for all archive access criteria (Senko, 1977).
Once the archive has been completed, the source request will be
lost. It saves much money. There is no way to re-examine data
in the source networks. This ensures that users must perform
14. rigorous access tests before anyone can claim success.
Archiving principles
Data archiving is the method of preserving the activities for
further scanning and review within the framework. When
information passes into the repository for the processing device,
an archive file collects and saves data in an indexed fashion for
recuperation. Data is normally saved for alarm/access
regulation, adjusting device status, streaming, and audio. It is
probably to be housed on individual disc or library volumes
(Vans et al., 2018).
In the management of their archives, archivists implement the
two concepts of 'provenance' and original order. These ideals
must form the basis for all the archives' practices (Kilchenmann
et al., 2019). Until any take action to enhance their maintenance
and care, your archives ought to consider how and how they
were made and how well they are organized.Original Order.
In the sequence in that, they were first produced or used;
archives are stored. 4 This must be understood when dealing
with libraries to maintain this original order. The original order
enables guardians to safeguard the validity of documents and
contains important knowledge on the type, maintenance, and
usage of records. Perhaps this initial order was missing due to
misuse or "re-sorting" (Stokes, 2012).
The original order essentially ensures that objects remain in the
order that the individual or organisation whose archives initially
held them. This is significant even though those documents
might be stored intact for a purpose, even though the purpose
was not readily evident.
A basic concept of archive management is a consideration for
the initial registration order. Digital archive arrangements are
far less about preserving the actual structure of storage media
but are more about retaining logical links between electronic
records when the digital records' external order frequently
requires to be changed for storage and maintenance purposes
(Niu, 2014).Provenance.
15. The provenance theory ensures that the documents that a person
or organization creates accumulate and maintain collectively to
be separate from some other maker's documents. As its
development promoted the challenges caused by archival
science, the concept of origin is regarded as a landmark in
archival practice and philosophy (Milosch, 2014).
Provenance signifies the history of possession of the holding of
a set of documents or an object. This implies the designers and
later owners of the documents and their relationship with the
files. It is important to preserve knowledge about those
partnerships because they show how and who produced and used
the documents before becoming something of the archive.
Provenance offers important historical material to appreciate the
contents and heritage of a series of archives (Hunter & Cheung,
2007).
As the notion of origin originated in an archival sense in the
19th century, it had a logical objective: to arrange a collection
of documents that had lost their organic association with its
authors as a result of a thematic grouping. This theory led the
archivists to apply the theory as a concrete organizational
principle, which consolidates archives of the same love. One
reason documents cannot be lent is provenance. The possession
and retention (physical presence and not content) of an archive
after it was established should preferably trace (Tognoli &
Guimarães, 2018). The shareholders' knowledge lets one assess
whether anyone has modified it, so it is easier to say whether it
would be genuine.Archival Locations.
When we intend to archive records, we must worry about
disaster recovery and enterprise continuity plans, which can
turn very difficult because the archiving process recognizes
threats. Let us presume we want to archive records; it is
normally a terrible idea to store archival information in the
same space or building as the facility used for data retention
(Leonhardt et al., 2016). We determine that the archived
intrusion test data should be maintained in a safe facility that is
physically separate from the system site, so natural and human-
16. made accidents are ever at risk. We need two versions – one
centrally and one else if we need it fast.Compliance.
Due to legal enforcement, certain organizations are forced to
maintain data for a specified amount of time. It is a prominent
market issue that remains under regulatory criteria as required
by industrial laws or governmental policies. Consequences can
comprise payments for costs, fines, and canceled contracts for
breach of compliance (Giacalone et al., 2018).
Data archiving allows companies to achieve compliance through
long-term data storage as well as consolidation in an audit. The
rules governing the time required to store, store and have access
to information vary depending on the sector and the form of
data companies generated in this industry. The below are among
some of the explanations why organizations focus on methods
for data archiving:Preventing data loss.
For legal purposes, archiving is also relevant. Many
corporations have records that really should be kept by
regulation unintentionally. Thus, staff should remember that
breaching such rules will lead to heavy sanctions or even
imprisonment punishments in certain contexts. The movement
of data was one of the most serious challenges to ongoing
implementations and data access. Statistics suggest that this
year there will be an unplanned collapse in 75% of
organizations (Killalea, 2016). When an organization executes a
data migration process, there is a much greater risk of failure.
Archiving preserves business processes and the company's data
by transferring data from costly main storage facilities to a
significantly lower-cost archive storage device.Legal
requirements.
A successful archiving scheme guarantees consistency with
company-specific retention schemes, independent of individual
workers' expertise. To make it conscious that violating these
rules could contribute to substantial fines or prison terms in
some situations (Gerber & von Solms, 2008), the Data
Protection regulators are imposing more strict penalties on the
industry.
17. When an entity deals in a court action, the company is provided
with encrypting protection and assistance. This is generally
termed a discovery in lawsuits and is data collection and
transmission on request (S & Venkateshkumar, 2018).
Excluding archives, the expenses of gathering evidence for a
complaint could be as cost-effective as the case itself.Data
Backup Optimization.
Data backups can be slow and tedious, but it does not have to be
this way when they store the corporate data. Indeed, certain
businesses that archive files see significant changes in data
retention times and, for this kind of purpose, some also switch
to file archiving. Choosing file archiving firms is one step
forward by supplying archive duplication to remove the
necessity for data backups. This is much more cost-effective
and productive (Ghantasala et al., 2018). Data Storage Costs.
Perhaps this one is more evident. Info, period, is paid for the
business. Regardless of the business or data form, it costs a
fortune to retain the data on a disc or a cloud. It does not matter
which one is (Sergeant & Sergeant, 2010). It is the financial
part that has the greatest advantage to archive the results. The
cost will be minimized by up to 50 percent based on your data
amount as users store the business's data. This will contribute to
considerable long-term savings for products and other
sectors.Data security and compliance.
The purpose of the data protection conformance regulations is
to enable businesses to ensure that data structures and sensitive
data are integral, secure, and available. They have a series of
protocols and regulations that safeguard companies from
security vulnerabilities by protecting networks and records
(Bindley, 2019).
Controlled companies are accountable for maintaining records
rather than frequently(Sholler et al., 2019). This is to follow
regulations as well as the principles of conformity. Regardless
of its scale or sector in which it works, conformity can be seen
in virtually every record made. In comparison to GDPR,
archiving is a prerequisite if the organization complies with all
18. other organization legislation, particularly data
management.Data Storage Management.
Maybe the most prominent purpose for archiving data is for
efficiency and sizing purposes to eliminate redundant
transaction data in the output record (Sangat et al., 2017). The
data provider should also point to the data kept in its primary
storage.
While it seems to be a minor act, the organization will save both
main and backup storage IT expenses and improve the speed of
software such as IFS Applications. In exchange, a quicker
machine can boost efficiency. Finally, the company has to hold
up its space on its main datastore but does not address it
anymore by correctly archiving obsolete data firms.Data
Visualization.
The user can help digest the data with visualization and view
new directions. This allows consumers to recognize new
dynamics and phenomena that may not be seen with table data.
It enables managers in graphical displays, including diagrams,
plots, and heat maps, to monitor results (LaPolla & Rubin,
2018).
As big data emerges, data visualization is increasingly essential
to interpret the data gathered daily by the data user.
Data visualization enables companies with modern, more
immersive formats to recognize, analyze and communicate data.
This willingness to be data-oriented allows them to educate and
learn how to use data visualization applications and their related
formats. The best data archiving strategies enable companies to
visualize their data and create better strategic strategies for
their records (S & Sathayanarayana, 2018). It ensures that you
can understand just how aged the database is and what data the
firm has, how many days the data is processed, and other
important information that allows the company to build
effective data archiving policy.Increased Security.
Maintaining outdated or inactive records on high-traffic
databases raises the prospect of a possible intrusion on the
enterprise and with correct access controls. By protecting
19. unuseful data, for instance, by separating them from public
access to a remote backup tier or system, businesses reduce the
risk and possible effect of data lost or stolen while ensuring the
confidentiality of all these data as long as they are required
(Schafer, 2004).
For security purposes, archiving is critical, particularly when
cyber-attacks and data violations are getting more prevalent.
Companies can detect and defend themselves from unwanted
third parties by safely archiving records.Data Consolidation.
The organization wants to optimize this information and so
much data through its computers, which expands rapidly every
day. De-duplicate and stubborn files are just the start! Any file
archiving program helps you to further certain compact files,
lessening the digital footprint (Narayanan, 2020). If this is not
an opportunity to archive information for the organization, what
is it?
Companies can obtain major information efficiently and
conveniently by data consolidation. Businesses may improve
their production and competitiveness if valuable knowledge is
saved in a single location. Data restructuring also decreases
maintenance costs. From either the context of data intake, the
solution to the sustaining data illustration is more complicated
with a larger number of references being incorporated into the
Key framework (Bergquist, 2001).Data Management Systems
Data is a distinct piece of information, usually formatted in a
certain manner based on user requirements. Data should be
stored and archived in structured and encrypted form for
traceable and secure access. Data storage, archival, and data
retention are critical to the organization for both business and
legal reasons (Bose, 2006). Lack of good practices in an
organization can open and organization and its employees to
several risks, which could damage an organization's reputation
and business. For example, in the health industry, data safety
and patient confidentiality are paramount (Ferrari, 2010).
Archiving data ensures robust backup, faster recovery of data
guarantees easier backup processes. It also helps in maintaining
20. and protecting the policies and objectives of and organizations
and less time-consuming. An efficient data storage
pipeline/strategy and archival and cost-effective archival
solutions enhance productivity and lead to organizational
growth.
Enterprise Resource Planning Systems (ERP systems) for data
integration.
Information systems in a business can be composed of custom
applications (written internally) or commercially purchased
generic systems. Custom applications require extensive
resources, long and expensive development cycles. Moreover,
they need to be continually updated and maintained with the
evolving landscape of new informatio n systems (Herrmann,
2016). Off-the-shelf commercial systems remove the above
problem by taking the responsibility off the user. However, the
one size fits all generic commercial systems approach cannot be
tailored specifically to each business requirement (which may
have thousands of parameters), thereby imposing the need to
obtain IT solutions from several different vendors
(Wickramasinghe & Gunawardena, 2010). Separate modules are
needed to link different functional areas. For example, the
human resource area will require a different module to satisfy
its business needs compared to the financial area. Data
generation and handling.
These modules should be linked to make better business
decisions by using the data generated from each module across
each other. Enterprise resource planning (ERP) systems were
developed with this vision. ERP applications are implemented
to provide an integrated solution to all areas involved in the
business operations (for example, Human resources, sales, etc.).
ERP applications are solely developed for data handling and are
thus well suited for modeling various transactional processes
(Pylypenko & Redko, 2019). These systems consist of
applications focused on the integration of data from various
sources. Common data structures are shared across many
21. applications and thus eliminate the need to pass data step-by-
step among other applications. In ERP systems, data
manipulation is easy since data is maintained in interoperable
databases that can store data in a structured format used by the
ERP applications. This, in turn, is based on the assumption that
data infrastructures are homogeneous across the organization
(rarely the case), which means that in some cases, databases are
from the same vendor.
Moreover, some ERP systems may only support databases from
a specific vendor, forcing them to adopt standardized data
management solutions according to the ERP system. This also
means that the adoption of specific ERP systems requires that
legacy databases be replaced with ERP–compatible databases,
which creates the need for data conversions and the creation of
defined architecture for data storage. Therefore, the conversion
from legacy databases to ERP-compatible versions needs
standardizing, transferring, and cleaning existing data elements
(Lee & Chang, 2020).
To ensure its effectiveness and stability, the construction of the
ERP Mechanism plays an important part. Three essential
architectures for ERP systems are currently established (figure
below). The architecture utilizes a specific technology for
implementation to restrict the use of an appropriate method for
any role the device has to perform. The threats of short- or
long-term issues are not well understood.
Stable structures, the complete management structure developed
by leading organizations, including IBM, Sun Microsystems,
and BMC, are among the most important advantages (Khazaei et
al., 2016). These technical providers provide a high degree of
product expertise. Some drawbacks, such as hierarchical
structures and new requirements.
The improvement in computing power means the existing server
is transformed into a bigger server. Its equipment is patented,
and it makes the retailer reliant on the customer. Any computer
machine must adapt and track patterns like punched cards
overcome by solid-state drives (Baškarada et al., 2018).
22. The sophistication of digital applications needs both software
production and efficiency upgrades. This implies, however, that
the ERP architecture has discovered unavoidable faults. Which,
with time, would lead to new architectures including such
microservices, as opposed to itself. Any applications use this
kind of construction more efficiently.
Since their development in the early '90s, legacy ERP systems
are widely used. Initially developed to handle hard data, i.e.,
stored on hard drives or memory storage devices, ERP systems
have since evolved (disparately) to address data generated on
Web and IOT devices (Boniecki & Rawłuszko, 2018). One issue
is that the fundamental technology that drives legacy systems is
old-fashioned, unable to leverage open source software and
APIs to empower interconnection. Also, they are not appropriate
to an organization expanding through mergers and acquisitions.
These systems cannot handle the innumerable global regulatory
necessities (Brogi et al., 2018).
Consequently, the legacy systems are unable to connect easily
and converse with other systems. This leads to the creation of
multiple bolts-on solutions and costly both in terms of time and
money (Cho & Kim, 2014). In turn, a monster legacy ERP
system depends on resources with specific legacy system
programming and system knowledge. These resources can be
costly both in terms of time and money.
Microservices.
A microservice is a distinct, autonomous element contributing
to a specific service. In a medium to large enterprise, many such
services may combine to achieve an end goal, for example, data
storage and archival. Furthermore, though robust
implementation of different microservices components may
improve overall efficiency, this implementation will differ
based on organizational management and past knowledge
(Yousif, 2016). These factors' role is not well known and needs
to be studied to optimize distributed services' efficiency, i.e.,
microservices.
23. Microservices fulfills typical data storage characteristics by
providing independent, expandable, and upgradeable factors fit
for the evolutionary design approach. For enterprises that have
traditionally used legacy ERP systems, migration to
microservices will require a change in organizational thinking
(Oberle & Dreiss, 2018). The distributed nature of
microservices means that data structure handling and archival
will be different for each service. This will require modified
requirements and collaboration between teams handling each
service.
Microservices are a subset of distributed computing services
that offer more secure, efficient, and cost-effective alternatives
to monolithic ERP systems for data archival and storage
solutions in medium and large enterprises (Maas et al., 2014).
As an organization scales, it will generate more data that needs
to be methodically managed and supervised to be applied
properly. In the medium to large enterprises, data flows in many
forms and shapes.
The architectural style of Microservices has received
considerable attention in recent years. The demand for micro-
services began in 2014 and has continuously increased since
then.
An architectural microservices solution is to create a single
program as a series of small services that work with lightweight
mechanisms, often an HTTP resource API. They are designed on
business skills and are completely autonomous deployment
machines, and can individually be used (Олещенко &
Глінський, 2017). This modern architecture allows massive,
complex, and scalable systems to be created, including tiny,
autonomous, and highly unconnected processes that
communicate with each other using APIs.Properties of
Microservices.Microservice architecture.
The aim for microservices is to use autonomous units which,
through decentralized container technology, such as Docker, are
insulated and coordinated into a decentralized network. In
normal words, this architecture paradigm's implementation often
24. means adopting agile methodologies, such as DevOps, which
decreases the time to modify the structure and extends this to
the development environment.
Services are the key blocks and means of modularizing micro-
service structures as the expression 'microservices' implies.
Services may be deployed independently, replaced, and removed
in different process circumstances (Celozzi, 2020). Any
microservices focus on a single business purpose following the
concept of single responsibility (SRP).
Centralized administration is discarded as far as possible and
practicable, as well as data storage. This helps the team to
choose the best resources, such as suitable programming
languages or repositories for this mission (Molchanov &
Zhmaiev, 2018). Also, without impacting other teams, decisions
may be reversed or overturned.
The teams will develop new implementations of their service on
their behalf through high implementation and infrastructure
optimization. Implementations of microservices are stateless,
except for short-term caches, which boost efficiency and
durability. Sometimes, databases, in particular, are even
typically run (Venugopal, 2017).
Microservices' functionality is supplemented with API calls that
offer data in a format easily useable by data visualization
clients. This approach lessens the code difficulties on the client-
side dealing with data aggregation and transformation for
visualization (Neubert et al., 2019). Therefore, Microservices
can be easily implemented in small to medium organizations.
However, implementing microservices in large companies needs
Re-strategizing the application deployment. It can lead to self-
service delivery implications since microservices need the
administrators to understand the relationship between these
services' demand.
Conversely, this creates the need for additional training and
understanding of evolving technologies to promote their
adoption and create a seamless transition. There is still limited
knowledge to determine the objective and subjective factors
25. required for adopting microservices as an alternative to legacy
ERP systems. Decentralized data management.
There is a ubiquitous need in the IT sector to create quick-run
quality applications. Each company thinks about the word
"microservices" from large-scale cloud services to opposing
start-ups (Sultan, 2020). We want to step away slowly from the
conventional ERP software to loosely connected providers.
Microservice architectures quickly became a must for business
data management. By splitting a large collection of functions
into distinct functions that allow developers to build loose-
connected autonomous services as their server or utility, they
are not attached to a specific server or circumstance (Plutora,
2019).
Microservices decentralize data collection choices, as well as
the decentralization of decisions on logical models. Although
ERP applications use a single, logical database for persistent
data, companies often use a single database over various
applications based on seller's licensing models.Drivers for
Microservices adoption.
To overcome the challenges of a monolith application
development, microservices were being invented. Given its
importance when deciding to take microservices as an
assessment focuses for microservices adoption, the participants
are asked to evaluate commonly assigned resources to
microservices. One of the most key characteristics of
microservices is strong scalability and stability (Laigner et al.,
2021). The organization around the company plays a very
significant role in adding value for the highly requested
resources of Microservices architecture. Since its inception,
Microservices Architectures has revolutionized the computing
industry by providing optimal solutions for many unexpected
complexities.
This can alone be expanded rather than the whole machine
deployed. We may also reduce the database as demand falls.
This prevents excessive operating costs and server failure due to
high demand. The rise and decrease in operation instances may
26. even be automated, allowing a carefree program servicing
strategy to be adopted. Furthermore, microservices architecture
can help to extract fast and instantaneous solutions for the
current applications effectively. The microservices
organizations are closely coupled, making each of them
unrelated.
The use of microservices would certainly change an
organization's technological and organizational culture. No
modifications are therefore required to implement or alter some
function in the whole codebase (Yi et al., 2019). Each provider
is a specific entity without having to scale the whole application
individually.Barriers in Microservices adoption.
No afterthought has been given to the fact that companies with
microservices have derived many benefits from this. On the
other hand, turning the coin shows plainly that not all
businesses are capable of the rewards of design for
microservices. Be sure the company is ready to handle it before
switching to microservices. Both staff and developers'
resistance to microservices can be very obstacles to
microservices acceptance, as microservices differ very much
from the use of developers and operators (Mateus-Coelho et al.,
2021). In reality, developers and operators alike will not be able
to resist transition and to use microservices.
The high level of self-nomination by the team is highly
responsible. The teams now will have to work with certain
transversal problems traditionally handled by specialist teams.
So, I think it is the right thing. Microservices are not easy to
test. Each operation is directly or gradually reliant on others.
Dependencies are increasing with the inclusion of more new
functions.
Continuous implementation and gradual growth models enable
teams to provide support quickly with microservices. Also, it
can be instant when it relates to the use of utilities.Properties of
Monolithic. Monolithic Architecture.
Monolithic applications for various similar activities are
planned. These applications are usually complex and have many
27. strictly interconnected features. A typical method of designing
applications is called monolithic architecture. A unified,
indivisible entity is a monolithic program. Typically a
customized user experience, a server-side program, and a
database are part of this approach. It is centralized, and it
operates and serves all roles at a single location.
Normally, a massive codebase and modularity are lacking in
monolithic programs. To upgrade or modify things, developers
accessing the same code basis. Therefore, they adjust the whole
sequence at once. Monolithic implementations have strong
interdependence of modules, closely interconnec ted (Villamizar
et al., 2015). The various modules use features so that even an
individual module default causes dropping dominoes to
collapse, leading to the multiplicative effect.
ERP has streamlined enterprise systems and
monolithic platforms share a shared data method and model
covering all operations. The business requirements concentrate
mainly on four fields: IT cost reduction, the productivity of
enterprise applications, business procedures, and business
productivity (Mosleh et al., 2018). Choose every type of ERP
that is exclusive to any business. In the cloud, we built a
platform that can allow one to find out which solution is the
right one for the business enterprise based on comprehensive
expertise in the active deployment and support of ERP
applications.Drivers for Monolithic adoption.
Overlapping problems include the issues affecting the whole
program, such as recording, handling, caching, and tracking
output. This category of complexity is only one feature for a
monolithic framework and is thus easier to manage. Unlike the
design of microservices, it is much simpler to configure and
validate monolithic systems. Because a single unit is a
monolithic application, end-to-end testing can be performed
much quicker. The simplification of monolithic applications
also makes them easy to deploy. One does not have to tackle
multiple implementations when applied to monolithic systems –
only one file or registry. As a common method to design
28. software, a monolithic solution provides a team of engineers
with the correct experience and skills to create a monolithic
framework.Barriers for Monolithic adoption.
It gets too difficult to grasp if a monolithic program increases.
Also, it is difficult to handle a dynamic code structure within a
framework. Changes in a wide and complicated program with
very close connections are harder to execute. Every
modification of code impacts the whole network and must
therefore be coordinated extensively (Marcinauskas, 2021). This
lengthens the whole construction process. A monolithic
application involves the application of modern technology,
which then requires rewriting the whole application.
Monolithic Vs. Microservices Systems. Microservice
architecture versus monolithic architecture.
The concept 'monolithic software' describes software
implementation that cannot be implemented independently from
the modules, as seen in Figure 2—the monolithic architecture
instance. In software systems for a long time, the monolithic
architecture style was the norm. Even then, some general
problems with the monolithic architecture lead to conversion to
microservices (Dragoni et al., 2017). The below is the list of
issues:
1. Monolithic applications appear to be continuously expanding.
This also increases the ambiguity, which makes it progressively
difficult to retain monolithic applications. It takes a long time
to identify mistakes and build new functionality (Dragoni et al.,
2017).
2. If a portion of a single application is modified at some stage,
it is essential to reload the entire application. This is good for
smaller applications, but this may be a substantial downtime for
application areas (Dragoni et al., 2017).
3. Another flaw in monolithic implementations is the
management of scalability. Typically, the approach is to build
more instances of the app for handling elevated load while an
application is having a hit with inbound applications.
29. Monolithic applications function as a unified, monolithic
structure and cannot be divided. The application pieces, which
do not require the added burden, still get it and drain money.
4. Monolithic implementations are harder to implement as there
could be different standards for certain application areas than
others. This means some bits are heavy in computer terms, some
are heavy in memory. Developers need to select a single-size
environment to fulfill both costly and suboptimal specifications
(Al-Debagy & Martinek, 2019).
Microservices are unified and autonomous processes the, as
described, communicate with each other to shape a spread
application. Example of the architecture of microservices. They
are tiny, autonomous operating systems, databases, and other
supporting applications which have their remote environment.
Microservices are essentially all components in an MSA
program. A microservices is, for example, a webshop with
microservices to treat the consumer data. It just adds, removes,
updates, and lists customer details for the online store that the
service does. No more roles are available to the microservices,
and little else is known (Laigner et al., 2021).
It focuses solely on the small role of managing information to
customers. Together, the addition of several 20 microservices is
a shared framework. Microservices also interact with each other
by transferring messages. This ensures that microservices can
be designed according to the specifications of various
programming languages and contexts. Migration from an ERP
System to Microservices.
When a corporation is set up, its implementations usually start
being monolithic, depending on context. It is fair since these
systems initially perform better and need less equipment under
minimal circumstances. Nevertheless, they will need their
technology infrastructure as businesses develop and change
(Slamaa et al., 2021). When networks are expanding and
dynamic, businesses become a long-term technology option for
Microservices.
In this case, it is necessary to evaluate both architectures'
30. success as an alternative to justifying such migration. The
amount of memory in the operation of a procedure is memory
utilization. Network output is data transfer calculation, both
when transmitting and receiving the data. Wix.com has
embraced microservices as part of its migration inspiration to
address major technological difficulties that have caused
uncertainty (Jiang et al., 2014). In 2010, the corporation began
to split components into smaller services to help handle
scalability. Likewise, Best Buy's architecture has been a
deployment constraint. It is time to hold company online was
just too long. A few decades back, companies needed to run all
the server-side software such as data server administration,
customized applications, network switches, and data center
racks. However, with the launch of cloud computing, things got
easier.Team experience of Monolithic Systems to
Microservices.
As established organizations adjust the team responsibilities to
new software development practices, including the ownership of
various aspects of the development cycle (Marquez et al.,
2021). In the Agile Manifesto, the well-known word "self-
organizing teams" will describe how many software cultures
adapt them — organically and easily, with limited confusion.
However, the necessary modifications could require some
encouragement from leadership for other organizations. It all
depends on the culture of the company experiencing the change.
The layout of input as a direct result of team structure is a good
way to view micro services' creation (Bucchiarone et al., 2018).
The aim is to improve team frameworks, in this case,
microservices, to produce products that are focused. And just
about every company would take this path to Maximize the
Value of the software. It works like here.
Comparing core parameters of Microservice and Monolithic
Systems.
Microservices vary in architecture from monolithic systems.
This means that microservices have a different methodology for
their implementation, deployment, and maintenance than ERP
31. systems. Conversely, this implies that the functional and
technical performance will also be different. We discuss bel ow
some core parameters that need to be considered when
comparing microservices with ERP systems. Independent
Components.
Above all, all systems should be deployed and individually
modified to provide more stability. Secondly, a malfunction in a
single microservices only affects a certain service and does not
affect the whole framework. Adding additional functionality to
the microservices platform is often much faster than an ERP
(Gao et al., 2020).Agility.
The architecture of Microservices offers greater mobility and
facilitates the swiveling of domain areas. DevOps will
concentrate on upgrading only the appropriate parts of an
application by breaking up functionality to the lowest level and
then resuming the associated services. The frustrating
integration mechanism usually linked to ERP applications is
removed. The growth of microservices is accelerated and can be
done in the week and not months. Systems are typically
configured to run on multiple servers (Kazanavičius & Mažeika,
2019).
Microservices operate properly with agility in all the features
and functions. It means that the whole machine never falls in
companies developing information infrastructure. Microservices
include agility. ERP schemes have inconsistent effects on
agility, and minimal effects are achieved after deployment
(Tapia et al., 2020). In the past, business information planning
programs helped to simplify, standardize, integrate and
automate operations, thus having an unclear impact on the
company's capacity to make agility.Implementation.
The most simple to execute ERP architecture. The outcome is
probably a monolith if no construction is implemented. ERP
architecture will take an application very far as it is simple to
create and helps teams bring their products before their clients
very easily (Montesi et al., 2021). It has many benefits to
maintain the entire codebase in one location and to launch a
32. single program. You only have to keep one repository and can
browse and find all features in one folder quickly.Deployment.
The ERP design lets you deploy the approach once and only
based on the existing modifications. However, the entire project
will melt down if anything goes wrong.
Deployment is a dynamic method in microservices in the
context of microservices architecture. The independent
implementation of each micro-service is needed, extending the
implementation process (Mazlami et al., 2017). Just one
microservices is affected if anything goes wrong, and this will
be easier to repair.Maintenance.
An IT team is involved in multiple platforms such as
Pascal,.NET, Java, or DB2 is needed in ERP architecture
maintenance. It takes much time in the monolith to find bugs
and to make adjustments. Testing itself, though, is
straightforward and can take place at once.
Maintenance is simpler than monolithic microservices. Smaller
services also save programmers time and are easy to test. With
time, productivity rises, and money saves.Reliability.
If durability is involved, the monolith has little chance towards
microservices. If in ERP architecture anything unexpected
happens, it will interrupt the whole structure. In the meantime,
splitting one service would not create major issues with the
application system in the micro-service design (AL-Mandi &
AL-Sharjabi, 2020).
Micro networks, however, are stable and secure in large part.
Breaking one portion affects this aspect only, while the others
stay unchanged. This versatility makes a high growth rate
without competing with others and implementing improvements
in one feature.Scalability.
Due to structure complexity and size, scalability is challenging
in ERP architecture to accomplish. It is difficult to update this
option. Scalability is much simpler for microservices since we
can only measure certain bits that need more energy. The
microservices solution also benefits from the fact that each item
can be individually sized. So, because the whole application
33. must be scaled even though it does not have to be used, the
entire solution is more economical and time-efficient than
monoliths. Furthermore, each monolith has scalability
constraints, such that the greater the number of users you buy,
the more issues the monolith has (Di Francesco et al., 2019).
Many firms, however, eventually restore their ERP
architectures. Contrast the simple scalability in Microservices
with ERPs; when scaling is not trivial, whether the module has
a sluggish internal code cannot work quicker. To scale an ERP
system, a clone of the whole system must be executed on a
separate computer, not removing the bottleneck of a sluggish
inner stage within the monolith.Development.
It takes a little more than microservices for ERP architecture to
evolve. This is because both departments have to operate in
tandem with the same code. Microservices provide quick
implementation (Escobar et al., 2016). As they do in ERP
architecture, teams do not have to operate parallel as any
application can be supplied separately.Releases.
A single-piece arrangement is a monolith that can be divided
into smaller pieces. That is why before publication, that
everything should be ready. Possible issues would hamper the
whole project in teamwork. Due to the microservices' structure,
new capabilities can be released more rapidly by microservices
(Baresi & Garriga, 2019).Cost.
Microservices are delightful in simplifying the complicated
issues of attempting to modify massive, unmanageable ERPIT
structures based on a vast variety of parts, technology, and
applications. Monolith architecture is cheaper and quicker to
build, but each particular case has to be addressed. Monoliths
are an important investment for companies and are a greater
challenge and a larger budget burden (Villamizar et al., 2016).
Microservices are often more costly, and the entire
implementation takes longer than in monolithic applications.
And they will also cost fewer, in the long term, if we consider
that the working time for developers is less than a monolithic
architecture. Conclusion
34. It is equally necessary to have the ability to handle knowledge
effectively in today's business environment so that it remains
productive for business. Data is regarded as a highly useful
competitive advantage that provides the enterprise with
economic benefits. This view on data storage has been further
emphasized in the progress of software development in
organizations. Our research is focused on understanding the
factors that promote the growth of a creative company
microservice ecosystem and their contribution to organizational
competitiveness.
This research helps to clarify the design strategy for improving
assertiveness by considering the impact of the organizational
memory components. There is a fundamental shift in how
knowledge is created, used, and handled in the organizations
today. It is probably obvious at this stage that the universe
powered by our data would not shrink. In reality, information
and data storage capacity will probably continue to increase. A
dynamic phase beyond the processing and storing of information
is the cognitive mechanisms of organizational memories that
accumulate, perceive, and preserve information. To quickly
view and summarize the results as useable knowledge at the
time of a decision, organizations will have to use complex
storage and recuperation procedures (Bhandary & Maslach,
2018).
Organizational remembrance is the information that has been
acquired from prior experiences that may be used for decision-
making. This essay discusses some of the subtleties of the
memories of institutions and their impact on organizations. One
key problem relevant to this thesis, which examines the facets
of organizational storage, is the domain of data storage. This is
currently considered an essential factor to enhance and enhance
business productivity through knowledge and memory
management.
The organizational memory concerns the organization's ability
to take advantage of its previous events to function successfully
in the present. Thus, the OM philosophy focuses on the storage
35. and recovery processes, such that organizational and human
understanding can be reused. This expertise can be stored in
different deposits and is essential for enhancing the efficiency
of the organization. In order to improve productivity,
organizational memory enables the organization (Kaufmann et
al., 2018). Its key principles are based on features to save,
restore and use past business interactions. In other terms, OM
learns about the background and tends to make new experiences.
References
Baresi, L., & Garriga, M. (2019). Microservices: The Evolution
and Extinction of Web Services? Microservices, 3–28.
https://doi.org/10.1007/978-3-030-31646-4_1
Baškarada, S., Nguyen, V., & Koronios, A. (2018). Architecting
Microservices: Practical Opportunities and Challenges. Journal
of Computer Information Systems, 1–9.
https://doi.org/10.1080/08874417.2018.1520056
Berman, E. (2017). An Exploratory Sequential Mixed Methods
Approach to Understanding Researchers' Data Management
Practices at UVM: Findings from the Quantitative Phase.
Journal of EScience Librarianship, 6(1), e1098.
https://doi.org/10.7191/jeslib.2017.1098
Brogi, A., Neri, D., & Soldani, J. (2018). A microservice-based
architecture for (customizable) analyses of Docker images.
Software: Practice and Experience, 48(8), 1461–1474.
https://doi.org/10.1002/spe.2583
Celozzi, C. (2020, December 2). How Door Dash transiti oned
from a code monolith to microservices. Door Dash Engineering
Blog. https://doordash.engineering/2020/12/02/how -doordash-
transitioned-from-a-monolith-to-microservices/
Di Francesco, P., Lago, P., & Malavolta, I. (2019). Architecting
with microservices: A systematic mapping study. Journal of
36. Systems and Software, 150, 77–97.
https://doi.org/10.1016/j.jss.2019.01.001
Habadi, A., Samih, Y., Almehdar, K., & Aljedani, E. (2017). An
Introduction to ERP Systems: Architecture, Implementation, and
Impacts. International Journal of Computer Applications,
167(9), 1–4. https://doi.org/10.5120/ijca2017914322
Kazanavičius, J., & Mažeika, D. (2019, April 1). I am migrating
Legacy Software to Microservices Architecture. IEEE Xplore.
https://doi.org/10.1109/eStream.2019.8732170
Khazaei, H., Barna, C., Beigi-Mohammadi, N., & Litoiu, M.
(2016). Efficiency Analysis of Provisioning Microservices.
2016 IEEE International Conference on Cloud Computing
Technology and Science (CloudCom).
https://doi.org/10.1109/cloudcom.2016.0051
Laigner, R., Zhou, Y., Salles, M. A. V., Liu, Y., & Kalinowski,
M. (2021). Data Management in Microservices: State of the
Practice, Challenges, and Research Directions. ArXiv:
2103.00170 [Cs]. https://arxiv.org/abs/2103.00170
Nawaz, N., & Channakeshavalu. (2013). The Impact of
Enterprise Resource Planning (ERP) Systems Implementation on
Business Performance. SSRN Electronic Journal.
https://doi.org/10.2139/ssrn.3525298
Plutora. (2019, June 28). Understanding Microservices and
Their Impact on Companies. Plutora.
https://www.plutora.com/blog/understanding-microservices
Sampaio, A. R., Rubin, J., Beschastnikh, I., & Rosa, N. S.
(2019). Improving microservice-based applications with runtime
placement adaptation. Journal of Internet Services and
Applications, 10(1). https://doi.org/10.1186/s13174-019-0104-0
Sandoe, K., & Olfman, L. (1992). Anticipating the mnemonic
shift: Organizational remembering and forgetting in 2001.
INTERNATIONAL CONFERENCE on INFORMATION
SYSTEMS (ICIS), 1–12.
https://core.ac.uk/download/pdf/301364184.pdf
Singh, V., & K Peddoju, S. (2017). Container-based
microservice architecture for cloud applications. International
37. Conference on Computing, Communication, and Automation
(ICCCA), 847–852.
https://doi.org/10.1109/CCAA.2017.8229914.
Siong Choy, C., & Yong Suk, C. (2005). Critical Factors In The
Successful Implementation Of Knowledge Management. Journal
of Knowledge Management Practice, 6(1), 234–258.
http://www.tlainc.com/articl90.htm
Stubbs, J., Moreira, W., & Dooley, R. (2015, June 1).
Distributed Systems of Microservices Using Docker and
Serfnode. IEEE Xplore; 7th International Workshop on Science
Gateways, Budapest, Hungary.
https://doi.org/10.1109/IWSG.2015.16
J. Stubbs, W. Moreira and R. Dooley, "Distributed Systems of
Microservices Using Docker and Serfnode," 2015 7th
International Workshop on Science Gateways, Budapest,
Hungary, 2015, pp. 34-39, doi: 10.1109/IWSG.2015.16.
Swoyer, M. L., Steve. (2020, July 15). Microservices Adoption
in 2020. O'Reilly Media.
https://www.oreilly.com/radar/ microservices-adoption-in-2020/
Tapia, F., Mora, M. Á., Fuertes, W., Aules, H., Flores, E., &
Toulkeridis, T. (2020). From Monolithic Systems to
Microservices: A Comparative Study of Performance. Applied
Sciences, 10(17), 5797. https://doi.org/10.3390/app10175797
Villamizar, M., Garces, O., Ochoa, L., Castro, H., Salamanca,
L., Verano, M., Casallas, R., Gil, S., Valencia, C., Zambrano,
A., & Lang, M. (2016). Infrastructure Cost Comparison of
Running Web Applications in the Cloud Using AWS Lambda
and Monolithic and Microservice Architectures. 2016 16th
IEEE/ACM International Symposium on Cluster, Cloud and
Grid Computing (CCGrid).
https://doi.org/10.1109/ccgrid.2016.37
Vrîncianu, M., Anica-Popa, L., & Anica-Popa, I. (2009).
Organizational Memory: an Approach from Knowledge
Management and Quality Management of Organizational
Learning Perspectives. The AMFITEATRU ECONOMIC
Journal, 11(26), 473–481.
38. https://ideas.repec.org/a/aes/amfeco/v11y2009i26p473-482.html
Baboi, M., Iftene, A., & Gîfu, D. (2019). Dynamic
Microservices to Create Scalable and Fault Tolerance
Architecture. Procedia Computer Science, 159, 1035–1044.
https://doi.org/10.1016/j.procs.2019.09.271
CHAN JIANLI1, D., AL-RASHDAN, M., & AL-MAATOUK, Q.
(2020). SECURE DATA STORAGE SYSTEM. Journal of
Critical Reviews, 7(03). https://doi.org/10.31838/jcr.07.03.18
Al-Debagy, O., & Martinek, P. (2019). A Comparative Review
of Microservices and Monolithic Architectures.
ArXiv:1905.07997 [Cs]. http://arxiv.org/abs/1905.07997
AL-Mandi, M. A., & AL-Sharjabi, A. (2020, December 1).
Level of Effectiveness for ERP System in Improving the
Educational Process in Higher Education Institutions in Yemen:
A Case Study of the University of Science and Technology.
.يع جام ال يم ل ع ت ال ودةج ضمان ل ية عرب ال لة مج ال
https://doaj.org/article/e2f955aaa2d34ae9af4ec375d9db8cb7
Balalaie, A., Heydarnoori, A., Jamshidi, P., Tamburri, D. A., &
Lynn, T. (2018). Microservices migration patterns. Software:
Practice and Experience. https://doi.org/10.1002/spe.2608
Bergquist, N. R. (2001). A concept for the collection,
consolidation and presentation of epidemiological data. Acta
Tropica, 79(1), 3–5. https://doi.org/10.1016/s0001-
706x(01)00132-2
Bhandary, A., & Maslach, D. (2018). Organizational Memory.
The Palgrave Encyclopedia of Strategic Management, 1219–
1223. https://doi.org/10.1057/978-1-137-00772-8_210
Bindley, P. (2019). Joining the dots: how to approach
compliance and data governance. Network Security, 2019(2),
14–16. https://doi.org/10.1016/s1353-4858(19)30023-6
Boniecki, R., & Rawłuszko, J. (2018). ON THE
DEVELOPMENT OF THE ERP SYSTEM IN THE
PROCESSING-TRANSPORTING ENTERPRISES. Ekonomiczne
Problemy Usług, 131, 49–56.
https://doi.org/10.18276/epu.2018.131/1-05
Booth, C., & Rowlinson, M. (2006). Management and
39. organizational history: Prospects. Management &
Organizational History, 1(1), 5–30.
https://doi.org/10.1177/1744935906060627
Borgerud, C., & Borglund, E. (2020). Correction to: Open
research data, an archival challenge? Archival Science.
https://doi.org/10.1007/s10502-020-09335-y
Bose, R. (2006). Understanding management data systems for
enterprise performance management. Industrial Management &
Data Systems, 106(1), 43–59.
https://doi.org/10.1108/02635570610640988
Bruno, G. (2014). A Data-flow Language for Business Process
Models. Procedia Technology, 16, 128–137.
https://doi.org/10.1016/j.protcy.2014.10.076
Bucchiarone, A., Dragoni, N., Dustdar, S., Larsen, S. T., &
Mazzara, M. (2018). From Monolithic to Microservices: An
Experience Report from the Banking Domain. IEEE Software,
35(3), 50–55. https://doi.org/10.1109/ms.2018.2141026
Bukari Zakaria, H., & Mamman, A. (2014). Where is the
Organisational Memory? A Tale of Local Government
Employees in Ghana. Public Organization Review, 15(2), 267–
279. https://doi.org/10.1007/s11115-014-0271-1
C. PRIYA, C. P. (2011). Need Based Technology for
Innovation. Indian Journal of Applied Research, 4(4), 19–20.
https://doi.org/10.15373/2249555x/apr2014/251
Cho, Y.-T., & Kim, I. (2014). The Difference Analyses between
Users’ Actual Usage and Perceived Preference: The Case of
ERP Functions on Legacy Systems. The Journal of Information
Systems, 23(1), 185–202.
https://doi.org/10.5859/kais.2014.23.1.185
Dragoni, N., Giallorenzo, S., Lafuente, A. L., Mazzara, M.,
Montesi, F., Mustafin, R., & Safina, L. (2017). Microservices:
Yesterday, Today, and Tomorrow. Present and Ulterior Software
Engineering, 195–216. https://doi.org/10.1007/978-3-319-
67425-4_12
Ehrhart, M. G., Aarons, G. A., & Farahnak, L. R. (2015). Going
above and beyond for implementation: the development and
40. validity testing of the Implementation Citizenship Behavior
Scale (ICBS). Implementation Science, 10(1).
https://doi.org/10.1186/s13012-015-0255-8
Escobar, D., Cardenas, D., Amarillo, R., Castro, E., Garces, K.,
Parra, C., & Casallas, R. (2016). Towards the understanding and
evolution of monolithic applications as microservices. 2016
XLII Latin American Computing Conference (CLEI).
https://doi.org/10.1109/clei.2016.7833410
Esposito, C. (2018). Interoperable, dynamic and privacy-
preserving access control for cloud data storage when
integrating heterogeneous organizations. Journal of Network
and Computer Applications, 108, 124–136.
https://doi.org/10.1016/j.jnca.2018.01.017
Ferrari, E. (2010). Access Control in Data Management
Systems. Synthesis Lectures on Data Management, 2(1), 1–117.
https://doi.org/10.2200/s00281ed1v01y201005dtm004
Fujita, T., & Ogawara, M. (2005). Arbre: A File System for
Untrusted Remote Block-level Storage. IPSJ Digital Courier, 1,
381–393. https://doi.org/10.2197/ipsjdc.1.381
Gao, M., Chen, M., Liu, A., Ip, W. H., & Yung, K. L. (2020).
Optimization of Microservice Composition Based on Artificial
Immune Algorithm Considering Fuzziness and User Preference.
IEEE Access, 8, 26385–26404.
https://doi.org/10.1109/access.2020.2971379
Gerber, M., & von Solms, R. (2008). Information security
requirements – Interpreting the legal aspects. Computers &
Security, 27(5-6), 124–135.
https://doi.org/10.1016/j.cose.2008.07.009
Giacalone, M., Cusatelli, C., & Santarcangelo, V. (2018). Big
Data Compliance for Innovative Clinical Models. Big Data
Research, 12, 35–40. https://doi.org/10.1016/j.bdr.2018.02.001
Herrmann, F. (2016). Using Optimization Models for
Scheduling in Enterprise Resource Planning Systems. Systems,
4(1), 15. https://doi.org/10.3390/systems4010015
Hujda, K., Marineau, C., & Wick, A. (2016). Maximum Product,
Even Less Process: Increasing Efficiencies in Archival
41. Processing Using ArchivesSpace. Journal of Archival
Organization, 13(3-4), 100–113.
https://doi.org/10.1080/15332748.2018.1443549
Hunter, J., & Cheung, K. (2007). Provenance Explorer-a
graphical interface for constructing scientific publication
packages from provenance trails. International Journal on
Digital Libraries, 7(1-2), 99–107.
https://doi.org/10.1007/s00799-007-0018-5
Jiang, L., Xu, L. D., Cai, H., Jiang, Z., Bu, F., & Xu, B. (2014).
An IoT-Oriented Data Storage Framework in Cloud Computing
Platform. IEEE Transactions on Industrial Informatics, 10(2),
1443–1451. https://doi.org/10.1109/tii.2014.2306384
Johansson, B. (2012). Exploring how open source ERP systems
development impact ERP systems diffusion. International
Journal of Business and Systems Research, 6(4), 361.
https://doi.org/10.1504/ijbsr.2012.049468
K S, G., & T, Prof. P. (2019). A Better
Solution
Towards Microservices Communication In Web Application: A
Survey. International Journal of Innovative Research in
Computer Science & Technology, 7(3), 71–74.
https://doi.org/10.21276/ijircst.2019.7.3.7
Kaufmann, E., Favretto, J., Filippim, E. S., & Cohen, E. D.
(2018). Relationship Between The Organizational Memory and
Innovativity: The Case of Software Development Companies in
The Southern Region of Brazil. Journal of Information Systems
and Technology Management, 16.
https://doi.org/10.4301/S1807-1775201916004
42. Khidzir, N. Z., & Ahmed, S. A.-A.-M. (2018). Big Data Digital
Evidences Integrity: Issues, Challenges and Opportunities.
SSRN Electronic Journal. https://doi.org/10.2139/ssrn.3227714
Kilchenmann, A., Laurens, F., & Rosenthaler, L. (2019).
Digitizing, archiving... and then? Ideas about the usability of a
digital archive. Archiving Conference, 2019(1), 146–150.
https://doi.org/10.2352/issn.2168-3204.2019.1.0.34
Killalea, T. (2016). The hidden dividends of microservices.
Communications of the ACM, 59(8), 42–45.
https://doi.org/10.1145/2948985
Kornei, K. (2019). More Than a Million New Earthquakes
Spotted in Archival Data. Eos, 100.
https://doi.org/10.1029/2019eo121757
Kumari, S., Archana, A., Shree, K., Ashwini, A., & M, C.
(2019). EFFICIENT BLOCK-WISE IMAGE COMPARISON
AND STORAGE REDUCTION USING DICE PROTOCOL.
International Journal of Current Engineering and Scientific
Research, 6(6), 175–181.
https://doi.org/10.21276/ijcesr.2019.6.6.30
Laigner, R., Zhou, Y., Salles, M. A. V., Liu, Y., & Kalinowski,
M. (2021). Data Management in Microservices: State of the
Practice, Challenges, and Research Directions.
ArXiv:2103.00170 [Cs]. http://arxiv.org/abs/2103.00170
Langos, C., & Giancaspro, M. (2015). Does Cloud Storage Lend
Itself to Cyberbullying? IEEE Cloud Computing, 2(5), 70–74.
43. https://doi.org/10.1109/mcc.2015.102
LaPolla, F. W. Z., & Rubin, D. (2018). The “Data Visualization
Clinic”: a library-led critique workshop for data visualization.
Journal of the Medical Library Association, 106(4).
https://doi.org/10.5195/jmla.2018.333
Lee, N. C.-A., & Chang, J. Y. T. (2020). Adapting ERP Systems
in the Post-implementation Stage: Dynamic IT Capabilities for
ERP. Pacific Asia Journal of the Association for Information
Systems, 28–59. https://doi.org/10.17705/1pais.12102
Leonhardt, J. M., Trafimow, D., & Niculescu, M. (2016).
Selecting Field Experiment Locations with Archival Data.
Journal of Consumer Affairs, 51(2), 448–462.
https://doi.org/10.1111/joca.12117
Linger, H., Burstein, F., Zaslavsky, A., & Crofts, N. (1999). A
Framework for a Dynamic Organizational Memory Information
System. Journal of Organizational Computing and Electronic
Commerce, 9(2), 189–203.
https://doi.org/10.1207/s15327744joce0902&3_6
Maas, J.-B., van Fenema, P. C., & Soeters, J. (2014). ERP
system usage: the role of control and empowerment. New
Technology, Work and Employment, 29(1), 88–103.
https://doi.org/10.1111/ntwe.12021
Marcinauskas, E. (2021, March 1). Research of ERP System
integration into Lean Manufacturing. Mokslas: Lietuvos Ateitis.
https://doaj.org/article/a6fb6fe1b19d488eb599c8a7b3fd47f1
44. Marquez, G., Taramasco, C., Astudillo, H., Zalc, V., & Istrate,
D. (2021). Involving Stakeholders in the Implementation of
Microservice-Based Systems: A Case Study in an Ambient-
Assisted Living System. IEEE Access, 9, 9411–9428.
https://doi.org/10.1109/access.2021.3049444
Mateus-Coelho, N., Cruz-Cunha, M., & Ferreira, L. G. (2021).
Security in Microservices Architectures. Procedia Computer
Science, 181, 1225–1236.
https://doi.org/10.1016/j.procs.2021.01.320
Mazlami, G., Cito, J., & Leitner, P. (2017). Extraction of
Microservices from Monolithic Software Architectures. 2017
IEEE International Conference on Web Services (ICWS).
https://doi.org/10.1109/icws.2017.61
Milosch, J. C. (2014). Provenance: Not the Problem (The