This document discusses enterprise content management (ECM) solutions, including their typical architecture and key challenges in implementation. It describes the four main components of an ECM architecture: (1) the user interface, (2) information governance, (3) attributes like data archiving and workflow, and (4) the repository for secure storage. The document also outlines stages in an ECM implementation roadmap strategy, highlighting the need to specify information governance over the lifecycle and establish interoperability between systems.
DESIGN, DEVELOPMENT & IMPLEMENTATION OF ONTOLOGICAL KNOWLEDGE BASED SYSTEM FO...IJDKP
Dynamism and uncertainty are genuine threats for current high technology organisations. Capability to change is the crux of sustainability of current large organisations. Modern manufacturing philosophies, including agile and lean, are not enough to be competitive in global market therefore a new emerging paradigm i.e. reconfigurable manufacturing systems is fast emerging to complement the application of lean
and agile manufacturing systems. Product, Process and Resource (PPR) are the core areas in an engineering domain of a manufacturing enterprise which are tightly coupled with each other. Change in one (usually product) affects the others therefore engineering change management activity has to tackle
PPR change effects. Current software applications do not provide an unequivocal infrastructure where PPR can be explicitly related. It follows that reconfigurable techniques can be further complemented with the help of knowledge based systems to design, engineer, manufacture, commission and change existing processes and resources against changed products.
Companies’ perception toward manufacturing execution systems IJECEIAES
The use of information systems in manufacturing sector is very crucial to reach a high level of operational excellence and improve companies’ competitiveness. The use of such systems will definitely increase in the upcoming years, considering the digitalization strategies. Manufacturing execution systems gained a lot of attention in recent years due to showcased benefits in production management operations. Companies that adopted such systems witnessed an increase in process efficiency and enhancements with regards to cost savings and products quality. This paper seeks to analyze what makes the usage of manufacturing execution systems successful among manufacturing companies. We analyzed how the integration capabilities of such systems with other business applications and the company profile impact their usage and consequently the perceived benefits. A case study was conducted with 51 manufacturing companies and data were analyzed using partial least square structural equation modeling technique. The results confirmed the positive and significant impact of the company profile and solution integration capabilities on system usage. In addition, a ranking of solution modules importance for companies was also provided.
Designing a Framework to Standardize Data Warehouse Development Process for E...ijdms
Data warehousing solutions work as information base for large organizations to support their decision
making tasks. With the proven need of such solutions in current times, it is crucial to effectively design,
implement and utilize these solutions. Data warehouse (DW) implementation has been a challenge for the
organizations and the success rate of its implementation has been very low. To address these problems, we
have proposed a framework for developing effective data warehousing solutions. The framework is
primarily based on procedural aspect of data warehouse development and aims to standardize its process.
We first identified its components and then worked on them in depth to come up with the framework for
effective implementation of data warehousing projects. To verify effectiveness of the designed framework,
we worked on National Rural Health Mission (NRHM) project of Indian government and designed data
warehousing solution using the proposed framework.
KEYWORDS
Data warehousing, Framework Design, Dimensional
Role of Operational System Design in Data Warehouse Implementation: Identifyi...iosrjce
Data warehouse designing process takes input from operational system of the organization. Quality
of data warehousing solution depends on design of operational system. Often, operational system
implementations of organizations have some limitations. Thus, we cannot proceed for data warehouse
designing so easily. In this paper, we have tried to investigate operational system of the organization for
identifying such limitations and determine role of operational system design in the process of data warehouse
design and implementation. We have worked out to find possible methods to handle such limitations and have
proposed techniques to get a quality data warehousing solution under such limitations. To make the work based
on live example, National Rural Health Mission (NRHM) Project has been taken. It is a national project of
health sector, managed by Indian Government across the country. The complex structure and high volume of
data makes it an ideal case for data warehouse implementation.
Framework for developed simple architecture enterprise fdsaecsandit
In This article presents a framework for develop de Architecture enterprise based on the
articulation of emerging paradigms for architecture development of information enterprise [1].
The first one comes from the agile methods and it is inspired on the Scrum model which aim to
simplify the complex task of developing a quality software, the second the processes models
whose are oriented the development of Architectures Enterprise as Zachman and TOGAF in a
paradigm of the Model Driven and principles de reference de architecture de Software form the
paradigms Generation (MDG), these approaches are integrated eventually leading to the
formulation and presentation of an framework for developed simple architecture enterprise –
FDSAE- The goal is to present a simple, portable, understandable terms enabling, modeling
and design business information architecture in any organizational environment, in addition to
this, there are important aspects related to the unified Modeling Language UML 2.5 and the
Business Process Modeling BPMn that become tools to obtain the products in the FDSAE
Framework, This framework is an improved version of Framework MADAIKE [2] developed by
the same authors.
Data warehouse generally contains both types of data i.e. historical & current data from various data sources. Data warehouse in world of computing can be defined as system created for analysis and reporting of these both types of data. These analysis report is then used by an organization to make decisions which helps them in their growth. Construction of data warehouse appears to be simple, collection of data from data sources into one place (after extraction, transform and loading). But construction involves several issues such as inconsistent data, logic conflicts, user acceptance, cost, quality, security, stake holder’s contradictions, REST alignment etc. These issues need to be overcome otherwise will lead to unfortunate consequences affecting the organization growth. Proposed model tries to solve these issues such as REST alignment, stake holder’s contradiction etc. by involving experts of various domains such as technical, analytical, decision makers, management representatives etc. during initialization phase to better understand the requirements and mapping these requirements to data sources during design phase of data warehouse.
DESIGN, DEVELOPMENT & IMPLEMENTATION OF ONTOLOGICAL KNOWLEDGE BASED SYSTEM FO...IJDKP
Dynamism and uncertainty are genuine threats for current high technology organisations. Capability to change is the crux of sustainability of current large organisations. Modern manufacturing philosophies, including agile and lean, are not enough to be competitive in global market therefore a new emerging paradigm i.e. reconfigurable manufacturing systems is fast emerging to complement the application of lean
and agile manufacturing systems. Product, Process and Resource (PPR) are the core areas in an engineering domain of a manufacturing enterprise which are tightly coupled with each other. Change in one (usually product) affects the others therefore engineering change management activity has to tackle
PPR change effects. Current software applications do not provide an unequivocal infrastructure where PPR can be explicitly related. It follows that reconfigurable techniques can be further complemented with the help of knowledge based systems to design, engineer, manufacture, commission and change existing processes and resources against changed products.
Companies’ perception toward manufacturing execution systems IJECEIAES
The use of information systems in manufacturing sector is very crucial to reach a high level of operational excellence and improve companies’ competitiveness. The use of such systems will definitely increase in the upcoming years, considering the digitalization strategies. Manufacturing execution systems gained a lot of attention in recent years due to showcased benefits in production management operations. Companies that adopted such systems witnessed an increase in process efficiency and enhancements with regards to cost savings and products quality. This paper seeks to analyze what makes the usage of manufacturing execution systems successful among manufacturing companies. We analyzed how the integration capabilities of such systems with other business applications and the company profile impact their usage and consequently the perceived benefits. A case study was conducted with 51 manufacturing companies and data were analyzed using partial least square structural equation modeling technique. The results confirmed the positive and significant impact of the company profile and solution integration capabilities on system usage. In addition, a ranking of solution modules importance for companies was also provided.
Designing a Framework to Standardize Data Warehouse Development Process for E...ijdms
Data warehousing solutions work as information base for large organizations to support their decision
making tasks. With the proven need of such solutions in current times, it is crucial to effectively design,
implement and utilize these solutions. Data warehouse (DW) implementation has been a challenge for the
organizations and the success rate of its implementation has been very low. To address these problems, we
have proposed a framework for developing effective data warehousing solutions. The framework is
primarily based on procedural aspect of data warehouse development and aims to standardize its process.
We first identified its components and then worked on them in depth to come up with the framework for
effective implementation of data warehousing projects. To verify effectiveness of the designed framework,
we worked on National Rural Health Mission (NRHM) project of Indian government and designed data
warehousing solution using the proposed framework.
KEYWORDS
Data warehousing, Framework Design, Dimensional
Role of Operational System Design in Data Warehouse Implementation: Identifyi...iosrjce
Data warehouse designing process takes input from operational system of the organization. Quality
of data warehousing solution depends on design of operational system. Often, operational system
implementations of organizations have some limitations. Thus, we cannot proceed for data warehouse
designing so easily. In this paper, we have tried to investigate operational system of the organization for
identifying such limitations and determine role of operational system design in the process of data warehouse
design and implementation. We have worked out to find possible methods to handle such limitations and have
proposed techniques to get a quality data warehousing solution under such limitations. To make the work based
on live example, National Rural Health Mission (NRHM) Project has been taken. It is a national project of
health sector, managed by Indian Government across the country. The complex structure and high volume of
data makes it an ideal case for data warehouse implementation.
Framework for developed simple architecture enterprise fdsaecsandit
In This article presents a framework for develop de Architecture enterprise based on the
articulation of emerging paradigms for architecture development of information enterprise [1].
The first one comes from the agile methods and it is inspired on the Scrum model which aim to
simplify the complex task of developing a quality software, the second the processes models
whose are oriented the development of Architectures Enterprise as Zachman and TOGAF in a
paradigm of the Model Driven and principles de reference de architecture de Software form the
paradigms Generation (MDG), these approaches are integrated eventually leading to the
formulation and presentation of an framework for developed simple architecture enterprise –
FDSAE- The goal is to present a simple, portable, understandable terms enabling, modeling
and design business information architecture in any organizational environment, in addition to
this, there are important aspects related to the unified Modeling Language UML 2.5 and the
Business Process Modeling BPMn that become tools to obtain the products in the FDSAE
Framework, This framework is an improved version of Framework MADAIKE [2] developed by
the same authors.
Data warehouse generally contains both types of data i.e. historical & current data from various data sources. Data warehouse in world of computing can be defined as system created for analysis and reporting of these both types of data. These analysis report is then used by an organization to make decisions which helps them in their growth. Construction of data warehouse appears to be simple, collection of data from data sources into one place (after extraction, transform and loading). But construction involves several issues such as inconsistent data, logic conflicts, user acceptance, cost, quality, security, stake holder’s contradictions, REST alignment etc. These issues need to be overcome otherwise will lead to unfortunate consequences affecting the organization growth. Proposed model tries to solve these issues such as REST alignment, stake holder’s contradiction etc. by involving experts of various domains such as technical, analytical, decision makers, management representatives etc. during initialization phase to better understand the requirements and mapping these requirements to data sources during design phase of data warehouse.
Applying systemic methodologies to bridge the gap between a process-oriented ...Panagiotis Papaioannou
This work is an application of the Soft Systems Methodology (SSM) to improve an information system to fully support the related process-based management system and help its internal improvement. Design and Control Systemic Methodology (DCSYM) is used as a modelling tool to facilitate conceptual models comparison within the SSM context.
Data Warehouse Development Standardization Framework (DWDSF): A Way to Handle...IOSRjournaljce
Why does large number of data warehousing projects fail? How to avoid such failures? How to meet out user’s expectations and fulfil data analysis needs of business managers from data warehousing solutions? How to make data warehousing projects successful? These are some of the key questions before data warehouse research community in the present time. Literature shows that large numbers of data warehousing projects undertaken eventually result in a failure. In this paper, we have designed a framework named “Data Warehouse Development Standardization Framework” (DWDSF), to help data warehouse developer’s community in implementing effective data warehousing solutions. We have critically analysed literature to find out possible reasons of data warehouse project failure. Our framework has been designed to overcome such issues and enable implementation of successful data warehousing solutions. To verify usefulness of our framework, we have applied guidelines of DWDSF framework to design and implement data warehousing solution for National Rural Health Mission (NRHM) project which offers various health services throughout the country. The developed solution is giving results for all type of queries business managers want to run. We have shown results of some sample queries executed over the implemented data warehouse repository. All results are meeting out business manager’s query expectations.
Simplifying Model-Based Systems Engineering - an Implementation Journey White...Alex Rétif
Model-Based Systems Engineering (MBSE) is perhaps one of the most misunderstood and often abused acronyms in the engineering vernacular. Many companies struggle to understand how it will improve their entire product life-cycle and address the ever-increasing complexity of products. In many companies, executives and middle management experience a lack of understanding regarding the rapid pace of today’s technology and its impact on organizations and processes. Technical practitioners may gain additional insight as they focus their energies on establishing strong MBSE practices. The successful implementation of MBSE includes transformations and enhancements in three key areas: organization, process and technology. This white paper shares proper planning and implementation considerations in adopting an MBSE practice. It provides a high-level view, defines critical components to help success and identifies many problematic areas to avoid in an implementation journey.
FROM PLM TO ERP : A SOFTWARE SYSTEMS ENGINEERING INTEGRATIONijseajournal
The present paper on three related issues and their integration Product lifecycle management , Enterprise Planning resources and Manufacturing execution systems. Our work is how to integrate all these in a unified systems engineering framework. As most company about two third claim to have integrate ERP to PLM, ; we still observe some related problems as also mentioned by Aberdeen group. In actual global data sharing, we have some options to also integrate systems best practices towards such objective. Such critical study come with solution by reverse engineering, revisiting requirement engineering steps and propose a validation and verification for the success factors of such integration.
Model-Driven Context-Aware Approach to Software Configuration Management: A F...theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The use of an architecture–centered development process for delivering information technology began with
the introduction of client / server based systems. Early client/server and legacy mainframe applications did not
provide the architectural flexibility needed to meet the changing business requirements of the modern
publishing organization. With the introduction of Object Oriented systems, the need for an architecture–
centered process became a critical success factor. Object reuse, layered system components, data
abstraction, web based user interfaces, CORBA, and rapid development and deployment processes all
provide economic incentives for object technologies. However, adopting the latest object oriented technology,
without an adequate understanding of how this technology fits a specific architecture, risks the creation of an
instant legacy system.
Publishing software systems must be architected in order to deal with the current and future needs of the
business organization. Managing software projects using architecture–centered methodologies must be an
intentional step in the process of deploying information systems – not an accidental by–product of the
software acquisition and integration process.
Success Factors for Enterprise Systems in the Higher Education Sector: A Case...inventionjournals
Many large organisations have moved to Enterprise System solutions in recent years, including the higher education sector (HES). Whilst the benefits of Enterprise systems are well known, the sector has a social mission and characteristics that do not necessarily map to a commercially-focused corporate conceptualization, and assessing the suitability of any particular enterprise solution requires a qualified set of criteria to be applied. This paper looks at an “essential set” of critical success factors (CSFs) relevant to enterprise systems in the HES and applies them in a case study of a large Australian University. The CSFs found to be most relevant to successful ES deployment show differences from CSFs reported in other studies, mainly those in commercial sectors, suggesting a sector based approach be taken to evaluating ES success. We generalise our practical findings to theory, and propose further theory development and validation through confirmatory case studies and specific hypothesis testing.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
574An Integrated Framework for IT Infrastructure Management by Work Flow Mana...idescitation
Information Technology (IT) is one of the most emerging
fields in today’s Internet world. IT can be defined in various ways,
but is broadly considered to encompass the use of computers and
telecommunications equipment to store, retrieve, transmit and
manipulate data. Infrastructure is the base for everything. IT also
has an infrastructure, which can be managed and maintained
properly. For an organization’s Information Technology,
Infrastructure Management (IM) is the management of essential
operation components, such as policies, processes, equipment,
data, human resources and external contacts, for overall
effectiveness.
In this paper, we propose a methodology to manage the IT
Infrastructure in a better way. Our methodology uses the tree-
structure based architecture to manage the infrastructure with less
manual power. The process of how to manage the infrastructure is
discussed with efficient methodology and necessary steps with
algorithm, in this paper. Also, in this paper, the process of workflow
management on IT infrastructure management has been provided.
Applying systemic methodologies to bridge the gap between a process-oriented ...Panagiotis Papaioannou
This work is an application of the Soft Systems Methodology (SSM) to improve an information system to fully support the related process-based management system and help its internal improvement. Design and Control Systemic Methodology (DCSYM) is used as a modelling tool to facilitate conceptual models comparison within the SSM context.
Data Warehouse Development Standardization Framework (DWDSF): A Way to Handle...IOSRjournaljce
Why does large number of data warehousing projects fail? How to avoid such failures? How to meet out user’s expectations and fulfil data analysis needs of business managers from data warehousing solutions? How to make data warehousing projects successful? These are some of the key questions before data warehouse research community in the present time. Literature shows that large numbers of data warehousing projects undertaken eventually result in a failure. In this paper, we have designed a framework named “Data Warehouse Development Standardization Framework” (DWDSF), to help data warehouse developer’s community in implementing effective data warehousing solutions. We have critically analysed literature to find out possible reasons of data warehouse project failure. Our framework has been designed to overcome such issues and enable implementation of successful data warehousing solutions. To verify usefulness of our framework, we have applied guidelines of DWDSF framework to design and implement data warehousing solution for National Rural Health Mission (NRHM) project which offers various health services throughout the country. The developed solution is giving results for all type of queries business managers want to run. We have shown results of some sample queries executed over the implemented data warehouse repository. All results are meeting out business manager’s query expectations.
Simplifying Model-Based Systems Engineering - an Implementation Journey White...Alex Rétif
Model-Based Systems Engineering (MBSE) is perhaps one of the most misunderstood and often abused acronyms in the engineering vernacular. Many companies struggle to understand how it will improve their entire product life-cycle and address the ever-increasing complexity of products. In many companies, executives and middle management experience a lack of understanding regarding the rapid pace of today’s technology and its impact on organizations and processes. Technical practitioners may gain additional insight as they focus their energies on establishing strong MBSE practices. The successful implementation of MBSE includes transformations and enhancements in three key areas: organization, process and technology. This white paper shares proper planning and implementation considerations in adopting an MBSE practice. It provides a high-level view, defines critical components to help success and identifies many problematic areas to avoid in an implementation journey.
FROM PLM TO ERP : A SOFTWARE SYSTEMS ENGINEERING INTEGRATIONijseajournal
The present paper on three related issues and their integration Product lifecycle management , Enterprise Planning resources and Manufacturing execution systems. Our work is how to integrate all these in a unified systems engineering framework. As most company about two third claim to have integrate ERP to PLM, ; we still observe some related problems as also mentioned by Aberdeen group. In actual global data sharing, we have some options to also integrate systems best practices towards such objective. Such critical study come with solution by reverse engineering, revisiting requirement engineering steps and propose a validation and verification for the success factors of such integration.
Model-Driven Context-Aware Approach to Software Configuration Management: A F...theijes
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The use of an architecture–centered development process for delivering information technology began with
the introduction of client / server based systems. Early client/server and legacy mainframe applications did not
provide the architectural flexibility needed to meet the changing business requirements of the modern
publishing organization. With the introduction of Object Oriented systems, the need for an architecture–
centered process became a critical success factor. Object reuse, layered system components, data
abstraction, web based user interfaces, CORBA, and rapid development and deployment processes all
provide economic incentives for object technologies. However, adopting the latest object oriented technology,
without an adequate understanding of how this technology fits a specific architecture, risks the creation of an
instant legacy system.
Publishing software systems must be architected in order to deal with the current and future needs of the
business organization. Managing software projects using architecture–centered methodologies must be an
intentional step in the process of deploying information systems – not an accidental by–product of the
software acquisition and integration process.
Success Factors for Enterprise Systems in the Higher Education Sector: A Case...inventionjournals
Many large organisations have moved to Enterprise System solutions in recent years, including the higher education sector (HES). Whilst the benefits of Enterprise systems are well known, the sector has a social mission and characteristics that do not necessarily map to a commercially-focused corporate conceptualization, and assessing the suitability of any particular enterprise solution requires a qualified set of criteria to be applied. This paper looks at an “essential set” of critical success factors (CSFs) relevant to enterprise systems in the HES and applies them in a case study of a large Australian University. The CSFs found to be most relevant to successful ES deployment show differences from CSFs reported in other studies, mainly those in commercial sectors, suggesting a sector based approach be taken to evaluating ES success. We generalise our practical findings to theory, and propose further theory development and validation through confirmatory case studies and specific hypothesis testing.
International Journal of Engineering Research and Applications (IJERA) is an open access online peer reviewed international journal that publishes research and review articles in the fields of Computer Science, Neural Networks, Electrical Engineering, Software Engineering, Information Technology, Mechanical Engineering, Chemical Engineering, Plastic Engineering, Food Technology, Textile Engineering, Nano Technology & science, Power Electronics, Electronics & Communication Engineering, Computational mathematics, Image processing, Civil Engineering, Structural Engineering, Environmental Engineering, VLSI Testing & Low Power VLSI Design etc.
574An Integrated Framework for IT Infrastructure Management by Work Flow Mana...idescitation
Information Technology (IT) is one of the most emerging
fields in today’s Internet world. IT can be defined in various ways,
but is broadly considered to encompass the use of computers and
telecommunications equipment to store, retrieve, transmit and
manipulate data. Infrastructure is the base for everything. IT also
has an infrastructure, which can be managed and maintained
properly. For an organization’s Information Technology,
Infrastructure Management (IM) is the management of essential
operation components, such as policies, processes, equipment,
data, human resources and external contacts, for overall
effectiveness.
In this paper, we propose a methodology to manage the IT
Infrastructure in a better way. Our methodology uses the tree-
structure based architecture to manage the infrastructure with less
manual power. The process of how to manage the infrastructure is
discussed with efficient methodology and necessary steps with
algorithm, in this paper. Also, in this paper, the process of workflow
management on IT infrastructure management has been provided.
ML operations comprise a set of practices and methods specifically crafted for streamlined management of the complete lifecycle of machine learning models in production environments. It encompasses the iterative process of model development, deployment, monitoring, maintenance and integrating the model into operational systems, ensuring reliability, scalability, and performance.ML operations comprise a set of practices and methods specifically crafted for streamlined management of the complete lifecycle of machine learning models in production environments. It encompasses the iterative process of model development, deployment, monitoring, maintenance and integrating the model into operational systems, ensuring reliability, scalability, and performance.
ML operations comprise a set of practices and methods specifically crafted for streamlined management of the complete lifecycle of machine learning models in production environments. It encompasses the iterative process of model development, deployment, monitoring, maintenance and integrating the model into operational systems, ensuring reliability, scalability, and performance.
ML operations comprise a set of practices and methods specifically crafted for streamlined management of the complete lifecycle of machine learning models in production environments. It encompasses the iterative process of model development, deployment, monitoring, maintenance and integrating the model into operational systems, ensuring reliability, scalability, and performance. In certain cases, ML operations are solely employed for deploying machine learning models.
Section 1: PROJECT INTRODUCTION
Section 1: PROJECT INTRODUCTION
Project Deliverable 1: Project Plan Inception
CIS 499 – Information Systems Capstone
Background
In the last two years, the ACME Company has experienced continued growth. This growth is expected to continue in the very near future. Specifically, the company is expected to experience a 60% growth in the next eighteen months. This rate of growth has presented new challenges for the company. It now has to redesign its information systems for the larger office space occupied. The continued growth has also highlighted the need to set up the company to deal with more data and ensure for safety and security for its clients. The ACME Company is currently valued at $25 million but is expected to experience significant growth in the future.
Type of Business
The ACME Company collects data using Web analytics and combines it with operational systems data. Increasingly, businesses have appreciated the competitive edge presented by analyzing market data. However, the successful use of data in decision making is a long process that has greatly influenced the growth of information systems. Some of the major steps in this process include collecting information and interpreting its significance. This is intended to compare the external and internal environments of a business and propose better practices that would benefit as a whole.
ACME based its information technology on a hybrid model where some of the systems are hosted and other in-house. This method was initially done with the goal of minimizing costs. However, a lot has changed in the business that necessitates major changes to be made.
Skilled Information Systems Personnel
At the moment, there are only four employees in the company dedicated to the Information Technology department. ACME has adopted a hybrid solution to information technology where much of the systems used by the company were hosted by other entities. This was believed to be part of cutting costs. As the business has continually grown, its information technology needs have expanded and redesigned to meet its current obligations.
The personnel at the company will need to be trained to use any other systems introduced at the workplace. Although all the workers are trained information technology experts, it will be important to involve them in the development of the new design to facilitate its effectiveness. This is primarily intended to ensure that all the qualified personnel at the organization are well-informed about the information technology changes occurring at the workplace.
Types of Data
ACME collects web analytics and combines it with operational systems data. Web analytics includes all the data tha.
ML operations comprise a set of practices and methods specifically crafted for streamlined management of the complete lifecycle of machine learning models in production environments. It encompasses the iterative process of model development, deployment, monitoring, maintenance and integrating the model into operational systems, ensuring reliability, scalability, and performance.
5 Key Data Management Trends of 2022 as observed by a data practitioner. Covers trends on data architecture, data storage, data platforms, and data operations.
Software plays a critical role in businesses, governments, and societies. To improve
performance and quality of the software are important goals of software engineering. Mining
data has recently emerged as a promising means to meet this goal due to two main trends:
The increasing abundance of such data and its demonstrated helpfulness in solving numerous
real-world problems. Poor performance costs the software industry millions of money
annually in the form of lost revenue, hardware costs, damaged customer relations and
decreased productivity. Performance analysis and evaluation through data mining technique
will result performance improvement suggestions for software developers.
1
45
Table of Contents
Introduction3
Need for technology-based solutions3
Infrastructure Automation Tools4
Implementation4
The Central Theory: Organizational Management and Memory4
Organizational Management4
Organizational Memory4
Need of Data Archival And Storage5
Data Storage.5
Types of Storage.6
Data Archival9
Data Archival Process9
Archiving principles12
Data Management Systems12
Enterprise Resource Planning Systems (ERP systems) for data integration.13
Microservices.15
Properties of Monolithic.17
Conclusion22
References24
Introduction
Technology is considered vital in today's globalized world. Especially in terms of business, information technology has both quantifiable and unquantifiable benefits. It is essential to communicate with customers and stakeholders regularly and necessary for communicating quickly and clearly. It helps in implementing business operations efficiently and effectively, also. A business with robust technological capacity creates new opportunities for a company to stay ahead of the competition and grow eventually (Rangus & Slavec, 2017). Consequently, it also makes dynamic teams that can interact from anywhere in the world—furthermore, technology aids in understanding the business needs and managing and securing confidential and critical data.Need for technology-based solutions
The need for data recovery, active and continuous data processing by its life cycle of significance and utility for research, scientific and educational purposes (Bukari Zakaria & Mamman, 2014). The acknowledgment that information is an organization's key asset since late, decisively affecting its profitability, has contributed to some comprehensive corporate memory approaches. The key causes of competitive advantage are corporate memory and organizational learning ability (C. Priya, 2011). Hence the main obstacle is the effectiveness of information management while ensuring the consistency of training facilities.
Organizations need robust technology-based solutions. Thus, software developers have developed and deployed various forms of overtime architectures that enable software products to become resource-effective and usable. Some architectures implement their frameworks in either one layer or various layers or levels (Suresh, 2012). It is understood that ERP implementation efficiency of ERP implementations is influenced by the rise or excess of a certain degree of capability in the volume of data to process (Johansson, 2012). In the last couple of decades, new architectures have been created with creativity that offers optimum solutions. Thus, the microservices architecture is gaining room and becoming part of the technological, financial, and advertising decision-making process. Microservices replace monolithic, tightly dispersed system-focused applications with an independent operation (Vrîncianu, Anica-Popa, & Anica-Popa, 2009).Infrastructure Automation Tools
One issue as microservices are applied is that any s ...
1
45
Table of Contents
Introduction3
Need for technology-based solutions3
Infrastructure Automation Tools4
Implementation4
The Central Theory: Organizational Management and Memory4
Organizational Management4
Organizational Memory4
Need of Data Archival And Storage5
Data Storage.5
Types of Storage.6
Data Archival9
Data Archival Process9
Archiving principles12
Data Management Systems12
Enterprise Resource Planning Systems (ERP systems) for data integration.13
Microservices.15
Properties of Monolithic.17
Conclusion22
References24
Introduction
Technology is considered vital in today's globalized world. Especially in terms of business, information technology has both quantifiable and unquantifiable benefits. It is essential to communicate with customers and stakeholders regularly and necessary for communicating quickly and clearly. It helps in implementing business operations efficiently and effectively, also. A business with robust technological capacity creates new opportunities for a company to stay ahead of the competition and grow eventually (Rangus & Slavec, 2017). Consequently, it also makes dynamic teams that can interact from anywhere in the world—furthermore, technology aids in understanding the business needs and managing and securing confidential and critical data.Need for technology-based solutions
The need for data recovery, active and continuous data processing by its life cycle of significance and utility for research, scientific and educational purposes (Bukari Zakaria & Mamman, 2014). The acknowledgment that information is an organization's key asset since late, decisively affecting its profitability, has contributed to some comprehensive corporate memory approaches. The key causes of competitive advantage are corporate memory and organizational learning ability (C. Priya, 2011). Hence the main obstacle is the effectiveness of information management while ensuring the consistency of training facilities.
Organizations need robust technology-based solutions. Thus, software developers have developed and deployed various forms of overtime architectures that enable software products to become resource-effective and usable. Some architectures implement their frameworks in either one layer or various layers or levels (Suresh, 2012). It is understood that ERP implementation efficiency of ERP implementations is influenced by the rise or excess of a certain degree of capability in the volume of data to process (Johansson, 2012). In the last couple of decades, new architectures have been created with creativity that offers optimum solutions. Thus, the microservices architecture is gaining room and becoming part of the technological, financial, and advertising decision-making process. Microservices replace monolithic, tightly dispersed system-focused applications with an independent operation (Vrîncianu, Anica-Popa, & Anica-Popa, 2009).Infrastructure Automation Tools
One issue as microservices are applied is that any s ...
The International Journal of Engineering & Science is aimed at providing a platform for researchers, engineers, scientists, or educators to publish their original research results, to exchange new ideas, to disseminate information in innovative designs, engineering experiences and technological skills. It is also the Journal's objective to promote engineering and technology education. All papers submitted to the Journal will be blind peer-reviewed. Only original articles will be published.
The papers for publication in The International Journal of Engineering& Science are selected through rigorous peer reviews to ensure originality, timeliness, relevance, and readability.
Leveraging shared IT and Business resources to maintain PCI complianceShiva Hullavarad
Given the serious security risks to information technology (IT) assets, managing those risks effectively is an essential task for the University and its departments. The process will benefit both the individual departments and the University as a whole. It is important that management understand what risks exist in their IT environment, and how those risks can be reduced or eliminated. In an increasingly competitive business environment organizations must develop capabilities that will provide them with a sustainable competitive advantage. The universities and colleges big and small – face continued the threat of data theft ranging from finance, heath, intellectual property and other sensitive information.
In such a high-risk environment, it’s imperative for universities and colleges to share and collaborate ideas, methods, and technologies to learn how the risks can be addressed. This talk will provide some insights on how to identify the areas for cross – collaboration to stay compliant and reduce risk. The talk also outlines the University of Alaska and Texas A&M synergistic efforts.
Vulnerability is a weakness in the application or a design flaw that allows an attacker to exploit for potential harm or financial benefits. Though it is practically impossible to have vulnerability free system, one can implement tools to identify the nature of vulnerabilities and mitigate the potential risk they pose. As an institution, it is very important for business managers, administrators, and IT security personnel to pay attention to those security warnings. The talk will identify types, sources, and mitigation of external and internal threats. The talk will review Vulnerability Assessment and Penetration Testing (VAPT) tools available in the market and their benefits. Presenters will engage the audience in interactive style discussion on the available tools to detect vulnerabilities and threats and the steps needed to mitigate.
Vulnerability is a weakness in the application or a design flaw that allows an attacker to exploit for potential harm or financial benefits. Though it is practically impossible to have vulnerability free system, one can implement tools to identify the nature of vulnerabilities and mitigate the potential risk they pose. As an institution, it is very important for business managers, administrators, and IT security personnel to pay attention to those security warnings. The talk will identify types, sources, and mitigation of external and internal threats. The talk will review Vulnerability Assessment and Penetration Testing (VAPT) tools available in the market and their benefits. Presenters will engage the audience in interactive style discussion on the available tools to detect vulnerabilities and threats and the steps needed to mitigate.
Enterprise Content Management (ECM) solutions provide robust functionality to control and analyze information. ECM solutions help reduce search times, manage data, and enable institutions with regulatory compliance. The correlation between impact on a business process through ECM implementation stage is demonstrated and been shown to follow reported hypothesis by Reimer (2002). The objective of this article is to provide (1) a typical architecture of an ECM, (2) identify key challenges in implementation and (3) implementation road map strategy
2. Please cite this article in press as: Hullavarad, S., et al. Enterprise Content Management solutions—Roadmap strategy and implementation
challenges. International Journal of Information Management (2015), http://dx.doi.org/10.1016/j.ijinfomgt.2014.12.008
ARTICLE IN PRESSG Model
JJIM-1383; No.of Pages6
2 S. Hullavarad et al. / International Journal of Information Management xxx (2015) xxx–xxx
Table 1
ECM development and pioneering research.
Author(s) Focus/main theme
Reimer (2002) ECM basic structure and
fundamentals
Rockley, Kastur, Manning (2003)
(Meicher, 2013)
Development of unified
content methodology
Smith and McKeen (2003)
(Meicher, 2013)
ECM information governance,
benefits, content stewardship
Nordheim and Paivarinta (2006)
(Meicher, 2013)
Strategic development and
ECM implementation method
Nordheim and Paivarinta (2004)
(Meicher, 2013)
ECM customization
O’Callaghan and Smits (2005)
(Meicher, 2013)
ECM development
Munkvold (2006) (Meicher, 2013) Improvement opportunities for
ECM
Paivarinta and Munkvold (2005) ECM impact, objectives,
content management,
enterprise information
architecture
process; and covering legal liability. Such quantitative estimations
tend to be complicated due to various factors agreed to by industry
and academic peers (Irani, 2002). To our knowledge, after reviewing
existing literature, this is the first empirical evidence that reinforces
Remer’s hypothesis (Reimer, 2002). Research in ECM concepts
including knowledge, data and information resource management,
and compliance are still a nascent field (Brocke, Simons, & Cleven,
2011). Brocke (2007) provide a good timeline of ECM research and
development in the field. Table 1 (adapted from Brocke, 2007)
shows the selected contributions in ECM development.
2. ECM architecture
ECM solution typically consists of four essential components
(Fig. 1); – (1) User interface –a process through which information
(digital or non-digital) is brought into ECM. This is accomplished
either by converting hard copy documents by image capture scan-
ning or by uploading electronic version of information into ECM.
The information consists of documents in the hard copy format or
digital format (generated by Microsoft/Mac, or by Google Docu-
ments).
(2) Information governance – This is a key ECM functionality that
separates ECM from other digital archival systems. The incoming
information is now designated at this stage as an official record.
ECM solutions offer a capability to assign a record with functional
area specific records and retention rules. ECM automatically deletes
such records after the records retention duration, which, thus, pro-
vides regulatory compliance.
(3) Attributes – ECM is equipped with features meant to achieve
specific business purposes. Data archive provides a systematic
approach to archive and retrieve the information using select
keywords; Intelligent Data Capture – for converting image based
information to a computer readable format by optical character
recognition; Workflow – an automated process based on a pre-
configured logic where information flows through different stages;
Integration/Data processing – a built in information management
solution to connect different data streams; and Information disposal
– a deletion time affixed to certain information to be automatically
applied in order to delete the documents to be in compliance.
(4) Repository – ECM systems provide a secure approach to
store the information for on demand access. There is a variety of
Fig. 1. Typical ECM architecture in institutions of higher education.
3. Please cite this article in press as: Hullavarad, S., et al. Enterprise Content Management solutions—Roadmap strategy and implementation
challenges. International Journal of Information Management (2015), http://dx.doi.org/10.1016/j.ijinfomgt.2014.12.008
ARTICLE IN PRESSG Model
JJIM-1383; No.of Pages6
S. Hullavarad et al. / International Journal of Information Management xxx (2015) xxx–xxx 3
Fig. 2. ECM road map Strategy.
information storage protocols that allow information to be stored
on arrayed disks to allow for enhanced data security. The reposito-
ries can be on-site or through the cloud (cloud storage is discussed
in following section).
3. ECM implementation stages
The primary goal of an ECM implementation roadmap strategy
(Fig. 2) is to specify the information governance for the life cycle of
the information based on establishing an amalgamated and inter-
operability space, and reducing the content classification burden
for the end user. A well-developed ECM implementation strategy
covers the following;
• Encompasses the majority of records, both paper and electronic,
unstructured and structured.
• Meets the needs of a wide variety of stakeholders throughout the
organization.
• Enables the organization to respond to legal discovery.
• Automates business processes, removing the inconsistency of
manual processes.
• Up-to-date with respect to technology.
AIIM recommends the following steps for success in implemen-
ting an ECM system: (1) Concept of Operations, (2) Information
Governance Framework, (3) Business & System Requirements, (4)
Classification Scheme, (5) User Interface & Environment, (6) IT
infrastructure, (7) Roll-out, and (8) Post implementation. These
can be broadly grouped under three categories; (1) Organizational
Requirements, (2) Access and Collaboration Requirements, and (3)
Functional Requirements (Fig. 3).
3.1. Roadmap strategy
Long before considering an ECM solution, organizations need to
assess their business needs, which are inherent and specific to the
nature of the business and culture of the organization. The business
needs should sufficiently cover the following aspects;
• Assessment of existing technology infrastructure/environment,
readiness.
• Change management.
• Immediate and long term training considerations.
• Information security and alignment with regulatory compliance.
• Taxonomy and metadata requirements for data classification and
retrieval.
• Records management and information governance.
• Storage capacity needs – on premise and cloud.
• Disaster recovery strategy.
These needs include identification of tactical benefits including
improving internal and external collaboration, enhancing con-
tent quality and maintaining consistency, standardizing workflows,
producing organizational metadata attached to content objects,
and provisioning for regulatory requirements.
An information governance team comprising representatives
from all stakeholder groups should act as a catalyst to enforce con-
sistent governing policies, such as the adoption of an organization
file plan or classification scheme; use of taxonomy; and application
of retention, disposal, and archival rules. A minimal gap between
perceived benefits and user adoption is a clear indication of a well-
planned roadmap.
3.2. ECM design and development
ECM solutions are not plug-and-play and can be customized
based on application data and content. Every business process is
different with varying inflow of information originating internally
and externally. Albeit, ECM solutions for the broader category
of business types (viz., healthcare, finance, education, insurance,
research & development) – provide some basic functionality
that are specific to an industry, a certain degree of product
4. Please cite this article in press as: Hullavarad, S., et al. Enterprise Content Management solutions—Roadmap strategy and implementation
challenges. International Journal of Information Management (2015), http://dx.doi.org/10.1016/j.ijinfomgt.2014.12.008
ARTICLE IN PRESSG Model
JJIM-1383; No.of Pages6
4 S. Hullavarad et al. / International Journal of Information Management xxx (2015) xxx–xxx
OrganizaƟonal
Requirements
• Knowledge of industry best pracƟces,
• OrganizaƟon records, informaƟon governance, management
• Regulatory compliance
Access &
CollaboraƟon
Requirements
• Easy access, retrieval of informaƟon
• Defined access rights & privileges based on roles
• InformaƟon sharing, automaƟon, workflow
FuncƟonal
Requirements
• Easy access to records and referenced informaƟon/documents
• ReducƟon in informaƟon overload to end users
• Document version control
• Training on exisƟng services
Fig. 3. Organization requirements and access controls.
customization is necessary. The likelihood of ECM success depends
heavily on the outcome of connected workflow execution order
and process schedules. The following features are worthy of
consideration in designing the ECM;
• Documents are routed in a standard, controlled, and prompt man-
ner.
• Accommodate exceptions by assigning specific users with rights
to add or exempt stages on an ad hoc basis.
• Forward documents without delay to each successive phase.
• Allows documents to be prioritized in each queue. If there is no
priority assigned, the documents are sorted by the date and time
they enter the lifecycle.
• Monitor and measure the time to complete a process.
• Audit queues for periodic review for quality assurance.
• Processes can be easily added or adjusted at the document,
process, group, or enterprise level by specified users or adminis-
trators.
• Customization of both the routing and the user interface without
programming by enabling Point-and-click configuration.
3.3. ECM deployment and training
The detailed deployment and validation plan is very critical to
achieve timely implementation of ECM. The deployment should be
piloted in a test environment for learning any process-related bottle
necks before ECM is migrated to the production environment. There
are industry standards on stress testing key functionalities, such as
large data handling or varied types of data inflow, – see Table 2
Table 2
Stress test features for validation.
Stress test type Process
Volume handling Assume 50% of employee login as users to ECM
handling at least 60% of total data/information
volume. Repeat for 2 different scenarios
Processing power •Based on information load, write at least 250
scanned pages per minute
•Response to user input – retrieval rate
•Recovery from server failure within 10 min
Test configuration •Multiple physical servers, hosted on VMWare
and running VMWare for failover
•Dynamic reallocation of computing resources
across cluster
for aid in identifying possible missing features during the course of
deployment.
Training of personnel and keeping pace with upgrades on the
deployed ECM solution should be part of product support. Some
ECM providers require that power users, who are in an ECM admin-
istrative capacity, receive standardized testing certification. It takes
an extra effort by an ECM implementation team to develop training
methods that are carefully customized for user preference (video vs
PowerPoint presentation) to engage the personnel through stages
of testing, upgrades, and later extending other ECM functionalities.
Implementation is often hindered by incompatibility between
the ECM platform and the existing technology environment. The
ECM solution should be compatible to the existing software appli-
cations in use for routine job functions, such as editing documents,
storing data files, searching, and electronic record fabrication, and
preservations tools. In many cases, projects fail dismally because of
a poor (or no) initial needs assessment not including a broad stake-
holder pool; not developing a good deployment plan; or not having
executive support.
The success of an ECM implementation lies in articulating the
perceived benefits and potential of system efficiency. To achieve
these benefits, organizations must involve all stakeholders in
selecting the ECM product and vendor, developing the imple-
mentation plan, deploying the plan, and training users. Mapping
the organization’s information management needs, culture, and
business processes and, then, threading those into an ECM imple-
mentation roadmap strategy will enable organizations to weather
the storm of big data transformation and be in compliance.
3.4. ECM solution implementation: lessons learned
• Identify areas and realistic needs and then configure ECM solu-
tions to meet the functional area needs not the needs of the ECM
solution – let the functional need drive the technology, not the
other way around.
• Include all stakeholders early in the process and get buy-in.
• Analyze the content first to arrive at the decision to choose ECM
technology.
• Be prepared to accept the fact that launching an ECM is just
the beginning of a long process that involves training, routine
upgrades, certifications.
• Define the quantitative expected outcomes for the organization.
• Delineate security access controls, roles, and responsibilities.
• Document access and approval procedures.
5. Please cite this article in press as: Hullavarad, S., et al. Enterprise Content Management solutions—Roadmap strategy and implementation
challenges. International Journal of Information Management (2015), http://dx.doi.org/10.1016/j.ijinfomgt.2014.12.008
ARTICLE IN PRESSG Model
JJIM-1383; No.of Pages6
S. Hullavarad et al. / International Journal of Information Management xxx (2015) xxx–xxx 5
Fig. 4. Evolution of business efficiency through ECM deployment stages.
4. Business impact through ECM implementation stages
Fig. 4 shows the ECM implementation stage and its impact in
improving business efficiency. Reimer (2002) has suggested a 3-
stage process to monitor and fine-tune the ECM implementation
process involving a transition from (1) physical handling, (2) par-
tial – physical and electronic handling, and (3) complete electronic
processing, as time progresses. In this study, we compared a typical
impact on the student enrollment automation process with the the-
oretical correlation provided by Reimer between business impact
and the ECM implementation process (Meicher, 2013). The data
was collected from published business case studies for higher edu-
cation when institutions implemented an automated ECM process
at the various steps of student enrollment. The business impact
curve for a typical process has a slower adoption rate initially as
noted by a lag from the theoretical pattern predicted by Reimer.
However, by the end of second phase of the implementation, the
automation starts exhibiting the increased rate and takes over the
theoretical curve at a faster rate as noted by a crossover in Fig. 4.
The amount of lag is dependent on the business process and can
be minimized by analyzing the type of information, implementing
a suitable ECM platform, identifying the challenges, and training
personnel.
5. ECM in the cloud: security and compliance
The idea of the cloud as an ever-elastic and infinitely avail-
able storage facility for all content is gaining momentum especially
in the collaborative academic world. There are numerous advan-
tages from cloud ECM (notably for mobile access as compared to
the on-premises platforms). The important criterion when decid-
ing on cloud ECM include data security, all time access to data, and
detailed audit trail while at same time an ability to provide suffi-
cient functionality to support and optimize business processes as
that of on-premises ECM.
Cloud ECM can be installed in a very short period of time
compared to on-premises ECM at a fraction of the cost with
no hardware or software to install (approximately 2–3 months
to implement with no capital and annual maintenance costs on
the physical infrastructure). Cloud solutions are highly scalable,
efficient, and cost effective, and hence, the cost of ownership
is less. It is no surprise that many organizations have now
shifted to cloud-based solutions for managing email communi-
cations, documents, and scheduling that make use of the virtual
storage and access capabilities of cloud storage by decoupling
the content from physical infrastructure and reducing the burden
on IT departments. Microsoft and Google Apps provide a ver-
satile suite of applications for online sharing of documents and
data, through SharePoint/Lync and Google Docs, respectively. How-
ever, the major risk and challenge associated with cloud ECM
are security and compliance. The transition from on-premises
to cloud might require additional user training, password iden-
tification and logins, group filing systems, and integration with
existing enterprise, facility management and maintenance sys-
tems.
An important aspect to consider while deploying cloud ECM
is to ensure the safety of the cloud storage physical location,
content management, the personnel managing the servers and
their credibility. Cloud ECM information governance should cover
cloud, network, security, data center architecture, in-transit con-
nections, built-in redundancy, and data replication aspects. The
cloud ECM vendor should be able to demonstrate a proven record
of security and compliance. The real challenge would be to have
a clear idea of the physical location and contents of the cloud
storage. A cloud’s routers, servers, and technical data storage
devices are typically located across multiple systems across the
globe.
6. Conclusion
ECM solutions offer robust functionality in handling information
regardless of their origin, minimize operating costs, improve cus-
tomer service, and minimize risk. Understanding the true nature of
data and information streams is very critical to automate the pro-
cess. The ECM implementation stages play a key role in business
impact and can be accelerated by scoping the type of enterprise
content, type of architecture and user training. The lag in imple-
mentation can be minimized by analyzing the type of information,
implementing a suitable ECM platform, identifying the challenges,
and training personnel.
Disclaimer
The views and opinions expressed in this article are solely by
the authors. No reliance should be placed upon this article for
making legal, business, or other important decisions. The Univer-
sity of Alaska and the authors do not endorse and/or approve any
affiliations to any commercial entities and the study reported in
this paper is for education and knowledge dissemination purposes
only.
References
Bentley, K., & Young, P. (2000). Knowledge work and telework: An exploratory study.
Internet Research, 10(4), 346–356.
Brocke, J. (2007). Design principles for reference modeling: reusing information
models by means of aggregation, specialization, instantiation and analogy. In
P. Fettke, & P. Loos (Eds.), Modeling for business systems analysis. (pp. 47–75).
London: IGI Publishing.
Brocke, J., Simons, A., & Cleven, A. (2011). Towards a business process-oriented
approach to enterprise content management: The ECM-blueprinting network.
International System E: Business Management, 9, 475–476.
Engel, E., Hayes, R. M., & Wang, X. (2007). The Sarbanes–Oxley Act and
firms’ going-private decisions. Journal of Accounting and Economics, 44(1–2),
116–145.
Grahlmann, K. R. (2010). Impacts of implementing enterprise content management
systems, ECIS2010-0288.R1. In 18th European Conference on Information Systems.
Irani, Z. (2002). Information systems evaluation: Navigating through the problem
domain. Information & Management, 40(1), 11–24.
Meicher, L. (2013). Madison Area Technical College, https://www.onbase.
com/community/iug/higher education/m/2013mwuserforumforhe/13599.aspx
Paivarinta, T., & Munkvold, B. E. (2005). Enterprise content management: An inte-
grated perspective on information management. In Proceedings of the 38th
Hawaii international conference on system sciences Waikoloa, HI, USA, January
3–6,
6. Please cite this article in press as: Hullavarad, S., et al. Enterprise Content Management solutions—Roadmap strategy and implementation
challenges. International Journal of Information Management (2015), http://dx.doi.org/10.1016/j.ijinfomgt.2014.12.008
ARTICLE IN PRESSG Model
JJIM-1383; No.of Pages6
6 S. Hullavarad et al. / International Journal of Information Management xxx (2015) xxx–xxx
Reimer, J. A. (2002). Enterprise content management. Datenbanken Spektrum, 2(4),
17–35.
Tyrvainen, P., Paivarinta, T., Salminen, A., & Iivari, J. (2006). Characterizing the evolv-
ing research on enterprise content management. European Journal of Information
Systems, 15(6), 627–634.
Shiva Hullavarad is Enterprise Content and Electronics Records Administrator for
the University of Alaska System. He holds four university degrees and ECM/ERM
practitioner. He is responsible for maintaining the ECM platform and provides over
sight for the implementation and management of the system. Shiva reviews and
approves all ECM/ERM process changes and implementation requests submitted
by campus administrators. He has authored 81 technical papers and presented at
national conferences.
Russell O’Hare is the Chief Records Officer for the University of Alaska System. He
holds four university degrees and is a certified records manager. He is responsible
for the university records information compliance program, approves university
retention and disposition schedules, and is the university Red Flag program admin-
istrator. He oversees the statewide records center, micrographic, and enterprise
content management offices.
Ashok K. Roy is Vice President for Finance & Administration/CFO of the University
of Alaska System & Associate Professor of Business Administration at University of
Alaska Fairbanks. He holds six university degrees and five professional certifica-
tions. Dr. Roy has also authored over 83 publications in trade and academic journals
including chapters in two encyclopedias.