This document provides an introduction to a study on the development of the Z39.50 standard for information retrieval. The study had two main goals: 1) to document the development of Z39.50 over 16 years, and 2) to develop a holistic understanding of the standards development process through analyzing Z39.50. The study used a case study methodology with qualitative data collection including documents, interviews, and observations. A preliminary conceptual model guided the initial research. The study was limited to investigating Z39.50 and did not attempt to generalize to all standards development processes. The report is structured to provide background, methodology, a historical reconstruction of Z39.50, findings from analyzing it, and implications.
Common protocol to support disparate communication types within industrial Et...Maurice Dawson
Owing to the increasing demand for reliable products built globally, and through the evolution of machine design, the need for improved and a common communications protocol in different geographical regions has intensified. In this paper, the goal is to reveal that the current protocols used to support disparate communication types in manufacturing have caused complexity in configurations and an increase in monetary overhead for industrial system designers and the end users. Through the simulation of an industrial network, the packet timing, and packet loss between peer-to-peer systems, similar protocol systems will be compared with two dissimilar protocols systems to establish the thesis. The internal validation research method used in this study will reveal the need for an all-inclusive protocol to eliminate the timing and packet loss issues, the systems’ configuration complexities, and the need to reduce the monetary overhead currently associated with the machine communications.
Common protocol to support disparate communication types within industrial Et...Maurice Dawson
Owing to the increasing demand for reliable products built globally, and through the evolution of machine design, the need for improved and a common communications protocol in different geographical regions has intensified. In this paper, the goal is to reveal that the current protocols used to support disparate communication types in manufacturing have caused complexity in configurations and an increase in monetary overhead for industrial system designers and the end users. Through the simulation of an industrial network, the packet timing, and packet loss between peer-to-peer systems, similar protocol systems will be compared with two dissimilar protocols systems to establish the thesis. The internal validation research method used in this study will reveal the need for an all-inclusive protocol to eliminate the timing and packet loss issues, the systems’ configuration complexities, and the need to reduce the monetary overhead currently associated with the machine communications.
Table of Content - International Journal of Managing Information Technology (...IJMIT JOURNAL
The International Journal of Managing Information Technology (IJMIT) is a quarterly open access peer-reviewed journal that publishes articles that contribute new results in all areas of the strategic application of information technology (IT) in organizations. The journal focuses on innovative ideas and best practices in using IT to advance organizations – for-profit, non-profit, and governmental. The goal of this journal is to bring together researchers and practitioners from academia, government and industry to focus on understanding both how to use IT to support the strategy and goals of the organization and to employ IT in new ways to foster greater collaboration, communication, and information sharing both within the organization and with its stakeholders. The International Journal of Managing Information Technology seeks to establish new collaborations, new best practices, and new theories in these areas.
Information system project management (IT - project) is a complex iterative process. An important role for the development of complex IT projects plays records of the development lifecycle (LC). The article presents an analysis of the effectiveness of the work on the creation of IT - projects based on two modified models of the life cycle: cascade and spiral. Analysis of the effectiveness of the management of the IT project was implemented on the basis of simulation. The modeling was carried out on the basis of Any Ljgic tools on the example of development of geoinformation system (GIS). It is shown that it is advisable to design GIS on the basis of a modified spiral LC with splitting of the flow of requirements at the input. The peculiarity of the proposed study is to take into account the requirements in the form of communicative interactions of different types. Under the communicative interactions are understood all the interactions between the subjects of the process of creating an IT-project: verbal and non - verbal, carried out on the basis of CASE-means.
Information system project management (IT - project) is a complex iterative process. An important role for the development of complex IT projects plays records of the development lifecycle (LC). The article presents an analysis of the effectiveness of the work on the creation of IT - projects based on two modified models of the life cycle: cascade and spiral. Analysis of the effectiveness of the management of the IT project was implemented on the basis of simulation. The modeling was carried out on the basis of Any Ljgic tools on the example of development of geoinformation system (GIS). It is shown that it is advisable to design GIS on the basis of a modified spiral LC with splitting of the flow of requirements at the input. The peculiarity of the proposed study is to take into account the requirements in the form of communicative interactions of different types. Under the communicative interactions are understood all the interactions between the subjects of the process of creating an IT-project: verbal and non - verbal, carried out on the basis of CASE-means
An Investigation of Critical Failure Factors In Information Technology ProjectsIOSR Journals
Rate of failed projects in information technology system project remains high in comparison with other infrastructure or high technology projects. The objective of this paper is to determine and represent a broad range of potential failure factors during the implementation phase and cause of IS/IT Project defeat/failure. Challenges exist in order to achieve the projects goal successfully and to avoid the failure. In this research study, 12 articles were studied as significant contributions to analyze developing a list of critical failure factors of IT projects
Cloud Computing Role in Information technologyKHakash
Simply put, cloud computing is the delivery of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over the Internet (“the cloud”) to offer faster innovation, flexible resources, and economies of scale.
Information technology (IT) is part of the value chain and corporate strategy of companies (PORTER, 1998). This area has been highlighted by its rapid development being responsible for several technological innovations, which affect the strategic positioning of companies. In the 1970s the computing model based on both proprietary technology and high cost large computers, known as mainframes, contributed to the formation of oligopolies in companies providing IT services which provided data processing to consumer companies.
Achieving Semantic Integration of Medical Knowledge for Clinical Decision Sup...AmrAlaaEldin12
Abstract. Enhancing the outputs of the Clinical Decision Support systems (CDS) is a permanent concern for many research communities, which have to deal with an abundance of entities, data, structures, methods, application, tools, and so on. In the few past decades, there were theorized and standardized tech- nologies that could help researchers to obtain better results. The paper presents a method to enrich the inputs of the CDS through a semantic integration of sev- eral medical knowledge sources, by using the Topic Maps standard, in order to obtain more refined medical recommendations. Future research directions and challenges are summarized and conclusions are issued.
Use of Computational Tools to Support Planning & Policy by Johannes M. BauerLaleah Fernandez
Quello Center Director Johannes M. Bauer sharing his insights on the use of big data analytics and computational tools for policy design, implementation and monitoring at the 9th Annual Workshop on Survey Methodology, organized by NIC.br and ENCE, in São Paolo, Brazil, on May 21, 2019
Table of Content - International Journal of Managing Information Technology (...IJMIT JOURNAL
The International Journal of Managing Information Technology (IJMIT) is a quarterly open access peer-reviewed journal that publishes articles that contribute new results in all areas of the strategic application of information technology (IT) in organizations. The journal focuses on innovative ideas and best practices in using IT to advance organizations – for-profit, non-profit, and governmental. The goal of this journal is to bring together researchers and practitioners from academia, government and industry to focus on understanding both how to use IT to support the strategy and goals of the organization and to employ IT in new ways to foster greater collaboration, communication, and information sharing both within the organization and with its stakeholders. The International Journal of Managing Information Technology seeks to establish new collaborations, new best practices, and new theories in these areas.
Information system project management (IT - project) is a complex iterative process. An important role for the development of complex IT projects plays records of the development lifecycle (LC). The article presents an analysis of the effectiveness of the work on the creation of IT - projects based on two modified models of the life cycle: cascade and spiral. Analysis of the effectiveness of the management of the IT project was implemented on the basis of simulation. The modeling was carried out on the basis of Any Ljgic tools on the example of development of geoinformation system (GIS). It is shown that it is advisable to design GIS on the basis of a modified spiral LC with splitting of the flow of requirements at the input. The peculiarity of the proposed study is to take into account the requirements in the form of communicative interactions of different types. Under the communicative interactions are understood all the interactions between the subjects of the process of creating an IT-project: verbal and non - verbal, carried out on the basis of CASE-means.
Information system project management (IT - project) is a complex iterative process. An important role for the development of complex IT projects plays records of the development lifecycle (LC). The article presents an analysis of the effectiveness of the work on the creation of IT - projects based on two modified models of the life cycle: cascade and spiral. Analysis of the effectiveness of the management of the IT project was implemented on the basis of simulation. The modeling was carried out on the basis of Any Ljgic tools on the example of development of geoinformation system (GIS). It is shown that it is advisable to design GIS on the basis of a modified spiral LC with splitting of the flow of requirements at the input. The peculiarity of the proposed study is to take into account the requirements in the form of communicative interactions of different types. Under the communicative interactions are understood all the interactions between the subjects of the process of creating an IT-project: verbal and non - verbal, carried out on the basis of CASE-means
An Investigation of Critical Failure Factors In Information Technology ProjectsIOSR Journals
Rate of failed projects in information technology system project remains high in comparison with other infrastructure or high technology projects. The objective of this paper is to determine and represent a broad range of potential failure factors during the implementation phase and cause of IS/IT Project defeat/failure. Challenges exist in order to achieve the projects goal successfully and to avoid the failure. In this research study, 12 articles were studied as significant contributions to analyze developing a list of critical failure factors of IT projects
Cloud Computing Role in Information technologyKHakash
Simply put, cloud computing is the delivery of computing services—including servers, storage, databases, networking, software, analytics, and intelligence—over the Internet (“the cloud”) to offer faster innovation, flexible resources, and economies of scale.
Information technology (IT) is part of the value chain and corporate strategy of companies (PORTER, 1998). This area has been highlighted by its rapid development being responsible for several technological innovations, which affect the strategic positioning of companies. In the 1970s the computing model based on both proprietary technology and high cost large computers, known as mainframes, contributed to the formation of oligopolies in companies providing IT services which provided data processing to consumer companies.
Achieving Semantic Integration of Medical Knowledge for Clinical Decision Sup...AmrAlaaEldin12
Abstract. Enhancing the outputs of the Clinical Decision Support systems (CDS) is a permanent concern for many research communities, which have to deal with an abundance of entities, data, structures, methods, application, tools, and so on. In the few past decades, there were theorized and standardized tech- nologies that could help researchers to obtain better results. The paper presents a method to enrich the inputs of the CDS through a semantic integration of sev- eral medical knowledge sources, by using the Topic Maps standard, in order to obtain more refined medical recommendations. Future research directions and challenges are summarized and conclusions are issued.
Use of Computational Tools to Support Planning & Policy by Johannes M. BauerLaleah Fernandez
Quello Center Director Johannes M. Bauer sharing his insights on the use of big data analytics and computational tools for policy design, implementation and monitoring at the 9th Annual Workshop on Survey Methodology, organized by NIC.br and ENCE, in São Paolo, Brazil, on May 21, 2019
As Europe's leading economic powerhouse and the fourth-largest hashtag#economy globally, Germany stands at the forefront of innovation and industrial might. Renowned for its precision engineering and high-tech sectors, Germany's economic structure is heavily supported by a robust service industry, accounting for approximately 68% of its GDP. This economic clout and strategic geopolitical stance position Germany as a focal point in the global cyber threat landscape.
In the face of escalating global tensions, particularly those emanating from geopolitical disputes with nations like hashtag#Russia and hashtag#China, hashtag#Germany has witnessed a significant uptick in targeted cyber operations. Our analysis indicates a marked increase in hashtag#cyberattack sophistication aimed at critical infrastructure and key industrial sectors. These attacks range from ransomware campaigns to hashtag#AdvancedPersistentThreats (hashtag#APTs), threatening national security and business integrity.
🔑 Key findings include:
🔍 Increased frequency and complexity of cyber threats.
🔍 Escalation of state-sponsored and criminally motivated cyber operations.
🔍 Active dark web exchanges of malicious tools and tactics.
Our comprehensive report delves into these challenges, using a blend of open-source and proprietary data collection techniques. By monitoring activity on critical networks and analyzing attack patterns, our team provides a detailed overview of the threats facing German entities.
This report aims to equip stakeholders across public and private sectors with the knowledge to enhance their defensive strategies, reduce exposure to cyber risks, and reinforce Germany's resilience against cyber threats.
Explore our comprehensive data analysis project presentation on predicting product ad campaign performance. Learn how data-driven insights can optimize your marketing strategies and enhance campaign effectiveness. Perfect for professionals and students looking to understand the power of data analysis in advertising. for more details visit: https://bostoninstituteofanalytics.org/data-science-and-artificial-intelligence/
Data Centers - Striving Within A Narrow Range - Research Report - MCG - May 2...pchutichetpong
M Capital Group (“MCG”) expects to see demand and the changing evolution of supply, facilitated through institutional investment rotation out of offices and into work from home (“WFH”), while the ever-expanding need for data storage as global internet usage expands, with experts predicting 5.3 billion users by 2023. These market factors will be underpinned by technological changes, such as progressing cloud services and edge sites, allowing the industry to see strong expected annual growth of 13% over the next 4 years.
Whilst competitive headwinds remain, represented through the recent second bankruptcy filing of Sungard, which blames “COVID-19 and other macroeconomic trends including delayed customer spending decisions, insourcing and reductions in IT spending, energy inflation and reduction in demand for certain services”, the industry has seen key adjustments, where MCG believes that engineering cost management and technological innovation will be paramount to success.
MCG reports that the more favorable market conditions expected over the next few years, helped by the winding down of pandemic restrictions and a hybrid working environment will be driving market momentum forward. The continuous injection of capital by alternative investment firms, as well as the growing infrastructural investment from cloud service providers and social media companies, whose revenues are expected to grow over 3.6x larger by value in 2026, will likely help propel center provision and innovation. These factors paint a promising picture for the industry players that offset rising input costs and adapt to new technologies.
According to M Capital Group: “Specifically, the long-term cost-saving opportunities available from the rise of remote managing will likely aid value growth for the industry. Through margin optimization and further availability of capital for reinvestment, strong players will maintain their competitive foothold, while weaker players exit the market to balance supply and demand.”
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Opendatabay - Open Data Marketplace.pptxOpendatabay
Opendatabay.com unlocks the power of data for everyone. Open Data Marketplace fosters a collaborative hub for data enthusiasts to explore, share, and contribute to a vast collection of datasets.
First ever open hub for data enthusiasts to collaborate and innovate. A platform to explore, share, and contribute to a vast collection of datasets. Through robust quality control and innovative technologies like blockchain verification, opendatabay ensures the authenticity and reliability of datasets, empowering users to make data-driven decisions with confidence. Leverage cutting-edge AI technologies to enhance the data exploration, analysis, and discovery experience.
From intelligent search and recommendations to automated data productisation and quotation, Opendatabay AI-driven features streamline the data workflow. Finding the data you need shouldn't be a complex. Opendatabay simplifies the data acquisition process with an intuitive interface and robust search tools. Effortlessly explore, discover, and access the data you need, allowing you to focus on extracting valuable insights. Opendatabay breaks new ground with a dedicated, AI-generated, synthetic datasets.
Leverage these privacy-preserving datasets for training and testing AI models without compromising sensitive information. Opendatabay prioritizes transparency by providing detailed metadata, provenance information, and usage guidelines for each dataset, ensuring users have a comprehensive understanding of the data they're working with. By leveraging a powerful combination of distributed ledger technology and rigorous third-party audits Opendatabay ensures the authenticity and reliability of every dataset. Security is at the core of Opendatabay. Marketplace implements stringent security measures, including encryption, access controls, and regular vulnerability assessments, to safeguard your data and protect your privacy.
1. `
1–1
Chapter 1
Introduction to the Study
1.1. Introduction
The convergence of computer and communications technologies of the late 20th century has
profoundly affected information creation, distribution, access, and use. Vast networks of fiber
optic, copper, wireless, and other transmission media interconnect millions of computers running
a wide range of applications. The resulting internetworked information infrastructure provides a
“framework in which communications networks support higher–level services for human
communication and access to information” (National Research Council, 1994b, p. 22).
Achieving even primitive interconnection of disparate computer systems requires agreements on
communications protocols and many other technical specifications. Often these protocols and
technical specifications are the result of processes and procedures that bestow upon the technical
specifications the privileged status of a standard. Robust interconnection and information
exchange between systems require a broad range of standards: standards that specify data
formats, network interfaces, operating systems, storage devices, computer–to–computer
communications protocols, and other aspects of this complex information handling and processing
environment. An initial challenge is to articulate the problems and requirements to enable systems
to exchange information and, second, to specify standards that support such information
exchange.
National and international initiatives are underway to deploy the National Information
Infrastructure (NII) and its international counterpart, the Global Information Infrastructure (GII).
Visions of these massive infrastructures call for interoperable and open systems that support a
variety of applications and in which digital data passes easily and seamlessly across networks and
technologies (Information Infrastructure Task Force, 1993). Technical standards that address
interfaces and applications are critical to building the NII and GII (see for example, Computer
Systems Policy Project, 1994; Cross Industry Working Team, 1994; Information Infrastructure
Task Force, 1993; National Institute of Standards and Technology, 1994; National Research
Council, 1996).
Technical standards are necessary for meaningful exchange of information between computer
systems. The availability of standards depends on workable systems and processes for developing
those standards. Currently, organizations and individuals use various mechanisms to develop
information technology standards. Among these mechanisms are the system of voluntary
consensus standards production governed by the American National Standards Institute (ANSI),
various consortia (e.g., World Wide Web Consortium), and groups such as the Internet
Engineering Task Force.
2. 1–2
Evidence in the literature, however, points to major difficulties and problems in producing
information technology standards. As one example, a report from the Organisation for Economic
Co–operation and Development (Committee for Information, Computers and Communications
Policy, 1991, p. 7) stated:
the acceleration of technological advances in the IT [information technology] area has
put the standardization system under pressures which may often be contradictory: the
production of standards become more urgent, but consensus between the interested
groups may be more difficult to achieve because of uncertainties and the magnitude of
vested interests. Furthermore, caution is required to formulate standards which will
evolve in line with future developments.
Although there is a critical need for technical standards, understanding the processes for creating
these standards appears deficient.
Little in the way of systematic, empirical research on the development of information technology
standards exists. A number of writers have pointed to reasons why these essential standards are
difficult to create. Beyond the descriptions of the official standards procedures and anecdotal
evidence about some of the problems, however, we have little understanding of entire process of
“standards development.” While a number of researchers have conducted empirical research, the
efforts are often limited to aspects of standards development but have contributed insights into the
economic factors, the formal rules and procedures, and leadership in standards committees (e.g.,
Farrell & Saloner, 1985; Lehr, 1992; Spring, et al., 1995; Weiss, 1993). None of the research has
attempted an empirically– or theoretically–based, holistic understanding of an entire standards
development process.
1.2. Study Goals, Objectives, and Research Questions
This study addresses the lack of a holistic understanding of the complex of activities, entities,
processes, and forces that comprise standards development. The basic question motivating the
researcher was: “Why is the development of voluntary consensus standards for information
technology problematic?” Addressing this question implied a need to develop an understanding of
the processes by which such standards are created. There are calls for improving information
technology standards development. Without an empirical and conceptual understanding of the
complex and multifaceted standards development process, however, changes and reforms to
existing process may turn out to be “solutions” to the “wrong problems.”
To develop an understanding of standards development, the study investigated the creation of one
specific standard currently used in the distributed, networked information environment.
ANSI/NISO Z39.50: Information Retrieval (Z39.50) Application Service Definition and Protocol
Specification (National Information Standards Organizations, 1995) is an American National
Standard that defines a standard protocol by which two computers to communicate for the purpose of
information retrieval. Specifically, Z39.50 supports information retrieval in a distributed, client and
server environment where a computer operating as a client conducts information retrieval transactions
3. 1–3
on a remote computer acting as an information server. It provides a consistent view of information
from a wide variety of sources and offers client implementors the capability to integrate information
from a range of databases and servers (for background information on Z39.50, see Moen, 1995).
A case study of Z39.50 provided an opportunity to investigate a long–term standards development
activity. Formal Z39.50 development began in 1979 and this study follows the standard’s evolution
through 1995. The focus of this study was the complex phenomenon of standards development
that includes activities, procedures, processes, and various entities, constraints, and other forces
operating at macro– and micro–levels.
Standards development does not occur in a vacuum. At the macro–level of standards
development are policy, organizational, political, economic, and other concerns that may influence
the development of a particular standard (for an overview of these concerns, see Committee for
Information, Computers and Communications Policy, 1991). A holistic understanding requires
that a macro–level understanding be complemented by an examination of the activities and
processes involved in writing and producing standards (i.e., the micro–level).
Two complementary goals, a set of objectives, and three specific research questions provided
overall guidance and direction for this study. Figure 1–1 lists the study’s goals, objectives, and
research questions.
___________________________________________________________
Figure 1–1
Study Goals, Objectives, and Research Questions
Study Goals
• G1: Document the development of Z39.50
• G2: Develop a holistic understanding of Z39.50 development
Study Objectives
• O1: Identify and describe the context within which Z39.50 development
occurred and discover the important factors that enabled or constrained its
development
• O2: Revise and refine the preliminary conceptual model of information
technology standards development to reflect Z39.50 development
• O3: Develop working hypotheses from Z39.50 development to test and explore
in other information technology standards development efforts
Research Questions
• RQ1: What are the activities, entities, processes, and forces, and the contexts
that influenced, enabled, or constrained Z39.50 development?
• RQ2: What are the components of a systems–theoretic conceptual model that
reflects Z39.50 development?
• RQ3: What working hypotheses are warranted based on Z39.50 development to
guide future research?
___________________________________________________________
The first Study Goal (G1) was to document the development of Z39.50. Chapter 4 presents a
descriptive narrative detailing the various events, activities, and people involved with Z39.50
4. 1–4
development over a 16–year period beginning in 1979 with the establishment of the formal
standards committee and ending with the approval in 1995 of revised Version 3 of the standard.
The second Study Goal (G2) was to develop a holistic understanding of Z39.50 development.
Chapter 5 analyzes the reconstruction of Z39.50’s historical development presented in Chapter 4.
The researcher used a systems–theoretic conceptual model as an analytical tool to identify
essential features and factors that shaped Z39.50 evolution. He also revised the model, based on
the data, to reflect Z39.50 evolution.
1.3. Justification for the Research and Study Benefits
Technical standards are needed for achieving the visions of national and global information
infrastructures and to realize effective and efficient information exchange within these
infrastructures. While many commentators and participants criticize the formal voluntary
consensus approach to standards development, there have been few models or methodologies
proposed for conducting research on standards development. This study offers a methodological
approach that allows the exploration, description, and analysis of the complex social and technical
interaction that comprises standards development. The study demonstrates the utility of a
qualitative/naturalistic research approach and case study method to gain a holistic understanding.
The study provides important insights into the difficulty and complexity of developing a voluntary
consensus standard for information technology. The analysis of Z39.50 development presented in
this study demonstrates the utility of a systems model as an analytical tool. The researcher
concluded, however, that the systems–based preliminary conceptual model (see Chapter 2) was
inadequate in representing the dynamic and evolutionary character of Z39.50. The study resulted
in a revised conceptual model that accounts for Z39.50 evolution and proposes two conceptual
extensions to the systems approach that may have utility in subsequent research on standards
development.
Z39.50 was an appropriate standard for this study. Its development was a long–term effort. The
researcher investigated Z39.50 development activities occurring first in the formal setting of the
National Information Standards Organization (NISO) and since 1990, occurring within an emergent,
informal, and non–sanctioned structure (i.e., the Z39.50 Implementors Group, ZIG). The standard’s
development since 1990 through the ZIG and the Z39.50 Maintenance Agency offers an example of a
major change in the standards development process to deal pragmatically with the problems and
requirements implementors. The researcher also views Z39.50 as a revelatory case to the extent that
it provided an opportunity to discover and describe problems that may be common to other
information technology standards development efforts.
This study contributes to an understanding of the formal voluntary consensus standards
development process, since Z39.50 development occurred under the auspices of NISO, an ANSI–
accredited standards organization. Such understanding is necessary if standards organizations are
to make systematic and informed changes to their processes and procedures rather than in
haphazard and reactive ways. Without understanding the complex phenomenon of standards
5. 1–5
development, reforms may be introduced (e.g., using electronic mail with the hope that standards
development can proceed more quickly) when in fact the actual problems may be of an entirely
different order (e.g., not fully acknowledging that goals of standards development efforts may
change over time). The study provides an empirical basis for organizations and individuals
interested identifying factors that may affect standards development. The study’s findings
highlight the complex interaction of forces and entities that may be involved in standards efforts,
and they contribute to ongoing discussions concerning appropriate processes to produce timely
and successful information infrastructure standards.
1.4. Study Methodology
Given the lack of empirical research on standards development, the researcher intended this study
as exploratory and descriptive research. Chapter 3 details the study’s research method and
activities. The study focused on discovery, description, and a holistic understanding of Z39.50
during its 16–year evolution. The study employed a case study methodology that provided a
methodological structure for a detailed examination and documentation of the activities, entities,
processes, and forces, and their interactions during the development of the standard.
The study used a naturalistic research approach (Guba, 1990; Guba & Lincoln, 1994; Lincoln &
Guba, 1985). Naturalistic research is a qualitative research approach that “requires not the
isolation of factual fragments from society, perhaps in order to analyze them quantitatively, but a
participative immersion in the context under study and a healthy respect for the richness, density,
and ambiguity of social life” (Sutton, 1993, p. 416). The focus of naturalistic inquiry is on
understanding and not generalization, a focus on the idiographic (i.e., the particulars of the case)
rather than the nomothetic (i.e., lawlike generalizations) (Lincoln & Guba, 1985, p. 42).
The study used primarily qualitative data collection strategies, although when appropriate,
quantitative data were collected. The study’s sources of data included:
• Documentary evidence from the sixteen years of Z39.50 activities
• Interviews with participants and stakeholders in Z39.50 development
• Observation of Z39.50 standards development activities.
At the outset of the study, the researcher proposed a preliminary conceptual model based upon a
systems–theoretic perspective of standards development. Chapter 2 introduces the model and
provides evidence for it from the literature on standards and standards development. The model
guided the initial staged of the research and served as an analytical tool to examine Z39.50
development. The systems–theoretic approach, which acknowledges the interrelatedness among
components of a system, supported the study’s orientation toward a holistic understanding. Such
an approach illustrates that “the complex world of human beings cannot be fully captured and
understood by simply adding up carefully measured and fully analyzed parts” (Patton, 1990, p.
81).
6. 1–6
The research design was evolutionary in that the entire research effort could not be detailed in
advance. The proposal for this study, however, provided a general map of the research activities.
For this exploratory and descriptive study of Z39.50 development, a qualitative, naturalistic, case
study approach was appropriate. The choice of this approach, however, has implications in terms
of the limits of the study.
1.5. Scope of Study
The study was limited to an investigation of the development of Z39.50. The researcher did not
systematically compare and contrast this case with other standards development efforts. This study
explored the Z39.50 as one instance of information technology standards development. Single–
case studies provide a richness of data and can lead to a deep understanding of a phenomenon in a
single context. Single–case studies such as this do not provide the basis for generalizability over
all instances of the phenomenon. The working hypotheses derived through this study, however, can
be tested in subsequent research. In addition, the study did not attempt to test the preliminary
conceptual model. Instead, the study’s goal was to develop a model that could represent the evolution
of Z39.50.
Z39.50 development is an ongoing activity. This scope of this study, however, was temporally
bound by the initiation of the standards effort in the late 1970s and the completion and approval
of Version 3 of the standard in 1995. This 16–year period allowed the researcher to examine how
the standard’s evolution moved through several phases.
Z39.50 development included a range of activities, entities, processes, and forces, and contexts
including:
• Initiation of the standards development effort
• Components and stages of the development of this standard
• Forces and pressures that affected the development of the standard (e.g., political,
economic, technological, etc.)
• Stakeholders and other interested parties and their interaction.
This was the terrain of standards development investigated by this study.
1.6. Structure of the Report
This dissertation comprises six chapters. This first chapter provides a basic introduction to the
study and briefly presents the motivation for the study, its focus, goals and objectives, research
questions, and research approach.
Chapter 2 provides a context for the research by examining the relevant literature on standards
and standards development. The chapter offers background on standards and standards
development processes to situate Z39.50 development in a broader context. In Chapter 2, the
7. 1–7
researcher introduces the preliminary conceptual model and discusses the theoretical framework
upon which the model is based.
Chapter 3 addresses the research method, data collection, and analysis techniques used in this
study. It describes the archival resources examined in developing the historical reconstruction of
Z39.50 development and identifies the interviews and observations conducted by the researcher to
gather data about historical and current activities related to Z39.50.
Chapters 4 and 5 present the results of the research. Chapter 4 provides a historical
reconstruction of the evolution of Z39.50. Primary source materials and documentary evidence
such as minutes of meetings, reports, and correspondence formed the basis for the reconstruction.
Chapter 5 contains a set of key findings from the research. Chapter 4 presents the story of
Z39.50, and Chapter 5 reflects the next step in creating an understanding of Z39.50 development.
The systems–based conceptual model served as an analytical tool to identify and categorize
factors and forces that affected the development of the standard. The key findings provide a
foundation for understanding why and how Z39.50 evolved. The chapter revisits the preliminary
conceptual model, evaluates its utility in this study, and proposes significant extensions to the
model to represent the case of Z39.50 evolution.
Chapter 6 concludes this dissertation. It summarizes and further elaborates the implications of the
findings from Chapter 5. In addition, the chapter identifies possible next steps in research on
standards development by specifying a set of working hypotheses.
1.7. Summary
This study demonstrates the challenges as well as opportunities for conducting research on
complex and “messy” social and technological phenomena such as standards and their
development. The researcher had assumed that standards development was a complex,
multifaceted social process, yet he had the belief that it was a process that could be successfully
studied and analyzed.
The researcher demonstrated that qualitative/naturalistic research applied in a rigorous and
systematic manner offers an approach to overcome what Schmidt and Werle (1992) view as
difficulties in conducting research on standards development. They suggest that the “empirical
reconstruction of standardization processes is confronted with difficult problems of gathering and
interpreting ‘data’.... Thus, in the final analysis not so much the problems involved in theorizing about
standardization but rather the empirical difficulties of grasping the ‘real world’ of standard–setting
processes may turn out to be the main impasses to this on–going research process” (p. 326).
In studying Z39.50, the researcher explored an important standard for information retrieval in the
networked environment. Z39.50 is a complex computer communications protocol that supports
an open systems environment and has a long and intense development history. Further, as
Michael (Michael & Hinnebusch, 1995, p. 15) states, “Z39.50 is the single most important networking
standard available today.” Through this case study, the researcher achieved an understanding of
8. 1–8
one instance of the phenomenon of information technology standards development and lays a
foundation for future research.