Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Best Practices in Continuous Data Synthesis (Paper)


Published on

  • Be the first to comment

  • Be the first to like this

Best Practices in Continuous Data Synthesis (Paper)

  1. 1. Best Practices in Continuous Data Synthesis Les Pang, Ph.D., MBA Information Resources Management College, National Defense University, 300 5th Avenue, Marshall Hall (Building 62), Room 198A1, IRMC-IT, Fort Lesley J. McNair, DC 20319 (202) 685-2060 Fax (202) 685-3974 Abstract This paper identifies successful strategies, methods and techniques used by Federal agencies and military institutions in implementing the various aspects of continuous data synthesis (CDS). CDS approaches include the use of data warehouses, operational data stores, online analytic processing, data mining, and providing web database accessibility. The challenges, approaches, impact, metrics, and lessons learned from 24 selected projects were reviewed. The paper includes summary lists of best practices derived from these project experiences. Key words - continuous data synthesis, databases, data warehousing, operational data stores, online analytical processing, data mining, web databases, Federal best practices 1. Purpose and Scope Continuous data synthesis (CDS) involves technology-based approaches aimed at improving the level of business intelligence throughout an organization. These approaches support the concept of knowledge management, which is the key to mission success within all organizations. CDS includes the following approaches: ♦ Data Warehousing ♦ Operational Data Stores ♦ Data Mining ♦ Online Analytical Processing (OLAP) ♦ Web Data Accessibility The purpose of this paper is to (1) identify successful strategies, methods and techniques used by Federal government and military agencies in the implementation of CDS projects; (2) describe challenges, approaches, impact, metrics, and lessons learned from selected CDS projects; and (3) present summary lists of best practices derived from these project experiences. The scope was limited to CDS projects implemented by Federal agencies, military institutions and organizations directly supporting them. The reason for this sample is the premise that public sector projects are different from those in the public sector. Projects reviewed include the following. ♦ Agriculture's Rural Development Data Warehouse ♦ Army and Air Force Exchange Service (AAFES) ♦ Census 2000 Cost and Progress (C&P) System ♦ Defense Dental Standard System ♦ Defense Medical Logistics Support System Data Warehouse Program
  2. 2. ♦ Department of Defense (DOD) Army War Reserve Deployment System ♦ DOD Defense Finance & Accounting Service "Operation Mongoose". ♦ DOD Medical Logistics Support System Data Warehouse ♦ DOD's Computerized Executive Information System (CEIS) ♦ Department of the Interior (DOI) United States Geological Survey (USGS) Hydrologic Operational Data Store (ODS) ♦ DOT's Executive Reporting Framework (ERF) System ♦ Environmental Protection Agency (EPA) Envirofacts ♦ Federal Aviation Administration (FAA) Aircraft Accident Data Mining Project ♦ Federal Credit Union ♦ Health and Urban Development (HUD) Enterprise Data Warehouse ♦ Internal Revenue Service (IRS) Compliance Data Warehouse ♦ Marine Corps Personnel Data Warehouse / Total Force Manpower Process Modernization Program ♦ U.S. Army Center for Healthcare Education and Studies ♦ U.S. Army's Operational Testing and Evaluation Command ♦ U.S. Coast Guard Executive Information System (EIS) ♦ U.S. Dept of Energy's LANMAS ♦ U.S. Navy's Type Commander's Readiness Management System ♦ U.S. Department of Agriculture (USDA) Rural Development ♦ Veteran's Administration (VA) Demographics Best practices for each CDS approach are presented below. Each practice is underlined and it is followed by a description of the CDS project that espouses the practice. 2. Data Warehouse 2.1 Ensure the accuracy of the source data to maintain public trust of information 2.1.1 EPA Envirofacts - The Envirofacts data warehouse comprises of information from 12 different environmental databases for facility information, including toxic chemical releases, water discharge permit compliance, hazardous waste handling processes, Superfund status, and air emission estimates. Each program office provides its own data and is responsible for maintaining this data. Initially, Envirofacts data warehouse architects noted some data integrity problems, namely, issues with accurate data, understandable data, comparable data, properly linked data and standardized data. They have worked hard to address these data problems so that the public have access to quality data. [1] 2.1.2 HUD Enterprise Data Warehouse (EDW) - Gloria Parker, HUD’s current CIO, spearheaded data warehousing projects at the Department of Education and at HUD. The EDW effort was used to profile performance, detect fraud, profile customers, and do "what if" analysis. Business areas served included FHA loans, subsidized properties, and grants. She emphasizes that the public trust of the information is critical. Government agencies do not want to jeopardize public trust by putting out bad data. Bad data will result in major ramifications not only from citizens but also from the GAO and Congress. [2] 2.1.3 U.S. Navy Type Commander Readiness Management System - The U.S. Navy utilizes a data warehouse to support the decisions of its commanding officers. Data at the lower unit levels is aggregated to the higher levels, such as that of a Type Commander, and then interfaced with other military systems for a joint military assessment of readiness as required by the Joint Chiefs of Staff. The Navy found that it was spending an excess amount of time to determine its readiness, and some of its reports contained incorrect data. The Navy developed a user friendly, web-based system that enables rapid and accurate
  3. 3. assessment of readiness data at all levels within the Navy. "The system collects, stores, reports and analyzes mission readiness data from air, sub and surface forces" for the Atlantic and Pacific Fleets. Although this effort was successful, the Navy learned that data originating from the lower levels still needs to be accurate. The reason is that a number of legacy systems lacked validation functions. [3] 2.2 Standardize your organization’s data definitions 2.2.1 DOD Computerized Executive Information System (CEIS) – The CEIS is a 4 terabyte data warehouse that holds the medical records of the 8.5 million active members of the U.S. military health care system who are treated at 115 hospitals and 461 clinics around the world. The Defense Department wanted to convert its fixed-cost health care system to a managed-care model to lower costs and increase patient care for the active military, retirees and their dependents. Over 12,000 doctors, nurses and administrators will eventually use it. Frank Gillett, an analyst at Forrester Research, Inc., states that "What kills these huge data warehouse projects is that the human beings don't agree on the definition of data. Without that . . . all that $450 million could be thrown out the window." [4] 2.3 Be selective on what data to include in the warehouse 2.3.1 Federal Credit Union - Users are unsure of what data they want so they often specify the placement of an excess number of data elements in the warehouse. The credit union data warehouse architects suggests that users know which data they use most, although they will not always admit to what they use least. [5] 2.4 Select the Extraction-Transformation-Loading (ETL) strategy carefully 2.4.1 IRS Compliance Warehouse – This massive warehouse allows the IRS to analyze, develop, and implement business strategies for increasing voluntary compliance, increasing productivity and managing the business. It is used to provide projections, forecasts, quantitative analysis, and modeling. It currently holds 3 terabytes of storage with about 1.2 terabytes of raw data. Users able to query this data for decision support. Currently, there are about 120 users -- economists, research analysts, statisticians -- who are searching for ways to improve customer service, increase compliance with federal tax laws and increase productivity. Its expected return is 200:1. Among its functionalities, the warehouse can (1) detect trends and unknown patterns of compliance/noncompliance, (2) apply quantitative methods to resource allocation modeling. (3) drill down into data to view detailed information, (4) evaluate the impact of proposed tax-law changes, (5) detect tax fraud and (6) increase productivity by producing results more rapidly and at a lower cost. Major hurdle faced by the warehouse designers was to transform the large and diverse legacy online transactional data sets for effective use in an analytical architecture. This is an ETL issue many warehouse designers must face. They needed a way to process custom hierarchical data files and convert to ASCII for local processing and map to relational databases. They ended up with developing a script program that would do all of this. ETL is a major challenge and may be a "showstopper". Designers need to focus and come up with creative solution. One of the most important decisions in building a data warehouse is selecting an appropriate ETL strategy. [6] 2.5 Provide auditing capability 2.5.1 DOD Army Operational Testing and Evaluation Command (OTEC). OTEC is responsible for developing test criteria and evaluating the performance of extremely complex weapons equipment in every conceivable environment and condition. As national defense policy evolves, so do the weapon systems, and thus the testing requirements. The objective was to consolidate the massive and diverse test
  4. 4. data to provide analysts and auditors with access to the specific information needed to make proper decisions. OTEC personnel had "fits" when auditing agencies, such as the General Accounting Office (GAO), would show up to investigate a weapon system. For instance, if problems with a weapon show up five years after it is introduced into the field, people are going to want to know what tests were performed. A warehouse with its metadata features made data retrieval much more efficient. [7] 3. Operational Data Store (ODS) 3.1 Consider the less expensive batch mode updates instead of the direct data connections 3.1.2 DOD Army War Reserve Deployment System (AWRDS). This operational data store was designed to enable Army War Reserve to rapidly deploy forces around the globe for combat and humanitarian missions. This system is a distributed database application that allows military personnel in the United States and in Europe and Asia (both on land and sea) to maintain and access current equipment availability and readiness information. Military personnel from around 20 pre-positioned sites update the information into their local database and every 6 hours the updated information is replicated from their local machines to a centralized server. This ensures that all Army War Reserve sites have access to the most up-to-date information. Currently the central server contains about 2 gigabytes of data. There are four classes of operational data stores: ♦ Class I - online synchronous ♦ Class II - hourly data flows ♦ Class III - daily basis ♦ Class IV - unscheduled Data store designers need to make a tradeoff between cost and the data speed in choosing which class of ODS to execute. 3.2 Use the ODS approach when forced to reduce the need for a DBA at every site 3.2.1 DOD Army War Reserve Deployment System. According to Jim Brabston, a database contractor, the ODS's "low administrative overhead is another key factor for the system's success since we can't place a DBA at every site." [8] 3.3 Take into account the provisional nature of information when used for making mission-critical decisions 3.3.1 DOI USGS Hydrologic ODS. Streamflow, water level, water quality, and water use data are collected throughout a state for use in research and hydrologic studies. Data from real-time streamflow gages are relayed to the District office through the Geostationary Operational Environmental Satellite data collection system. Operational data are transmitted from each station at intervals of 3 or 4 hours and are loaded onto the District Computer System. Data are reviewed periodically to ensure accuracy. Each district record is considered provisional until the data is reviewed, analyzed and published. This because reaction may concern personnel or public safety or the outcome of such action may involves substantial monetary or operational consequence. [9]
  5. 5. 4. Data Mining 4.1 Avoid the privacy trap 4.1.1 DOD Computerized EIS (CEIS). Patients and CEIS's creators said they are concerned about the privacy of individual medical records in the system. It may include confidential data such as HIV text results. "Any kind of large database like that where you talk about personal info raises red flags," said Alex Fowler, a spokesman for the Electronic Frontier Foundation in San Francisco. "There are all kinds of questions raised about who accesses that info or protects it and how somebody fixes mistakes." [4] 4.2 Use results to identify potential areas to investigate 4.2.1 DOD Defense Finance & Accounting Service’s (DFAS) Operation Mongoose. This effort involved DFAS identifying billing errors and fraud through data mining. About 2.5 million financial transactions were searched to locate inaccurate charges. Criteria for the search involved what was bought and at what price to how it compares with previous purchases. So far several hundred bills were found that may warrant further investigation. This approach was also used to detect patterns in the use of purchase cards. As the computer searches the transactions, data patterns that might indicate improper use emerge — such as purchases made on weekends and holidays, entertainment expenses, unusually frequent purchases, multiple purchases from a single vendor and other transactions that do not correspond to the agency’s past purchasing patterns. It turned up a cluster of 345 cardholders (out of 400,000) who had made suspicious purchases, some of whom are still under investigation. However, the process needs some fine-tuning. Purchases of golf equipment appeared suspicious until investigators discovered that a military recreation manager had authority to buy the equipment. And expenses the computer said were charged to a "casino" turned out to be an ordinary hotel bill. Nevertheless, the data mining results have been promising enough that prediction are that data mining will become a regular part of DFAS efforts to stop fraud. [10] 4.3 Create a economic model based on case histories to justify costs 4.3.1 FAA Aircraft Accident Data Mining Project. The FAA hired Mitre Corp. to find ways it can mine aircraft accident data for clues about their causes and how those clues could help prevent future crashes. Mitre found that planes equipped with instrument displays that can be read without requiring a pilot to look away from the windshield were damaged less in runway accidents than planes without them. The government is cautious about committing much money to data mining. "One of the problems is how do you prove that you kept the plane from falling out of the sky," said Trish Carbone, a technology manager at Mitre. It is often difficult to justify data mining costs and indicate specific benefits. [10] 4.4 Use data mining for supporting budgetary requests 4.4.1 Veteran's Administration (VA) Demographics. The VA predicts demographic changes among its 3.6 million patients and projects collections from insurance companies. The technology enables the VA to send Congress much more precise budget requests. Agencies such as the VA, which spends about $19 billion annually to provide medical care to veterans, are under increasing pressure to show that they are operating efficiently. For many, data mining is becoming the tool of choice to highlight good performance or identify waste and fraud. [10] 4.4.2 United States Coast Guard (USCG) Executive Information System (EIS). USCG developed an EIS designed for managers to see what resources are available to them and better understand the organization's needs. Also, it is also used to identify relationships between Coast Guard initiatives and
  6. 6. seizures of contraband cocaine and establish tradeoffs between costs of alternative strategies. As with many other organizations, the Coast Guard has numerous databases; content overlaps each other; and only one employee understands each database well enough to extract information. In addition, field offices are organized geographically but budgets are drawn up by programs that operate nationwide. Therefore, there exists a disconnect between organizational structure and appropriations structure. The Coast Guard successfully used their EIS to overcome these obstacles. They have used the EIS to develop their Fiscal Year 2000 performance plan. [11] 4.4.3 DOT Executive Reporting Framework (ERF). The ERF data warehouse provides complete, reliable and timely information in an environment that allows for the cross-cutting identification, analysis, discussion and resolution of issues. The ERF also manages grants and operations data. Grants data reveals the taxes and fees that the DOT distributes to the states highway and bridge construction, airport development and transit systems. Operations data covers payroll, administrative expenses, travel, training and other operations cost. The ERF system accesses data from various financial and programmatic systems in use by Operating Administrators. Before 1993 there was no financial analysis system to compare the department's budget with congressional appropriations. There was also no system to track performance against the budget or how they were doing with the budget. The ERF system changed this. The ERF system has increased financial accountability. The system has also allowed for the tracking of the budget and they are able to correct any areas that have been over overplanned. Using ERF, adjustments were made within the quarter so the agency did not go over budget. ERF is being extended to manage budget projections, development and formulation. It can used as a proactive tool which allows the agency to more dynamically project ahead. [12] 4.4Give users continuous computer-based training on mining tools 4.5.1 DOD Medical Logistics Support System (DMLSS) Data Warehouse. DOD's Medical Logistics office built a warehouse with a front-end decision support tools to help manage the accelerating costs of health care, enhance health care delivery, enhance health delivery in peacetime and promote wartime readiness and sustainability. The office is responsible for the supply of medical equipment and medicine worldwide for DOD medical care facilities. Users manually gathered supplier information from various sources, re-entered the data, consolidated it and often converted it. This led to inaccurate data, time-consuming processes, information lacking the necessary detail. This warehouse provided online access to detailed, descriptive data and pricing information on pharmaceutical and medical/surgical items previously obtained manually. Presently, 4,000 users worldwide can access information and eventually 10,000 users will. Users can access online pricing, compare prices from multiple sources, and make cost-effective purchases that meet inventory needs. Users can now cost effectively provide supplies when and where needed. It also used to track drug expiration, which formerly constituted a major expense. In fact, non-technical users can access this key information. The estimated ROI is 10:1. In one hospital, about $500,000 in costs were avoided. Overall cost avoidance is estimated at $300 million over 10 years. Inventories were reduced from 60-150 days to less than 30 days. This approach reduced shelf inventory, relaxed the need for warehouse space and workers, improved end users decision-making cycle, and placed less dependency on IS department. One challenge faced by the agency was the difficulty in keeping up on the training of users because of the constant turnover of military personnel. It was determined that there is a need to provide quality computer-based training. [13]
  7. 7. 5. Online Analytical Processing (OLAP) 5.1 Leverage the web to reach dispersed users 5.1.1 DOD Army Operational Testing and Evaluation Command (OTEC). OTEC developed a web- based front end to their OLAP tool so that information can be entered and accessed regardless of the hardware available to users. It supports the geographically dispersed nature of OTEC's mission. Users performing tests in the field can be anywhere from Albany, N.Y. to Fort Hood, Texas. That is why the browser client the Army developed is so important. [7] 5.1.2 DOD Computerized EIS (CEIS). 12,000 doctors, nurses and administrators throughout the country utilize this key information resource. Its nationwide impact is quite tremendous. [4] 5.1.3 DOD Defense Dental Standard System. This system supports more than 10,000 users at 600 military installations worldwide. The system consists of three main modules: Dental Charting, Dental Laboratory Management, and Workload and Dental Readiness Reporting. The charting module, a customized commercial product, helps dentists graphically record patient information. The lab module automates the workflow between dentists and lab technicians. The reporting module allows users to see key information though on-line reports. [14] 5.2 Make data available to all knowledgeworkers (not only to managers) 5.2.1 IRS Compliance Data Warehouse. This massive data warehouse supports a diverse range of users including economists, research analysts, statisticians all of whom are searching for ways to improve customer service, increase compliance with federal tax laws and increase productivity. [6] 5.3 Supply data in a format readable by spreadsheets 5.3.1 DOD Army OTEC. The Army wanted users to transfer data then work with information on applications that they are familiar with. In OTEC, they transfer the data into a format readable by spreadsheets so that analysts can really crunch the data. Specifically, pivot tables found in spreadsheets allows the analysts to manipulate the information to put meaning behind the data. [7] 5.4 Restrict or encrypt classified/sensitive data 5.4.1 DOD Computerized EIS. CEIS uses an online analytical processing tool that could be used to restrict access to certain data, such as HIV test results, so that any confidential data would not be disclosed. [4] 5.5 Perform analysis during the data collection process 5.5.1 Census 2000 Cost and Progress (C&P) System. To integrate all the necessary database operations into one database management system, the Census Bureau developed the Census 2000 Cost and Progress System using data warehousing software which consolidates information from several computer systems. The use of the data warehouse software has allowed users to perform analyses during the data collection process which was previously not possible. This approach allows executives to take a more proactive management role. With this system, Census directors, regional offices, manages, and congressional oversight committees have the ability to track the 2000 census during the collection process, which never been done before. [15]
  8. 8. 6. Web Database Accessibility 6.1 Consider portal technology to access diverse data sources 6.1.1 IRS Compliance Warehouse. This warehouse includes a Web-based query and reporting solution that provides high-value, easy-to-use data access and analysis capabilities, be quickly and easily installed and managed, and scale to support hundreds and thousands of users. With this information portal, the IRS found their ideal solution. Portals provide a way to access diverse data sources via a single screen. [6] 6.2 Leverage user familiarity with browsers to reduce training requirements 6.2.1 USDA Rural Development. Rural Development, Office of Community Development, administers funding programs for the Rural Empowerment Zone Initiative. There was a need to tap into legacy databases to provide accurate and timely rural funding information to top policy makers. Challenges include extensive legacy data sources and a lending volume that rivals that of the fourth largest bank in the nation. Rural Development has a staggering number of investments to track and manage, and funding programs to analyze for effectiveness. The system serves a staff of 5,000 in 900 USDA offices nationwide. Through web accessibility using an intranet system, there were dramatic improvements in financial reporting accuracy and timely access to data. Prior to the Intranet, questions such as "What were the Rural Development investments in 1997 for the Mississippi Delta region?" required weeks of laborious data gathering and analysis, yet yielded obsolete answers with only an 80 percent accuracy factor. Now, similar analysis takes only a few minutes to perform, and the accuracy of the data is as high as 98 percent. More sophisticated analysis, such as comparing investments to census population characteristics for a particular region, which were impossible to perform, are now easy to mine from the database. More than 7,000 Rural Development employees nationwide can retrieve the information at their desktops, using a standard Web browser. Because employees are familiar with the browser, they did not need training to use the new Data Mining System. That's important for an agency with 900 offices. They can look at standard reports, drill down into the data for more detail, and create custom reports as needed. Little more than a year after it was established on the department's intranet, or internal Internet, the system's web site gets 75,000 hits a month. [16] 6.3 Use information visualization techniques such as Geographic Information Systems (GIS) 6.3.1 EPA Envirofacts. As a way to display the output of their Envirofacts warehouse, the EPA uses the EnviroMapper GIS system. It maps several types of environmental information, including drinking water, toxic and air releases, hazardous waste, water discharge permits, and Superfund sites at the national, state, and county levels. It is possible to drill down to the street level to identify environmental sites. [1] 6.4 Be sensitive to privacy intrusions 6.4.1 Social Security Administration. Three years ago, the Social Security Administration (SSA) learned this lesson the hard way when it released an Internet version of the Personal Earnings Benefit Statement (PEBES). The project had good intentions: give citizens online access to information about their Social Security contributions and future benefits. However, public perception turned on SSA when the press reported that the privacy of the system may not have been assured. All that was needed to gain entry was the social security number and the mother's maiden name, and other readily available information. SSA learned a lesson when it had not adequately consulted privacy advocates and had not built privacy protections into the system.
  9. 9. 6.5 Protect yourself from hackers 6.5.1 Justice Department, EPA, DOD. These government web sites were recently hacked or found vulnerable to hacking. Because of the tremendous damages, hackers pose, it is imperative that managers ensure that the proper protective measures are undertaken to guard the agency's web resources and databases. 7. Overarching Best Practices To summarize, the following are best practices gleaned from the range of projects studied; ♦ Still need to clean up the source data. ♦ Establish corporate data standards. ♦ Engage managers and users throughout the process. ♦ Be conscious of privacy and security concerns. ♦ Maintain public trust of the data. ♦ Avoid being bogged down from management by committee. ♦ Start small, it will grow by itself. ♦ Maintain complete and timely content. ♦ Provide adequate training. ♦ Strong, consistent leadership and direction. 8. Future of CDS According to PriceWaterhouse Coopers [18] and other sources, the future of CDS is summarized below: ♦ Data warehouses and marts will grow in size and complexity. ♦ Web-based access to data will dominate more. ♦ Integration of OLAP and data mining. ♦ Tools will be simpler to use and less expensive. ♦ XML will be the next step. ♦ Major security issues associated with the wide availability of data 9. Recommendations As a result of this research, a number of recommendations were identified. Improve performance metrics. In conducting this research, many organizations did not document their ROI or provide any post-implementation performance indicators. These metrics are needed this to measure the level of success. These figures could be used to justify other similar projects. Consider outsourcing. There are many issues associated with outsourcing but sometimes it is a very attractive alternative when the agency does not have the available personnel or expertise to do a CDS project adequately. Account for the Total Cost of Ownership (TCO). TCO is an accounting method used by businesses seeking to understand and control their systems costs. By accounting for all costs associated with a system across its entire life span, TCO analysis seeks to make procurement, management, and use more efficient. Total cost of ownership is computed by combining direct costs (hardware purchase or lease, and software licensing) with its indirect costs (administration, support, and other unbudgeted costs). The direct costs are easily determined by looking at an IT department's capital budget. The indirect costs, however, are harder to quantify. It is important to keep in mind that CDS projects are generally continuous efforts and not a one time event.
  10. 10. Architecture is a key to success. CDS should be part of an enterprise architecture, a framework for visualizing the IT assets of an enterprise and how these assets interrelate. It should reflect the vision and business processes of an organization. It should also include standards for the assets and interoperability requirements among these assets. Technology is just part of the solution. To ensure success, an agency needs to examine and modify its business processes as well as understand how it can leverage data to improve it level of service and products. Also, the agency needs to recognize its most valuable resource, its people, who need to be both well-trained and motivated to do the job well. 10. Conclusion The application of CDS in the government and military offers much potential in improving the level of knowledge and business intelligence of the organization. However, the implementation and maintenance of CDS project can be very challenging not only because of technical obstacles but also because of the unique environment in which the government and military operates. Based on a comprehensive review of Federal and military CDS projects, this paper was aimed at identifying key best practices which would hopefully overcome these tough challenges. By doing so, there is much value to gain by the warfighters, their support, and the public. References [1] Garvey, Patrick, "Envirofacts Warehouse Public Access to Environmental Data Over the Web," Presentation, January 5, 1999. [2] Parker, Gloria. "Data Warehousing at the Federal Government: A CIO Perspective," Technical Presentation. Data Warehouse Conference '99, October 1, 1999. [3] "US Navy Ensures Readiness Using SQL Server," [4] Hamblen, Matt." Pentagon to deploy huge medical data warehouse," [5] Deitch, Jonathan, "Technicians are from Mars, Users are from Venus: Myths and Facts about Data Warehouse Administration," Technical Presentation, undated. [6] Kmonk, Jeff, " Viador Information Portal Provides Web Data Access and Reporting for the IRS," [7] "OPTEC Adopts Data Warehousing Strategy to Test Critical Weapons Systems," [8] "Success Stories," [9] Water Resources Division, South Dakota District, United States Geological Survey, [10] "Digging Digital Gold" Federal Computer Week, Feb. 7, 2000;
  11. 11. [11] Ferris, Nancy, "Information is Power," [12] Gerber, Cheryl, "Feds turn to OLAP as reporting tool," [13] "Military Marches Toward Next-Generation Health Care Service: The Defense Dental Standard System ," [14] [15] "The U.S. Bureau of the Census Counts on a Better System," census.html [16] Ferris, Nancy, "9 Hot Trends for '99," [17] Schwartz, Ari, "Making the Web Safe," 03/13/2000. [18] Technology Forecast: 1999. PricewaterhouseCoopers, October 1988, pp. 461-462. Author Biography Dr. Les Pang is professor of systems management at the Information Resources Management College which is part of the National Defense University. He currently teaches courses to military and civilian leaders on data management technologies, modeling and simulation, Internet, software technologies, and the Year 2000 problem. He received the Ph.D. in engineering from the University of Utah in 1983 and a Master in Business Administration (Management Information Systems concentration) from the University of Maryland College Park in 1988. Disclaimer “The views expressed in this presentation are those of the author and do not reflect the official policy or position of the National Defense University, the Department of Defense or the U.S. Government.”