The document describes a web application developed for NASA's Marshall Space Flight Center to integrate their facilities management data sources. The application allows users to view maps, architectural floor plans, facility information, equipment data, work orders, personnel locations, and generate reports through a single interface. It provides improved access and analysis of facilities data compared to previous separate systems that required specialized software and training.
This document discusses integrating independent data sets at NASA Marshall Space Flight Center. Previously, data was siloed across different departments and systems with no linkages. The new application aims to rapidly gather information from various sources through a visual portal, while improving access, integration, decision making and efficiencies. It utilizes existing databases, business processes and software without migrating all data into a central system. An integration hub correlates non-homogeneous databases and regroups data sets into reports. An access control module allows restricted access to specific users.
Db graph a_tool_for_development_of_database_systems_basedAmbar Abdul
This document proposes an extension to the Entity-Relationship (E-R) modeling technique to support conceptual database design for geographic information systems. The extension handles spatial objects, relationships, and attributes commonly found in GIS. It represents spatial relationships as relationships in E-R diagrams and maps them to topological or coordinate-based implementations in GIS. The extended E-R modeling approach provides a conceptual modeling tool to improve GIS database design processes.
The document summarizes a presentation about NASA's Johnson Space Center's (JSC) approach to integrating facilities planning information across the center. It describes how JSC developed a Facilities Review Database (JFReD) to capture and assess facility usage and cost data over multiple years. JFReD integrates with JSC's Geographic Information System to allow visualization of facility locations and utilization profiles. It also incorporates capabilities assessment information to support strategic planning for facilities and technical capabilities.
This document discusses the design of a geographic information system (GIS) software platform integrated with a decision support system (DSS) for use in e-government applications in China. It proposes a new approach that tightly integrates DSS techniques with GIS techniques to provide comprehensive information and decision-making services to governments. The platform uses a uniform database design and data management approach. It is developed using a component-based approach to achieve close integration of GIS and DSS functions. The platform adopts a client-server architecture for applications and a client-server structure for system maintenance.
Establishing A Robust Data Migration Methodology - White PaperJames Chi
This document outlines a data readiness methodology for migrating data from legacy systems to SAP. The methodology includes extracting data from source systems or collecting manual data, transforming the data in a staging area, and loading it into SAP. It describes components like extract, transform, and load. The methodology is intended to identify data quality issues early and deliver consistent, predictable results for data migration.
Metric is a SaaS platform that provides integrated network operations optimization, enhancing performance and efficiency. It allows for collaborative workflow management, task tracking, file sharing and indexing of network data. Key features include visualization of drive test and performance data on interactive maps, automated analysis of issues like coverage, interference and configuration errors, and customizable reporting of network KPIs. The platform aims to simplify processes, improve productivity and help telecom operators make faster, data-driven decisions.
Roy Interrante is a Project Manager and GIS consultant with over 20 years of experience managing projects involving infrastructure, GIS, and enterprise data systems. He has expertise in project management, GIS implementation, web development, CAD, system architecture, and data modeling. Notable projects include developing a web portal integrating NASA databases and applications that reduced data collection time from 1 week to less than 2 hours, saving $4 million per year, and a GIS system for the Navy integrating data from 26 bases.
The document provides details of the individual's professional experience including roles as Chief of GIS/Spatial Data Systems and various project management roles at the District Department of Transportation from 2007-2010. Some key responsibilities and projects mentioned include establishing GIS committees and programs, managing the development of an Enterprise Asset Management System, upgrading the Street Inventory System, and developing asset inventories.
This document discusses integrating independent data sets at NASA Marshall Space Flight Center. Previously, data was siloed across different departments and systems with no linkages. The new application aims to rapidly gather information from various sources through a visual portal, while improving access, integration, decision making and efficiencies. It utilizes existing databases, business processes and software without migrating all data into a central system. An integration hub correlates non-homogeneous databases and regroups data sets into reports. An access control module allows restricted access to specific users.
Db graph a_tool_for_development_of_database_systems_basedAmbar Abdul
This document proposes an extension to the Entity-Relationship (E-R) modeling technique to support conceptual database design for geographic information systems. The extension handles spatial objects, relationships, and attributes commonly found in GIS. It represents spatial relationships as relationships in E-R diagrams and maps them to topological or coordinate-based implementations in GIS. The extended E-R modeling approach provides a conceptual modeling tool to improve GIS database design processes.
The document summarizes a presentation about NASA's Johnson Space Center's (JSC) approach to integrating facilities planning information across the center. It describes how JSC developed a Facilities Review Database (JFReD) to capture and assess facility usage and cost data over multiple years. JFReD integrates with JSC's Geographic Information System to allow visualization of facility locations and utilization profiles. It also incorporates capabilities assessment information to support strategic planning for facilities and technical capabilities.
This document discusses the design of a geographic information system (GIS) software platform integrated with a decision support system (DSS) for use in e-government applications in China. It proposes a new approach that tightly integrates DSS techniques with GIS techniques to provide comprehensive information and decision-making services to governments. The platform uses a uniform database design and data management approach. It is developed using a component-based approach to achieve close integration of GIS and DSS functions. The platform adopts a client-server architecture for applications and a client-server structure for system maintenance.
Establishing A Robust Data Migration Methodology - White PaperJames Chi
This document outlines a data readiness methodology for migrating data from legacy systems to SAP. The methodology includes extracting data from source systems or collecting manual data, transforming the data in a staging area, and loading it into SAP. It describes components like extract, transform, and load. The methodology is intended to identify data quality issues early and deliver consistent, predictable results for data migration.
Metric is a SaaS platform that provides integrated network operations optimization, enhancing performance and efficiency. It allows for collaborative workflow management, task tracking, file sharing and indexing of network data. Key features include visualization of drive test and performance data on interactive maps, automated analysis of issues like coverage, interference and configuration errors, and customizable reporting of network KPIs. The platform aims to simplify processes, improve productivity and help telecom operators make faster, data-driven decisions.
Roy Interrante is a Project Manager and GIS consultant with over 20 years of experience managing projects involving infrastructure, GIS, and enterprise data systems. He has expertise in project management, GIS implementation, web development, CAD, system architecture, and data modeling. Notable projects include developing a web portal integrating NASA databases and applications that reduced data collection time from 1 week to less than 2 hours, saving $4 million per year, and a GIS system for the Navy integrating data from 26 bases.
The document provides details of the individual's professional experience including roles as Chief of GIS/Spatial Data Systems and various project management roles at the District Department of Transportation from 2007-2010. Some key responsibilities and projects mentioned include establishing GIS committees and programs, managing the development of an Enterprise Asset Management System, upgrading the Street Inventory System, and developing asset inventories.
This document provides an overview of a project to assess mapping technologies for connected vehicle applications. The project aims to determine the best technologies to support intelligent transportation systems and will analyze technologies like aerial imagery and vehicle-mounted sensors. A field test of mapping technologies was conducted and involved collecting road data using sensors on a test vehicle. The goal is to develop maps to enable safety and mobility applications by knowing vehicles' locations relative to the roadway and other vehicles.
The Harvard University Facilities and Operations department sought to centralize and consolidate facility management data by creating a geographic information system (GIS). Microdesk was tasked with developing an integrated GIS to inventory, document, and manage Harvard's chilled water, steam, and condensate distribution system assets. The GIS provides the ability to assemble, manage, and present key utility information through a web browser. It links common attributes, properties, diagrams, schematics, and photos to utility assets. The system allows Harvard University staff to actively manage utility assets and always have an up-to-date view of campus infrastructure.
TELUS Case Study: iVAULT implementation improved corporate intelligence eventspat
This document summarizes a webinar about TELUS's implementation of the iVAULT content management system. Some key points:
- iVAULT was implemented to improve TELUS's corporate intelligence by creating a centralized spatial data store and new FieldView application. This consolidated data from disparate legacy systems and maps like ArcGIS, MapGuide, and Google Maps.
- The new architecture included an Oracle spatial data store replicating TELUS infrastructure data from their Intergraph Framme system. This cleaned up issues and standardized the data.
- A new FieldView application was developed to provide customized analysis tools for various departments through a web interface on both desktop and mobile.
- The
TELUS Case Study: GIS for Telecommunicationseventspat
This document describes how TELUS implemented an iVAULT system to improve access to and use of their spatial data. Key points:
1. TELUS integrated their disparate GIS systems and data into a single iVAULT system with a spatial data store, allowing unified access for field users and departments.
2. The iVAULT system included a new FieldView application for viewing, searching, analyzing and editing spatial and attribute data via web and mobile.
3. The spatial data store cleaned up TELUS' IMAGE database and consolidated over 1,000 design files, improving data quality and access.
4. The unified system allows TELUS to better analyze customer and network data
The Navy’s GeoReadiness Repository builds on a Web services, using ArcObjects 9 and open
standards, and provides the Navy with the following:
• Authoritative source of geospatial data for Navy Real Property in support of Critical
Infrastructure and Force Protection, Shore Installation Management, and
Environmental Protection
• Baseline architecture for a network of Regional Repositories
• Portal that integrates functional applications and databases
• Security controls limiting access to specific data layers
• Quality control by automating the SDSFIE Standards and IVT Quality Assurance
Plan compliance check function, and
• Access between authoritative geospatial databases
William McGuyer GIS Analyst Resume 2015 newestBill McGuyer
William McGuyer has over 26 years of experience as a GIS Analyst, Project Manager, and database administrator with expertise in Esri ArcGIS, AutoCAD, Oracle, and SQL. He currently works as a GIS Analyst for Colorado State University assisting the Air Force with standardizing their environmental geospatial datasets. Prior to his current role, he held GIS and database roles for various government contractors providing geospatial support and analysis to the Department of Defense.
FieldTRAKS is a web-based geospatial data management platform that allows users to customize solutions for collecting, managing, and reporting data related to spatial or non-spatial objects. It offers applications such as ARKS for agriculture record keeping and pipeline management. FieldTRAKS applications can be configured through database profiles to meet various data collection and reporting needs across different industry sectors.
GEMS implemented a GIS portal called Pangaea using ArcGIS technology to improve access to and management of its massive project data stores. The portal allows intuitive spatial and attribute searching of GEMS' over 50 terabytes of documents, images, and data. It reduces data search time and duplication while improving access to up-to-date project information for staff and customers.
Kafka Migration for Satellite Event Streaming Data | Eric Velte, ASRC FederalHostedbyConfluent
ASRC Federal created the Mission Operator Assist (MOA) tool to extend human capabilities through AI/ML for NOAA. MOA ingests system log data from on-orbit satellite constellations and applies machine learning to greatly improve real-time situational awareness. MOA uses a collection of tools, including Kafka for multi-subscriber communications, all hosted through AWS Cloud Services and Kubernetes Containers for microservices. Like many traditional on-premises systems, satellite ground station operations are undergoing a renaissance as they increasingly become enabled by cloud.
During this session, the audience will learn about the satellite communications chain, and best practices and lessons learned in creating a data pipeline with Kafka for high throughput and scalability while displaying high quality situational awareness to mission operators. We will discuss our goals centered around establishing event-driven streaming for satellite logs so our machine learning becomes real-time and supporting a multi-subscriber approach for various Kafka topics. Listeners will also learn how a multi-subscriber approach using Kafka, helped us auto scale logstash based on how many messages are in the queue and other microservices.
An Energy Efficient Data Transmission and Aggregation of WSN using Data Proce...IRJET Journal
The document proposes a system for efficient data transmission and aggregation in wireless sensor networks (WSNs) using MapReduce processing. Sensors are grouped into three clusters, with a cluster head elected in each based on distance, memory, and battery to reduce energy consumption. Sensor data is encrypted and sent to cluster heads, which aggregate the data and append a signature before sending to the base station. The signature is verified and data is stored in Hadoop and processed using MapReduce. The system aims to provide data integrity and privacy during concealed data aggregation to reduce overhead in heterogeneous WSNs.
IRJET- Big Data Processes and Analysis using Hadoop FrameworkIRJET Journal
This document discusses issues with analyzing sub-datasets in a distributed manner using Hadoop, such as imbalanced computational loads and inefficient data scanning. It proposes a new approach called Data-Net that uses metadata about sub-dataset distributions stored in an Elastic-Map structure to optimize storage placement and queries. Experimental results on a 128-node cluster show that Data-Net provides better load balancing and performance for various sub-dataset analysis applications compared to the default Hadoop implementation.
The document is a resume for Dejan Neskovic, who has over 16 years of experience as a senior data scientist and system engineer working on projects for the FAA and DHS. He has led teams developing predictive models, geospatial analysis tools, and other innovative solutions to support air transportation and border security. Current areas of focus include predictive threat modeling, radar coverage assessment, and statistical analysis to improve departure time predictions.
A Survey of Agent Based Pre-Processing and Knowledge RetrievalIOSR Journals
Abstract: Information retrieval is the major task in present scenario as quantum of data is increasing with a
tremendous speed. So, to manage & mine knowledge for different users as per their interest, is the goal of every
organization whether it is related to grid computing, business intelligence, distributed databases or any other.
To achieve this goal of extracting quality information from large databases, software agents have proved to be
a strong pillar. Over the decades, researchers have implemented the concept of multi agents to get the process
of data mining done by focusing on its various steps. Among which data pre-processing is found to be the most
sensitive and crucial step as the quality of knowledge to be retrieved is totally dependent on the quality of raw
data. Many methods or tools are available to pre-process the data in an automated fashion using intelligent
(self learning) mobile agents effectively in distributed as well as centralized databases but various quality
factors are still to get attention to improve the retrieved knowledge quality. This article will provide a review of
the integration of these two emerging fields of software agents and knowledge retrieval process with the focus
on data pre-processing step.
Keywords: Data Mining, Multi Agents, Mobile Agents, Preprocessing, Software Agents
World Pipelines - Better Together - SCADA and GISsmrobb
This document discusses how geographic information systems (GIS) and supervisory control and data acquisition (SCADA) systems can work together to improve pipeline operations. Traditionally, pipeline operators have relied on SCADA alone, but integrating SCADA data with GIS capabilities offers significant benefits. The combination allows operators to view pipeline assets and real-time operating conditions within an accurate geospatial context. Linking GIS and SCADA without data duplication also reduces long-term costs while providing operators a comprehensive picture to more effectively troubleshoot problems and dispatch field crews. Pipeline companies are now able to realize improved logistics, decision-making, and overall operational efficiency by integrating their GIS and SCADA systems.
This memo provides details on the South East Natural Resource Information Portal (SENRIP) project. There are two parts to the project - establishing the SENRIP portal itself using technologies like SharePoint, and developing an integrated online weeds database. The weeds database project involves reviewing existing databases, analyzing user requirements, and developing a database that allows remote data entry and integration with GIS mapping. Technologies to be used include Windows Server, IIS, Access, SQL Server, and SharePoint.
Advertisement jakarta walk_in_interview_published_nov_20_2011Ade Herdiansah
Huawei Technologies is a leading global provider of information and communications technology solutions. It has established advantages in telecom networks, devices, and cloud computing through customer-centric innovation and partnerships. Huawei is committed to creating maximum value for its customers by providing competitive solutions and services to telecom operators, enterprises, and consumers. The company's products and solutions have been deployed in over 140 countries serving more than one third of the world's population. Due to significant growth, Huawei is expanding its business in the region and is seeking highly motivated individuals for various positions.
The document discusses how geographic information systems (GIS) can be used in various aspects of civil engineering. It provides definitions of GIS and describes how GIS allows storage, analysis, and visualization of spatial data. It then discusses specific applications of GIS in infrastructure management over the project lifecycle, including planning, design, construction, and operations/maintenance. Additional applications discussed include transportation, landfill site selection, watershed management, town planning, and critical infrastructure protection.
William McGuyer has over 29 years of experience in geospatial analysis, software engineering, project management, and Oracle database administration. He currently provides GIS support to the Air Force at Wright-Patterson Air Force Base, standardizing environmental geospatial data. Previously he has held roles as a GIS analyst, project manager, and Oracle database manager for various contractors supporting the Department of Defense and Air Force. He has extensive experience with GIS and CAD software, relational databases, and technical project work.
This document summarizes the strategic plan and activities of the Honolulu Land Information System (Holis) department. Holis provides critical geospatial data and services to support business processes and decision making across the city. Key accomplishments in recent years include updating parcel and infrastructure data, producing maps, and developing web-based GIS interfaces. Holis aims to continue enhancing productivity through improved work processes, data management, and new technologies.
A Survey on Data Mapping Strategy for data stored in the storage cloud 111NavNeet KuMar
This document describes a method for processing large amounts of data stored in cloud storage using Hadoop clusters. Data is uploaded to cloud storage by users and then processed using MapReduce on Hadoop clusters. The method involves storing data in the cloud for processing and then running MapReduce algorithms on Hadoop clusters to analyze the data in parallel. The results are then stored back in the cloud for users to download. An architecture is proposed involving a controller that directs requests to Hadoop masters which coordinate nodes to perform mapping and reducing of data according to the algorithm implemented.
This document provides an overview of a project to assess mapping technologies for connected vehicle applications. The project aims to determine the best technologies to support intelligent transportation systems and will analyze technologies like aerial imagery and vehicle-mounted sensors. A field test of mapping technologies was conducted and involved collecting road data using sensors on a test vehicle. The goal is to develop maps to enable safety and mobility applications by knowing vehicles' locations relative to the roadway and other vehicles.
The Harvard University Facilities and Operations department sought to centralize and consolidate facility management data by creating a geographic information system (GIS). Microdesk was tasked with developing an integrated GIS to inventory, document, and manage Harvard's chilled water, steam, and condensate distribution system assets. The GIS provides the ability to assemble, manage, and present key utility information through a web browser. It links common attributes, properties, diagrams, schematics, and photos to utility assets. The system allows Harvard University staff to actively manage utility assets and always have an up-to-date view of campus infrastructure.
TELUS Case Study: iVAULT implementation improved corporate intelligence eventspat
This document summarizes a webinar about TELUS's implementation of the iVAULT content management system. Some key points:
- iVAULT was implemented to improve TELUS's corporate intelligence by creating a centralized spatial data store and new FieldView application. This consolidated data from disparate legacy systems and maps like ArcGIS, MapGuide, and Google Maps.
- The new architecture included an Oracle spatial data store replicating TELUS infrastructure data from their Intergraph Framme system. This cleaned up issues and standardized the data.
- A new FieldView application was developed to provide customized analysis tools for various departments through a web interface on both desktop and mobile.
- The
TELUS Case Study: GIS for Telecommunicationseventspat
This document describes how TELUS implemented an iVAULT system to improve access to and use of their spatial data. Key points:
1. TELUS integrated their disparate GIS systems and data into a single iVAULT system with a spatial data store, allowing unified access for field users and departments.
2. The iVAULT system included a new FieldView application for viewing, searching, analyzing and editing spatial and attribute data via web and mobile.
3. The spatial data store cleaned up TELUS' IMAGE database and consolidated over 1,000 design files, improving data quality and access.
4. The unified system allows TELUS to better analyze customer and network data
The Navy’s GeoReadiness Repository builds on a Web services, using ArcObjects 9 and open
standards, and provides the Navy with the following:
• Authoritative source of geospatial data for Navy Real Property in support of Critical
Infrastructure and Force Protection, Shore Installation Management, and
Environmental Protection
• Baseline architecture for a network of Regional Repositories
• Portal that integrates functional applications and databases
• Security controls limiting access to specific data layers
• Quality control by automating the SDSFIE Standards and IVT Quality Assurance
Plan compliance check function, and
• Access between authoritative geospatial databases
William McGuyer GIS Analyst Resume 2015 newestBill McGuyer
William McGuyer has over 26 years of experience as a GIS Analyst, Project Manager, and database administrator with expertise in Esri ArcGIS, AutoCAD, Oracle, and SQL. He currently works as a GIS Analyst for Colorado State University assisting the Air Force with standardizing their environmental geospatial datasets. Prior to his current role, he held GIS and database roles for various government contractors providing geospatial support and analysis to the Department of Defense.
FieldTRAKS is a web-based geospatial data management platform that allows users to customize solutions for collecting, managing, and reporting data related to spatial or non-spatial objects. It offers applications such as ARKS for agriculture record keeping and pipeline management. FieldTRAKS applications can be configured through database profiles to meet various data collection and reporting needs across different industry sectors.
GEMS implemented a GIS portal called Pangaea using ArcGIS technology to improve access to and management of its massive project data stores. The portal allows intuitive spatial and attribute searching of GEMS' over 50 terabytes of documents, images, and data. It reduces data search time and duplication while improving access to up-to-date project information for staff and customers.
Kafka Migration for Satellite Event Streaming Data | Eric Velte, ASRC FederalHostedbyConfluent
ASRC Federal created the Mission Operator Assist (MOA) tool to extend human capabilities through AI/ML for NOAA. MOA ingests system log data from on-orbit satellite constellations and applies machine learning to greatly improve real-time situational awareness. MOA uses a collection of tools, including Kafka for multi-subscriber communications, all hosted through AWS Cloud Services and Kubernetes Containers for microservices. Like many traditional on-premises systems, satellite ground station operations are undergoing a renaissance as they increasingly become enabled by cloud.
During this session, the audience will learn about the satellite communications chain, and best practices and lessons learned in creating a data pipeline with Kafka for high throughput and scalability while displaying high quality situational awareness to mission operators. We will discuss our goals centered around establishing event-driven streaming for satellite logs so our machine learning becomes real-time and supporting a multi-subscriber approach for various Kafka topics. Listeners will also learn how a multi-subscriber approach using Kafka, helped us auto scale logstash based on how many messages are in the queue and other microservices.
An Energy Efficient Data Transmission and Aggregation of WSN using Data Proce...IRJET Journal
The document proposes a system for efficient data transmission and aggregation in wireless sensor networks (WSNs) using MapReduce processing. Sensors are grouped into three clusters, with a cluster head elected in each based on distance, memory, and battery to reduce energy consumption. Sensor data is encrypted and sent to cluster heads, which aggregate the data and append a signature before sending to the base station. The signature is verified and data is stored in Hadoop and processed using MapReduce. The system aims to provide data integrity and privacy during concealed data aggregation to reduce overhead in heterogeneous WSNs.
IRJET- Big Data Processes and Analysis using Hadoop FrameworkIRJET Journal
This document discusses issues with analyzing sub-datasets in a distributed manner using Hadoop, such as imbalanced computational loads and inefficient data scanning. It proposes a new approach called Data-Net that uses metadata about sub-dataset distributions stored in an Elastic-Map structure to optimize storage placement and queries. Experimental results on a 128-node cluster show that Data-Net provides better load balancing and performance for various sub-dataset analysis applications compared to the default Hadoop implementation.
The document is a resume for Dejan Neskovic, who has over 16 years of experience as a senior data scientist and system engineer working on projects for the FAA and DHS. He has led teams developing predictive models, geospatial analysis tools, and other innovative solutions to support air transportation and border security. Current areas of focus include predictive threat modeling, radar coverage assessment, and statistical analysis to improve departure time predictions.
A Survey of Agent Based Pre-Processing and Knowledge RetrievalIOSR Journals
Abstract: Information retrieval is the major task in present scenario as quantum of data is increasing with a
tremendous speed. So, to manage & mine knowledge for different users as per their interest, is the goal of every
organization whether it is related to grid computing, business intelligence, distributed databases or any other.
To achieve this goal of extracting quality information from large databases, software agents have proved to be
a strong pillar. Over the decades, researchers have implemented the concept of multi agents to get the process
of data mining done by focusing on its various steps. Among which data pre-processing is found to be the most
sensitive and crucial step as the quality of knowledge to be retrieved is totally dependent on the quality of raw
data. Many methods or tools are available to pre-process the data in an automated fashion using intelligent
(self learning) mobile agents effectively in distributed as well as centralized databases but various quality
factors are still to get attention to improve the retrieved knowledge quality. This article will provide a review of
the integration of these two emerging fields of software agents and knowledge retrieval process with the focus
on data pre-processing step.
Keywords: Data Mining, Multi Agents, Mobile Agents, Preprocessing, Software Agents
World Pipelines - Better Together - SCADA and GISsmrobb
This document discusses how geographic information systems (GIS) and supervisory control and data acquisition (SCADA) systems can work together to improve pipeline operations. Traditionally, pipeline operators have relied on SCADA alone, but integrating SCADA data with GIS capabilities offers significant benefits. The combination allows operators to view pipeline assets and real-time operating conditions within an accurate geospatial context. Linking GIS and SCADA without data duplication also reduces long-term costs while providing operators a comprehensive picture to more effectively troubleshoot problems and dispatch field crews. Pipeline companies are now able to realize improved logistics, decision-making, and overall operational efficiency by integrating their GIS and SCADA systems.
This memo provides details on the South East Natural Resource Information Portal (SENRIP) project. There are two parts to the project - establishing the SENRIP portal itself using technologies like SharePoint, and developing an integrated online weeds database. The weeds database project involves reviewing existing databases, analyzing user requirements, and developing a database that allows remote data entry and integration with GIS mapping. Technologies to be used include Windows Server, IIS, Access, SQL Server, and SharePoint.
Advertisement jakarta walk_in_interview_published_nov_20_2011Ade Herdiansah
Huawei Technologies is a leading global provider of information and communications technology solutions. It has established advantages in telecom networks, devices, and cloud computing through customer-centric innovation and partnerships. Huawei is committed to creating maximum value for its customers by providing competitive solutions and services to telecom operators, enterprises, and consumers. The company's products and solutions have been deployed in over 140 countries serving more than one third of the world's population. Due to significant growth, Huawei is expanding its business in the region and is seeking highly motivated individuals for various positions.
The document discusses how geographic information systems (GIS) can be used in various aspects of civil engineering. It provides definitions of GIS and describes how GIS allows storage, analysis, and visualization of spatial data. It then discusses specific applications of GIS in infrastructure management over the project lifecycle, including planning, design, construction, and operations/maintenance. Additional applications discussed include transportation, landfill site selection, watershed management, town planning, and critical infrastructure protection.
William McGuyer has over 29 years of experience in geospatial analysis, software engineering, project management, and Oracle database administration. He currently provides GIS support to the Air Force at Wright-Patterson Air Force Base, standardizing environmental geospatial data. Previously he has held roles as a GIS analyst, project manager, and Oracle database manager for various contractors supporting the Department of Defense and Air Force. He has extensive experience with GIS and CAD software, relational databases, and technical project work.
This document summarizes the strategic plan and activities of the Honolulu Land Information System (Holis) department. Holis provides critical geospatial data and services to support business processes and decision making across the city. Key accomplishments in recent years include updating parcel and infrastructure data, producing maps, and developing web-based GIS interfaces. Holis aims to continue enhancing productivity through improved work processes, data management, and new technologies.
A Survey on Data Mapping Strategy for data stored in the storage cloud 111NavNeet KuMar
This document describes a method for processing large amounts of data stored in cloud storage using Hadoop clusters. Data is uploaded to cloud storage by users and then processed using MapReduce on Hadoop clusters. The method involves storing data in the cloud for processing and then running MapReduce algorithms on Hadoop clusters to analyze the data in parallel. The results are then stored back in the cloud for users to download. An architecture is proposed involving a controller that directs requests to Hadoop masters which coordinate nodes to perform mapping and reducing of data according to the algorithm implemented.
A Survey on Data Mapping Strategy for data stored in the storage cloud 111
NASA MSFC Facilities GIS Brochure
1. National Aeronautics and
Space Administration
Contacts
marshall space flight center
NASA MSFC Contacts
Tim Corn
Facilities Management Office Manager
256–544–9451
Charlotte Schrimsher
Project Manager
Facilities GIS &
256–503–1610
Data Integration
Gary Rogers
Technical Manager
256–544–7955
Summary
The Facilities Management Office at NASA Intergraph Corporation Contacts
Marshall Space Flight Center maintains many
maps, databases, and CAD drawings for its Ron Harlow
daily operations. However, these data sets are Executive Manager
independent from each other, and require specially 256–730–1521
trained users with complex software to query and
report. A web application was developed using GIS Bill Mommsen
and SVG technologies to bring pertinent live data Program Manager
to the user’s desktop, without exhaustive training 256–730–8179
or the installation of numerous applications. Now
decision makers at NASA are able to view maps Roy Interrante
and architectural floor plans, obtain floor and room Technical Manager
information, find equipment, review maintenance 757–515–5883
work orders, locate personnel, create color coded
floor plans, and many other functions from one
easy to use application.
National Aeronautics and Space Administration
George C. Marshall Space Flight Center
Huntsville, AL 35812
www.nasa.gov
NP-2007-05-70-MSFC
5-44319
2. Objectives Approach Application (Continued)
The objectives of this application are to provide a The core approach to the development of this web- queries and reports are generated from a Center
web interface to Facilities Data, rapid data access, a based Mapping, Interactive Floor Plan, and Data Portal and Building perspective. The user can drill down
visual portal to multiple data sets, and be integrated was to augment the decision-making capability of the to more building information through the Interactive
in such a way that facilitates information gathering Facilities Management Office’s existing data sets and Floor Plan Application, where queries and reports
and analysis. This application makes business their associated processes and applications. Each sub- are generated from a floor and room perspective.
system was reviewed with the data maintenance staff A Security Module was implemented in order to
and other data users to determine the best approaches restrict access to sensitive data to specific users.
to access, report, and integrate data. The architecture The application was built using a modular approach.
of this application is scalable, in order that new data The modules include drawing and map navigation
access portals, components, functions, and applications tools, web menu controls, map and floor plan
can be added in the future. layer controls, search tools, database reports, etc.
Components can be added, removed, or modified,
depending on the customer’s requirements and the
Application structure of their databases.
All Databases, Maps, CAD files, Applications, System
Processes, etc. were reviewed and a data flow model Results
was developed. This model illustrates all linkages
and data flows of the existing information systems Previously, Engineers and Decision Makers had to
contact many individuals to request data in order
Applications
user
Business
to gather necessary information. For example, they
Map
Layers
user
Web Application
would contact the mapping department and request
Applications
Business
Environ-
• GIS / Maps
mental
drawings of utility systems, aerials, and other maps,
sense, in that it improves the efficiency and decision user
• Data Portal then the Environmental department for locations of
Applications
Business
Security Integration Maximo
Module • Queries Hub Database
capabilities of the Facilities Management Office. This user
• Searches monitoring wells and other concerns, then Asset
integration application uses existing databases and • Reports Management for work orders and equipment
Applications
Business
Space
Management
• Analysis
their associated business processes. The existing user reports, then Space Management for color coded
applications have developed and evolved through the floor plans, then the Planning Department, etc.
Applications
Business
Floor
Plans
years, and a significant amount of resources have been Now Decision Makers can access all these data
invested. Instead of migrating these databases into sets from their desktops and generate reports and
one central system, it was determined to leverage the that the Facilities Management Office accesses. An maps. Pertinent data can be viewed and analyzed
current systems by developing methods to integrate Integration Hub was developed to correlate various non- from new perspectives. Also, Data Managers can
data sets into one application. Current databases and homogeneous databases. For example, by querying the dedicate more time to building and improving the
business applications remain in place. Now users can Maximo database to return a record set of all buildings integrity and reliability of their databases.
retrieve about 80% of the common reports generated to be replaced, and then joining the results to a GIS
by the current applications from this web portal. database, a thematic map highlighting buildings to be
replaced is generated. The web application consists
of two major components: the Map and the Interactive
Floor Plan Applications. From the Map Application,