This document proposes an Intelligent Data Entry and Acquisition (IDEA) system to help with on-site highway maintenance and construction. It describes an architecture using wearable computers and sensors to collect asset data in the field, process it using pattern recognition, and upload it to centralized databases. Field workers could use tools like digital notepads, cameras, and GPS to gather location-tagged images, notes and condition reports on assets, which the IDEA system would then analyze and integrate into maintenance planning databases back at the office. The goal is to streamline data collection and improve safety, productivity and data quality for tasks like infrastructure inspections.
This document introduces SQL-H, which enables SQL analytics on Hadoop. It provides a primer on HCatalog and Aster, defines SQL-H, and provides examples of SQL-H usage. SQL-H allows direct access to HCatalog tables from within AsterDB, providing full SQL support and integration with BI tools on data stored in Hadoop. It performs reads from HCatalog in a distributed, native manner without using MapReduce.
This document provides status updates on various client projects including:
1. BI governance and data management projects to improve effective BI delivery.
2. Decoupling applications from BI and increasing user enablement through operational reporting, BI reporting/analytics, and data mining.
3. Updates on OLAP cube capability, data quality, security, and infrastructure including ETL processes and hierarchy management.
4. Planned timeline for Q1-Q4 2012 showing projects like HANA implementation, CCM for information management, and MDM stabilization.
HCLT Brochure: E-Discovery and Document Review SolutionsHCL Technologies
http://www.hcltech.com/search/apachesolr_search/business-services~ More on Business Services
With the number of litigations expected to increase due to the economy, corporations and law firms are increasingly concerned with cost effective high-quality electronic d`iscovery (“e-discovery”) solutions. With 70% of the total cost of a litigation attributed to the document review fees, corporations and law firms must select innovative document review solutions to stay in budget. Simple Solutions’ e-Discovery and Document Review Services provides corporations and law firms with high quality, cost-effective document review services that gives them the cost certainty needed to stay in budget.
e-Discovery companies are leveraging cloud computing and deployment of Software as a Service (SaaS) platforms with focus on back office services to improve legal compliance service levels.
Download our e-Discovery and Document Review Solutions Brochure to understand how HCL focuses on creating efficient and cost effective document review solutions by marrying e-discovery.
Big Data launch keynote Singapore Patrick BuddenbaumIntelAPAC
The document describes Intel's open platform for next-generation analytics called the Intel Distribution for Apache Hadoop software. The platform delivers hardware-enhanced performance and security for Apache Hadoop and enables partners to innovate in data analytics. It strengthens the Apache Hadoop ecosystem and helps organizations unlock value from data.
Big Data launch Singapore Patrick BuddenbaumIntelAPAC
The document discusses Intel's Open Platform for Next-Gen Analytics. It introduces Intel's Distribution for Apache Hadoop software, which delivers optimized performance, security, and ease of deployment for Apache Hadoop. The software is backed by Intel's portfolio of data center products and contributes enhancements to the open source Apache Hadoop ecosystem. The distribution enables partners to innovate on analytics solutions.
Big Data i CSC's optik, CSC RepresentativeIBM Danmark
This document provides an overview of CSC's Netezza case study for a client in Zurich. It discusses how the client was struggling with performance issues on their DB2 database. CSC conducted a proof of concept that showed Netezza and Teradata providing significant performance improvements over DB2. Netezza was ultimately chosen due to cost and compatibility factors. The implementation of Netezza reduced the client's month-end processing time from 9 days to 3 days and improved query performance dramatically. Future plans include migrating more systems to Netezza and taking advantage of upcoming Netezza upgrades.
This document discusses the role of geospatial technology in safely and effectively delivering major events. It notes that major events face unique challenges related to scale, organizational complexity, risks, and balancing conflicting requirements. It provides examples of how geospatial technology can help with security, operational decision support, developing a common operational picture, and maximizing the use of geospatial information to improve outcomes.
This document provides an overview of offerings from AST Global related to data storage, data centers, and electromagnetic field shielding. The offerings include modular data center systems for enterprises and small/medium businesses, data center services for operations, management, and energy efficiency, and solutions for attenuating electromagnetic fields through smart shielding products and consulting services.
This document introduces SQL-H, which enables SQL analytics on Hadoop. It provides a primer on HCatalog and Aster, defines SQL-H, and provides examples of SQL-H usage. SQL-H allows direct access to HCatalog tables from within AsterDB, providing full SQL support and integration with BI tools on data stored in Hadoop. It performs reads from HCatalog in a distributed, native manner without using MapReduce.
This document provides status updates on various client projects including:
1. BI governance and data management projects to improve effective BI delivery.
2. Decoupling applications from BI and increasing user enablement through operational reporting, BI reporting/analytics, and data mining.
3. Updates on OLAP cube capability, data quality, security, and infrastructure including ETL processes and hierarchy management.
4. Planned timeline for Q1-Q4 2012 showing projects like HANA implementation, CCM for information management, and MDM stabilization.
HCLT Brochure: E-Discovery and Document Review SolutionsHCL Technologies
http://www.hcltech.com/search/apachesolr_search/business-services~ More on Business Services
With the number of litigations expected to increase due to the economy, corporations and law firms are increasingly concerned with cost effective high-quality electronic d`iscovery (“e-discovery”) solutions. With 70% of the total cost of a litigation attributed to the document review fees, corporations and law firms must select innovative document review solutions to stay in budget. Simple Solutions’ e-Discovery and Document Review Services provides corporations and law firms with high quality, cost-effective document review services that gives them the cost certainty needed to stay in budget.
e-Discovery companies are leveraging cloud computing and deployment of Software as a Service (SaaS) platforms with focus on back office services to improve legal compliance service levels.
Download our e-Discovery and Document Review Solutions Brochure to understand how HCL focuses on creating efficient and cost effective document review solutions by marrying e-discovery.
Big Data launch keynote Singapore Patrick BuddenbaumIntelAPAC
The document describes Intel's open platform for next-generation analytics called the Intel Distribution for Apache Hadoop software. The platform delivers hardware-enhanced performance and security for Apache Hadoop and enables partners to innovate in data analytics. It strengthens the Apache Hadoop ecosystem and helps organizations unlock value from data.
Big Data launch Singapore Patrick BuddenbaumIntelAPAC
The document discusses Intel's Open Platform for Next-Gen Analytics. It introduces Intel's Distribution for Apache Hadoop software, which delivers optimized performance, security, and ease of deployment for Apache Hadoop. The software is backed by Intel's portfolio of data center products and contributes enhancements to the open source Apache Hadoop ecosystem. The distribution enables partners to innovate on analytics solutions.
Big Data i CSC's optik, CSC RepresentativeIBM Danmark
This document provides an overview of CSC's Netezza case study for a client in Zurich. It discusses how the client was struggling with performance issues on their DB2 database. CSC conducted a proof of concept that showed Netezza and Teradata providing significant performance improvements over DB2. Netezza was ultimately chosen due to cost and compatibility factors. The implementation of Netezza reduced the client's month-end processing time from 9 days to 3 days and improved query performance dramatically. Future plans include migrating more systems to Netezza and taking advantage of upcoming Netezza upgrades.
This document discusses the role of geospatial technology in safely and effectively delivering major events. It notes that major events face unique challenges related to scale, organizational complexity, risks, and balancing conflicting requirements. It provides examples of how geospatial technology can help with security, operational decision support, developing a common operational picture, and maximizing the use of geospatial information to improve outcomes.
This document provides an overview of offerings from AST Global related to data storage, data centers, and electromagnetic field shielding. The offerings include modular data center systems for enterprises and small/medium businesses, data center services for operations, management, and energy efficiency, and solutions for attenuating electromagnetic fields through smart shielding products and consulting services.
The document summarizes a case study of Credium, a financial services company, implementing a cloud backup and disaster recovery solution with GTS. The solution involved continual backup of Credium's critical systems and data to a remote virtual environment hosted by GTS. This minimized Credium's risk from catastrophic failures by allowing operations to quickly failover to the backup systems. The flexible virtual platform also provided scalable capacity without requiring Credium to invest in unused backup hardware. Overall, the solution helped Credium achieve reliable disaster recovery protection with minimal costs.
4-page success story describes how private health insurer HBF selected Hitachi Virtual Storage Platform to deliver a seamless user experience when transitioning from physical to virtual desktops and for performance tiering and dynamic pooling efficiencies.
For more information on Virtual Storage Platform please visit: http://www.hds.com/products/storage-systems/hitachi-virtual-storage-platform.html?WT.ac=us_mg_pro_hvsp
The document discusses a unified data architecture that enables any user to access and analyze any data type from data capture through analysis. It describes using a discovery platform to enable interactive data discovery on structured and unstructured data without extensive modeling. It also describes using an integrated data warehouse for cross-functional analysis, shared analytics, and lowest total cost of ownership. Finally, it provides examples of using the architecture for IPTV quality of service analysis, including predictive models using decision trees and naive Bayes.
Ipscf2011 I K Price Infor10 Ea Mv10 1 Roadmapricardorodalves
The document provides an overview of Infor's EAM solution roadmap and details on version 10.1. It discusses Infor10 Workspace, which offers in-context business intelligence, collaboration tools, and a unified user experience. The roadmap aims to deliver improved asset lifecycle management, work management, analytics and reporting capabilities. Version 10.1 will focus on enhancing the mobile workforce and flexibility of the solution.
The document describes an IT portfolio management solution called IT Discovery that provides a centralized view of metadata from various IT systems. It extracts information from sources like code, databases, logs and integrates it into a repository. This enables managers to analyze applications, optimize resources, and improve communication across departments. IT Discovery runs on mainframes and supports various programming languages, databases and tools. It provides reporting and querying capabilities to help with tasks like license management, performance analysis and maintenance planning.
Embarcadero ER/Studio helps companies document and enhance existing databases, improve data consistency, effectively communicate models across the enterprise, and model more than just data. With many additional features lacking in Sybase PowerDesigner, ER/Studio brings clarity to complex data models.
The document discusses green IT and energy efficiency strategies at Austin Energy. It provides details on Austin Energy's initiatives to increase the use of renewable energy, improve energy efficiency in data centers and buildings, and reduce customers' energy use and carbon footprint through various rebate and incentive programs. The utility aims to power all of its facilities with 100% green energy by 2012 and meet 30% of the city's power needs through renewable sources by 2020.
Etzard Stolte and Ralph Schlapbach introduce the proof of concept provided by Hewlett-Packard Life Sciences for Phase 2 of the Pistoia Alliance Sequence Services project.
Paradox Routing Tool Overview and Quick Tourparadoxsci
Overview of a truck Routing and Scheduling Tool for shippers, 3rd party logistics service providers, logistics consulting companies, and private fleet operations.
Hadoop's Opportunity to Power Next-Generation ArchitecturesDataWorks Summit
(1) Hadoop has the opportunity to power next-generation big data architectures by integrating transactions, interactions, and observations from various sources.
(2) For Hadoop to fully power the big data wave, many communities must work together, including being diligent stewards of the open source core and providing enterprise-ready solutions and services.
(3) Integrating Hadoop with existing IT investments through services, APIs, and partner ecosystems will be vitally important to unlocking the value of big data.
Isis Papyrus Document Capture SolutionsFriso de Jong
Papyrus Capture was used to automate the processing of over 10,000 documents received daily by Telekom Austria. The documents were scanned and classified into over 30 different types using Papyrus Capture and FreeForm technology. Key index data such as customer number, area code, postal code, and telephone number was automatically extracted from the documents using intelligent recognition. Documents requiring human review were flagged for operator verification. The automated system provided transparency and improved processing capabilities, allowing Telekom Austria to achieve new processing targets and improve customer service.
Tackling big data with hadoop and open source integrationDataWorks Summit
The document discusses Talend's goal of democratizing integration and big data. It describes how big data involves transactions, interactions and observations from diverse sources, requiring a different approach than traditional data integration. Talend aims to make big data accessible to everyone with its open source Talend Open Studio for Big Data, which improves the efficiency of designing big data jobs with intuitive interfaces and generates code to run transforms within Hadoop. Poor data quality in big data projects can magnify problems, so Talend recommends incorporating data quality checks into loading processes or via separate map reduce jobs.
Tech Ed 09 - Arc302 - Analysis and Architecturemhessinger
The document discusses the disconnect that often exists between architects and business analysts in defining system requirements. It argues that architects need to get more involved in the requirements process to help ensure all necessary requirements are considered. Closer collaboration between architects and analysts can yield benefits like reduced complexity, increased usability, and quicker time to market. The document provides examples of how architects can influence requirements and considerations if they do not increase their role in defining system needs.
The document discusses trends in big data and data management. It notes that data volume, velocity, variety, and value are increasing dramatically. This rapid growth is challenging IT to manage and analyze more complex data relationships in real time and at large scale. The document also discusses how new consumption models like cloud computing and storage virtualization can help reduce costs and better manage the explosion of data replication. It introduces Hitachi's accelerated flash storage and new HUS VM entry-level enterprise storage system to address these big data challenges.
The document discusses how datacenter networks are evolving from fixed, hierarchical designs optimized for client/server transactions to dynamic networks better suited to cloud computing and big data needs. This requires flattening network topologies, converging server and storage networks onto high-speed Ethernet fabrics, and introducing more intelligence and flexibility at the network edge to support virtualized, application-driven workloads. The network must be able to quickly and reliably handle increased server-to-server traffic within the datacenter in order to enable real-time analytics across massive and diverse data sources.
This document provides an annotated list of presentations, courses, seminars, and workshops by MJD and TETRAD related to topics like disaster management, biothreat detection, counterterrorism, and humanitarian applications of science and technology. It describes formal courses taught at universities, as well as presentations given at conferences on subjects such as border security, emergency response, and connecting dots to locate terrorist operations. The document aims to provide information for organizing future training opportunities on issues covered in the materials listed.
Coordinated And Unified Responses To Unpredictable And Widespread Biothreatsmartindudziak
Intelligent and rapid dissemination of information is essential for responding to CBRN threats but has been missing from most response plans. The CUBIT system provides a solution with its coordinated and unified approach. CUBIT uses sensors, analytics, diagnostics, treatments, and population control protocols incorporated as scalable and modular components that can dynamically interact. It employs principles of "plug and play" and adaptability to respond to unpredictable biothreats affecting populations when infrastructure is damaged.
The document describes a proposed system for detecting land mines using an unmanned aerial vehicle (UAV) equipped with multiple sensors, including magneto-optic thin-film sensors (MODE sensors) and video cameras. The system would analyze sensor data using pattern recognition software to identify locations of land mines. A key challenge is developing a modular system that can efficiently integrate different sensors onto a small UAV. The document outlines designs for a modular payload assembly and embedded computer system that could process sensor data and transmit it to ground control in real-time. Initial testing of MODE sensors would be conducted on a laboratory workbench to evaluate their effectiveness at detecting ferromagnetic objects like land mines from aerial images.
This document discusses a novel magneto-optic sensor called the MODE sensor that can be used for non-destructive testing of structural integrity. The MODE sensor uses thin films made of rare earth and transition metal oxides that have high magneto-optic properties, allowing it to detect cracks, fissures, and corrosion in structures. A portable system has been designed using this sensor to allow real-time inspection of bridges, fuel tanks, and other metal structures. The system includes image processing and pattern recognition capabilities to help identify defects.
This document proposes a technology using magneto-optic thin film sensors to study magnetic fields in deep space through wide-area arrays deployed by spacecraft. Each sensor would measure local magnetic fields and disturbances, with data communicated to reconstruct magnetic activity over large regions. The arrays could also control large space systems through parallel computing principles. The sensors use bismuth-substituted iron-garnet films that respond to magnetic fields through the magneto-optic Faraday effect, providing high sensitivity and domain wall velocity. Deployed arrays would allow unprecedented magnetic mapping beyond spacecraft's direct reach.
The document summarizes a case study of Credium, a financial services company, implementing a cloud backup and disaster recovery solution with GTS. The solution involved continual backup of Credium's critical systems and data to a remote virtual environment hosted by GTS. This minimized Credium's risk from catastrophic failures by allowing operations to quickly failover to the backup systems. The flexible virtual platform also provided scalable capacity without requiring Credium to invest in unused backup hardware. Overall, the solution helped Credium achieve reliable disaster recovery protection with minimal costs.
4-page success story describes how private health insurer HBF selected Hitachi Virtual Storage Platform to deliver a seamless user experience when transitioning from physical to virtual desktops and for performance tiering and dynamic pooling efficiencies.
For more information on Virtual Storage Platform please visit: http://www.hds.com/products/storage-systems/hitachi-virtual-storage-platform.html?WT.ac=us_mg_pro_hvsp
The document discusses a unified data architecture that enables any user to access and analyze any data type from data capture through analysis. It describes using a discovery platform to enable interactive data discovery on structured and unstructured data without extensive modeling. It also describes using an integrated data warehouse for cross-functional analysis, shared analytics, and lowest total cost of ownership. Finally, it provides examples of using the architecture for IPTV quality of service analysis, including predictive models using decision trees and naive Bayes.
Ipscf2011 I K Price Infor10 Ea Mv10 1 Roadmapricardorodalves
The document provides an overview of Infor's EAM solution roadmap and details on version 10.1. It discusses Infor10 Workspace, which offers in-context business intelligence, collaboration tools, and a unified user experience. The roadmap aims to deliver improved asset lifecycle management, work management, analytics and reporting capabilities. Version 10.1 will focus on enhancing the mobile workforce and flexibility of the solution.
The document describes an IT portfolio management solution called IT Discovery that provides a centralized view of metadata from various IT systems. It extracts information from sources like code, databases, logs and integrates it into a repository. This enables managers to analyze applications, optimize resources, and improve communication across departments. IT Discovery runs on mainframes and supports various programming languages, databases and tools. It provides reporting and querying capabilities to help with tasks like license management, performance analysis and maintenance planning.
Embarcadero ER/Studio helps companies document and enhance existing databases, improve data consistency, effectively communicate models across the enterprise, and model more than just data. With many additional features lacking in Sybase PowerDesigner, ER/Studio brings clarity to complex data models.
The document discusses green IT and energy efficiency strategies at Austin Energy. It provides details on Austin Energy's initiatives to increase the use of renewable energy, improve energy efficiency in data centers and buildings, and reduce customers' energy use and carbon footprint through various rebate and incentive programs. The utility aims to power all of its facilities with 100% green energy by 2012 and meet 30% of the city's power needs through renewable sources by 2020.
Etzard Stolte and Ralph Schlapbach introduce the proof of concept provided by Hewlett-Packard Life Sciences for Phase 2 of the Pistoia Alliance Sequence Services project.
Paradox Routing Tool Overview and Quick Tourparadoxsci
Overview of a truck Routing and Scheduling Tool for shippers, 3rd party logistics service providers, logistics consulting companies, and private fleet operations.
Hadoop's Opportunity to Power Next-Generation ArchitecturesDataWorks Summit
(1) Hadoop has the opportunity to power next-generation big data architectures by integrating transactions, interactions, and observations from various sources.
(2) For Hadoop to fully power the big data wave, many communities must work together, including being diligent stewards of the open source core and providing enterprise-ready solutions and services.
(3) Integrating Hadoop with existing IT investments through services, APIs, and partner ecosystems will be vitally important to unlocking the value of big data.
Isis Papyrus Document Capture SolutionsFriso de Jong
Papyrus Capture was used to automate the processing of over 10,000 documents received daily by Telekom Austria. The documents were scanned and classified into over 30 different types using Papyrus Capture and FreeForm technology. Key index data such as customer number, area code, postal code, and telephone number was automatically extracted from the documents using intelligent recognition. Documents requiring human review were flagged for operator verification. The automated system provided transparency and improved processing capabilities, allowing Telekom Austria to achieve new processing targets and improve customer service.
Tackling big data with hadoop and open source integrationDataWorks Summit
The document discusses Talend's goal of democratizing integration and big data. It describes how big data involves transactions, interactions and observations from diverse sources, requiring a different approach than traditional data integration. Talend aims to make big data accessible to everyone with its open source Talend Open Studio for Big Data, which improves the efficiency of designing big data jobs with intuitive interfaces and generates code to run transforms within Hadoop. Poor data quality in big data projects can magnify problems, so Talend recommends incorporating data quality checks into loading processes or via separate map reduce jobs.
Tech Ed 09 - Arc302 - Analysis and Architecturemhessinger
The document discusses the disconnect that often exists between architects and business analysts in defining system requirements. It argues that architects need to get more involved in the requirements process to help ensure all necessary requirements are considered. Closer collaboration between architects and analysts can yield benefits like reduced complexity, increased usability, and quicker time to market. The document provides examples of how architects can influence requirements and considerations if they do not increase their role in defining system needs.
The document discusses trends in big data and data management. It notes that data volume, velocity, variety, and value are increasing dramatically. This rapid growth is challenging IT to manage and analyze more complex data relationships in real time and at large scale. The document also discusses how new consumption models like cloud computing and storage virtualization can help reduce costs and better manage the explosion of data replication. It introduces Hitachi's accelerated flash storage and new HUS VM entry-level enterprise storage system to address these big data challenges.
The document discusses how datacenter networks are evolving from fixed, hierarchical designs optimized for client/server transactions to dynamic networks better suited to cloud computing and big data needs. This requires flattening network topologies, converging server and storage networks onto high-speed Ethernet fabrics, and introducing more intelligence and flexibility at the network edge to support virtualized, application-driven workloads. The network must be able to quickly and reliably handle increased server-to-server traffic within the datacenter in order to enable real-time analytics across massive and diverse data sources.
This document provides an annotated list of presentations, courses, seminars, and workshops by MJD and TETRAD related to topics like disaster management, biothreat detection, counterterrorism, and humanitarian applications of science and technology. It describes formal courses taught at universities, as well as presentations given at conferences on subjects such as border security, emergency response, and connecting dots to locate terrorist operations. The document aims to provide information for organizing future training opportunities on issues covered in the materials listed.
Coordinated And Unified Responses To Unpredictable And Widespread Biothreatsmartindudziak
Intelligent and rapid dissemination of information is essential for responding to CBRN threats but has been missing from most response plans. The CUBIT system provides a solution with its coordinated and unified approach. CUBIT uses sensors, analytics, diagnostics, treatments, and population control protocols incorporated as scalable and modular components that can dynamically interact. It employs principles of "plug and play" and adaptability to respond to unpredictable biothreats affecting populations when infrastructure is damaged.
The document describes a proposed system for detecting land mines using an unmanned aerial vehicle (UAV) equipped with multiple sensors, including magneto-optic thin-film sensors (MODE sensors) and video cameras. The system would analyze sensor data using pattern recognition software to identify locations of land mines. A key challenge is developing a modular system that can efficiently integrate different sensors onto a small UAV. The document outlines designs for a modular payload assembly and embedded computer system that could process sensor data and transmit it to ground control in real-time. Initial testing of MODE sensors would be conducted on a laboratory workbench to evaluate their effectiveness at detecting ferromagnetic objects like land mines from aerial images.
This document discusses a novel magneto-optic sensor called the MODE sensor that can be used for non-destructive testing of structural integrity. The MODE sensor uses thin films made of rare earth and transition metal oxides that have high magneto-optic properties, allowing it to detect cracks, fissures, and corrosion in structures. A portable system has been designed using this sensor to allow real-time inspection of bridges, fuel tanks, and other metal structures. The system includes image processing and pattern recognition capabilities to help identify defects.
This document proposes a technology using magneto-optic thin film sensors to study magnetic fields in deep space through wide-area arrays deployed by spacecraft. Each sensor would measure local magnetic fields and disturbances, with data communicated to reconstruct magnetic activity over large regions. The arrays could also control large space systems through parallel computing principles. The sensors use bismuth-substituted iron-garnet films that respond to magnetic fields through the magneto-optic Faraday effect, providing high sensitivity and domain wall velocity. Deployed arrays would allow unprecedented magnetic mapping beyond spacecraft's direct reach.
The document describes a family of microinstruments being developed for use in space missions. The instruments use magneto-optic thin film sensors to perform tasks like non-destructive testing of spacecraft components, detecting electromagnetic fields, monitoring biomagnetic fields, and optical signal processing. Each sensor is based on a proprietary Fe-Ga thin film material and uses polarized light and a spatial light modulator. The sensors can detect magnetic fields as small as 10-7 Oersted and have applications in areas like defect detection, energy generation, medicine, and neural networks. The technology provides advantages over existing non-destructive testing methods by directly imaging defects in real-time with high resolution and low false readings.
This document summarizes research on using magneto-optic imaging for non-destructive testing of metal structures. It describes developing new thin-film sensors with improved sensitivity, integrating the sensors and image recognition algorithms into a portable system, and applying a neural network algorithm called SONON to enhance defect detection in images. Laboratory experiments demonstrated the new sensors could detect smaller defects than previous methods. The overall aim is more accurate, automated inspections using portable, wearable equipment.
TETRAD Technologies Group provides solutions for critical event detection and situation awareness. It has three divisions focused on products for emergencies, origins of events, and types of threats. TETRAD develops sensors for chemicals, explosives, and biopathogens. It also offers software, bioprotection products, and consulting services. The company aims to understand complex asymmetric situations and provide fast, adaptive solutions exceeding expectations.
1. The document discusses scanning probe microscopy (SPM) techniques such as atomic force microscopy (AFM) and their applications in biomedical research.
2. SPM allows high-resolution imaging of surfaces and can be used to study cell topology, structures like cytoskeletons and membranes, and how electromagnetic fields impact cells.
3. Experiments aim to use AFM to image living cells over time and study phenomena like solitons and fractals at the microscopic level to gain insights into cell behavior and pathology.
Splunk is a big data company founded in 2004 that provides a platform for collecting, indexing, and analyzing machine-generated data. It has over 5,000 customers in over 80 countries across various industries. Splunk's software can handle large volumes of machine data, scaling to terabytes per day and thousands of users. It collects and indexes machine data from various sources like logs, metrics, and applications without needing prior knowledge of schemas or custom connectors.
How a Cloud Computing Provider Reached the Holy Grail of Visibilityeladgotfrid
CloudShare is a cloud computing provider that was seeking a centralized system for operational visibility across its infrastructure. It evaluated Splunk and found that Splunk could aggregate data from all its systems, correlate business and infrastructure data, and provide analytics and dashboards. Splunk helped CloudShare gain insights through real-time monitoring, historical analysis, and business intelligence dashboards for various teams. It provided a single platform to handle CloudShare's massive "big data" volumes and give the visibility needed across the organization.
Cetas Analytics as a Service for Predictive AnalyticsJ. David Morris
This document discusses how predictive analytics using big data can lead to successful recommendations and revenue maximization. It describes trends in data growth, the value of data analytics exceeding hardware costs, and how a unified analytics cloud platform can simplify infrastructure and optimize resources. Sample predictive analytics applications are outlined for industries like ecommerce, mobile, advertising, gaming, and IT, with the goal of revenue maximization and user engagement through recommendation engines and targeted placements. The cloudification of predictive analytics as an analytics-as-a-service approach is presented as the logical conclusion to fully leverage big data.
This document discusses how predictive analytics using big data leads to successful recommendations and revenue maximization. It outlines key trends like the growth of new data sources and analyzes how companies are using predictive analytics in applications like ecommerce, mobile, advertising, and gaming to optimize customer engagement and maximize profits. The document advocates taking predictive analytics to its logical conclusion through cloud-based analytics-as-a-service and leveraging big data to directly monetize insights from predictive modeling.
This document discusses big data use cases and business value. It provides examples of companies using big data across various industries like telecommunications, waste management, healthcare, and government. For each use case, it describes the data sources, techniques used like Hadoop, analytics, machine learning, and how it provides business value through increased revenue, cost reductions, or other benefits.
This document discusses integrating independent data sets at NASA Marshall Space Flight Center. Previously, data was siloed across different departments and systems with no linkages. The new application aims to rapidly gather information from various sources through a visual portal, while improving access, integration, decision making and efficiencies. It utilizes existing databases, business processes and software without migrating all data into a central system. An integration hub correlates non-homogeneous databases and regroups data sets into reports. An access control module allows restricted access to specific users.
The document summarizes a presentation on evolving a new analytical platform. It discusses defining the platform to include tools for the whole research cycle beyond just business intelligence (BI), with SQL Server 2008 R2 as an example of defining the platform. It also discusses what is working with existing platforms and what is still missing, including the need for more scalable data storage and processing.
The document describes DataFed, a federated data system that provides non-intrusive integration of diverse environmental datasets using open standards. DataFed allows users to find and access datasets through a catalog and flexible tools for processing and visualizing the data. It facilitates publishing, finding, and accessing geospatial and environmental data through loose coupling of autonomous nodes and OGC web service protocols.
The document discusses the need for a single data and events platform to handle high volumes of data and events. It describes GemStone Systems, which provides a distributed main-memory data management platform using a data fabric/grid. The platform allows applications to process and distribute large amounts of data and events at high speeds and scales linearly. It provides an example of using the platform for electronic trade order management to normalize, validate, aggregate and distribute trading data and events in real-time across clustered applications.
The document discusses DDS (Data Distribution Service), a middleware standard for distributed real-time systems. It provides an overview of DDS technology including its data-centric publish-subscribe model, quality of service capabilities, and ability to integrate external systems. The document also discusses RTI's implementation of the DDS standard and how it provides high performance, scalability and other benefits for building distributed real-time systems.
The document discusses plans by BPMIGAS, Indonesia's upstream oil and gas regulator, to improve integration and management of petroleum resources data. It outlines a two-phase approach:
1) Implement an internal "PRIM" system for centralized GIS, online connectivity, analytics and decision support within BPMIGAS.
2) Expand to an external "PRM" system connecting BPMIGAS and oil companies through shared GIS maps, project management, standards and administration portals. The goal is reliable data sharing to optimize resource management and maintain national oil production targets.
As the core SQL processing engine of the Greenplum Unified Analytics Platform, the Greenplum Database delivers Industry leading performance for Big Data Analytics while scaling linearly on massively parallel processing clusters of standard x86 servers. This session reviews the product's underlying architecture, identify key differentiation areas, go deep into the new features introduced in Greenplum Database Release 4.2, and discuss our plans for 2012.
The document discusses using tablets as part of an infrastructure to enable various business applications. It notes that a single tablet has limited uses but that connecting tablets to a backend platform allows for easier use, lower costs through volume purchases, and many new applications across business functions like sales, marketing, customer service, manufacturing and more. It provides examples of potential applications including learning management, signage, TV integration, monitoring, ticketing, sales catalogues, and more. The overall message is that a backend platform can significantly expand what is possible with a fleet of connected tablets versus individual standalone tablets.
The document provides an overview of IBM's Big Data platform vision. The platform addresses big data use cases involving high volume, velocity and variety of data. It integrates with existing data warehouse and master data management systems. The platform handles different data types and formats, provides real-time and batch analytics, and has tools to make it easy for developers and users to work with. It is designed with enterprise-grade security, scalability and failure tolerance. The platform allows organizations to analyze big data from various sources to gain insights.
Making your Analytics Investment Pay Off - StampedeCon 2012StampedeCon
At StampedeCon 2012 in St. Louis, Bill Eldredge of Nokia presents: At Nokia, we expect to save millions on avoided license fees this year on a single “Big Data” project by creating a symbiotic relationship between our traditional RDBMS storage and our newer Hadoop cluster. Our hybrid approach to data enables us to manage the convergence of structured and unstructured data, and save money. In our case we use Hadoop to process and import data into traditional systems. We have found that this use of Hadoop as a preprocessing engine has enabled maximum value to be derived from our systems, our data and our people.
SAP HANA and Apache Hadoop for Big Data Management (SF Scalable Systems Meetup)Will Gardella
In this presentation I argue that the future of data management may see a split between (1) real-time in-memory systems such as SAP HANA for most enterprise workloads (2) disk-based free and open-source Apache Hadoop for certain specialized big data uses.
The presentation starts with a definition of what is intended by the term big data, then talks about SAP HANA and Apache Hadoop from the perspective of suitability for enterprise use with a special concentration on Hadoop. (The basics of SAP HANA were covered in the immediately preceding session). This is followed by a description of currently available SAP support for Apache Hadoop in SAP BI 4.0 and SAP Data Services / EIM. Due to time constraints I did not discuss Apache Hadoop support built into Sybase IQ.
Hadoop World 2011: Data Ingestion, Egression, and Preparation for Hadoop - Sa...Cloudera, Inc.
One of the first challenges Hadoop developers face is accessing all the data they need and getting it into Hadoop for analysis. Informatica PowerExchange accesses a variety of data types and structures at different latencies (e.g. batch, real-time, or near real-time) and ingests data directly into Hadoop. The next step is to parse the data in preparation for analysis in Hadoop. Informatica provides a visual IDE to deploy pre-built parsers or design specific parsers for complex data formats and deploy them on Hadoop. Once the analysis is complete, Informatica PowerExhange delivers the resulting output to other information management systems such as a data warehouse. Learn in this session from Informatica and one of their customers, how to get all the data you need into Hadoop, parse a variety of data formats and structures, and egress the resultant output to other systems.
This document outlines a proposal for a 6-month, $150,000 project to develop concepts of operations (CONOPS) for a Regional Environmental Biothreat Detection Network (REDBIONET). The network would integrate existing biodefense sensing systems and new diagnostic tools to enable early detection of biothreats through wildlife monitoring. Key components include adapting the RODS predictive system and integrating it with GITI's knowledge management tools. The proposal also describes integrating rapid diagnostic technologies, evaluating sensor placement options, and demonstrating a field-ready prototype to identify and respond to biothreats. Personnel are identified with relevant experience in pattern recognition, bioinformatics, and emergency response networks.
The document discusses a proposed mobile early warning system called Nomad EyesTM to detect and prevent nuclear terrorism. It argues that terrorism relies on networks and readily available technology. Radiation attacks are attractive due to their ability to cause social and economic disruption even without loss of life. The system would use mobile and wireless sensors to detect suspicious movements and shipments of radioactive and conventional materials. Data would be analyzed using various techniques like sensor fusion, graph theory, and gaming to identify potential threats while also providing emergency response capabilities. The goal is to develop a flexible, low-cost, and disruptive counterterrorism system.
This document discusses the Nomad Eyes project, which aims to use a network of mobile sensors and the general public to detect and prevent nuclear terrorism through early warning. The project would distribute radiation sensors that can attach to mobile phones to collect and transmit data. Games and advertising would encourage public participation. Collected data would be analyzed using graph theory and Bayesian methods to identify potential terrorist planning and threats. In the event of an attack, the network could quickly notify the public and route them to safety. The current status describes sensor prototypes, public engagement design, and network/database software development. The goal is to move terrorism prevention and response capabilities out of secure facilities and into the hands of the general public.
Global InfoTek will develop concepts of operations (CONOPS) for an Emergency Mobile Phone Incident Reporting System (EMPIRES) that leverages mobile phones and infrastructure to collect and share situational data from citizens during crises. The 6-month project will cost $90,000. Global InfoTek will conceptualize a system using mobile phones to collect incident reports, environmental effects data, and real-time audio/video from citizens. They will integrate existing programs and technologies to disseminate this data to emergency responders through an integrated display. Global InfoTek will focus on communication challenges faced by responders during crises when infrastructure may be unavailable and develop solutions using emerging technologies like sensors and GPS on mobile
The document describes the I3BAT and Nomad Eyes systems, which are designed to incorporate terrorist thinking and tactics to help prevent terrorist attacks. Nomad Eyes would involve widely distributing sensors and collecting data using mobile phones and other devices. This data would then be analyzed using statistical and mathematical models to identify patterns that could link people, objects, and events and help forecast terrorist plans and activities. The goal is for the general public to help detect threats through passive and anonymous data collection using everyday devices to supplement formal security and law enforcement efforts.
This document discusses the concept of ecosymbiotics, which aims to integrate economic profitability with environmental and social sustainability. It argues that education, basic research, environmental protection, and economic development are interdependent and should be viewed holistically. Ecosymbiotics proposes developing commercial innovations through collaborative, interdisciplinary research that also benefits education and future generations. The goal is to move beyond dependence on non-profit funding and directly link basic scientific progress with business and capital growth in a mutually sustainable way.
The document outlines a seminar on how quantum events may play a role in coherent biomolecular systems. It discusses several topics: (1) introducing motivations around reconciling quantum mechanics and relativity in biological systems; (2) exploring quantum network dynamics and structures like solitons that could provide stability; and (3) investigating chiral and tensegrity-stable solitons in higher dimensions that may model quantum networks sustaining topological identities. The goal is to better understand intracellular control and signaling at the quantum scale.
The document outlines a theory of topological process dynamics and its applications to biosystems. It discusses how a stable spacetime emerges from a quantum process flux described as a "spin glass" of topological 3-surface regions. Below certain length scales, p-adic numbers and an ultrametric topology are hypothesized to apply, with favored p-adic primes corresponding to physically important length scales like those seen in biological structures. The length scale hypothesis proposes lengths scales of L(p) = sqrt(p) * L0 that match observations of elementary particles, cells, viruses, and nanobacteria. P-adic topology is proposed below these scales with continuous classical spacetime emerging at larger scales.
This document summarizes a study on pattern recognition and learning in networks of coupled bistable units. The network is composed of N oscillators moving in a double-well potential, with pair-wise interactions between all elements. Two methods are used for training the network: (1) constructing the coupling matrix using Hebb's rule based on stored patterns, and (2) iteratively updating the matrix to minimize error between applied and desired patterns. Graphs show the learning rate converges as mean squared error and coupling strengths decrease over iterations.
This document discusses a hypothesis that molecular dynamics across neural membranes and cytoskeletal structures provide a matrix for self-organized behavior and information processing in the brain. Specifically:
1) Patterns of molecular activity may form stable solitons or "chaotons" capable of storing information over time, providing a basis for learning, memory, and consciousness.
2) These solitons could behave in a self-similar way across complexes of neurons operating within synapto-dendritic field activity.
3) Atomic force microscopy may help experimentally confirm theoretical models of these solitons and emergent structures in subcellular processes.
Evolutionary IED Prevention 09 2006 Updated W Comments Jan2010martindudziak
The document provides an overview of a presentation given at a 2006 conference on evolutionary detection and prevention of improvised explosive devices (IEDs) and related terrorist weapons. The presentation discusses the need for detection systems that can (1) think ahead of terrorists rather than just react, (2) detect multiple substances in diverse environments, and (3) be usable by non-technical users. It also examines challenges like evolving weapon technologies and effectiveness. The presentation proposes solutions like a single, reconfigurable detection technology that can integrate with existing systems.
The document discusses applying geospatial representation and forecasting models to improve chemical, biological, radiological, nuclear and explosive (CBRNE) defense. It proposes integrating CBRNE prediction, detection, and countermeasures with geospatial analysis. This would allow incorporation of mobile, wireless, and portable technologies. The goal is a smooth transition between combat, post-combat and civilian CBRNE situations. Challenges include differences between field and domestic environments and issues with sensors. The document outlines several proposed technologies, including the Nomad Eyes architecture for distributed sensor deployment using inverse modeling. It also discusses the ADaM software for real-time data processing and sensor devices like the portable OPA for chemical detection.
Increased Vulnerability To Nuclear Terrorist Actions 20july07martindudziak
This document discusses the increased vulnerability of the continental United States to nuclear terrorism. It outlines three types of potential nuclear terrorist attacks: 1) an atomic bomb, 2) a dirty bomb using conventional explosives to disperse radioactive material, and 3) a passive radiation exposure device to contaminate areas without explosives. It argues that current container scanning methods are ineffective and may unintentionally aid terrorists. A recent assassination using polonium-210 demonstrated how easily radiation can spread, and terrorists could replicate this on a larger scale. The supply chain for nuclear materials is more accessible than assumed, as terrorist groups do not need state-level resources and can learn from incidents like the assassination. Overall, the document warns that the US remains vulnerable to nuclear
The document proposes a community program called SAFETY NET that utilizes cell phones and civilian awareness to more quickly detect and respond to emerging crisis situations through basic data collection and analysis, with the goal of preventing violence, abuse, and crime. The program could be implemented at nearly no cost while providing benefits to both community users and service providers. It aims to address issues like domestic abuse, gang activity, and terrorism by getting to the root causes within families and communities.
This document proposes using mutual information techniques in a hierarchical, staged approach to improve deformable registration of medical images for clinical applications. It involves pairing different object models corresponding to image components and measuring their mutual agreement through deformations to maximize mutual information at each stage. Overcoming failures at one stage involves introducing another level based on results from previous pairings. The goal is to leverage established algorithms and databases to improve registration accuracy and utility for clinicians without overcomplicating protocols.
This document describes a flexible polymer film-based sensor network system called SenseNet for medical monitoring applications. The SenseNet uses polymer film patches laminated with wireless communication components and embedded visual sensors based on compound eye technology. These sensor patches can be applied in various medical environments both internal and external to continuously monitor patients and procedures in situations where staffing may be limited. The visual sensor units take inspiration from insect eyes to create compact imaging systems. The overall system architecture involves individual sensor patches communicating data via wireless networks to a central server and database for real-time monitoring and historical recording by medical staff and observers.
The document proposes Project Sano y Salvo, an efficient and cost-effective international program to prevent child abuse, corruption, and slavery using new technologies. It would use marketing strategies to promote educational activities on the internet that build strengths against these issues. By embedding lessons and games targeting children, families, and society, it could help strengthen communities and circumvent crimes before they happen. The proposal suggests piloting the program jointly in Costa Rica and the US due to their challenges and opportunities, with support from government, corporate, and nonprofit partners.
This document discusses detecting trigger points and irreversible thresholds in shock and trauma patients during catastrophic events when clinical infrastructure may be limited. It proposes that unstable recurrent patterns in physiological parameters could serve as early indicators of critical conditions. The document reviews using models like the Kuramoto-Sivashinsky equation to identify recurrent patterns in dissipative systems and associates these patterns with medical conditions to aid triage and forecasting needs under adverse conditions with sparse data. Further work is needed to determine relevant physiological parameters and associate recurrent patterns in those parameters with medical outcomes.
This document proposes a plan for an international corporate and government consortium to provide technical support to organizations helping women in Afghanistan. The plan would involve setting up portable satellite-linked sites in villages to jumpstart medical and educational programs. The sites would be assembled by volunteers and transported to Afghanistan, where they would be installed and handed over to local user organizations. The proposal estimates a budget of $40,000 per village supported and identifies potential partners and funding sources to implement the project.
Nomad Eyes is an architecture for early warning, prevention, and response to threats. It uses both inverse and forward reasoning to detect anomalies within predictable systems and unstable patterns within nonlinear systems. The fundamental model is based on using different approaches like total isolation, vaccination, camouflage, and understanding the enemy. It aims to create associations that match expected sequences of activity consistent with planned terrorist attacks. A key principle is to create models from the attacker's perspective to treat information flow as an encrypted process.
1. Smart and Secure Data Access
for
On-Site Highway Maintenance and Construction
Intelligent Cooperative
Data Acquisition and Identification
Dr. Martin Dudziak
Parikh Advanced Systems, Inc.
September, 1998
2. Presentation Outline
System and Enterprise Issues
IDEA (Intelligent Data Entry and
Acquisition) Architecture
Practical Implementation Scenario
10/10/12 Parikh Advanced Systems, Inc.
2
3. Data Collection and Currency is more than a
Network and Systems Performance Issue
10/10/12 Parikh Advanced Systems, Inc.
3
4. Enterprise-Level Premises and Constraints
Field-based operations and management vs.
Centralized control structures
Integration of Computer Technology with
Nuts-and-Bolts Technology
Must be oriented to more than the IT-
proficient users of the departments and
divisions
10/10/12 Parikh Advanced Systems, Inc.
4
5. Platform Independence and Versatility
“I need a hands-free PC for this job”
“This is not the belt-pack wearable PC I used last
week but it will do”
“This job needs pictures and video and all we have
is this serial-port camera interface”
“I’m going to be in the field for all day and need a
longer-life extra battery”
“I can’t be bothered with a display panel; where’s
the headset?”
10/10/12 Parikh Advanced Systems, Inc.
5
6. Hands-free and Wearable - Why?
Full-power for computing and communication
Desktop capabilities for not much more $
Speech interface that fits Everyman
Displays that fit the task - headset, flat-panel,
flexi-wrap
Expandable and modular
10/10/12 Parikh Advanced Systems, Inc.
6
8. 3 Areas of Integration and Improvement
Speech, Paper Notes, Sketches, and Vision
Security and Simplicity of the
SmartCard
Assisted Intelligence & Intelligent Assistance
10/10/12 Parikh Advanced Systems, Inc.
8
9. Generic Inventory-Oriented Data Collection
Van-based Extra-
Video vehicular
Recording Video/Still
Recording
Location-Indexed
Image/Loc Recs
Interactive
ID/Condition
Resolution
System
Maintenance
Database
Entries
10/10/12 Parikh Advanced Systems, Inc.
9
10. Early Version of IDEA -- Projected IDAAT Work Flow
Field (Mobile) Base Station
Field Collection Units RDBMS
DATA SERVER
10/10/12 Parikh Advanced Systems, Inc.
10
11. Projected IDAAT Data Flow (cont.)
GIS,
Inventory,
Maintenance,
Financial,
Planning
Databases and
Decision
Field (Mobile) Support
Collection Unit Field (Mobile) Base Station
Systems
Data Cards
10/10/12 Parikh Advanced Systems, Inc.
11
12. IDAAT Network Overview
Maintenance
Training
Workstation
Asset Counting
Unit (BarCode)
GIS, Spatial,
Address Data
Asset Resources
Inventory
Unit (PDA)
Asset Counting IDAAT Relational/Object-
Unit MagStrip Oriented
Reader
Base Station
Data Server
Portable
GPS Field CAD/Map
Unit Publishing Server
In the Field
Pen-based
Portable Digital Body-
Notepad Wearable In Regional and Central
PC Offices
Local/manual connectivity High-speed LAN/WAN (ATM) Network
10/10/12 Parikh Advanced Systems, Inc.
12
13. IDAAT Base Station Architecture
PCMCIA
(PC Card)
Reader or
equivalent
PDA Docking
Station or Direct Network
equivalent Pentium Workstation
(Desktop Minitower)
64 MB RAM, 5 GB disk,
CDROM, modem, spare Bays,
2 serial ports)
Connectivity Dial-Up Modem
Magstrip/
Barcode Reader
Docking Port
Land-Line Wireless
Standard PC
I/O Ports and
Video Board Devices
with Camera
Port(s) UPS power backup for mobile
(vehicle-based) operations
10/10/12 Parikh Advanced Systems, Inc.
13
14. PDA/HPC Connectivity and Data Transfer
Cradle/Docking Station with
standard Synchronization software
PC Card or Docking Station connectivity
Field (Mobile) Base Station
PC Card exchange or
internet connectivity
Internet or LAN connectivity
10/10/12 Parikh Advanced Systems, Inc.
14
15. Base Station to Network Connectivity
District network, sharing of incident
and asset status multimedia data
Central data server(s) linking
to IDAAT and other central systems
with GIS and full asset data acquisition
data sets collected in the field
Field (Mobile) Base Station
Financial and accounting system,
providing task and timekeeping data
10/10/12 Parikh Advanced Systems, Inc.
15
16. Intelligent Data Entry and Acquisition
(IDEA)
Van-based
Recording Extra-vehicular
Recording
Asset-Targeted
Image/Loc recs
Automated
Asset ID and
Condition
Rejections
Assessment
Interactive
ID/Condition
Acceptances Resolution
Resolutions
IDAAT Database
Entry
10/10/12 Parikh Advanced Systems, Inc.
16
17. IDEA System Architecture and Interfaces
Wireless
Internet Video/
GPS Still Cam
Tracking
Body- Voice
Linear wearable Input
Ref Data PC
Instrumentation Van
Docking Station
IDAAT Data Collection
Post- Processing
10/10/12 Parikh Advanced Systems, Inc.
17
18. IDEA in Action - Collection
Target Acquisition and Identification
Image/video (optional)
GPS location (automatic)
Asset characterization
Notepad sketches (optional)
Voice input (optional)
10/10/12 Parikh Advanced Systems, Inc.
18
19. Intelligent Data Entry and Acquisition System Integrated Maintenance Management
System and Databases
(typically Oracle 8 on RS/6000
or similar platform)
District Field Office
PC 2
(Base Station) Task dataset loaded onto
WorkCard in Base Station PC
WORKCA 1 Card Reader
RD
6
WorkCard returned to Base 3
Station PC for upload
Wearable PC
WORKCA with CardReader
RD built-in or as plug-in
WORKCA
RD GPS
Keyboardinput
5
4
Work completed and WORKCARD
time-stamped and ready for upload In-field data collection process; data Camera or
through BASE station processed on PC and stored on video
WORKCARD
Internet
access
Voice
input
10/10/12 Parikh Advanced Systems, Inc.
19
21. From Pad to Form to Database
Ntriplicate Form Scanner – Rayzer’s Red Cross Blood Bank Drive
File View Navigate Scan Options Help Report Pane
Open Form (Notebook) *8/12/98, 3:29:10PM:
Open Pad TR12 form with 13
of 14 fields processed
*TR10 form with 22
Upload Form (Notebook) of 22 fields processed
Upload Pad *TR12 form with 14
of 14 fields processed
Exit
Transferring form 5 of 24 forms to
http://www.redcross.org/bloodbank/
CANCEL
5
24
Preliminary NTriplicate Form Scanner User Interface
10/10/12 Parikh Advanced Systems, Inc.
21
22. IDEA Internal Data Model
PTR (Primary Transaction Record)
FIELD VALUE
Unique key alphanumeric string
User-ID alphanumeric string
TCR Field List alphanumeric string; field pointers
separated by delimiters
Jobstart date/time
Jobend date/time
optional other task-defining fields (optional)
TCR (Transaction Content Record)
FIELD VALUE CONTENT
PTR key alphanumeric string pointer to the associated PTR record
file list linked list; e.g. (see below) (link/field ID + file pointer + locator in file)
--- link/field1 text memo in MEMO.XXX, loc 001
--- link/field2 still photo in PHOTO1.YYY
--- link/field3 text memo in MEMO.XXX, loc 002
--- link/field4 sketch in DRAW01.ZZZ
--- link/field (n) video clip in VID01.XXX
10/10/12 Parikh Advanced Systems, Inc.
22
23. IDEA On-Board Intelligent Processing
Assimilation and Normalization
Feature Extraction - Building the
Representation Set
Classification - Matching Examples
Resolution - Interaction by the Expert
Database Entry and QA/QC
10/10/12 Parikh Advanced Systems, Inc.
23
24. Adaptive Pattern Recognition -
Like Humans Do It
Beginning of recognition process Later stage of recognition process
10/10/12 Parikh Advanced Systems, Inc.
24
25. Another Example - Cutting Through the
Noise
Beginning of recognition process Later stage of recognition process
10/10/12 Parikh Advanced Systems, Inc.
25
26. Other Aspects of Asset Identification,
Assessment, and Planning ...
10/10/12 Parikh Advanced Systems, Inc.
26
27. Extending IDAAT and the IDEA Model
to Many Highway Engineering Problems
Non-Destructive Testing
Bridges and Metal Structures
Tanks and Containers
Hard-to-reach, Hard-to-inspect Structures
Traffic Incident / Density Planning and
Emergency Response
Motorist Safety and Security
Emergency Weather Response
10/10/12 Parikh Advanced Systems, Inc.
27
28. One Interesting New Technology for NDT,
One More Plug-In for the IDEA System
Magneto-Optic Defect Inspection and
Imaging
Easy to use
Safe and “hassle-free”
Images, not abstract data (e.g., ultrasonics)
Can be automated
Relatively inexpensive
Easy training, interpretation
10/10/12 Parikh Advanced Systems, Inc.
28
29. Magnetic Imaging for Stress and Defect
Detection in Bridges and Poles
Microcracks
Anomaly/Occlusion
10/10/12 Parikh Advanced Systems, Inc.
29
30. More Images Pertaining to
Highway Structure Safety
Surface
Interior
Crack hidden by ice
10/10/12 Parikh Advanced Systems, Inc.
30
31. And A Few More Still...
Soft Steel with artificially
created pit defects
Soft Steel with
artificially created
gouges and pits
10/10/12 Parikh Advanced Systems, Inc.
31
32. Acknowledgements
Ray Reaux, PARIKH
NTriplicate architecture and development
Ray Lindquist, PARIKH
Enterprise information technology analysis
Dan Widner, VDOT
GIS Systems Engineering
Robin Bresley, VDOT
ICAS Project Manager
Murali Rao, VDOT
IMMP Technology Manager
10/10/12 Parikh Advanced Systems, Inc.
32