The document discusses Digital Asset Management (DAM) and how it can be implemented using the ISIS Papyrus Platform. Some key points:
1) DAM involves storing digital files and related metadata to enable search and management of assets. It is important for large enterprises that need to securely manage large volumes of assets across departments.
2) The ISIS Papyrus Platform provides a framework for defining DAM objects and attributes without coding. It allows flexible workflows and scales to support many users.
3) Assets are stored separately from metadata for improved performance. Papyrus defines the metadata and integrates with asset storage systems. This provides a consolidated solution for content and process management in enterprises.
This document introduces SQL-H, which enables SQL analytics on Hadoop. It provides a primer on HCatalog and Aster, defines SQL-H, and provides examples of SQL-H usage. SQL-H allows direct access to HCatalog tables from within AsterDB, providing full SQL support and integration with BI tools on data stored in Hadoop. It performs reads from HCatalog in a distributed, native manner without using MapReduce.
IBM InfoSphere Information Server 8.1 is a unified platform for understanding, cleansing, transforming and delivering trustworthy information. It combines the technologies of components like the Information Server Console, Metadata Workbench, Business Glossary, DataStage & QualityStage, Information Analyzer and Information Services Director. The platform provides shared services for administration and reporting. Metadata services allow accessing and integrating data. Key components include the Metadata Server, Metadata Workbench and Business Glossary for managing metadata. DataStage & QualityStage is used for designing jobs to transform and cleanse data, while Information Analyzer helps understand data quality.
Teradata Aster: Big Data Discovery Made Easy
Brad Elo, VP, Aster Data, Teradata
ANALYTICS AND VISUALIZATION FOR THE FINANCIAL ENTERPRISE CONFERENCE
June 25, 2013 The Langham Hotel Boston, MA
Hadoop World 2011: Big Data Architecture: Integrating Hadoop with Other Enter...Cloudera, Inc.
Recent research has pointed out the complementary nature of Hadoop and other data management solutions and the importance of leveraging existing systems, SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve analytic processing. Come to this session to learn how companies optimize the use of Hadoop with other enterprise systems to improve overall analytical throughput and build new data-driven products. This session covers: ways to achieve high-performance integration between Hadoop and relational-based systems; Hadoop+NoSQL vs Hadoop+SQL architectures; high-speed, massively parallel data transfer to analytical platforms that can aggregate web log data with granular fact data; and strategies for freeing up capacity for more explorative, iterative analytics and ad hoc queries.
The document discusses a unified data architecture that enables any user to access and analyze any data type from data capture through analysis. It describes using a discovery platform to enable interactive data discovery on structured and unstructured data without extensive modeling. It also describes using an integrated data warehouse for cross-functional analysis, shared analytics, and lowest total cost of ownership. Finally, it provides examples of using the architecture for IPTV quality of service analysis, including predictive models using decision trees and naive Bayes.
The document describes several potential metadata use cases, including reporting/analytics, desktop accessibility of metadata definitions, and governance workflows. It provides examples of actors, system interactions, and sample data for each use case. The use cases are presented to demonstrate how they can address common challenges with metadata solutions projects.
Learn more about ER/Studio Data Architect and try it free at: http://embt.co/ERStudioDA
With round-trip database support, data architects using ER/Studio Data Architect have the power to easily reverse-engineer, compare and merge, and visually document data assets residing in diverse locations from data centers to mobile platforms. Enterprise data can be more effectively leveraged as a corporate asset, while compliance is supported for business standards and mandatory regulations -- essential factors in an organizational data governance program. A range of data sources are supported ranging from those residing on the cloud to data sources residing on mobile phones. A variety of database platforms, including traditional RDBMS and big data technologies such as MongoDB and Hadoop Hive, can be imported and integrated into shared models and metadata definitions.
Ashish Maheshwari has over 8 years of experience in data modeling, integration, analytics, mining, ETL processes, BI applications, and data warehousing. He currently works as a Technical Lead at Ness Technologies, Bangalore and has experience working on projects for clients such as FGL, Marks, and Target. His skills include Oracle SQL, PL/SQL, Datastage, data warehousing concepts, R programming, and Agile development processes.
This document introduces SQL-H, which enables SQL analytics on Hadoop. It provides a primer on HCatalog and Aster, defines SQL-H, and provides examples of SQL-H usage. SQL-H allows direct access to HCatalog tables from within AsterDB, providing full SQL support and integration with BI tools on data stored in Hadoop. It performs reads from HCatalog in a distributed, native manner without using MapReduce.
IBM InfoSphere Information Server 8.1 is a unified platform for understanding, cleansing, transforming and delivering trustworthy information. It combines the technologies of components like the Information Server Console, Metadata Workbench, Business Glossary, DataStage & QualityStage, Information Analyzer and Information Services Director. The platform provides shared services for administration and reporting. Metadata services allow accessing and integrating data. Key components include the Metadata Server, Metadata Workbench and Business Glossary for managing metadata. DataStage & QualityStage is used for designing jobs to transform and cleanse data, while Information Analyzer helps understand data quality.
Teradata Aster: Big Data Discovery Made Easy
Brad Elo, VP, Aster Data, Teradata
ANALYTICS AND VISUALIZATION FOR THE FINANCIAL ENTERPRISE CONFERENCE
June 25, 2013 The Langham Hotel Boston, MA
Hadoop World 2011: Big Data Architecture: Integrating Hadoop with Other Enter...Cloudera, Inc.
Recent research has pointed out the complementary nature of Hadoop and other data management solutions and the importance of leveraging existing systems, SQL, engineering, and operational skills, as well as incorporating novel uses of MapReduce to improve analytic processing. Come to this session to learn how companies optimize the use of Hadoop with other enterprise systems to improve overall analytical throughput and build new data-driven products. This session covers: ways to achieve high-performance integration between Hadoop and relational-based systems; Hadoop+NoSQL vs Hadoop+SQL architectures; high-speed, massively parallel data transfer to analytical platforms that can aggregate web log data with granular fact data; and strategies for freeing up capacity for more explorative, iterative analytics and ad hoc queries.
The document discusses a unified data architecture that enables any user to access and analyze any data type from data capture through analysis. It describes using a discovery platform to enable interactive data discovery on structured and unstructured data without extensive modeling. It also describes using an integrated data warehouse for cross-functional analysis, shared analytics, and lowest total cost of ownership. Finally, it provides examples of using the architecture for IPTV quality of service analysis, including predictive models using decision trees and naive Bayes.
The document describes several potential metadata use cases, including reporting/analytics, desktop accessibility of metadata definitions, and governance workflows. It provides examples of actors, system interactions, and sample data for each use case. The use cases are presented to demonstrate how they can address common challenges with metadata solutions projects.
Learn more about ER/Studio Data Architect and try it free at: http://embt.co/ERStudioDA
With round-trip database support, data architects using ER/Studio Data Architect have the power to easily reverse-engineer, compare and merge, and visually document data assets residing in diverse locations from data centers to mobile platforms. Enterprise data can be more effectively leveraged as a corporate asset, while compliance is supported for business standards and mandatory regulations -- essential factors in an organizational data governance program. A range of data sources are supported ranging from those residing on the cloud to data sources residing on mobile phones. A variety of database platforms, including traditional RDBMS and big data technologies such as MongoDB and Hadoop Hive, can be imported and integrated into shared models and metadata definitions.
Ashish Maheshwari has over 8 years of experience in data modeling, integration, analytics, mining, ETL processes, BI applications, and data warehousing. He currently works as a Technical Lead at Ness Technologies, Bangalore and has experience working on projects for clients such as FGL, Marks, and Target. His skills include Oracle SQL, PL/SQL, Datastage, data warehousing concepts, R programming, and Agile development processes.
This document discusses a smart data platform for Microsoft SharePoint collaboration and workflow. It provides [1] intelligent data identification, clustering, and extraction to uncover business insights and visualize data; [2] secure storage of smart and safe data anywhere, on premise or in the cloud, that gets smarter through clustering; and [3] a team of experts to assist with SharePoint 2013 migrations.
This document provides an introduction to data warehousing fundamentals. It defines a data warehouse as an enterprise repository for subject-oriented, time-variant data used for reporting and analysis. It describes the typical phases of a data warehousing project including strategy, definition, analysis, design, build, population, and evolution. It compares data warehouses to operational databases and data marts. Finally, it discusses extract, transform, load processes, possible reasons for ETL failure, and typical warehousing development tasks.
A world's one of the first complete Online Web-based Development Frameworks to develop and deploy Decision Support Systems, Knowledge-based systems, Web-sites and Applications backed by Expert System, Case-Based Reasoning and Hybrid AI Technologies
IOUG93 - Technical Architecture for the Data Warehouse - PresentationDavid Walker
The document outlines a technical architecture for implementing a data warehouse. It discusses business analysis, database schema design, project management, data acquisition, building a transaction repository, data aggregation, data marts, metadata and security, middleware and presentation layers. The goal is to help users find the information they need from the data warehouse. Contact information is provided at the end.
The Smart Data Platform adds to the capability of SharePoint to alleviate the
need for customers to spend millions of dollars on much more expensive and proprietary solutions
SnapLogic is a data integration platform that allows users to connect any data source, apply transformations, and share custom integrations called "Snaps". The company was founded in 2009 and is venture funded by top firms. SnapLogic's core products are the DataFlow Server for building data pipelines and the SnapStore marketplace for sharing custom Snaps. The SnapStore provides developers opportunities to profit from selling their integrations and helps SnapLogic build out a large integration library. Analysts praise SnapLogic's approach as innovative and advantageous for both the company and its developer community.
D365 Finance & Operations - Data & Analytics (see newer release of this docum...Gina Pabalan
This very comprehensive white paper provides a detailed and clear overview of Microsoft's D365 Finance & Operations solutions to support Data & Analytics.
There is a newer version of this available - search SlideShare for the new version of this deck.
This document provides a comparison of SAP BW and Teradata, two leading tools for reporting and analysis. It begins with background information on each tool, describing SAP BW as a comprehensive business intelligence package that merges, transforms, and interprets business data to support decision making. Teradata is introduced as a fully scalable relational database management system designed for analytical queries. The document then compares the pros and cons of each tool based on factors like users, value proposition, usability, interfaces, and features. SAP BW is generally better for small organizations while Teradata can handle extremely large amounts of data and thousands of users through massively parallel processing.
Vensai Consultants is an IT consulting firm that specializes in building data warehouses. They provide a roadmap for building a data warehouse that includes data acquisition, integration, storage in a data repository, and reporting services. They recommend tools for each step of the data warehouse development process, including data modeling, ETL, databases, analytics, and reporting tools.
DRM Webinar Series, PART 3: Will DRM Integrate With Our Applications?US-Analytics
In the third part of the series, we'll debunk myths around integrating DRM:
“It can’t automate or integrate with my non-Oracle products like SAP, Salesforce, Workday, or ServiceNow.”
“DRM doesn’t support a SaaS-based cloud architecture.”
“It doesn’t have delivered support for maintaining Oracle EPM products, like Essbase, Planning, HFM, and PBCS."
Deepak Sharma has over 9 years of experience as an ETL programmer focusing on data warehousing, data integration, and business intelligence. He has expertise in designing, developing, and implementing data warehouse/data integration solutions using tools like Informatica PowerCenter and Oracle. Some of his responsibilities include interacting with business stakeholders to understand requirements, designing mappings, developing ETL processes, testing solutions, and supporting production environments. He currently works as a senior ETL developer at Bank of America on their Corporate Investment Data Warehouse.
DRM Webinar Series, PART 1: Barriers Preventing You From Getting Started?US-Analytics
Data governance guru Greg Briscoe debunks myths about Oracle’s Data Relationship Management (DRM) application. Don't let common misconceptions stop you from getting an amazing return on investment!
A Data Warehouse is a collection of integrated, subject-oriented databases designed to support decision-making. It contains non-volatile data that is relevant to a point in time. An operational data store feeds the data warehouse with a stream of raw data. Metadata provides information about the data in the warehouse.
The document discusses the architecture and components of Data Virtualization. The main components include the server, virtual database, access layer, query engine, and connector architecture. The server acts as an interface between applications and data sources. The virtual database provides a unified view of integrated data. The access layer allows applications to submit queries. The query engine processes queries from data sources. Connectors provide connectivity to physical data sources.
The document provides a summary of Gerald Donaldson's experience and qualifications. It includes his contact information, objective of seeking an enterprise architecture role, and summaries of his past roles including Enterprise Data Architect, Data Warehouse Architect, and BI Architect. He has over 30 years of experience designing and implementing data warehouse and BI solutions primarily using Microsoft technologies. The document also lists his education background and technical skills.
Wallchart - Data Warehouse Documentation RoadmapDavid Walker
This document outlines the key components and processes involved in planning, designing, building, implementing and managing a data warehouse architecture. It includes sections on business requirements, data requirements, technical architecture, data modeling, ETL processes, testing, implementation, project management and documentation. The document provides a roadmap to guide an organization through each stage of developing an enterprise data warehouse.
This document provides an overview and agenda for a presentation on big data landscape and implementation strategies. It defines big data, describes its key characteristics of volume, velocity and variety. It outlines the big data technology landscape including data acquisition, storage, organization and analysis tools. Finally it discusses an integrated big data architecture and considerations for implementation.
ETL stands for Extraction, Transformation, and Loading. The document describes an example ETL process to load master customer data from an Excel file into an SAP BI platform. First, the data is extracted from the Excel file into the BI data warehouse using a data source and info package. Next, the data in the persistent staging area is transformed by defining the customer ID and name fields as characteristic info objects. Finally, a data transfer process loads the mapped data from the source into the appropriate info objects, completing the ETL process.
Charles Diorio is a veteran of the United States Army seeking a permanent, full-time position. He has over 15 years of experience in the Army serving in various leadership roles such as platoon sergeant, squad leader, and section leader. His duties have included training, supervising, and leading teams of soldiers. He is proficient in weapons, tactics, and operations. Diorio has a background in hazardous materials response and seeks a role utilizing his skills in operations, leadership, and safety. He has completed some college coursework in related fields and lists four former military supervisors as references.
This document summarizes different Kaspersky security products for consumers and small businesses. It describes Kaspersky Anti-Virus which provides essential virus protection for tech-savvy users. Kaspersky Internet Security offers premium protection from viruses and hackers with features like parental controls. Kaspersky PURE Total Security provides a complete security suite for homes with multiple PCs. Finally, Kaspersky Small Office Security is designed for small businesses with up to 10 PCs and includes easy centralized management and technical support.
This document discusses a smart data platform for Microsoft SharePoint collaboration and workflow. It provides [1] intelligent data identification, clustering, and extraction to uncover business insights and visualize data; [2] secure storage of smart and safe data anywhere, on premise or in the cloud, that gets smarter through clustering; and [3] a team of experts to assist with SharePoint 2013 migrations.
This document provides an introduction to data warehousing fundamentals. It defines a data warehouse as an enterprise repository for subject-oriented, time-variant data used for reporting and analysis. It describes the typical phases of a data warehousing project including strategy, definition, analysis, design, build, population, and evolution. It compares data warehouses to operational databases and data marts. Finally, it discusses extract, transform, load processes, possible reasons for ETL failure, and typical warehousing development tasks.
A world's one of the first complete Online Web-based Development Frameworks to develop and deploy Decision Support Systems, Knowledge-based systems, Web-sites and Applications backed by Expert System, Case-Based Reasoning and Hybrid AI Technologies
IOUG93 - Technical Architecture for the Data Warehouse - PresentationDavid Walker
The document outlines a technical architecture for implementing a data warehouse. It discusses business analysis, database schema design, project management, data acquisition, building a transaction repository, data aggregation, data marts, metadata and security, middleware and presentation layers. The goal is to help users find the information they need from the data warehouse. Contact information is provided at the end.
The Smart Data Platform adds to the capability of SharePoint to alleviate the
need for customers to spend millions of dollars on much more expensive and proprietary solutions
SnapLogic is a data integration platform that allows users to connect any data source, apply transformations, and share custom integrations called "Snaps". The company was founded in 2009 and is venture funded by top firms. SnapLogic's core products are the DataFlow Server for building data pipelines and the SnapStore marketplace for sharing custom Snaps. The SnapStore provides developers opportunities to profit from selling their integrations and helps SnapLogic build out a large integration library. Analysts praise SnapLogic's approach as innovative and advantageous for both the company and its developer community.
D365 Finance & Operations - Data & Analytics (see newer release of this docum...Gina Pabalan
This very comprehensive white paper provides a detailed and clear overview of Microsoft's D365 Finance & Operations solutions to support Data & Analytics.
There is a newer version of this available - search SlideShare for the new version of this deck.
This document provides a comparison of SAP BW and Teradata, two leading tools for reporting and analysis. It begins with background information on each tool, describing SAP BW as a comprehensive business intelligence package that merges, transforms, and interprets business data to support decision making. Teradata is introduced as a fully scalable relational database management system designed for analytical queries. The document then compares the pros and cons of each tool based on factors like users, value proposition, usability, interfaces, and features. SAP BW is generally better for small organizations while Teradata can handle extremely large amounts of data and thousands of users through massively parallel processing.
Vensai Consultants is an IT consulting firm that specializes in building data warehouses. They provide a roadmap for building a data warehouse that includes data acquisition, integration, storage in a data repository, and reporting services. They recommend tools for each step of the data warehouse development process, including data modeling, ETL, databases, analytics, and reporting tools.
DRM Webinar Series, PART 3: Will DRM Integrate With Our Applications?US-Analytics
In the third part of the series, we'll debunk myths around integrating DRM:
“It can’t automate or integrate with my non-Oracle products like SAP, Salesforce, Workday, or ServiceNow.”
“DRM doesn’t support a SaaS-based cloud architecture.”
“It doesn’t have delivered support for maintaining Oracle EPM products, like Essbase, Planning, HFM, and PBCS."
Deepak Sharma has over 9 years of experience as an ETL programmer focusing on data warehousing, data integration, and business intelligence. He has expertise in designing, developing, and implementing data warehouse/data integration solutions using tools like Informatica PowerCenter and Oracle. Some of his responsibilities include interacting with business stakeholders to understand requirements, designing mappings, developing ETL processes, testing solutions, and supporting production environments. He currently works as a senior ETL developer at Bank of America on their Corporate Investment Data Warehouse.
DRM Webinar Series, PART 1: Barriers Preventing You From Getting Started?US-Analytics
Data governance guru Greg Briscoe debunks myths about Oracle’s Data Relationship Management (DRM) application. Don't let common misconceptions stop you from getting an amazing return on investment!
A Data Warehouse is a collection of integrated, subject-oriented databases designed to support decision-making. It contains non-volatile data that is relevant to a point in time. An operational data store feeds the data warehouse with a stream of raw data. Metadata provides information about the data in the warehouse.
The document discusses the architecture and components of Data Virtualization. The main components include the server, virtual database, access layer, query engine, and connector architecture. The server acts as an interface between applications and data sources. The virtual database provides a unified view of integrated data. The access layer allows applications to submit queries. The query engine processes queries from data sources. Connectors provide connectivity to physical data sources.
The document provides a summary of Gerald Donaldson's experience and qualifications. It includes his contact information, objective of seeking an enterprise architecture role, and summaries of his past roles including Enterprise Data Architect, Data Warehouse Architect, and BI Architect. He has over 30 years of experience designing and implementing data warehouse and BI solutions primarily using Microsoft technologies. The document also lists his education background and technical skills.
Wallchart - Data Warehouse Documentation RoadmapDavid Walker
This document outlines the key components and processes involved in planning, designing, building, implementing and managing a data warehouse architecture. It includes sections on business requirements, data requirements, technical architecture, data modeling, ETL processes, testing, implementation, project management and documentation. The document provides a roadmap to guide an organization through each stage of developing an enterprise data warehouse.
This document provides an overview and agenda for a presentation on big data landscape and implementation strategies. It defines big data, describes its key characteristics of volume, velocity and variety. It outlines the big data technology landscape including data acquisition, storage, organization and analysis tools. Finally it discusses an integrated big data architecture and considerations for implementation.
ETL stands for Extraction, Transformation, and Loading. The document describes an example ETL process to load master customer data from an Excel file into an SAP BI platform. First, the data is extracted from the Excel file into the BI data warehouse using a data source and info package. Next, the data in the persistent staging area is transformed by defining the customer ID and name fields as characteristic info objects. Finally, a data transfer process loads the mapped data from the source into the appropriate info objects, completing the ETL process.
Charles Diorio is a veteran of the United States Army seeking a permanent, full-time position. He has over 15 years of experience in the Army serving in various leadership roles such as platoon sergeant, squad leader, and section leader. His duties have included training, supervising, and leading teams of soldiers. He is proficient in weapons, tactics, and operations. Diorio has a background in hazardous materials response and seeks a role utilizing his skills in operations, leadership, and safety. He has completed some college coursework in related fields and lists four former military supervisors as references.
This document summarizes different Kaspersky security products for consumers and small businesses. It describes Kaspersky Anti-Virus which provides essential virus protection for tech-savvy users. Kaspersky Internet Security offers premium protection from viruses and hackers with features like parental controls. Kaspersky PURE Total Security provides a complete security suite for homes with multiple PCs. Finally, Kaspersky Small Office Security is designed for small businesses with up to 10 PCs and includes easy centralized management and technical support.
De juiste persoon, op het juiste moment, op de juiste opdracht - SEE 2016TOPdesk
Hoe zorgt u ervoor dat de ICT-serviceorganisatie focus en overzicht heeft en haar tijd en geld besteedt aan de juiste zaken? In deze casus van ONS, het nieuwe Shared Service Center van de provincie Overijssel, de gemeente Zwolle en de gemeente Kampen, nemen wij u mee in de uitdagingen, de gekozen oplossingsrichting, de successen én obstakels van een SSC dat steeds meer in control komt.
De Zorg Centraal - Drie Servicemanagement toepassingenTOPdesk
Zorg centraal, zelfredzaamheid en transparantie zijn tegenwoordig de belangrijkste kernwaarden voor de dienstverlenende afdelingen binnen de Zorg. Het antwoord van TOPdesk hierop is: Zet de klant centraal! Door middel van een online Product- en dienstencatalogus, Shared Service Management en toepassing van het Shift Left (Left) principe kunnen dienstverlenende organisaties aan deze kernwaarden voldoen. In deze presentatie vertellen wij graag meer over de implementatie hiervan en welke resultaten u hiermee bereikt.
The document discusses the relationship between Enterprise 2.0, service-oriented architecture (SOA), and mashups. It defines mashups as web applications or pages that combine data from multiple external sources to create a new service. Enterprise 2.0 uses web technologies like social networking and user interfaces for business collaboration and information sharing. The document explores how mashups can leverage SOA interfaces and be used and managed from a process perspective to provide more agility than traditional business process management approaches. It also discusses how mashups could be implemented and secured to balance the dynamic nature of mashups with manageability for business applications.
The document discusses trends in mobile device usage and the mobile app industry. It notes that there are now over 5.5 billion mobile users and mobile apps account for 50% of mobile internet traffic. The mobile app market is growing rapidly, with over $2 billion in revenue in 2010, 70% of which was from apps. It also outlines differences between bespoke and platform-based mobile app development. The document promotes Mobile Roadie as a platform for brands and organizations to create engaging mobile apps and discusses some of its clients. It predicts continued growth in smartphones, tablets, and mobile usage overall.
Présentation "Coopération BPI France DG06", dans le cadre de la Mission exploratoire AWEX-WBI. Cap sur les Villes intelligentes françaises. (Volet Bordeaux-Nantes: du 23 au 27 novembre 2015).
Papyrus Web Applications & Portal - Technology Innovation Brochure by ISIS Pa...ISIS Papyrus Software
The document discusses Papyrus WebPortal, a platform that allows non-technical business users to generate standardized documents and communications through a web interface. It provides centralized access to document templates, content, and processes. This ensures consistency, reuse of text, and cost savings compared to separate point solutions. The platform also integrates with backend systems through adapters and allows both viewing and interactive editing of communications from a web browser.
The document discusses a health survey conducted in the Park Mesa Heights neighborhood near Los Angeles from 2009 to present. It found that 25% of residents were uninsured, 17% did not have a regular doctor, and 20% reported their health as fair or poor. The most common health conditions identified were chronic diseases like diabetes, high blood pressure, and obesity. When asked about neighborhood problems, residents cited issues like crime, gangs, and lack of access to health services and fresh food.
Social work class brief history of us health policywvucharleston
1) Over the 19th century, Americans were wary of medical authority but became devoted to it in the 20th century as science and technology advanced medicine.
2) American doctors transitioned from a divided, financially insecure profession in the 19th century to a united, prosperous one in the 20th century.
3) Hospitals became the central institution for medical care while public health declined in prominence.
The document then discusses the history of various attempts at national health reform and health insurance in the United States from the 19th century to modern times, and why a universal system failed to emerge.
Este documento describe tres elementos importantes en la construcción: 1) Tuberías de aguas negras, que transportan aguas residuales desde los baños y cocinas; 2) Malla electro-soldada, que se usa para reforzar el concreto y mejorar la resistencia a sismos; 3) Concreto armado, que combina acero y concreto para crear una estructura resistente tanto a compresión como a tensión.
Secret Powers of Negotiations Summary PDFDan Walker
This document summarizes key negotiation tactics and strategies. It discusses techniques like establishing criteria, seeking information, compromise, using gambits to gain advantage, countering hot potato and nibbling tactics, appealing to higher powers, dealing with good cop/bad cop scenarios, calling out people who flinch, agreeing then turning proposals around, having walk away power, positioning for easy acceptance, the power of writing, negotiating on your own territory, being the one to write contracts, breaking amounts down, using decoys and red herrings, acting as a reluctant buyer/seller, demonstrating power, never taking the first offer, trading off asks, using the vise technique, bracketing offers, and asking for more than expected. The goal is
Services centraal - de toekomst van TOPdesk - SEE 2016TOPdesk
Marco Tonino presenteert u een toekomst waar alles in TOPdesk om service draait. Een service die alle processen bundelt tot een consistent geheel en uw dienstverlening beter, sneller en overzichtelijker maakt. Aan de hand van stellingen neemt hij u mee langs TOPdesks visie.
Marco gianotten - empathie maakt winnaars - SEE 2016TOPdesk
This document discusses the need for empathy and experience-focused service management in next generation IT. It notes that traditional IT has become too complex, specialized, and separated into silos. The document advocates eliminating unnecessary systems and standardizing/automating what remains to simplify IT and focus on building innovative services for users. It argues that service level agreements should prioritize the quality of experience for users rather than technical metrics. Empathy is highlighted as the most important factor for providing a fault-tolerant, high-quality digital experience.
Este catálogo presenta las nuevas líneas de maletas de Peli tras la adquisición de Hardigg Industries, incluyendo 23 nuevas maletas Peli Storm moldeadas por inyección y 24 maletas rotomoldeadas Peli-Hardigg. Ahora los clientes pueden elegir entre los cierres dobles de Peli o los sencillos cierres de "pulsar y tirar" de Peli Storm. La adquisición ha ampliado la oferta de Peli y permite satisfacer mejor las necesidades de los clientes.
This document provides an overview of data warehousing concepts including definitions of data warehousing, the components of a data warehouse architecture, characteristics of data, and the process of data modeling. It describes what a data warehouse is and some key elements like the data sources, data integration, business intelligence tools, and different types of databases. It also discusses data attributes, metadata, and the three levels of data modeling.
The document discusses the complexity of implementing a data warehouse. It involves multiple steps such as collecting business requirements, creating a data model and physical design, defining data sources, choosing database and reporting tools, and updating the warehouse. No single tool can handle all data warehouse access needs, so implementations rely on a suite of tools chosen based on the type of access required. Vendors have emerged that focus on fulfilling requirements like extraction, integration, and management of metadata for data warehousing. Solutions discussed include Prism Warehouse Manager, Carleton's PASSPORT, and SAS Institute products.
Panel presentation to a graduate class at the University of Arizona School of Information Resources and Library Science. Invited by Dr. Jana Bradley. July 2006.
This document discusses strategies for applying metadata to content in SharePoint. It covers manual tagging by end users, automatic tagging using SharePoint's built-in capabilities, and using third party tools that employ rules-based or semantic-based tagging. Semantic tagging uses natural language processing and machine learning to understand meanings and apply tags without predefined taxonomies or rules. The document also describes a specific semantic tagging tool called Termset that provides entity extraction, sentiment analysis, summarization and more.
This document discusses Saxo Bank's plans to implement a data governance solution called the Data Workbench. The Data Workbench will consist of a Data Catalogue and Data Quality Solution to provide transparency into Saxo's data ecosystem and improve data quality. The Data Catalogue will be built using LinkedIn's open source DataHub tool, which provides a metadata search and UI. The Data Quality Solution will use Great Expectations to define and monitor data quality rules. The document discusses why a decentralized, domain-driven approach is needed rather than a centralized solution, and how the Data Workbench aims to establish governance while staying lean and iterative.
Metadata is more than a search enabler. It is the value added to the workflow.Metadata as a Service (MaaS) means metadata that is hosted, managed and validated at a central source and syndicated in
the form of templates and panels to users to embed metadata into their assets.
What “Model” DITA Specializations Can Teach About Information ModelincDon Day
The DITA Open Toolkit download site includes several demo specializations that few people discover and use. In this webinar, DITA maven, Don Day, will use these examples to highlight the role of information modelling that led to each specialization. Don will highlight the key points of how each specialization was created, or how semantics were introduced into the specialization, and a whole lot more.
Real Semantics is a product designed with BCBS 239 compliance in mind. It uses a universal graph model and common data model to trace decisions made by systems. It can peer into legacy systems at different levels. This synchronization of data takes chaos out of IT systems. Real Semantics satisfies BCBS 239 requirements such as establishing integrated data taxonomies, ownership and quality of risk data, and capabilities to generate risk data subsets quickly. While many organizations struggle with regulations, Real Semantics sees it as an opportunity to improve systems to satisfy customers and grow business.
Gowthami S is a software developer and designer with over 2 years of experience in data warehousing using databases like Teradata and Oracle. She has extensive experience with ETL tools like Informatica and data loading utilities for Teradata. She has worked on full data warehouse development lifecycles including requirements, design, implementation and maintenance. Currently working as a software engineer at Tech Mahindra, her projects include developing ETL processes and Teradata SQL queries to load and transform data from various sources into a Cisco enterprise data warehouse supporting business intelligence reporting and analytics.
Data is any set of characters that has been gathered and translated for some purpose, usually analysis. It can be any character, including text and numbers, pictures, sound, or video. If data is not put into context, it doesn't do anything to a human or computer.
A database is a collection of information that is organized so that it can be easily accessed, managed and updated.
Informatica is the global giant in data warehousing solutions, which offers data integration tools for enterprise. Informatica is a proven solution for enterprise and have set standards for highly scalable and high-performance solutions.
Ooluk Data Dictionary Manager allows easy metadata management for heterogeneous databases. You can document and tag your entire data envionment allowing users to better understand your data.
50-55 hours Training + Assignments + Actual Project Based Case Studies
All attendees will receive,
Assignment after each module, Video recording of every session
Notes and study material for examples covered.
Access to the Training Blog & Repository of Materials
It is important to make architecture accessible in a good way so that it is accessible to all target groups as much as possible. At the same time, architectural languages such as ArchiMate are not fully focused on this. In addition, they are actually too abstract. You cannot express exactly enough what you are trying to model and the resulting model is open to several interpretations. Linked Data fits in well with these objectives and makes it easier to define and unlock more accessible and more targeted information.
In this presentation Danny Greefhorst will tell more about the motivation behind the idea, but also show a further elaboration. For example, he has made a mapping of ArchiMate to commonly used Linked Data vocabularies. He also made a demonstrator, in which you can see how you can define, enrich and publish ArchiMate models as Linked Data. He will also discuss how the European reference architecture EIRA is available as Linked Data.
This webinar provided an overview of Darwin Information Typing Architecture (DITA) including definitions of DITA, its key principles and components, and reasons for adopting a DITA approach. The presenter discussed DITA information types and domains, and how DITA supports structural integrity, reusability, and extensibility. The webinar also addressed how to determine if an organization needs DITA, what is required to implement it, and recommended a phased migration approach. The presentation concluded with references and information about training and consulting services.
The presentation I gave at the 2007 Semantic Technology Conference. Declarative programming” has become the latest buzzword to describe languages that abstractly define systems requirements (the what) and leave the implementation (the how) to be determined by an independent process. This makes the semantics (meaning) of declarative data elements even more critical as these systems are shared between organizations. This presentation: (1) Provides a background of declarative programming (2) Describes why understanding the semantic aspects of declarative systems is critical to cost-effective software development.
UNIT - 1 Part 2: Data Warehousing and Data MiningNandakumar P
DBMS Schemas for Decision Support , Star Schema, Snowflake Schema, Fact Constellation Schema, Schema Definition, Data extraction, clean up and transformation tools.
Artifacts, Data Dictionary, Data Modeling, Data WranglingFaisal Akbar
This document discusses different data modeling concepts including artifacts, data dictionaries, and data modeling. It defines artifacts as tangible byproducts of software development that help describe functions, architecture, and design. Data dictionaries are described as databases containing metadata about the data stored in other databases, including information like field sizes and data authorization. Different types of data models are presented, including conceptual, logical, and physical models, with conceptual being most abstract and physical being database specific. The document also discusses data wrangling as the process of cleaning, structuring, and enriching raw data.
Эволюция Big Data и Information Management. Reference Architecture.Andrey Akulov
This document outlines Oracle's third generation Information Management Reference Architecture. It defines key concepts like the Raw Data Reservoir for storing immutable raw data, and the Foundation Data Layer for standardized enterprise data. It describes logical components like the Data Factory for ingestion and interpretation, and the Access and Performance Layer for enabling queries. It also provides design patterns for different use cases including a Discovery Lab, Information Platform, and Real-Time Event processing. Overall the architecture aims to practically manage all types of data at scale to maximize information value.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
1. An ISIS Papyrus Study Paper - November 2008
Sharing Experience™ -
Digital Asset Management
!quot;!quot;#!$%&'()*+&$#quot;,-*.(-#/(01#
23*.'#4+.$.'5.6#78#
298:;;#<)=#>$?.'-@&'%#
A.3B#C;:988:D98EFF79G#
>()+3B#+$%&H+-+-9I)I,'J-=K&(#
2. Sharing Experience™ - Digital Asset Management
!quot;!#$%&'!()'!*'&!+(
This document contains a study of the ISIS Papyrus Platform, a business information
system for large enterprises, providing optimization of all customer service content
and processes. Due to the nature of modern customer communication this approach
also requires management of related digital assets.
There are numerous vendors on the market that provide hardcoded solutions for
specific Digital Asset Management (DAM) needs. Some are referred to as Media
Management or Brand Management products enabling the storage, management and
deployment of digitized content, such as text, web content, audio, video, graphics,
CAD files, images, print layouts, animations, and other forms of presentations.
We at ISIS Papyrus propose that there is no content without process and no process
without content and therefore DAM is required for both. DAM must be a native
functionality of a business information system, because the quality of assets defines
the quality of the communciation. Asset management workflows and deployment of
these into the production or business processes have to be controlled.
The graphical modeling capability for entities, interfaces, rules, processes, content and
presentation of the Papyrus WebRepository is the key to unlimited flexibility for asset
management. Using the Resource Management Framework as a starting point for
DAM, each organization can add any type of asset and their attributes as needed, use
the role/policy authorization for secure access, create design, admin and management
views, processes and audit reports, and deploy the digital assets into production.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! 2
3. *45678!9:;75<48=>?;!
Digital Asset Management is considered a separate market segment with special needs
and therefore specific products. More than 70 vendors all with small market shares
populate this segment. While the lightweight functionality of DAM is acceptable for a
small business in the media domain, it is not productive for a large enterprise and its
software infrastructure. In the enterprise IT market it is important to look beyond the
fragmented market of Digital Asset Management products that have to be integrated
by means of XML and SOA.
Consolidation and not integration as an overall target is the only direction forward for
enterprise solutions. Enterprise architects who hope to create an infrastructure of
replaceable components by segmenting and layering various products have to consider
poor compliance to standards and continuously changing products. These are major
stumbling blocks for an enterprise architecture, whereas ISIS Papyrus integrates
seamlessly with legacy applications to provide instant benefits with its rich
functionality and platform and output channel independence. On the long term it
provides an application infrastructure that allows for a gradual enterprise strategy
towards consolidation with no additional integration efforts.
Working with a platform vendor with broad functionality that includes DAM such as
ISIS Papyrus will provide the most flexible and consolidated setup for process and
content management with the key DAM capabilities. Papyrus provides the complete
range of functionality of a DAM product and it can be custom-defined to provide
special user interfaces and processes that are a perfect fit for enterprise needs.
/5=?@=AB7;!>C!quot;=D=84B!&;;78!*4?4D7E7?8!
A Digital Asset (DA) is a file containing digitized information used for artwork,
content creation or media production. A DA is often itself referred to as “content” if
used on the Web, as a “resource” for production or within other content.
The most common types of digital assets are:
Text (plain, RTF, HTML, other XML formats)
•
Images (GIF, TIFF, JPEG, PNG, or raw bitmaps)
•
Audio (WAV, AIFF, MP3, others)
•
Video (a variety of formats, most commonly MPEG-1/2/4)
•
Web content (HTML, FLASH, PDF, others)
•
Product specific (DOC, PPT, CAD, any)
•
Complex content (a mix of any of the above)
•
Source code (scripts or programs)
•
Binaries (functional executables)
•
In a DAM system the digital assets are stored in binary format with a set of attributes.
While this list covers the most important object types, you have to expect that new
DA object types or attributes may be required. To define new object types, add new
DA-attributes or sign-off workflows as well as specific user interfaces must be
possible without complex programming. Asset objects with the same attributes are
considered a class or type that they belong to.
quot;&*!%75E=?>B>DF!
To avoid the continuous confusion of which function is provided by what product we
want to clarify at this point the terminology of digital asset and content management.
Many DAM libraries as well as archive systems have been named a repository by the
vendor and refer to attribute values or search indices as meta-data, most probably
because it sounds more scientific and professional. A repository is by definition a
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! 3
4. function that manages application definitions such as metadata, while DAM and
archive systems store however unique object instances of assets and documents.
A repository does therefore not use metadata to manage other entities but only
manages object descriptions and the metadata definitions of its attributes. It is
common today to use 'meta-data' rather than 'meta-information' or even better object
attributes. In other situations the attributes (metadata) are tags that classify the content
or asset according to a taxonomy or user created folksonomy. A flexible DAM system
or archive may have the ability to define the metadata for object attributes or its
search fields, but the key ability of a repository is to make the metadata taxonomy
available to other applications or systems.
META is a preposition in classical Greek and translates to: among; along with;
after; behind. Epistemology describes META as quot;about its own categoryquot;. Which
makes the only proper use of metadata as quot;data about dataquot; and not “data fields
about other entities we want to manage.” Data and metadata exist at the same time
in a relationship and one does not produce the other. Data is no more than a value,
number or string and turned into information by assigning metadata.
Therefore, information about name, size, type, location, ownership of a DA, or
system setup or specification values are not metadata but information about attributes!
Also the search tags for a DA in a library or a content instance in an archive are not
metadata but a representation of the search index.
At the very heart of a flexible and powerful DAM system lies the ability to create a
taxonomy of class definitions for DA-files, to define the metadata for DA attributes
and to link them to each other by what is called an encapsulation into an 'object class
definition'.
DA attributes types can be roughly grouped into:
Descriptive – describes the object for reference, search and retrieval.
•
Structural – specifies either internal object structure or relationship to other
•
objects (is a part of, is composed of, is a variant of, is a version of)
Functional – DA states, DA state transitions, which actions can be performed,
•
how will it send and react to events
Administrative – format specifications, copyright and licensing information,
•
user access rights, lifecycle information (i.e. retention periods, validity),
projects and ownership, change management state
Taxonomy – classifies the object as part of a class
•
Operative – audit (past access and changes), reporting (productive use, cost),
•
license fees (due or paid)
HI;8=CF=?D!quot;&*!C>5!8J7!B45D7!7?875A5=;7!
Digital Asset Management or DAM is used for various purposes and in various ways.
In principle a DAM system stores digital files and their related attributes in various
formats. Some of these attributes can be used as search indices. The range is from
personal DAM libraries on a PC to large corporate solutions that extend to partners
and customers through the Internet. Apart from the flexibility and scalability of the
asset storage, a core difference of various DAM systems is how they can manage the
asset related workflows and how far they can reach in managing asset use in the
production environment. Enterprise DAM systems have to be secure and auditable.
Enterprise-level solutions require scalable, reliable, configurable systems with
unlimited numbers of assets for large numbers of simultaneous users.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! G
5. Which departments may make use of a DAM?
Design – finding existing assets and re-use, a library of ideas,
•
Production – push digital assets into production, job tickets, signoff
•
Marketing – manage assets for online and printed catalogues
•
Sales – link assets to price list, more accurate offers and proposals
•
Customer Service – handle asset queries and complaints
•
Management – track and audit changes, measure process performance
•
IT – manage web assets, content assets, fonts, logos, …
•
Volume considerations for choosing a DAM:
How many assets will be managed and added per year?
•
What will the size of the average asset be?
•
Sign off workflows
•
Backup procedures
•
Versioning with check-in and out, rollback and history
•
Number of internal and external users
•
Organizational model of authorizations
•
L7F>?M!@>;8!@>?;=M7548=>?;!!
For people costing one can use 'a dollar a minute' estimation based on a $100k per
year cost with 200 workdays. Let’s assume that a designer may spend 30 minutes a day
searching or organizing media files. If we can conservatively half that time by means
of a DAM, then we are talking about saving $3000 a year. For achieving a breakeven
ROI of 12 month for a $300k DAM (prudent estimates put two thirds of that cost
towards implementation manpower!) it would require 50 fulltime users that work with
it. In a large corporation that number can be easily reached but we have to now
consider that we need an integration effort between the DAM system and other
corporate systems. That will easily multiply the manpower cost. If the DAM is in any
case part of the content and process platform you have just saved half a million
dollars conservatively with a consolidated system like Papyrus.
Consider further, that labor reduction is never a good reason to invest into IT
solutions. IT must be used to improve quality more than anything else. Using a
consolidated solution addresses a number of such quality issues. The reuse of creative
assets from previous projects enables faster development and offsets workers from
tedious error-prone tasks. Moreover, a DAM system should not ENFORCE rigid
workflows that hinder creativity but work more like a collaborative system that
empowers the user with features such as annotation while providing project
management structures, assets use auditing and accounting.
For some projects a more consistent process makes sense that allows a producer to
assign, prioritize, and track a project's progress across a production team. Auditing
enables to track the history of asset edits, conversions, and sign-offs. These processes
vary substantially from business to business and therefore a freely definable workflow
avoids the expensive tailoring of a DAM to specific vertical markets.
A Media Catalog is a special kind of asset repository that provides thumbnail display
of media assets to business users across an organization with a user-friendly web
frontend. Users are given access to check assets in an out to enable sharing and
cooperation.
Summary of potential savings and benefits:
Reduce the time and cost to search for images of products and applications
•
Preparation of assets by product management, marketing or external creative
•
organizations such as PR or advertizing agencies
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! K
6. Flexible definition of approval and sign-off processes
•
Provide approved variants of digital assets in different resolutions, color
•
schemes or languages
Avoid redundancy and related potential sources of error
•
Control the Corporate Identity standard worldwide in the organization
•
Avoid the storage of assets in many different systems
•
quot;&*!O;=?D!8J7!#(#(!/4AF5I;!P7:-7A>;=8>5F!
DAM is provided by the ISIS Papyrus Platform as a framework of definitions for
general resource management. DAM also uses the application lifecycle functions with
project management and version/variant control. Most systems only support versions,
but a larger organization requires also the use of asset variants for different display
qualities in color or resolution or for support of multiple languages.
!quot;#$%&'()*+(,&-./01-quot;20/$3(
• Upload, store, and give access to DAs for authorized users
• Automatically extract asset attributes on import
• Enable control over versions and variants of DAs
• Distribute versioned assets to other applications by unique ID
• Provide DAM via thick and thin client environments
• Utilize DAM by setting up the needed lifecycle processes without coding
• Create, sign-off and ensure corporate identity standards
• Manage any kind of current or future asset by definable object types
• Scale the DAM functionality to any number of users needed
• Secure digital assets and their use with full security functionality
• Control copyrighted material for accounting and compliance
• Audit and monitor the use of digital assets
Fig 1: The Preview Carousel in Papyrus for browsing the asset library
The Papyrus WebRepository is a meta-data driven system that enables the definition
of executable object models with states and methods. Therefore all object definitions
are virtual and derived from a meta-class. This means that there is no limitation in the
type of objects that can be defined and stored.
Also in the Papyrus Platform the assets are not to be stored inside the WebRepository
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! N
7. but within a separate possibly independent asset library server. The asset types and
their attributes are however to be defined in the WebRepository and used in the
library. Therefore the asset definitions are well managed and accessible by all
applications.
Fig 2: ISIS Papyrus WebRepository defines Metadata for DAM attributes
);;7?8=4B!quot;&*R!(7@I5=8F!4?M!quot;=D=84B!-=DJ8;!*4?4D7E7?8!
In this world where using unauthorized copyrighted content can lead to expensive
fines and lawsuits it is not only important to improve the access but also to manage
and control it. The standard security functionality of the Papyrus Platform is perfect
for assuring users access rights and control the use of digital assets.
!quot;#$%&'(!2quot;/41%5(67.&%0/$(47quot;/&%7'3(
• User Authentication, optional SmartCard/Fingerprint
• User Authorization by assigning one or more ROLES
• User organization by assigning users a corporate POLICY
• Data security by means of encrypted communication and storage
• Document/asset authentication by PKI digital signing of assets
• Document/asset security by PKI encryption of assets
Fig 3: Papyrus Platform Authentication, Authorization and Encryption
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! Q
8. WebRepository uses a role and policy based approach called RBAC. Roles enable
users to be authorized to perform any specific methods on objects, Policies segment
object instances into control groups so that different departments and subsidiaries on
the same system can be kept separate despite using the same organizational roles and
object types.
&;;78!&885=:I87!4?M!.*/!0>EA48=:=B=8F!
Many asset formats contain imbedded attributes. For many it is necessary to extract
them from the asset file and store them to DAM attributes. An import and export
function supports a variety of formats.
The resource library function of the WebRepository supports also the import of files
in the XMP format. The Adobe Extensible Metadata Platform (XMP) is a commonly
used format for definition, creation, and processing of extensible files that include the
asset attributes, also here called metadata. While it makes sense to utilize XMP when
an asset is exported as a file for a particular application, it is of no relevance while the
asset is under control of the DAM system. XMP mostly uses attributes from the
Dublin Core Metadata Initiative. The biggest drawback is that as an XML format
XMP does not allow binary formats such as thumbnails to be included natively, but
they have to be converted to the huge Base64 format. Adobe provides the XMP
Toolkit free of charge under a BSD license that includes specification and usage
documents (PDFs), API documentation (doxygen/javadoc), C++ source code
(XMPCore and XMPFiles) and Java source code (currently only XMPCore).
Papyrus and other asset management systems utilize automated transcoders to convert
assets to the format requested. In a large corporate environment transcoding should
only be enabled for immediate production such as printing or web presentation.
Transcoding to store another variant of an asset must be under process control and
sign-off to ensure asset quality.
/4AF5I;!&IM=8=?D!4?M!-7A>58=?D!T748I57;!C>5!quot;&*!
The Papyrus Platform has the ability to define what actions and attribute changes have
to be audited. These events are written into an audit log that can be statistically
processed into audit reports.
Audit data can be used to verify the correct use of the resource processes, but it can
also create lists of all assets that have been checked in and out. Each stored object in
the asset library retains information when it was last used. That enables to identify
assets that are stale and no longer used.
A parent object pointer enables the user to verify where all the assets are used in
applications and therefore provide a list of asset users. If assets are checked out for
third party application use, then a Check-Out Wizard can be created with the Activity
Recorder that forces the user to enter essential information about the usage of the
asset. This can then be used for billing for copyrighted resources.
/4AF5I;!)U)!quot;7C=?4:B7!O;75!=?875C4@7!C>5!/0!4?M!TB4;J!
Papyrus WebPortal presentation with Papyrus EYE empowers the DAM user to have
very intuitive and specialized user interface for each of the DAM user roles. Some
user roles are creative, while others are administrative and a few only require audit and
reporting functions. All of these user interfaces can be custom defined using Papyrus
EYE.
The Papyrus EYE user presentation does not require programming - in difference to
Ajax, ASP, Java Swing or Adobe Flex - and is completely defined through version-
controlled OO models in the WebRepository. The same applies to the Papyrus
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! S
9. Desktop. Using the graphical components, any view type can be defined and saved.
That allows the user to navigate to a context of objects related to a process and
present them through his choice of predefined views. The user is not limited to one
application area, but can arrange multiple views to show queries, work queues,
collaboration, and content in the process context. The graphical functionality of EYE
enables the user to choose from a number of charting options to set up dashboards or
chart representations of customer data.
Fig 4: Papyrus EYE TransPromo using DAM library
As the metadata model from the WebRepository enables direct access to live business
data, the user can pull these into the process context and use them as operational
business intelligence for better decision making. A ‘context memory’ function enables
each user to dynamically switch between tasks and views to handle exceptions,
queries, handle interruptions, and can setup layouts that match his specialized needs
without programming.
(@4B4:B7!0>BB4:>548=<7!P>56CB>W;!
As a DAM system evolves there is a growing number of assets to be managed. In an
enterprise environment the asset base can easily sum up to millions of items. Without
a solid workflow solution provided by the DAM system itself there is growing
pressure to integrate yet another third-party solution.
The User-Trained Agent (UTA) of the Papyrus Platform provides the context of a
business user role and monitors its interaction with each DA without forcing them
into the straitjacket of rigid process models. This gives each user a maximum of
creativity while maintaining audit and control features for supervisor roles.
(IEE45F!
Any organization planning to rein in the use of its digital assets for control,
compliance or cost reasons will benefit from considering the DAM approach
proposed by ISIS Papyrus. The ISIS Papyrus consolidated communications and
process platform is the ideal solution for creating a highly functional, auditable and
cost effective DAM. Its many additional features make the Papyrus Platform the
premier choice for an enterprise architecture strategy.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! V
11. #(#(!/4AF5I;!/B48C>5E!9<75<=7W!
The Papyrus Platform is an end-to-end, process- and content driven application lifecycle
system from business need to production that includes the functionality to manage
digital assets. The ISIS Papyrus Platform offers a unique combination of application
features as well as the system management power to scale on an enterprise level to
thousands of users. By setting up Papyrus WebRepository loaded with metadata,
templates and tool components, IT enables business professionals to define
applications that tie business processes, user presentation, business content, digital
assets and service interfaces together without the need for complex process
integration projects with substantial manpower, time and budget requirements.
ISIS Papyrus offers a number of quick-start application frameworks for use with asset
management, marketing, content management, document management, web
presentation, printing, customer correspondence, marketing administration and
customer care processes. Adapting those frameworks for business use NEVER
involves Java or .Net coding!
The Papyrus Platform
is uniquely integrated
and provides
business- changing
technology with many
quick-start
applications that are
tuned with little effort.
Already in 2001, ISIS Papyrus announced its Inbound/Outbound Process strategy and
started to promote the forgotten contextual link between business process and all
business content. Today’s problems of DAM, ECM, CRM, BPM, EMM and MRM
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! XX
12. are centered in a lack of understanding the business context - how people (employees
and customers) and business processes are related to content.
Businesses need a single information system that offers collaboration, process
coordination and role coaching on top of the background data processing. This single
information platform would enable a business to model its key information assets,
support its process workgroups, create and retain the knowledge as to how the
business actually performs its processes. All this knowledge and experience is shared
between workgroups according to authority. Management can monitor quality criteria
and audit each single process if needed. Because it is a single life-cycle platform,
software borders do not exist and process optimization is a continuous exercise that
does not require complex projects. All in all, such a system would not only be updated
from its users about processes, but it would to spread all the gathered knowledge
across the enterprise, hence ‘Sharing Experience™.’
Gartner Group stated in a 2007 product study that a single life-cycle repository is the
key to delivering business-changing software. Such a repository has to also provide
Digital Asset Management to fulfill the central role that it is supposed to play and
cover all aspect of process and content related knowledge applications. It is also
essential for marketing and campaign management because data and service
integration and consolidation are the cornerstones of success. A major issue of all
current business process in marketing is the simplistic flowchart paradigm that
fragments a holistic customer view into many simplistic step-by-step flows. Prospects
and customers communicate with business content in a continuous closed loop.
Outbound business content (paper and electronic):
Marketing material (enveloping-barcodes!)
•
Batch produced statements, reports, credit cards
•
Claims handling, service requests
•
General correspondence
•
Web content on self-service sites
•
Phone conversations and SMS
•
Inbound business content:
Letters
•
Emails
•
Phone calls
•
Web forms
•
Branch visits
•
Partner requests
•
,>W!M>7;!8J7!/4AF5I;!/B48C>5E!quot;7;@5=:7!*>M7B;Z!
Papyrus WebRepository uses a state/event driven concept for virtual object modeling
enabled by a distributed object-relational database and transaction engine. The core
concept is to model the interaction of business entities and not rigid process flows.
Models are assembled into templates and assigned to vertical process frameworks.
Metadata: All data structures used are best centrally defined and individually managed
and not hidden somewhere in source code or XML files. A metadata change
transparently propagates through all defined components of the application.
Trained Processes: While it is easy to define processes with the flexible state/event
model of Papyrus time and money can be saved by discovering the implicit process
flows by interactive user training rather than abstract analysis. Trained processes can
be additionally bounded by business rules that link to metadata. The trained process
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! X2
13. progression control activities, content and also calls the service interfaces.
Business Rules: These are quite distinct from process guidance but trained process
and business rules can be freely mixed. Business rules can also be entered by business
users using a ‘Natural Language Rule’ interface and become part of the application by
simply adding them to the business context.
Activities: User interfaces cannot just be simple forms or views but must be activities
that guide the user through a number of wizard-like entry steps or direct content
interaction. This dramatically reduces the need for user training. At any point in time
all information (data, content and past processes) pertaining to a customer is available
regardless of the current process and only limited by authority. Data presentation
must be asynchronous and the user interface must allow those interactions.
Content: There is no business process or case without its content – inbound
documents; fax and email, archived contracts or communications, batch or user
generated outbound correspondence, or in fact audio or video files. WebRepository
enables business users to define content classification and data capture for Inbound as
well as assemble template definitions and production jobs for Outbound.
User Presentation: by means of a flexible GUI object model called Papyrus EYE,
non-technical users can describe by drag and drop any user interface they desire and
create perspective layouts and composite data views for identical use on PC and Web.
Service Interfaces: Adapters use the metadata definitions to map the interface data.
A change in the interface can be version-controlled together with changes to all other
process components. Papyrus proposes a data federation model through its object
modeling that reduces the need to use service interfaces. MQ, SOA, dot.NET and
many other adapters such as for CICS and JES are available to call program functions.
0>?;>B=M487M![758=@4B!(>BI8=>?;!
ISIS Papyrus celebrates 20 years of continued innovation and successfully focused on
the long-term benefits for our 2000 corporate customers rather than short-term
revenue targets for all those years. With the availability of its unified
Inbound/Outbound functionality under common process control in 2001, ISIS
Papyrus has long anticipated the current consolidation trend.
The ISIS Papyrus Communication & Process Platform is a consolidated solution for
content (ECM), process (BPM), business rules (BR) and operational business
intelligence (BI) using data federation to create composite knowledge applications
with front and back office collaboration. A very typical application is marketing
management where Papyrus builds closed-loop business services in a rapid
development environment that connect departments, subsidiaries, business partners
and customers through the Web.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! X3
14. &5@J=87@8I57!9I8B=?7!
Platform architecture: A peer-to-peer network grid of distributed transaction-server
•
nodes can be run 24/7 around the globe with clustering or hot-backup. Any number
of transaction, portal or archive nodes can be linked to multiple repositories. The
single, central Papyrus WebRepository is a virtual meta-data repository, which
means that the meta-data models are freely definable and extendable and guarantees
full lifecycle change management without process boundaries. The system itself and
all its applications are modeled, managed and executed based on WebRepository
frameworks. User interfaces are freely definable components in HTML, Ajax, or
EYE-Flash in Papyrus WebPortal or native Papyrus Desktop.
Effortless integration through configurable Papyrus Adapters. Standard messaging
•
adapters for SMTP, POP3, FTP, SMB, JDBC, LDAP, SAP-XOM, J2EE and .NET
programming, as well as XML, ASCII, COBOL data files and many others.
Authentication and security. Papyrus employs a deeply imbedded RBAC role-
•
based access control mechanism with action controlling policies. Node-to-node data
transmissions are SSL encrypted. Fingerprint and Smartcard readers use PKI to
authenticate users, documents and digital signatures. Operating systems: Windows,
Linux, AIX, HP/UX, Sun Solaris, z/OS.
Process Guidance: Using the CEP complex event processing capability of the User-
•
Trained Agent the Papyrus Platform can discover existing business processes from
user interaction with the modeled information and content assets. The substantial
effort to analyze business processes, simulation, deployment, monitoring and
optimization is made redundant. The Papyrus WebRepository provides the UTA
with the business context metadata for the transductive training. Each UTA can be
considered a coach to a business user role who monitors the specified business
context and recommends actions for process progression.
Business Rules: The Activity Recorder (AR) and Natural Language Rule (NLR)
•
editor enable line-of-business users to react to market changes with consistent meta-
data definitions in WebRepository for attributes, reporting and auditing enterprise-
wide business policies. Rule editing, deployment and use is controlled by the RBAC
authorization system for all frameworks and they can be triggered by events or
trained pattern matches.
Business Process Management: Processes and related meta-data are modeled in
•
the WebRepository using a state/event principle that avoids the limitations of
flowcharts. Events are not the exception but simply trigger rule execution. Users
define the user interaction with the AR and NLR editors and train the process
sequence with the User-trained Agent. Collaboration, check-in/check-out,
versioning, project management are generic platform functions.
&AAB=@48=>?!T54E7W>56!9I8B=?7!
Enterprise Marketing Management: The Papyrus Adapters provide real-time
•
access to either operational data or data warehouses through the meta-data and
service registry definitions in WebRepository. Data access is federated and reports
can be generated automatically or ad-hoc from predefined document components.
Business rules can be used to provide data filtering and condensation for real-time
decisions support. The User-Trained Agent can be used to train pattern matching
for complex event handling or predictive decision support.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! XG
15. Enterprise Content Management: For image capture Papyrus provides scanning,
•
image correction, OCR, machine-learning document classification, FreeForm® data
extraction and validation, email text classification, distributed archiving with
automatic content replication. Any archived content (emails, PDF,…) can be
assigned Records Management retention principles and PKI signed for
authentication. Collaboration, check-in/check-out, versioning, project management
and business process are generic platform functions that simply need to be used for
content applications.
Document and Output Management: The Papyrus Designer provides the most
•
powerful document generation functionality for dynamic documents in high-volume
or interactive correspondence. Business rules can be used to define the conditional
logic for correspondence created through the WebPortal™ interface or Desktop.
Papyrus PrintPool™ bundles high-volume output and correspondence that is
multi-channel distributed or printed with Papyrus WebControl.
Web Content Management and Portal: Papyrus WebPortal and the WCM
•
Framework enable creation and management of multiple intranet and extranet sites in
HTML/CSS, Ajax or EYE-Flash. The Papyrus WebRepository™ centrally controls
branding, look and security and maintain consistency of information across distribute
Internet sites with localized international versions.
(A7@=C=@!&AAB=@48=>?!T54E7W>56;!!
/4AF5I;!/5=?8/>>B!4?M!P7:&5@J=<7!C>5!-7A>58;!
The need to print, organize, and distribute corporate reports is expensive. Documents
in the form of billing and transaction statements, invoices, and budget analysis reports
are huge. The volume of reports is a barrier to finding relevant information. These are
either created by Papyrus DocEXEC or by third-party software applications. Good
decision-making hinges on access to well formatted and understandable business
information.
Create and manage reports within the Papyrus platform
•
Integrate reports into business processes for decision making
•
Utilize business data from CRM, ERP or any other application
•
Papyrus user role/policy authorization and authentication controls access
•
Papyrus WebArchive enables effortless distribution to the Web
•
Electronically captures meta-tags, archives and distributes reports
•
Transforms report content into meaningful business intelligence
•
Empowers business users with multi-channel distribution
•
/4AF5I;!04A8I57!C>5!#?:>I?M!quot;>@IE7?8;!
Papyrus Capture transforms paper documents into images and metadata. The images
are associated with case files and automatically routed through business processes by
user-trained or user-defined rules. Papyrus Capture provides high-capacity document
capture with multiple scanning stations in distributed mailroom configurations.
Papyrus capture uses automatic classification of images without human intervention.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! XK
16. /4AF5I;!(>BI8=>?;!C>5!#?;I54?@7!T=?4?@7!4?M!%7B7@>E!
The insurance, finance and telecom industry must be prepared to react to international
financial volatility, free-market changes and regulation. The ISIS Papyrus Platform
provides international companies, their agents or brokers with a business-driven,
customer-focused, experience-sharing environment that facilitates cooperative
business processes.
Customer Analysis & Marketing
•
Market Research
•
Product Development & Implementation
•
Brand Management
•
Sales Force Management
•
Underwriting & Policy Information
•
Customer Service
•
Contract Management
•
Claims Management
•
Regulatory Compliance
•
L7F>?M!#;>B487M!*45678=?D!4?M!L54?M=?D!
The ISIS Papyrus Platform empowers businesses to define the marketing processes
they really need with the power and flexibility to service customers and prospects as
individuals in a mass marketing and highly automated service environment. A
consistent brand experience is important but it cannot substitute for a high-quality
service relationship. How hotel staff interact with you is a lot more important than
whether they all wear the same uniform. The transition from prospect to customer
must be as painless as customer to repeat customer. That is not a branding exercise
but a service quality aspect and steps far beyond customer analytics, event-detection,
campaign management, lead management, and marketing resource management. The
increase in service quality is the best guarantee for improving sales and for customer
satisfaction and therefore retention.
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! XN
17. In addition to
branding, analytics
and event detection,
an increase in
customer service
quality at reasonable
cost is the best
possible guarantee
for customer
retention.
Fig 5: Information content, activity and goal centric functionality are split into disparate systems. Marketing must
not be isolated from customer service for optimal communication quality with marketing messaging.
/4AF5I;!C>5!#?;I54?@7!0>557;A>?M7?@7!
• Transforming the claims process to maximize customer service
• Insurance corporations streamline their claims operations
• Migrating legacy host claims letter generation systems
• Eliminating manual processes with process automation
• Simple change and maintenance of layouts by the business users
• Improve Time-to-Market
• Text blocks are standardized and reused rather than duplicated
• Eliminating hard-coded printer controls provides printer independence
• Bundling of all letters to one recipient in one envelope
• High-volume batch processing and user documents are bundled together
)B7@85>?=@!L=BB!/57;7?8E7?8!
The Internet offers the opportunity to replace paper billing and check payment
processes with electronic communication. Businesses such as telecoms or utility
companies, who send and receive large numbers of bills and payments, will benefit the
most from Electronic Bill Presentment.
Replacing paper invoices
•
Reducing postage costs to mail invoices
•
Increasing cash flow through faster payments
•
Lowering bad debt factors by receiving payment prior to shipment
•
Lowering collection costs through improved payment processing
•
!quot;#$#%&'!&(()%!*&+&$)*)+%!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!(,&-#+$!)./)-#)+0)1! XQ