High level talk given at AIIM Canada's breakfast event March 23, 2017.
The talk goes through the challenges of information management in the era of BYOD and cloud services. The last part of the talk is how to start with a small but impactful project to show the value of IMaaS.
This document discusses developing an Information Management as a Service (IMaaS) framework. It notes that information management has fundamentally changed with the rise of mobile devices and cloud computing. The document recommends taking an iterative approach to building an IMaaS using principles from Kanban, including visualizing workflows, limiting work in process, focusing on continuous flow, and driving continuous improvement. The goal is to design an information platform that meets long-term business needs by integrating storage and service strategies across on-premise and cloud-based systems.
ThinkDox LLC provides an information management as a service (IMaaS) solution. The document discusses developing an information management strategy and framework for long term success by implementing a service oriented approach. It also covers topics like integrating storage strategies, understanding how users work with information, generating information lifecycle models, and ensuring solutions meet organizational needs.
The document defines case management and provides examples. It then describes a reference architecture for case management with five domains: Collaboration and Social, Deliver, Manage, Create, and Support. Each domain contains specific capabilities for case management. The Manage domain capabilities include information management, business rules management, process and activity management, and content management. The Create domain capabilities include modeling, content creation, and case initiation.
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
The document discusses product information management (PIM) for HP Printing and Personal Systems. It outlines the challenges of managing vast amounts of product data across departments and systems. It then describes how a PIM solution could address these challenges by providing a single source of truth for product information through capabilities like data integration, governance and a centralized repository. The paper also provides details on how HP could implement a PIM architecture using a transactional hub model to manage master product data.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
This document outlines best practices for implementing a Master Data Management (MDM) solution to improve data quality. MDM can help by providing a single source of trusted customer, product, and partner data across systems. When implementing MDM, organizations should follow best practices like establishing formal data governance policies, using architecture consistent with existing IT systems, demonstrating a clear business case and ROI, taking a phased approach while making MDM a long-term program, and ensuring active vendor support. Following these practices can help organizations realize the benefits of MDM like increased revenue, cost savings, and regulatory compliance.
This document discusses developing an Information Management as a Service (IMaaS) framework. It notes that information management has fundamentally changed with the rise of mobile devices and cloud computing. The document recommends taking an iterative approach to building an IMaaS using principles from Kanban, including visualizing workflows, limiting work in process, focusing on continuous flow, and driving continuous improvement. The goal is to design an information platform that meets long-term business needs by integrating storage and service strategies across on-premise and cloud-based systems.
ThinkDox LLC provides an information management as a service (IMaaS) solution. The document discusses developing an information management strategy and framework for long term success by implementing a service oriented approach. It also covers topics like integrating storage strategies, understanding how users work with information, generating information lifecycle models, and ensuring solutions meet organizational needs.
The document defines case management and provides examples. It then describes a reference architecture for case management with five domains: Collaboration and Social, Deliver, Manage, Create, and Support. Each domain contains specific capabilities for case management. The Manage domain capabilities include information management, business rules management, process and activity management, and content management. The Create domain capabilities include modeling, content creation, and case initiation.
Master Data Management: Extracting Value from Your Most Important Intangible ...FindWhitePapers
This SAP Insight explores the importance of master data and the barriers to achieving sound master data, describes the ideal master data management solution, and explains the value and benefits of effective management of master data.
The document discusses product information management (PIM) for HP Printing and Personal Systems. It outlines the challenges of managing vast amounts of product data across departments and systems. It then describes how a PIM solution could address these challenges by providing a single source of truth for product information through capabilities like data integration, governance and a centralized repository. The paper also provides details on how HP could implement a PIM architecture using a transactional hub model to manage master product data.
Reference matter data management:
Two categories of structured data :
Master data: is data associated with core business entities such as customer, product, asset, etc.
Transaction data: is the recording of business transactions such as orders in manufacturing, loan and credit card payments in banking, and product sales in retail.
Reference data: is any kind of data that is used solely to categorize other data found in a database, or solely for relating data in a database to information beyond the boundaries of the enterprise .
This document outlines best practices for implementing a Master Data Management (MDM) solution to improve data quality. MDM can help by providing a single source of trusted customer, product, and partner data across systems. When implementing MDM, organizations should follow best practices like establishing formal data governance policies, using architecture consistent with existing IT systems, demonstrating a clear business case and ROI, taking a phased approach while making MDM a long-term program, and ensuring active vendor support. Following these practices can help organizations realize the benefits of MDM like increased revenue, cost savings, and regulatory compliance.
Mike Ferguson, managing director of Intelligent Business Strategies, highlights his top ten worst practices in Master Data Management (MDM) in this Information Builders webinar slideshow.
Laserfiche10 highlights- how the new features can benefit your mobile and wor...Christopher Wynder
Laserfiche 10 brings a lot of additional features for information management, workflow building and mobile content access. This slide deck provides the overview of how Laserfiche 10 can benefit clients looking to automate their processes.
This is a slide deck that was assembled as a result of months of Project work at a Global Multinational. Collaboration with some incredibly smart people resulted in content that I wish I had come across prior to having to have assembled this.
Tips & tricks to drive effective Master Data Management & ERP harmonizationVerdantis
This document summarizes a presentation given by Jeffrey Karson of Siemens Water Technologies and Arthur Raguette of Verdantis regarding their master data management and ERP harmonization initiative at Siemens Water Technologies. Siemens Water Technologies had legacy data quality issues due to multiple acquisitions. They implemented a master data initiative using Verdantis' Harmonize solution to cleanse and enrich historical data and Verdantis Integrity solution for ongoing data governance. The initiative improved data quality, reduced costs, and enabled greater visibility and efficiency. Key metrics like duplicates avoided and data enrichment rates were used to measure success.
The document discusses the responsibilities of an Enterprise Data Architect, including defining vision/strategy for data management, standards, governance, modeling, and more. It lists key tasks like implementing data strategies/roadmaps, models, and governance frameworks. The architect must understand how data is used and mitigate risks. Relevant domains include data strategy/governance, modeling, store definition, analysis, and content management. The architect must also track emerging solutions/topics and possess skills like strategy analysis, communication, and leadership.
Data Integration Trends Businesses Should Watch for in 2021Safe Software
Businesses should watch for several data integration trends in 2021 that can help them gain a competitive advantage. These include embracing automation to eliminate manual tasks, leveraging more data types like spatial and real-time data, evolving infrastructure to the cloud, improving customer experience with AI, planning for effective metadata management, and being prepared for changes in processor technology. To get the most value from data, organizations need data integration solutions that can adapt to these evolving trends.
Data Systems Integration & Business Value Pt. 2: CloudDATAVERSITY
Certain systems are more data focused than others. Usually their primary focus is on accomplishing integration of disparate data. In these cases, failure is most often attributable to the adoption of a single pillar (silver bullet). The three webinars in the Data Systems Integration and Business Value series are designed to illustrate that good systems development more often depends on at least three DM disciplines (pie wedges) in order to provide a solid foundation.
Enterprise Information Management (EIM) is an organizational commitment to define, secure, and improve information accuracy across boundaries to support business objectives. There are 7 key trends in EIM: 1) Focus is shifting from control to access and prioritizing people over content. 2) Keyword search is no longer effective and alternatives like machine learning are needed. 3) Systems are becoming more complex while users prefer simplicity. 4) Ease of use, implementation, and integration are important. 5) SharePoint focuses too much on content before people. 6) Communication tools are still used more than dedicated collaboration systems. 7) Information architecture models need to evolve to better solve problems. Research shows user adoption and experience are most critical for technology success
The document discusses different techniques for building a Customer Data Hub (CDH), including registry, co-existence, and transactional techniques. It outlines the CDH build methodology, including data analysis, defining the data model and business logic, participation models, governance, and deliverables. An example enterprise customer data model is also shown using a hybrid-party model with relationships, hierarchies, and extended attributes.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
3 Keys To Successful Master Data Management - Final PresentationJames Chi
This document discusses keys to successful master data management including process, governance, and architecture. It summarizes a survey finding that while many companies see data as an asset, only around 20% have implemented master data management. Successful MDM requires alignment with business objectives, clear governance models, and comprehensive solution architectures. The document advocates establishing policies, procedures, standards, governance, and tools to create and maintain high-quality shared reference data.
Master data management executive mdm buy in business case (2)Maria Pulsoni-Cicio
The document provides guidance on gaining executive support for master data management (MDM) projects. It recommends quantifying the hidden costs of bad data, conducting interviews with stakeholders across business units to understand data issues, and analyzing the findings to build a business case that shows the specific financial benefits of implementing MDM. Key steps include identifying stakeholders in IT and business functions, preparing interview questions tailored to different roles, interviewing a wide range of staff, and using the results to quantify savings and improved revenues from reducing data problems.
Stop the madness - Never doubt the quality of BI again using Data GovernanceMary Levins, PMP
Does this sound familiar? "Are you sure those numbers are right?" "Why are your numbers different than theirs?"
We've all heard it and had that gut wrenching feeling of doubt that comes with uncertainty around the quality of the numbers.
Stop the madness! Presented in Dunwoody on April 18 by industry leading expert Mary Levins who discusseses what it takes to successfully take control of your data using the Data Governance Framework. This framework is proven to improve the quality of your BI solutions.
Mary is the founder of Sierra Creek Consulting
This webinar from Gartner provided seven building blocks for a successful master data management (MDM) plan: vision, strategy, metrics, information governance, organization and roles, information lifecycle, and enabling infrastructure. The presentation emphasized the importance of establishing an MDM vision aligned with business goals, assessing the organization's current MDM maturity, defining metrics to measure success, establishing governance, and considering organizational roles and responsibilities. It also stressed understanding the information lifecycle and having the right technology infrastructure.
IBM's InfoSphere Master Data Management v11 features a unified MDM solution that supports virtual, physical and hybrid implementation styles within a single instance. It provides enhanced governance capabilities, improved support for reference data management and advanced hierarchies. The release also aims to accelerate time to value through simplifying upgrades, pre-built accelerators and modularity. Additionally, v11 further integrates MDM with big data and analytics capabilities, allowing the augmentation of master data with insights from unstructured sources.
Share point saturday access services 2015 final 2InnoTech
This document discusses Microsoft Access Services 2013 and the benefits it provides for migrating existing Access databases. It allows centralizing data in a SQL database while giving business users independence in designing user interfaces and reports. This improves data security, governance and reliability while making solutions easier to develop and maintain. It also provides tools for deploying Access apps in SharePoint, managing permissions and distribution.
Enterprise Information Management Strategy - a proven approachSam Thomsett
Access a proven approach to Enterprise Information Management Strategy - providing a framework for Digital Transformation - by a leader in Information Management Consulting - Entity Group
The document discusses strategies for managing master data through a Master Data Management (MDM) solution. It outlines challenges with current data management practices and goals for an improved MDM approach. Key considerations for implementing an effective MDM strategy include identifying initial data domains, use cases, source systems, consumers, and the appropriate MDM patterns to address business needs.
How to identify the correct Master Data subject areas & tooling for your MDM...Christopher Bradley
1. What are the different Master Data Management (MDM) architectures?
2. How can you identify the correct Master Data subject areas & tooling for your MDM initiative?
3. A reference architecture for MDM.
4. Selection criteria for MDM tooling.
chris.bradley@dmadvisors.co.uk
Presentation to AIIM First Canadian Chapter on April 29, 2015. Concepts for better understanding of metadata, controlled vocabularies, and taxonomies for enterprise content.
Document Imaging Initiatives in Government of Canada - PWGSC - October 27, 20...Cheryl McKinnon
Slides presented by Bruce Covington, PWGSC on document imaging initiatives in Canadian government. Presented to the Ottawa local AIIM chapter event on October 27, 2011 in support of World Paper Free Day.
Mike Ferguson, managing director of Intelligent Business Strategies, highlights his top ten worst practices in Master Data Management (MDM) in this Information Builders webinar slideshow.
Laserfiche10 highlights- how the new features can benefit your mobile and wor...Christopher Wynder
Laserfiche 10 brings a lot of additional features for information management, workflow building and mobile content access. This slide deck provides the overview of how Laserfiche 10 can benefit clients looking to automate their processes.
This is a slide deck that was assembled as a result of months of Project work at a Global Multinational. Collaboration with some incredibly smart people resulted in content that I wish I had come across prior to having to have assembled this.
Tips & tricks to drive effective Master Data Management & ERP harmonizationVerdantis
This document summarizes a presentation given by Jeffrey Karson of Siemens Water Technologies and Arthur Raguette of Verdantis regarding their master data management and ERP harmonization initiative at Siemens Water Technologies. Siemens Water Technologies had legacy data quality issues due to multiple acquisitions. They implemented a master data initiative using Verdantis' Harmonize solution to cleanse and enrich historical data and Verdantis Integrity solution for ongoing data governance. The initiative improved data quality, reduced costs, and enabled greater visibility and efficiency. Key metrics like duplicates avoided and data enrichment rates were used to measure success.
The document discusses the responsibilities of an Enterprise Data Architect, including defining vision/strategy for data management, standards, governance, modeling, and more. It lists key tasks like implementing data strategies/roadmaps, models, and governance frameworks. The architect must understand how data is used and mitigate risks. Relevant domains include data strategy/governance, modeling, store definition, analysis, and content management. The architect must also track emerging solutions/topics and possess skills like strategy analysis, communication, and leadership.
Data Integration Trends Businesses Should Watch for in 2021Safe Software
Businesses should watch for several data integration trends in 2021 that can help them gain a competitive advantage. These include embracing automation to eliminate manual tasks, leveraging more data types like spatial and real-time data, evolving infrastructure to the cloud, improving customer experience with AI, planning for effective metadata management, and being prepared for changes in processor technology. To get the most value from data, organizations need data integration solutions that can adapt to these evolving trends.
Data Systems Integration & Business Value Pt. 2: CloudDATAVERSITY
Certain systems are more data focused than others. Usually their primary focus is on accomplishing integration of disparate data. In these cases, failure is most often attributable to the adoption of a single pillar (silver bullet). The three webinars in the Data Systems Integration and Business Value series are designed to illustrate that good systems development more often depends on at least three DM disciplines (pie wedges) in order to provide a solid foundation.
Enterprise Information Management (EIM) is an organizational commitment to define, secure, and improve information accuracy across boundaries to support business objectives. There are 7 key trends in EIM: 1) Focus is shifting from control to access and prioritizing people over content. 2) Keyword search is no longer effective and alternatives like machine learning are needed. 3) Systems are becoming more complex while users prefer simplicity. 4) Ease of use, implementation, and integration are important. 5) SharePoint focuses too much on content before people. 6) Communication tools are still used more than dedicated collaboration systems. 7) Information architecture models need to evolve to better solve problems. Research shows user adoption and experience are most critical for technology success
The document discusses different techniques for building a Customer Data Hub (CDH), including registry, co-existence, and transactional techniques. It outlines the CDH build methodology, including data analysis, defining the data model and business logic, participation models, governance, and deliverables. An example enterprise customer data model is also shown using a hybrid-party model with relationships, hierarchies, and extended attributes.
Master data management (mdm) & plm in context of enterprise product managementTata Consultancy Services
The presentation discusses the classical features and advantages of Master Data Management (MDM) system along with appropriate situations to use it. How do companies apply MDM who design, manufacture and sell their products in several geographies facing challenges in making appropriate decisions on their investment in PLM & MDM space?
Another important aspect covers the comparison/relation between a MDM system (or Product Master System) and Enterprise PLM system. How can you maximize your ROI on both PLM and MDM investments? With examples from different industries the key takeaways include whether your organization requires an MDM solution or not.
3 Keys To Successful Master Data Management - Final PresentationJames Chi
This document discusses keys to successful master data management including process, governance, and architecture. It summarizes a survey finding that while many companies see data as an asset, only around 20% have implemented master data management. Successful MDM requires alignment with business objectives, clear governance models, and comprehensive solution architectures. The document advocates establishing policies, procedures, standards, governance, and tools to create and maintain high-quality shared reference data.
Master data management executive mdm buy in business case (2)Maria Pulsoni-Cicio
The document provides guidance on gaining executive support for master data management (MDM) projects. It recommends quantifying the hidden costs of bad data, conducting interviews with stakeholders across business units to understand data issues, and analyzing the findings to build a business case that shows the specific financial benefits of implementing MDM. Key steps include identifying stakeholders in IT and business functions, preparing interview questions tailored to different roles, interviewing a wide range of staff, and using the results to quantify savings and improved revenues from reducing data problems.
Stop the madness - Never doubt the quality of BI again using Data GovernanceMary Levins, PMP
Does this sound familiar? "Are you sure those numbers are right?" "Why are your numbers different than theirs?"
We've all heard it and had that gut wrenching feeling of doubt that comes with uncertainty around the quality of the numbers.
Stop the madness! Presented in Dunwoody on April 18 by industry leading expert Mary Levins who discusseses what it takes to successfully take control of your data using the Data Governance Framework. This framework is proven to improve the quality of your BI solutions.
Mary is the founder of Sierra Creek Consulting
This webinar from Gartner provided seven building blocks for a successful master data management (MDM) plan: vision, strategy, metrics, information governance, organization and roles, information lifecycle, and enabling infrastructure. The presentation emphasized the importance of establishing an MDM vision aligned with business goals, assessing the organization's current MDM maturity, defining metrics to measure success, establishing governance, and considering organizational roles and responsibilities. It also stressed understanding the information lifecycle and having the right technology infrastructure.
IBM's InfoSphere Master Data Management v11 features a unified MDM solution that supports virtual, physical and hybrid implementation styles within a single instance. It provides enhanced governance capabilities, improved support for reference data management and advanced hierarchies. The release also aims to accelerate time to value through simplifying upgrades, pre-built accelerators and modularity. Additionally, v11 further integrates MDM with big data and analytics capabilities, allowing the augmentation of master data with insights from unstructured sources.
Share point saturday access services 2015 final 2InnoTech
This document discusses Microsoft Access Services 2013 and the benefits it provides for migrating existing Access databases. It allows centralizing data in a SQL database while giving business users independence in designing user interfaces and reports. This improves data security, governance and reliability while making solutions easier to develop and maintain. It also provides tools for deploying Access apps in SharePoint, managing permissions and distribution.
Enterprise Information Management Strategy - a proven approachSam Thomsett
Access a proven approach to Enterprise Information Management Strategy - providing a framework for Digital Transformation - by a leader in Information Management Consulting - Entity Group
The document discusses strategies for managing master data through a Master Data Management (MDM) solution. It outlines challenges with current data management practices and goals for an improved MDM approach. Key considerations for implementing an effective MDM strategy include identifying initial data domains, use cases, source systems, consumers, and the appropriate MDM patterns to address business needs.
How to identify the correct Master Data subject areas & tooling for your MDM...Christopher Bradley
1. What are the different Master Data Management (MDM) architectures?
2. How can you identify the correct Master Data subject areas & tooling for your MDM initiative?
3. A reference architecture for MDM.
4. Selection criteria for MDM tooling.
chris.bradley@dmadvisors.co.uk
Presentation to AIIM First Canadian Chapter on April 29, 2015. Concepts for better understanding of metadata, controlled vocabularies, and taxonomies for enterprise content.
Document Imaging Initiatives in Government of Canada - PWGSC - October 27, 20...Cheryl McKinnon
Slides presented by Bruce Covington, PWGSC on document imaging initiatives in Canadian government. Presented to the Ottawa local AIIM chapter event on October 27, 2011 in support of World Paper Free Day.
IBM Information Management - Pas de décision de qualité sans informations de ...Nicolas Desachy
Sur une planète toujours plus intelligente, instrumentée et interconnectée, la masse d\'information explose. Il n\'y a pas de prise de décision de qualité sans une information fiable, pertinente, à la bonne personne au bon moment. Lors des Tendances Logicielles New Intelligence, Dan Benouaisch, IBM, a développé les concepts et présenté l\'offre IBM InfoSpere qui répond à ces impératifs.
Presentation from Information Management seminar discussing and illustrating how Enterprise 2.0 principles and mechanisms can be used when adressing Information Management challenges
Information Management : de l’excellence opérationnelle à l’excellence inform...Jean-Michel Franco
Information management en environnement SAP : Pourquoi maintenant ? Quelles solutions dans le porte-feuille SAP Business Objects? Retours d’expérience dans un contexte Business Intelligence
United Airlines Best Practices Conference 2013 presentationDenise Wilson
United Airlines uses SharePoint for collaboration, content management, and social features. It provisions SharePoint sites according to service level agreements and type of content. Departments have created their own content management systems in SharePoint in 2-8 weeks for initiatives like an operations dashboard, DOT compliance, and a technology service catalog. Taxonomies of lists are important for organizing content, and content types with standardized columns are reused across sites.
Determining What Information to Keep in a File Shares Cleanup - 6 Key Questio...AIIM International
A critical part of a file shares cleanup is determining what to keep in the process. Developed by AIIM, an independent association focused on intelligent information management, and reviewed by information professionals, understand the 6 questions to ask when trying to figure out if you should keep certain pieces of content.
This presentation has been extracted from an AIIM online Quick Study course focused on how to cleanup your file shares. To learn more visit: http://www.aiim.org/file-share-cleanup-course
The AIIM Certificate Programs are designed from global best practices among AIIM's 65,000 members, and are available as classroom or online training courses leading to AIIM Practitioner, Specialist and Master designations.
[Webinar Slides] Finding the Right Information in Your Stockpiled ContentAIIM International
Stop wasting your time and energy on a constant hunt for information. Learn the tips and tricks to help you find the information you need when you need it.
Want to follow along with the webinar replay? Download it here for free: http://info.aiim.org/finding-the-right-information
[Webinar Slides] How to Plan Your Information Management Strategy in 2017AIIM International
AIIM’s chief analyst outlines the data from 5 community-wide surveys and identifies the game-changing trends you need to know about as you plan your information management strategy.
Want to follow along with the webinar replay? Download it here for free: http://info.aiim.org/information-management-strategy.
[Webinar Slides] 7 Key ECM Changes - A Look Ahead to 2017 AIIM International
What new changes are coming for Enterprise Content Management (ECM)? Based on AIIM research and conversations with our members, we see 7 key areas where changes and advancements are revolutionizing how you do business and manage information. See an outline of these changes and strategies to address them in these webinar slides.
Want to follow along with the webinar replay? Download it here for free: http://info.aiim.org/7-key-ecm-changes
This document provides an introduction and overview of electronic records management for California state government agencies. It defines key terms related to electronic records and records management systems. It discusses the importance of developing an electronic records management program and plan, including reviewing requirements, training, documenting existing systems, conducting a records inventory and appraisal, and obtaining approval for records retention schedules. The document provides guidance on various aspects of electronic records management such as creating, organizing, and arranging electronic files and records series. It also covers ensuring electronic records integrity, database management, hardware and data security, disaster preparedness and recovery, storage media care, and legal issues. The overall aim is to help agencies improve their electronic recordskeeping practices to meet statutory requirements.
Learning English as a second language - the myths, facts and realitiesNalaka Gamage
This document discusses common myths and facts about learning English to provide a realistic approach. Some myths discussed are that English is too difficult to learn, speaking is more important than writing, and needing a British accent. However, the facts are that English has a simple alphabet and grammar, what is spoken is the same as what is written, and natural accents are accepted. The reality is that English is easy to learn by focusing on essential words, being confident with one's own accent, and understanding context without knowing every word. The key is to ignore myths, put in initial effort, and use common sense when learning and using English.
This document summarizes an article from The Corporate Governance Advisor on tools for boards to oversee cybersecurity risk. It discusses the business impacts and litigation/regulatory risks of cyber attacks. It outlines how boards have an oversight duty to ensure proper information and reporting systems exist to manage cybersecurity risk. The document provides examples of cybersecurity disclosure from companies like Target and Home Depot. It discusses SEC guidance on cybersecurity disclosure and notes boards must exercise oversight in good faith to avoid liability for failures.
Product Cost Analytics solution overviewSridhar Pai
The document discusses a product cost analytics solution from ConverBiz Technologies. It provides self-service reporting on product cost management data through pre-built dashboards. The solution allows assembly cost analysis, commodity analysis, supplier spend projection, and sourcing performance tracking. Sample dashboards show cost breakdowns, top cost drivers, and supplier spending charts. The solution is built on Oracle Product Lifecycle Analytics and integrates with Oracle Agile Product Cost Management.
Experience the wonders of Sri Lanka from the best individual tour operator with BUDGET PRICES.
TRUST US WITH YOUR HOLIDAYS WE HAVE FOR YOU .WE OFFER A WIDE CHOICE OF TOURS: OUR EXPERTS CAN ALSO “TAILOR “A HOLIDAY TO YOUR PERSONAL WISHES.
Calendrier des activités de JEADER _ AFRIQUE _ 2016 JEADER
2016 a été une année spéciale grâce à vous ! Merci pour votre support et revisitez les activités phares de 2016.
Ensemble faisons de 2017 une année tout aussi EXTRAORDINAIRE !
Integrating user needs into ECM projects is key to success. Whether it is a initial implementation or a reboot or just expanding use, user needs and UX testing should be integrated into every project
ECNO 2016-Using ECM to gain administrative efficiency for school boardsChristopher Wynder
Presentation from ECNO 2016. The presentation centers on embedding records management into process management. We take a IT project centric view of how to move from chaos to manage-able information access points. A key concept is how ECM and EIM technologies provide opportunities for school boards to reduce their costs and risk.
Data Mesh in Azure using Cloud Scale Analytics (WAF)Nathan Bijnens
This document discusses moving from a centralized data architecture to a distributed data mesh architecture. It describes how a data mesh shifts data management responsibilities to individual business domains, with each domain acting as both a provider and consumer of data products. Key aspects of the data mesh approach discussed include domain-driven design, domain zones to organize domains, treating data as products, and using this approach to enable analytics at enterprise scale on platforms like Azure.
Encrypted Data Management With Deduplication In Cloud...Angie Jorgensen
The document discusses some disadvantages of Minitrex's current data management system and proposes solutions based on customer relationship management (CRM) theories. It finds that Minitrex's data is siloed across different departments, leading to issues like duplicate customer records and a lack of a holistic view of customers. It suggests integrating CRM across departments to get a unified view of customers. It also recommends utilizing CRM software to consolidate data to improve data quality, gain insights, and better manage customer relationships. Leadership support and an integrated, holistic approach are identified as important for effective use of CRM.
This document provides an introduction to business process reengineering (BPR) and enterprise resource planning (ERP). It defines BPR as fundamentally rethinking and redesigning business processes to achieve dramatic improvements in areas like cost, quality and speed. The objectives of BPR include reducing costs and time, improving customer service and reinventing business rules. ERP software aims to integrate all departments and functions across a company onto a single system. It discusses the benefits of ERP including lower costs and better data access, as well as challenges such as high implementation costs and potential inflexibility.
The document discusses several topics related to information management within government organizations. It begins by outlining the key considerations for a Canadian government RFI on cloud services, including policy, business, technical, procurement, pricing and security. It then discusses challenges of moving to the cloud and key capabilities needed for collaboration and content management. Several graphics show examples of infrastructure layouts, the variety of locations information can be stored, and the need to define user journeys to understand how people complete tasks. It emphasizes identifying "dangerous" user groups where compliance issues are most likely to occur to prioritize support and adoption of information management systems.
The deliverable from a consulting engagement for a hospital. The hospital needed to define the requirements for a single EIM platform. This two-day clinic allowed them to identify key issues and requirements to reduce the time to move from idea to RFP. While ensuring the that process stayed focused on hospital goals rather than just technical ease and fastest implementation.
- IT needs to implement an ECM system to manage the growing amount of unstructured data and content that users are storing in unauthorized locations outside of the organization's control, like cloud storage and personal devices.
- For the ECM system to be effective, it must have high adoption amongst end users. This requires understanding how users actually work rather than making assumptions, and designing the ECM system around users' daily tasks and challenges finding information.
- The document outlines exercises for requirements gathering that focus on understanding users' information sources, challenges, and daily workflows in order to design an ECM system that solves users' problems and enables productivity, leading to higher adoption.
Data analytics tools help organizations derive insights from vast amounts of data, enabling informed decision-making, identifying trends and patterns, personalizing customer experiences, optimizing processes, and driving innovation and competitive advantage.
1) MDM is the process of creating a single point of reference for highly shared types of data like customers, products, and suppliers. It links multiple data sources to ensure consistent policies for accessing, updating, and routing exceptions for master data.
2) Successful MDM requires defining business needs, setting up governance roles, designing flexible platforms, and engaging lines of business in incremental programs. Common challenges include lack of clear business cases and roadmaps.
3) Key aspects of MDM include modeling shared data, managing data quality, enabling stewardship of data, and integrating/propagating master data to operational systems in real-time or batch processes.
Big Data Paris - A Modern Enterprise ArchitectureMongoDB
Depuis les années 1980, le volume de données produit et le risque lié à ces données ont littéralement explosé. 90% des données existantes aujourd’hui ont été créé ces 2 dernières années, dont 80% sont non structurées. Avec plus d’utilisateurs et le besoin de disponibilité permanent, les risques sont beaucoup plus élevés.
Quels sont les paramètres de bases de données qu’un décideur doit prendre en compte pour déployer ses applications innovantes?
Making Informed Business Decisions with an Enterprise Information Management ...Perficient, Inc.
Perficient presents: An Enterprise Information Management (EIM) solution provides an integration of structured and unstructured information in a context that is used by users to make decisions.
EIM Solutions provide a seamless, role based set of tools that let users be more efficient in completing their key tasks
These tools can include;
Business Intelligence
Enterprise Content Management
Portal
Enterprise Search
Collaboration
E-Mail Management
This document provides information about obtaining fully solved assignments from an assignment help service. It lists the email and phone contact information for the service and provides instructions to include semester and specialization name when reaching out. It also lists the subject codes and names for assignments that are available for various MBA programs and semesters, including Business Intelligence & Tools for semester 3.
Information Governance: Reducing Costs and Increasing Customer SatisfactionCapgemini
The document discusses best practices for information governance, including how it can help organizations reduce costs and increase customer satisfaction. It provides an overview of SAP and Capgemini's information governance best practices and addresses common questions clients have around data issues. Information governance is important because data is a key organizational asset, and governance helps ensure consistent, accurate data is available for reporting and decision making. Lack of governance can lead to issues like multiple versions of the truth and inefficient processes. The benefits of effective information governance include reduced costs through improved data management, better decisions from leveraging high-quality data, and increased customer satisfaction.
AIIM Info 2011 Increasing mobile worker productivityZia Consulting
This session describes how education, healthcare and government organizations can implement a collaborative mobile ECM and Project management strategy for their workforce. Attend as we cover the benefits of using CMIS, mobile applications and devices, and best practices for a mobile ECM delivery strategy.
• The value of writing content rich mobile CMIS applications that work against multiple ECM repositories.
• How to build a strategy to enable increased mobile worker productivity by created task-oriented ECM and project management related activities delivered on mobile devices.
• Mobile ECM best practices that utilize a variety of free and widely available software packages on the iPhone and iPad.
• Examples of mobile content delivery and how it has saved local governments time and money.
Successful artificial intelligence enables organizations to capture the thought process of top performers and deploy it as a virtual coach. Combining artificial intelligence with expert knowledge, metadata generation, auto-classification, and taxonomy management delivers great knowledge transfer.
In this webinar Discovery Machine and Concept Searching will demonstrate how their combined offering enables enterprises to establish an effective information framework by enhancing access to corporate knowledge sources with artificial intelligence.
Join us to find out more about how the solution can save your organization both time and money, while increasing accuracy and consistency of corporate knowledge access.
What you will learn about during this session:
• Capturing enterprise knowledge and deploying subject matter expertise as a virtual coach
• Effective content identification and classification, regardless of content location in the enterprise
• Eliminating the error and cost burdens of identification and management of records
• Documenting knowledge in the context of business process to create tangible knowledge assets
• Increasing the quality of information for decision making
• Automatic migration of content driven by classification of metadata
Speakers:
Todd Griffith, CTO and Co-Founder at Discovery Machine
Ken Lemons, Vice President Federal Programs at Concept Searching
John Challis, Founder and Chief Executive Officer at Concept Searching
Tips --Break Down the Barriers to Better Data AnalyticsAbhishek Sood
1) Analytics executives face challenges in collecting, analyzing, and delivering insights from data due to a lack of skills, cultural barriers, IT backlogs, and productivity drains.
2) Legacy systems and complex analytics platforms also impede effective data use. Modular solutions that integrate with existing systems and empower self-service are recommended.
3) The document promotes the Statistica software as addressing these challenges through its ease of use, integration capabilities, and support for big data analytics.
The webinar presentation deck for "Intranet Content Management in a Social World" webinar, presented by Toby Ward, Founder, Prescient Digital Media.
Learn how to create, publish, and manage great content across multiple departments and publishers; and how to ensure old and bad content is renewed, archived or deleted.
A how-to 60-minute webinar hosted by Toby Ward, founder of Prescient Digital Media and the Digital Workplace & Intranet Global Forum conference series. You will learn:
- Rules for creating intranet content
- Intranet governance
- Empowering employees to create the RIGHT CONTENT
- Dos and Don'ts for content management and SharePoint
Similar to Information Management aaS AIIM First Canadian presentation (20)
Whitepaper developed with Pharma Exec magazine on how EIM- Enterprise Information Management- can provide efficiency and kick start innovation by ensuring information flows correctly inside- and outside- the company
Healthcare products suffer from a lack of ability to control documents and non-clinical images. OpenText ApplicationXtender can solve that problem for vendors through our OEM program. This whitepaper goes through the benefits of embedding ApplicationXtender into healthcare products.
OpenText ApplicationXtender provides cost effective document management. For software vendors looking to expand or build a healthcare focused product, "AX" can be embedded to provide first class content services in without the high cost of research and development.
Automating Patient Management with ApplicationXtender WorkflowChristopher Wynder
The hardest part about managing a clinic is keeping everybody up-to-date with the right information. Whether this is simply making sure billing is alerted of a new bill or as complex as managing follow-ups after a referral. There is simply too many documents, emails and schedules for a person to manage. This is the value of workflow to a clinic or hospital- setting the rules regarding who gets to see certain types of documents and ensuring that know about the updated information.
This document discusses preparing healthcare organizations for a digital future. It explains that electronic health records and digital health information systems can improve patient care by giving providers a comprehensive digital view of a patient's health history. However, digital transformation requires new tools to securely manage both structured data and unstructured information like medical images. The document recommends that healthcare organizations implement an enterprise content management system to collect, manage, and act as a repository for both structured and unstructured patient information across departments. This will help improve efficiency, collaboration, and regulatory compliance while reducing costs.
This deck goes through the Information conundrum and how ApplicationXtender is positioned to provide the technical platform for organizations to start moving from paper to a digital future
ThinkDox talk from ECNO 2017 on using Laserfiche to manage student records and student information. We use the examples of Field trip forms and student record search to highlight the potential administrative efficiencies that can be gained.
Histone demethylase and it srole in cell biology reviewChristopher Wynder
This document provides a scientific review of the histone demethylase enzymes; particularly the H3K4 demethlases (KDM5 family) focusing on their role in cell biology. This review was written in 2014
1. The document discusses different types of electronic signatures and their legal validity, including email signatures, signatures on password-protected websites, signatures validated through third parties like social media accounts, and digital signatures.
2. It analyzes four scenarios representing the different signature types to determine their ability to meet legal requirements in Canada.
3. While digital signatures provide the highest assurance, the document concludes that other electronic signatures could also be legally valid depending on the specific use case and how well they meet criteria like uniquely identifying the signer.
The document discusses improving processes for updating student records at the end of the school year. It identifies three sub-processes: 1) records change approval, 2) records change workflow, and 3) records update capture. Each sub-process is occurring outside the existing systems and causing issues. The document recommends analyzing each sub-process, addressing why they happen outside the systems, and moving the entire process to a form-based approach within a single system to optimize the process.
We are often why use a VAR- what am I paying you for? This presentation goes through the basics of how we implement Laserfiche and provide continual support above and beyond basic technical support, we make sure you understand what is possible and support you as you maximize your investment.
Moving records management from a paper based strategy to a electronic strategy requires re-thinking what needs to be protected and where the threats to security exist.
The key is to stop focusing on the artifact (the document) and focus on the information that is important. Documents are just the storage media to move the information from person to person.
AMCTO presentation on moving from records managment to information managementChristopher Wynder
This presentation was given to AMCTO zones 1 and 4/5. It presents how to use the records classification as the core for a faceted classification schema that can be used to enable workflow and processes across the organization.
Protocol for preparing small volume samples from ES cells, clinical samples and non-adherent cell types.
This sample is designed to prepare samples for histone modifying enyme assays including demethylase, methyltransferase, deacetylase.
This is a re-boot of a presentation originally given on the potential role of cloud infrastructure in healthcare delivery from eHealth Canada 2012.
Key concepts are the drivers of change in healthcare, how hospitals can protect themselves when using of cloud, the potential use of enterprise content management as part of healthcare delivery and the current models that we are seeing in Canada and the US.
Regulation of KDM5 by multiple cofactors regulates cancer and stem cellsChristopher Wynder
The document discusses regulation of histone modifications and neural differentiation. It notes that during the first two years, ASD brains "over-grow" leading to hyper-connectivity and often apoptosis, resulting in absence of neurons. ASD can also result from later synaptic activity changes. Many ASD genes are also cancer-related, and epigenetics is given as the reason for ASD. Recently, mutations in multiple epigenetic regulators have been found in ASD patients. Histone modifications, like those regulated by KDM5 proteins, are an example of epigenetic changes that can be significant and changeable. KDM5 proteins are regulated by a two-part system involving activation by TLE4 and gene-specific
KDM5 epigenetic modifiers as a focus for drug discoveryChristopher Wynder
A summary presentation of my scientific work.
My laboratory focused on an enzyme KDM5b (aka PLU-1, JARID1b) that was widely expressed during development and played a key role in progression of breast cancer through HER-2.
My lab focused on understanding the key biochemical activity of the enzyme through dissecting the proteomic and genomic interactors.
Our results were confirmed through the use of ES cells, adult stem cells and mouse models.
Much of this work remains unpublished, please contact me for more information and/or access to any reagents that I still have as part of this work.
crwynder@gmail.com
Primer on Epigenetics given at the IRSF family conference 2011Christopher Wynder
My presentation for the families of Rett syndrome patients.
This serves as a basic primer on what epigenetics is without deep details on the science.
Appropriate for all levels of education.
For more information contact the author: crwynder@gmail.com
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
In the realm of cybersecurity, offensive security practices act as a critical shield. By simulating real-world attacks in a controlled environment, these techniques expose vulnerabilities before malicious actors can exploit them. This proactive approach allows manufacturers to identify and fix weaknesses, significantly enhancing system security.
This presentation delves into the development of a system designed to mimic Galileo's Open Service signal using software-defined radio (SDR) technology. We'll begin with a foundational overview of both Global Navigation Satellite Systems (GNSS) and the intricacies of digital signal processing.
The presentation culminates in a live demonstration. We'll showcase the manipulation of Galileo's Open Service pilot signal, simulating an attack on various software and hardware systems. This practical demonstration serves to highlight the potential consequences of unaddressed vulnerabilities, emphasizing the importance of offensive security practices in safeguarding critical infrastructure.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
A Comprehensive Guide to DeFi Development Services in 2024Intelisync
DeFi represents a paradigm shift in the financial industry. Instead of relying on traditional, centralized institutions like banks, DeFi leverages blockchain technology to create a decentralized network of financial services. This means that financial transactions can occur directly between parties, without intermediaries, using smart contracts on platforms like Ethereum.
In 2024, we are witnessing an explosion of new DeFi projects and protocols, each pushing the boundaries of what’s possible in finance.
In summary, DeFi in 2024 is not just a trend; it’s a revolution that democratizes finance, enhances security and transparency, and fosters continuous innovation. As we proceed through this presentation, we'll explore the various components and services of DeFi in detail, shedding light on how they are transforming the financial landscape.
At Intelisync, we specialize in providing comprehensive DeFi development services tailored to meet the unique needs of our clients. From smart contract development to dApp creation and security audits, we ensure that your DeFi project is built with innovation, security, and scalability in mind. Trust Intelisync to guide you through the intricate landscape of decentralized finance and unlock the full potential of blockchain technology.
Ready to take your DeFi project to the next level? Partner with Intelisync for expert DeFi development services today!
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
Have you ever been confused by the myriad of choices offered by AWS for hosting a website or an API?
Lambda, Elastic Beanstalk, Lightsail, Amplify, S3 (and more!) can each host websites + APIs. But which one should we choose?
Which one is cheapest? Which one is fastest? Which one will scale to meet our needs?
Join me in this session as we dive into each AWS hosting service to determine which one is best for your scenario and explain why!
Building Production Ready Search Pipelines with Spark and MilvusZilliz
Spark is the widely used ETL tool for processing, indexing and ingesting data to serving stack for search. Milvus is the production-ready open-source vector database. In this talk we will show how to use Spark to process unstructured data to extract vector representations, and push the vectors to Milvus vector database for search serving.
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
4. Patterns of HOW work gets done have changed
Strictly Org. Chart Nodal
ThinkDox INC.
Model adapted from Dion Hinchcliffe @dionhinchcliffe; IT models
Image from: http://gmdd.shgmo.org/Computational-
Biology/ANAP/ANAP_V1.1/help/anap-userguide/manual.html
• Information passed upwards.
• Access to information was strictly tied to
position in hierarchy.
• Ability to action on information tied to
hierarchy
• Information passed between nodes based on
relationships within organization.
• Ability to action on information tied to role
and project.
5. How information is conveyed and used is no longer tied to
artifacts
Users do not have “silo’ed” work days where
they handle just records or handle just
documents.
Documents contain of information that is
used for particular business processes.
There is NO INHERENT VALUE in the
container.
Records are a subclass of documents that
must be treated differently. Specifically, they
must be maintained in the format that
conveyed the information. i.e. the
container has value.
9am
DATE
?
5pm
The average user’s day
ERP/CRM
ThinkDox INC.
6. The days of separate information sources is over
Organization-owned content stores
Departmental
controlled
content stores
Resource driven view of the
corporate information
Individual corporate
stores
Individual personal data
DATE
?
Service driven view of
corporate information
ERP/CRM
ERP/CRM
7. It’s not a tech problem- it is a alignment
problem
ThinkDox LLC.
Solving the digital divide
problem
Preparing for success
How to ensure that you meet the
actual needs of the organization
8. Most records handling issues are
systemic not antagonistic
Use and understanding of “e”
records technology
“Technology agnostic problems”
9. The technical issues typically stem from user habits
Collaboration
System of record
Access control
Templates
PDF generation as designed
strips metadata and is not linked
to a form type in SoR
Admin kept copy
of template on
HD No one actually used
SharePoint for
version control
Template
IT had tied
metadata to
“live” copy
12. Effective ECM is service
driven:
It is embedded into normal work
processes
Provides time-savings to system
users
Aligns with organization strategy
and goals
13. How to ensure that you meet the
actual needs of the organization
Moving process into the digital forum
ThinkDox LLC.
Understanding how information
flows between users
Preparing for success
14. Current practices in information management are designed
based on content control rather than information movement
• Rigid
organization-
enforced
taxonomy.
• Retention
rules
• Disposition
workflow
• Audit of
deletion
schedules
Capture Organize Use
Archive or
retire
How it is
generated does
not matter in a
paper world. The
physical artifact
is “handed over”
Use is
controlled
via
ownership
of artifact.
15. Nodal working patterns mean that workers path through information
sources looks chaotic
User journey though information source
Finance
16. Supporting Nodal working patterns
An architect plans the design of information:
Brings structure to unstructured sources.
Provides easy access to information users already know
about.
Requires existing user compliance and understanding of
information sources.
A gardener sets the parameters of access:
Single point of entry across multiple types of
information based on process.
Provides access to a wide variety of information.
Requires understanding of how work gets done.
ThinkDox INC.
17. Refresh scheduleMix of content types
The Information Garden
Harvest schedule
ThinkDox INC.
18. Integrate the Information Architecture into
the garden at the “plot” level
• Re-Think Information architecture
to be “feature” of each
information source
• The financial system and the
ECM holding the POs will
have different IA
• Both however need to be
intuitive to the user
IT
Efficiency
Risk
Mitigation
Business
Efficiency
IA is simply the balancing of
efficiency(s) and risk mitigation
19. Generate a information lifecycle for different asset classes
Capture Organize Use
Archive or
retire
ECM
lifecycle
User
information
lifecycle
Generate Record Use
Forget or
store
?
Organize Re-Organize
ECM works best when
the information is
organized at capture
The un-asked question-”How do
users get work done?”
This is key to how users expect to
find information
Users lack the
tools to
appropriately
archive content
Re-use leads to lots
of local copies.
20. Addressing the “Why” – understand how each user works
Admin
Student
records
Facilities
management
User Journey of a Admin’s day
Check
information
Get
Approval
Confirm
Updates
Request
updates
Review
orders
Send
orders
Request
approval
Draft
orders
Analysis:
The nature of approvals is the real issue.
Facilities management is completely done through accounting
software. Has no ability to capture “wet signatures”
Approver wants to just send an email.
21. Focus on cross application metadata to ensure information
findability
The right two pieces of process information will
allow users to find the right documents
Weak recall
Object
=
Weak recall
Who
=
Strong
recall
=
People are hard wired to remember WHO they:
Work with
Communicated with
Made the original
22. Matching access to resources to working patterns
9am
DATE
?
5pm
The average user’s day
How many different
applications are they
using
How many times are they
breaking compliance
ERP/CRM
Generate-
How do users generate content-what are the filetypes,
what are the key applications
Record
Where is the information from that content being
recorded? Office documents, applications
Organize
What is the point of the content? Is the information
being shared? Is it for revenue generation? Does it
need to be moved to other people?
When
..is the information source used again. What do users
really need, what can you securely provide them.
23. Categorize the descriptors based on GROW fields
Contract
negotiations
Billing
Contracts
Secondary
office
Remote
CRM logs
Surveys
Direct
interaction
Location
financials
Call list
Daily
activities
Calendar
Hand-over
Workgroup
Potential
taxonomy
descriptors
(GROW)
These could
be the drop-
down terms
Wide
category
Remember this initial goal is about gaining control
over documents. The long term goal is a living set of
descriptors that mirror business practices.
These are probably too specific.
Additional personas will generalize
these further to make them usable.
25. Information management is no longer about choosing the right Application it is
about ensuring information is served to users appropriately.
Financial Services
IT service
Resource planning
Business process
ERP, “S drive”, excel
Applications
ATL Data center,
local SAN,
web service
Infrastructure
User care about:
1. How do I get to
the right App?
2. Where is the
information I
need to complete
my task?
3. How do I find
that information?
IT should know:
1. Which
application(s)
management wants
users to use.
2. Who should have
access to all the
applications in the
process
3. Compliance risk of
information
managed
Management should know:
Cost of application
Cost of maintaining
application(s)
IT should measure:
Usage of App
Cost of contracts
CIO should know:
Cost of infrastructure
Cost of IT time to maintain
IT should control:
Usage of bulk storage for
high risk processes
26. Information Management a-a-S is a multi-project move
Information risk
and value
Enterprise wide policies
Archiving
Disposition, growth control
Information
Organization
Build a taxonomy
Storage management
Enterprise wide storage control
through deletion
The key to controlling growth is translating management practices
into governance policies
Management Governance Long term ROI
PolicyTech
27. Move from here to IMaaS carefully
ThinkDox INC.
What is shaping ECM and
information management in
general
Managing IMaaS
How to ensure that you meet the
actual needs of the organization
28. Design an information platform for long term use
Public
Access
Home
Mobile
Client
Web
Access
API?
Data Lake?
Connector?
User access?
Vendor consolidation?
ECM
29. Your security plan needs to change to
protect your information
ThinkDox INC.
Web Access
Home
AD
Mobile Client
Client
Web Access
Home
AD
Mobile Client
Client
ECM
30. Build your IM-aaS iteratively
Principals of Kanban
1. Visualize Work
• By creating a visual model of your work and workflow,
you can observe the flow of work moving through your
Kanban system.
2. Limit Work in Process
• You can also avoid problems caused by task switching
and reduce the need to constantly reprioritize items.
3. Focus on Flow
• By using work-in-process (WIP) limits and developing
team-driven policies, you can optimize your Kanban
system to improve the smooth flow of work, collect
metrics to analyze flow, and even get leading indicators
of future problems by analyzing the flow of work.
4. Continuous Improvement
• Once your Kanban system is in place, it becomes the
cornerstone for a culture of continuous improvement.
Teams measure their effectiveness by tracking flow,
quality, throughput, lead times and more.
Use Kanban as a starting point
Kanban is from Toyota’s “Just-in-Time”
model of supply management.
This required precise knowledge of when
parts were needed, how many and what
the rate of replenishment was for any
given part.
Similarly, from a users perspective
information is only useful at the time of
use- and out of date data is wasteful.
Unless you are an academic- you likely
want the information in order to do
something now.
31. Alignment and mapping of goals and how-to
User perspective
What do users need to
know to perform their
job?
Is it tied to a process or
just general knowledge?
Does it expire? Or change
based on time or location?
Enterprise
perspective
What are the required
permissions for users to get
their job done?
Does access to information
enhance process efficiency?
Where is necessary
information for a process
coming from?
32. Don’t impede day-to-day operations
User perspective
Don’t re-build process in
version 1.0.
Aggressively push for
more access to base
information.
Limit the need for
application switching for
information only purposes
Enterprise
perspective
Design simple,
implementable versions
that fix a pain point.
Think information
movement first, technical
integration second.
Limit the number of apps
that manage processes
ECM
33. Look to cull “extra” people-steps
User perspective
Collect data, focus on
changes that reduce
tedious steps.
Embed compliance
steps into automation.
Design for “peak
laziness”
Enterprise
perspective
Focus on efficiency of
whole process.
Be willing to pay for
automation to be done
correctly.
Provide ECM/BPM team
will full access to
compliance needs
ECM
34. Have a plan for version 3.0 before completing 1.0
User perspective
Have a transparent
roadmap for what you
are building towards.
Keep assuming there are
steps that can be
removed from users.
Stage the transfer of user
driven to machine
performed steps so as to
be seamless.
Enterprise
perspective
Place a premium on
process efficiency as a
design element
Engage process users to
reduce the headaches of
change.
Automate steps that
impede information
movement.
ECM
35. Dept. level
Balance strategy with reality
Org. level
System of
interaction
System of
record
Access control
Findability
Archive
Ad hoc/
Fileshare
Holistic planning for information management
Infrastructure planning
Requirement gathering
Implementation
Integrated retention and disposition schedules
Understanding trends in content generation
Information management strategy
Technological support for managing information
36. DO NOT underestimate the role of engagement as part of the move
to IMaaS
Over-explain the need for user
involvement in the move to IMaaS
project.
Provide a mechanism for feedback.
Schedule and keep to the schedule of
feedback.
Nothing kills a ECM project faster
than silence from the ECM team.
Communicate
Build collaborative partnerships
with the business when shaping the
changes in related processes;
employee on-boarding, retire/fire,
financial reporting.
Create a clear, shared vision
between the key stakeholders and
IT. Take everyone with you,
develop a shared agenda.
Collaborate
Build confidence in the change
– allocate time and resources for
user testing and training.
Provide visible and active post
rollout support. Get feedback, fix
problems, and keep
communication channels open.
Build Confidence
Focus on the three key tactics for success when implementing change:
37. IMaaS is as much about IT staffing as it is about choosing
the right platforms
• Start by determining how similar the key intra- and inter-
departmental information movement patterns are across key IT
platforms.
• Start with SERVICE issue; “Salesforce isn’t working” may be
the equivalent of “I can’t find the customer ID” to users. The
Application support team and Service desk team need to
work in concert.
• IT Asset Management and Capacity planning are they the
necessary skills for technical success.
• Communicate, Communicate, Communicate. Every aspect of
any “aaS” is about clear lines of communication between
support and users.
38. Thank you
Have questions or want a copy of the presentation:
Email me: chrisw@thinkdox.com
Don’t want to email me:
See our websites presentation page
http://thinkdox.com/news/white-papers-and-presentations/
We are on twitter and LinkedIn
@Thinkdox
@ChrisW_thinkdox
40. Provide a single strategy to manage
information
ThinDox LLC.
A single
value
focused
governance
plan
Physical
Records
Documents
Records
Databases
Putting a value on
information not artifacts
Reducing risk through
process
Reducing mundane tasks
41. Multiple service offerings
ThinDox LLC.
Workshops Clinics Consulting/Teaching
Provides peer support and
specific information
management training through
problem solving.
These are designed to be both
networking events and skills
development.
Typically industry focused on
process or technology
Provides planning advice,
recommendations and
technology review customized
to you.
Private, focused problem
solving engagement.
Designed to provide a plan for
solving a problem as well as the
processes for development.
Provides project management
and an extra pair of hands to
get past issues when you have
momentum.
Long term, traditional
professional services.
42. Take full advantage of your software's
strong suits
ThinDox LLC.
Automated
Optimized
Simplified
Keep up with the
users
Re-evaluate the technology
Review the schedules and
alignment with regulations
Expand the connections
between information
sources.
Editor's Notes
Architect-135940149 Garden-163232901
Fertilizer 88342091
Fertilizer 88342091
Barely Repeatable Process:
organization applications such as ERPs, CRMs and other data focused apps bring give a home to highly repeatable processes such as order processing, customer engagements. These are often mundane tasks that have the same starting, ending and order to the workflow.
These highly repeatable processes often surround highly regulated documents. Users understand the need for workflow and repeatability to reduce regulatory pain.
The problem becomes using these data sources as part of a users job-to be productive.
Any process that has high complexity, crosses information sources and needs to be communicated is rarely done the same way or the same order.
These barely repeatable processes are often ad hoc, multi-source, multi-person processes-building a document, diagnosing a patient, requesting time-off, building revenue projections.
For IT it is nearly impossible for us to understand what the users actually do to build ensure the tools work.