Presentation on using workflow to implement a highly used ECM system.
Provides a step-by-step outline how to understand user needs through marketing techniques such as user journeys and persona building.
Introduces the concept that ECM is an organically growing system rather than an architected software solution.
Moving records management from a paper based strategy to a electronic strategy requires re-thinking what needs to be protected and where the threats to security exist.
The key is to stop focusing on the artifact (the document) and focus on the information that is important. Documents are just the storage media to move the information from person to person.
AMCTO presentation on moving from records managment to information managementChristopher Wynder
This presentation was given to AMCTO zones 1 and 4/5. It presents how to use the records classification as the core for a faceted classification schema that can be used to enable workflow and processes across the organization.
Integrating user needs into ECM projects is key to success. Whether it is a initial implementation or a reboot or just expanding use, user needs and UX testing should be integrated into every project
The deliverable from a consulting engagement for a hospital. The hospital needed to define the requirements for a single EIM platform. This two-day clinic allowed them to identify key issues and requirements to reduce the time to move from idea to RFP. While ensuring the that process stayed focused on hospital goals rather than just technical ease and fastest implementation.
This is a re-boot of a presentation originally given on the potential role of cloud infrastructure in healthcare delivery from eHealth Canada 2012.
Key concepts are the drivers of change in healthcare, how hospitals can protect themselves when using of cloud, the potential use of enterprise content management as part of healthcare delivery and the current models that we are seeing in Canada and the US.
Presentation on using workflow to implement a highly used ECM system.
Provides a step-by-step outline how to understand user needs through marketing techniques such as user journeys and persona building.
Introduces the concept that ECM is an organically growing system rather than an architected software solution.
Moving records management from a paper based strategy to a electronic strategy requires re-thinking what needs to be protected and where the threats to security exist.
The key is to stop focusing on the artifact (the document) and focus on the information that is important. Documents are just the storage media to move the information from person to person.
AMCTO presentation on moving from records managment to information managementChristopher Wynder
This presentation was given to AMCTO zones 1 and 4/5. It presents how to use the records classification as the core for a faceted classification schema that can be used to enable workflow and processes across the organization.
Integrating user needs into ECM projects is key to success. Whether it is a initial implementation or a reboot or just expanding use, user needs and UX testing should be integrated into every project
The deliverable from a consulting engagement for a hospital. The hospital needed to define the requirements for a single EIM platform. This two-day clinic allowed them to identify key issues and requirements to reduce the time to move from idea to RFP. While ensuring the that process stayed focused on hospital goals rather than just technical ease and fastest implementation.
This is a re-boot of a presentation originally given on the potential role of cloud infrastructure in healthcare delivery from eHealth Canada 2012.
Key concepts are the drivers of change in healthcare, how hospitals can protect themselves when using of cloud, the potential use of enterprise content management as part of healthcare delivery and the current models that we are seeing in Canada and the US.
Many companies have started to experience the consequences of non-existent, insufficient or poorly implemented data security plans. The absence of ‘proper IT’ to serve the diversity of information management, analysis and human-centric workflow requirements that exist in the office has created a paucity of unsecured business-critical information held in spreadsheets and micro-databases beyond the governance of IT teams. For most organizations, up to 60% of business critical information is found in these unsecured office environments.
Forrester: How Organizations Are Improving Business Resiliency with Continuou...EMC
This analyst report describes reasons why adoption of continuous availability is rapidly increasing, citing research on benefits they believe they can realize in their IT environment.
Online Hospital Appointment Management using Cloud Computingijtsrd
Salesforce Cloud computing is quickly replacing the traditional model of having software applications installed on on premise hardware, from desktop computers to rooms full of servers, depending on the size of the business. With cloud computing, businesses access applications via the internet. It's called Software as A Service or SaaS . Salesforce is the leader in cloud computing, offering applications for all aspects of your business, including CRM, sales, ERP, customer service, marketing automation, business analytics, mobile application building, and much more. And it all works on the same, connected platform, drawing from the same customer data. The outpatient of most clinics in developing countries are faced with plethora of issues. These include overtime for doctors and nurses during clinic sessions, long waiting time for patients, and peak workloads for counter personnel. The quality of health care delivery has been threatening by overtime and peak work load. This paper focuses on developing a system to improve upon the efficiency and quality of delivering a web based appointment system to reduce waiting time. In this project, a patient appointment and scheduling system is designed using Salesforce Lightning for the frontend, Apex is used to get values from frontend and store in back end cloud storage. Nithiyasri Sathiyamoorthy | Amaresan S "Online Hospital Appointment Management using Cloud Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31681.pdf Paper Url :https://www.ijtsrd.com/management/other/31681/online-hospital-appointment-management-using-cloud-computing/nithiyasri-sathiyamoorthy
IDC Study on Enterprise Hybrid Cloud StrategiesEMC
White Paper discussing IDC Survey of over 650 enterprise IT decision makers that was designed to understand the evolution of the cloud across world’s largest IT organizations.
From Asset to Impact - Presentation to ICS Data Protection Conference 2011Castlebridge Associates
This is a presentation I delivered to the Irish Computer Society Data Protection Conference in February 2011 and again on a webinar for dataqualitypro.com in March 2011.
It looks (for what I believe was the first time) at the relationship between Information Quality and Data Governance principles and practices and the objectives of Data Protection/Privacy compliance. it includes my first version of the mapping of the 8 Data Protection principles to the POSMAD Information Life Cycle referred to by McGilvray and others in the IQ/DQ fields.
Consolidating Payment Processing, A Nonprofit Perspective with American Heart...CDS Global, Inc.
At IOFM Payments Summit 2014, the American Heart Association shared how moving to a centralized payment processing model with help from CDS Global improved its organization, including better data integrity and consistency.
Learn how to get started on finding and implementing the best consolidated payment processing model for your organization.
IOFM Payments Summit on Twitter: #PaymentsSummit
For more info, grab the case study:
http://www.cds-global.com/resources/american-heart-association-case-study/
PRESENTERS:
• Chip Sugrue, ( @ChipSugrue ) National Vice President - Customer Strategies/Affiliate Management Consultant; American Heart Association
• Erin Westergaard, ( @eewestergaard ) Client Director, CDS Global
Grant Thornton - Global Tax Automation TalkbookAlex Baulf
Instead, there is suite of solutions available (from different software solution providers and consultants) which are each designed to relieve specific pressure points in the global VAT/GST compliance process. This high level talkbook is designed to set out the pressure points at each stage of the VAT/GST compliance cycle and the various options at a high level to help facilitate an impartial commercial discussion around tax automation and technology solutions.
Technology shouldn't be deployed in isolation and instead it should be implemented alongside best practices as part
of a wider tax control framework, encompassing Technology, People, Process and Data.
Many companies have started to experience the consequences of non-existent, insufficient or poorly implemented data security plans. The absence of ‘proper IT’ to serve the diversity of information management, analysis and human-centric workflow requirements that exist in the office has created a paucity of unsecured business-critical information held in spreadsheets and micro-databases beyond the governance of IT teams. For most organizations, up to 60% of business critical information is found in these unsecured office environments.
Forrester: How Organizations Are Improving Business Resiliency with Continuou...EMC
This analyst report describes reasons why adoption of continuous availability is rapidly increasing, citing research on benefits they believe they can realize in their IT environment.
Online Hospital Appointment Management using Cloud Computingijtsrd
Salesforce Cloud computing is quickly replacing the traditional model of having software applications installed on on premise hardware, from desktop computers to rooms full of servers, depending on the size of the business. With cloud computing, businesses access applications via the internet. It's called Software as A Service or SaaS . Salesforce is the leader in cloud computing, offering applications for all aspects of your business, including CRM, sales, ERP, customer service, marketing automation, business analytics, mobile application building, and much more. And it all works on the same, connected platform, drawing from the same customer data. The outpatient of most clinics in developing countries are faced with plethora of issues. These include overtime for doctors and nurses during clinic sessions, long waiting time for patients, and peak workloads for counter personnel. The quality of health care delivery has been threatening by overtime and peak work load. This paper focuses on developing a system to improve upon the efficiency and quality of delivering a web based appointment system to reduce waiting time. In this project, a patient appointment and scheduling system is designed using Salesforce Lightning for the frontend, Apex is used to get values from frontend and store in back end cloud storage. Nithiyasri Sathiyamoorthy | Amaresan S "Online Hospital Appointment Management using Cloud Computing" Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-4 | Issue-4 , June 2020, URL: https://www.ijtsrd.com/papers/ijtsrd31681.pdf Paper Url :https://www.ijtsrd.com/management/other/31681/online-hospital-appointment-management-using-cloud-computing/nithiyasri-sathiyamoorthy
IDC Study on Enterprise Hybrid Cloud StrategiesEMC
White Paper discussing IDC Survey of over 650 enterprise IT decision makers that was designed to understand the evolution of the cloud across world’s largest IT organizations.
From Asset to Impact - Presentation to ICS Data Protection Conference 2011Castlebridge Associates
This is a presentation I delivered to the Irish Computer Society Data Protection Conference in February 2011 and again on a webinar for dataqualitypro.com in March 2011.
It looks (for what I believe was the first time) at the relationship between Information Quality and Data Governance principles and practices and the objectives of Data Protection/Privacy compliance. it includes my first version of the mapping of the 8 Data Protection principles to the POSMAD Information Life Cycle referred to by McGilvray and others in the IQ/DQ fields.
Consolidating Payment Processing, A Nonprofit Perspective with American Heart...CDS Global, Inc.
At IOFM Payments Summit 2014, the American Heart Association shared how moving to a centralized payment processing model with help from CDS Global improved its organization, including better data integrity and consistency.
Learn how to get started on finding and implementing the best consolidated payment processing model for your organization.
IOFM Payments Summit on Twitter: #PaymentsSummit
For more info, grab the case study:
http://www.cds-global.com/resources/american-heart-association-case-study/
PRESENTERS:
• Chip Sugrue, ( @ChipSugrue ) National Vice President - Customer Strategies/Affiliate Management Consultant; American Heart Association
• Erin Westergaard, ( @eewestergaard ) Client Director, CDS Global
Grant Thornton - Global Tax Automation TalkbookAlex Baulf
Instead, there is suite of solutions available (from different software solution providers and consultants) which are each designed to relieve specific pressure points in the global VAT/GST compliance process. This high level talkbook is designed to set out the pressure points at each stage of the VAT/GST compliance cycle and the various options at a high level to help facilitate an impartial commercial discussion around tax automation and technology solutions.
Technology shouldn't be deployed in isolation and instead it should be implemented alongside best practices as part
of a wider tax control framework, encompassing Technology, People, Process and Data.
Avoiding the Cash Flow Crunch - 3 Ways to Win Accounts Payable AutomationDerek Gerber
Do you struggle with Cash Flow cycles from month to month? Tallega helps it's clients maximize their business efficiencies by automating processes like your Accounts Payable processes.
We say - let the OCR Scanning Software + Document Management Software do the tedious data entry automatically. Then, use that same software in other departments like Human Resources and your Legal Departments. This is the first step toward the paperless office and working electronically with your documents.
You'll learn how to:
- decrease processing time
- capture early-payment discounts
- decrease processing costs
Use these 3 Approaches to find approval for your AP Automation project and finally fight off the Cash Flow Monster in 2015.
Comcast, Integra LifeSciences, LPL Financial, and Smucker's - Doing Your ERP ...Oracle
The GRC panel “Doing Your ERP Implementation/Upgrade Right with Oracle Advanced Controls Solutions” Session ID: CON8210. Find out how they accelerated and improved their EBS and PeopleSoft implementations, upgrades, module rollouts and patching using Advanced Controls. This is a great opportunity to learn from some of the most experienced Advanced Controls owners around!
Travel and Entertainment Expense Management Trends for 2016Ashley Emery
Certify recently surveyed more than 500 CFOs, controllers and accounting professionals from outside of its customer base to uncover the top trends and best practices in expense management today. Find out what other small, midsized and enterprise businesses like yours are doing to navigate the complex expense reporting process, streamline workflow, and increase spending visibility and control.
Learn more about:
• T&E expense spending benchmarks by category and average amount
• Common expense management challenges and expected benefits
• The factors and estimated costs to process an individual expense report
• Types of systems and processes currently used for expense management
• What it takes to achieve a return on investment with a new system
AP Fiscal Year-End Close: A 10-Step ChecklistTradeshift
It’s that time of year again. For accounts payable professionals, along with CFOs, controllers, and corporate finance analysts, year-end is a whirlwind. Given all the internal and external reporting and audit requirements, you need access to timely, accurate, and consistent AP data. Your organization’s vital decisions depend upon it.
In this instructive 45-minute webinar, internal controls and compliance expert, Chris Doxey, will present strategic and tactical suggestions that you can use to deliver the best data and improve the fiscal closing process for your AP department. You’ll learn 10 steps to preventing duplicate efforts and unnecessary controls during the closing crunch.
This free educational event will leave you with a clearer understanding of the entire financial close process, including:
- Roles and responsibilities during the financial close
- Key metrics, controls, processes, and policies
- Properly accruing expenses, reconciling accounts, and dealing with reporting variances
Success Measurement: Measuring the Value and Performance of Your Source-to-Pa...SAP Ariba
Research consistently shows that companies with defined source-to-pay strategies (including robust value and performance management) not only consistently outperform their peers in the industry, but also—and perhaps more importantly—drive recognized value within their organization. Come spend 75 minutes with us to gain insights into how to establish a value framework and options for measuring your strategy. We’ll delve into how to maintain executive sponsorship, metrics that matter—by solution, and discuss the SAP Ariba free benchmark program available to all customers, which provides actionable data for your program.
Innovative Procurement Strategies for Thriving in a Networked EconomySAP Ariba
Building an agile procurement organization requires a focus on value, performance and capabilities.
Presented by Amy Fong – Senior Director, Procurement Executive Advisory - The Hackett Group
ChiefDigitalOfficer.net (“CDO”) is seeking sponsors for 2014 events. Each event provides attendees with rich and unique opportunities to acquire knowledge and interact with peers in an intimate setting. For partners and sponsors, CDO events accommodate thought leadership development, brand-building, and access to new audiences in a favorable environment.
1.Types of Computer Information SystemsThere are four basic type.pdfarccreation001
1.
Types of Computer Information Systems
There are four basic types of computer-based information Systems:
Transaction Processing Systems (TPS)
Record day-to-day transactions such as customer orders, bills, inventory.
Helps supervisors by generating databases needed for other information Systems.
Examples: recording customer orders, bills, inventory levels, and production output.
Management Information Systems (MIS)
Summarizes the detailed data of the transaction processing system.
Produces standard reports for middle-level managers.
Examples: Production schedule and budget summaries.
Decision Support Systems (DSS)
Draws on the detailed data of the transaction processing system.
Provides a flexible tool for middle-level managers for analysis.
Examples: Analyzing the effects of events such as strikes, rising interest rates, etc.
Executive Support Systems (ESS)
Presents information in a very highly summarized form.
Combines the internal data from TPS and MIS with external data.
Helps top-level managers oversee operations and develop strategic plans.
Examples: Introducing new products, starting a company wide cost control program, etc.
2. Benifits:
1. Enabling better and faster decision making
By delivering relevant information at the time of need through structure, search, subscription,
syndication, and support, a knowledge management environment can provide the basis for
making good decisions. Collaboration brings the power of large numbers, diverse opinions, and
varied experience to bear when decisions need to be made. The reuse of knowledge in
repositories allows decisions be based on actual experience, large sample sizes, and practical
lessons learned.
2. Making it easy to find relevant information and resources
When faced with a need to respond to a customer, solve a problem, analyze trends, assess
markets, benchmark against peers, understand competition, create new offerings, plan strategy,
and to think critically, you typically look for information and resources to support these
activities. If it is easy and fast to find what you need when you need it, you can perform all of
these tasks efficiently.
3. Reusing ideas, documents, and expertise
Once you have developed an effective process, you want to ensure that others use the process
each time a similar requirement arises. If someone has written a document or created a
presentation which addresses a recurring need, it should be used in all future similar situations.
When members of your organization have figured out how to solve a common problem, know
how to deliver a recurring service, or have invented a new product, you want that same solution,
service, and product to be replicated as much as possible. Just as the recycling of materials is
good for the environment, reuse is good for organizations because it minimizes rework, prevents
problems, saves time, and accelerates progress.
4. Avoiding redundant effort
No one likes to spend time doing something over again. But they do so all the .
Information Management aaS AIIM First Canadian presentationChristopher Wynder
High level talk given at AIIM Canada's breakfast event March 23, 2017.
The talk goes through the challenges of information management in the era of BYOD and cloud services. The last part of the talk is how to start with a small but impactful project to show the value of IMaaS.
Posting 1 Reply required for belowBusiness costs or risks of p.docxharrisonhoward80223
Posting 1 : Reply required for below
Business costs or risks of poof data quality:
Poor data quality may lead chiefs to not have the capacity to settle on poor choices or not have the capacity to settle on choices by any stretch of the imagination. Poor data may prompt lost deals and different opportunities, misallocation of assets, defective methodologies, and orders might not be right, inventory levels perhaps incorrect, and clients may wind up noticeably disappointed and headed out. The cost of poor quality data spreads all through the organization influencing frameworks from transportation and accepting to bookkeeping and client administrations. Extra costs are acquired when representatives must set aside opportunity to chase down and correct data errors.
The improvement of information innovation amid the most recent decades has empowered organizations to gather and store enormous measures of data. Nonetheless, as the data volumes increment, so does the multifaceted nature of overseeing them. Since bigger and more unpredictable information assets are being gathered and overseen in organizations today, this implies the danger of poor data quality builds (Watts and Shankaranarayanan, 2009). Another often specified data related issue is that organizations often oversee data at a nearby level (e.g. division or area).
Data mining:
Data mining, likewise called learning disclosure in databases, in software engineering, the way toward finding intriguing and valuable examples and connections in extensive volumes of data. The field consolidates apparatuses from insights and counterfeit consciousness, (for example, neural networks and machine learning) with database administration to examine substantial computerized accumulations, known as data sets. Data mining is broadly utilized as a part of business (protection, saving money, retail), science inquire about (space science, prescription), and government security (location of hoodlums and terrorists).
Text mining:
Text Analytics, otherwise called text mining, is the way toward examining extensive accumulations of composed assets to create new information, and to transform the unstructured text into organized data for use in assist examination. Text mining distinguishes actualities, connections and statements that would somehow stay covered in the mass of textual huge data. These realities are removed and transformed into organized data, for investigation, perception (e.g. by means of html tables, mind maps, graphs), mix with organized data in databases or distribution centers, and further refinement utilizing machine learning (ML) frameworks.
Posting 2 : Reply required for below
What Are The Business Costs Or Risks Of Poor Data Quality?
Financial effect – Poor data quality impacts organizations negatively regarding finances by raising the cost of operations which in turn reduces the revenue as well as the profit realized by the company over a given period. Also, poor data quality results in inapp.
Project 3 – Hollywood and IT· Find 10 incidents of Hollywood p.docxstilliegeorgiana
Project 3 – Hollywood and IT
· Find 10 incidents of Hollywood portraying IT security incorrectly
· You can use movies or TV episodes
· Write 2-5 paragraphs for each incident. Use supporting citations for each part.
· What has Hollywood portrayed wrong? Describe the scene and what is being shown. Make sure to state whether it is partially wrong or totally fictitious.
· How would you protect/secure against what they show (answers might include install firewall, load Antivirus etc.)
· Use APA formatting for your sources on everything.
· Make sure to put your name on assignment.
Big Data and Social Media
Colgate Palmolive
Agenda Of socail media use
Buisness intellegence and Social media concenpts
Intellegent organization
Data Anaylysis and Data trustworthiness
Conclusion
Buisness intellegence and Social media concenpts
No-Hassle Documentation
Gain Trusted Followers
Spy on Competition
Learn Customer Demographics
Research and Analyze Events
Advertise More Accurately
Intellegent organization
They consistently use (big) data proactively
They know exactly where they want to go: all-round vision
They continuously discuss business matters: alignment
They talk to each other regarding positive and negative performance
They know their customers through and through
They think and work in an agile way
Data Anaylysis and Data trustworthiness
Data completeness and accuracy
Data credibility
Data consistency
Data processing and algorithms
Data Validity
Conclusion
How Colgate benefit from Big Data and Social Media
Social media increases sales and customers
Big data shows popular trends and popular companies
All around they are both beneficial
Big Data can find trends that can benefit you greatly
Criteria
Title Page:
Name, Contact info, title of Presentation
Slide 1
Adenda : Topic you going to cover in order
Slide 2
Discuss how big data, social media concepts and knowledge to successfully create business intellegence (Support your bullets points with data, analysis, charts)
Slide 3
Describe how big data can be used to build an intelligent organization
Slide 4
Discuss the importance of data source trustworthiness and data analysis
Slide 5
Conclusion
Slide 6
Big Data And Business Intelligence
Business Value With Big Data
For business to survive in a competitive environment, organizational change requires improved governance, sponsorship, processes, and controls, in addition to new skill sets and technology all work in harmony to deliver the benefits of big data. See Fig. 13.2
Data science has taken the business world by storm. Every field of study and area of business has been affected as companies realize the value of the incredible quantities of data being generated. But to extract value from those data, one needs to be trained in the proper data science skills. The R programming language has become the de fac to programming language for data science. Its flexibility, power, sophistication, and expressiveness have ma ...
The pioneers in the big data space have battle scars and have learnt many of the lessons in this report the hard way. But if you are a general manger & just embarking on the big data journey, you should now have what they call the 'second mover advantage’. My hope is that this report helps you better leverage your second mover advantage. The goal here is to shed some light on the people & process issues in building a central big data analytics function
The Role of Community-Driven Data Curation for EnterprisesEdward Curry
With increased utilization of data within their operational and strategic processes, enterprises need to ensure data quality and accuracy. Data curation is a process that can ensure the quality of data and its fitness for use. Traditional approaches to curation are struggling with increased data volumes, and near real-time demands for curated data. In response, curation teams have turned to community crowd-sourcing and semi-automatedmetadata tools for assistance. This chapter provides an overview of data curation, discusses the business motivations for curating data and investigates the role of community-based data curation, focusing on internal communities and pre-competitive data collaborations. The chapter is supported by case studies from Wikipedia, The New York Times, Thomson Reuters, Protein Data Bank and ChemSpider upon which best practices for both social and technical aspects of community-driven data curation are described.
E. Curry, A. Freitas, and S. O’Riáin, “The Role of Community-Driven Data Curation for Enterprises,” in Linking Enterprise Data, D. Wood, Ed. Boston, MA: Springer US, 2010, pp. 25-47.
Whitepaper developed with Pharma Exec magazine on how EIM- Enterprise Information Management- can provide efficiency and kick start innovation by ensuring information flows correctly inside- and outside- the company
Healthcare products suffer from a lack of ability to control documents and non-clinical images. OpenText ApplicationXtender can solve that problem for vendors through our OEM program. This whitepaper goes through the benefits of embedding ApplicationXtender into healthcare products.
OpenText ApplicationXtender provides cost effective document management. For software vendors looking to expand or build a healthcare focused product, "AX" can be embedded to provide first class content services in without the high cost of research and development.
Automating Patient Management with ApplicationXtender WorkflowChristopher Wynder
The hardest part about managing a clinic is keeping everybody up-to-date with the right information. Whether this is simply making sure billing is alerted of a new bill or as complex as managing follow-ups after a referral. There is simply too many documents, emails and schedules for a person to manage. This is the value of workflow to a clinic or hospital- setting the rules regarding who gets to see certain types of documents and ensuring that know about the updated information.
Healthcare information management is complex. Not only does it require a system designed specifically for medically relevant information, you also need a system that manages billing and the normal every-business kind of HR, vendor, etc records that confound organizations of all kinds.
In this whitepaper we go through how Content Services (nee ECM) can be used as the connective tissue between your clinical systems and administrative use cases.
This deck goes through the Information conundrum and how ApplicationXtender is positioned to provide the technical platform for organizations to start moving from paper to a digital future
ThinkDox talk from ECNO 2017 on using Laserfiche to manage student records and student information. We use the examples of Field trip forms and student record search to highlight the potential administrative efficiencies that can be gained.
Histone demethylase and it srole in cell biology reviewChristopher Wynder
This document provides a scientific review of the histone demethylase enzymes; particularly the H3K4 demethlases (KDM5 family) focusing on their role in cell biology. This review was written in 2014
ECNO 2016-Using ECM to gain administrative efficiency for school boardsChristopher Wynder
Presentation from ECNO 2016. The presentation centers on embedding records management into process management. We take a IT project centric view of how to move from chaos to manage-able information access points. A key concept is how ECM and EIM technologies provide opportunities for school boards to reduce their costs and risk.
Embedding records management practices into how people work is essential to ensuring compliance and reducing risk in the digital age. This presentation goes through examples of how current processes and the mixed digital paper processes commonly found at organizations are reducing control rather than increasing it.
We are often why use a VAR- what am I paying you for? This presentation goes through the basics of how we implement Laserfiche and provide continual support above and beyond basic technical support, we make sure you understand what is possible and support you as you maximize your investment.
Laserfiche10 highlights- how the new features can benefit your mobile and wor...Christopher Wynder
Laserfiche 10 brings a lot of additional features for information management, workflow building and mobile content access. This slide deck provides the overview of how Laserfiche 10 can benefit clients looking to automate their processes.
Protocol for preparing small volume samples from ES cells, clinical samples and non-adherent cell types.
This sample is designed to prepare samples for histone modifying enyme assays including demethylase, methyltransferase, deacetylase.
Regulation of KDM5 by multiple cofactors regulates cancer and stem cellsChristopher Wynder
Presentation of data regarding proteins that regulate the activity of KDM5b.
The studies use multiple disciplines including in vitro enzymology, ES cell studies of differentiation, Mass spectrometry to detect protein protein interactions.
These studies resulted in a comprehensive view of KDM5b function. It required development of at least three novel assays that are focused on moving epigenetic research from yeast and HeLa cell types to primary, clinically relevant cell types.
The techniques have been successfully used in Embryonic stem cells (human and mouse), Neural stem cells (mouse and patient derived as well as iPSCs.
KDM5 epigenetic modifiers as a focus for drug discoveryChristopher Wynder
A summary presentation of my scientific work.
My laboratory focused on an enzyme KDM5b (aka PLU-1, JARID1b) that was widely expressed during development and played a key role in progression of breast cancer through HER-2.
My lab focused on understanding the key biochemical activity of the enzyme through dissecting the proteomic and genomic interactors.
Our results were confirmed through the use of ES cells, adult stem cells and mouse models.
Much of this work remains unpublished, please contact me for more information and/or access to any reagents that I still have as part of this work.
crwynder@gmail.com
Primer on Epigenetics given at the IRSF family conference 2011Christopher Wynder
My presentation for the families of Rett syndrome patients.
This serves as a basic primer on what epigenetics is without deep details on the science.
Appropriate for all levels of education.
For more information contact the author: crwynder@gmail.com
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Goodbye Windows 11: Make Way for Nitrux Linux 3.5.0!SOFTTECHHUB
As the digital landscape continually evolves, operating systems play a critical role in shaping user experiences and productivity. The launch of Nitrux Linux 3.5.0 marks a significant milestone, offering a robust alternative to traditional systems such as Windows 11. This article delves into the essence of Nitrux Linux 3.5.0, exploring its unique features, advantages, and how it stands as a compelling choice for both casual users and tech enthusiasts.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Securing your Kubernetes cluster_ a step-by-step guide to success !
Expand ecm acrossorg_empower15
1. BAS302 Build a Compelling Case for Enterprise-
Wide Information Management
Define the value of information access across departments
Christopher Wynder
Senior Consulting Analyst
Info-Tech Research Group
cwynder@infotech.com
@ChrisW_ptmd
2. Burst
Lift
Mortar
Enterprise Content Management has fundamentally changed
It is time to give up on the concept of content management.
Prior to consumerization Now
Content was generated through corporate resources
and stored in corporate databases.
Risk could be mitigated by use of a single monolithic
ECM system.
Content is now generated through a complex mix of
applications (cloud SaaS, on-premise, user acquired) and
stored in several unlinked databases.
Risk mitigation now requires a strategic plan for determining
what content requires tight controls or needs to be findable
to maximize user productivity.
The fuel: financials, HR, email
The burn: BI,
documents, social
Marketing, user notes,
podcasts
Fuel
Financials, HR, email
Burn
Documents, social
Fire Control
Active directory, ECM,
content creation suites
Fire control: active directory, ECM
3. Content is exploding in all directions and most of it is garbage
Explosive multi-directional content, has
low value
• End-user generated content across these
types of documents is generally of low value
to the enterprise as a whole.
• Storing and managing this information is
costly and not sustainable long term.
Focused, consolidated information, has
potential value
• The important information is the analysis and
customer facing deliverables that are generated
using the short-lived materials.
The only way to separate valuable content from garbage is to
govern all content as a asset that has a potential value and an
expiration date.
IT can only control the explosion if it builds a holistic framework based on
information use rather than content type.
4. The holistic strategy must account for user access and
organizational workflows.
Organizations need to define the value of
information based on the width of use.
Most user’s day is a series of Barely Repeatable Processes of sorting
through information sources
Enterprise-
wide data
Department
data
Personal
Filter
Information movement
Key IT
control
9am
DATE
?
5pm
The average user’s day
How many different
applications are they
using
How many times are
they breaking
compliance
ERP/CRM
5. Re-think how you enable and protect information. Content focused
management is too hard.
Do you know how content will GROW in the future?
•Generate -how do users generate content-
what are the filetypes, what are the key
applications
•Record -where is the information from that
content being recorded? Office documents,
applications
•Organize -what is the point of the content?
Is the information being shared? Is it for revenue
generation? Does it need to be moved to other
people?
•When- ..is the information source used
again. What do users really need, what can you
securely provide them.
Source: Info-Tech Research Group analysis of available statistics
from Facebook, Twitter, Radicati group, Internet Statistics Group,
and EMC
AmountofcontentperFTE(GBs)
0.00
0.50
1.00
1.50
2.00
2.50
3.00
“Not Office Documents”
Office documents
Email
Social
Everyone’s content is growing but the solution requires
you to understand how users expand content
Providing access
to Office
documents alone
will not solve
worker problems.
Organizations do not live in classic
filetypes and more
6. IT has a role but what is it? Are we in charge of the structure of information
or do we control the growth and fruit of the labor?
Control GROW-th by accepting the organic nature of ECM
EIM
An architect plans the design of information,
brings structure to unstructured sources by enabling
users to move through a "journey“.
Requires existing user compliance and
understanding of information sources.
Practically only works in external sites when you
know what the purpose of the user’s visit.
A gardener sets the parameters of access, provides a
single point of entry to user needs by understanding that
every user has multiple “journeys” that encompass
their job.
Requires access control to key information sources to
ensure user compliance.
Acknowledges that content growth is organic and
needs to be constantly re-evaluated for appropriate
growth.
7. Be the gardener: plant the seed, control the weeds, and
nourish the environment
• Gardeners to control growth they only maximize the
conditions for growth.
• What can you as an Information Gardener do:
◦ provide appropriate access (the size of the plot).
◦ Set limits on where the seeds can grow (users)
and
◦ provide within that plot the nutrients (information)
that seeds need.
• You cannot control the growth but you can limit the
unwanted growth.
• The personas will define how well the seeds grow
into knowledge and productivity.
Persona
Refresh scheduleMix of content types
Information sources
Plan for organic growth of personas by
focusing on access to key information
A persona grows based on the content
and information provided.
8. Persona(s) Refresh scheduleMix of content types
Internal Partnerships
Records
Random
(cross department, general notice)
The Information Garden
Structured
Analysis
9. Define the types of seeds that you need to support
Persona
Business Process
Users Workflow
Customer Service
Representative
A/P
Case
management
Check
schedule
Follow-up
Confirm
Payment
Send
order
Review
order
Monitor
action
Request
action
Review
fulfillment
Customer
service
ERP A/P
module
CRM
case #
Workflow
Confirm
by SMTP
10. The mix of fertilizer components is dependent on the needs of
the seeds
Mix of content types
DATE
CRM
Sales
Vacation
request
R&D
What information do users need to “get work done”
DATE
DATE
DATE
How many of
these resources
are up-to-date?
11. How long
does content
have value
users?
How often do they seeds need to be watered
Refresh schedule
Generate
Use
Delete
or
Archive
Evaluate
Integration of
mobile, “non-Office”
Ease of sharing,
“folksonomy”
Transparent
Disposition and
archive automation
What can IT provide to enable users
12. The soil is the key. Each plot needs to be
balanced for the crop
IT
Efficiency
Risk
Mitigation
Business
Efficiency
The Soil is the platform for
information movement.
The ECM provides the
simplest platform to enable
the various processes and
information usage that the
business requires.
Each “plot” is designed to
enable personas based on
information usage.
13. 0
200000
400000
600000
800000
1000000
1200000
ECM Workflow Mobile and Email Replacement ECM Customize
Getting approval to buy the farm.
Measure
effectiveness for
a single
department
Contrast
replacement
with
workflow
and mobile
Cost of expansion
across organization
14. User productivity can get the conversation started
Time wasted due to misalignment of user
needs and ECM deployments.
Hours (per
week)
Knowledge Worker
cost
Process Worker
cost
Reformatting 2.4 $ 86.21 $ 58.05
Re-creating content 1.9 $ 68.06 N/A
Moving documents between locations
1.5 $ 54.45 $ 36.66
Publishing to multiple applications 1.8 $ 63.53 $ 42.77
Manual retrieval of archive records
1.4 $ 52.18 N/A
Searching but not finding 2.2 $ 79.41 $ 53.47
Totals 11.2 $ 403.85 $ 190.96
A single platform that mediates document search, and file sharing can save
that investment by solving the key user frustration of findability.
Info-Tech Research Group analysis of Productivity surveys by McKinsey (2011), Oracle (2009),
Ponemon (2010) and KPMG (2009)
15. Enterprise wide adoption must balance three drivers: users,
business, and the information itself
Balance the needs of different stakeholders to ensure buy-in across multiple
levels.
Business
Users
Information
Focus on decision
making processes by
exec team.
Simplified usage for
reports to stakeholders,
identifying new
opportunities.
Reduce cost for
information usage (AKA
governance)
Focus on the unnecessary steps
of their day-to-day process.
“The easy button” for vacation
request documents.
Focus on enabling future
revenue.
Analytics, Eased compliance
16. EIM projects often fail to get off the ground
because they start too big. Consider a
project that starts with:
Engage all senior executives in a
governance and steering process.”
You will never get the CEO, CFO, CxO, [the Pope,
the President, etc.] in a room together at the same
time. They are too busy and are focused on bigger
issues.
Expect to have to prove that an EIM solved a problem.
The first common problem of business case development: the
Popes & Presidents Problem
You have to start within IT before pushing out to the rest of the business
Bottom Line: Identify departments that will have success.
17. Develop a strategy that avoids both the Popes and Presidents
Put together an Information Organization strategy from within IT
first, before disseminating it to the rest of the enterprise. It is far
easier to engage with business units and get buy in when you have
already started something.
Answer some basic questions about the enterprise and its information
needs and then ask the business why you’re wrong:
What are the information-related problems facing the
enterprise?
How do these related to business and IT priorities?
What information sources do we actually have?
Is there risk or opportunity associated with those information
sources?
Who uses these information sources and what do they really
need?
Bottom Line: The straw man doesn’t have to be definitive. But it does have to be defensible!
You need a straw man strategy that starts in IT is then pushed to the rest of
enterprise via the governance structure
18. Focus on the trends that the business cares about:
First trend: Compliance
Business
Users
Information
Visibility into
information contained
within “content.”
Visibility into age, and
changes in
information.
Control of information
access.
Control over ILM
Appropriate access without
additional layers.
Reduce the technological
barriers to collaboration.
Reduce risk of breach. Ease
compliance reporting.
Provide a platform for
expanding the types of
assets that can be tracked.
19. Regulatory pressure is increasing on all types of organizations
1965 1970 1975 1980 1985 1990 1995 2000 2005 2010 2015
Finance Reform
Legal Requirements
Privacy Regulations
Non-government
standards & certifications
Public and political pressure continues to drive finance reform and privacy
requirements in jurisdictions around the world.
IT’s ability to reduce risk is inversely related to number of different information
sources that need protection. Enterprise wide adoption of a single platform
consolidates the risk increasing visibility
Regulations are proliferating
to cope with growing volumes
of sensitive data and content.
SOX Basel II
Basel III
Dodd-Frank
FRCP update to
include ESI
(eDiscovery)
PIPEDA
PCI DSS v. 2
HIPAA
FERPA
Access to Information Act
Freedom of
Information Act
GLB
PATRIOT Act
FISMA
20. There is too much content to control it
Source: Info-Tech Research Group analysis of available statistics from
Facebook, Twitter, Radicati group, Internet Statistics Group, and EMC
Information security and ensuring compliance requires a more mature approach
that includes defining the role of storage in the problem
20% of all corporate content is
now stored on employee
owned cloud storage.
Email as a percent of total
content is decreasing by 40%
per year
AmountofcontentperFTE(GBs)
0.00
0.50
1.00
1.50
2.00
2.50
3.00
Corporate owned storage
User owned documents
Email
Social
30x Increase
202020142008
Structured Data
Replicated Data
Unstructured Data
Source: Info-Tech Research Group, 2011
Evernote has
grown by 33%
every 6 months
Half of enterprise
users have
Dropbox
Breakdown of average user
information sources
Breakdown of enterprise wide
information sources
21. 90% of data stored to disk is never accessed
again in a 90 day period.
-- University of California study
Most of your data is useless. To get a handle on data growth, you must
decide what to keep in long-term archive, and what to get rid of.
Introduce RM and archive capabilities to control compliance
risk
Harness best practices to close the
affordability gap from both sides.
Currently, the TCO of storage is 3-10 times the cost of acquisition. The largest TCO impact can be found from
optimizing information management processes. Use the TCO for storage as long term risk for not adopting ECM?EIM
solutions as part of the long term information/storage plan.
Info-Tech Insight- Business case development point
69 percent of information in most companies has no
business, legal, or regulatory value.
-- Deidre Paknad, Director of Information Lifecycle
Governance Solutions at IBM
22. Securing information is an ever increasing cost
Public sector is reacting with more spending
Percentage of Total Annual Revenue Allocated to Security by Organization Type
Less than
.1%
Between .1%
and .25%
Between .25%
and .5%
Between .5%
and .75%
Between
.75% and 1%
More than
1%
For-profit, privately owned 28.0% 12.0% 20.0% 4.0% 20.0% 16.0%
Government 22.2% 22.2% 11.1% 11.1% 11.1% 22.2%
Non-government, not-for-profit 27.7% 22.2% 11.1% 0.0% 33.3% 5.5%
Publicly traded 35.7% 35.7% 7.1% 7.1% 10.7% 3.5%
Source: IOFM’s Benchmark Survey on Security Salaries and Spending
IT security spending is predicted to
increase 6.6% compound annual
growth rate, reaching $30.1 billion in
2017
Companies with 100-499 employees
are predicted to drive IT security
spending most, totalling $8.5 billion
by 2017
23. Control Information Security costs through ILM
Drive the Cost Per Incident down through increasing visibility into day to day
activities and removing risk from storage locations.
Be tidy:
Delete old data, lock
down high risk data
Know what you have:
Metadata, audit trails
Know how users work:
Workflow, important info
sources
Savings from:
IT time
Reporting time
Consolidated
access control
24. EIM provides control for both storage and security risks
• Appropriately governed information
sources can reduce the TCO of storage.
The cost per user for storage is
reduced by 40-60%.
• Additional IT cost decreases are seen by
the reduced eDiscovery through ECM
search and hold capabilities.
$1.00
$10.00
$100.00
$1,000.00
$10,000.00
Initial Year 1 Year 2 Year 3 Year 4 Year 5 Year 6
Base costs
EIM
Storage
controls
Information Governance reduces the cost to
store user information
Required
additional
storage
Only 15% of CIOs believe their data to
be well, and comprehensively
managed.
-- Merv Adrian, Principal,
ITMarketstrategy
• Human error accounts for 1/3 of all
data/information breaches. Over half of
these losses are due to lost endpoints
containing data.
• Centrally locating information and
processes on an ECM system
exceeds most industries compliance
needs for digital assets.
25. Give the right access and control your growth and risk-Analyst
story
As with all workers, I’m a pack rat. I always think “I really used this a lot last
week, I should put this somewhere safe” So what do I do? I put it on the hard
drive, the File Share and Evernote. Do I update all of those sources? Of course
not-but Im not about to delete them- because there is always a chance that I’ll
re-use that template or that I’ll want that line that I really liked.
It extends to my personal information habits…..I have a shoebox of hard drives
with one of them so old the SCSI wire doesn’t exist to hook it up to a computer.
So what does this tell us about end users? We can’t predict the actual
information that they will want to keep or why. We can shape how the users
move through our systems to get to information.
Once we have confidence that users have the access they need we can start to
put the disposition policies in place to deal with the garbage content that is filling
our storage.
If I had confidence that I could find my information I would be willing to
just have one-okay maybe two copies. Just like the hard drive that is too
old to be added to the computer, we can prevent old garbage from coming
back to the system.
26. Focus on the trends that the business cares about
Second trend: Findability
The Google problem:
relevance and ranking
Standardize tags and
search control by role.
Business
Users
Information
Multiple
locations.
Indexing and
ranking.
Versioning and
modifying.
27. “The real issue is
when to get rid of
stuff. We have shared
drives with files from
pre-2K.”
– Scott Macleod
Illustrative Case Study: Dead content suppresses users’ ability
to respond to clients in a timely fashion.
Results
• IT recovered 20% of the
storage capacity by
deleting old files.
• The executives realized
that IT had to control all
content for legal coverage.
• Using regulatory needs IT
built a flexible system
decreased duplication and
export of key information.
• Users helpdesk tickets for
lost or missing documents
has decreased.
• IT has started to build sef-
serve analytics, taking
advantage of the user
personas for access.
Action
• Recognized the need for
tighter controls on content
to control growth.
• Started to delete all files
that have not been
accessed in more than five
years.
• IT recognized that the
website and Twitter
account often received
comments that
departments should be
aware of in a timely
fashion.
“It was the craziest thing.
Folks are writing personal
information on our
webpage. We quickly
realized that this was going
to be an issue.”
Scott Macleod
Information System
manager
State government
Situation
• Content growth is
suppressing the
infrastructure.
• The user population is
highly dynamic, but needs
specific access to data.
• Users often re-make
content rather than
attempt to find it.
• The content is often used
and updated by multiple
user groups.
• Privacy of content is a
paramount concern.
“We’re actually in the midst
of a re-organization. So we
aren’t really sure which
documents are paper and
which are electronic.”
28. How users work is the key to maximizing the value of an ECM
system
Focus on the workflow problems to enable user adoption.
BPM
System based language.
This is the nuts and bolts of
application development and
information automation.
Requires multiple systems effectively
move a process from beginning to
end.
A BPMN engine does not equal BPM.
System integration for passing
processing and data through
applications
Workflow
User language
This is the how the user has to work
within their day.
This is a surface level customization
that may require multiple
applications.
This requires understanding how
information moves amongst users.
The workflow needs to be hidden
behind the information access layer.
29. Focus on the trends that the business cares about
Third trend: Analysis
Statistical relevance
Access to necessary
information
Visualization
Business
Users
Information
Relevant
information
Contextual data
Historical trends
Predictive
analytics
Judging value of insights
Controlling intellectual
property
Ensuring privacy and
compliance
30. ECM/EIM platform
1. As businesses adopt social and web
2.0 tools the need for a single plan for
how information is formatted, used and
protected becomes key.
2. IT plays a key role in ensuring the
success of business initiatives but
providing access to information to the
applications and workers-where ever
they are.
3. Use the business initiatives to survey
the most important information sources
and the business units that should be
part of the Information Governance
committees.
Effective internal document
sharing
Easy
knowledge
transfer
The platform should be robust
enough to support the business
use of information and regulations
5Xgrowthin
informationperyear
Business use and re-use of
information expanding
BYOD/
Mobile
Analytics
Weightofregulationsis
increasing
1
2
3
Information management is at the heart of key business
initiatives
31. Value example-BI
The primary source of latency in BI is the lack of context for data points. This
requires human intervention.
Source: A Business Approach to Right-Time Decision Making, TDWI
Data Latency
Analysis Latency
BI initiatives fail through
incomplete data and unclear
presentation.
Both of these failures are due to
the lack of context regarding
data points.
ECM systems can provide the
context thus improving success.
This require a multi-department
system to provide a business
context.
32. ECM can provide enterprise wide control and access for BI
projects
The BI operating model is considered effective when all four criteria are met.
Effective
Business
Intelligence
Focused on
Business
Requirements
Tailored
Functionality
Based on
Users’ Needs
Use KPIs to
Monitor
Performance
Simplify
Complex
Processes
33. Focus on the trends that the business cares about
Fourth trend: Storage
Templates
Collaboration
Business
Users
Information
Relevant
information
Compliance
Mobile/External
IT costs
Format
34. Storage growth is an immediate and pressing trend due to
growth in content
• Structured and Unstructured data
grows by approximately 50% each
year.
• Sixty-three percent of CIOs have
increased data storage spending in
2013.
• On average, organizations are
allocating 20-30% of IT budgets
towards storage.
• Most storage has a 20:1 data
duplication ratio. Moving high use
documents to a ECM with version
control can significantly reduce the
storage overhead.
Key Data Growth Drivers
Media intensive industries such as
entertainment broadcasting, medical, legal, and
insurance can expect to see data growth rates
over 120 percent year on year.
-- Storage Strategies Inc.
35. Protect the investment that you have already made into
information systems
The investment that an organization makes into information storage and
applications such as SharePoint, BI, etc., is based on the expected value
that it will bring to the users.
The direct investment in applications that use information:
An organization with 5000 Standard SharePoint server Licenses (and CALs)
has an approximate 3yr TCO of $740,000.
Storage strategy and flexibility
Low SharePoint adoption means that storage resources that are dedicated to
SharePoint are underutilized.
1
2
(ECM + Storage)
#GBs X Adoption rate
Lost
Investment
37. Place an organizational wide value on content based on
information use
Risk Value
Compliance
Collaboration mandate
Special projects
Internal initiative
Standard risk assessment
Likelihood
Impact
Legal and compliance
Σ =5 Σ =4
Specific risks
IT supportability
Vendor roadmap
For the strawman we just need a rough risk assessment. As the project
matures you will need a more granular risk assessment that is based off
of the business goals and IT capabilities
38. Enterprises can manage content effectively through a variety of approaches.
Technology must be aligned with your strategic needs
• The majority of enterprises
surveyed had several products to
control content.
• Enterprises with several products
were equally successful at
controlling content as those with a
single ECM product.
HOWEVER:
• Enterprises with no ECM
products reported a greater
concern regarding the business
efficiency and compliance.
0%
10%
20%
30%
40%
50%
60%
Several
products
Shared
Drives
Single ECM User
controlled
No strategy
Several products
Shared Drives
Single ECM
User controlled
No strategy
6X
Source: Info-Tech Research Group Q2 2012;
N=75
Single ECM platforms are not required
for success.
Howdoyoumanagecontent?(%respondents)
39. Successful organizations have a mix skills within IT to administer ECM
Challenge 1: IT skill set be capable of meeting the
administrative needs
Information
Governance
IT
Competency
Technology
readiness
IT Competency
Information sources risk
assessment
Standard operating
procedures for
requirements gathering
Mature process for
application development
Basic understanding of
consumerization trends
Information Competency
Do you have a:
Information governance
committee
Program manual for
information governance
Retention and archive plan
Executives acknowledge
need for better user adoption
A controlled vocabulary to
base user needs on
Technological Readiness
Implemented an ECM solution
Applied the taxonomy to the ECM
Assessed the gaps in user needs and
ECM features
Checked vendor roadmap for updates
to current issues
40. Challenge 2: Avoid re-building the junk drawers
Be proactive with ECM or users will default to the
old habits of throwing everything in the same
place.
• ECM cannot be used appropriately without a
Risk profile and Information Governance plan.
• Users do not know what they
want from ECM-they just know
what they need to do.
• When we allow users to decide on the
organization of ECM they often become
frustrated with the lack of built-in tools-which
then leads to dissatisfaction and low use of
ECM.
Do not ask “What can an ECM do?” Ask “What do we want our ECM to do?”
ECM is an expansive tool box that can support
both application development and document
management-out of the box.
• Solve this problem by have a business focused
plan for the initial roll out.
• Focus on solving a user driven problem. This
will likely require building workflows or addition
of third party tools.
• Set up a straw man of based on IT’s view of
what user’s need so that we can get the users
talking about what they actually want compared
to what we’ve showed them
• “This is what I think you need,
Why am I wrong?”
41. Challenge 3: Information movement patterns
Individual to Individual
Ensure that all content has
enterprise-wide descriptors as part
of the metadata. Use role-based
access to ensure key individuals
have the access they need.
Standing collaboration
group
Prioritize classification tools and
provide federated search. Allow
user-based tagging to ensure that
content has long term find-ability.
Ad Hoc groups
No tool can ensure that information
is available to unknown queries.
Ensure that user profiles are up-
to-date so that experts can be
found to combat these situations.
Individual to group
Prioritize collaborative tools. Build
a strong process to define
enterprise-wide content versus
group content.
Open share of personal
stores
Prioritize semantic or ontology-
based classification tools.
Prepare a clear, but concise
governance package for sensitive
content.
Peer to Peer
Prepare to bring strong user
profiles into content metadata.
Ensure that communication
streams are part of content
capture.
Typical combinations Key IT tools
Communication capture
User tagging
Author field as
metadata
Last accession-based
archive
Enterprise defined
metadata
Integrated social tools
Strict retention rules
UserDepartment
42. Start your technical requirement gathering by defining what
the content is doing within the enterprise
We need to increase
knowledge sharing
Workflows need
better visibility
Define knowledge within your enterprise.
• How do workers capture knowledge?
o Peer to peer communication?
o Outside journals?
o Publishing internal documents?
• Define how sources of knowledge are mixed. Find
out how users think it can be best organized.
o By subject
o By project
• Guide the business leaders in defining the
metadata.
• Enterprises with expansive metadata libraries
should consider more advanced and flexible
organization tools including separate classification
tools.
Define which manual processes have
dependable, predictable outcomes.
• For each department, which processes have
paper to electronic workflows?
o Vacation/leave requests?
o Procurement cycles?
o Order reconciliation?
• Define who the process owners and key decision
makers are for exceptions.
o Who is alerted for orders/requests that do not
match paper versions?
o What is the process for reconciliation?
• Guide the business leaders in defining retention
times for data from each process.
Ensure that all communications associated with a project are
included. Many of the kernels of long term knowledge are
buried in the social communications.
43. Prioritize organization tools when comparing applications
Classification works best when it matches the information sharing needs of
the enterprise.
• Content capture: What
is the primary type of
content?
• Process or knowledge
based?
• Content organization:
How will content be
used?
• Organic user-based
tagging, rigid
enterprise-enforced
taxonomy.
• Use governance:
What is the
enterprise’s security
need for content?
• Define end of life:
When does a piece of
data cease to be
generally useful?
• Records, file shares,
eDiscovery.
For process heavy
enterprises, these
capture features are the
key:
• OCR
• batch metadata
addition
• combining e-docs and
paper docs from the
same process
For all content these
features are key:
Document IDs: for
version control.
Records management
tools: taxonomy, file
plans, access control,
audit features.
Applying Holds:
Retention Policy
Services, workflow
review, and approval
tools (e.g., comment
controls, exception
management).
Search: cross-library
searches using content
attributes.
Records management
tools available for all
content:
Archiving tools: backup
to storage, automatic
deletion dates.
Capture Organize Use Archive or retire
44. Use the content lifecycle to define requirement priorities
Information managers must ensure that their deployment supports the
critical needs at each stage of the content lifecycle.
The expansion of content beyond the corporate walls means that IT cannot manage all content nor can
it control users to manage the growth of content. Ensure that the content management strategy
provides controls at each step that give visibility and slow growth.
The ECM
Lifecycle
Capture Organize Use
Archive or
retire
Administer
Content must be
brought into an ECM
system. It might come
from existing
systems, imaging, or
it could be uploaded
by users.
Information within
an ECM repository
must be
organized,
indexed, and
classified.
Users must get
access to content.
Consider time,
workflow, format,
and device.
Enterprise content
must be retired,
either through
deletion or
archiving.
Content Management changes the expectations for administrators. They must
reassess their approach to strategy, risk, security, monitoring, and support.
45. Align the ECM and user information lifecycles to define the
system requirements
Adoption and user workflow are linked together. Solve the users’ key needs and you’ll solve
your compliance concerns surrounding structured documents and records.
Capture Organize Use
Archive or
retire
System
touchpoints
User
information
lifecycle
Generate Record Use
Forget or
store
?
Organize Re-Organize
Specific ECM
requirements
46. Business capabilities should focus on cross-functional,
organization-wide use of knowledge for decision making
• Business capabilities supported by data and analysis to enable fast,
value-based, near-real time decisions
◦ Information at point of need and in a form to enable decision
• Identification of few distinctive capabilities where analytics and insights
will provide competitive advantage
• Embed the process of Analytics (insight generation, validation and
value realization) as a capability across the organization
• Emphasize end-to-end process and insight thinking
The ECM
Lifecycle
Capture Organize Use
Archive or
retire
Administer
47. Client conversation: “What are the best practices in
documenting my Information Governance program”
Situation
Insurance company with multiple
very public breaches.
Currently has an under-used and
confusing Information Governance
program based on its records
management regulations.
A recent review showed that the
data governance program has
increased data quality and reduced
risk of data loss.
The key identified problem was lack
of user understanding of how to
communicate information across
departments and to customers.
Complication
The public nature of the
breaches caused deep
scrutiny by regulators and
stockholders.
The growth in product
offerings and client
touchpoints have
outstripped IT’s ability to
respond.
These two complications
led to turnover at the
management tier and
expansion of untrained
customer service
representatives.
Resolution
The IT and compliance offices set
up a dedicated panel to oversee the
process.
They identified the data governance
program as a relevant and
successful model.
The documents and committees
were customized and expanded to
reflect the wider information need.
Program growth was done by
acknowledging that information had
a value to the organization and
needed to be protected as such-just
like data.
Insurance company expands a successful data governance model to build an
Information Governance program
48. Efficiency is nice but complying with your regulators pays the
bills
Business
Efficiency
Collaboration
Enterprise
Search
Compliance
Cost reduction
Web Content
External
Communication
Litigation
Importance to enterprise
Priorityneededforcontrols
Compliance and Litigation rank
lower in importance, but need to be
the key feature guide for IT.
Enterprise Search ranks high for
all the wrong reasons.
Enterprises believe that better
search will replace the need for
quality metadata.
It will not.
To achieve the top needs of
Business Efficiency and
Collaboration you need to build
keywords and metadata to
search upon.
Increased Business Efficiency
is the number one reason why
enterprises are willing to invest
in an ECM.
49. You can’t Capex your way to the best solution-Analyst story
I recently talked to the US branch of a global company headquartered in
the EU. The global company has several hunting and camping products
sold across its different branches. They have identified consolidating the
supply chain and centralizing purchase and customer relations as the
priority for the next fiscal year.
The good news is that money is no option and they have invested (heavily)
in a technical solution that can absolutely meet the needs of the global
organization. They have also defined a three month timeline for
implementation of the solution based on the vendors case studies of similar
companies.
The problem is that the US branch has only one customer- the US
Department of Defense. Which has strict rules on origination of
communications, data storage and a whole host of other security
regulations that centralization would breach.
Technically there is no problem, but the standard implementation will not
meet the existing contracts with the Department of Defense.
Defining technical solutions is important but the investment will be
wasted if you do not understand the key problems that need to be
solved.
50. Thank you for your time
Ways to contact me:
LinkedIn: Christopher Wynder
Email: cwynder@infotech.com
Twitter: @ChrisW_ptmd
This presentation will be made available via SlideShare.
Search via “ECM business case” OR “Chris Wynder” OR “Empower 15”
Please see the Info-Tech website for more detailed information on Information
Governance or any IT problem.
www.infotech.com
52. The soil is the key. Design the plots based on the crops
Information sources
This need is the top ranked driver for ECM adoption
by Info-Tech clients.
Business efficiency is the only need that will
garner buy-in.
Business efficiency is based on findability and well
implemented business initiatives.
IT Efficiency
Business Efficiency
The content growth provides a perfect opportunity to
control storage costs.
Well governed information can reduce the cost of
storage, in the long term, by 60% through
controlling growth rate, reduced duplication of
content and automated disposition.
Risk Mitigation
Internal records
All organizations have HR documents and
financial records that require governance.
Litigation
eDiscovery is the elephant in the room. For most
organizations the risk is huge but the likelihood is
very low.
Compliance
For most organizations the limited regulatory
overhead will not be an effective driver for
Information Governance.
53. These five steps were identified as the critical increasing adoption
Already using an ECM but need to increase adoption?
Establish Information Governance plan prior to evaluating and technical changes.
Establish Information governance as a item one of existing compliance
committees
Build a organizational taxonomy that can provide both control
and increase document findability
Identify existing templates and taxonomy for departments or user groups
that can be extended to the whole organization.
Establish information architecture that can increase user
adoption
54. Information organization in SharePoint is not intuitive. User journeys allow IT
to tailor SharePoint to guide users to the appropriate sites for information
Build user journeys to detail the activities that require
Information that the Organization owns.
• User journeys are maps of the
steps in an activity.
• They represent a linear set of
steps or tasks that a user must
complete to complete an activity
• Essentially it is the same as
process mapping that is done for
BPM projects.
• Depending on the goal of the
journey they may represent a daily
activity or a multi-day activity.
• The key is that each activity is
broken down in smaller steps
that use or generate information
in a documented form.
Doctor
Patient
diagnosis
Grand
rounds
User Journey of a Doctor’s day
The goal of a user journey is break down activities into
actionable steps.
Specifically we are looking to focus on those tasks that use-or
should use SharePoint.
Once we have a Straw man for set of user journeys we can build
a attach the information sources to each step.
The user journey then provides guidelines to what IT needs to
provide to users in SharePoint
Check
schedule
Follow-up
Confer w/
nurse
Order
tests
Schedule
Confer w/
peers
Write-up
Get case
history
55. Combine how information moves within departments with the enterprise-
wide needs to define the management strategy.
Each IG project has best fit use cases. Use the use case to guide
your decision on which project to use.
Enterprise
Department
System of
interaction
System of
record
Access control
Findability
Archive
5.4
Ad hoc/
Fileshare
Enables
search,
collaboration.
Reduces
duplication
Controls
sensitive
information
and assures
audit trail
Allows users
and
workgroups a
junk drawer
Provides a wider
set of tools for
social, collaboration
and access control
Provides rigid
controls and
automation of
complex policies.
Provides a
separated database
with disposition and
robust search
Ad hoc/ Fileshare Archive
System of interaction System of
record
Findability
Access
control
Moving sources between systems is not
always feasible. Use the findability and
access control projects to maximize
value on “unmoveable” sources
56. Funnel information sources through ECM to build an
Organizational level System of Information
Users create content
using a device. The
device could be a work
station or mobile
device.
?
Systems create
content through
the comments and
transactions (e.g.
payable reports,
PHI).
1 3
Users query on
keywords and
enterprise
descriptors.
5
A single set of
enterprise descriptors
automates association
of similar files from
multiple sources.
4
The search
returns multiple
documents that
have the
keywords or the
same descriptors
(e.g. same
author,
department,
project).
6
User choice
becomes a data
field to rank search
(accession date).
7
Properly tagging
documents
improves findability.
Tags/Metadata also
become the basis
for providing
appropriate access
and classification
2
57. Focus on user tools to improve ECM success
45%
55%
Meet
Expectations
Did not meet
expectations
• ECM brings many of the tools that are
needed to appropriately manage information
and administer the system.
• Technically ECM any has the tools to
support most business needs.
Most organizations do not identify a
business need prior to implementation
High adoption naturally feeds risk mitigation. Start with
a system that solves a user problem and they WILL use
it.
User tools
Why does ECM
fail?
Information
management
System
administration
1
2
3
ECM: More failures than
successes!
Info-Tech Research Group, “Does ECM
meet the needs of your end-users?” n=58,
Q4 2012
Why does ECM
fail?
If you have an Information Governance
plan this is about the user tools
58. Mold ECM to meet your needs before further technology
investment
0% 20% 40% 60%
Customize SharePoint
3rd party tool
Successful ECM implementations focus on
customization and application integration
ECM success requires a dedication to
the platform through integration of
LOB applications.
AIIM, survey 2012, The ECM puzzle, adapted from Figure
16. N=345
ECM is more application platform that traditional
ECM. Its ability to centralize document sharing and
integrate communications can provide users
platform to manage their mundane tasks and bring
efficiency to the “processes” that encompass their
workday
Info-Tech Insight
ECM has a variety of tools that ease
customization.
The adoption problem will not be solved by
additional tools.
This is a problem that must be dealt with
through ensuring that ECM makes workday
tasks easier to perform.
59. Align ECM and user information lifecycles at key points in the
process
Adoption and BRPs are linked together. Solve the users’ key needs and you’ll solve your
compliance concerns surrounding structured documents and records.
Capture Organize Use
Archive or
retire
ECM
lifecycle
User
information
lifecycle
Generate Record Use
Forget or
store
?
Organize Re-Organize
ECM works best when
the information is
organized at capture
The un-asked question-”How do
users get work done?”
This is key to how users
expect to find documents
Users lack the
tools to
appropriately
archive content
Re-use leads to lots
of local copies.
60. Start to build a taxonomy by defining key user groups as
personas
Role:
What do they do?
What are their key challenges?
E. What are their activities?
Se. For what do they search?
M. What document types do they use?
S. Where do they work?
T. When do they work?
Code
Identify key
challenges with
information use or
access.
Now that we have
some of this
information use it to
jump start the
taxonomy process
61. Classification is hard. It is an exercise in logic, philosophy, and – occasionally – faith, since it
deals with universalities. Thomas Jefferson, for example, ordered the books in Monticello
according to Francis Bacon’s Faculties of the Mind: Memory (History), Reason (Philosophy),
and Imagination (Fine Arts). Melvil Dewey borrowed this structure – and indirectly borrowed
from Hegel – to create the popular Dewey Decimal System.
The best approach for IT comes from S.R. Ranganathan. He was inspired by both Meccano
and Hindu mysticism to create a scheme centered on five key facets:
How do we actually classify stuff?
But what actually belongs in the taxonomy?
Facet Description Examples
Personality The core subject of the work. Ignore it! It is too difficult to operationalize in the typical enterprise.
Matter Objects, typically inanimate. Desktops; Servers; Storage; Buildings.
Energy
Actions and Interactions. It can
also describe specific processes.
Customer service; Quality control; Manufacturing; Research;
Accounts payable.
Space
Locations, departments, or
similar descriptors.
Human resources; APAC; Guatemala; Building A2.
Time Hour, period, or duration Morning; Q3; Financial close; Winter; 2011.
62. Focus on information findability with strong document
classification
You don’t need a tree structure to capture everything
Most people are familiar with the rigid classification systems used by
biologists, the period table of the elements, or library systems such as the
Dewey Decimal System. Each of these systems lets things be in only one
location in the classification system. This approach makes sense if you’re
trying to shelve books.
Most classification systems are pre-coordinated. Things an only be in
one place at a time. Enterprise Information is different. We need to use a
post-coordinated system that enables us to classify documents in a
variety of different ways.
Take three different creatures: grasshoppers, dufflepuds, and kangaroos. We
need different post-coordinated facets to effectively describe them: mammal,
insect, fictional, and things-that-jump.
63. Keep the taxonomy structure to 8x3
Long lists of anything are a disaster for information collection
Marketing Joke: “What is the biggest state in the United States?”
Punch line: Alabama.
The joke isn’t funny but it does illustrate a common problem with Information Organization
and data collection. Digital marketers often solicit information from site visitors who aren’t
highly motivated to provide accurate information. Hence, they select the first option in the
“State” drop down list: AL – Alabama.
We have this same problem when we develop taxonomies and expect users to accurately
catalog documents when they upload them.
The Answer:
8x3Humans work best when presented with a list of about eight
items. We can typically keep that many items in working
memory. Furthermore, we will typically drill through three levels
of how detail.
Keep your taxonomy to three levels of detail, each with about eight items. The taxonomy for a facet,
therefore, can have 83 – or 512 – items.
64. Managed metadata, taxonomies, ontologies, thesauri, etc. all have
subtle differences but share some core elements:
• Authority file. Names that can be used. Descriptors and names
are listed in authority files.
• Broader term. Terms to which other terms are subordinate.
• Category. Grouping of terms which are associated, either
semantically or statistically.
• Related term. Terms which are similar to one another and often
exist in the same category.
• Modifier. A term that narrows the focus of another term. For
example, the use of “Character” in the compound term “Stanton,
Archibald – Character”.
• Narrower term. A term that is subordinate to another in a
category.
• Preferred term. The term that is used for indexing among a
group of related terms.
• Scope note. Direction on how to apply a term explaining usage
and coverage.
The controlled vocabulary is the basis of taxonomy and
findability
It can get complicated, but focus on the core elements.
Controlled
Vocabulary
Thesaurus
Ontology
Controlled
Vocabulary
65. Move from defining problems to building a solution
The goals for requirements gathering.
Basics of building a ECM site with user experiences in mind.
• Identify goals of the site
What is the one activity that will drive users to stay within
ECM.
•Create a logical hierarchy for the content
•Create a structure for the site based on the content hierarchy
• Explore the use of metaphors to come up with a site structure (organizational
metaphors, functional metaphors, visual metaphors)
Design the wireframes for the individual pages
Justify the project to stakeholders
Provide a feedback system to ensure that the site
adoption stays high.
For internal sites this
is inherited from the
controlled
vocabulary
66. Start to build a taxonomy by defining key user groups as
personas
Role:
What do they do?
What are their key challenges?
E. What are their activities?
Se. For what do they search?
M. What document types do they use?
S. Where do they work?
T. When do they work?
Code
Identify key
challenges with
information use or
access.
Now that we have
some of this
information use it to
jump start the
taxonomy process
67. Start your taxonomy based on the vocabulary that already
exists
Pillar
Depart.
Budget
related
Location
Research
activities
Daily
activities
Clinical
activities
Folks-
onomy
Intranet Workshop
Other
sources
NYPD
Washing-
ton?
Remember our goal at the
beginning is to have enough
taxonomy to confidently allow
users to add content to
SharePoint for the purposes that
the organization has defined. The
taxonomy WILL need to updated
through a controlled process.
The key with folksonomy is a clear
process for evaluating the usage.
The goal should be to have these
integrated into the controlled
vocabulary to replace unused
terms rather than create a shadow
metadata system
68. User journeys are a process map of the tasks that workers perform. IT can
use this as a guide for what information sources that end-users should be
able to access from SharePoint process steps
Focus on the User’s journey through the system to increase
adoption
You know you need this if:
√ You ask users when they last looked at SharePoint and they say Huh?
√ You get requests for adding Google drive to the desktop
√ You spend more time explaining where to find vacation request than application development
You have excel files called corporate financials-confidential
Finance is asking IT to pay for their version of "Search for dummies"
The top sent email address ends in "@evernote.com [or @gmail.com]"
You have spent more than one hour looking for "some document that joe from research work on maybe last year"
You have 300 TBs of data for five users
You get helpdesk requests that start "I need to Jane to access my……."
The CIO keeps forgetting to approve your vacation suggesting "it would be easier if I could just press a button"
You have product request for productivity apps
If you are on older versions of portals can you extract what worked and what failed?
69. Synthesis of individual knowledge
Individuals use a variety of information sources to build their knowledge base
70. Group knowledge requires individual information
Synthesis of group knowledge requires the right mix of information from
each individual and collaborative analysis
71. 4.1 Mitigate storage growth with the “one-two punch” of
policy and technology
Policy: Data management policy
and practices will mitigate data
growth and maximize data value.
Technology: innovations
will improve utilization,
matching data value to
storage cost.
• Data deduplication
• Automatic tiering
• Storage virtualization/
Software defined storage
• Thin provisioning
• Data compression
• Data governance
• User policies
• Archival practices
• Purchase timing
optimization
IT managers who optimize their storage
environment through policy and technology
approaches are able to:
1) Drastically reduce the cost of storing the
same amount of data.
Effect size: Reduce storage budget by over
half; experience as much as 60-70% cost
reduction.
Or
2) Purchase more storage capacity with existing
budget amounts.
Effect size: 150-200% more storage volume for
the same price.
Addressing storage with only one of these approaches is like boxing with
one hand tied behind your back.
72. Do not discount any content type without determining your risk and value
guidelines.
The Information Governance team needs to have full visibility
into all potential information sources.
?
Archives
and back-up
Old
hardware
Hosted
services
User acquired
services
Communication
New content
types
The explosion of content type means
fragmentation of how information is share
and stored. The Information Governance
committee’s definition of information
should be content type neutral.
Information
Governance
Strategy
73. Start with policy, then apply the policies to information sources based on
value to the business.
Break Information Governance into manageable projects
Information risk and
value
Project 1:
Enterprise wide policies
Archiving
Project 3:
Disposition,
growth control
Specific risk
mitigation,
findability
Archiving can be
the driver for
better governance
but it cannot
replace
governance.
Archiving requires
rules and policies for
both enterprise wide
rules and managing
exceptions to the rule.
Is specific content
valuable enough to
keep?
Information
Organization
Project 2: Build a
taxonomy
Storage management
Project 4:
Enterprise wide storage
control through deletion
The key to controlling
growth is translating
governance policies into
management practices
74. Define the value of governance based on the initiatives that
use information
DATE
Potential information sources What information is important long term?
Most user’s spend their time making documents that will
not be used or likely opened more than once. All
stakeholders can agree that these types of files are a
waste of space.
It really comes down to this: if file X was deleted
tomorrow would anyone care-or even notice?
The answer for most files is no but…..there is also no
value to end users in determining which documents
are low value.
Storage wins can drive cost reduction but without
strong backing from the executive these will not lead
to long term adoption of Information Governance.CRM
Focus on those information sources where good
governance will increase the value or ease the
implementation of a business initiative.
Initiatives that require information to move between
users or applications will be more valuable and easier to
implement with clear guidelines on how what information
should be included, how it should be classified and who
can access the information.
Info-Tech Insight
All of these sources should be governed.
Start with sources that where there is a
clear enterprise wide mandate for
expanding their use.
2.1
75. IT needs a strategy that links relevant content and brings appropriate
controls to the important content.
Strategically balance the compliance needs with productivity
goals
Enterprise-wide content management:
Enterprise Content Management (ECM) is a strategy for
IT to employ for unstructured data.
The three key factors in the content management
strategy are:
1. Compliance and litigation arising from
communication.
2. Where documents, multimedia, and records are in
the enterprise’s databases to ensure audit-ability and
transparency to IT.
3. Enhance usability and cross-department content
sharing.
• It is not a “one-size-fits-all” solution. Strategic decisions
require a real understanding of what content has
strategic value to the enterprise.
• The strategic value is based on how content visibility will
enable productivity allowing for a transparent audit trail.
Content management
strategy
ITCompliance Productivity
Shape the explosion to meet the
enterprise’s needs
Required technical
controls
ECM
76. Understand the drivers for Information Governance
Find the right mix of enterprise-wide needs to structure your
Information Governance framework
This need is the top ranked driver for ECM
adoption by Info-Tech clients.
Business efficiency is the only need that
will enable a long term Information
Governance program
Business efficiency is based on findability
and well implemented business initiatives.
Information
Governance
Business Efficiency
Risk
mitigation
IT Efficiency
Compliance
Business Efficiency
The content growth provides a perfect
opportunity to control storage costs.
Well governed information can reduce
the cost of storage, in the long term, by
60% through controlling growth rate,
reduced duplication of content and
automated disposition.
For most organizations the limited
regulatory overhead will not be an
effective driver for Information
Governance.
IT Efficiency
Litigation
eDiscovery is the elephant in the room.
For most organizations the risk is huge
but the likelihood is very low.
Internal records
All organizations have HR documents
and financial records that require
governance.
Risk Mitigation
1.1
77. Web Content Management
Collaboration
eDiscovery
Capture
Analytics
Wikis
Blogs
Archiving
Workflow
Forms
Intranet
Search
DAM
Repository
ECM strategy is implemented with a variety of different
technologies
The core of ECM as a technology is a
pyramid of three technologies:
• Records Management
• Document Management
• Web Content Management
These three technologies form the
basis of ECM applications.
ECM applications often bleed into a
fringe of related ancillary technologies
like archiving and collaboration.
Strategically, ECM applications are
the technical control to implement and
control content throughout its
whole lifecycle.
Dedicated ECM suites include both core and
fringe technologies.
78. Content will continue to explode. The proliferation of cloud and mobile
devices has altered where content comes from and how it is used.
Fireworks look better from a distance - protect the enterprise
from personal content
• Ad hoc/Personal. Most enterprises are seeing growth in
this area. This includes enterprise social (activity feeds),
mobile workers (purchase orders), or personal knowledge
stores (e.g., Evernote).
• Collaborative. Content generated as part of group efforts;
templates and documents specific to a single department
or workgroup. Collaborative content is a low security risk,
but potentially useful to many users.
• System of record. Widely used documents (content
marketing), workflow (vacation approvals), and content
requiring tight control due to compliance or litigation
concerns (communications-IM, email). These records
require a structured system to ensure control of growth
and compliance.
The future of content will be social. AdAge reports more than 3.5 billion pieces of content are shared over
Facebook each week. As the Facebook demographic fills out, the workforce sharing will increase. IT needs
rules in place defining what traits of any communication require storage in the system of record and which
should be left to individual management.
Personal
Enterprise
79. Combine how information moves within departments with the enterprise-
wide needs to define the management strategy.
Fluid information movement requires good governance
• Start by determining how similar the key intra- and inter-
departmental movement patterns are.
• Enterprises with similar departmental and enterprise-wide
needs for their system (user profiles, classifications) should
prioritize a single ECM platform that spans both departmental
and enterprise content.
• Where these needs diverge IT must carefully consider the
compliance environment.
o Enterprises with low compliance and litigation burdens
should consider giving departments autonomy on the
choice of system or even just a collaboration platform.
o When tools diverge IT must ensure that the appropriate
access and controls exist to share information between
departments. IT’s goal should be to protect the enterprise
from compliance and legal concerns.
• For highly regulated industries, provide personal content tools
that have search and audit features. The enterprise may still
be responsible during eDiscovery for employee-generated
content in their personal stores.
Regulations
Enterprise-
wide data
Similarities
Departmental
data
The greater the number of regulations the
higher up the firework the ECM must reach.
The similarities between departments
defines the complexity of the ECM.
Key considerations for ECM
80. “What did you know and when did you know it?” The enterprise is obligated
to know what has been communicated by any recordable medium.
Move beyond content type – it’s about the context of the
communication
• The advent of IM and activity streams has changed the
landscape.
• The majority of rules and regulations do not point to
any particular type of communication such as email.
• The critical decision is what the information in the
electronic communication is.
• New consumer tools and social media change the
thinking regarding records. Records are a content type
that require a strategy to define acceptable-use policy
not just for email, but for all forms of internal and
external communications.
Email as a record: it depends on the context.
In general, email contracts, invoices, and personnel
records should be retained in systems with records
management functionality.
According to United States’ Federal Rules of Civil
Procedure (FRCP), the obligation to preserve email as a
record begins as soon as there is a reasonable
expectation of litigation.
Specific regulations have audit requirements. For
example, Sarbanes-Oxley has an audit-ability of email
and other communications requirement.
For highly regulated industries, IT must work closely
with Legal to ensure that the ECM strategy can meet
their regulatory burdens.
The world has changed. Mobile, consumerization, and
BYOD have fragmented content storage locations.
Ensure that you have full visibility by adopting mobile
device management solutions that have content
management capabilities. See Info-Tech’s strategy set:
Develop a Mobile Device Management .
81. The basis of compliance is visibility and find-ability. These same needs are
also the basis for productivity.
Guard your assets through clear rules and appropriate search
tools
• ECM applications provide centralized logic and organization to
associated cross-department content.
• The fluid ways that workers are producing and using content
presents issues to ECM applications for applying appropriate
security. IT is no longer capable of blocking export of
content to personal devices.
• Digital Asset Management technology allows enterprises similar
controls for images as rights management does for documents,
providing control over where and when it can be published.
• Extend this approach by taking advantage of the role-based
security to build enterprise-wide author lists for content.
This will ease finding relevant content based on known
relationships to subject matter experts.
Digital Asset Management
This once-stand-alone product is an advanced feature of all web
experience management (WEM) solutions.
Similar to record retention classifications, DAM is integral to
monitoring where assets have been used and who has the right to
publish or share that piece of content.
Enable DAM to ensure an audit trail of strategic content.
0%
5%
10%
15%
20%
25%
30%
35%
40%
Challenge No Challenge
Securing content is the
largest challenge identified.
Source: Info-Tech Research Group Q2 2012;
N=75
As with all aspects of ECM, the
challenges vary based on industry
and size. Content security is the only
challenge that cuts evenly across
these lines.
Editor's Notes
Architect-135940149 Garden-163232901
Fertilizer 88342091
Fertilizer 88342091
Fertilizer 88342091
Fertilizer 88342091
Fertilizer 88342091
Fertilizer 88342091
94101611
149003134
134346044
Despite reports of tightening IT budgets, security spending is expected to increase over the next few years.
Security continues to grow as a priority for all industries, so the budget allocated to security is increasing within organizations, with a heavy focus on security personnel.
Generally, about 5% of a company’s IT budget is spent on security. The average security budget in 2013 of large companies was 4.3 million dollars. Total IT security spending is actually predicted to increase at a 6.6% compound annual growth rate, reaching 30.1 billion dollars spent on security in 2017 for all organizations. Trends predict that medium-to-large businesses with 100 to just-under-500 employees will be the greatest generator of security spending, totalling about 8.5 billion dollars in 2017.
This chart shows the breakdown of a CISO’s security budget. As you can see, security personnel get the majority of the funding allocated.
This table below shows the percentage of organizations’ total annual revenue that allocated to security, broken up by organization type. So you can see, in the grand scheme of the entire company, the money spent on security is pretty low, almost always below 0.1% of the annual revenue.
<Walking the audience through the graphic with your hands> Walking through a basic example, as a business event occurs, we start to collect data, at which point the application and staff analyze said data. Once the data is analyzed, reports are generated and passed on to business users where they can make data driven decisions based on that insight. <click>
This chain of events, creates three areas of latency that is in the BI team’s purview to minimize.
First is data latency. <click> This is the time between the business event occurring and the data being ready for analysis. The BI team can reduce data latency by pulling relevant data from storage and preparing this data for analysis as the business event concludes.
Second is analysis latency. <click> This latency is the time when the data being prepared for analysis, and the actual delivery of information to the business. Analysis latency is the biggest driver of business value and we recommend you focus here as a first step when considering timeliness. Reducing analysis latency involves building responsive new processes and enhancing skill sets within the team to increase the speed-to-analyze.
Last is decision latency. <click> Decision latency is the relatively out of the control of the BI team. It represents the time between information being delivered to the business, and their business decision based on the information. This is easily the biggest factor outside of the control of the BI team. But tactics can still be employed to reduce this type of latency. Ensuring reports are findable utilizing metadata and enabling a efficient user experience are just two that can help reduce decision latency.
While we treat these three latencies as independent events for learning purposes today, the key here is to remember that regardless of the controllability of these latencies from the BI team’s perspective, the business will hold you responsible for slow time-to-insight so working to reduce all of these latencies is a key component of remedying our unhealthy BI diet. <click>
Our last pill to swallow is about the effectiveness of your BI operations strategy. When you can confidently meet these four criteria, you can consider your BI program effective.
First, <click> focus on business requirements. Doing so yields higher benefits per BI output.
Second, <click> tailor functionality based on users’ needs. Accomplishing this ensures investment in maintaining BI application functionality is producing value to the business.
Third, <click> use KPIs to monitor performance. This allows you to facilitate measuring performance, and re-assessing when necessary.
Lastly , <click> you want to simplify complex processes. Doing so lowers BI unit costs, allowing for investments in innovative practices.
The impact of an effective BI operation is felt in two areas: the quality of the final deliverables, and the efficiency of the BI team as projects and requests require less rework, thereby increasing throughput.
Priorities:
Frequency of use
RPO/RTO
User groups
Exec
HR
Accounting
Development-related tasks (user group)
Pilotability
Supportability
Risks
Compliance
retention
Internal bias
Due diligence
property valuation (mistakes with missing information)
Audit-ability
Compatibility
Encryption/redaction
Task notes:
Had to clarify what was meant by priorities
the actual priorities vs. the criteria to identify priorities
A lot to say for the risks section, very engaged
Emphasis was on the risk section from client side
difficult to come up with a long list without prompting
Thinks the full exercise would be a meaningful exercise to go through
Without the focus on distinctive capabilities, organization will not be getting the biggest ban for their effort
Good decision usually have systematically assembled data and analysis behind them
Distinctive capabilities:
American airlines in the early days: yield management
Capital One
Harrah’s Casinos: Customer Loyalty and Service
Marriott Hotels:
WalMart, CEMEX Cement: supply chain optimization, vendor management and more
Several studies have found significant correlation between higher levels of analytical maturity and robust annual compound annual growth rates.