Data migration is intrinsic to most big IT projects and yet
as an industry we have a poor record of tackling it.
If you are faced with responsibility for an IT project where
there will be some element of data migration then this
white paper written by industry expert Johny Morris will
help guide you past the pitfalls that await the unwary and
even how to add value as a consequence of performing
this necessary task.
This document discusses how artificial intelligence can help companies better manage complex contracts and associated risks. It makes the following key points:
1) Companies face increasing financial risk due to their inability to manage large amounts of contract data. AI can help interpret complex contracts and interrelated agreements to provide a more comprehensive view of risk.
2) AI-powered collaboration tools and search engines can organize contract data, automate certain tasks, and detect risks earlier, freeing up legal and risk professionals to focus on higher-level work.
3) By returning thousands of hours to risk analysis through automation and organization of data, AI technologies can help reduce legal costs from disputes and better inform negotiations.
Your firm needs to be committed to protecting information assets, including personal data and client documents. As a trusted advisor to our clients, the expectation is that we are aware of threats and are guarding their data. Data privacy and information security are fundamental components of doing business today, no matter how large your firm is.
In this paper we will look at three specific ways of protecting our clients:
1. Protection through our ability to research and improve intellectual capital
2. Protection through policies, procedures and processes
3. Protection by securing client data
The document discusses how digital technologies are disrupting traditional work practices by enabling employees to work remotely in various locations. It finds that while many employees want improved communication and collaboration tools to work flexibly, outdated infrastructure and limited investment are hindering organizations' ability to support this new digital way of working. The survey of IT leaders reveals that videoconferencing and mobile access to work networks are increasingly important but that legacy phone systems and other technologies need upgrades to meet evolving employee and business needs.
Crime Scene Investigation: Content – Who killed Enterprise Content Management? As consumer technology takes more attention, enterprise content management seems to have disappeared, particularly ECM. Presentation by John Newton was made at the Technology Services Group led by Dave Giordano at the University of Chicago Gleacher Center on 8 June 2011.
Is your infrastructure holding you back?Gabe Akisanmi
This ebook will help you connect the dots between
today’s biggest business opportunities and the specific
technology required to seize them. You’ll get the facts
you need to identify where current components may
be falling short—and how the right investments in infrastructure
can lead to better business outcomes while
strengthening your role as a strategic consultant within
your organization.
20140408 LOMA Life Insurance Conference: STP More Than Just A Tweak To Your O...Steven Callahan
Provides an overview of what may be achieved through the digitalization of new business processing and the implementation of straight-through processing including the digital delivery of life insurance policies.
When you’re planning to move to the cloud and manage a hybrid environment, security is a top concern. But cloud is not necessarily less secure than a traditional environment. In fact, it may be possible to deliver even greater security in a hybrid cloud environment because it offers new and advanced opportunities.
In this eBook, you’ll discover how hackers are using traditional tactics in new ways to attack the cloud. You’ll also find out how the cloud can help you increase security with innovative approaches designed to detect threats long before they threaten your enterprise.
The digital revolution changed the way we work forever. Is your organisation keeping up? Here’s a quick guide to the four
most popular digital workers, and how IT support can help your company make the most of the tech boom.
This document discusses how artificial intelligence can help companies better manage complex contracts and associated risks. It makes the following key points:
1) Companies face increasing financial risk due to their inability to manage large amounts of contract data. AI can help interpret complex contracts and interrelated agreements to provide a more comprehensive view of risk.
2) AI-powered collaboration tools and search engines can organize contract data, automate certain tasks, and detect risks earlier, freeing up legal and risk professionals to focus on higher-level work.
3) By returning thousands of hours to risk analysis through automation and organization of data, AI technologies can help reduce legal costs from disputes and better inform negotiations.
Your firm needs to be committed to protecting information assets, including personal data and client documents. As a trusted advisor to our clients, the expectation is that we are aware of threats and are guarding their data. Data privacy and information security are fundamental components of doing business today, no matter how large your firm is.
In this paper we will look at three specific ways of protecting our clients:
1. Protection through our ability to research and improve intellectual capital
2. Protection through policies, procedures and processes
3. Protection by securing client data
The document discusses how digital technologies are disrupting traditional work practices by enabling employees to work remotely in various locations. It finds that while many employees want improved communication and collaboration tools to work flexibly, outdated infrastructure and limited investment are hindering organizations' ability to support this new digital way of working. The survey of IT leaders reveals that videoconferencing and mobile access to work networks are increasingly important but that legacy phone systems and other technologies need upgrades to meet evolving employee and business needs.
Crime Scene Investigation: Content – Who killed Enterprise Content Management? As consumer technology takes more attention, enterprise content management seems to have disappeared, particularly ECM. Presentation by John Newton was made at the Technology Services Group led by Dave Giordano at the University of Chicago Gleacher Center on 8 June 2011.
Is your infrastructure holding you back?Gabe Akisanmi
This ebook will help you connect the dots between
today’s biggest business opportunities and the specific
technology required to seize them. You’ll get the facts
you need to identify where current components may
be falling short—and how the right investments in infrastructure
can lead to better business outcomes while
strengthening your role as a strategic consultant within
your organization.
20140408 LOMA Life Insurance Conference: STP More Than Just A Tweak To Your O...Steven Callahan
Provides an overview of what may be achieved through the digitalization of new business processing and the implementation of straight-through processing including the digital delivery of life insurance policies.
When you’re planning to move to the cloud and manage a hybrid environment, security is a top concern. But cloud is not necessarily less secure than a traditional environment. In fact, it may be possible to deliver even greater security in a hybrid cloud environment because it offers new and advanced opportunities.
In this eBook, you’ll discover how hackers are using traditional tactics in new ways to attack the cloud. You’ll also find out how the cloud can help you increase security with innovative approaches designed to detect threats long before they threaten your enterprise.
The digital revolution changed the way we work forever. Is your organisation keeping up? Here’s a quick guide to the four
most popular digital workers, and how IT support can help your company make the most of the tech boom.
Explore the Roles and Myths of Automation and Virtualization in Data Center T...Dana Gardner
The document is a transcript of a podcast discussion about virtualization and automation in data centers. It discusses how automation can help companies achieve higher levels of virtualization adoption, beyond the 30% level many are stuck at. Automation allows virtualization to extend to more applications, including tier 1 apps, and provides benefits like improved compliance, repeatability, and agility. It argues companies need to take a strategic view of virtualization through a full lifecycle lens to realize its full potential for transformation.
A look at IT decision making, budgeting, priorities and technology adoption among UK and Germany-based SMEs based on 500 interviews (250 in the UK and 250 in Germany) with IT decision makers from private sector SME organisations.
Let’s face it; technology can be confusing. And it doesn’t help when the jargon all sounds the same — intranet, extranet… internet vs. extranet. That's why we’re here to clear up a bit of the confusion and set the record straight on intranet vs. extranet.
20140826 I&T Webinar_The Proliferation of Data - Finding Meaning Amidst the N...Steven Callahan
Joint presentation with I&T's covering the proliferation of data available to insurance companies today and a high level view of searching for value and leveraging the relevant and useful buried in all of the trivia.
BSI Teradata: The Case of the Dropped Mobile CallsTeradata
The BSI team helps Intergalactic Telephone Corp., whose customers are experiencing dropped mobile phone calls. Using Teradata Aster, Teradata Hybrid Storage, Aprimo and Tableau, the team develops some novel approaches to analyzing call detail records to identify the top potential defectors and to improve customer satisfaction with a powerful combination of real-time apolgies, credits on bills, software upgrades, and free femtocell boosters for selected customers. Watch the video at http://bit.ly/zDfmJH.
The consumerization of IT is under way. Workers want tablet access to business applications, often from personal devices. Learn why VPNs are not ideal for mobile connectivity and why remote desktop is a more secure, less expensive approach to tablet access that is easier to deploy, manage and use.
This document provides an overview of conducting an audit of an e-business to evaluate its security, performance, and overall health. It outlines key areas to examine, including response time, security practices, network configuration, and development processes. The audit aims to identify any issues and make recommendations for improvement. Sample diagrams and checklists are provided to help structure the audit. The goal is to ensure the e-business is operating securely and meeting customer needs.
Learn How to Maximize Your ServiceNow InvestmentStave
Understand how leading companies are adopting an aPaaS strategy
Learn the evolution of ServiceNow's platform capabilities
Assert IT's influence over shadow IT practices
In this Slideshare, Fuji Xerox covers the hottest trends for any organisations actively seeking to achieve a more efficient business workflow through digitalisation.
The document discusses improving customer experience through interactions. It states that 86% of customers quit doing business with a brand due to a bad customer experience. The document then lists 5 pain points companies commonly face regarding customer interactions: 1) providing personalized customer service is expensive and difficult, 2) concerns about data loss in a digital world, 3) customers dislike lengthy signup forms, 4) increasing risks to customer data as a company grows, and 5) difficulty servicing customers quickly through social media. It provides solutions to each pain point, such as using CRMs to access customer data and automating processes to improve the customer experience. Overall, the document stresses that customer interactions can define a company and securing data while streamlining processes is important to
Digitization offers businesses opportunities to improve efficiency and reduce costs by transitioning processes from paper-based to digital. While some businesses have made progress in digitizing, most still rely on paper for some processes. Fully digitizing customer communications and integrating paper and digital information could help businesses improve productivity, enhance customer service, and reduce expenses associated with paper usage and storage. However, going completely paper-free is not necessary or realistic for most established businesses, as paper still provides benefits in some contexts like customer communications. The goal of digitization should be to reduce paper where possible while leveraging its advantages elsewhere.
Streamlining paper processes in a digital world - Canon CEECanon Business CEE
Wherever you are on your journey to process modernisation, we can help you go further.
We can help organizations of all shapes and sizes transform the way they work.
We can work with you to seamlessly integrate paper and digital workflows, automating information capture, storage and distribution to accelerate operational performance.
En ebook-e-signatures and workflow automation transform your tomorrowNiranjanaDhumal
Market competition is now higher than ever, and automation is very likely to be a prominent resolve for the future of business.
With the automation workflows and approval procedures – the most influential aspects of your business – you can transition seamlessly towards a digital tomorrow.
This document discusses the growing need for enterprise-wide remote access and mobility solutions. It outlines some of the key business drivers, such as increased productivity and business continuity. It then evaluates some popular existing approaches like VPNs, server-based computing, and virtual desktop infrastructure (VDI) and finds that they do not scale cost-effectively to support large numbers of remote users. A new appliance-based remote desktop access solution is introduced that aims to provide secure, high-performance remote access across many device types in a more cost-effective way than traditional options.
Flexible Working enables both the employee and the business’ needs to be met through agility and adjustments to when, where and how both choose to work.
This is mutually beneficial to both the employer and employee and result in outcomes the reap success.
Backfile Conversion: Best Practices and ConsiderationsDATAMARK
This document discusses best practices for converting paper documents to digital files through a process called backfile conversion. It addresses the need for organizations to go digital to improve efficiency and services. The key aspects of backfile conversion covered are comprehensive planning, analyzing document workflows, addressing preparation and indexing of documents, and considering whether to conduct the conversion in-house or outsource it. Careful planning is emphasized to make the process of digitizing potentially millions of legacy documents strategic and successful.
En ebook-where-does-edi-stand-today-and-where-do-we-stand-with-itNiranjanaDhumal
This eBook explores every nook and cranny of EDI, where it stands today, and where do we stand with it. It scrutinizes the core competencies of how EDI works across various ecosystems and trading channels, what standards it follows, and implementation options.
Outsourcing Email Management? Companies are getting the Messagepaully58
Email is a critical tool for businesses, but maintaining email systems has become challenging and costly for companies. The volume of emails and mailboxes is growing rapidly, exposing businesses to security vulnerabilities. Most companies rely on overburdened IT departments to manage complex email systems, but outsourcing email management to a provider can free up internal resources and ensure email reliability, security, and compliance with regulations. Outsourcing saves companies time, money and productivity by standardizing systems and allowing IT experts to focus on strategic initiatives rather than maintaining email.
White paper I developed on best practices when selecting an HR technology product suite. Includes an amazing checklist to help those in the market for a new human resources suite tech product.
The pioneers in the big data space have battle scars and have learnt many of the lessons in this report the hard way. But if you are a general manger & just embarking on the big data journey, you should now have what they call the 'second mover advantage’. My hope is that this report helps you better leverage your second mover advantage. The goal here is to shed some light on the people & process issues in building a central big data analytics function
Building an Effective Data Management StrategyHarley Capewell
In June 2013, Experian hosted a Data
Management Summit in London, with over
100 delegates from the public, private and
third sectors. Speakers from Experian
and across the data industry explored the
challenges of developing and implementing
data quality strategies - and how to
overcome them. Read on for more information.
IBM's InfoSphere software helps organizations successfully leverage big data by providing an understanding of their data. It addresses the challenges of big data's four V's (volume, variety, velocity, and veracity) by automating data integration and governance. This helps boost confidence in big data by establishing standard terminology, tracing data lineage, and separating useful "good" data from unnecessary "bad" data. As a result, organizations can more accurately analyze big data and act on the insights with confidence.
Explore the Roles and Myths of Automation and Virtualization in Data Center T...Dana Gardner
The document is a transcript of a podcast discussion about virtualization and automation in data centers. It discusses how automation can help companies achieve higher levels of virtualization adoption, beyond the 30% level many are stuck at. Automation allows virtualization to extend to more applications, including tier 1 apps, and provides benefits like improved compliance, repeatability, and agility. It argues companies need to take a strategic view of virtualization through a full lifecycle lens to realize its full potential for transformation.
A look at IT decision making, budgeting, priorities and technology adoption among UK and Germany-based SMEs based on 500 interviews (250 in the UK and 250 in Germany) with IT decision makers from private sector SME organisations.
Let’s face it; technology can be confusing. And it doesn’t help when the jargon all sounds the same — intranet, extranet… internet vs. extranet. That's why we’re here to clear up a bit of the confusion and set the record straight on intranet vs. extranet.
20140826 I&T Webinar_The Proliferation of Data - Finding Meaning Amidst the N...Steven Callahan
Joint presentation with I&T's covering the proliferation of data available to insurance companies today and a high level view of searching for value and leveraging the relevant and useful buried in all of the trivia.
BSI Teradata: The Case of the Dropped Mobile CallsTeradata
The BSI team helps Intergalactic Telephone Corp., whose customers are experiencing dropped mobile phone calls. Using Teradata Aster, Teradata Hybrid Storage, Aprimo and Tableau, the team develops some novel approaches to analyzing call detail records to identify the top potential defectors and to improve customer satisfaction with a powerful combination of real-time apolgies, credits on bills, software upgrades, and free femtocell boosters for selected customers. Watch the video at http://bit.ly/zDfmJH.
The consumerization of IT is under way. Workers want tablet access to business applications, often from personal devices. Learn why VPNs are not ideal for mobile connectivity and why remote desktop is a more secure, less expensive approach to tablet access that is easier to deploy, manage and use.
This document provides an overview of conducting an audit of an e-business to evaluate its security, performance, and overall health. It outlines key areas to examine, including response time, security practices, network configuration, and development processes. The audit aims to identify any issues and make recommendations for improvement. Sample diagrams and checklists are provided to help structure the audit. The goal is to ensure the e-business is operating securely and meeting customer needs.
Learn How to Maximize Your ServiceNow InvestmentStave
Understand how leading companies are adopting an aPaaS strategy
Learn the evolution of ServiceNow's platform capabilities
Assert IT's influence over shadow IT practices
In this Slideshare, Fuji Xerox covers the hottest trends for any organisations actively seeking to achieve a more efficient business workflow through digitalisation.
The document discusses improving customer experience through interactions. It states that 86% of customers quit doing business with a brand due to a bad customer experience. The document then lists 5 pain points companies commonly face regarding customer interactions: 1) providing personalized customer service is expensive and difficult, 2) concerns about data loss in a digital world, 3) customers dislike lengthy signup forms, 4) increasing risks to customer data as a company grows, and 5) difficulty servicing customers quickly through social media. It provides solutions to each pain point, such as using CRMs to access customer data and automating processes to improve the customer experience. Overall, the document stresses that customer interactions can define a company and securing data while streamlining processes is important to
Digitization offers businesses opportunities to improve efficiency and reduce costs by transitioning processes from paper-based to digital. While some businesses have made progress in digitizing, most still rely on paper for some processes. Fully digitizing customer communications and integrating paper and digital information could help businesses improve productivity, enhance customer service, and reduce expenses associated with paper usage and storage. However, going completely paper-free is not necessary or realistic for most established businesses, as paper still provides benefits in some contexts like customer communications. The goal of digitization should be to reduce paper where possible while leveraging its advantages elsewhere.
Streamlining paper processes in a digital world - Canon CEECanon Business CEE
Wherever you are on your journey to process modernisation, we can help you go further.
We can help organizations of all shapes and sizes transform the way they work.
We can work with you to seamlessly integrate paper and digital workflows, automating information capture, storage and distribution to accelerate operational performance.
En ebook-e-signatures and workflow automation transform your tomorrowNiranjanaDhumal
Market competition is now higher than ever, and automation is very likely to be a prominent resolve for the future of business.
With the automation workflows and approval procedures – the most influential aspects of your business – you can transition seamlessly towards a digital tomorrow.
This document discusses the growing need for enterprise-wide remote access and mobility solutions. It outlines some of the key business drivers, such as increased productivity and business continuity. It then evaluates some popular existing approaches like VPNs, server-based computing, and virtual desktop infrastructure (VDI) and finds that they do not scale cost-effectively to support large numbers of remote users. A new appliance-based remote desktop access solution is introduced that aims to provide secure, high-performance remote access across many device types in a more cost-effective way than traditional options.
Flexible Working enables both the employee and the business’ needs to be met through agility and adjustments to when, where and how both choose to work.
This is mutually beneficial to both the employer and employee and result in outcomes the reap success.
Backfile Conversion: Best Practices and ConsiderationsDATAMARK
This document discusses best practices for converting paper documents to digital files through a process called backfile conversion. It addresses the need for organizations to go digital to improve efficiency and services. The key aspects of backfile conversion covered are comprehensive planning, analyzing document workflows, addressing preparation and indexing of documents, and considering whether to conduct the conversion in-house or outsource it. Careful planning is emphasized to make the process of digitizing potentially millions of legacy documents strategic and successful.
En ebook-where-does-edi-stand-today-and-where-do-we-stand-with-itNiranjanaDhumal
This eBook explores every nook and cranny of EDI, where it stands today, and where do we stand with it. It scrutinizes the core competencies of how EDI works across various ecosystems and trading channels, what standards it follows, and implementation options.
Outsourcing Email Management? Companies are getting the Messagepaully58
Email is a critical tool for businesses, but maintaining email systems has become challenging and costly for companies. The volume of emails and mailboxes is growing rapidly, exposing businesses to security vulnerabilities. Most companies rely on overburdened IT departments to manage complex email systems, but outsourcing email management to a provider can free up internal resources and ensure email reliability, security, and compliance with regulations. Outsourcing saves companies time, money and productivity by standardizing systems and allowing IT experts to focus on strategic initiatives rather than maintaining email.
White paper I developed on best practices when selecting an HR technology product suite. Includes an amazing checklist to help those in the market for a new human resources suite tech product.
The pioneers in the big data space have battle scars and have learnt many of the lessons in this report the hard way. But if you are a general manger & just embarking on the big data journey, you should now have what they call the 'second mover advantage’. My hope is that this report helps you better leverage your second mover advantage. The goal here is to shed some light on the people & process issues in building a central big data analytics function
Building an Effective Data Management StrategyHarley Capewell
In June 2013, Experian hosted a Data
Management Summit in London, with over
100 delegates from the public, private and
third sectors. Speakers from Experian
and across the data industry explored the
challenges of developing and implementing
data quality strategies - and how to
overcome them. Read on for more information.
IBM's InfoSphere software helps organizations successfully leverage big data by providing an understanding of their data. It addresses the challenges of big data's four V's (volume, variety, velocity, and veracity) by automating data integration and governance. This helps boost confidence in big data by establishing standard terminology, tracing data lineage, and separating useful "good" data from unnecessary "bad" data. As a result, organizations can more accurately analyze big data and act on the insights with confidence.
Master Data in the Cloud: 5 Security FundamentalsSarah Fane
Your master data is essential to the smooth operation of your business. But it is also valuable to others. Master data is vulnerable to both internal and external attacks. As the future of business and data is increasingly cloud-based, we explore five fundamentals to ensure the security of your data.
Project 3 – Hollywood and IT· Find 10 incidents of Hollywood p.docxstilliegeorgiana
Project 3 – Hollywood and IT
· Find 10 incidents of Hollywood portraying IT security incorrectly
· You can use movies or TV episodes
· Write 2-5 paragraphs for each incident. Use supporting citations for each part.
· What has Hollywood portrayed wrong? Describe the scene and what is being shown. Make sure to state whether it is partially wrong or totally fictitious.
· How would you protect/secure against what they show (answers might include install firewall, load Antivirus etc.)
· Use APA formatting for your sources on everything.
· Make sure to put your name on assignment.
Big Data and Social Media
Colgate Palmolive
Agenda Of socail media use
Buisness intellegence and Social media concenpts
Intellegent organization
Data Anaylysis and Data trustworthiness
Conclusion
Buisness intellegence and Social media concenpts
No-Hassle Documentation
Gain Trusted Followers
Spy on Competition
Learn Customer Demographics
Research and Analyze Events
Advertise More Accurately
Intellegent organization
They consistently use (big) data proactively
They know exactly where they want to go: all-round vision
They continuously discuss business matters: alignment
They talk to each other regarding positive and negative performance
They know their customers through and through
They think and work in an agile way
Data Anaylysis and Data trustworthiness
Data completeness and accuracy
Data credibility
Data consistency
Data processing and algorithms
Data Validity
Conclusion
How Colgate benefit from Big Data and Social Media
Social media increases sales and customers
Big data shows popular trends and popular companies
All around they are both beneficial
Big Data can find trends that can benefit you greatly
Criteria
Title Page:
Name, Contact info, title of Presentation
Slide 1
Adenda : Topic you going to cover in order
Slide 2
Discuss how big data, social media concepts and knowledge to successfully create business intellegence (Support your bullets points with data, analysis, charts)
Slide 3
Describe how big data can be used to build an intelligent organization
Slide 4
Discuss the importance of data source trustworthiness and data analysis
Slide 5
Conclusion
Slide 6
Big Data And Business Intelligence
Business Value With Big Data
For business to survive in a competitive environment, organizational change requires improved governance, sponsorship, processes, and controls, in addition to new skill sets and technology all work in harmony to deliver the benefits of big data. See Fig. 13.2
Data science has taken the business world by storm. Every field of study and area of business has been affected as companies realize the value of the incredible quantities of data being generated. But to extract value from those data, one needs to be trained in the proper data science skills. The R programming language has become the de fac to programming language for data science. Its flexibility, power, sophistication, and expressiveness have ma ...
Exploring new mobile and cloud platforms without a governance .docxssuser454af01
The document discusses the challenges and risks of adopting new mobile and cloud technologies without proper governance strategies. It provides examples of companies that experienced negative consequences due to a lack of oversight and planning around BYOD policies and unsupported devices/applications. The author advocates for proactive governance that supports new tools while managing risks through well-defined protocols and transparency between IT and users.
19Question 1 4 4 pointsLO5 What is a packetQu.docxaulasnilda
This document contains a series of questions and feedback from an exam on information systems topics. It discusses firewalls, databases, authentication methods, competitive advantages, software quality triangles, trademarks, and other concepts. The feedback indicates some multiple choice questions were missed and to review for the next test.
A strategy for security data analytics - SIRACon 2016Jon Hawes
A snag list for 'things that can go wrong' with big data analytics initiatives in security, and ways to think about the problem space to avoid that happening.
Data observability is a collection of technologies and activities that allows data science teams to prevent problems from becoming severe business issues.
The document discusses how CFOs can better evaluate IT departments and technology investments. It recommends that CFOs focus on three key areas: communication between finance and IT, governance over technology spending and projects, and assessing IT capabilities and risks.
For communication, the document suggests CFOs and CIOs develop a shared language focused on business value and processes rather than just technology. For governance, it advises implementing two levels - one for strategic initiatives and one for individual projects. And for assessment, it provides questions for CFOs to ask about information quality, technology capabilities, reliability, and risks. Taking steps in these three areas can help CFOs optimize IT spending and prepare their companies technologically for the future
1) The document discusses speed bumps that companies face in adopting advanced technology for document processing, including regulatory headaches, staying compliant, cost control, and meeting service level agreements. It also discusses technology speed bumps such as handling document volume and velocity, validation challenges, and issues with cloud vs. on-premise solutions.
2) It notes that while the technology for automated document processing has matured, many companies have been slow to adopt it due to rigid pricing models from some vendors and challenges upgrading legacy software.
3) However, it suggests the industry is at a critical juncture and companies that take a customer-centric approach with flexible, upgradeable technology will be best positioned to automate document processing and
Driving Digital Supply Chain Transformation - A Handbook - 23 MAY 2017Lora Cecere
Insights on driving a digital transformation based on research on new technologies, advisory work with clients, and quantitative research projects. This is a short handbook to help companies get started on their journey to define the digital supply chain.
Hiring the best data entry professionals from a top manpower outsourcing company can be of great benefits for you and your business.
Hiring expert data entry operators will let you focus solely on your business rather than juggling other obligations. The great team of people who have the knowledge and experience ensure the accuracy of all data entry services.
Data Analytics in Industry Verticals, Data Analytics Lifecycle, Challenges of...Sahilakhurana
Banking and securities
Challenges
Early warning for securities fraud and trade visibilities
Card fraud detection and audit trails
Enterprise credit risk reporting
Customer data transformation and analytics.
The Security Exchange commission (SEC) is using big data to monitor financial market activity by using network analytics and natural language processing. This helps to catch illegal trading activity in the financial markets.
The Data Analytics Lifecycle is designed specifically for Big Data problems and data science projects. The lifecycle has six phases, and project work can occur in several phases at once. For most phases in the lifecycle, the movement can be either forward or backward. This iterative depiction of the lifecycle is intended to more closely portray a real project, in which aspects of the project move forward and may return to earlier stages as new information is uncovered and team members learn more about various stages of the project. This enables participants to move iteratively through the process and drive toward operationalizing the project work.
Phase 1—Discovery: In Phase 1, the team learns the business domain, including relevant history such as whether the organization or business unit has attempted similar projects in the past from which they can learn. The team assesses the resources available to support the project in terms of people, technology, time, and data. Important activities in this phase include framing the business problem as an analytics challenge that can be addressed in subsequent phases and formulating initial hypotheses (IHs) to test and begin learning the data.
Phase 2—Data preparation: Phase 2 requires the presence of an analytic sandbox, in which the team can work with data and perform analytics for the duration of the project. The team needs to execute extract, load, and transform (ELT) or extract, transform and load (ETL) to get data into the sandbox. The ELT and ETL are sometimes abbreviated as ETLT. Data should be transformed in the ETLT process so the team can work with it and analyze it. In this phase, the team also needs to familiarize itself with the data thoroughly and take steps to condition the data.
This document discusses the importance of HIPAA compliance for organizations serving people with disabilities and the challenges of maintaining compliance with outdated software systems. It notes that HIPAA violations and complaints have increased in recent years. A cloud-based integrated software system can help organizations address HIPAA requirements by securely centralizing information access and updating automatically. The document recommends Intuition by Vertex as a comprehensive software solution designed for rehabilitation organizations that handles administrative tasks and ensures HIPAA compliance.
Assignment 1 Your Mobile Ordering Project team needs to provide a sdesteinbrook
Assignment 1: Your Mobile Ordering Project team needs to provide a summary of your analysis findings to IT leadership and other department stakeholders. You are assigned to create the material.
Create
material that:
Illustrates the current information system, capturing data from in-store ordering and payment processes
Illustrates the new information system that will be needed when mobile ordering and payment is added
Identifies potential uses of cloud computing technology, including the names of the cloud type and likely cloud computing services
Identifies the security vulnerabilities the new information system could create
Provides recommendations to mitigate security concerns
Identifies any ethical and privacy concerns leadership should review.
Use
diagrams to illustrate the information systems mentioned above. You can create diagrams using Microsoft® Word, Microsoft® PowerPoint®, Microsoft® Visio® or another program of your choice.
Document
your summary using one of the following options:
A 1- to 2-page Microsoft® Word document
A 10- to 12-slide Microsoft® PowerPoint® presentation with detailed speaker notes
Assignment 2: You are an IT specialist in a private medical practice. The practice includes three physical locations and 35 employees.
Until now, all IT technology and software was fully managed on-premises. However, the practice is adding another location and five new employees. Your organization will need additional storage capacity and computing power.
You have been asked to investigate available cloud computing options and recommend whether cloud computing should be used or whether the investment should be made to maintain 100% internal IT management.
Compare
and
contrast
the use of Infrastructure as a Service (IaaS) to internal investment.
Include
the following in your comparison:
The benefits of each
The risks or vulnerabilities of each
Additional information you would need to make a recommendation for which would most align with the organization's needs
Document
your summary using one of the following options:
A 1- to 2-page Microsoft® Word chart or table
A 6- to 8-slide Microsoft® PowerPoint® presentation with detailed speaker notes
A Microsoft® Excel® spreadsheet formatted for printing
Submit
your assignment to the Assignment Files tab
Discussion Questions 8 total:
1. Message expanded.Message read ERP
posted by G MARTINEZ , May 16, 2018, 12:14 PM
This chapter talks about ERP systems. It gives a good overview as to what an ERP is, its benefits and limitations and the different methods onimplementing an ERP system. One ERP system that I am familiar with is Microsoft's Dynamic NAV. We implemented this system into the company to connect our sales, purchasing, operations, accounting, inventory and HR. We used a combination of all 3 strategic approaches: vanilla, custom, and best of breed approach. We used most of the default modules that came with Navision. We also built some custom t ...
Using IoT technology can help smaller FMCG companies implement Lean manufacturing principles by addressing common challenges they face. Specifically, IoT solutions can help companies collect and consolidate operational data from packaging lines in a cost-effective way, providing insights into unplanned downtime that impede productivity. Wireless connectivity and cloud-based data platforms have become affordable options to monitor equipment from multiple vendors. This data allows managers to identify improvement opportunities, boosting output while reducing wasteful costs from downtime by thousands of dollars per line each month.
Do you think data warehousing has failed in the market?? If yes, what are the possible causes?
What is the current status of data warehousing tools? What is required in future upcoming tools?
What is the difference between creation of data warehouse and implementation of MDDB?
AIIM Nuxeo Webinar: Modern Problems Require Modern SolutionsNuxeo
John Mancini from AIIM explores why the key information management challenges faced by a modern organization require a new modern set of solutions to solve them.
Information systems are combinations of hardware, software, and networks that organizations use to collect, create, and distribute useful data. They help process and manage data, monitor performance, and support decision making. Information systems are central to virtually every organization and provide users with resources like Wi-Fi networks, database search services, and printers. They consist of hardware, software, data, knowledge, and telecommunications components. Data analytics uses specialized software to explore large volumes of data and provide insights to help organizations grow customer bases, improve efficiency, and detect fraud. Common types of information systems include transaction processing, office automation, decision support, and executive information systems.
Similar to Tackling the ticking time bomb – Data Migration and the hidden risks (20)
Introduction of Cybersecurity with OSS at Code Europe 2024Hiroshi SHIBATA
I develop the Ruby programming language, RubyGems, and Bundler, which are package managers for Ruby. Today, I will introduce how to enhance the security of your application using open-source software (OSS) examples from Ruby and RubyGems.
The first topic is CVE (Common Vulnerabilities and Exposures). I have published CVEs many times. But what exactly is a CVE? I'll provide a basic understanding of CVEs and explain how to detect and handle vulnerabilities in OSS.
Next, let's discuss package managers. Package managers play a critical role in the OSS ecosystem. I'll explain how to manage library dependencies in your application.
I'll share insights into how the Ruby and RubyGems core team works to keep our ecosystem safe. By the end of this talk, you'll have a better understanding of how to safeguard your code.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
This presentation provides valuable insights into effective cost-saving techniques on AWS. Learn how to optimize your AWS resources by rightsizing, increasing elasticity, picking the right storage class, and choosing the best pricing model. Additionally, discover essential governance mechanisms to ensure continuous cost efficiency. Whether you are new to AWS or an experienced user, this presentation provides clear and practical tips to help you reduce your cloud costs and get the most out of your budget.
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Main news related to the CCS TSI 2023 (2023/1695)Jakub Marek
An English 🇬🇧 translation of a presentation to the speech I gave about the main changes brought by CCS TSI 2023 at the biggest Czech conference on Communications and signalling systems on Railways, which was held in Clarion Hotel Olomouc from 7th to 9th November 2023 (konferenceszt.cz). Attended by around 500 participants and 200 on-line followers.
The original Czech 🇨🇿 version of the presentation can be found here: https://www.slideshare.net/slideshow/hlavni-novinky-souvisejici-s-ccs-tsi-2023-2023-1695/269688092 .
The videorecording (in Czech) from the presentation is available here: https://youtu.be/WzjJWm4IyPk?si=SImb06tuXGb30BEH .
For the full video of this presentation, please visit: https://www.edge-ai-vision.com/2024/06/temporal-event-neural-networks-a-more-efficient-alternative-to-the-transformer-a-presentation-from-brainchip/
Chris Jones, Director of Product Management at BrainChip , presents the “Temporal Event Neural Networks: A More Efficient Alternative to the Transformer” tutorial at the May 2024 Embedded Vision Summit.
The expansion of AI services necessitates enhanced computational capabilities on edge devices. Temporal Event Neural Networks (TENNs), developed by BrainChip, represent a novel and highly efficient state-space network. TENNs demonstrate exceptional proficiency in handling multi-dimensional streaming data, facilitating advancements in object detection, action recognition, speech enhancement and language model/sequence generation. Through the utilization of polynomial-based continuous convolutions, TENNs streamline models, expedite training processes and significantly diminish memory requirements, achieving notable reductions of up to 50x in parameters and 5,000x in energy consumption compared to prevailing methodologies like transformers.
Integration with BrainChip’s Akida neuromorphic hardware IP further enhances TENNs’ capabilities, enabling the realization of highly capable, portable and passively cooled edge devices. This presentation delves into the technical innovations underlying TENNs, presents real-world benchmarks, and elucidates how this cutting-edge approach is positioned to revolutionize edge AI across diverse applications.
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
leewayhertz.com-AI in predictive maintenance Use cases technologies benefits ...alexjohnson7307
Predictive maintenance is a proactive approach that anticipates equipment failures before they happen. At the forefront of this innovative strategy is Artificial Intelligence (AI), which brings unprecedented precision and efficiency. AI in predictive maintenance is transforming industries by reducing downtime, minimizing costs, and enhancing productivity.
5th LF Energy Power Grid Model Meet-up SlidesDanBrown980551
5th Power Grid Model Meet-up
It is with great pleasure that we extend to you an invitation to the 5th Power Grid Model Meet-up, scheduled for 6th June 2024. This event will adopt a hybrid format, allowing participants to join us either through an online Mircosoft Teams session or in person at TU/e located at Den Dolech 2, Eindhoven, Netherlands. The meet-up will be hosted by Eindhoven University of Technology (TU/e), a research university specializing in engineering science & technology.
Power Grid Model
The global energy transition is placing new and unprecedented demands on Distribution System Operators (DSOs). Alongside upgrades to grid capacity, processes such as digitization, capacity optimization, and congestion management are becoming vital for delivering reliable services.
Power Grid Model is an open source project from Linux Foundation Energy and provides a calculation engine that is increasingly essential for DSOs. It offers a standards-based foundation enabling real-time power systems analysis, simulations of electrical power grids, and sophisticated what-if analysis. In addition, it enables in-depth studies and analysis of the electrical power grid’s behavior and performance. This comprehensive model incorporates essential factors such as power generation capacity, electrical losses, voltage levels, power flows, and system stability.
Power Grid Model is currently being applied in a wide variety of use cases, including grid planning, expansion, reliability, and congestion studies. It can also help in analyzing the impact of renewable energy integration, assessing the effects of disturbances or faults, and developing strategies for grid control and optimization.
What to expect
For the upcoming meetup we are organizing, we have an exciting lineup of activities planned:
-Insightful presentations covering two practical applications of the Power Grid Model.
-An update on the latest advancements in Power Grid -Model technology during the first and second quarters of 2024.
-An interactive brainstorming session to discuss and propose new feature requests.
-An opportunity to connect with fellow Power Grid Model enthusiasts and users.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Monitoring and Managing Anomaly Detection on OpenShift.pdfTosin Akinosho
Monitoring and Managing Anomaly Detection on OpenShift
Overview
Dive into the world of anomaly detection on edge devices with our comprehensive hands-on tutorial. This SlideShare presentation will guide you through the entire process, from data collection and model training to edge deployment and real-time monitoring. Perfect for those looking to implement robust anomaly detection systems on resource-constrained IoT/edge devices.
Key Topics Covered
1. Introduction to Anomaly Detection
- Understand the fundamentals of anomaly detection and its importance in identifying unusual behavior or failures in systems.
2. Understanding Edge (IoT)
- Learn about edge computing and IoT, and how they enable real-time data processing and decision-making at the source.
3. What is ArgoCD?
- Discover ArgoCD, a declarative, GitOps continuous delivery tool for Kubernetes, and its role in deploying applications on edge devices.
4. Deployment Using ArgoCD for Edge Devices
- Step-by-step guide on deploying anomaly detection models on edge devices using ArgoCD.
5. Introduction to Apache Kafka and S3
- Explore Apache Kafka for real-time data streaming and Amazon S3 for scalable storage solutions.
6. Viewing Kafka Messages in the Data Lake
- Learn how to view and analyze Kafka messages stored in a data lake for better insights.
7. What is Prometheus?
- Get to know Prometheus, an open-source monitoring and alerting toolkit, and its application in monitoring edge devices.
8. Monitoring Application Metrics with Prometheus
- Detailed instructions on setting up Prometheus to monitor the performance and health of your anomaly detection system.
9. What is Camel K?
- Introduction to Camel K, a lightweight integration framework built on Apache Camel, designed for Kubernetes.
10. Configuring Camel K Integrations for Data Pipelines
- Learn how to configure Camel K for seamless data pipeline integrations in your anomaly detection workflow.
11. What is a Jupyter Notebook?
- Overview of Jupyter Notebooks, an open-source web application for creating and sharing documents with live code, equations, visualizations, and narrative text.
12. Jupyter Notebooks with Code Examples
- Hands-on examples and code snippets in Jupyter Notebooks to help you implement and test anomaly detection models.
Monitoring and Managing Anomaly Detection on OpenShift.pdf
Tackling the ticking time bomb – Data Migration and the hidden risks
1. An Experian white paper
Understand your data:
Understand your risks
Tackling the ticking time bomb
– Data Migration and the hidden risks
Written by Johny Morris.
2. Table of contents
1. Introduction 03
2. What kinds of projects 04
3. Why so many bad news stories?
04
4. Common errors 05
4.1. Underestimating the scale of data issues in the
legacy system(s) 05
4.2. Overestimating the ability of the technologists to
fix the problems unaided 05
4.3. Poor prioritisation and management of data issues 06
4.4. Misunderstanding what you signed up for
06
5. Defusing the bomb 08
5.1. Landscape analysis and profiling
08
5.2. Early business stakeholder engagement
09
5.3. Data quality management processes
09
5.4.
A proper project charter 09
6. Appropriate Software choices 11
6.1.
Profiling software 11
6.2. quality software 12
Data
6.3.
Mapping software 12
6.4. software 12
ETL
6.5. Why buy – can’t I just use existing coding resource
to do all this? 12
7. Appropriate methodology 13
7.1. Once in a business lifetime event
13
8. The benefits of a well conducted data migration project 14
8.1. wins 14
Early
8.2. governance 14
Data
8.3.
Health warning 15
9. Learning points 16
3. Experian QAS
3 - Tackling the ticking time bomb – Data Migration and the hidden risks
1. Introduction
Data migration is intrinsic to most big IT projects and yet
as an industry we have a poor record of tackling it.
If you are faced with responsibility for an IT project where
there will be some element of data migration then this
white paper written by industry expert Johny Morris will
help guide you past the pitfalls that await the unwary and
even how to add value as a consequence of performing
this necessary task.
About Author
Johny Morris has 25 years plus experience in InformationTechnology
spending the last 15 years consulting exclusively on Data Migration
assignments for the likes of the BBC, BritishTelecommunications plc, and
Barclays Bank. Johny is the author of the Practical Data Migration (PDM)
method and book of the same name. Johny co-founded the specialist data
migration consultancy, Iergo Ltd (www.iergo.com) and is on the expert panel
of Data Migration Pro.
4. Experian QAS
4 - Tackling the ticking time bomb – Data Migration and the hidden risks
2. What kinds of projects?
Data migration is the moving of data from our old
computer systems to new computer systems but it is not
an activity we perform in its own right. However if you
are about to embark on:
A technology refresh
A merger or acquisition
A de-merger or buyout
Outsourcing or in-sourcing
System replacement due to end of support
Regulatory requirement to produce a consistent view of customer
Then you are probably going to be involved with a data
migration.
3. Why so many bad news stories?
How can data migration be
so hard? We are all expert
enough to download huge video
files from the internet. We all
routinely copy files to our data
sticks and pass them around.
The more adventurous of us may even synch
our calendars on phone and desktop. So why is
it that with all the technical competence at our
disposal we routinely fail to move data that has
been sitting happily in our old systems?
And the figures are appalling. If you go it
alone, without appropriate software support,
training or external methodologies, the chances
are greater that you will fail than that you will
succeed (source Data Migration, Philip Howard,
August 2011, Bloor Research).
5. Experian QAS
5 - Tackling the ticking time bomb – Data Migration and the hidden risks
4. Common errors
Data migrations go wrong for four principal reasons
Underestimating the scale of data issues in the
legacy system(s)
Overestimating the ability of the technologists to fix
the problems unaided
Poor prioritisation and management of data issues
Misunderstanding what you signed up for
You will see that “Mapping problems” is not on the list. Although we sometimes
hear of mapping being a problem on closer inspection it is usually because of
problems in other areas – like assuming that a legacy field is populated with
consistent values then finding out that half the values are missing.
4.1 Underestimating the scale of data issues
We get to trust our existing systems to the
extent that they become almost invisible
to us. And yet under the cover of that old
familiar skin unknown numbers of errors
lurk. Looking into old systems is like being
on an archaeological dig. Each layer reveals
a new set of issues. There are long forgotten
system crashes, misapplied patches, changes
to validation, company re-organisations,
botched updates, long forgotten initiatives,
poor end user training and (my favourite)
end user inventiveness that populates
under used fields with user determined and
undocumented values. These problems are
compounded if our data migration is driven by
a merger or acquisition where even the limited
knowledge we have of our own systems is
missing.
The scale of the issues facing us is normally a
complete unknown when we start our project.
Will we be pleasantly surprised or horrified at
what we find when we peal back the covers?
Because we routinely use these systems our
tendency is to underestimate the time and
effort that will be involved.
4.2 Overestimating the ability of the technologists to fix the
problem unaided
Most of the data problems you find are quite
within the competence of your technologists
to fix. When it comes to dates in the wrong
format, names and address that need to be rearranged according to the post office address
formula or the proper representation for phone
numbers, our programmers and analysts are
in their element. But, and it is a big but, there
are always some residual problems that they
cannot fix or to put it another way, they cannot
fix without help from the business.
Let us look at a real life example. A supplier
in a heavily regulated industry was half way
through their migration to a new, all embracing
computer system. The commercial customer
migration had been successful but when they
started on the domestic customers something
like 20% of their domestics had commercial
tariffs. From a technical perspective this is
simple. Get a list of the erroneous tariffs,
cross reference it to the list of correct
domestic tariffs and in the data migration
replace one with the other.
From a technical point of view it may be simple
but from a business perspective things were
a lot more difficult. There were over payments
and under payments. There were differences
in sales tax on commercial and domestic
6. Experian QAS
6 - Tackling the ticking time bomb – Data Migration and the hidden risks
tariffs. There were potential regulatory
problems. There were PR problems. There
was the potential for the call centre being
swamped if 20% of the customers phoned in.
All of these issues and more took months of
interdepartmental meetings to resolve and
were certainly above the pay-grade of the data
analyst in the ITC department who discovered
the underlying problem.
This was an exceptional case but even where
you do not find such mammoth problems you
should still plan to find some problems that
just have to go back to the business to resolve
and unfortunately the business side is just not
equipped to turn these around quickly on top
of their normal workload.
4.3 Poor prioritisation and management of data issues
If there is a necessary time lag on the
business side getting questions answered,
this will be exacerbated by failing to have in
place arrangements for managing data quality
issues or even worse failure to pre-warn our
business colleagues that these data issues
are coming their way.
The worse that happens is the dreaded
responsibility gap opens up. This is where the
technical side, having exhausted what they
can do, push a large pile of data items at the
business side who are overwhelmed, under
prepared and under supported. They push
back. Between the two sides a gap develops
into which many a data migration project has
plunged.
We need to arm both sides with the
appropriate tools and structures which will
ready them for the data preparation task and
unite them into one big virtual team.
Those of us that have been involved in data
migrations before will be aware that we will
uncover more data issues than we can fix in
the time available before go live. We will need
to prioritise but we need to do it in a way that
takes account of the business as well as the
technical drivers. Left to their own devices it
is all too easy for the technologists to make
erroneous assumptions about what is most
important.
4.4 Misunderstanding what you signed up for
Just as we have to build a common team it is
also essential that all the parties know what
is expected and, just as importantly, what is
not expected of them. In most medium to
large scale data migrations there will be a
systems implementer (SI) or a vendor as well
as the client’s own IT competency carrying out
different aspects of the migration. There may
even be multiple teams from multiple vendors.
It is vital that we know what everyone is doing.
This may seem obvious but we could share
any number of case studies where the
purchaser has misunderstood the scope of
the implementer’s activity and vice versa.
There are a number of reasons for this but the
two most common are the rigidity of modern
procurement contracts and the impact of
underestimating the scale of data issues.
Underestimating the scale of data issues
we have already touched on and when this
is compounded by poor prioritisation and
overestimating the ability of technologists
to solve the problems unaided it’s easy to
see that we have a recipe for going into the
project with a misjudgement of what each
party can realistically do. The technologist
cannot fix all the problems unaided and the
ones they cannot fix tend to be the most
intractable and time consuming. This is the
first misunderstanding.
‘Those of us that have been involved in data
migrations before will be aware that we will
uncover more data issues than we can fix in the
time available before go live.’
7. Experian QAS
7 - Tackling the ticking time bomb – Data Migration and the hidden risks
The second is the unforeseen consequence
of tighter and tighter procurement contracts
drawn up by our buyers. The history of IT
projects is long and marked with some
spectacular failures. The learning from this
has been to draw up well specified output
based contracts where the supplier is tied
to delivering a working system at a given
point in time for a fixed price. In response the
suppliers have had to withdraw responsibility
for activities where they cannot predict the
outcome with sufficient exactness to be able
to price the risk of failure into the contract.
Experienced suppliers know that the issues
of data quality in legacy data are often
understated but for any one client they cannot
know by how much. They cannot therefore
quantify the effort needed to fix the issues.
The result of these two forces – the purchaser
looking for certainty in the contract and the
supplier looking to remove unquantifiable risk
- has been contracts that focus more on the
“Final mile”. In other words the responsibility
for discovering and fixing legacy data issues
is a client side one. Classically the supplier
will present a template to the client for them
to source or a staging database for them to
populate.
a timeframe that will not compromise the
contracted end date.
To be fair to the suppliers if you are migrating
to a complex product (like SAP for instance)
then that final mile can be a tortuous one that
requires considerable expertise so there is still
plenty for them to do.
Altogether underestimating the scale of data
issues, overestimating the ability of technology
to fix the problem, poor prioritisation and
management of data issues and mutual
misunderstandings leaves a time bomb ticking
away in the heart of our biggest projects. This
set of inter-related problems explode when
we try to load data in earnest blowing apart
our project plans, compromising our delivery
date and severely curtailing our return on
investment.
The end result is that the supplier will do data
migration – but only of data that matches
the staging database they provide. Anything
other will be rejected and it will be up to the
client to get it fixed and re-submitted within
So the responsibility for selecting data
sources, profiling and performing some initial
transformations lies with the client – but often,
dazzled by the promise that the new system
brings, this is overlooked or underestimated.
There are many variants to this scenario.
Sometimes the supplier will perform profiling
of suggested sources and report potential
errors back to the client. Sometimes the
supplier will develop the target but expect
the client to understand how to generate the
staging data base. Sometimes the supplier
will perform extended transformations of
data so that it will load but just as often we
have seen cases where the contract and the
supplier will refuse responsibility to perform
any data enhancement. We have even seen
this extended to include defaulting in missing
values but that is not the norm. More often
than not the supplier will use their experience
to willingly go beyond the contract to help the
client select and prepare data for loading but
in almost all cases the ultimate responsibility
for the data selection and preparation tasks
invariably falls back on the client.
8. Experian QAS
8 - Tackling the ticking time bomb – Data Migration and the hidden risks
5. Defusing the time bomb
Just are there are four major sources of trouble when it
comes to data migration, there are also four key steps
that should be taken to alleviate them:
Landscape analysis and profiling
Early business stakeholder engagement
Data quality management processes
A proper project charter
5.1 Landscape analysis and profiling
Data migration is the building of a bridge
from old systems to new over which data
flows. When engineers are building a bridge
they pay just as much attention to each end.
Classically within IT projects we just have not
done that. We have been so mesmerised by
the promise and novelty of the new that we
have neglected the old. Our focus is on buying
in expertise of the target not on rediscovering
the legacy. But our legacy may represent
twenty or more years of effort. Some well
directed some not. We cannot know, without
looking, what all those keystrokes have
contributed to the quality of the data in there.
We can anticipate from experience that there
will be all sorts of dubious data to deal with.
If you are dealing with anything but the very
smallest of legacy data sets then we should
also know that manual efforts just will not be
good enough. To review the millions of entries
in large data stores across many years we
need tools capable of doing the job. This is
even more the case when we have to combine
data from multiple sources. How well will it
fit together both at the model level and at the
detailed, customer by customer and account
by account level? How do we de-duplicate a
million customers?
These tasks gain extra degrees of complexity
when we are dealing with a merger or demerger driven data migration. Here we
may not even have access to subject matter
experts (SME) who know the data.
For these jobs specialist data profiling tools
have been created. These reduce weeks or
months of potentially hand coded scripting
to the work of hours or days. Specifically
designed to look for rules hidden in the data
they direct our effort to the anomalies – the
fields that are always to supposed to have a
unique value but which are empty or contain
duplicates, curious data patterns in fields that
are not supposed to be in use and so on.
The research above suggests that without
access to a built for purpose profiling tool
you have a 45% chance of missing your go live
date.
With a tool, and the good practices that go
with it, you will have the knowledge that will
allow you to properly scale and plan the
remainder of the migration.
‘These tasks gain extra degrees of complexity
when we are dealing with a merger or de-merger
driven data migration.’
9. Experian QAS
9 - Tackling the ticking time bomb – Data Migration and the hidden risks
5.2 Early business stakeholder engagement
The profiling and landscape analysis phases
will generate an awfully large number of
questions and potential issues to deal with.
As we saw earlier, the majority of these can
be dealt with by the technologists but there
will be some, usually the most intractable, that
will need to be referred back to the business
to be solved. We know we are going to have to
do this, so it is important that we seek out and
bring into our processes the subject matter
experts (SME) that we are going to need as
soon as possible. It is only human nature but if
you only seek help when you have a potentially
project wrecking problem on your hands and
time is running short don’t be surprised if you
encounter reluctance to engage. It therefore
makes sense to start profiling early when
there is less time pressure and to have a data
quality management process in place before
the big issues appear.
5.3 Establish data quality management processes
Within our favoured methodology (Practical
Data Migration or PDM) one of our earliest
steps is to establish a board composed jointly
of technologists and SME to look at known
problems that will impact the migration
together.
To perform this function properly you need to
access modern data quality software. This
allows side by side working where data can
be examined swiftly and rules created and
tested on live data by a process of joint SME –
technologist engagement.
Often the solution to data quality issues
lies with the business correcting errors in
the source data. Where this is the case it is
important that the data quality rules can be
re-run efficiently to track that you are on target
to get your data right by the go live date. And
of course, it is also true that often whatever is
causing the bad data in the first place may not
have been fixed (after all why fix something
when the replacement is only months away?)
Built for purpose data quality software allows
you to easily monitor that the source data
once cleaned up is not degrading again and to
take remedial action where it is.
5.4 A proper project charter
As with many journeys the first steps are the most important. There needs to be a clear
understanding of the responsibility boundaries of each of the participants. It needs to be shown
that there are no gaps. Here are some specimen questions:
Who is responsible for the detailed design of the target including reference data
design (e.g. the code values for products, the detailed chart of accounts)? Often the
supplier is contracted only to deliver a working system but it is the responsibility of
the client to create this level of detail.
Who is responsible for retiring legacy systems? If we do not retire systems then
we are not performing a migration which by definition is the permanent movement
of data from one platform to another. Retirement can be logical not necessarily
physical. When an enterprise is demerged from its parent it may have to give up the
use of all of its existing systems but of course these carry on working to serve the
parent.
Who is responsible for the archive solution? In any but the simplest of migrations a
large amount of historical data that may be required for occasional processing (e.g.
tax records retained just in case there is an inspection) will not be loaded into the
target. Where is this data going to reside? Who is responsible for the design and
build of this archive?
10. Experian QAS
10 - Tackling the ticking time bomb – Data Migration and the hidden risks
Where is the transformation and enrichment going to be performed and by whom?
By the client between extracting data and placing it in the staging database? By the
supplier after the raw data is placed in the staging database? Some by the client
and some by the supplier and if so who decides where?
How are you going to manage the efficient passing of design changes from supplier
to data migration team, especially towards the end of the project when there can be a
swirl of late fixes falling out of user acceptance testing?
There is a helpful check list of all the items to look for in
the book Practical Data Migration – 2nd Edition.
11. Experian QAS
11 - Tackling the ticking time bomb – Data Migration and the hidden risks
6. Appropriate Software choices
By now it should be clear that when it comes to data
migrations you will be faced with having to perform
significant amounts of activity, to a high degree of
accuracy, within a tight timeframe. It is therefore
imperative that you give the technologists the right tools
to do the job.
Generically there are four types of software. Each type
corresponds to a specific phase in the migration process. These
software types are:
Profiling software
Data quality software
Mapping software
ETL software
6.1 Profiling Software
We saw that we are often ignorant of the
exact state of the data in our legacy systems.
The first step therefore is to shine a light on
that uncertainty and discover what is really
going on in our legacy data stores. This is
what profiling tools do. Reading through
the source databases they rapidly produce
a set of reports on what they encounter. The
best of breed produce reports on scores of
dimensions of data issues: They show us
where columns have mostly unique values.
They show us columns that have identifiable
patterns (e.g. bank phone numbers or credit
card numbers). They show us which columns
seem to have matching data with other
columns (i.e. where columns appear to be
related and so possibly a lookup table join).
They show us where a column has data types
inconsistent with its purpose (numbers in a
first name field). Altogether they give us a
rich picture of what we have in our legacy data.
Our recommendation is to start profiling as
early as possible. Do not wait until the target
is fully defined. The target is likely to arrive
later than planned and subject to change right
up to go-live. In theory there is a risk that you
will try to fix problems that you ultimately do
not need to fix. In practice this is not the case.
There will be enough data issues that have to
be fixed, whatever the target will ultimately
look like, to be getting on with. On the other
hand, as we have seen, when it comes to
getting responses back from the business
these are necessarily slow. Move as much
activity up the time line as possible. Tools that
allow maximum collaboration between the
technologist and the expert business user are
best suited to this task. You need to be able to
review real data sitting side by side.
12. Experian QAS
12 - Tackling the ticking time bomb – Data Migration and the hidden risks
6.2 Data quality software
If profiling software discovers the rules in
legacy data (and their violation), data quality
software takes business rules and applies
them to legacy data. This is a task that
commences once the target starts becoming
clearer. In the best case the software allows
rules that are discovered during profiling to
be carried forward into the data quality phase.
For this kind of reuse to be effective you need
a single, integrated, software set. Again
prototyping software that allows agile rapid
development of rules is what is needed here.
6.3 Mapping software
Mapping software is, naively, what we first
think about when we consider data migration.
Linking fields from source to target, enhancing
and transforming existing data en route is the
end point of our journey. As we have seen
however the bulk of our work should have
occurred prior to this. Once again software
that allows for a collaborative approach, where
real data can be seen transformed, is more
efficient and less risky than the traditional
forest of mapping spreadsheets.
6.4 ETL software
This is the software for the final mile. Often
the target will have specific load software
optimised to load data into their environment.
With SAP it is the BAPI, IDOCS, LSMW
etc. These are often within the purview of
the supplier and mandated by the target.
Mapping, from a client perspective, is
therefore often only to the staging database to
be picked up by the ETL software.
6.5 Why buy – can’t I just use existing coding resource to do
all this?
It is true that everything profiling, data quality,
mapping and ETL software can do could be
done by writing programs in the traditional
way, however the speed and sophistication of
the specialised tools could not be replicated
in anything like the time you will have available
to you. A modern profiling tool will perform
hundreds of checks in a single pass that it
would take a programmer months to replicate
and that is before we get on to more esoteric
features like fuzzy matching. The cost benefit
case is clear. If in doubt about what this is all
about ask for a demonstration, the results will
be startling.
‘The cost benefit case is clear. If in doubt about
what this is all about ask for a demonstration, the
results will be startling.’
13. Experian QAS
13 - Tackling the ticking time bomb – Data Migration and the hidden risks
7. Appropriate methodology
As we all know technology is a wasted expense if we do
not know how to exploit it properly. With fast profiling and
data quality tools come great leaps in productivity but
only if they are used within an appropriate framework.
Obviously there needs to be training in the
application itself but also in utilising it in
a data migration setting. A profiling tool
on its own can create a library of metrics.
Every column in the source data base will be
analysed in over a score of different ways.
How do we sort the wood from the trees?
When we have a bunch of problems refined,
how do we get answers and action out of our
hard pressed business colleagues? How do
we know that we have covered all the bases in
our data migration strategy?
7.1 Once in a business lifetime event
It is a simple fact that for most enterprises
data migration is not a business as
usual activity. Of course there are some
organisations – data bureau, outsourcing
companies etc. for whom data migration is
a normal activity. These companies need
to take a factory, production line approach
to data migration but for the rest of us, our
organisations may get involved in a major
migration once every fifteen or so years. The
chances are that the people who were involved
last time will have moved on (or are still so
scarred by the experience that they do not
want to get involved again). The technology
will certainly have moved on and yet data
migrations are difficult, complex things
requiring their own special skill sets so as
well as appropriate technology you must have
an appropriate methodology and access to
trained experienced staff as well.
14. Experian QAS
14 - Tackling the ticking time bomb – Data Migration and the hidden risks
8. The benefits of a well conducted
data migration project
Obviously the major benefit of a well conducted data
migration project is the delivery of a fully functional
software application that adds value to the enterprise
from the moment it is turned on not a chaotic shambles
for the first three months after go live as data is found to
be missing or badly structured. However that is not an
end to it.
8.1 Early wins
The great thing about starting your profiling
and data quality activity early is that you
can publicly slay a few dragons that have
been hanging about for ages. If you are
implementing a new Customer Relationship
Management (CRM) package then you will
need to de-duplicate your customer list. This
is an activity that has probably been on a lot
of folks’ wish lists for years. So do it. Do it
early. Publicise the success. Not only do
you gain plaudits but you also build a lasting
partnership with the business.
8.2 Data governance
Many, probably most, organisations these
days have realised the value of information
and the insight it brings. Data is no longer the
lubricant that greases the wheels of profitable
production, it is part of the profit generating
product in its own right. Most forward looking
organisations have instituted some form of
data governance function to try to get a grip
on this but how successfully? As a number
of recent industry reports (backed up by our
experience) show there are an awful lot of
data governance initiatives out there that have
started out boldly enough, only to collapse
in a frustration born out of the natural clash
between the structural goal of the benefits
of good data governance and the short term
drivers of everyday business. We may wish to
end the scrap and rework cycle but it just never
seems to be the right time. To rephrase St.
Augustine of Hippo’s famous aphorism “Make
me make my data good lord, but not yet”.
Increasingly what is being urged is that in
place of the grand design to fix everything,
look for the tactical opportunity to embed
good data practices by using data governance
principles to add value more immediately
if more locally. And a good data migration
experience is the ideal place. Procrastination
is not an option when there is the compelling
event of a data migration looming. You will
have to deploy all (well most) of the tools of
the data governance trade to succeed. You
will be building your internal case study of
the benefits of a systematic approach to data
centric activity. You will have supporters in the
business and, if you use a methodology like
Practical Data Migration (PDMv2), you will
have solved otherwise perplexing issues like
who is the data owner.
You need to consciously think through the
implications of linking data migration to your
nascent data governance activity. It may alter
your software tool selection for instance. You
will need tools that are up to the longer term
task. On the other hand it may give you reason
to bring in tools in the first place, possibly on a
lease basis, to establish their value.
15. Experian QAS
15 - Tackling the ticking time bomb – Data Migration and the hidden risks
8.3 Health warning
Better data governance and slaying a few
data quality dragons are excellent things in
their own right but they must not be allowed to
interfere with your raison d’etre – shifting data
of an appropriate quality to the right place at
the right time. If you fail to hit the go-live date
then your longer term goals may also suffer
a fatal blow. Expect, given the time frame, to
uncover more issues than you can solve. Make
this a fact from the outset and use it to justify
the ongoing data governance activity that
should necessarily follow.