This webinar originally aired on Tuesday, October 9th, 2012. It is part of Data Blueprint's ongoing webinar series on data management with Dr. Aiken.
Sign up for future sessions at http://www.datablueprint.com/webinar-schedule.
Abstract:
This presentation provides guidance to organizations considering or preparing for data quality initiatives. We will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality can be engineered provides a useful framework in which to develop an organizational approach. This in turn will allow organizations to more quickly identify data problems caused by structural issues versus practice-oriented defects. Participants will also learn the importance of practicing data quality engineering quantification.
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re-occurring.
How to Realize Benefits from Data Management Maturity ModelsKingland
View individual use cases from a large B2B organization, mid-size financial institution, and a scientific data repository. See the plan and outcome from all case studies.
Introduction to Data Management Maturity ModelsKingland
Jeff Gorball, the only individual accredited in the EDM Council Data Management Capability Model and the CMMI Institute Data Management Maturity Model, introduces audiences to both models and shares how you can choose which one is best for your needs.
How Ally Financial Achieved Regulatory Compliance with the Data Management Ma...DATAVERSITY
A Data Management Maturity Model Case Study
Ally Financial Inc., previously known as GMAC Inc., is a bank holding company headquartered in Detroit, Michigan. Ally has more than 15 million customers worldwide, serving over 16,000 auto dealers in the US. In 2009 Ally Bank was launched – at present it has over 784,000 customers, a satisfaction score of over 90%, and has been named the “Best Online Bank” by Money magazine for the last four years.
Ally was an early adopter of the DMM, conducting a broad-based evaluation of its data management practices, and creating a strategy and sequence plan for improvements based on the results. Ally’s implementation of an integrated, organization-wide data management program including data governance, a robust data quality program, and managed data standards, resulted in a “Satisfactory” rating on its latest regulatory audit.
In this webinar, you will learn:
How Ally employed the DMM to evaluate its data management practices
Who was involved / lessons learned
How Ally prioritized and sequenced data management improvement initiatives
How the data management program has been enhanced and expanded
Business impacts and benefits realized
Major initiatives completed and underway
How Ally is leveraging DMM 1.0 to proactively prepare for BCBS 239 compliance.
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re-occurring.
How to Realize Benefits from Data Management Maturity ModelsKingland
View individual use cases from a large B2B organization, mid-size financial institution, and a scientific data repository. See the plan and outcome from all case studies.
Introduction to Data Management Maturity ModelsKingland
Jeff Gorball, the only individual accredited in the EDM Council Data Management Capability Model and the CMMI Institute Data Management Maturity Model, introduces audiences to both models and shares how you can choose which one is best for your needs.
How Ally Financial Achieved Regulatory Compliance with the Data Management Ma...DATAVERSITY
A Data Management Maturity Model Case Study
Ally Financial Inc., previously known as GMAC Inc., is a bank holding company headquartered in Detroit, Michigan. Ally has more than 15 million customers worldwide, serving over 16,000 auto dealers in the US. In 2009 Ally Bank was launched – at present it has over 784,000 customers, a satisfaction score of over 90%, and has been named the “Best Online Bank” by Money magazine for the last four years.
Ally was an early adopter of the DMM, conducting a broad-based evaluation of its data management practices, and creating a strategy and sequence plan for improvements based on the results. Ally’s implementation of an integrated, organization-wide data management program including data governance, a robust data quality program, and managed data standards, resulted in a “Satisfactory” rating on its latest regulatory audit.
In this webinar, you will learn:
How Ally employed the DMM to evaluate its data management practices
Who was involved / lessons learned
How Ally prioritized and sequenced data management improvement initiatives
How the data management program has been enhanced and expanded
Business impacts and benefits realized
Major initiatives completed and underway
How Ally is leveraging DMM 1.0 to proactively prepare for BCBS 239 compliance.
Data-Ed Webinar: Implementing the Data Management Maturity Model (DMM) - With...DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
- Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational data management and data management maturity
- Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Introduction to DCAM, the Data Management Capability Assessment ModelElement22
DCAM is a model to assess data management capability within the financial industry. It was created by the EDM Council. This presentation provides an overview of DCAM and how financial institutions leverage DCAM to improve or establish their data management programs and meet regulatory requirements such as BCBS 239.
Data Quality Management: Cleaner Data, Better Reportingaccenture
In this new Accenture Finance & Risk presentation we explore a process to investigate, prioritize and resolve data quality issues, key to creating a more efficient and accurate reporting environment. View our presentation to learn more.
For more on regulatory reporting, see presentation on Financial Reporting Robotics: http://bit.ly/2qaLK9y
Visit our blog for latest Regulatory Insights: https://accntu.re/2qnXs1B
A Data Management Maturity Model Case StudyDATAVERSITY
How Ally Financial Achieved Regulatory Compliance with the Data Management Maturity (DMM) Model
Ally Financial Inc., previously known as GMAC Inc., is a bank holding company headquartered in Detroit, Michigan. Ally has more than 15 million customers worldwide, serving over 16,000 auto dealers in the US. In 2009 Ally Bank was launched – at present it has over 784,000 customers, a satisfaction score of over 90%, and has been named the “Best Online Bank” by Money magazine for the last four years.
Ally was an early adopter of the DMM, conducting a broad-based evaluation of its data management practices, and creating a strategy and sequence plan for improvements based on the results. Ally’s implementation of an integrated, organization-wide data management program including data governance, a robust data quality program, and managed data standards, resulted in a “Satisfactory” rating on its latest regulatory audit.
In this webinar, you will learn:
How Ally employed the DMM to evaluate its data management practices
Who was involved / lessons learned
How Ally prioritized and sequenced data management improvement initiatives
How the data management program has been enhanced and expanded
Business impacts and benefits realized
Major initiatives completed and underway
How Ally is leveraging DMM 1.0 to proactively prepare for BCBS 239 compliance.
15 slide presentation displaying the use cases, features and benefits of the 4th generation Kingland Platform. The platform delivers enterprise data management solutions for some of the world's largest organizations. Powered by an artificial intelligence suite, the platform helps organizations avoid costs, accelerate projects, and improve how you use data to make business decisions.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
Is Your Data Ready to Drive Your Company's Future?Edgewater
Before investing the time and money to implement a reporting and analytics solution to guide you out of the current economic crisis, make sure that your data is prepared to lead the way.
Join Edgewater Technology for a step-by-step approach to readying your data to support enterprise reporting and analytics applications.
Introduction to DCAM, the Data Management Capability Assessment Model - Editi...Element22
DCAM stands for Data management Capability Assessment Model. DCAM is a model to assess data management capabilities within the financial industry. It was created by the EDM Council in collaboration with over 100 financial institutions. This presentation provides an overview of DCAM and how financial institutions leverage DCAM to improve or establish their data management programs and meet regulatory requirements such as BCBS 239. Also the benefits of DCAM are described as part of this presentation.
The Data Driven University - Automating Data Governance and Stewardship in Au...Pieter De Leenheer
Data Governance and Stewardship requires automation of business semantics management at its nucleus, in order to achieve data trust between business and IT communities in the organization. University divisions operate highly autonomously and decentralized, and are often geographically distributed. Hence, they benefit more from an collaborative and agile approach to Data Governance and Stewardship approach that adapts to its nature.
In this lecture, we start by reviewing 'C' in ICT and reflect on the dilemma: what is the most important quality of data being shared: truth or trust? We review the wide spectrum of business semantics. We visit the different phases of growing data pain as an organization expands, and we map each phase on this spectrum of semantics.
Next, we introduce our principles and framework for business semantics management to support Data Governance and Stewardship focusing on the structural (what), processual (how) and organizational (who) components. We illustrate with use cases from Stanford University, George Washington University and Public Science and Innovation Administrations.
In this lecture we discuss data quality and data quality in Linked Data. This 50 minute lecture was given to masters student at Trinity College Dublin (Ireland), and had the following contents:
1) Defining Quality
2) Defining Data Quality - What, Why, Costs
3) Identifying problems early - using a simple semantic publishing process as an example
4) Assessing Linked (big) Data quality
5) Quality of LOD cloud datasets
References can be found at the end of the slides
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 (CC-BY-SA-40) International License.
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. The model allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and strengths to leverage. The assessment method reveals priorities, business needs, and a clear, rapid path for process improvements. This webinar will describe the DMM, its evolution, and illustrate its use as a roadmap guiding organizational data management improvements.
Learning Objectives:
Purpose of the DMM: How to use the DMM to assess and improve Data Management Maturity
Getting the most from a DMM assessment: DMM dependencies and requirements for use
Data-Ed Webinar: Data Governance StrategiesDATAVERSITY
The data governance function exercises authority and control over the management of your mission critical assets and guides how all other data management functions are performed. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. This webinar provides you with an understanding of what data governance functions are required and how they fit with other data management disciplines. Understanding these aspects is a necessary pre-requisite to eliminate the ambiguity that often surrounds initial discussions and implement effective data governance and stewardship programs that manage data in support of organizational strategy.
Takeaways:
Understanding why data governance can be tricky for most organizations
Steps for improving data governance within your organization
Guiding principles & lessons learned
Understanding foundational data governance concepts based on the DAMA DMBOK
Compliance issues can impact organizations in many ways. For medical device companies, this can be in the form of the FDA’s unique device identification (UDI) requirements. These requirements, a result of the passage of The FDA Amendments Act of 2007, stipulate that most medical devices carry a unique device identifier.
A webinar addressing how enterprise data management enables UDI compliance was presented live on May 23, 2013 in a joint session with Kelle O’Neal of First San Francisco Partners and Ross Hart of Riversand Technologies.
During the presentation, the following areas were discussed:
- The FDA legislation and the impact it will have on your organization
- Current UDI data challenges and benefits
- How enterprise information management and PIM support UDI
- How to get a UDI program started
- How to ensure a successful UDI program
These are the slides used in Kelle's portion of the presentation.
Many data professionals struggle with the ability to demonstrate tangible returns on data management investments. In a webinar that is designed to appeal to both business and IT attendees, your presenter Dr. Peter Aiken will describe multiple types of value produced through data-centric development and management practices. One of our examples, the healthcare space, offers the unique opportunity to demonstrate additional types of return on investment or value outcomes, namely returns in the form of lives saved through increased rates of Bone Marrow Donor matches. In addition to metrics around increasing revenues or decreasing costs, i.e. investments that directly impact an organization’s financial position, these additional statistics of lives saved can be used to justify data management and quality initiatives.
Check out more of our webinars here: http://www.datablueprint.com/resource-center/
Data-Ed Online: Your Documents and Other Content: Managing Unstructured Data Data Blueprint
This webinar originally aired on Tuesday, August 14, 2012. It is part of Data Blueprint's ongoing webinar series on data management with Dr. Aiken.
Sign up for future sessions at http://www.datablueprint.com/webinar-schedule.
Abstract
Non-tabular data plays an increasing role in organizations. While we are still far away from automated content comprehension, increasingly sophisticated technologies are extending our data management capabilities into more critical and more regulated areas. This presentation provides you with an understanding of the dimensions of this vast new area, including electronic and physical document monitoring, storage systems, content analysis and archive, retrieve and purge cycling.
Data-Ed Webinar: Implementing the Data Management Maturity Model (DMM) - With...DATAVERSITY
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. This model—based on the Capability Maturity Model pioneered by the U.S. Department of Defense for improving software development processes—allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and identify strengths to leverage. In doing so, this assessment method reveals organizational priorities, business needs, and a clear path for rapid process improvements.
In this webinar, we will:
- Describe the DMM model, its purpose and evolution, and how it can be used as a roadmap for assessing and improving organizational data management and data management maturity
- Discuss how to get the most out of a DMM assessment, including its dependencies and requirements for use
Introduction to DCAM, the Data Management Capability Assessment ModelElement22
DCAM is a model to assess data management capability within the financial industry. It was created by the EDM Council. This presentation provides an overview of DCAM and how financial institutions leverage DCAM to improve or establish their data management programs and meet regulatory requirements such as BCBS 239.
Data Quality Management: Cleaner Data, Better Reportingaccenture
In this new Accenture Finance & Risk presentation we explore a process to investigate, prioritize and resolve data quality issues, key to creating a more efficient and accurate reporting environment. View our presentation to learn more.
For more on regulatory reporting, see presentation on Financial Reporting Robotics: http://bit.ly/2qaLK9y
Visit our blog for latest Regulatory Insights: https://accntu.re/2qnXs1B
A Data Management Maturity Model Case StudyDATAVERSITY
How Ally Financial Achieved Regulatory Compliance with the Data Management Maturity (DMM) Model
Ally Financial Inc., previously known as GMAC Inc., is a bank holding company headquartered in Detroit, Michigan. Ally has more than 15 million customers worldwide, serving over 16,000 auto dealers in the US. In 2009 Ally Bank was launched – at present it has over 784,000 customers, a satisfaction score of over 90%, and has been named the “Best Online Bank” by Money magazine for the last four years.
Ally was an early adopter of the DMM, conducting a broad-based evaluation of its data management practices, and creating a strategy and sequence plan for improvements based on the results. Ally’s implementation of an integrated, organization-wide data management program including data governance, a robust data quality program, and managed data standards, resulted in a “Satisfactory” rating on its latest regulatory audit.
In this webinar, you will learn:
How Ally employed the DMM to evaluate its data management practices
Who was involved / lessons learned
How Ally prioritized and sequenced data management improvement initiatives
How the data management program has been enhanced and expanded
Business impacts and benefits realized
Major initiatives completed and underway
How Ally is leveraging DMM 1.0 to proactively prepare for BCBS 239 compliance.
15 slide presentation displaying the use cases, features and benefits of the 4th generation Kingland Platform. The platform delivers enterprise data management solutions for some of the world's largest organizations. Powered by an artificial intelligence suite, the platform helps organizations avoid costs, accelerate projects, and improve how you use data to make business decisions.
A successful data governance capability requires a strategy to align regulatory drivers and technology enhancement initiatives with business needs and objectives, taking into account the organizational, technological and cultural changes that will need to take place.
Is Your Data Ready to Drive Your Company's Future?Edgewater
Before investing the time and money to implement a reporting and analytics solution to guide you out of the current economic crisis, make sure that your data is prepared to lead the way.
Join Edgewater Technology for a step-by-step approach to readying your data to support enterprise reporting and analytics applications.
Introduction to DCAM, the Data Management Capability Assessment Model - Editi...Element22
DCAM stands for Data management Capability Assessment Model. DCAM is a model to assess data management capabilities within the financial industry. It was created by the EDM Council in collaboration with over 100 financial institutions. This presentation provides an overview of DCAM and how financial institutions leverage DCAM to improve or establish their data management programs and meet regulatory requirements such as BCBS 239. Also the benefits of DCAM are described as part of this presentation.
The Data Driven University - Automating Data Governance and Stewardship in Au...Pieter De Leenheer
Data Governance and Stewardship requires automation of business semantics management at its nucleus, in order to achieve data trust between business and IT communities in the organization. University divisions operate highly autonomously and decentralized, and are often geographically distributed. Hence, they benefit more from an collaborative and agile approach to Data Governance and Stewardship approach that adapts to its nature.
In this lecture, we start by reviewing 'C' in ICT and reflect on the dilemma: what is the most important quality of data being shared: truth or trust? We review the wide spectrum of business semantics. We visit the different phases of growing data pain as an organization expands, and we map each phase on this spectrum of semantics.
Next, we introduce our principles and framework for business semantics management to support Data Governance and Stewardship focusing on the structural (what), processual (how) and organizational (who) components. We illustrate with use cases from Stanford University, George Washington University and Public Science and Innovation Administrations.
In this lecture we discuss data quality and data quality in Linked Data. This 50 minute lecture was given to masters student at Trinity College Dublin (Ireland), and had the following contents:
1) Defining Quality
2) Defining Data Quality - What, Why, Costs
3) Identifying problems early - using a simple semantic publishing process as an example
4) Assessing Linked (big) Data quality
5) Quality of LOD cloud datasets
References can be found at the end of the slides
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 (CC-BY-SA-40) International License.
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. The model allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and strengths to leverage. The assessment method reveals priorities, business needs, and a clear, rapid path for process improvements. This webinar will describe the DMM, its evolution, and illustrate its use as a roadmap guiding organizational data management improvements.
Learning Objectives:
Purpose of the DMM: How to use the DMM to assess and improve Data Management Maturity
Getting the most from a DMM assessment: DMM dependencies and requirements for use
Data-Ed Webinar: Data Governance StrategiesDATAVERSITY
The data governance function exercises authority and control over the management of your mission critical assets and guides how all other data management functions are performed. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. This webinar provides you with an understanding of what data governance functions are required and how they fit with other data management disciplines. Understanding these aspects is a necessary pre-requisite to eliminate the ambiguity that often surrounds initial discussions and implement effective data governance and stewardship programs that manage data in support of organizational strategy.
Takeaways:
Understanding why data governance can be tricky for most organizations
Steps for improving data governance within your organization
Guiding principles & lessons learned
Understanding foundational data governance concepts based on the DAMA DMBOK
Compliance issues can impact organizations in many ways. For medical device companies, this can be in the form of the FDA’s unique device identification (UDI) requirements. These requirements, a result of the passage of The FDA Amendments Act of 2007, stipulate that most medical devices carry a unique device identifier.
A webinar addressing how enterprise data management enables UDI compliance was presented live on May 23, 2013 in a joint session with Kelle O’Neal of First San Francisco Partners and Ross Hart of Riversand Technologies.
During the presentation, the following areas were discussed:
- The FDA legislation and the impact it will have on your organization
- Current UDI data challenges and benefits
- How enterprise information management and PIM support UDI
- How to get a UDI program started
- How to ensure a successful UDI program
These are the slides used in Kelle's portion of the presentation.
Many data professionals struggle with the ability to demonstrate tangible returns on data management investments. In a webinar that is designed to appeal to both business and IT attendees, your presenter Dr. Peter Aiken will describe multiple types of value produced through data-centric development and management practices. One of our examples, the healthcare space, offers the unique opportunity to demonstrate additional types of return on investment or value outcomes, namely returns in the form of lives saved through increased rates of Bone Marrow Donor matches. In addition to metrics around increasing revenues or decreasing costs, i.e. investments that directly impact an organization’s financial position, these additional statistics of lives saved can be used to justify data management and quality initiatives.
Check out more of our webinars here: http://www.datablueprint.com/resource-center/
Data-Ed Online: Your Documents and Other Content: Managing Unstructured Data Data Blueprint
This webinar originally aired on Tuesday, August 14, 2012. It is part of Data Blueprint's ongoing webinar series on data management with Dr. Aiken.
Sign up for future sessions at http://www.datablueprint.com/webinar-schedule.
Abstract
Non-tabular data plays an increasing role in organizations. While we are still far away from automated content comprehension, increasingly sophisticated technologies are extending our data management capabilities into more critical and more regulated areas. This presentation provides you with an understanding of the dimensions of this vast new area, including electronic and physical document monitoring, storage systems, content analysis and archive, retrieve and purge cycling.
Data-Ed: Unlock Business Value through Data Quality Engineering Data Blueprint
Organizations must realize what it means to utilize data quality management in support of business strategy. This webinar focuses on obtaining business value from data quality initiatives. I will illustrate how organizations with chronic business challenges often can trace the root of the problem to poor data quality. Showing how data quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from re-occurring.
You can sign up for future Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Data-Ed Online: Let's Talk Metadata: Strategies and Successes Data Blueprint
This webinar originally aired on Tuesday, September 11, 2012. It is part of Data Blueprint's ongoing webinar series on data management with Dr. Aiken.
Sign up for future sessions at http://www.datablueprint.com/webinar-schedule.
Abstract:
Commonly described as metadata management, properly implemented metadata practices incorporate data structures into more abstract processing. By using data about the data to enhance its value, its understandability, ease of use and many other options, organizations have developed sophisticated ways to enhance their data management and especially their data quality engineering efforts. Join us to learn more about specific metadata benefits and how to leverage it to achieve success within your organization.
Data-Ed: Unlock Business Value through Document & Content ManagementData Blueprint
Organizations must realize what it means to utilize document and content management in support of business strategy. The volume of unstructured data is growing at an enormous pace. While we are still far away from automated content comprehension, increasingly sophisticated technologies are extending our business and data management capabilities into more critical and regulated areas. This presentation provides you with an understanding of the dimensions of these new developments, including electronic and physical document monitoring, storage systems, content analysis and archive, retrieve and purge cycling.
Learning Objectives:
What is Document & Content Management and why is it important?
Planning and Implementing Document & Content Management
Document/Record Management Lifecycle
Levels of Control
Content management building blocks
Guiding principles & best practices
Understanding foundational document & content management concepts based on the Data Management Body of Knowledge (DMBOK)
http://www.datablueprint.com/webinar-schedule
Data Systems Integration & Business Value Pt. 2: CloudData Blueprint
Certain systems are more data focused than others. Usually their primary focus is on accomplishing integration of disparate data. In these cases, failure is most often attributable to the adoption of a single pillar (silver bullet). The three webinars in the Data Systems Integration and Business Value series are designed to illustrate that good systems development more often depends on at least three DM disciplines (pie wedges) in order to provide a solid foundation.
Many organizations are modifying their IT portfolios to fully take advantage of the benefits of cloud computing. While the motivation is specific and focuses on broad-based challenges, all organizations are prepared to benefit from aspects of the cloud. This is accomplished by ensuring that cloud-hosted data share three attributes. Cloud-hosted datasets must be of:
Higher quality data than those data residing outside of the cloud;
Lower volume (1/5 the size of data collections) than similar collections residing outside of the cloud; and
Increased share-ability than data residing outside the cloud.
Increases in capacity utilization, improved IT flexibility and responsiveness, as well as the forecast decreases in cost accruing to cloud-based computing are all possible after these first three conditions have been met. Necessary investments in data engineering can help organizations to save even more money by reducing the amount of resources required to perform their duties and increasing the effectiveness of their duties and decision-making. This webinar will show you how to recognize the opportunities, ‘size up’ the required investment, and properly supervise your efforts to take advantage of the opportunities presented by the cloud.
You can sign up for future Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Data-Ed: Show Me the Money: The Business Value of Data and ROIData Blueprint
This webinar originally aired on Tuesday, December 11, 2012. It is part of Data Blueprint's ongoing webinar series on data management with Dr. Aiken.
Sign up for future sessions at http://www.datablueprint.com/webinar-schedule.
Abstract:
Failure to successfully monetize data management investments sets up an unfortunate loop of fixing symptoms without addressing the underlying problems. As organizations begin to understand poor data management practices as the root causes of many of their problems, they become more willing to make the required investments in our profession. This presentation uses specific examples to illustrate the costs of poor data management. Join us and learn how you can apply similar tactics at your organization to justify funding and gain management approval.
Data-Ed: Unlock Business Value through Data GovernanceData Blueprint
If your organization understands your function, they see you as an investment. If your organization does not understand what you do, they are likely to perceive you as a cost. The goal of this webinar is to provide you with concrete ideas for how to reinforce the first mindset at your organization. Success stories must be used to ensure continued organizational support. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. For example: using specific common terms (and narratives) when referencing organizational mishaps, e.g. The Chocolate Story.
Learning Objectives:
Understanding contextually why data governance can be tricky for most organizations
Demonstrate a variety of “storytelling” techniques
How to use “worst practices” to your advantage
Understanding foundational data governance concepts based on the Data Management Body of Knowledge (DMBOK)
Taking away several novel but tangible examples of generating business value through data governance
Data-Ed: Unlock Business Value Through Reference & MDM Data Blueprint
In order to succeed, organizations must realize what it means to utilize reference and MDM in support of business strategy. This presentation provides you with an Understanding of the goals of reference and MDM, including the establishment and implementation of authoritative data sources, more effective means of delivering data to various business processes, as well as increasing the quality of information used in organizational analytical functions, e.g. BI. We also highlight the equal importance of incorporating data quality engineering into all efforts related to reference and master data management.
Check out more of our webinars here: http://www.datablueprint.com/webinar-schedule
This webinar aired originally on Tuesday, March 13, 2012. It is part of Data Blueprint's ongoing webinar series on data management with Dr. Peter Aiken.
Sign up for future sessions at http://www.datablueprint.com/webinar-schedule.
Abstract
This presentation provides you with an understanding of the data modeling and data development components of data management. Participants will understand how the analysis, design, implementation, deployment, and maintenance of data solutions should be approached in order to maximize the full value of the enterprise data resources and activities. Architecting in quality is imperative at this level and complements a subset of project activities within the system development lifecycle (SDLC) focused on defining data requirements, designing data solution components, and implementing these components. Participants will understand the difficulties organizations experience when interacting with data development efforts and how to best incorporate these efforts into specific data projects.
View the video recording here: http://www.slideshare.net/aberkowitz/dataed-online-practical-data-modeling-12019990
Data architecture is foundational to an information-based operational environment. It is your data architecture that organizes your data assets so they can be leveraged in your business strategy to create real business value. Even though this is important, not all data architectures are used effectively. This webinar describes the use of data architecture as a basic analysis method. Various uses of data architecture to inform, clarify, understand, and resolve aspects of a variety of business problems will be demonstrated. As opposed to showing how to architect data, your presenter Dr. Peter Aiken will show how to use data architecting to solve business problems. The goal is for you to be able to envision a number of uses for data architectures that will raise the perceived utility of this analysis method in the eyes of the business.
Find more Data-Ed webinars here: www.datablueprint.com
Data-Ed: Building the Case for the Top Data JobData Blueprint
Reflections on the past 25 years of organizational IT accomplishments, combined with performance measurement data, indicate that current IT management has been called upon to do a job that it cannot do well. Data are assets that deserve to be managed as professionally and aggressively as other company assets. Objective measurements show that approximately 1% of all organizations achieve data management success. In the face of the ongoing “data explosion,” this leaves most organizations wholly unprepared to leverage their sole, non-degrading, strategic asset. The requirements and organizational performance dictate a full time position that does not report to IT and manages the data function from a function that is external to and precedes the SDLC. While transformation may require some organizational discomfort, this move will achieve improved organizational IT performance faster and cheaper than ERPs or any other silver bullet.
Learning Objectives:
Why there typically isn’t and ultimately must be an authority (a chief) on organizational informational asset management
Why CIOS have not been able to devote the required time and attention
The seriousness of the skill gap – requisite expertise is rare
Understanding the ideal relationship between Data and IT.
Introduction to Microsoft’s Master Data Services (MDS)James Serra
Master Data Services is bundled with SQL Server 2012 to help resolve many of the Master Data Management issues that companies are faced with when integrating data. In this session, James will show an overview of Master Data Services 2012, including the out of the box Web UI, the highly developed Excel Add-in, and how to get started with loading MDS with your data.
Microsoft SQL Server 2012 Master Data ServicesMark Ginnebaugh
Author: Mark Gschwind, DesignMind
San Francisco, California
Master Data Services had a major upgrade in the SQL Server 2012 release. BI Consultant Mark Gschwind takes you through the new Excel interface, the new Silverlight look and feel, and integration improvements.
Knowing how to use this tool can be a valuable addition to your repertoire as a BI professional, allowing you to address data quality and other challenges.
Mark will show how to create a model, add columns and rows, manage security, and create hierarchies. He demos the new Excel interface and discuss how to allow you to manage master data yourself. He'll touch on how to integrate with a DW, migrating from Dev to Production.
You'll learn:
* How to let users manage dimensions and hierarchies for your DW
* How to create workflows to improve data quality in your DW
* Tips from real-life implementations to help you achieve a successful implementation
Mark Gschwind, Partner at DesignMind, is an expert on data warehousing, OLAP, and ERP migration. He has authored three enterprise data warehouses and over 80 OLAP cubes for 46 clients in a wide range of industries. Mark has certifications in SQL Server and Oracle Essbase.
Introducing Open XDX Technology for Open Data API developmentBizagi Inc
Introduction to the concepts of Open-XDX for building Open Data APIs using the CAMeditor toolkit. See also http://www.verifyXML.org for working online demonstration site.
For online demonstration site please see: http://www.verifyxml.org
Similar to Data-Ed Online: Engineering Solutions to Data Quality Challenges (20)
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization’s data management capabilities. The model allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and strengths to leverage. The assessment method reveals priorities, business needs, and a clear, rapid path for process improvements. This webinar will describe the DMM, its evolution, and illustrate its use as a roadmap guiding organizational data management improvements.
Data-Ed: A Framework for no sql and HadoopData Blueprint
Big Data and NoSQL continue to make headlines everywhere. However, most of what has been written about these topics is focused on the hardware, services, and scale out. But what about a Big Data and NoSQL Strategy, one that supports your business strategy? Virtually every major organization thinking about these data platforms is faced with the challenge of figuring out the appropriate approach and the requirements. This presentation will provide guidance on how to think about and establish realistic Big Data management plans and expectations. We will introduce a framework for evaluating the various choices when it comes to implementing and succeeding with Big Data/NoSQL and show how to demonstrate a sample use case.
Many data professionals struggle with the ability to demonstrate tangible returns on data management investments. In a webinar that is designed to appeal to both business and IT attendees, your presenter will describe multiple types of value produced through data-centric development and management practices. One of our examples, the healthcare space, offers the unique opportunity to demonstrate additional types of return on investment or value outcomes, namely returns in the form of lives saved through increased rates of Bone Marrow Donor matches. In addition to metrics around increasing revenues or decreasing costs, i.e. investments that directly impact an organization’s financial position, these additional statistics of lives saved can be used to justify data management and quality initiatives.
The data governance function exercises authority and control over the management of your mission critical assets and guides how all other data management functions are performed. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. This webinar provides you with an understanding of what data governance functions are required and how they fit with other data management disciplines. Understanding these aspects is a necessary pre-requisite to eliminate the ambiguity that often surrounds initial discussions and implement effective data governance and stewardship programs that manage data in support of organizational strategy.
Find more of our Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Data architecture is foundational to an information-based operational environment. It is your data architecture that organizes your data assets so they can be leveraged in your business strategy to create real business value. Even though this is important, not all data architectures are used effectively. This webinar describes the use of data architecture as a basic analysis method. Various uses of data architecture to inform, clarify, understand, and resolve aspects of a variety of business problems will be demonstrated. As opposed to showing how to architect data, your presenter Dr. Peter Aiken will show how to use data architecting to solve business problems. The goal is for you to be able to envision a number of uses for data architectures that will raise the perceived utility of this analysis method in the eyes of the business.
Find out more: http://www.datablueprint.com/resource-center/webinar-schedule/
This presentation provides you with an understanding of the goals of reference and master data management (MDM), including establishing and implementing authoritative data sources, establishing and implementing more effective means of delivery data to various business processes, as well as increasing the quality of information used in organizational analytical functions (such as BI). You will understand the parallel importance of incorporating data quality engineering into the planning of reference and MDM.
Check out more of our Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Data is the lifeblood of just about every organization and functional area today. As businesses struggle to come to grips with the data flood, it is even more critical to focus on data as an asset that directly supports business imperatives as other organizational assets do. Organizations across most industries attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality) to enhance business unit performance. Unfortunately however, the results of these efforts frequently fall far below expectations due to haphazard approaches. Overall, poor organizational data management capabilities are the root cause of many of these failures. This webinar covers three lessons (illustrated by examples), which will help you to establish realistic OM plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers.
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered. This webinar will illustrate that good systems development more often depends on at least three data management disciplines in order to provide a solid foundation.
Find more Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Good systems development often depends on multiple data management disciplines that provide a solid foundation. One of these is metadata. While much of the discussion around metadata focuses on understanding metadata itself along with its associated technologies, this perspective often represents a typical tool-and-technology focus, which has not achieved significant results to date. A more relevant question when considering pockets of metadata is whether to include them in the scope of organizational metadata practices. By understanding what it means to include items in the scope of your metadata practices, you can begin to build systems that allow you to practice sophisticated ways to advance their data management and supported business initiatives. After a bit of practice in this manner you can position your organization to better exploit any and all metadata technologies in support of business strategy.
Find more data management webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
The data governance function exercises authority and control over the management of your mission critical assets and guides how all other data management functions are performed. When selling data governance to organizational management, it is useful to concentrate on the specifics that motivate the initiative. This means developing a specific vocabulary and set of narratives to facilitate understanding of your organizational business concepts. This webinar provides you with an understanding of what data governance functions are required and how they fit with other data management disciplines. Understanding these aspects is a necessary pre-requisite to eliminate the ambiguity that often surrounds initial discussions and implement effective data governance and stewardship programs that manage data in support of organizational strategy.
Check out more webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Data-Ed: Best Practices with the Data Management Maturity ModelData Blueprint
The Data Management Maturity (DMM) model is a framework for the evaluation and assessment of an organization's data management capabilities. The model allows an organization to evaluate its current state data management capabilities, discover gaps to remediate, and strengths to leverage. The assessment method reveals priorities, business needs, and a clear, rapid path for process improvements. This webinar will describe the DMM, its evolution, and illustrate its use as a roadmap guiding organizational data management improvements.
Tools alone are not the answer: Career roles and growth tracks for data professionals. In today’s (Big) data-driven information economy, it is even more critical to focus on data as an asset that directly supports business imperatives. But tools alone are not the answer. Organizations that want to rise above their competition can only do so with the help of skilled professionals who know how to manage, mine, and draw actionable insights from the multitudes of (Big) data sources. Numerous new roles and job titles have emerged to address the high demand for specialized data professionals. This webinar brings together three individuals well qualified to contribute to this important industry-wide discussion of data jobs. We will take a closer look at these newer data management roles and present recommendations on how to enhance career paths.
Check out more webinars here: http://www.datablueprint.com/resource-center/webinar-archive/
Data is the lifeblood of just about every organization and functional area today. As businesses struggle to come to grips with the data flood, it is even more critical to focus on data as an asset that directly supports business imperatives as other organizational assets do. Organizations across most industries attempt to address data opportunities (e.g. Big Data) and data challenges (e.g. data quality) to enhance business unit performance. Unfortunately however, the results of these efforts frequently fall far below expectations due to haphazard approaches. Overall, poor organizational data management capabilities are the root cause of many of these failures. This webinar covers three lessons (illustrated by examples), which will help you to establish realistic OM plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers.
Check out more of our webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
We are in the middle of a data flood and we need to figure out how to tame it without drowning. Most of what has been written about Big Data is focused on selling hardware and services. But what about a Big Data Strategy that guides hardware and software decisions? While virtually every major organization is faced with the challenge of figuring out the approach for and the requirements of this new development, jumping into the fray hastily and unprepared will only reproduce the same dismal IT project results as previously experienced. Join Dr. Peter Aiken as he will debunk a number of misconceptions about Big Data as your un-typical IT project. He will provide guidance on how to establish realistic Big Data management plans and expectations, and help demonstrate the value of such actions to both internal and external decision makers without getting lost in the hype.
Check out more of our Data-Ed webinars here: www.datablueprint.com/webinar-schedule
Data-Ed: Show Me the Money: Monetizing Data ManagementData Blueprint
Failure to successfully monetize data management investments sets up an unfortunate loop of fixing symptoms without addressing the underlying problems. As organizations begin to understand poor data management practices as the root causes of many of their business problems, they become more willing to make the required investments in our profession. This presentation uses specific examples to illustrate the costs of poor data management and how it impacts business objectives. Join us and learn how you can better align your data management projects with business objectives to justify funding and gain management approval.
Check out more of our webinars: http://www.datablueprint.com/resource-center/webinar-schedule/
Data Systems Integration & Business Value PT. 3: Warehousing Data Blueprint
Certain systems are more data focused than others. Usually their primary focus is on accomplishing integration of disparate data. In these cases, failure is most often attributable to the adoption of a single pillar (silver bullet). The three webinars in the Data Systems Integration and Business Value series are designed to illustrate that good systems development more often depends on at least three DM disciplines (pie wedges) in order to provide a solid foundation.
Integrating data across systems has been a perpetual challenge. Unfortunately, the current technology-focused solutions have not helped IT to improve its dismal project success statistics. Data warehouses, BI implementations, and general analytical efforts achieve the same levels of success as other IT projects – approximately 1/3rd are considered successes when measured against price, schedule, or functionality objectives. The first step is determining the appropriate analysis approach to the data system integration challenge. The second step is understanding the strengths and weaknesses of various approaches. Turns out that proper analysis at this stage makes actual technology selection far more accurate. Only when these are accomplished can proper matching between problem and capabilities be achieved as the third step and true business value be delivered.
Data-Ed: Data Systems Integration & Business Value PT. 1: MetadataData Blueprint
Certain systems are more data focused than others. Usually their primary focus is on accomplishing integration of disparate data. In these cases, failure is most often attributable to the adoption of a single pillar (silver bullet). The three webinars in the Data Systems Integration and Business Value series are designed to illustrate that good systems development more often depends on at least three DM disciplines (pie wedges) in order to provide a solid foundation.
Much of the discussion of metadata focuses on understanding it and the associated technologies. While these are important, they represent a typical tool/technology focus and this has not achieved significant results to date. A more relevant question when considering pockets of metadata is: Whether to include them in the scope organizational metadata practices. By understanding what it means to include items in the scope of your metadata practices, you can begin to build systems that allow you to practice sophisticated ways to advance their data management and supported business initiatives. After a bit of practice in this manner you can position your organization to better exploit any and all metadata technologies.
You can sign up for future Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
Yes, we face a data deluge and big data seems to be largely about how to deal with it. But 99% of what has been written about big data is focused on selling hardware and services. The truth is that until the concept of big data can be objectively defined, any measurements, claims of success, quantifications, etc. must be viewed skeptically and with suspicion. While both the need for and approaches to these new requirements are faced by virtually every organization, jumping into the fray ill-prepared has (to date) reproduced the same dismal IT project results.
The very real, very rapid, very great increases in data of all forms (charts showing data types and volume increases)
Challenges faced by virtually all data management programs
Means by which big data techniques can compliment existing data management practices
Necessary but insufficient pre-requisites to exploiting big data techniques
Prototyping nature of practicing big data techniques
You can sign up for future Data-Ed webinars here: http://www.datablueprint.com/resource-center/webinar-schedule/
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Removing Uninteresting Bytes in Software FuzzingAftab Hussain
Imagine a world where software fuzzing, the process of mutating bytes in test seeds to uncover hidden and erroneous program behaviors, becomes faster and more effective. A lot depends on the initial seeds, which can significantly dictate the trajectory of a fuzzing campaign, particularly in terms of how long it takes to uncover interesting behaviour in your code. We introduce DIAR, a technique designed to speedup fuzzing campaigns by pinpointing and eliminating those uninteresting bytes in the seeds. Picture this: instead of wasting valuable resources on meaningless mutations in large, bloated seeds, DIAR removes the unnecessary bytes, streamlining the entire process.
In this work, we equipped AFL, a popular fuzzer, with DIAR and examined two critical Linux libraries -- Libxml's xmllint, a tool for parsing xml documents, and Binutil's readelf, an essential debugging and security analysis command-line tool used to display detailed information about ELF (Executable and Linkable Format). Our preliminary results show that AFL+DIAR does not only discover new paths more quickly but also achieves higher coverage overall. This work thus showcases how starting with lean and optimized seeds can lead to faster, more comprehensive fuzzing campaigns -- and DIAR helps you find such seeds.
- These are slides of the talk given at IEEE International Conference on Software Testing Verification and Validation Workshop, ICSTW 2022.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.