The document discusses best practices for capturing data requirements for projects that are data-rich. It emphasizes the importance of taking a top-down requirements approach and maintaining traceability between requirements. While use cases are useful, they often do not fully capture data needs. The document advocates looking beyond immediate needs to plan for business intelligence by capturing additional relevant data elements upfront.
07. Analytics & Reporting Requirements TemplateAlan D. Duncan
This document template defines an outline structure for the clear and unambiguous definition of analytics & reporting outputs (including standard reports, ad hoc queries, Business Intelligence, analytical models etc).
Capturing Business Requirements For Scorecards, Dashboards And ReportsJulian Rains
This paper helps Management Information and Business Intelligence related projects build a solid foundation for their reporting business requirements gathering. It defines the scope of the information needed to design and build dashboards, scorecards and other types of report.
Presentation on Business Requirements gathering for Business Intelligence from our BI Practice Lead. Detailed instruction on how to maximize your time in gathering requirements and ensure you capture what is important to the user. Requirements gathering is critical to the success of a BI project.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
07. Analytics & Reporting Requirements TemplateAlan D. Duncan
This document template defines an outline structure for the clear and unambiguous definition of analytics & reporting outputs (including standard reports, ad hoc queries, Business Intelligence, analytical models etc).
Capturing Business Requirements For Scorecards, Dashboards And ReportsJulian Rains
This paper helps Management Information and Business Intelligence related projects build a solid foundation for their reporting business requirements gathering. It defines the scope of the information needed to design and build dashboards, scorecards and other types of report.
Presentation on Business Requirements gathering for Business Intelligence from our BI Practice Lead. Detailed instruction on how to maximize your time in gathering requirements and ensure you capture what is important to the user. Requirements gathering is critical to the success of a BI project.
To take a “ready, aim, fire” tactic to implement Data Governance, many organizations assess themselves against industry best practices. The process is not difficult or time-consuming and can directly assure that your activities target your specific needs. Best practices are always a strong place to start.
Join Bob Seiner for this popular RWDG topic, where he will provide the information you need to set your program in the best possible direction. Bob will walk you through the steps of conducting an assessment and share with you a set of typical results from taking this action. You may be surprised at how easy it is to organize the assessment and may hear results that stimulate the actions that you need to take.
In this webinar, Bob will share:
- The value of performing a Data Governance best practice assessment
- A practical list of industry Data Governance best practices
- Criteria to determine if a practice is best practice
- Steps to follow to complete an assessment
- Typical recommendations and actions that result from an assessment
03. Business Information Requirements TemplateAlan D. Duncan
A template for the clear and unambiguous definition of business data and information requirements. (cf. “Business Requirements Document”, “Functional Specification” or similar from standard SDLC processes). As such, the contents will typically form the basis for population and publication of a business glossary of information terms.
Metadata is hotter than ever, according to a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
Example data specifications and info requirements framework OVERVIEWAlan D. Duncan
This example framework offers a set of outline principles, standards and guidelines to describe and clarify the semantic meaning of data terms in support of an Information Requirements Management process.
It provides template guidance to Information Management, Data Governance and Business Intelligence practitioners for such circumstances that need clear, unambiguous and reliable understanding of the context, semantic meaning and intended usages for data.
Data Governance and Metadata ManagementDATAVERSITY
Metadata is a tool that improves data understanding, builds end-user confidence, and improves the return on investment in every asset associated with becoming a data-centric organization. Metadata’s use has expanded beyond “data about data” to cover every phase of data analytics, protection, and quality improvement. Data Governance and metadata are connected at the hip in every way possible. As the song goes, “You can’t have one without the other.”
In this RWDG webinar, Bob Seiner will provide a way to renew your energy by focusing on the valuable asset that can make or break your Data Governance program’s success. The truth is metadata is already inherent in your data environment, and it can be leveraged by making it available to all levels of the organization. At issue is finding the most appropriate ways to leverage and share metadata to improve data value and protection.
Throughout this webinar, Bob will share information about:
- Delivering an improved definition of metadata
- Communicating the relationship between successful governance and metadata
- Getting your business community to embrace the need for metadata
- Determining the metadata that will provide the most bang for your bucks
- The importance of Metadata Management to becoming data-centric
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
White Paper - Data Warehouse Documentation RoadmapDavid Walker
All projects need documentation and many companies provide templates as part of a methodology. This document describes the templates, tools and source documents used by Data Management & Warehousing. It serves two purposes:
• For projects using other methodologies or creating their own set of documents to use as a checklist. This allows the project to ensure that the documentation covers the essential areas for describing the data warehouse.
• To demonstrate our approach to our clients by describing the templates and deliverables that are produced.
Documentation, methodologies and templates are inherently both incomplete and flexible. Projects may wish to add, change, remove or ignore any part of any document. Some may also believe that aspects of one document would sit better in another. If this is the case then users of this document and these templates are encouraged to change them to fit their needs.
Data Management & Warehousing believes that the approach or methodology for building a data warehouse should be to use a series of guides and checklists. This ensures that small teams of relatively skilled resources developing the system can cover all aspects of the project whilst being free to deal with the specific issues of their environment to deliver exceptional solutions, rather than a rigid methodology that ensures that large teams of relatively unskilled staff can meet a minimum standard.
Review existing data management maturity models to identify core set of characteristics of an effective data maturity model:
DMBOK (Data Management Book of Knowledge) from DAMA (Data Management Association)
MIKE2.0 (Method for an Integrated Knowledge Environment) Information Maturity Model (IMM)
IBM Data Governance Council Maturity Model
Enterprise Data Management Council Data Management Maturity Model
Gathering And Documenting Your Bi Business RequirementsWynyard Group
Business requirements are critical to any project. Recent studies show that 70% of organisations fail to gather business requirements well. What is worse is that poor requirements can lead a project to over spend its original budget by 95%.
Business Intelligence and Performance Management projects are no different. This session will provide a series of tips, techniques and ideas on how you can discover, analyse, understand and document your business requirements for your BI and PM projects. This session will also touch on specific issues, hurdles and obstacle that occur for a typical BI or PM project
• The importance of business requirements and a well defined business requirements process
• Understanding the difference between a “wish-list” or vision and business requirements
• The need and benefits of having a business traceability matrix
Start your BI projects on the right foot – understand your requirements
You Need a Data Catalog. Do You Know Why?Precisely
Data catalog has become a more popular discussion topic within data management and data governance circles. “What is it?” and “Do I need one?” are two common questions; along with “How does a catalog relate to and support the data governance program?”
The data catalog plays a key role in the governance process; How well information can be managed, aligned to business objectives and monetized depends in great part to what you know about your data.
In this webinar you will learn about:
- The role of the data catalog
- What kinds of information should be in your data catalog
- Those catalog items that can be harvested systemically versus those that require stewardship involvement
- The role of the catalog in your data quality program
We hope you’ll join this on-demand webinar and learn how a data catalog should be part of your governance and data quality program!
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Master Data Management - Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) can provide significant value to the organization in creating consistent key data assets such as Customer, Product, Supplier, Patient, and the list goes on. But getting MDM “right” requires a strategic mix of Data Architecture, business process, and Data Governance. Join this webinar to learn how to find the “sweet spot” between technology, design, process, and people for your MDM initiative.
DAS Slides: Building a Data Strategy - Practical Steps for Aligning with Busi...DATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task. The opportunity in getting it right can be significant, however, as data drives many of the key initiatives in today’s marketplace: digital transformation, marketing, customer centricity, and more. This webinar will help de-mystify Data Strategy and Data Architecture and will provide concrete, practical ways to get started.
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Effectively Planning for an Enterprise-Scale CMDB ImplementationAntonio Rolle
Provies a review of why a CMDB is essential to and is the foundation of your BSM strategy. I also outline the known challenges that require planning at the outset of a CMDB initiative. Includes a case study which details the approach and lessons learned in the initial stages of a CMDB rollout for one of the largest financial institutions in North America
03. Business Information Requirements TemplateAlan D. Duncan
A template for the clear and unambiguous definition of business data and information requirements. (cf. “Business Requirements Document”, “Functional Specification” or similar from standard SDLC processes). As such, the contents will typically form the basis for population and publication of a business glossary of information terms.
Metadata is hotter than ever, according to a number of recent DATAVERSITY surveys. More and more organizations are realizing that in order to drive business value from data, robust metadata is needed to gain the necessary context and lineage around key data assets. At the same time, industry regulations are driving the need for better transparency and understanding of information.
While metadata has been managed for decades, new strategies & approaches have been developed to support the ever-evolving data landscape, and provide more innovative ways to drive business value from metadata. This webinar will provide an overview of metadata strategies & technologies available to today’s organization, and provide insights into building successful business strategies for metadata adoption & use.
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
Example data specifications and info requirements framework OVERVIEWAlan D. Duncan
This example framework offers a set of outline principles, standards and guidelines to describe and clarify the semantic meaning of data terms in support of an Information Requirements Management process.
It provides template guidance to Information Management, Data Governance and Business Intelligence practitioners for such circumstances that need clear, unambiguous and reliable understanding of the context, semantic meaning and intended usages for data.
Data Governance and Metadata ManagementDATAVERSITY
Metadata is a tool that improves data understanding, builds end-user confidence, and improves the return on investment in every asset associated with becoming a data-centric organization. Metadata’s use has expanded beyond “data about data” to cover every phase of data analytics, protection, and quality improvement. Data Governance and metadata are connected at the hip in every way possible. As the song goes, “You can’t have one without the other.”
In this RWDG webinar, Bob Seiner will provide a way to renew your energy by focusing on the valuable asset that can make or break your Data Governance program’s success. The truth is metadata is already inherent in your data environment, and it can be leveraged by making it available to all levels of the organization. At issue is finding the most appropriate ways to leverage and share metadata to improve data value and protection.
Throughout this webinar, Bob will share information about:
- Delivering an improved definition of metadata
- Communicating the relationship between successful governance and metadata
- Getting your business community to embrace the need for metadata
- Determining the metadata that will provide the most bang for your bucks
- The importance of Metadata Management to becoming data-centric
Enterprise Architecture vs. Data ArchitectureDATAVERSITY
Enterprise Architecture (EA) provides a visual blueprint of the organization, and shows key interrelationships between data, process, applications, and more. By abstracting these assets in a graphical view, it’s possible to see key interrelationships, particularly as they relate to data and its business impact across the organization. Join us for a discussion on how data architecture is a key component of an overall enterprise architecture for enhanced business value and success.
White Paper - Data Warehouse Documentation RoadmapDavid Walker
All projects need documentation and many companies provide templates as part of a methodology. This document describes the templates, tools and source documents used by Data Management & Warehousing. It serves two purposes:
• For projects using other methodologies or creating their own set of documents to use as a checklist. This allows the project to ensure that the documentation covers the essential areas for describing the data warehouse.
• To demonstrate our approach to our clients by describing the templates and deliverables that are produced.
Documentation, methodologies and templates are inherently both incomplete and flexible. Projects may wish to add, change, remove or ignore any part of any document. Some may also believe that aspects of one document would sit better in another. If this is the case then users of this document and these templates are encouraged to change them to fit their needs.
Data Management & Warehousing believes that the approach or methodology for building a data warehouse should be to use a series of guides and checklists. This ensures that small teams of relatively skilled resources developing the system can cover all aspects of the project whilst being free to deal with the specific issues of their environment to deliver exceptional solutions, rather than a rigid methodology that ensures that large teams of relatively unskilled staff can meet a minimum standard.
Review existing data management maturity models to identify core set of characteristics of an effective data maturity model:
DMBOK (Data Management Book of Knowledge) from DAMA (Data Management Association)
MIKE2.0 (Method for an Integrated Knowledge Environment) Information Maturity Model (IMM)
IBM Data Governance Council Maturity Model
Enterprise Data Management Council Data Management Maturity Model
Gathering And Documenting Your Bi Business RequirementsWynyard Group
Business requirements are critical to any project. Recent studies show that 70% of organisations fail to gather business requirements well. What is worse is that poor requirements can lead a project to over spend its original budget by 95%.
Business Intelligence and Performance Management projects are no different. This session will provide a series of tips, techniques and ideas on how you can discover, analyse, understand and document your business requirements for your BI and PM projects. This session will also touch on specific issues, hurdles and obstacle that occur for a typical BI or PM project
• The importance of business requirements and a well defined business requirements process
• Understanding the difference between a “wish-list” or vision and business requirements
• The need and benefits of having a business traceability matrix
Start your BI projects on the right foot – understand your requirements
You Need a Data Catalog. Do You Know Why?Precisely
Data catalog has become a more popular discussion topic within data management and data governance circles. “What is it?” and “Do I need one?” are two common questions; along with “How does a catalog relate to and support the data governance program?”
The data catalog plays a key role in the governance process; How well information can be managed, aligned to business objectives and monetized depends in great part to what you know about your data.
In this webinar you will learn about:
- The role of the data catalog
- What kinds of information should be in your data catalog
- Those catalog items that can be harvested systemically versus those that require stewardship involvement
- The role of the catalog in your data quality program
We hope you’ll join this on-demand webinar and learn how a data catalog should be part of your governance and data quality program!
Data Modeling, Data Governance, & Data QualityDATAVERSITY
Data Governance is often referred to as the people, processes, and policies around data and information, and these aspects are critical to the success of any data governance implementation. But just as critical is the technical infrastructure that supports the diverse data environments that run the business. Data models can be the critical link between business definitions and rules and the technical data systems that support them. Without the valuable metadata these models provide, data governance often lacks the “teeth” to be applied in operational and reporting systems.
Join Donna Burbank and her guest, Nigel Turner, as they discuss how data models & metadata-driven data governance can be applied in your organization in order to achieve improved data quality.
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Master Data Management - Aligning Data, Process, and GovernanceDATAVERSITY
Master Data Management (MDM) can provide significant value to the organization in creating consistent key data assets such as Customer, Product, Supplier, Patient, and the list goes on. But getting MDM “right” requires a strategic mix of Data Architecture, business process, and Data Governance. Join this webinar to learn how to find the “sweet spot” between technology, design, process, and people for your MDM initiative.
DAS Slides: Building a Data Strategy - Practical Steps for Aligning with Busi...DATAVERSITY
Developing a Data Strategy for your organization can seem like a daunting task. The opportunity in getting it right can be significant, however, as data drives many of the key initiatives in today’s marketplace: digital transformation, marketing, customer centricity, and more. This webinar will help de-mystify Data Strategy and Data Architecture and will provide concrete, practical ways to get started.
Strategic Business Requirements for Master Data Management SystemsBoris Otto
This presentation describes strategic business requirements of master data management (MDM) systems. The requirements were developed in a consortium research approach by the Institute of Information Management at the University of St. Gallen, Switzerland, and 20 multinational enterprises.
The presentation was given at the 17th Amercias Conference on Information Systems (AMCIS 2011) in Detroit, MI.
The research paper on which this presentation is based on can be found here: http://www.alexandria.unisg.ch/Publikationen/Zitation/Boris_Otto/177697
Effectively Planning for an Enterprise-Scale CMDB ImplementationAntonio Rolle
Provies a review of why a CMDB is essential to and is the foundation of your BSM strategy. I also outline the known challenges that require planning at the outset of a CMDB initiative. Includes a case study which details the approach and lessons learned in the initial stages of a CMDB rollout for one of the largest financial institutions in North America
1RUNNING HEAD Normalization2NormalizationNORM.docxdrennanmicah
1
RUNNING HEAD: Normalization
2
Normalization
NORMALIZATION
Charles Williams
CS352 Unit 3 IP
Professor Jeffery Karlberg
1/26/2019
Table of Contents
Table of Contents………………………………………………………………………………………………………………………………………………………………………………………………..….2
The Database Models, Languages, and Architecture 3
Database System Development Life Cycle 5
Database Management Systems 6
Advanced SQL 11
Web and Data Warehousing and Mining in the Business World 12
References 13
Database management
It is important that a formal design methodology is used as it provides a mathematical approach to coming up with a reliable database that consolidates all the environments that use the database. A design methodology helps as it provides a way in which the whole designing and development can be done with minimum errors. The design methodology helps in identifying the requirements, the specifications and design levels of the database and data warehouse up for development. The planning stage of the consolidated data base is very important as it involves the coming up with plans that will guide the development of the database (Mabogunje, 2015). The plans help in managing quality, time, risks and other related issues that might affect the design and development of the database and eventually the data warehouse.
The three layers of the 3-level ANSI-SPARC architecture include; a physical schema which is responsible for defining how data is to be stored, a conceptual schema which is responsible for indexing and relating data, and the external schema which is responsible for showing how information was presented. The 3-level ANSI-SPARC type of architecture is designed to guard and guide data change. The primary function of the first layer is to define how data is stored. It is important to note that there can be changes in the physical schema and the changes will not affect how external applications will interact with the stored data (Pokorný, 2018). The second layer’s primary function is to provide a consolidated view of a database. The third layer’s primary function is to define richer APIs and it can do so without necessarily having to change the underlying storage mechanisms in place. The 3-level ANSI-SPARC architecture helps in promoting data independence which in turn helps save time in the long run through the conceptual schema which emphasizes data mapping.
Data administrator and database administrator
A data administrator is an individual whose function is to gather data requirements, analyze data as well as design data and classify data types. They two primary roles of a data administrator include; coming up with data standards that will be applied in databases, and coming up with policies that will dictate on data security, data access, data usage, dataflow well as data authorization in an organization. Other minor duties of data administrator include; playing an assistance role by coming up with data resources and allowing for the sharing of data across ap.
Why BI ?
Performance management
Identify trends
Cash flow trend
Fine-tune operations
Sales pipeline analysis
Future projections
business Forecasting
Decision Making Tools
Convert data into information
How to Think ?
What happened?
What is happening?
Why did it happen?
What will happen?
What do I want to happen?
Complexities of Separating Data in an ERP Environmenteprentise
In an Enterprise Resource Planning (ERP) environment, multiple organizations can exist within a single instance. How does the data belonging to these organizations co-exist, and what are the challenges that companies face when they have to separate the data based on business reasons? With a focus on Oracle E-Business Suite (EBS), our speaker Chief Technology Officer of eprentise and Managing Director of eprentise India, Anil Kukreja will explore the best ways to address complexities in ERP environments to achieve success when separating data in this session.
Learning Objectives: After completion of this program you will be able to:
• Objective 1: Understand how data for multiple organizations reside in a single ERP environment.
• Objective 2: Understand the complexities involved in separating data for organization(s) in an ERP environment.
• Objective 3: Achieve success in separating data for organization(s) to meet business objectives.
Brighttalk converged infrastructure and it operations management - finalAndrew White
How Converged Infrastructure Will Change IT Operations Management
Over the past decade, Enterprises have leveraged a shared service model to make IT more cost effective. The emergence of “Converged Infrastructure” and “Fabric-Based Infrastructure” will allow IT to offer purpose driven solutions rather than the function driven solutions of the past. To do this, IT will need to evolve towards more modular designs, rely more on open standards, and rethink their approach to management frameworks.
In this session you will learn:
How converged infrastructure is used to create purpose driven solutions
Why new operational challenges are faced as this new approach is used broadly
What changes need to occur to succeed with this new paradigm
Age of Exploration: How to Achieve Enterprise-Wide DiscoveryInside Analysis
The Briefing Room with Dr. Robin Bloor and IBM Information Management
Live Webcast Nov. 19, 2013
Watch the archive: https://bloorgroup.webex.com/bloorgroup/lsr.php?AT=pb&SP=EC&rID=7808847&rKey=73cc8052da2d9962
The bigger data volumes get, the wider the range of sources available, the more companies need to secure a strategic view of their information assets. This is no small challenge for all kinds of reasons, not the least of which is access to the growing array of valuable data sets available. Today's most innovative companies are using creative solutions to ride the information wave.
Register for this episode of The Briefing Room to hear veteran Analyst Dr. Robin Bloor, as he explains how the unbridled growth of data and information systems requires a holistic approach to information access. He will be briefed by Mark Myers and Scott Parker of IBM, who will showcase the company’s InfoSphere Data Explorer product, a solution aimed squarely at the need to gain a cohesive view of enterprise data, wherever it may be. Myers and Parker will discuss how Data Explorer can help organizations to get more from their SharePoint investments, enabling them to deliver information to front-line employees regardless of where it is managed.
Visit InsideAnalysis.com for more information
Want to learn more about Activity Based Costing or IT Delivery Services Transparency? Would you like to better communicate your Technology Services Catalog or articulate proper chargebacks to the Business Stakeholders?
The business demands Best-in-Class Solutions with a secure and reliable infrastructures with no downtime, and now the CIO can provide a portal view with comprehensive Business Unit Dashboards and custom Data Analytics reporting.
Integrating SIS’s with Salesforce: An Accidental Integrator’s GuideSalesforce.org
Join our next Success webinar, Integrating Student Information Systems with Salesforce: Strategies and Best Practices, to explore the many ways system integration benefits your school. Whether you want an aggregated view of your students, the ability to trigger actions based on status changes, or the automation of manual work, you will learn the three simple steps to successful integration. By highlighting how higher education institutions have integrated with the most popular Student Information Systems, Grant Miller, director of Alliances and Jill Kenney, Director of Sales Engineering at the Salesforce Foundation, will explain the layers of integration and discuss considerations like synchronous-versus-asynchronous and buy-versus-build options.
Don’t Be a Black Box. Do your end users go around you to get IT services? Learn how you can become a service provider instead of “black box IT”, and display how your technology investments map to the business’s priorities.
New Datacenter and Cloud Services. What portion of your datacenter investment are you leveraging? Learn about new Microsoft cloud services that align and extend existing datacenter products you likely already have installed.
Mobile Device Management, Identity, and Data Control.
Are you prepared for the move of users to mobile devices, SaaS services, and data that can live anywhere? Learn how these challenges can be addressed through comprehensive mobile experience management technologies.
Better Platforms for Business Productivity. Are your business processes engaging and efficient, or clumsy and disconnected? Learn how to provide a new way to interact as a business and manage your customer’s needs.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
1. CAPTURING DATA REQUIREMENTS Best Practices in requirements gathering for data rich projects. Presented by: Greg Bhatia (Chief Instructor, MCOM IT Training) Visit us on the web at www.mcomtraining.com
2.
3.
4.
5. The Top down requirements process The Fabled Requirements Tree
6. The Top down requirements process In Reality: The Requirements Tree Stakeholder Request Functional Non Functional Constraints
7. The Top down requirements process Stakeholder Request Functional Non Functional Constraints Use Cases Accessibility Performance …… Software Specifications Software Specifications
8. The Top down requirements process Maintaining Traceability Stakeholder Request Functional Non Functional Constraints Use Cases Misc. NF Data
9. The Top down requirements process Maintaining Traceability Stakeholder Request Functional Non Functional Constraints Use Cases Misc. NF Data
10. The Top down requirements process Maintaining Traceability Stakeholder Request Functional Non Functional Constraints Use Cases Misc. NF Data 1:1 1:1 1:1 1:1 1:1 1:n n:1
18. Data, looking beyond the bits & bytes What is Data? The layman’s definition (1) Distinct pieces of information, usually formatted in a special way. All software is divided into two general categories: data and programs. Programs are collections of instructions for manipulating data. Strictly speaking, data is the plural of datum, a single piece of information. In practice, however, people use data as both the singular and plural form of the word. (2) The term data is often used to distinguish binary machine-readable information from textual human-readable information. For example, some applications make a distinction between data files (files that contain binary data) and text files (files that contain ASCII data). (3) In database management systems, data files are the files that store the database information, whereas other files, such as index files and data dictionaries, store administrative information, known as metadata.
19. Data, looking beyond the bits & bytes What is Data? The Business Analyst’s definition A collection of facts that have been organized and categorized for a pre determined purpose from which conclusions may be drawn;
21. Data, looking beyond the bits & bytes Settle on the best plan, exploit the dynamic within, develop it without. Follow the advantage, and master opportunity General Sun Tzu from The Art of War 6th century BC
22. Data, looking beyond the bits & bytes Every Data Requirement discussion is an opportunity to enhance the BI capability (current of future) of the organization There is never a better time to have BI focused discussions than when requirements for the data capture systems are being reviewed
26. Why Plan for BI? Why Do Organizations need Business Intelligence capabilities? Running a business costs money and so organizations need to maximize their revenue . This optimization can only be achieved through insight and it is this insight that is the key benefit a BI solution delivers to an organization B.I Components Analysis Data Warehouse OLAP Cubes Predictive Models Reporting Ad-hoc reports Balanced Scorecards Dashboards
27. Why Plan for BI? You can plan ahead, or you can play catch-up BI is the Future, not an after thought
28. Why Plan for BI? It Costs time & effort to plan for BI in every development initiative * This is an investment * It Costs money to re-configure a system to make It “BI friendly “ * This is a sunk cost * … and you still don’t get the full benefit of a BI solution due to missing Historical data
31. Data discovery – Ask and ye shall receive… One of the biggest challenges facing IT organizations is pinpointing the location of critical data throughout the enterprise. A lot of time is wasted in trying to identify data sources, owners and data capture strategies, time which is better spent in Requirements Analysis Delaying the data discovery phase only makes the problem worse
32.
33.
34.
35. Data discovery – Ask and ye shall receive… Asking the right questions to the right people is equally important Question: Is the data refreshed on a daily basis or weekly basis? Qualified Audience: Data owner, Data SME Question: We need to build an automated ETL pipe from this data source into our application. What would be the cost & timelines? Qualified Audience: Data owner, Question: What fields need to be captured in the data feed? Qualified Audience: Business SME / Stakeholder / BI Champion Question: At what grain should we capture the data? Qualified Audience: BI Champion
36.
37. Thank You for Attending For all your Individual & Corporate Training Needs Visit us on the web at www.mcomtraining.com