This document discusses how to enable continuous delivery while maintaining proper segregation of duties for security and compliance. It begins by explaining continuous delivery and the need for segregation of duties. Typical enforcement of segregation of duties acts as a blocker to continuous delivery by restricting who can access or change environments. The document then recommends implementing segregation of duties in a continuous delivery friendly way through principles like involving security early, separating confidential and regular data, pre-approving standardized deployment bundles, and using multi-factor authentication and configuration management tools. This allows continuous delivery practices like frequent deployments and rapid troubleshooting while still preventing any single person from having end-to-end access or making unregulated changes.
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
Data Architecture is foundational to an information-based operational environment. Without proper structure and efficiency in organization, data assets cannot be utilized to their full potential, which in turn harms bottom-line business value. When designed well and used effectively, however, a strong Data Architecture can be referenced to inform, clarify, understand, and resolve aspects of a variety of business problems commonly encountered in organizations.
The goal of this webinar is not to instruct you in being an outright Data Architect, but rather to enable you to envision a number of uses for Data Architectures that will maximize your organization’s competitive advantage. With that being said, we will:
Discuss Data Architecture’s guiding principles and best practices
Demonstrate how to utilize Data Architecture to address a broad variety of organizational challenges and support your overall business strategy
Illustrate how best to understand foundational Data Architecture concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Adopting a Process-Driven Approach to Master Data ManagementSoftware AG
What is a lasting solution to the sea of errors, headaches, and losses caused by inconsistent and inaccurate master data such as customer and product records? This is the data that your business counts on to operate business processes and make decisions. But this data is often incomplete or in conflict because it resides in multiple IT systems. Master Data Management (MDM)'s programs are the solution to this problem, but these programs can fail without the investment and involvement of business managers.
Listen to Rob Karel, Forrester analyst, and Jignesh Shah from Software AG to learn about a new, process-driven approach to MDM and why it is a win-win for both business and IT managers.
Visit us at http://www.softwareag.com Become part of our growing community: Facebook: http://www.facebook.com/softwareag Twitter: http://www.twitter.com/softwareag LinkedIn: http://www.linkedin.com/company/software-ag YouTube: http://www.youtube.com/softwareag
Driving Data Intelligence in the Supply Chain Through the Data Catalog at TJXDATAVERSITY
Roles and responsibilities are a critical component of every Data Governance program. Building a set of roles that are practical and that will not interfere with people’s “day jobs” is an important consideration that will influence how well your program is adopted. This tutorial focuses on sharing a proven model guaranteed to represent your organization.
Join Bob Seiner for this lively webinar where he will dissect a complete Operating Model of Roles and Responsibilities that encompasses all levels of the organization. Seiner will detail the roles and describe the most effective way to associate people with the roles. You will walk out of this webinar with a model to apply to your organization.
In this session Bob will share:
- The five levels of Data Governance roles
- A proven Operating Model of Roles and Responsibilities
- How to customize the model to meet your requirements
- Setting appropriate role expectations
- How to operationalize the roles and demonstrate value
Data Architecture Best Practices for Advanced AnalyticsDATAVERSITY
Many organizations are immature when it comes to data and analytics use. The answer lies in delivering a greater level of insight from data, straight to the point of need.
There are so many Data Architecture best practices today, accumulated from years of practice. In this webinar, William will look at some Data Architecture best practices that he believes have emerged in the past two years and are not worked into many enterprise data programs yet. These are keepers and will be required to move towards, by one means or another, so it’s best to mindfully work them into the environment.
You Need a Data Catalog. Do You Know Why?Precisely
The data catalog has become a popular discussion topic within data management and data governance circles. A data catalog is a central repository that contains metadata for describing data sets, how they are defined, and where to find them. TDWI research indicates that implementing a data catalog is a top priority among organizations we survey. The data catalog can also play an important part in the governance process. It provides features that help ensure data quality, compliance, and that trusted data is used for analysis. Without an in-depth knowledge of data and associated metadata, organizations cannot truly safeguard and govern their data.
Join this on-demand webinar to learn more about the data catalog and its role in data governance efforts.
Topics include:
· Data management challenges and priorities
· The modern data catalog – what it is and why it is important
· The role of the modern data catalog in your data quality and governance programs
· The kinds of information that should be in your data catalog and why
Data Architecture is foundational to an information-based operational environment. Without proper structure and efficiency in organization, data assets cannot be utilized to their full potential, which in turn harms bottom-line business value. When designed well and used effectively, however, a strong Data Architecture can be referenced to inform, clarify, understand, and resolve aspects of a variety of business problems commonly encountered in organizations.
The goal of this webinar is not to instruct you in being an outright Data Architect, but rather to enable you to envision a number of uses for Data Architectures that will maximize your organization’s competitive advantage. With that being said, we will:
Discuss Data Architecture’s guiding principles and best practices
Demonstrate how to utilize Data Architecture to address a broad variety of organizational challenges and support your overall business strategy
Illustrate how best to understand foundational Data Architecture concepts based on “The DAMA Guide to the Data Management Body of Knowledge” (DAMA DMBOK)
Adopting a Process-Driven Approach to Master Data ManagementSoftware AG
What is a lasting solution to the sea of errors, headaches, and losses caused by inconsistent and inaccurate master data such as customer and product records? This is the data that your business counts on to operate business processes and make decisions. But this data is often incomplete or in conflict because it resides in multiple IT systems. Master Data Management (MDM)'s programs are the solution to this problem, but these programs can fail without the investment and involvement of business managers.
Listen to Rob Karel, Forrester analyst, and Jignesh Shah from Software AG to learn about a new, process-driven approach to MDM and why it is a win-win for both business and IT managers.
Visit us at http://www.softwareag.com Become part of our growing community: Facebook: http://www.facebook.com/softwareag Twitter: http://www.twitter.com/softwareag LinkedIn: http://www.linkedin.com/company/software-ag YouTube: http://www.youtube.com/softwareag
Driving Data Intelligence in the Supply Chain Through the Data Catalog at TJXDATAVERSITY
Roles and responsibilities are a critical component of every Data Governance program. Building a set of roles that are practical and that will not interfere with people’s “day jobs” is an important consideration that will influence how well your program is adopted. This tutorial focuses on sharing a proven model guaranteed to represent your organization.
Join Bob Seiner for this lively webinar where he will dissect a complete Operating Model of Roles and Responsibilities that encompasses all levels of the organization. Seiner will detail the roles and describe the most effective way to associate people with the roles. You will walk out of this webinar with a model to apply to your organization.
In this session Bob will share:
- The five levels of Data Governance roles
- A proven Operating Model of Roles and Responsibilities
- How to customize the model to meet your requirements
- Setting appropriate role expectations
- How to operationalize the roles and demonstrate value
DAS Slides: Data Governance and Data Architecture – Alignment and SynergiesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, Data Governance consists of committee meetings and stewardship roles. To others, it focuses on technical Data Management and controls. Holistic Data Governance combines both of these aspects, and a robust Data Architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning Data Architecture and Data Governance for business and IT success.
Enterprise Data Management Framework OverviewJohn Bao Vuu
A solid data management foundation to support big data analytics and more importantly a data-driven culture is necessary for today’s organizations.
A mature Data Management Program can reduce operational costs and enable rapid business growth and development. Data Management program must evolve to monetize data assets, deliver breakthrough innovation and help drive business strategies in new markets.
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Data Governance Program Powerpoint Presentation SlidesSlideTeam
"You can download this product from SlideTeam.net"
Showcase data management practices with our content ready Data Governance Program PowerPoint Presentation Slides. Establish processes to ensure effective data management using Information Architecture PPT slides. Data management can enable better planning, minimize rework, optimize staff effectiveness. The information technology governance PowerPoint complete deck contains ready to use slides such as the need for data governance, why companies suffer from data governance, automated data governance, framework, roles and responsibilities, ways to establish data management process, improvement roadmap, etc. The easy to understand data migration program PPT visuals are fully editable. You can modify, color, text, and font size. It has relevant templates to cater to your business needs. Utilize visually appealing information architecture PowerPoint templates to define a set of data management procedures and plans. Data governance is a quality control discipline, assess data accuracy consistency and timeliness. Download this professionally designed information governance program presentation deck to manage, improve, monitor and protect organizational information. Clientele grows with our Data Governance Program Powerpoint Presentation Slides. Give a big boost to your customer base. https://bit.ly/3EViOt3
Data Catalog for Better Data Discovery and GovernanceDenodo
Watch full webinar here: https://buff.ly/2Vq9FR0
Data catalogs are en vogue answering critical data governance questions like “Where all does my data reside?” “What other entities are associated with my data?” “What are the definitions of the data fields?” and “Who accesses the data?” Data catalogs maintain the necessary business metadata to answer these questions and many more. But that’s not enough. For it to be useful, data catalogs need to deliver these answers to the business users right within the applications they use.
In this session, you will learn:
*How data catalogs enable enterprise-wide data governance regimes
*What key capability requirements should you expect in data catalogs
*How data virtualization combines dynamic data catalogs with delivery
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Lessons in Data Modeling: Data Modeling & MDMDATAVERSITY
Master Data Management (MDM) can create a 360 view of core business assets such as Customer, Product, Vendor, and more. Data modeling is a core component of MDM in both creating the technical integration between disparate systems and, perhaps more importantly, aligning business definitions & rules.
Join this webcast to learn how to effectively apply a data model in your MDM implementation.
Most Common Data Governance Challenges in the Digital EconomyRobyn Bollhorst
Todays’ increasing emphasis on differentiation in the digital economy further complicates the data governance challenge. Learn about today’s common challenges and about the new adaptations that are required to support the digital era. Avoid the pitfalls and follow along on Johnson & Johnson’s journey to:
- Establish and scale a best in class enterprise data governance program
- Identify and focus on the most critical data and information to bolster incremental wins and garner executive support
- Ensure readiness for automation with SAP MDG on HANA
Data & Analytics ReInvent Recap [AWS Basel Meetup - Jan 2023].pdfChris Bingham
After recapping the key data & analytics announcements from AWS re:Invent 2022, we look a little deeper at three key new services:
• AWS DataZone
• AWS Omics
• AWS Clean Rooms
And follow up with a demo of using AWS IoT ExpressLink hardware in conjunction with AWS IoT Core, Lambda, and Amplify to build a Gatsby web app that interacts with the AWS IoT ExpressLink demo badge via a device shadow.
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords: “Big Data,” “NoSQL,” “Data Scientist,” and so on. Few realize that all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, data modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business. Since quality engineering/architecture work products do not happen accidentally, the more your organization depends on automation, the more important the data models driving the engineering and architecture activities of your organization. This webinar illustrates data modeling as a key activity upon which so much technology and business investment depends.
Specific learning objectives include:
- Understanding what types of challenges require data modeling to be part of the solution
- How automation requires standardization on derivable via data modeling techniques
- Why only a working partnership between data and the business can produce useful outcomes
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
The main aim of Data-Centric Architecture is to reduce complexity of information systems by using shared data with clear meaning. But how can you trust your data? How do you know if it is accurate and reliable?
In this webinar you'll learn how to quickly and easily improve your business using Snowflake and Matillion ETL for Snowflake. Webinar presented by Solution Architects Craig Collier (Snowflake) adn Kalyan Arangam (Matillion).
In this webinar:
- Learn to optimize Snowflake and leverage Matillion ETL for Snowflake
- Discover tips and tricks to improve performance
- Get invaluable insights from data warehousing pros
Enterprise Data Governance Framework With Change ManagementSlideTeam
“You can download this product from SlideTeam.net”
Presenting this set of slides with name Enterprise Data Governance Framework With Change Management. The topics discussed in these slides are Strategy, Organization, Management. This is a completely editable PowerPoint presentation and is available for immediate download. Download now and impress your audience. https://bit.ly/3b4VcEH
AWS offers a variety of data migration services and tools to help you easily and rapidly move everything from gigabytes to petabytes of data. We can provide guidance and methodologies to help you find the right service or tool to fit your requirements, and we share examples of customers who have used these options in their cloud journey.
Simplify Supplier Risk Management Across Your Procurement Processes - SID 51538SAP Ariba
Suffering from sporadic supplier due diligence and fragmented risk information? Getting burned from engaging with at-risk suppliers? You are not alone. Come learn how to simplify supplier risk management across your procurement processes. Industry experts will share their experience using the SAP Ariba Supplier Risk solution to help ensure focused risk due diligence during supplier selection, detect early warning signals, and proactively monitor and address risks for each supplier engagement.
DAS Slides: Data Governance and Data Architecture – Alignment and SynergiesDATAVERSITY
Data Governance can have a varied definition, depending on the audience. To many, Data Governance consists of committee meetings and stewardship roles. To others, it focuses on technical Data Management and controls. Holistic Data Governance combines both of these aspects, and a robust Data Architecture and associated diagrams can be the “glue” that binds business and IT governance together. Join this webinar for practical tips and hands-on exercises for aligning Data Architecture and Data Governance for business and IT success.
Enterprise Data Management Framework OverviewJohn Bao Vuu
A solid data management foundation to support big data analytics and more importantly a data-driven culture is necessary for today’s organizations.
A mature Data Management Program can reduce operational costs and enable rapid business growth and development. Data Management program must evolve to monetize data assets, deliver breakthrough innovation and help drive business strategies in new markets.
Describes what Enterprise Data Architecture in a Software Development Organization should cover and does that by listing over 200 data architecture related deliverables an Enterprise Data Architect should remember to evangelize.
Data Governance Program Powerpoint Presentation SlidesSlideTeam
"You can download this product from SlideTeam.net"
Showcase data management practices with our content ready Data Governance Program PowerPoint Presentation Slides. Establish processes to ensure effective data management using Information Architecture PPT slides. Data management can enable better planning, minimize rework, optimize staff effectiveness. The information technology governance PowerPoint complete deck contains ready to use slides such as the need for data governance, why companies suffer from data governance, automated data governance, framework, roles and responsibilities, ways to establish data management process, improvement roadmap, etc. The easy to understand data migration program PPT visuals are fully editable. You can modify, color, text, and font size. It has relevant templates to cater to your business needs. Utilize visually appealing information architecture PowerPoint templates to define a set of data management procedures and plans. Data governance is a quality control discipline, assess data accuracy consistency and timeliness. Download this professionally designed information governance program presentation deck to manage, improve, monitor and protect organizational information. Clientele grows with our Data Governance Program Powerpoint Presentation Slides. Give a big boost to your customer base. https://bit.ly/3EViOt3
Data Catalog for Better Data Discovery and GovernanceDenodo
Watch full webinar here: https://buff.ly/2Vq9FR0
Data catalogs are en vogue answering critical data governance questions like “Where all does my data reside?” “What other entities are associated with my data?” “What are the definitions of the data fields?” and “Who accesses the data?” Data catalogs maintain the necessary business metadata to answer these questions and many more. But that’s not enough. For it to be useful, data catalogs need to deliver these answers to the business users right within the applications they use.
In this session, you will learn:
*How data catalogs enable enterprise-wide data governance regimes
*What key capability requirements should you expect in data catalogs
*How data virtualization combines dynamic data catalogs with delivery
The data architecture of solutions is frequently not given the attention it deserves or needs. Frequently, too little attention is paid to designing and specifying the data architecture within individual solutions and their constituent components. This is due to the behaviours of both solution architects ad data architects.
Solution architecture tends to concern itself with functional, technology and software components of the solution
Data architecture tends not to get involved with the data aspects of technology solutions, leaving a data architecture gap. Combined with the gap where data architecture tends not to get involved with the data aspects of technology solutions, there is also frequently a solution architecture data gap. Solution architecture also frequently omits the detail of data aspects of solutions leading to a solution data architecture gap. These gaps result in a data blind spot for the organisation.
Data architecture tends to concern itself with post-individual solutions. Data architecture needs to shift left into the domain of solutions and their data and more actively engage with the data dimensions of individual solutions. Data architecture can provide the lead in sealing these data gaps through a shift-left of its scope and activities as well providing standards and common data tooling for solution data architecture
The objective of data design for solutions is the same as that for overall solution design:
• To capture sufficient information to enable the solution design to be implemented
• To unambiguously define the data requirements of the solution and to confirm and agree those requirements with the target solution consumers
• To ensure that the implemented solution meets the requirements of the solution consumers and that no deviations have taken place during the solution implementation journey
Solution data architecture avoids problems with solution operation and use:
• Poor and inconsistent data quality
• Poor performance, throughput, response times and scalability
• Poorly designed data structures can lead to long data update times leading to long response times, affecting solution usability, loss of productivity and transaction abandonment
• Poor reporting and analysis
• Poor data integration
• Poor solution serviceability and maintainability
• Manual workarounds for data integration, data extract for reporting and analysis
Data-design-related solution problems frequently become evident and manifest themselves only after the solution goes live. The benefits of solution data architecture are not always evident initially.
Data Architecture, Solution Architecture, Platform Architecture — What’s the ...DATAVERSITY
A solid data architecture is critical to the success of any data initiative. But what is meant by “data architecture”? Throughout the industry, there are many different “flavors” of data architecture, each with its own unique value and use cases for describing key aspects of the data landscape. Join this webinar to demystify the various architecture styles and understand how they can add value to your organization.
Lessons in Data Modeling: Data Modeling & MDMDATAVERSITY
Master Data Management (MDM) can create a 360 view of core business assets such as Customer, Product, Vendor, and more. Data modeling is a core component of MDM in both creating the technical integration between disparate systems and, perhaps more importantly, aligning business definitions & rules.
Join this webcast to learn how to effectively apply a data model in your MDM implementation.
Most Common Data Governance Challenges in the Digital EconomyRobyn Bollhorst
Todays’ increasing emphasis on differentiation in the digital economy further complicates the data governance challenge. Learn about today’s common challenges and about the new adaptations that are required to support the digital era. Avoid the pitfalls and follow along on Johnson & Johnson’s journey to:
- Establish and scale a best in class enterprise data governance program
- Identify and focus on the most critical data and information to bolster incremental wins and garner executive support
- Ensure readiness for automation with SAP MDG on HANA
Data & Analytics ReInvent Recap [AWS Basel Meetup - Jan 2023].pdfChris Bingham
After recapping the key data & analytics announcements from AWS re:Invent 2022, we look a little deeper at three key new services:
• AWS DataZone
• AWS Omics
• AWS Clean Rooms
And follow up with a demo of using AWS IoT ExpressLink hardware in conjunction with AWS IoT Core, Lambda, and Amplify to build a Gatsby web app that interacts with the AWS IoT ExpressLink demo badge via a device shadow.
Because every organization produces and propagates data as part of their day-to-day operations, data trends are becoming more and more important in the mainstream business world’s consciousness. For many organizations in various industries, though, comprehension of this development begins and ends with buzzwords: “Big Data,” “NoSQL,” “Data Scientist,” and so on. Few realize that all solutions to their business problems, regardless of platform or relevant technology, rely to a critical extent on the data model supporting them. As such, data modeling is not an optional task for an organization’s data effort, but rather a vital activity that facilitates the solutions driving your business. Since quality engineering/architecture work products do not happen accidentally, the more your organization depends on automation, the more important the data models driving the engineering and architecture activities of your organization. This webinar illustrates data modeling as a key activity upon which so much technology and business investment depends.
Specific learning objectives include:
- Understanding what types of challenges require data modeling to be part of the solution
- How automation requires standardization on derivable via data modeling techniques
- Why only a working partnership between data and the business can produce useful outcomes
Data Governance Best Practices, Assessments, and RoadmapsDATAVERSITY
When starting or evaluating the present state of your Data Governance program, it is important to focus on best practices such that you don’t take a ready, fire, aim approach. Best practices need to be practical and doable to be selected for your organization, and the program must be at risk if the best practice is not achieved.
Join Bob Seiner for an important webinar focused on industry best practice around standing up formal Data Governance. Learn how to assess your organization against the practices and deliver an effective roadmap based on the results of conducting the assessment.
In this webinar, Bob will focus on:
- Criteria to select the appropriate best practices for your organization
- How to define the best practices for ultimate impact
- Assessing against selected best practices
- Focusing the recommendations on program success
- Delivering a roadmap for your Data Governance program
The main aim of Data-Centric Architecture is to reduce complexity of information systems by using shared data with clear meaning. But how can you trust your data? How do you know if it is accurate and reliable?
In this webinar you'll learn how to quickly and easily improve your business using Snowflake and Matillion ETL for Snowflake. Webinar presented by Solution Architects Craig Collier (Snowflake) adn Kalyan Arangam (Matillion).
In this webinar:
- Learn to optimize Snowflake and leverage Matillion ETL for Snowflake
- Discover tips and tricks to improve performance
- Get invaluable insights from data warehousing pros
Enterprise Data Governance Framework With Change ManagementSlideTeam
“You can download this product from SlideTeam.net”
Presenting this set of slides with name Enterprise Data Governance Framework With Change Management. The topics discussed in these slides are Strategy, Organization, Management. This is a completely editable PowerPoint presentation and is available for immediate download. Download now and impress your audience. https://bit.ly/3b4VcEH
AWS offers a variety of data migration services and tools to help you easily and rapidly move everything from gigabytes to petabytes of data. We can provide guidance and methodologies to help you find the right service or tool to fit your requirements, and we share examples of customers who have used these options in their cloud journey.
Simplify Supplier Risk Management Across Your Procurement Processes - SID 51538SAP Ariba
Suffering from sporadic supplier due diligence and fragmented risk information? Getting burned from engaging with at-risk suppliers? You are not alone. Come learn how to simplify supplier risk management across your procurement processes. Industry experts will share their experience using the SAP Ariba Supplier Risk solution to help ensure focused risk due diligence during supplier selection, detect early warning signals, and proactively monitor and address risks for each supplier engagement.
Profiling for SAP - Compliance Management, Access Control and Segregation of ...TransWare AG
Complex ERP systems are potentially susceptible to segregation of duties (SoD) issues. By means of Profiling for SAP®, the desired responsibilities of SAP® users can be counterchecked against the real usage of SAP®
Presentation by Smart ERP Solutions on Smart SoD, an add-on software solution providing effective Segregation of Duties for PeopleSoft applications. For webinar playback see also http://www.smarterp.com/media/Webinar-SoD.html
How do you know that your ERP system is SOX compliant? How can you enforce Segregation of Duties (SoD) rules? Don't be another Enron. Use compliance software to give your ERP software a check up from the neck up.
To arrange for a demo of SOX and SoD compliance software for your ERP system, send an e-mail to info@i-app.com or call Performa Apps CEO Dan Aldridge at 703.251.4504.
For much more content on ERP systems and enterprise software, visit us at http://inforln.com.
Presentation from Alliance 11 conference from the University of Nebraska and Smart ERP Solutions. Covers Row Level Security and Segregation of Duties for PeopleSoft.
Segregation of duties in SAP @ ISACA Pune presentation on 18.4.2015 CA CISA Jayjit Biswas
SOD conflict mitigation is a complex subject considering present manpower constraints and lack of technical understanding of core SAP domain. It is a mix of BPR and Technology together where process as well as IT knowledge is must to encounter this specialized area.
View on-demand recording: http://securityintelligence.com/events/how-vulnerable-is-your-critical-data/
Data infrastructures are highly dynamic, with changes in accounts, configurations and patches occurring regularly. Within your data infrastructure you need to understand the data. Not all data is the same. You need to protect the data that is considered high risk. However, most organizations lack the centralized control or skilled resources to review changes systematically to determine if they have introduced security gaps. While there are no silver bullets, there are key steps organizations can take to understand and reduce their risk and lower TCO.
In this presentation, Luis Casco-Arias, Senior Product Manager for IBM Security Guardium, describes best practices for:
- Assessing vulnerabilities and exposures
- Locking down critical data in various environments
- Aligning remediation workflows to prevent breaches and policy violations
Big Data LDN 2018: USING FAST-DATA TO MAKE SEMICONDUCTORSMatt Stubbs
Date: 14th November 2018
Location: Fast Data Theatre
Time: 14:30 - 15:00
Speaker: Neil Condon
Organisation: Edwards Vacuum
About: Semiconductor fabrication plants build devices with billions of components, each 1000x smaller than the human hair, with some features that are only a few atoms across. They are the most highly automated manufacturing environments in the world: In large fabs, time-series sensor data alone tops 50 TB/day, and combining that data with subject-matter-expertise fast enough to keep the automated production equipment functioning, is a major and growing challenge. The level of collaboration required to build high-performance real-time analytics, combined with the IP-sensitive nature of the data, results in a unique DataOps environment, where the use of Cloud resources serves to complicate rather than simplify the value equation. We’ll explore some of the challenges, and discuss the attributes of a PaaS that could help the industry tackle its fast-data challenges.
Federal Webinar: Security Compliance with SolarWinds Network Management ToolsSolarWinds
In this webinar attendees learned how to use our fault, performance and configuration management tools to improve your IT security posture. Our solutions help manage and monitor network devices and their configurations to enhance risk management, IT security and compliance. Discussions will include simplifying day-to-day operations, increasing automation, and generating reports to verify compliance and highlight violations.
Our federal Sales Engineers reviewed and demonstrated how our tools can help achieve and maintain RMF, FISMA and DISA STIG compliance. Configuration management helps agencies develop, deploy and maintain compliant configurations. Fault, performance, and log management help ensure that devices are continuously monitored and operating correctly. And patch management automates patching to reduce vulnerabilities. Attendees learned how SolarWinds tools can help you:
• Leverage Network Configuration Manager (NCM), and Patch Manager to satisfy security controls or help implement and manage controls
• Utilize NCM, and Log & Event Manager (LEM), our powerful SIEM, to verify that controls have been implemented correctly
• Employ LEM, Network Performance Monitor, and NCM to monitor that controls are working as expected
• Quickly and easily produce out-of-the-box compliance reports for DISA STIGS, FISMA, and more
Sales Planning vs. Demand Planning: Getting Sales Back Into S&OP
Featured Presenter:
Danny Smith, Vice President, Industries, Steelwedge Software
In recent years, functions other than Sales – including Supply Chain and Finance – have often taken ownership of predicting future sales. The process has become an aggregation exercise done by specialists, and the name itself – demand management – indicates that the Sales team is not intimately involved. But true S&OP requires Sales to “own their number,” which delivers company-wide benefits because Sales is the closest to the demand signal.
In this webinar you will learn about:
- The Sales Planning Challenges
- The Keys to Success
- How a Sales Planning Platform Can Help You Hit Your Number
Federal Webinar: RMF, DISA STIGs, and NIST FISMA Compliance using SolarWindsSolarWinds
In this webinar SolarWinds shared how to use our tools to improve your agency’s Risk Management Framework (RMF), NIST 800-53 controls, FISMA, and DISA STIGS compliance. Our solutions can help you implement, assess, and monitor your security controls. We also continuously monitor your networks, systems and application for compliance, and we provide tools to automate remediation and reporting.
Our federal Sales Engineers reviewed security controls where our tools provide support, and demonstrate how to utilize product features to meet your compliance needs. We plan to discuss Access Controls, Audit and Accountability, and Configuration Management controls, as well as Incident Response, System Maintenance, Media Protection, and other controls.
You'll learn how SolarWinds tools can help you:
• Satisfy controls or help implement and manage controls using Network Configuration Manager (NCM), and Patch Manager
• Make sure controls have been implemented correctly using NCM, and Log & Event Manager (LEM), our powerful SIEM
• Monitor that controls are working as expected using LEM, Network Performance Monitor, and NCM
• Quickly and easily produce out-of-the-box compliance reports for DISA STIGS, FISMA, and more
How to Better Manage Technical Debt While Innovating on DevOpsDynatrace
Forget the “Unicorns.” There is a lot to learn from “DevOps Unicorns” such as Etsy or Facebook, but for enterprises dealing with technical debt in legacy systems developed by teams no longer with the company, copying the unicorns is not an option.
Richard Dominguez, Operations Developer at Prep Sportswear, needed to “keep the lights on” for their legacy systems, while enabling his DevOps teams to launch new features much faster. Today Prep Sportswear releases more updates to their legacy systems than ever before by reducing MTTR (Mean Time To Repair), giving them more time to innovate on DevOps and Continuous Delivery on their new platform. You’ll learn:
• Top metrics for an Ops dashboard to catch potential issues early
• Tips to manage technical debt in legacy code caused by dev teams long gone
• Efficient ways to close loops while providing input to DevOps so they can optimize innovation and releases
Customer Story: Scaling Security With Detections-as-CodePanther Labs
Learn how Cedar is leveraging Detections-as-Code with Panther to build high-signal alerts to gain visibility into user activity, suspicious behaviors, and unauthorized data sharing.
IDERA Live | Understanding SQL Server Compliance both in the Cloud and On Pre...IDERA Software
You can watch the replay for this IDERA Live webcast, Understanding SQL Server Compliance both in the Cloud and On Premises, on the IDERA Resource Center, http://ow.ly/tJ3V50A4rPD.
Every industry has its own regulatory compliance guidelines. On top of that, if you want to collect credit card information you must be PCI compliant. If you are trading on a Stock Exchange you must be SOX compliant. If you gather information on EU Members, you must be GDPR compliant. The list of regulations can be lengthy for an organization and some of those regulations may conflict with each other. With more companies moving to the cloud, it is even more important to review your compliance processes. With this session, we will explore the complex world of regulations and how that applies to how you collect and maintain your data.
Speaker: Kim Brushaber is the Senior Product Manager for SQL Compliance Manager at IDERA. Kim has over 20 years of experience as a Business Analyst, Software Developer, Product Manager and IT Executive. Kim enjoys working as the translator between the business and the technical teams in an organization.
Your database holds your company's most sensitive and important assets- your data. All those customers' personal details, credit card numbers, social security numbers- you can't afford leaving them vulnerable to any- outside or inside- breaches.
Exploding data growth doesn’t mean you have to sacrifice data security or compliance readiness. The more clarity you have into where your sensitive data is and who is accessing it, the easier it is to secure and meet compliance regulations.
Walk through this presentation to learn how to:
- Detect and block cyber security events in real-time
- Protect large and diverse data environments
- Simplify compliance enforcements and reporting
- Take control of escalating costs.
Presentation Title – 10 steps to DCIM success
In this presentation we will be discussing the real world practicalities of DCIM deployment and offering '10 key considerations for a successful DCIM project', utilising case studies taken from our enviable portfolio of customers.
Good data is like good water: best served fresh, and ideally well-filtered. Data Management strategies can produce tremendous procedural improvements and increased profit margins across the board, but only if the data being managed is of high quality. Determining how Data Quality should be engineered provides a useful framework for utilizing Data Quality management effectively in support of business strategy. This, in turn, allows for speedy identification of business problems, delineation between structural and practice-oriented defects in Data Management, and proactive prevention of future issues. Organizations must realize what it means to utilize Data Quality engineering in support of business strategy. This webinar will illustrate how organizations with chronic business challenges can often trace the root of the problem to poor Data Quality. Showing how Data Quality should be engineered provides a useful framework in which to develop an effective approach. This in turn allows organizations to more quickly identify business problems as well as data problems caused by structural issues versus practice-oriented defects and prevent these from reoccurring.
Learning objectives:
-Help you understand foundational Data Quality concepts for improving Data Quality at your organization
-Demonstrate how chronic business challenges for organizations are often rooted in poor Data Quality
-Share case studies illustrating the hallmarks and benefits of Data Quality success
Who, What, Where and How: Why You Want to KnowEric Kavanagh
Hot Technologies with Dr. Robin Bloor, Dez Blanchfield and IDERA
In our increasingly data-driven society, knowing who did what with information assets is a must-have. Every aspect of the business benefits: operations, analytics, planning, compliance. The story gets even more serious when sensitive information comes into play. For all these reasons and more, the modern business needs to know what's happening where, and who is involved in the process.
Register for this episode of Hot Technologies to hear Dr. Robin Bloor and Data Scientist Dez Blanchfield extol the virtues of data awareness. They'll be briefed by Bullett Manale of IDERA, who will demonstrate how his company's software allows organizations to gain deep insights into their data, right down to the column level. He'll show how to audit the most sensitive information, monitor and alert on suspicious activity, satisfy audits for PCI, HIPAA, FERPA and SOX -- all from a Web-based dashboard that simplifies access from any browser.
Enterprise Vulnerability Management: Back to BasicsDamon Small
Vulnerability Management is the lifecycle of identifying and remediating vulnerabilities in an organization's enterprise. A number of companies are starting to do this well, but in some cases, focus on advanced and emerging threats has had the unintended consequence of leaving Vulnerability Management unattended. Defense is actually hard work and people aren't doing it as well as they should! Considered in the context of asymmetric warfare, Blue Teaming is more difficult than Red Teaming. Coupled with the fact that most vulnerabilities do not actually suffer from advanced attacks and 0-days, Vulnerability Management must be the cornerstone of any Information Assurance Program.
The speakers, Kevin Dunn and Damon Small, will describe the key elements of a mature Vulnerability Management Program (VMP) and the pitfalls encountered by many organizations as they try to implement it. Dunn and Small will include detailed examples of why purchasing the scanner should be one of the last decisions made in this process, and what the attendee must do to ensure the successful defense of company assets and data. This session will cover:
- Vulnerability Management: What is it good for?
- What is it not good for?
- How do I make a real difference?
Modern data processing environments resemble factory lines, transforming raw data to valuable data products. The lean principles that have successfully transformed manufacturing are equally applicable to data processing, and are well aligned with the new trend known as DataOps. In this presentation, we will explain how applying lean and DataOps principles can be implemented as technical data processing solutions and processes in order to eliminate waste and improve data innovation speed. We will go through how to eliminate the following types of waste in data processing systems:
* Cognitive waste - unclear source of truth, dependency sprawl, duplication, ambiguity.
* Operational waste - overhead for deployment, upgrades, and incident recovery.
* Delivery waste - friction and delay in development, testing, and deployment.
* Product waste - misalignment to business value, detach from use cases, push driven development, vanity quality assurance.
We will primarily focus on technical solutions, but some of the waste mentioned requires organisational refactoring to eliminate.
Similar to Segregation of Duties and Continuous Delivery (20)
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
A Comprehensive Look at Generative AI in Retail App Testing.pdfkalichargn70th171
Traditional software testing methods are being challenged in retail, where customer expectations and technological advancements continually shape the landscape. Enter generative AI—a transformative subset of artificial intelligence technologies poised to revolutionize software testing.
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
Enhancing Project Management Efficiency_ Leveraging AI Tools like ChatGPT.pdfJay Das
With the advent of artificial intelligence or AI tools, project management processes are undergoing a transformative shift. By using tools like ChatGPT, and Bard organizations can empower their leaders and managers to plan, execute, and monitor projects more effectively.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...informapgpstrackings
Keep tabs on your field staff effortlessly with Informap Technology Centre LLC. Real-time tracking, task assignment, and smart features for efficient management. Request a live demo today!
For more details, visit us : https://informapuae.com/field-staff-tracking/
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
3. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
What we’ll cover today
■ About Continuous Delivery
■ The need for Segregation of Duties
■ How typical enforcement of Segregation of Duties is a blocker to CD
■ How to improve SoD enforcement and accelerate CD
3
4. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Important Points
■ People behave as they are measured (e.g. KPIs)
■ Most issues are 10% technical and 90% cultural/behavioral
■ CD-Friendly SoD and true Continuous Delivery are more process and
people problems, and very less tool problems.
■ You should move toward automation-friendly tools, though.
4
6. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
What Continuous Delivery is NOT:
6
Topic Clarification
“CI/CD” You need more than just a “daemonic CI” and a
“pipeline plugin”
Continuous
Deployment
Deployment using Tools
Blanket permission to Environment owners need to review, approve and
trigger deployments at their convenience.
Permission to push
“Containers” to Prod
What goes in those containers needs to be
validated!
8. @sriramNRNwww.sriramnarayanan.com
Continuous delivery is a software
engineering approach in which
teams produce software in short
cycles, ensuring that the software
can be reliably released at any
time. It aims at building, testing,
and releasing software faster and
more frequently.
- Wikipedia
8
9. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
With fast I.T. turn-around times, business can:
■ Stay competitive
■ Respond to change faster
■ Fix defects earlier
■ Try new ideas boldly and revert confidently.
9
10. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
What we’d love to have!
10
Commit
Code
Build and
Package
Test
Locally
Deploy to
Production!
Production Support
■ Deploy when ever we want
■ Debug processes on Production servers
■ Query Production Databases
■ Inspect traffic, review log files
■ Apply hot fixes within minutes
11. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
11
Production Support
■ Deploy when ever we want - “Raise a ticket to deploy”
■ Debug processes on Production servers - “No way !!”
■ Query Production Databases - A ticket for individual query results
■ Inspect traffic, review log files - A ticket for log extracts
■ Apply hot fixes within minutes - Ticket please!
Reality Check !
Commit
Code
Build and
Package
Test
Locally
Deploy to
Production!
Tickets per phase!
12. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
What puzzles (frustrates!) Dev Teams and Business
■ Why are Ops, Audit and Security Teams throwing roadblocks at us?
■ Are they raising roadblocks just to assert their importance?
■ Why are Ops given access that they cannot make use of to solve issues?
■ Why do we have such ridiculous policies!?
■ Why does everyone make us raise so many tickets?
■ Why are we trusted to write the software but not to troubleshoot it!!!??
■ Are Ops, Security and Compliance on our side, or our competitors side?
12
13. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
What Ops, Security, Compliance have to say:
13
“We are merely following industry
norms to protect business and
customers. We are not the enemy!
Please don’t blame us for doing
our job!!”
14. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
So, who is right?
14
Development teams – who develop
software that meets business goals?
Or
Ops and Security – who ensure
uptimes and protect customers?
16. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Expectations from an organization
■ Make money (if a business)
■ Conform to the laws (e.g. those that protect the customers’ interests)
■ Run in a stable manner
16
17. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
How orgs are managed - GRC
Source: Wikipedia
17
Topic Explanation
Governance The executives are responsible for the org’s
operations
Risk Management Identify, analyze and respond to risks
Compliance Conform to stated requirements (Regulations, Org
policies, Business guarantees to
customers)
Applicable to IT, Finance, Legal
18. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Some examples of fraud and error
■ Untimely and/or non-uniform deployment
■ Deploying with the wrong permissions
■ Handling production environments with zero exposure and skills
■ Accessing confidential data in violation of privacy policies
■ Changing production configurations ad-hoc with poor review, and poor
documentation of changes
■ Bypassing domain logic and enforcement in the application, and changing
production data directly
■ Logging confidential data and accessing these via logs
18
19. @sriramNRNwww.sriramnarayanan.com
Separation of duties (SoD) (also
known as "Segregation of duties")
is the concept of having more than
one person required to complete a
task. … an internal control intended
to prevent fraud and error
- Wikipedia
19
20. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Segregation of Duties
■ A well-understood concept in Finance, Law, Governance, Military, etc.
■ No single person should have end to end access to complete an entire
workflow
■ At least one other person should be able to
● Regulate the activity, if need be.
● Review the activity
20
22. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Typical SoD procedures for Deployments
22
Intent Action Typical
Implementation
Impact
Devs should not
author and
deploy code
Deployment by
Ops
Dependent upon
Ops availability
Business cannot
deploy on-
demand
Demonstrate
deployment in an
auditable manner
Deployment
using Tools
Special tools,
typically not
available in Dev
Dev and Prod
deployments are
different
Control over
when prod is
changed
Deployment at
specific times
Strict calendar
schedules
Cannot deploy
frequently.
Exceptions can
be expensive.
23. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Typical SoD procedures for Troubleshooting
23
Intent Action Typical
Implementation
Impact
Devs should not
access
confidential data
in logs
Regulate access
to log systems
Access to prod
logs governed by
SLAs. Extracts
only.
Lack of direct
access to logs
prevents fast
troubleshooting
Prevent
adhoc/harmful
changes, and
data sniffing
Regulate access
to prod servers
Special tools,
typically not
available in Dev
Dev and Prod
deployments are
different
24. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Typical SoD procedures for Databases
24
Intent Action Typical
Implementation
Impact
Ensure database
schema and data
integrity by
skilled DBAs
Regulate changes
to databases
Changes
reviewed and
denied before
prod deployment.
Documentation.
Waste of
precious time.
Wasteful
documentation.
Prevent
adhoc/harmful
changes, and
data sniffing
Regulate access
to prod data
A query per
ticket, reviewed,
approved,
applied
Waste of
precious time.
Penalties for
delays
25. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
Typical SoD procedures for Configuration
25
Intent Action Typical
Implementation
Impact
Ensure that all
(app,OS)
changes to prod
are valid and
documented
Regulate changes
to production
Changes
reviewed and
denied before
prod deployment.
Documentation
Waste of
precious time.
Wasteful
documentation.
Prevent attacks
based on known
weaknesses
Apply patches
regularly at
scheduled
intervals
Configuration
(settings,
patches) not
shared with devs
Software not
tested with Prod
configuration
28. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-friendly SoD – General principles
■ Involve Ops and Security right from Design phase
■ Policies in executable form via CD-Friendly config mgmt tools.
■ Separate confidential data and logs from regular data and logs
■ Single Deployment bundle – app, config, policy, DB schema.
■ Bundle Once, Deploy anywhere
■ Restrict access to confidential data/logs, permit easy access to regular
data/logs.
■ Enforce via config than via tickets (e.g. resource throttling vs tickets).
■ Use multi-factor (vs tickets) where possible to regulate actions.
28
29. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-Friendly deployment and configuration
29
Commit
Code
Build and
Package
Test
Locally
Deploy to
Production!
Dev, DBA, Ops,
Security
Tested
Deployment
Bundle with
approved
prod-ready
configs
2FA Deployment
enables any-time
deployment by Env owner
Policies, Code,
Approved
changes
App, OS patches,
configs, DB
changes
Deployment
Bundle
When gatekeeping checks are codified
and tested
Automated, Exploratory
and Pen Tests
30. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-Friendly SoD!
30
Commit
Code
Build and
Package
Test
Locally
Deploy to
Production!
Production Support
■ Deploy when ever we want – Environment owners decide, use 2FA
■ Debug processes on Production servers – Yes, configs elsewhere.
■ Query Production Databases – Easier access to regular data.
■ Inspect traffic, review log files – Easier access to regular data.
■ Apply hot fixes within minutes – Test in 1-click dev envs first
Dev, DBA, Ops,
Security
App, OS patches,
configs, DB changes
Pre-Approved
Deployment
Bundle
Pre-Approved
Deployment
Bundle
2FA Deployment
by Env Owner
31. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-Friendly SoD procedures for Deployments
31
Intent Action Recommended
Implementation
Impact
Devs should not
author and
deploy code
Deployment by
Environment
Owners
Review and
deploy changes
in small batches
Small batches
makes changes
easier to review.
Demonstrate
deployment in an
auditable manner
Configuration
management
tools
Build once,
deploy anywhere
Dev-Prod are the
auditably the
same
Control over
when prod is
changed
Deployment by
Environment
Owners
Frequent
Deploys in small
batches. Multi-
factor controls
Deploy only
when the Env
owner wants to.
32. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-Friendly SoD procedures for Troubleshooting
32
Intent Action Recommended
Implementation
Impact
Devs should not
access
confidential data
in logs and
config files
Separate
confidential and
regular logs.
Externalised
configuration
Log UUIDs.
Prod Support
teams access
regular logs, and
can SSH to prod.
Confidential data
remains
restricted. Prod
support is fast.
Prevent
adhoc/harmful
changes, and
data sniffing
Standard
environments.
1-click
environment
creation and 1-
click deployment
Prod errors can
be caught earlier
in Dev. Reduces
prod errors.
33. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-Friendly SoD procedures for Databases
33
Intent Action Recommended
Implementation
Impact
Ensure database
schema and data
integrity by
skilled DBAs
Regulate changes
to databases
using CD-
friendly DB
config tools
DBAs review
and recommend
changes at Dev
using CD-
friendly tools.
Identical schema
from Dev
through prod as
approved by the
DBA.
Prevent
adhoc/harmful
changes, and
data sniffing
Delink
confidential and
regular data.
Restrict access to
confidential data.
Provide access to
regular data.
Most
troubleshooting
needs just regular
data, and is fast.
34. @sriramNRNwww.sriramnarayanan.com
SEGREGATION OF DUTIES AND CONTINUOUS DELIVERY
CD-Friendly SoD procedures for Configuration
34
Intent Action Recommended
Implementation
Impact
Ensure that all
(app,OS)
changes to prod
are valid and
documented
Ops and Security
config settings in
CD-Friendly
config
management tool
Test pre-
approved configs
from dev through
prod.
Pre-approved
and tested
configs enable
frequent deploys.
Prevent attacks
based on known
weaknesses
Test OS patches
in Dev
Apply and test
OS patches via
automation in 1-
click dev env.
Rapidly test OS
patches and
Software in non-
Prod first.