June 26, 2017 presentation. With the move to infrastructure as code and continuous integration/continuous delivery pipelines, it looked like releases would become more frequent and less problematic. Then the auditors showed up and made everyone stop what they were doing. How could this have been prevented? What if the audits were part of the process instead of a roadblock? What sort of visibility do we have into the state of our Azure infrastructure compliance? This talk will provide an overview of Chef's open-source InSpec project (https://inspec.io) and how you can build "Compliance as Code" into your Azure-based infrastructure.
Machine learning services with SQL Server 2017Mark Tabladillo
SQL Server 2017 introduces Machine Learning Services with two independent technologies: R and Python. The purpose of this presentation is 1) to describe major features of this technology for technology managers; 2) to outline use cases for architects; and 3) to provide demos for developers and data scientists.
Securing Hadoop's REST APIs with Apache Knox Gateway Hadoop Summit June 6th, ...Kevin Minder
Securing Hadoop's REST APIs with Apache Knox Gateway
Presented at Hadoop Summit on June 6th, 2014
Describes the overall roles the Apache Knox Gateway plays in Hadoop security and briefly covers its primary features.
While traditional on-prem systems have always been a target from internal and external attackers, recent times have seen increased attacks on Hadoop cloud deployments. Hadoop systems are going to be increasingly targeted due to the large volume of data that it stores. Many Hadoop installations on cloud are publicly accessible without any security measures which pose threat to exfiltration of large datasets and possibly crypto-mining on this infrastructure with its huge distributed compute capability.
Apache Knox provides multiple layers of security related to authentication, service-level authorization and web application security controls out of the box for multiple Hadoop components.
Apache Knox provides configuration to prevent common OWASP Top 10 security risks e.g. Cross-site Request Forgery (CSRF), Cross Site Scripting (XSS), MIME Content Type sniffing, Clickjacking, etc. We will also discuss controls like HTTP Strict Transport Security which prevents SSL Downgrade attacks and CORS filter for allowing applications to make cross domain requests only to specifically allowed hosts through XHR. Support to include/exclude Cipher suites and exclude SSL protocols enables compliance with hardening guidelines provided by CIS for application servers.
Knox has several supported authentication mechanisms with Kerberos underneath e.g. LDAP over SSL, AD, PAM based auth for Unix users, integration with Identity Providers like Okta, etc. Also, capabilities like Trusted Proxy, Single Sign-On auth, Hostmap Provider, Identity Assertion Provider, Client Authentication enhances the overall security posture.
We will also cover the typical kill-chain methodology tailored to Hadoop ecosystem which will help formulate the preventive measures against future compromises.
Machine learning services with SQL Server 2017Mark Tabladillo
SQL Server 2017 introduces Machine Learning Services with two independent technologies: R and Python. The purpose of this presentation is 1) to describe major features of this technology for technology managers; 2) to outline use cases for architects; and 3) to provide demos for developers and data scientists.
Securing Hadoop's REST APIs with Apache Knox Gateway Hadoop Summit June 6th, ...Kevin Minder
Securing Hadoop's REST APIs with Apache Knox Gateway
Presented at Hadoop Summit on June 6th, 2014
Describes the overall roles the Apache Knox Gateway plays in Hadoop security and briefly covers its primary features.
While traditional on-prem systems have always been a target from internal and external attackers, recent times have seen increased attacks on Hadoop cloud deployments. Hadoop systems are going to be increasingly targeted due to the large volume of data that it stores. Many Hadoop installations on cloud are publicly accessible without any security measures which pose threat to exfiltration of large datasets and possibly crypto-mining on this infrastructure with its huge distributed compute capability.
Apache Knox provides multiple layers of security related to authentication, service-level authorization and web application security controls out of the box for multiple Hadoop components.
Apache Knox provides configuration to prevent common OWASP Top 10 security risks e.g. Cross-site Request Forgery (CSRF), Cross Site Scripting (XSS), MIME Content Type sniffing, Clickjacking, etc. We will also discuss controls like HTTP Strict Transport Security which prevents SSL Downgrade attacks and CORS filter for allowing applications to make cross domain requests only to specifically allowed hosts through XHR. Support to include/exclude Cipher suites and exclude SSL protocols enables compliance with hardening guidelines provided by CIS for application servers.
Knox has several supported authentication mechanisms with Kerberos underneath e.g. LDAP over SSL, AD, PAM based auth for Unix users, integration with Identity Providers like Okta, etc. Also, capabilities like Trusted Proxy, Single Sign-On auth, Hostmap Provider, Identity Assertion Provider, Client Authentication enhances the overall security posture.
We will also cover the typical kill-chain methodology tailored to Hadoop ecosystem which will help formulate the preventive measures against future compromises.
During the second half of 2016, IBM built a state of the art Hadoop cluster with the aim of running massive scale workloads. The amount of data available to derive insights continues to grow exponentially in this increasingly connected era, resulting in larger and larger data lakes year after year. SQL remains one of the most commonly used languages used to perform such analysis, but how do today’s SQL-over-Hadoop engines stack up to real BIG data? To find out, we decided to run a derivative of the popular TPC-DS benchmark using a 100 TB dataset, which stresses both the performance and SQL support of data warehousing solutions! Over the course of the project, we encountered a number of challenges such as poor query execution plans, uneven distribution of work, out of memory errors, and more. Join this session to learn how we tackled such challenges and the type of tuning that was required to the various layers in the Hadoop stack (including HDFS, YARN, and Spark) to run SQL-on-Hadoop engines such as Spark SQL 2.0 and IBM Big SQL at scale!
Speaker
Simon Harris, Cognitive Analytics, IBM Research
For decades developers and DBAs have battled over who controls the world. With each new development paradigm the battle flares again as developers push DBAs to adopt and support new data structures (JSON), new APIs (REST services), new technologies (In-Memory) and new platforms (Cloud). In this session, Gerald Venzl takes on the role of lead developer on a project to deploy a RESTful web-based application for a new coffeeshop chain, while Maria Colgan takes on the role of the DBA. Through the use of live demos, they learn to work together to find a solution that will allow them to embrace a more agile development approach, as well as the latest technology trends without exposing the business to painful availability or security vulnerabilities.
This is the second session of the learning pathway at PASS Summit 2019, which is still a stand alone session to teach you how to write proper Linux BASH scripts
Database Provisioning in EM12c: Provision me a Database Now!Maaz Anjum
My presentation for Georgia Oracle User Group on December 12, 2013. In it, I discuss the Database Provisioning feature in Enterprise Manager 12c with an example of how I architected a solution by leveraging it.
Data Analytics Using Container Persistence Through SMACK - Manny Rodriguez-Pe...{code} by Dell EMC
New digital business models facilitated by containers require collecting and analyzing device data. Apache Mesos removes the need to build separate stacks and combines optimized application containers and data analytics into a single platform. In this session, we will explore new approaches to data analytics using REX-Ray as a container persistence tool and the SMACK stack - Spark, Mesos, Akka, Cassandra, Kafka – a set of tools for building data and messaging layers for digital engagement apps.
Many enterprise are implementing Hadoop projects to manage and process large datasets. Big question is: how to configure Hadoop clusters to connect to enterprise directory containing 100k+ users and groups for access management. Several large enterprises have complex directory servers for managing users and groups. Many advanced features have been recently added to Hadoop user management in order to support various complex directory server structures.
In this session attendees will learn about: setting up Hadoop node with users from Active Directory for executing Hadoop jobs, setting up authentication for enterprise users, and setting up authorization for users and groups using Apache Ranger. Attendees will also learn about the common challenges faced in the enterprise environments while interacting with Active Directory including filtering out users to be brought into Hadoop from Active Directory, restricting access to a set of users from Active Directory, handling users from nested group structures, etc.
Speakers
Sailaja Polavarapu, staff Software Engineer, Hortonworks
Velmurugan Periasamy, Director - Engineering, Hortonworks
Nl HUG 2016 Feb Hadoop security from the trenchesBolke de Bruin
Setting up a secure Hadoop cluster involves a magic combination of Kerberos, Sentry, Ranger, Knox, Atlas, LDAP and possibly PAM. Add encryption on the wire and at rest to the mix and you have, at the very least, a interesting configuration and installation task.
Nonetheless, the fact that there are a lot of knobs to turn, doesn't excuse you from the responsibility of taking proper care of your customers' data. In this talk, we'll detail how the different security components in Hadoop interact and how easy it actually can be to setup thing correctly, once you understand the concepts and tools. We'll outline a successful secure Hadoop setup with an example.
Keep remote desktop power users productive with Dell EMC PowerEdge R840 serve...Principled Technologies
When the Dell EMC™ PowerEdge™ R840 launched, we found that companies could get more power for their CPU-intensive workloads with this 2U four-socket rack server.1 Now, it presents an opportunity for you to support more power users, speed desktop responsiveness, and grow your employee base.
As Hadoop becomes a critical part of Enterprise data infrastructure, securing Hadoop has become critically important. Enterprises want assurance that all their data is protected and that only authorized users have access to the relevant bits of information. In this session we will cover all aspects of Hadoop security including authentication, authorization, audit and data protection. We will also provide demonstration and detailed instructions for implementing comprehensive Hadoop security.
Big-Data-as-a-Service (BDaaS) in an enterprise environment requires meeting the often contradictory goals of (1) providing your data scientists, analysts, and data engineers with a self-service consumption model; (2) delivering agile and scalable on-demand infrastructure for the rapidly evolving ecosystem of big data frameworks and application software; while (3) ensuring enterprise-grade capabilities for isolation, security, monitoring, etc.
In this presentation at our BDaaS meetup in Santa Clara, Tom Phelan (chief architect and co-founder of BlueData) reviewed these goals and how to resolve the potential contradictions. He also discussed the infrastructure, application, user experience, security, and maintainability considerations required before selecting (or designing and building) a Big-Data-as-a-Service platform for an enterprise big data deployment.
More info on this BDaaS meetup can be found at: http://www.meetup.com/Big-Data-as-a-Service/events/233999817
Hadoop and Kerberos: the Madness Beyond the Gate: January 2016 editionSteve Loughran
An update of the "Hadoop and Kerberos: the Madness Beyond the Gate" talk, covering recent work "the Fix Kerberos" JIRA and its first deliverable: KDiag
During the second half of 2016, IBM built a state of the art Hadoop cluster with the aim of running massive scale workloads. The amount of data available to derive insights continues to grow exponentially in this increasingly connected era, resulting in larger and larger data lakes year after year. SQL remains one of the most commonly used languages used to perform such analysis, but how do today’s SQL-over-Hadoop engines stack up to real BIG data? To find out, we decided to run a derivative of the popular TPC-DS benchmark using a 100 TB dataset, which stresses both the performance and SQL support of data warehousing solutions! Over the course of the project, we encountered a number of challenges such as poor query execution plans, uneven distribution of work, out of memory errors, and more. Join this session to learn how we tackled such challenges and the type of tuning that was required to the various layers in the Hadoop stack (including HDFS, YARN, and Spark) to run SQL-on-Hadoop engines such as Spark SQL 2.0 and IBM Big SQL at scale!
Speaker
Simon Harris, Cognitive Analytics, IBM Research
For decades developers and DBAs have battled over who controls the world. With each new development paradigm the battle flares again as developers push DBAs to adopt and support new data structures (JSON), new APIs (REST services), new technologies (In-Memory) and new platforms (Cloud). In this session, Gerald Venzl takes on the role of lead developer on a project to deploy a RESTful web-based application for a new coffeeshop chain, while Maria Colgan takes on the role of the DBA. Through the use of live demos, they learn to work together to find a solution that will allow them to embrace a more agile development approach, as well as the latest technology trends without exposing the business to painful availability or security vulnerabilities.
This is the second session of the learning pathway at PASS Summit 2019, which is still a stand alone session to teach you how to write proper Linux BASH scripts
Database Provisioning in EM12c: Provision me a Database Now!Maaz Anjum
My presentation for Georgia Oracle User Group on December 12, 2013. In it, I discuss the Database Provisioning feature in Enterprise Manager 12c with an example of how I architected a solution by leveraging it.
Data Analytics Using Container Persistence Through SMACK - Manny Rodriguez-Pe...{code} by Dell EMC
New digital business models facilitated by containers require collecting and analyzing device data. Apache Mesos removes the need to build separate stacks and combines optimized application containers and data analytics into a single platform. In this session, we will explore new approaches to data analytics using REX-Ray as a container persistence tool and the SMACK stack - Spark, Mesos, Akka, Cassandra, Kafka – a set of tools for building data and messaging layers for digital engagement apps.
Many enterprise are implementing Hadoop projects to manage and process large datasets. Big question is: how to configure Hadoop clusters to connect to enterprise directory containing 100k+ users and groups for access management. Several large enterprises have complex directory servers for managing users and groups. Many advanced features have been recently added to Hadoop user management in order to support various complex directory server structures.
In this session attendees will learn about: setting up Hadoop node with users from Active Directory for executing Hadoop jobs, setting up authentication for enterprise users, and setting up authorization for users and groups using Apache Ranger. Attendees will also learn about the common challenges faced in the enterprise environments while interacting with Active Directory including filtering out users to be brought into Hadoop from Active Directory, restricting access to a set of users from Active Directory, handling users from nested group structures, etc.
Speakers
Sailaja Polavarapu, staff Software Engineer, Hortonworks
Velmurugan Periasamy, Director - Engineering, Hortonworks
Nl HUG 2016 Feb Hadoop security from the trenchesBolke de Bruin
Setting up a secure Hadoop cluster involves a magic combination of Kerberos, Sentry, Ranger, Knox, Atlas, LDAP and possibly PAM. Add encryption on the wire and at rest to the mix and you have, at the very least, a interesting configuration and installation task.
Nonetheless, the fact that there are a lot of knobs to turn, doesn't excuse you from the responsibility of taking proper care of your customers' data. In this talk, we'll detail how the different security components in Hadoop interact and how easy it actually can be to setup thing correctly, once you understand the concepts and tools. We'll outline a successful secure Hadoop setup with an example.
Keep remote desktop power users productive with Dell EMC PowerEdge R840 serve...Principled Technologies
When the Dell EMC™ PowerEdge™ R840 launched, we found that companies could get more power for their CPU-intensive workloads with this 2U four-socket rack server.1 Now, it presents an opportunity for you to support more power users, speed desktop responsiveness, and grow your employee base.
As Hadoop becomes a critical part of Enterprise data infrastructure, securing Hadoop has become critically important. Enterprises want assurance that all their data is protected and that only authorized users have access to the relevant bits of information. In this session we will cover all aspects of Hadoop security including authentication, authorization, audit and data protection. We will also provide demonstration and detailed instructions for implementing comprehensive Hadoop security.
Big-Data-as-a-Service (BDaaS) in an enterprise environment requires meeting the often contradictory goals of (1) providing your data scientists, analysts, and data engineers with a self-service consumption model; (2) delivering agile and scalable on-demand infrastructure for the rapidly evolving ecosystem of big data frameworks and application software; while (3) ensuring enterprise-grade capabilities for isolation, security, monitoring, etc.
In this presentation at our BDaaS meetup in Santa Clara, Tom Phelan (chief architect and co-founder of BlueData) reviewed these goals and how to resolve the potential contradictions. He also discussed the infrastructure, application, user experience, security, and maintainability considerations required before selecting (or designing and building) a Big-Data-as-a-Service platform for an enterprise big data deployment.
More info on this BDaaS meetup can be found at: http://www.meetup.com/Big-Data-as-a-Service/events/233999817
Hadoop and Kerberos: the Madness Beyond the Gate: January 2016 editionSteve Loughran
An update of the "Hadoop and Kerberos: the Madness Beyond the Gate" talk, covering recent work "the Fix Kerberos" JIRA and its first deliverable: KDiag
Melbourne Infracoders: Compliance as Code with InSpecMatt Ray
Presentation to the Melbourne Infrastructure Coders Meetup November 8, 2016. Overview of InSpec (https://inspec.io) and the idea of "Compliance as Code"
http://www.meetup.com/Infrastructure-Coders/events/233990769/
DevSec Delight with Compliance as Code - Matt Ray - AgileNZ 2017AgileNZ Conference
For too long, audits and security reviews have been seen as resistant to the frequent release of software. Auditors require access to static systems and environments, which would seem to make continuous delivery impossible. Too frequently audits are a fire drill sampling of the current state and temporary fixes are put in place to appease the compliance audit without being integrated into future releases.
About Matt Ray:
Matt Ray is the Manager and Solutions Architect for Asia Pacific and Japan for Chef. He has worked in large enterprise software companies and founded his own startups in a wide variety of industries including banking, retail and government.
He has been active in open source communities for over two decades and has spoken at, and helped organise, many conferences and Meetups. He currently resides in Sydney, Australia after relocating from Austin, Texas. He podcasts at SoftwareDefinedTalk.com, blogs at LeastResistance.net and is @mattray on Twitter, IRC, GitHub and too many Slacks.
Compliance as Code with InSpec - DevOps Melbourne 2017Matt Ray
DevOps Melbourne Meetup March 28, 2017
PCI and auditors slowing you down? Compliance and security are the next steps in building your software-defined infrastructure. Chef's open-source project InSpec (https://inspec.io) and audit cookbooks provide an accessible pattern for building compliance into your continuous delivery pipelines.
Automating Compliance with InSpec - Chef Singapore MeetupMatt Ray
July 24, 2017 slides and demo for Automating Compliance with InSpec. The associated GitHub repository is here: https://github.com/mattray/inspec-workshop
DevOpsDays Austin 2016 talk. Compliance and security are the next steps after Infrastructure as Code and Test-Driven Infrastructure in expanding your DevOps workflow. Chef's open-source InSpec and audit cookbooks provide an accessible pattern for building compliance into your continuous delivery pipelines.
Presentation by Matt Ray
Compliance and security are the next steps after Infrastructure as Code and Test-Driven Infrastructure in expanding your DevOps workflow. Chef's open-source InSpec and audit cookbooks provide an accessible pattern for building compliance into your continuous delivery pipelines.
OSDC 2017 | Building Security Into Your Workflow with InSpec by Mandi WallsNETWAYS
InSpec is an open source testing framework for infrastructure with a human- and machine-readable language for specifying compliance, security, and policy requirements. Using a combination of command-line and remote-execution tools, InSpec can help you keep your infrastructure aligned with security and compliance guidelines on an ongoing basis, rather than waiting for and then remediating from arduous annual audits. InSpec’s flexibility makes it a key tool choice for incorporating security into a complete continuous delivery workflow, reducing the risk of new features and releases breaking established host-based security guidelines.
Adding Security to Your Workflow with InSpec (MAY 2017)Mandi Walls
An introduction to InSpec and its motivations for teams looking for a security and compliance tool for their organizations. May 2017 edition. Atmosphere.pl Krakow and Netways OSDC Berlin.
OSDC 2017 - Mandi Walls - Building security into your workflow with inspecNETWAYS
InSpec is an open source testing framework for infrastructure with a human- and machine-readable language for specifying compliance, security, and policy requirements. Using a combination of command-line and remote-execution tools, InSpec can help you keep your infrastructure aligned with security and compliance guidelines on an ongoing basis, rather than waiting for and then remediating from arduous annual audits. InSpec’s flexibility makes it a key tool choice for incorporating security into a complete continuous delivery workflow, reducing the risk of new features and releases breaking established host-based security guidelines.
Docker - Demo on PHP Application deployment Arun prasath
Docker is an open-source project to easily create lightweight, portable, self-sufficient containers from any application. The same container that a developer builds and tests on a laptop can run at scale, in production, on VMs, bare metal, OpenStack clusters, public clouds and more.
In this demo, I will show how to build a Apache image from a Dockerfile and deploy a PHP application which is present in an external folder using custom configuration files.
Prescriptive Security with InSpec - All Things Open 2019Mandi Walls
What is Chef InSpec, and how can it help you manage and maintain system security through the full lifecycle of your applications? See how this powerful tool can keep your systems secure. Demo slides included in the appendix
Similar to Melbourne Chef Meetup: Automating Azure Compliance with InSpec (20)
Open Source Summit NA 2024: Open Source Cloud Costs - OpenCost's Impact on En...Matt Ray
Discover how a leading enterprise achieved visibility into their cloud costs with the CNCF project OpenCost. OpenCost models current and historical Kubernetes cloud spend and resource allocation by service, deployment, namespace, labels, and much more. This data provides transparency for cloud bills and can be used as the basis for optimizing your Kubernetes deployments based on cost allocation. This session delves into the real-world journey of implementing OpenCost for tracking cloud costs and how they optimized their infrastructure with this information. We’ll start with an introduction to OpenCost, its capabilities, and how to get started as a user and as a contributor. Then we’ll explore the challenges faced, lessons learned, and the tangible impact observed. From initial deployment to ongoing management, learn how OpenCost empowered the enterprise to make data-driven decisions, avoid cost overruns, and streamline their cloud budgeting. Join us for practical insights, success stories, and actionable steps to harness the power of OpenCost in your enterprise.
KubeConEU24-Monitoring Kubernetes and Cloud Spend with OpenCostMatt Ray
KubeCon EU 2024 Lightning Talk
Understanding the cost and efficiency of Kubernetes on public clouds is essential once you start expanding your infrastructure with real production workloads. The FinOps Certified Solution and CNCF Sandbox OpenCost project monitors cloud costs and models current and historical Kubernetes cloud spend and resource allocation by service, deployment, namespace, labels, and much more. This data provides transparency for cloud bills and can be used as the basis for optimizing your Kubernetes deployments based on cost allocation. This quick introduction to OpenCost will start your foundation for monitoring and Kubernetes and cloud costs.
SCaLE 20X: Kubernetes Cloud Cost Monitoring with OpenCost & Optimization Stra...Matt Ray
Understanding the cost and efficiency of Kubernetes on public clouds is essential once you start expanding your infrastructure with real production workloads. The CNCF Sandbox OpenCost project and specification models current and historical Kubernetes cloud spend and resource allocation by service, deployment, namespace, labels, and much more. This data provides transparency for cloud bills and can be used as the basis for optimizing your Kubernetes deployments based on cost allocation. Optimizing Kubernetes for cost and performance is an ongoing iterative process that starts with applications and works through the entire stack.
EmacsConf 2019: Interactive Remote Debugging and Development with TRAMP ModeMatt Ray
Emacs’ TRAMP Mode allows for remotely editing files and using Emacs Shell Mode with remote systems. This session walked through the basics of using TRAMP Mode with the Free Software tools Vagrant, Chef, InSpec, and the interactive Ruby debugging shell Pry. The speaker notes are included along with the demo notes. The YouTube recording of the talk is available here: https://youtu.be/4pHid-kTBHw
Wellington DevOps: Bringing Your Applications into the Future with HabitatMatt Ray
Short presentation from the Wellington DevOps Meetup March 13, 2019 on why Habitat is interesting for re-platforming existing applications onto new platforms.
DevOps Days Singapore 2018 Ignite - Bringing Your Applications into the Futur...Matt Ray
Ignite talks are 20 slides auto-advancing every 15 seconds. This session attempts to share the value of migrating existing applications from legacy to modern platforms.
Cloud Expo Asia 20181010 - Bringing Your Applications into the Future with Ha...Matt Ray
What are we going to do about all these legacy applications? Kubernetes, Docker or Server Core? With Habitat it doesn’t matter anymore! As companies make the transition from traditional IT infrastructure to cloud-native container platforms packaging, deploying and managing applications becomes the focus for developers and operators. Having a consistent approach to managing dependencies and building applications brings stability to CI/CD pipelines and frees developers to prioritize on features. Automated, repeatable builds with immutable artifacts and consistent management of any application on any platform allow operators to focus on stability and speed. Chef's Habitat project brings all of this together in an open source automation platform that enables modern application teams to build, deploy, and run any application in any environment - from traditional data-centers to containerized microservices. This presentation provided an overview of the benefits of Habitat and a live demo of applications being built and deployed on traditional operating systems across Docker and Kubernetes, seamlessly.
Presentation from Cloud Expo Asia Hong Kong covering the rationale for "Compliance as Code" and how InSpec may be applied to servers, cloud platforms, and much more to keep track of your compliance everywhere.
Opening keynote for DevOpsDays Jakarta. I attempted to tie the themes of DevOps to a timeline of when they received increasing focus. Books on the subjects provided a convenient way to mark those times.
https://www.devopsdays.org/events/2018-jakarta/program/matt-ray/
DevOps Talks Melbourne 2018: Whales, Cats and KubernetesMatt Ray
Kubernetes, Docker or VMs? With Habitat it doesn’t matter anymore! As companies make the transition from traditional IT infrastructure to cloud-native container platforms packaging, deploying and managing applications becomes the focus for developers and operators. Having a consistent approach to managing dependencies and building applications brings stability to CI/CD pipelines and frees developers to prioritize on features. Automated, repeatable builds with immutable artefacts and consistent management of any application on any platform allow operators to focus on stability and speed. Meet Habitat! This session will provide an overview of the benefits of Habitat and a live demo of applications being built and deployed on traditional operating systems across Docker and Kubernetes, seamlessly.
Presentation to the Perth MS Cloud Computing User Group on November 14, 2017. Covered off on how Chef, InSpec, Habitat and Chef Automate work with Windows, Azure and the Microsoft ecosystem.
An overview of Chef Automate and the various resources for Chef, InSpec and Habitat for Azure and Microsoft's other products. Presented September 20, 2017 at Tank Stream Labs.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Connector Corner: Automate dynamic content and events by pushing a buttonDianaGray10
Here is something new! In our next Connector Corner webinar, we will demonstrate how you can use a single workflow to:
Create a campaign using Mailchimp with merge tags/fields
Send an interactive Slack channel message (using buttons)
Have the message received by managers and peers along with a test email for review
But there’s more:
In a second workflow supporting the same use case, you’ll see:
Your campaign sent to target colleagues for approval
If the “Approve” button is clicked, a Jira/Zendesk ticket is created for the marketing design team
But—if the “Reject” button is pushed, colleagues will be alerted via Slack message
Join us to learn more about this new, human-in-the-loop capability, brought to you by Integration Service connectors.
And...
Speakers:
Akshay Agnihotri, Product Manager
Charlie Greenberg, Host
7. SSH Control
"SSH supports two different protocol
versions.The original version, SSHv1, was
subject to a number of security issues.
Please use SSHv2 instead to avoid
these."
9. Whip up a one-liner!
grep "^Protocol" /etc/ssh/sshd_config | sed 's/Protocol //'
10. Apache Server Information Leakage
• Description
This Directive Controls wheather Server response field is sent back to clients includes a description of Generic OSType of the
Server.
This allows attackers to identify web servers details greatly and increases the efficiency of any attack,as security vulnerabilities are
dependent upon specific software versions.
• How toTest
In order to test for ServerToken configuration, one should check the Apache configuration file.
• Misconfiguration
ServerTokens Full
• Remediation
Configure the ServerTokens directive in the Apache configuration to value of Prod or ProductOnly.This tells Apache to only
return "Apache" in the Server header, returned on every page request.
ServerTokens Prod
or
ServerTokens ProductOnly
https://www.owasp.org/index.php/SCG_WS_Apache
11. More grep and sed!
grep "^ServerTokens" /etc/httpd/conf/httpd.conf | sed 's/ServerTokens //'
20. Key Trends
• While individual rule compliance
is up, testing of security systems
is down
• Sustainability is low. Fewer than
a third of companies were found
to be still fully compliant less
than a year after successful
validation.
21.
22. Shell Scripts
grep "^Protocol" /etc/ssh/sshd_config | sed 's/Protocol //'
grep "^ServerTokens" /etc/httpd/conf/httpd.conf | sed 's/ServerTokens //'
49. InSpec
> inspec exec test.rb
Test a machine remotely via SSH
> inspec exec test.rb -i identity.key -t ssh://root@172.17.0.1
Test your machine locally
> inspec exec test.rb -t winrm://Admin@192.168.1.2 --password super
Test Docker Container
> inspec exec test.rb -t docker://5cc8837bb6a8
Test a machine remotely via WinRM
AGENTLESS
50. Operating System & Application Coverage
• Microsoft Windows
• Red Hat Enterprise Linux
• Ubuntu Linux
• SUSE Linux Enterprise Server
• Oracle Enterprise Linux
• AIX
• HP-UX
• Solaris
• VMware ESXi
• MySQL
• Oracle
• PostgreSQL
• Tomcat
• SQL Server
• IIS
• HTTP request
61. azure_resource_group
control 'azure-1'
do impact 1.0
title 'Checks that there is only one storage account in the resource
group'
describe azure_resource_group(name: 'MyResourceGroup').where { type
== 'Microsoft.Storage/storageAccounts' }.entries do
its('count') { should eq 1 }
end
end
62. azure_virtual_machine
control 'azure-1' do
impact 1.0
title 'Make sure Ubuntu Servers are built from an Ubuntu template'
describe azure_virtual_machine(name: '[YOUR VM NAME]',
resource_group: '[YOUR RESOURCE GROUP]') do
its('sku') { should eq '16.04.0-LTS' }
its('publisher') { should eq 'Canonical' }
its('offer') { should eq 'UbuntuServer' }
end
end