In een wereld waarin flexibiliteit en schaalbaarheid van ICT door bijna iedere organisatie vereist wordt én waarin nieuwe wetgeving en cyberdreigingen steeds geavanceerder worden, zijn cloud en security inmiddels thema’s waar we niet langer omheen kunnen. Bij veel organisaties zijn beide onderwerpen dan ook hottopic op de ICT-agenda.
Presentatie van 15 juni 2017
Big Data is an increasingly powerful enterprise asset and this talk will explore the relationship between big data and cyber security, how we preserve privacy whilst exploiting the advantages of data collection and processing. Big Data technologies provide both governments and corporations powerful tools to offer more efficient and personalized services. The rapid adoption of these technologies has of course created tremendous social benefits. Unfortunately unwanted side effects are the potential rich pickings available to those with malicious intentions. Increasingly, the sophisticated cyber attacker is able to exploit the rich array public data to build detailed profiles on their adversaries to support their malicious intentions
Real-Time Analytics with Apache Cassandra and Apache SparkGuido Schmutz
Time series data is everywhere: IoT, sensor data, financial transactions. The industry has moved to databases like Cassandra to handle the high velocity and high volume of data that is now common place. However data is pointless without being able to process it in near real time. That's where Spark combined with Cassandra comes in! What was one just your storage system (Cassandra) can be transformed into an analytics system and it's really surprising how easy it is!
High-Performance Analytics in the Cloud with Apache ImpalaCloudera, Inc.
With more and more data being generated and stored in the cloud, you need a modern data platform that can extend to any environment so you can derive value from all your data. Cloudera Enterprise is the leading enterprise Hadoop platform for cloud deployments. It’s the easiest way to manage and secure Hadoop data across any cloud environment and includes component-level support for cloud-native object stores. This makes the platform uniquely suited to handle transient jobs like ETL and BI analytics, as well as persistent workloads like stream processing and advanced analytics.
With the recent release of Cloudera 5.8, Apache Impala (incubating) has added support for Amazon S3, enabling business analysts to get instant insights from all data through high-performance exploratory analytics and BI.
3 Things to learn:
Join David Tishgart, Director of Product Marketing, and James Curtis, Senior Analyst Data Platforms & Analytics at 451 Research, as they discuss:
* Best practices for analytic workloads in the cloud
* A live demo and real-world use cases
* What’s next for Cloudera and the cloud
Privacera and Northwestern Mutual - Scaling Privacy in a Spark EcosystemPrivacera
Privacera and Customer Northwestern Mutual Present "How to Scale Privacy in a Spark Ecosystem" at Data + AI Summit 2021
Privacera customer, Aaron Colcord, Sr. Director of Data Engineering at Northwestern Mutual and Don Bosco Durai, CTO and co-founder of Privacera detail an important use case in privacy and demonstrate how the financial security leader scales privacy with a focus on the business needs. Because privacy has become one of the most important critical topics in data today, it is more than how to ingest and consume data, but how to protect customers' rights while balancing the business need.
3 guiding priciples to improve data securityKeith Braswell
The information explosion, the proliferation of endpoint devices, growing user volumes, and new computing models like cloud, social business, and big data have created new security vulnerabilities. To secure sensitive data and address compliance requirements, organizations need to adopt a more proactive and systematic approach. Read this white paper to learn three simple guiding principles to help your organization achieve better security and compliance without impacting production systems or straining already-tight budgets.
Consolidate your data marts for fast, flexible analytics 5.24.18Cloudera, Inc.
In this webinar, Cloudera and AtScale will showcase:
How a company can modernize their analytic architecture to deliver flexibility and agility to more end-users.
How using AtScale’s Universal Semantic layer can end the data chaos and allow business users to use the data in the modern platform.
Highlight the performance of AtScale and Cloudera’s analytic database with newly completed TPC-DS standard benchmarking.
Best practices for migrating from legacy appliances.
Big Data is an increasingly powerful enterprise asset and this talk will explore the relationship between big data and cyber security, how we preserve privacy whilst exploiting the advantages of data collection and processing. Big Data technologies provide both governments and corporations powerful tools to offer more efficient and personalized services. The rapid adoption of these technologies has of course created tremendous social benefits. Unfortunately unwanted side effects are the potential rich pickings available to those with malicious intentions. Increasingly, the sophisticated cyber attacker is able to exploit the rich array public data to build detailed profiles on their adversaries to support their malicious intentions
Real-Time Analytics with Apache Cassandra and Apache SparkGuido Schmutz
Time series data is everywhere: IoT, sensor data, financial transactions. The industry has moved to databases like Cassandra to handle the high velocity and high volume of data that is now common place. However data is pointless without being able to process it in near real time. That's where Spark combined with Cassandra comes in! What was one just your storage system (Cassandra) can be transformed into an analytics system and it's really surprising how easy it is!
High-Performance Analytics in the Cloud with Apache ImpalaCloudera, Inc.
With more and more data being generated and stored in the cloud, you need a modern data platform that can extend to any environment so you can derive value from all your data. Cloudera Enterprise is the leading enterprise Hadoop platform for cloud deployments. It’s the easiest way to manage and secure Hadoop data across any cloud environment and includes component-level support for cloud-native object stores. This makes the platform uniquely suited to handle transient jobs like ETL and BI analytics, as well as persistent workloads like stream processing and advanced analytics.
With the recent release of Cloudera 5.8, Apache Impala (incubating) has added support for Amazon S3, enabling business analysts to get instant insights from all data through high-performance exploratory analytics and BI.
3 Things to learn:
Join David Tishgart, Director of Product Marketing, and James Curtis, Senior Analyst Data Platforms & Analytics at 451 Research, as they discuss:
* Best practices for analytic workloads in the cloud
* A live demo and real-world use cases
* What’s next for Cloudera and the cloud
Privacera and Northwestern Mutual - Scaling Privacy in a Spark EcosystemPrivacera
Privacera and Customer Northwestern Mutual Present "How to Scale Privacy in a Spark Ecosystem" at Data + AI Summit 2021
Privacera customer, Aaron Colcord, Sr. Director of Data Engineering at Northwestern Mutual and Don Bosco Durai, CTO and co-founder of Privacera detail an important use case in privacy and demonstrate how the financial security leader scales privacy with a focus on the business needs. Because privacy has become one of the most important critical topics in data today, it is more than how to ingest and consume data, but how to protect customers' rights while balancing the business need.
3 guiding priciples to improve data securityKeith Braswell
The information explosion, the proliferation of endpoint devices, growing user volumes, and new computing models like cloud, social business, and big data have created new security vulnerabilities. To secure sensitive data and address compliance requirements, organizations need to adopt a more proactive and systematic approach. Read this white paper to learn three simple guiding principles to help your organization achieve better security and compliance without impacting production systems or straining already-tight budgets.
Consolidate your data marts for fast, flexible analytics 5.24.18Cloudera, Inc.
In this webinar, Cloudera and AtScale will showcase:
How a company can modernize their analytic architecture to deliver flexibility and agility to more end-users.
How using AtScale’s Universal Semantic layer can end the data chaos and allow business users to use the data in the modern platform.
Highlight the performance of AtScale and Cloudera’s analytic database with newly completed TPC-DS standard benchmarking.
Best practices for migrating from legacy appliances.
How to Architect a Serverless Cloud Data Lake for Enhanced Data AnalyticsInformatica
This presentation is geared toward enterprise architects and senior IT leaders looking to drive more value from their data by learning about cloud data lake management.
As businesses focus on leveraging big data to drive digital transformation, technology leaders are struggling to keep pace with the high volume of data coming in at high speed and rapidly evolving technologies. What's needed is an approach that helps you turn petabytes into profit.
Cloud data lakes and cloud data warehouses have emerged as a popular architectural pattern to support next-generation analytics. Informatica's comprehensive AI-driven cloud data lake management solution natively ingests, streams, integrates, cleanses, governs, protects and processes big data workloads in multi-cloud environments.
Please leave any questions or comments below.
Real time analytics is a beautiful thing, especially if you can build it in quick, scalable & robust way. We built a digital command center for our marketing team, which provided real time analytics on social media, clickstream and google search term in a span of couple of months. This solution was entirely build on open source technologies, using a combination of Apache Nifi, Elastic search & Hadoop. Simple but very effective. In this presentation i would like to share the architecture, learning and business benefits of this solution.
Realizing the Promise of Big Data with Hadoop - Cloudera Summer Webinar Serie...Cloudera, Inc.
Apache Hadoop, an open-source platform, is increasingly gaining adoption within organizations trying to draw insight from all the big data being generated. Hadoop, and a handful of open-source tools that complement it, are promising to make gigantic and diverse datasets easily and economically available for quick analysis. A burgeoning partner ecosystem is also essential to helping organizations turn big data into business value.
Data analytics, Spark, Hadoop and AI have become fundamental tools to drive digital transformation. A critical challenge is moving from isolated experiments to an organizational or enterprise production infrastructure. In this talk, we break apart the modern data analytics workflow to focus on the data challenges across different phases of the analytics and AI life cycle. By presenting a unified approach to data storage for AI and Analytics, organizations can reduce costs, modernize their data strategy and build a sustainable enterprise data lake. By anticipating how Hadoop, Spark, Tensorflow, Caffe and traditional analytics like SAS, HPC can share data, IT departments and data science practitioners can not only co-exist, but speed time to insight. We will present the tangible benefits of a Reference Architecture using real-world installations that span proprietary and open-source frameworks. Using intelligent software-defined shared storage, users are able to eliminate silos, reduce multiple data copies, and improve time to insight.PALLAVI GALGALI, Offering Manager,IBM and DOUGLAS O'FLAHERTY, Portfolio Product Manager, IBM
General Data Protection Regulation (GDPR) which will be in effect in 2018, brings newer requirements for managing personal and sensitive data of European Union subjects. The recently enacted Privacy Shield directive from 2016 now regulates the movement of data between EU and the US. Together, both regulations are impacting how CXOs are thinking about procuring, storing and processing personal and sensitive data.
Over the last few years, open-source projects such as Apache Ranger and Apache Atlas have been driving comprehensive security and governance within Hadoop and the big data ecosystem. Solution vendors such as Privacera are leveraging the power of Hadoop and Apache projects such as Atlas, Ranger to help security and compliance teams within enterprises easily identify and protect data that are subject to the privacy regulations and monitor the use of such data.
This talk will walk through the current regulatory climate in Europe and how it can impact big data implementations. We will specifically walk through a business framework that enterprises can use to build a strategy to manage GDPR, Privacy Shield, and other regulations. We will use a live demonstration to show how projects such as Apache Ranger, Apache Atlas and solutions such as Privacera can be used effectively to address specific requirements of these regulations.
Today, financial services firms rely on data as the basis of their industry. In the absence of the means of production for physical goods, data is the raw material used to create value for and capture value from the market. However, as data volume and variety increase, so do the susceptibility to fraud and the temptation to hackers. Learn how an enterprise data hub built on Hadoop enables advanced security and machine learning on much more descriptive and real-time data to detect and prevent fraud, from payment encryption to anti-money-laundering processes.
Verizon: Finance Data Lake implementation as a Self Service Discovery Big Dat...DataWorks Summit
Finance Data Lake objective is to create a centralized enterprise data repository for all Finance and Supply Chain data. It serves as the single source of truth. It enables a self-service discovery Analytics platform for business users to answer adhoc business questions and derive critical insights. The data lake is based on open source Hadoop big data platform and a very cost effective solution in breaking the ERP data silos and simplifying the data architecture in the enterprise.
POCs were conducted on in-house Hortonworks Hadoop data platform to validate the cluster performance for Production volumes. Based on business priorities, an initial roadmap was defined using 3 data sources including 2 SAP ERPs and Peoplesoft (OLTP systems). Development environment was established in AWS Cloud for agile delivery. The near real time data ingestion architecture for the data lake was defined using replication tools and custom SQOOP based micro-batching framework and data persisted in Apache Hive DB in ORC format. Data and user security is implemented using Apache Ranger and sensitive data stored at rest in encryption zones. Business data sets were developed in Hive scripts and scheduled using Oozie. Multiple reporting tools connectivity including SQL tools, Excel and Tableau were enabled for Self-service Analytics. Upon successful implementation of the initial phase, a full roadmap is established to extend the Finance data lake to over 25 data sources and enhance data ingestion to scale as well as enable OLAP tools on Hadoop.
In 2015/16 Worldpay deployed it's Enterprise Data Platform - a highly secure cluster used for analysis of over 65 Billion card transactions and the subject of last years Hadoop Summit Keynote in Dublin. A year on and we are now rapidly expanding our platform with true multi-tenancy. For our first tenant we have build and deployed the analytics and reporting for our central platforms. Our second tenant is to deploy 'decision engines' into our core business systems. These allow Worldpay to make decisions derived from machine learning on how we authorise and route payments traffic and how these affect the consumer, merchant and other business partners. We are also developing other tenant for systems management and security. This talk will look at what it means to have truly have a single enterprise data lake and multiple tenants that share that data and look forward to how we will extend the platform in 2017 with Hadoop 3.
Just the sketch: advanced streaming analytics in Apache MetronDataWorks Summit
Doing advanced analytics in streaming architectures presents unique challenges around the tradeoff of having more context vs. performance. Typically performance and scalability requirements mandate that each message in a stream be operated on without the context of other messages in the stream that may have come before. In this talk, we will talk about using sketching algorithms to engineering a compromise which allows us to consider historical state without compromising scalability.
What we found analyzing the capabilities of many similar SIEMs and cybersecurity platforms is that a good portion of the advanced anaytics boil down to either simple rules enriched with the ability to do statistical baselining, set existence, and set cardinality computations. These operations are necessarily difficult to do in-stream, so often they're done after the fact. We look at ways to open up these analytics to stream computation without sacrificing scalability.
Specifically, we will introduce the infrastructure built for Apache Metron to perform these kinds of tasks. We will cover the novel integration between an Apache Storm and Apache Hbase and orchestrated by a custom domain specific language called Stellar to take all the sting out of constructing sketches and using them to accomplish simple and more advanced analytics such as statistical outlier analysis in stream. CASEY STELLA, Principal Software Engineer, Hortonworks
Beyond Kerberos and Ranger - Tips to discover, track and manage risks in hybr...DataWorks Summit
Even after deploying traditional security measures like authentication and authorization to secure sensitive data, data owners and security teams are still struggling to manage and get visibility on risks with data. The same challenge multiplies when data is moving and shared across different data silos such as on-premise Hadoop, public cloud infrastructures such as AWS, Azure and Google Cloud. To control the risks that come with data, enterprises need a comprehensive data-centric approach to easily identify risks, manage security and compliance policies and implement behavior analytics to differentiate between good and bad behavior. This talk will explain a 3 step process of implementing data-centric controls for your hybrid environment including discovering where sensitive data is stored, tracking where data is moving and can easily identifying and controlling potential misuse of the data in near real time.
Customer Best Practices: Optimizing Cloudera on AWSCloudera, Inc.
Join Cloudera’s Alex Moundalexis, who will discuss time-saving design and best practices for deploying Cloudera Enterprise clusters in AWS. He will also be joined by Josh Hammer, Partner Solutions Architect at Amazon Web Services who will highlight unique advantages of running Cloudera on AWS.
In this interactive webinar, we will hear from Celgene, a global biopharmaceutical company and we will explore best practices of running your Cloudera Enterprise cluster on AWS:
AWS components (EC2, S3, RDS, EBS, VPC, Direct Connect, Service Limits)
Deployment Topology
Roles & Instance Types
Networking, Connectivity and Security
Storage Configuration
Capacity Planning
Provisioning Instances
3 things to learn:
AWS components (EC2, S3, RDS, EBS, VPC, Direct Connect, Service Limits)
Networking, Connectivity and Security
Deployment Topology
Webinar - DataStax Enterprise 5.1: 3X the operational analytics speed, help f...DataStax
Let’s get up close with the latest and greatest in DataStax Enterprise. Find out everything you need to know about the latest features, as we dive into the launch of DataStax Enterprise 5.1, OpsCenter 6.1 and Studio 2.0.
View recording: https://youtu.be/Vhu6ZkQUR0M
Explore all DataStax webinars: http://www.datastax.com/resources/webinars
How to Architect a Serverless Cloud Data Lake for Enhanced Data AnalyticsInformatica
This presentation is geared toward enterprise architects and senior IT leaders looking to drive more value from their data by learning about cloud data lake management.
As businesses focus on leveraging big data to drive digital transformation, technology leaders are struggling to keep pace with the high volume of data coming in at high speed and rapidly evolving technologies. What's needed is an approach that helps you turn petabytes into profit.
Cloud data lakes and cloud data warehouses have emerged as a popular architectural pattern to support next-generation analytics. Informatica's comprehensive AI-driven cloud data lake management solution natively ingests, streams, integrates, cleanses, governs, protects and processes big data workloads in multi-cloud environments.
Please leave any questions or comments below.
Real time analytics is a beautiful thing, especially if you can build it in quick, scalable & robust way. We built a digital command center for our marketing team, which provided real time analytics on social media, clickstream and google search term in a span of couple of months. This solution was entirely build on open source technologies, using a combination of Apache Nifi, Elastic search & Hadoop. Simple but very effective. In this presentation i would like to share the architecture, learning and business benefits of this solution.
Realizing the Promise of Big Data with Hadoop - Cloudera Summer Webinar Serie...Cloudera, Inc.
Apache Hadoop, an open-source platform, is increasingly gaining adoption within organizations trying to draw insight from all the big data being generated. Hadoop, and a handful of open-source tools that complement it, are promising to make gigantic and diverse datasets easily and economically available for quick analysis. A burgeoning partner ecosystem is also essential to helping organizations turn big data into business value.
Data analytics, Spark, Hadoop and AI have become fundamental tools to drive digital transformation. A critical challenge is moving from isolated experiments to an organizational or enterprise production infrastructure. In this talk, we break apart the modern data analytics workflow to focus on the data challenges across different phases of the analytics and AI life cycle. By presenting a unified approach to data storage for AI and Analytics, organizations can reduce costs, modernize their data strategy and build a sustainable enterprise data lake. By anticipating how Hadoop, Spark, Tensorflow, Caffe and traditional analytics like SAS, HPC can share data, IT departments and data science practitioners can not only co-exist, but speed time to insight. We will present the tangible benefits of a Reference Architecture using real-world installations that span proprietary and open-source frameworks. Using intelligent software-defined shared storage, users are able to eliminate silos, reduce multiple data copies, and improve time to insight.PALLAVI GALGALI, Offering Manager,IBM and DOUGLAS O'FLAHERTY, Portfolio Product Manager, IBM
General Data Protection Regulation (GDPR) which will be in effect in 2018, brings newer requirements for managing personal and sensitive data of European Union subjects. The recently enacted Privacy Shield directive from 2016 now regulates the movement of data between EU and the US. Together, both regulations are impacting how CXOs are thinking about procuring, storing and processing personal and sensitive data.
Over the last few years, open-source projects such as Apache Ranger and Apache Atlas have been driving comprehensive security and governance within Hadoop and the big data ecosystem. Solution vendors such as Privacera are leveraging the power of Hadoop and Apache projects such as Atlas, Ranger to help security and compliance teams within enterprises easily identify and protect data that are subject to the privacy regulations and monitor the use of such data.
This talk will walk through the current regulatory climate in Europe and how it can impact big data implementations. We will specifically walk through a business framework that enterprises can use to build a strategy to manage GDPR, Privacy Shield, and other regulations. We will use a live demonstration to show how projects such as Apache Ranger, Apache Atlas and solutions such as Privacera can be used effectively to address specific requirements of these regulations.
Today, financial services firms rely on data as the basis of their industry. In the absence of the means of production for physical goods, data is the raw material used to create value for and capture value from the market. However, as data volume and variety increase, so do the susceptibility to fraud and the temptation to hackers. Learn how an enterprise data hub built on Hadoop enables advanced security and machine learning on much more descriptive and real-time data to detect and prevent fraud, from payment encryption to anti-money-laundering processes.
Verizon: Finance Data Lake implementation as a Self Service Discovery Big Dat...DataWorks Summit
Finance Data Lake objective is to create a centralized enterprise data repository for all Finance and Supply Chain data. It serves as the single source of truth. It enables a self-service discovery Analytics platform for business users to answer adhoc business questions and derive critical insights. The data lake is based on open source Hadoop big data platform and a very cost effective solution in breaking the ERP data silos and simplifying the data architecture in the enterprise.
POCs were conducted on in-house Hortonworks Hadoop data platform to validate the cluster performance for Production volumes. Based on business priorities, an initial roadmap was defined using 3 data sources including 2 SAP ERPs and Peoplesoft (OLTP systems). Development environment was established in AWS Cloud for agile delivery. The near real time data ingestion architecture for the data lake was defined using replication tools and custom SQOOP based micro-batching framework and data persisted in Apache Hive DB in ORC format. Data and user security is implemented using Apache Ranger and sensitive data stored at rest in encryption zones. Business data sets were developed in Hive scripts and scheduled using Oozie. Multiple reporting tools connectivity including SQL tools, Excel and Tableau were enabled for Self-service Analytics. Upon successful implementation of the initial phase, a full roadmap is established to extend the Finance data lake to over 25 data sources and enhance data ingestion to scale as well as enable OLAP tools on Hadoop.
In 2015/16 Worldpay deployed it's Enterprise Data Platform - a highly secure cluster used for analysis of over 65 Billion card transactions and the subject of last years Hadoop Summit Keynote in Dublin. A year on and we are now rapidly expanding our platform with true multi-tenancy. For our first tenant we have build and deployed the analytics and reporting for our central platforms. Our second tenant is to deploy 'decision engines' into our core business systems. These allow Worldpay to make decisions derived from machine learning on how we authorise and route payments traffic and how these affect the consumer, merchant and other business partners. We are also developing other tenant for systems management and security. This talk will look at what it means to have truly have a single enterprise data lake and multiple tenants that share that data and look forward to how we will extend the platform in 2017 with Hadoop 3.
Just the sketch: advanced streaming analytics in Apache MetronDataWorks Summit
Doing advanced analytics in streaming architectures presents unique challenges around the tradeoff of having more context vs. performance. Typically performance and scalability requirements mandate that each message in a stream be operated on without the context of other messages in the stream that may have come before. In this talk, we will talk about using sketching algorithms to engineering a compromise which allows us to consider historical state without compromising scalability.
What we found analyzing the capabilities of many similar SIEMs and cybersecurity platforms is that a good portion of the advanced anaytics boil down to either simple rules enriched with the ability to do statistical baselining, set existence, and set cardinality computations. These operations are necessarily difficult to do in-stream, so often they're done after the fact. We look at ways to open up these analytics to stream computation without sacrificing scalability.
Specifically, we will introduce the infrastructure built for Apache Metron to perform these kinds of tasks. We will cover the novel integration between an Apache Storm and Apache Hbase and orchestrated by a custom domain specific language called Stellar to take all the sting out of constructing sketches and using them to accomplish simple and more advanced analytics such as statistical outlier analysis in stream. CASEY STELLA, Principal Software Engineer, Hortonworks
Beyond Kerberos and Ranger - Tips to discover, track and manage risks in hybr...DataWorks Summit
Even after deploying traditional security measures like authentication and authorization to secure sensitive data, data owners and security teams are still struggling to manage and get visibility on risks with data. The same challenge multiplies when data is moving and shared across different data silos such as on-premise Hadoop, public cloud infrastructures such as AWS, Azure and Google Cloud. To control the risks that come with data, enterprises need a comprehensive data-centric approach to easily identify risks, manage security and compliance policies and implement behavior analytics to differentiate between good and bad behavior. This talk will explain a 3 step process of implementing data-centric controls for your hybrid environment including discovering where sensitive data is stored, tracking where data is moving and can easily identifying and controlling potential misuse of the data in near real time.
Customer Best Practices: Optimizing Cloudera on AWSCloudera, Inc.
Join Cloudera’s Alex Moundalexis, who will discuss time-saving design and best practices for deploying Cloudera Enterprise clusters in AWS. He will also be joined by Josh Hammer, Partner Solutions Architect at Amazon Web Services who will highlight unique advantages of running Cloudera on AWS.
In this interactive webinar, we will hear from Celgene, a global biopharmaceutical company and we will explore best practices of running your Cloudera Enterprise cluster on AWS:
AWS components (EC2, S3, RDS, EBS, VPC, Direct Connect, Service Limits)
Deployment Topology
Roles & Instance Types
Networking, Connectivity and Security
Storage Configuration
Capacity Planning
Provisioning Instances
3 things to learn:
AWS components (EC2, S3, RDS, EBS, VPC, Direct Connect, Service Limits)
Networking, Connectivity and Security
Deployment Topology
Webinar - DataStax Enterprise 5.1: 3X the operational analytics speed, help f...DataStax
Let’s get up close with the latest and greatest in DataStax Enterprise. Find out everything you need to know about the latest features, as we dive into the launch of DataStax Enterprise 5.1, OpsCenter 6.1 and Studio 2.0.
View recording: https://youtu.be/Vhu6ZkQUR0M
Explore all DataStax webinars: http://www.datastax.com/resources/webinars
Microsoft Cloud GDPR Compliance Options (SUGUK)Andy Talbot
Recently, Microsoft introduced Microsoft 365, which brings together Office 365, Windows 10, and Enterprise Mobility + Security. We’ll explore what this combination of products means for an organisation looking to ensure GDPR compliance and additional Office 365 products that you can layer to help you meet your obligations.
A Key to Real-time Insights in a Post-COVID World (ASEAN)Denodo
Watch full webinar here: https://bit.ly/2EpHGyd
Presented at Data Champions, Online Asia 2020
Businesses and individuals around the world are experiencing the impact of a global pandemic. With many workers and potential shoppers still sequestered, COVID-19 is proving to have a momentous impact on the global economy. Regardless of the current situation and post-pandemic era, real-time data becomes even more critical to healthcare practitioners, business owners, government officials, and the public at large where holistic and timely information are important to make quick decisions. It enables doctors to make quick decisions about where to focus the care, business owners to alter production schedules to meet the demand, government agencies to contain the epidemic, and the public to be informed about prevention.
In this on-demand session, you will learn about the capabilities of data virtualization as a modern data integration technique and how can organisations:
- Rapidly unify information from disparate data sources to make accurate decisions and analyse data in real-time
- Build a single engine for security that provides audit and control by geographies
- Accelerate delivery of insights from your advanced analytics project
Accelerate Self-Service Analytics with Data Virtualization and VisualizationDenodo
Watch full webinar here: https://bit.ly/39AhUB7
Enterprise organizations are shifting to self-service analytics as business users need real-time access to holistic and consistent views of data regardless of its location, source or type for arriving at critical decisions.
Data Virtualization and Data Visualization work together through a universal semantic layer. Learn how they enable self-service data discovery and improve performance of your reports and dashboards.
In this session, you will learn:
- Challenges faced by business users
- How data virtualization enables self-service analytics
- Use case and lessons from customer success
- Overview of the highlight features in Tableau
Microsoft Azure is an ever-expanding set of cloud services to help your organization meet your business challenges. It’s the freedom to build, manage, and deploy applications on a massive, global network using your favorite tools and frameworks.
Productive
Reduce time to market, by delivering features faster with over 100 end-to-end services.
Hybrid
Develop and deploy where you want, with the only consistent hybrid cloud on the market. Extend Azure on-premises with Azure Stack.
Intelligent
Create intelligent apps using powerful data and artificial intelligence services.
Trusted
Join startups, governments, and 90 percent of Fortune 500 businesses who run on the Microsoft Cloud today.
FSI201 FINRA’s Managed Data Lake – Next Gen Analytics in the CloudAmazon Web Services
FINRA’s Data Lake unlocks the value in its data to accelerate analytics and machine learning at scale. FINRA's Technology group has changed its customer's relationship with data by creating a Managed Data Lake that enables discovery on Petabytes of capital markets data, while saving time and money over traditional analytics solutions. FINRA’s Managed Data Lake includes a centralized data catalog and separates storage from compute, allowing users to query from petabytes of data in seconds. Learn how FINRA uses Spot instances and services such as Amazon S3, Amazon EMR, Amazon Redshift, and AWS Lambda to provide the 'right tool for the right job' at each step in the data processing pipeline. All of this is done while meeting FINRA’s security and compliance responsibilities as a financial regulator.
Transforming and Scaling Large Scale Data Analytics: Moving to a Cloud-based ...DataWorks Summit
The Census Bureau is the U.S. government's largest statistical agency with a mission to provide current facts and figures about America's people, places and economy. The Bureau operates a large number of surveys to collect this data, the most well known being the decennial population census. Data is being collected in increasing volumes and the analytics solutions must be able to scale to meet the ever increasing needs while maintaining the confidentiality of the data. Past data analytics have occurred in processing silos inhibiting the sharing of information and common reference data is replicated across multiple system. The use of the Hortonworks Data Platform, Hortonworks Data Flow and other open-source technologies is enabling the creation of a cloud-based enterprise data lake and analytics platform. Cloud object stores are used to provide scalable data storage and cloud compute supports permanent and transient clusters. Data governance tools are used to track the data lineage and to provide access controls to sensitive data.
To disrupt and innovate, you need access to data. All of your data. The challenge for many organisations is that the data they need is locked away in a variety of silos. And there's perhaps no bigger silo than one of the most a widely deployed business application: SAP. Bringing together all your data for analytics and machine learning unlocks new insights and business value. Together, Cloudera and Datavard hold the key to breaking SAP data out of its silo, providing access to unlimited and untapped opportunities that currently lay hidden.
Customer migration to Azure SQL database, December 2019George Walters
This is a real life story on how a software as a service application moved to the cloud, to azure, over a period of two years. We discuss migration, business drivers, technology, and how it got done. We talk through more modern ways to refactor or change code to get into the cloud nowadays.
Webinar: Het nieuwe back-uppen: simpel, snel en schaalbaarICT-Partners
Ben jij belast met het beheren van de back-up van jouw organisatie? En herken je de volgende punten?
- Back-up is een tijdrovende klus waar ik eigenlijk geen tijd voor heb.
- Het schedulen van alle jobs zodat alles past, is een hels karwei!
- Het terugzetten van een back-up duurt voor mijn organisatie té lang.
In dit webianr laten wij je, onder andere middels een live demo, zien hoe je back-upbeheer kunt transformeren naar een eenvoudig, veilig en snel proces waar je nauwelijks omkijken meer naar hebt.
In deze presentatie adviseren wij u over de beste wijze waarop u tot een strategische keuze komt voor uw vernieuwde werkplekinfrastructuur: een werkplek waarbij innovatie, eenvoud, kostenbesparing, security en maximale aansluiting op de business centraal staat.
Presentatie van 23 maart 2017
Past de cloud wel echt bij uw organisatie?ICT-Partners
Cloudmogelijkheden zijn vanwege gebruiksgemak en pay-per-use zeer verleidelijk om in uw organisatie toe te passen. Maar past het wel bij de behoeften en belangen van uw organisatie? Want hoe u het ook wendt of keert, adoptie van IAAS, PAAS of SAAS is een expliciet besluit om een dienst te (out)sourcen.
Presentatie van 26 januari 2017
Citrix en Nutanix: de kracht van de combinatieICT-Partners
In deze presentatie informeren wij u over de voordelen en mogelijkheden van Citrix in combinatie met Nutanix. Tevens geven wij u praktische tips & trucs rondom het succesvol doorvoeren van een werkplektransformatie in uw organisatie.
Presentatie van 24 november 2016
Webinar: 'Terug naar eenvoud in ICT | Dé roadmap voor ICT-innovatie & datacen...ICT-Partners
In dit webinar ‘Terug naar eenvoud in ICT | De roadmap voor ICT-innovatie & datacentertransformatie’ informeren wij u hoe u innovatie binnen uw iCT-organisatie kunt versnellen en geven we u praktische tips & trucs voor het opstellen van een ICT-roadmap, die de brug slaat tussen business en ICT.
Presentatie van 25 oktober 2016
The next step in workspace | Simple, Scalable, SecureICT-Partners
De werkplek is in de afgelopen jaren al behoorlijk geoptimaliseerd. We bieden onze eindgebruikers al de mogelijkheid tot flexibel werken en staan werken op (eigen) mobiele devices toe. ICT is hiermee al flink tegemoetgekomen aan eisen en wensen van gebruikers. Toch blijft de business vragen om een nog snellere werkplek, die efficiëntie en productiviteit van gebruikers maximaal vergroot.
Vernieuwde technologieën bieden u nu de mogelijkheid om eenvoudiger te voldoen aan de eisen en wensen van uw business en uw werkplekinfrastructuur nog beter, veiliger en beheersbaarder in te richten. Tijd dus voor ‘The next step in workspace’.
Presentatie van 6 oktober 2016
In de afgelopen decennia is ICT door allerlei factoren, zoals de economische crisis en technologische ontwikkelingen, (onnodig) complex geworden. Parallel hieraan vond in de autobranche een soortgelijke ontwikkeling plaats, echter kwam daar opeens een tegenantwoord uit onverwachte hoek… Geïnspireerd door sprekende voorbeelden uit de autobranche laat onze CEO, Pascal Verberkt, u zien dat eenvoud in ICT, dankzij disruptieve innovatie, niet langer een illusie is! Onze CTO, Frans Loth, maakt vervolgens voor u een praktische vertaalslag naar hoe de principes uit de autobranche praktisch toe te passen zijn in de transformatie van uw ICT naar een eenvoudige, veilige en beheersbare omgeving, die uw business optimaal ondersteunt.
Presentatie van 9 juni 2016
The new IT reality laat fysieke grenzen verdwijnen, stelt de mens centraal en maakt ICT-infrastructuren onzichtbaar. Tegelijkertijd zorgt deze revolutie ervoor dat u met slimme inzet van cloudmogelijkheden zelfs uw datacenter mobiel kunt maken.
Presentatie 9 juni 2016
IoT de aanjager voor een nieuwe vorm van UCICT-Partners
De wereld verandert: mensen willen sneller en efficiënter gebruik maken communicatiemiddelen. Alles is aan elkaar verbonden. Laat u informeren over welke rol Unified Communications speelt in deze verandering. Hoe kan uw organisatie hiervan profiteren en welke techniek maakt het mogelijk om dit te realiseren? Denk bijvoorbeeld aan de loadbalancer oplossing van KEMP.
Presentatie 9 juni
Discussing strategies for building the next gen data centreICT-Partners
Maak kennis met onze visie op de transformatie van het datacenter. BigTec helpt u met een eigen referentiearchitectuur: een 'solution stack’ met 'best-of-breed’ oplossingen, gebruikmakend van Software Defined Webscale technologieën. Door in deze referentiearchitectuur oplossingen van o.a. Nutanix, Rubrik, VMTurbo en AVI Networks toe te passen, ontstaat een fundamentele basis voor het Software Defined Data Centre.
Presentatie 9 juni 2016
Weet u nog waar uw bedrijfsdata zich bevindt? Uw data bevindt zich (straks) overal. In samenwerking met Commvault laten we zien, hoe uw organisatie ‘in control’ kan blijven over én meerwaarde kan geven aan uw data ongeacht of deze zich on-premise, in de cloud of op een end-user device bevindt.
Presentatie 9 juni 2016
Snelle verandering in het ICT-landschap: noodzaak voor security 2.0?ICT-Partners
Veranderende bedreigingen zoals ransomware en datalekken kunnen een vervelende uitwerking hebben op uw organisatie. Begin daarom snel met een aantal voor de hand liggende acties om deze bedreigingen te pareren. Kijk naar huidige ontwikkelingen zoals cloud en Software Defined Datacenters, analyseer de risico’s en zorg voor passende maatregelen. ICT-Partners geeft u in deze sessie inzicht in een pragmatisch stappenplan om security-risico’s aan te pakken.
Presentatie van 9 juni 2016
Wilt u ook inzicht in de prestaties van uw ICT-omgeving? Weten welke transities nodig zijn om verbeteringen door te voeren? Of weten of doorgevoerde wijzigingen het gewenste effect hebben? In deze break-out laat ICT-Partners u middels de oplossing Lakeside SysTrack en de klantcase RIVM zien hoe u het onzichtbare zichtbaar kunt maken.
Presentatie van 9 juni 2016
Met het toepassen van een Software Defined Datacenter legt u de basis voor een ‘onzichtbare’ ICT-infrastructuur. Zo wordt het mogelijk om functionaliteiten geautomatiseerd uit te rollen en kunt u eenvoudig kosten inzichtelijk maken voor u en uw klanten. Met een transitie naar een Software Defined Datacenter kunt u voortaan uw ICT-voorzieningen, net als gas, water en licht, aanbieden als nutvoorziening aan uw eindgebruikers.
Presentatie van 9 juni 2016
In deze hands-on workshop maakten wij kennis met het Xtreme Computing Platform van Nutanix! Dankzij deze webscale, invisible ICT-infrastructuur bent u voorzien in een schaalbaar, kostenefficiënt en eenvoudig beheerbaar datacenter.
Presentatie van 10 maart 2016
Sinds 29 juli 2015 is Windows 10, het nieuwe besturingssysteem van Microsoft, beschikbaar. In de media hebben we al veel voorbij zien komen over wat het is en wat het kan. Deze informatie is vooral gericht op de consument. Maar welke voordelen heeft Windows 10 voor uw organisatie én hoe zorgt u voor een soepele migratie…
Presentatie van 8 oktober 2015
Nutanix in de praktijk: dataprotectie strategieënICT-Partners
Met Nutanix kunt u alle verschillende dataprotectie, disaster recovery en back-up strategieën implementeren.
Welke zijn dit? Welke past bij u? In deze sessie worden de mogelijkheden toegelicht die u nu kunt toepassen:
Van snapshots, rack-awareness, metro availability tot back-up naar de cloud.
Presentatie van 28 mei 2015
Citrix Workspace Suite: De werkplekimplementatie die niet kan mislukken!ICT-Partners
De Citrix Workspace suite is een totaaloplossing die uw medewerkers in staat stelt plaats- en tijdonafhankelijk te werken. De gebruiker krijgt op al zijn apparaten overal en altijd de best mogelijke gebruikerservaring. Uiteraard worden de beheerders ook niet vergeten: het beheer van grotere aantallen werkplekken is nog nooit zo eenvoudig geweest. Uiteindelijk resulteert dit in een zeer lage TCO en korte ROI, maar vooral tevreden gebruikers.
Presentatie van 28 mei 2015
Why the end user experience needs Service Assurance (Xangati)ICT-Partners
With the increased importance of being agile, businesses are increasingly reliant on IT to deliver new applications and high quality end user experience. End user experience is everything and during this talk we will explore: How do you quickly find out why applications are not performing as expected so the business does no continue to suffer? How do you know if you have sufficient resources, or when you are overprovisioning? And more…
Presentatie van 28 mei 2015
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
8. ICT-Partners over ICT
Optimaliseer investeringen en investeer in innovatie
Maak van ICT een echte business-enabler
Een juist strategisch ICT-beleid omarmt technologische ontwikkelingen
die de concurrentiepositie van de business versterken.
9. Deel I - Cloud
‘De (praktijk)voordelen van Microsoft Cloud’
Door Jan Depping, Microsoft / André Kamman, ICT-Partners
11. Technology is fast
reshaping our world and
has the potential to change
everything – people,
businesses, communities
and nations.”
“
Employees work on nearly 2x the
number of teams than they did
five years ago*
41% employees say mobile business
apps are changing the way they work*
Information overload wastes
25% of employee time, costing
U.S. business $997B each year*
160M customer records leaked*
229 days to detect security infiltration*
- Luis Alvarez, CEO, Global Services, BT
BT Global Services CIO report 2016: The Digital CIO * Multiple industry sources.
19. Ecosystem Provided
Languages, Dev
Tools & App
Containers
CMS & Apps
Devices
Databases
Management
MS Integrated
Operating
systems Bring
your own
libcloud
jclouds
IDE
20.
21.
22. West US
West US 2
Microsoft infrastructure
38 Cloud regions worldwide
Central US
East US
North Central US
Brazil South
West Europe
Japan East
South India
Southeast
Asia
Australia Southeast
Australia East
Central India
West India
Japan West
East Asia
China West1
North Europe
Germany
Northeast2
Canada East
Canada Central
South Central US
China East1
Germany
Central2
Korea
South3
East US 2
Korea Central3
United Kingdom West
United Kingdom
South
West Central US
US Gov Virginia
US Gov Iowa
US DoD East
US DoD
West
France3
France3
100+ datacenters
One of 3 largest networks in the world
1China datacenters operated by 21 Vianet
2German data trustee services provided by
T-systems
3France, South Korea and US Gov datacenter
regions have been announced but are not
currently operational
Sovereign datacenters
Global datacenters
US Gov Texas3
US Gov Arizona3
24. Platform Services
Infrastructure Services
Web
Apps
Mobile
Apps
API
Apps
Notification
Hubs
Hybrid
Cloud
Backup
StorSimple
Azure Site
Recovery
Import/Export
SQL
Database DocumentDB
Redis
Cache
Azure
Search
Storage
Tables
SQL Data
Warehouse
Azure AD
Health Monitoring
AD Privileged
Identity
Management
Operational
Analytics
Cloud
Services
Batch
RemoteApp
Service
Fabric
Visual Studio
Application
Insights
VS Team Services
Domain Services
HDInsight Machine
Learning Stream Analytics
Data
Factory
Event
Hubs
Data Lake
Analytics Service
IoT Hub
Data
Catalog
Security &
Management
Azure Active
Directory
Multi-Factor
Authentication
Automation
Portal
Key Vault
Store/
Marketplace
VM Image Gallery
& VM Depot
Azure AD
B2C
Scheduler
Xamarin
HockeyApp
Power BI
Embedded
SQL Server
Stretch Database
Mobile
Engagement
Functions
Cognitive Services Bot Framework Cortana
Security Center
Container
Service
VM
Scale Sets
Data Lake Store
BizTalk
Services
Service Bus
Logic
Apps
API
Management
Content
Delivery
Network
Media
Services
Media
Analytics
27. Office 365
The most complete, intelligent and secure service for digital work
Authoring
Word
Excel
PowerPoint
OneNote
Mail & Social
Outlook
Yammer
Sites & Content
OneDrive
SharePoint
Delve
Chat, Meetings
& Voice
Microsoft Teams
Skype for Business
Analytics
Power BI
MyAnalytics
Office 365 Groups Graph Security & Compliance
30. Providing clarity and consistency for the
protection of personal data
Enhanced personal privacy rights
Increased duty for protecting data
Mandatory breach reporting
Significant penalties for non-compliance
The General Data Protection
Regulation (GDPR) imposes new
rules on organizations that offer goods
and services to people in the European
Union (EU), or that collect and analyze
data tied to EU residents, no matter
where they are located.
31. Personal
privacy
What are the key changes with the GDPR?
Controls and
notifications
Transparent
policies
IT and training
Processors will need:
• Train privacy personnel
& employee
• Audit and update data
policies
• Employ a Data
Protection Officer (for
larger organizations)
• Create & manage
processor/vendor
contracts
Processors will need to:
• Protect personal data
using appropriate security
practices
• Notify authorities within
72 hours of breaches
• Receive consent before
processing personal data
• Keep records detailing
data processing
Individuals have the right to:
• Access their personal
data
• Correct errors in their
personal data
• Erase their personal data
• Object to processing of
their personal data
• Export personal data
Processors are required to:
• Provide clear notice of
data collection
• Outline processing
purposes and use cases
• Define data retention
and deletion policies
32. How do I get
started?
Identify what personal data you have and
where it resides
Discover1
Govern how personal data is used
and accessed
Manage2
Establish security controls to prevent, detect,
and respond to vulnerabilities & data breaches
Protect3
Keep required documentation, manage data
requests and breach notificationsReport4
33. Discover:
In-scope:
•
•
•
•
•
•
•
•
•
•
Inventory:
•
•
•
•
•
•
•
Microsoft Azure
Microsoft Azure Data Catalog
Enterprise Mobility + Security (EMS)
Microsoft Cloud App Security
Dynamics 365
Audit Data & User Activity
Reporting & Analytics
Office & Office 365
Data Loss Prevention
Advanced Data Governance
Office 365 eDiscovery
SQL Server and Azure SQL Database
SQL Query Language
Windows & Windows Server
Windows Search
Example solutions
1
34. 2
Example solutions
Manage:
Data governance:
•
•
•
•
•
•
•
•
Data classification:
•
•
•
•
•
•
•
Microsoft Azure
Azure Active Directory
Azure Role-Based Access Control (RBAC)
Enterprise Mobility + Security (EMS)
Azure Information Protection
Dynamics 365
Security Concepts
Office & Office 365
Advanced Data Governance
Journaling (Exchange Online)
Windows & Windows Server
Microsoft Data Classification Toolkit
35. 3
Example solutions
Protect:
Preventing data
attacks:
•
•
•
•
•
•
•
•
Detecting &
responding to
breaches:
•
•
•
•
•
•
Microsoft Azure
Azure Key Vault
Enterprise Mobility + Security (EMS)
Azure Active Directory Premium
Microsoft Intune
Office & Office 365
Advanced Threat Protection
Threat Intelligence
SQL Server and Azure SQL Database
Transparent data encryption
Always Encrypted
Windows & Windows Server
Windows Defender Advanced Threat Protection
Windows Hello
Device Guard
36. 4
Example solutions
Report:
Record-keeping:
•
•
•
•
•
Reporting tools:
•
•
•
•
•
•
Microsoft Trust Center
Service Trust Portal
Microsoft Azure
Azure Auditing & Logging
Microsoft Azure Monitor
Enterprise Mobility + Security (EMS)
Azure Information Protection
Dynamics 365
Reporting & Analytics
Office & Office 365
Service Assurance
Office 365 Audit Logs
Customer Lockbox
Windows & Windows Server
Windows Defender Advanced Threat Protection
37. Data governance &
rights management
Responsibility SaaS PaaS IaaS On-prem
Client endpoints
Account & access
management
Identity & directory
infrastructure
Application
Network controls
Operating system
Physical network
Physical datacenter
CustomerMicrosoft
Physical hosts
ALWAYS RETAINED BY CUSTOMER
VARIES BY SERVICE TYPE
TRANSFERS TO CLOUD PROVIDER
Cloud service provider responsibility
Customer responsibility
A Partnership
38. CLOUD CONSUMER
AS DATA
CONTROLLER
Information security, privacy,
compliance, legal, policy
requirements
1
CLOUD PROVIDER
AS DATA
PROCESSOR
MICROSOFT
COMMITMENTS
Customer requests
assurances from Cloud
vendor
3
Customer demonstrates
compliance.6
CUSTOM(ER) CONTROLS & PROCESSES
Customer evaluates claims
and adds additional controls5
CLOUD PROVIDER CLAIMS
RISKS
GOVERNANCE, RISK & COMPLIANCE
Customer continuous
assessment2
MITIGATING CONTROLS
Monitoring
Investigation
Examination
REAL TIME INSIGHTS
Microsoft provides
attestations, certifications and
legal commitments
4CONTRACTING
INDEPENDENTLY VERIFIED
DESCRIPTIVE
INFORMATION
INTEGRATED CONTROLS
CUSTOMER OR EMPLOYEE
OF CLOUD CONSUMER AS
DATA SUBJECT
Assurance process
The coverage reports help customers to
map and evaluate Microsoft’s assurances to
local frameworks (NEN7510 / BIR)
39. CLOUD CONSUMER
AS DATA
CONTROLLER
Information security, privacy,
compliance, legal, policy
requirements
1
CLOUD PROVIDER
AS DATA
PROCESSOR
MICROSOFT
COMMITMENTS
Customer requests
assurances from Cloud
vendor
3
Customer demonstrates
compliance.6
CUSTOM(ER) CONTROLS & PROCESSES
Customer evaluates claims
and adds additional controls5
CLOUD PROVIDER CLAIMS
RISKS
GOVERNANCE, RISK & COMPLIANCE
Customer continuous
assessment2
MITIGATING CONTROLS
Monitoring
Investigation
Examination
REAL TIME INSIGHTS
Microsoft provides
attestations, certifications and
legal commitments
4CONTRACTING
INDEPENDENTLY VERIFIED
DESCRIPTIVE
INFORMATION
INTEGRATED CONTROLS
CUSTOMER OR EMPLOYEE
OF CLOUD CONSUMER AS
DATA SUBJECT
Assurance process
Online Service Terms | SLA
ISO 27001/2/17/18 | ISAE 3000 SOC2-II |
FEDRamp NIST 800-53r4
Trustcenter | white papers | verbal
discussions
EDISCOVERY AND ANALYTICS
ARCHIVING AND RETENTION
AUDITING Activity API/LOG
COMPLIANCE STANDARDS
47. Het moment om naar de cloud te gaan…
Wanneer
• Bij vernieuwing van software
• Bij verouderde hardware
• Extra functionaliteiten
• Capaciteit binnen de ICT-afdeling
• Workloads met piektijden
Wat
• Omzetten naar SaaS / Verplaatsen naar
cloud / Toch traditioneel datacenter
Bij een goede business case
Waarom
• Snelle implementatie
• Flexibiliteit
• Disaster recovery
• Automatische software updates
• Minder/geen Capital-expenditure
• Verhoogde samenwerking
• Locatie onafhankelijk
• Controle over documenten
• Security
• Beter omgaan met resources/energie
49. Naar de cloud…
Voorbereidende stappen nemen
• Rationaliseer processen en bijbehorende applicaties
• Standaardiseer waar het kan
• Differentieer waar het waarde toevoegt
• Draagvlak creëren
Plannen
• Passend plan
• Gefaseerd / naast elkaar draaien
• Houd applicaties en data bij elkaar
• Lage risico apps eerst
Testen
• Testplan opstellen
• Scenario uitwerken
• Dummy testen
50. Use-Case 1: Microsoft Exchange
Aanleiding
Verouderde Exchange, te weinig opslagcapaciteit, ontbreken redundantie.
51. Migratie
• Technisch Ontwerp
• Planning
• Cut-over / Staged / IMAP of Hybrid migratie
• Dummy gebruikerstest
• Keyuser test (hybride test)
• Eindgebruikers overzetten
Voordelen
• Geen infrastructuur beheer meer
• Flexibele storage
• Archivering
• Geen Exchange migraties meer
• Ingebouwde back-up
Overige
• Je behoudt volledige controle
• Tenant aanvraag / subscription
• Beheren via GUI en Commandline
(scripts)
• Afbouwen oude omgeving
• Gebruikers opleiden
• Latency / wachttijden
52. Use-Case 2: Azure Backup & Site Recovery
Aanleiding
Verouderde back-upomgeving, tape drive defect, extra functionaliteiten.
53. Voordelen
• Geen infrastructuur beheer meer
• Geen tape handeling meer
• Lange termijn archivering
• Automatisch schalen
• Goedkope storage
• Redundantie in storage
• Noodscenario in de cloud
Overige
• Azure subscription
• Geen tape support
• Je behoudt volledige controle
• Beheer via Azure Backup Server
• Beheerders opleiden
• Backup, Restore en Site Recovery
• Latency / wachttijden
• Windows en Linux
54. Migratie
• Technisch Ontwerp
• Planning
• Migratie
• Data/vm/Exchange/SQL
• Backup en restore test
• Eerst kleine workload
• Volledig overzetten
• Afbouwen oude omgeving
59. Onze oplossingen
ICT-Partners helpt u succesvol te zijn met uw ICT. Wij ontwerpen,
implementeren en optimaliseren ICT-omgevingen met de nieuwste
technologieën, die bijdragen aan continuïteit, stabiliteit en kostenbesparing.
63. Waarom anti-virus niet langer afdoende is
01100111
01010110
10101010
10100101
10001010
11010011
00101101
Wrappers
Veranderende ‘verschijning’ van een bestand
Variations / Obfuscators
Veranderende inhoudelijke code om uniek over
te komen
Packers
Bedoeld om enkel op een echte machine te
draaien (anti-VM, sleepers, interactions, anti-
debug)
Targeting
Bedoeld om enkel op een specifieke machine of
configuratie te draaien
Malicious Code
De daadwerkelijke onderliggende code met als
doelstelling informatie te stelen of te bewerken
78. Bestaansrecht Cybercrime
1: Geld 2: Oorlogsvoering /
Hacktivisme
Makkelijk geld verdienen
Weinig risico
--------------------------------- +
Veel gegadigden
Online oorlog land tegen land
Doel: schade aanrichten, spionage en
onderbreken systemen.
à Cyber Warfare onderdeel van
strategie
Hacktivisme: Organisaties tegen
landen of andere organisaties.
Doel: Statement of onderbreking
systemen
80. What’s Happening in Today’s Threat Landscape
In the past 12 months:
52%
compromised
1 to 5 times
by cyber attacks
Top 3
most difficult to secure
(along with mobile devices, social
media apps)
2014 2015 2016
56%
67%
86%
% of org’s to
augment/replace
existing endpoint
protection
82. Today’s Threats Easily Evade Detection by AV-based Solutions
01100111
01010110
10101010
10100101
10001010
11010011
00101101
Wrappers
Designed to turn known code into a new binary
Variations / Obfuscators
Designed to slightly alter code to make known
code appear new/different
Packers
Designed to make sure code runs only on a
real machine (anti-VM, sleepers, interactions,
anti-debug)
Targeting
Designed to allow code to run only on a
specific target machine/configuration
Malicious Code
The attack code that runs with the goal of
persisting, stealing, spying, or exfiltrating data
86. The SentinelOne Endpoint Protection Platform
Dynamic
Whitelisting /
Blacklisting
Cloud
Intelligence
PREVENTION
DETECTION RESPONSE
360-degree
Attack View
Forensics
Mitigation
Remediation
Rollback
Auto-immunize
Advanced
Static
Analysis
PRE-EXECUTION POST-EXECUTIONON-EXECUTION
STATIC
PROTECTION
DYNAMIC
PROTECTION
Dynamic
Behavior
Detection
Single Autonomous
Agent
70MB
Memory Footprint
Single Management
Console
Supports up to
25,000
Endpoints
Cloud
or
On-Premise
Deployment
87. Advanced Static Prevention
§ Major breakthrough in signature-less
detection, based on machine learning
§ Deep File Inspection (DFI) engine prevents
advanced malware-- on access
§ Supported on all endpoint platforms:
Windows / MacOS / Linux
§ Engine supports all mitigation actions
31,000
Unique file
characteristics
defined and referenced
Known and unknown
file-based malware
90. Detection: Dynamic Behavior Tracking: A Machine Learning
Approach
SentinelOne
Agent
101000
100101
011011
101001
001010
001110
001010
111011
001011
101011
011001
110101
110001
010111
File creates a copy of itself
Opens up cmd.exe and
deletes the original file
Creates an autorun registry
key
Encrypts other files /
connects to outbound network /
logs keystrokes
MALWARE
DETECTED
.exe
91. Response: Deep Endpoint Visibility and 360o View of Attacks
Lightweight, autonomous agent
▪ Continuously monitors all user/kernel
activity on the user endpoint or server,
online or offline
▪ Server agents sit out-of-band,
preserving performance
Full-context forensics in real
time
▪ 360-degree view of threats, from
inception to termination
93. Gartner Jan 2017
“SentinelOne Most Visionary”
“SentinelOne is the only vendor in this analysis
that includes full EDR-type functionality in the
core platform. SentinelOne is a good prospect to
replace or augment existing EPP solutions for
any company looking for a fresh approach and
integrated EDR…”
95. Certified Replacement for Anti-Virus
§ SentinelOne outperforms and replaces
traditional AV-based solutions
§ Lightweight, fully-autonomous agent monitors
endpoints constantly
§ Dynamic Behavior Tracking identifies threats
across all vectors– not just malware
§ Product excellence
recognized across the IT
industry