DataOps or how I learned to love production - Michael HausenblasEvention
A plethora of data processing tools, most of them open source, is available to us. But who actually runs data pipelines? What about dynamically allocating resources to data pipeline components? In this talk we will discuss options to operate elastic data pipelines with modern, cloud native platforms such as DC/OS with Apache Mesos, Kubernetes and Docker Swarm. We will review good practices, from containerizing workloads to making things resilient and show elastic data pipelines in action.
Big Data Day LA 2016/ Use Case Driven track - How to Use Design Thinking to J...Data Con LA
There is a novel approach to identifying big data use cases, one which will ultimately lower the barrier to entry to big data projects and increase overall implementation success. This talk describes the approach used by big data pioneer and Datameer CEO Stefan Groschupf to drive over 200 production implementations.
S3 Deduplication with StorReduce and CloudianCloudian
Deduplication appliances today support the CIFS & NFS protocols. What about your cloud based applications that use the S3 API? How do you deduplicate S3 data to save on storage and network bandwidth? Leverage your backup systems S3 API and get the deduplication needed!
An Introduction to Big Data, Hadoop architecture, HDFS and MapReduce. Some concepts are explained through animation which is best viewed by downloading and opening in PowerPoint.
Rackspace::Solve NYC - Solving for Rapid Customer Growth and Scale Through De...Rackspace
At Rackspace::Solve NYC, Jon Hyman, CIO of Appboy and Prashanth Chandrasekar, GM of DevOps at Rackspace, discuss the role of DevOps in helping to solve the technical challenges that come with rapid growth.
Rackspace (NYSE: RAX) is the #1 managed cloud company. Our technical expertise and Fanatical Support® allow companies to tap the power of the cloud without the pain of hiring experts in dozens of complex technologies. Rackspace is also the leader in hybrid cloud, giving each customer the best fit for its unique needs — whether on single- or multi-tenant servers, or a combination of those platforms. Rackspace is the founder of OpenStack®, the open-source operating system for the cloud. Headquartered in San Antonio, we serve more than 200,000 business customers from data centers on four continents. We rank 29th on Fortune’s list of 100 Best Companies to Work For. For more information, visit www.rackspace.com.
DataOps or how I learned to love production - Michael HausenblasEvention
A plethora of data processing tools, most of them open source, is available to us. But who actually runs data pipelines? What about dynamically allocating resources to data pipeline components? In this talk we will discuss options to operate elastic data pipelines with modern, cloud native platforms such as DC/OS with Apache Mesos, Kubernetes and Docker Swarm. We will review good practices, from containerizing workloads to making things resilient and show elastic data pipelines in action.
Big Data Day LA 2016/ Use Case Driven track - How to Use Design Thinking to J...Data Con LA
There is a novel approach to identifying big data use cases, one which will ultimately lower the barrier to entry to big data projects and increase overall implementation success. This talk describes the approach used by big data pioneer and Datameer CEO Stefan Groschupf to drive over 200 production implementations.
S3 Deduplication with StorReduce and CloudianCloudian
Deduplication appliances today support the CIFS & NFS protocols. What about your cloud based applications that use the S3 API? How do you deduplicate S3 data to save on storage and network bandwidth? Leverage your backup systems S3 API and get the deduplication needed!
An Introduction to Big Data, Hadoop architecture, HDFS and MapReduce. Some concepts are explained through animation which is best viewed by downloading and opening in PowerPoint.
Rackspace::Solve NYC - Solving for Rapid Customer Growth and Scale Through De...Rackspace
At Rackspace::Solve NYC, Jon Hyman, CIO of Appboy and Prashanth Chandrasekar, GM of DevOps at Rackspace, discuss the role of DevOps in helping to solve the technical challenges that come with rapid growth.
Rackspace (NYSE: RAX) is the #1 managed cloud company. Our technical expertise and Fanatical Support® allow companies to tap the power of the cloud without the pain of hiring experts in dozens of complex technologies. Rackspace is also the leader in hybrid cloud, giving each customer the best fit for its unique needs — whether on single- or multi-tenant servers, or a combination of those platforms. Rackspace is the founder of OpenStack®, the open-source operating system for the cloud. Headquartered in San Antonio, we serve more than 200,000 business customers from data centers on four continents. We rank 29th on Fortune’s list of 100 Best Companies to Work For. For more information, visit www.rackspace.com.
Revolutionising Storage for your Future Business RequirementsNetApp
Non-disruptive Operations, Efficiency and Seamless scale are all topics of discussion by organisations facing challenging growth in the volumes of data stored. In this session Julian Wheeler, NetApp Channel SE Manager, investigates new storage infrastructures that enable you to manage growth, scale and efficiency while improving the service to the business.
The Power of DataOps for Cloud and Digital Transformation Delphix
Companies have been trying to speed up their innovation delivery for many years but often at the cost of higher quality and stronger security. Despite billions invested to accelerate innovation, projects are too often slowed by data friction - the result of growing volumes of silo’d data and multiple requests for data.
Overcoming these sources of friction requires constant iteration across several key dimensions:
• Reducing the total cost of data by making it fast and efficient to deliver data, regardless of source or consumer. Automation and tooling is critical.
• Integrating security and governance into a seamless data delivery process. This requires integrated masking, but also a governance platform and process to ensure the right rules and access controls are in place.
• Breaking down silos between people and organizations. This starts with the organizational change to bring people together into one team, but requires technology change to provide self-service data access and control.
Journey to the Cloud: Database Modernization Best PracticesDatavail
In this presentation from the AWS Dallas workshop, Datavail's migration team discusses the different decision paths to the cloud, how to decide when to migrate to AWS, and 3 case studies examples of migrations to AWS
How Element 84 Raises the Bar on Streaming Satellite DataAmazon Web Services
GOES-16 is a source of critical data for monitoring smoke, flooding impacts, burn scars, volcanic ash, and weather. However, finding and using this data can require significant investment. Element 84 married video compression and streaming technology with NASA’s Cumulus data processing pipeline, plus AWS Managed Services, to make the entire GOES-16 archive interactive on an array of formats. Users can now easily identify dates of interest for events like natural disasters, and stage a subset of the archive for analysis. And all of this scales down to $0 when not in use.
Rackspace::Solve NYC - Second Stage CloudRackspace
James Staten, VP and top Analyst at Forrester Research discusses tech adoption of cloud computing at Rackspace::Solve New York. Staten explains the Second-Stage Cloud, which means that the optimization phase of client-server is ending while we enter the rationalizations phase of cloud computing. This makes the cloud-competition today based on “service-value”, causing a hyper-growth for cloud services.
SkyWatch is all about making Earth-observation data digestible and accessible. They believe that creating a single place to bring together the planet’s observational datasets will make new waves in geospatial analytics. In this session, we'll take a look at how companies can take advantage of cloud-native workflows to enable access and analysis across planetary-scale datasets. You’ll hear how SkyWatch leveraged AWS serverless technologies to build a company that transforms petabytes of sensor data from space into useful information. You'll also learn how Sinergise is merging a variety of data streams through products like Sentinel Hub, and creating actionable intelligence for its users.
Getting on C2S: Lessons Learned Migrating Space Operational Systems to the Cl...Amazon Web Services
Leaders across the Earth and space community are increasingly finding value in the services, capacity, and scalability of AWS. Join us as Lockheed Martin Space Systems (LMSS) discusses how it’s used AWS to build revolutionary, new space programs to deliver value and performance to customers. LMSS will share lessons learned after transitioning National Reconnaissance Office and National Geospatial-Intelligence Agency processing programs to the C2S Cloud, and will present a firsthand view of how they took advantage of AWS services and features to move space operations and production to AWS.
OpenStack at the speed of business with SolidFire & Red Hat NetApp
When it comes to OpenStack® and the enterprise, it’s critical that you can rapidly deploy a plug-and-play solution that delivers mixed workload capabilities on a shared infrastructure. Join Red Hat and SolidFire to see how Agile Infrastructure for OpenStack can help your cloud move at the speed of business.
Architecting a Modern Data Warehouse: Enterprise Must-HavesYellowbrick Data
The goal of modern data warehousing is to not only deliver insights faster to more users, but provide a richer picture of your operations afforded by a greater volume and variety of data for analysis.
This presentation from a Database Trends and Applications webcast will educate IT decision makers and data warehousing professionals about the must-have capabilities for modern data warehousing today – how they work and how best to use them.
The Evolution of OpenStack – From Infancy to EnterpriseRackspace
As OpenStack turns 5 this year, we thought it would be a good time to take a look back at the evolution of OpenStack. We start with a quick overview of what OpenStack is, how OpenStack came to be and describe the OpenStack Foundation. Next we describe the problem that OpenStack helps to solve, the components of OpenStack and the timeline for when these components came to be. Last, we outline the current features and benefits that make OpenStack ready for the enterprise with supporting Enterprise use case examples. Blog can be found here (
https://developer.rackspace.com/blog/evolution-of-openstack-from-infancy-to-enterprise/) and webinar can be found here (https://www.brighttalk.com/webcast/11427/138613)
Top Trends in Building Data Lakes for Machine Learning and AI Holden Ackerman
Presentation by Ashish Thusoo, Co-Founder & CEO at Qubole, on exploring the big data industry trends in moving from data warehouses to cloud-based data lakes.This presentation will cover how companies today are seeing a significant rise in the success of their big data projects by moving to the cloud to iteratively build more cost-effective data pipelines and new products with ML and AI.
Uncovering how services like AWS, Google, Oracle, and Microsoft Azure provide the storage and compute infrastructure to build self-service data platforms that can enable all teams and new products to scale iteratively.
Revolutionising Storage for your Future Business RequirementsNetApp
Non-disruptive Operations, Efficiency and Seamless scale are all topics of discussion by organisations facing challenging growth in the volumes of data stored. In this session Julian Wheeler, NetApp Channel SE Manager, investigates new storage infrastructures that enable you to manage growth, scale and efficiency while improving the service to the business.
The Power of DataOps for Cloud and Digital Transformation Delphix
Companies have been trying to speed up their innovation delivery for many years but often at the cost of higher quality and stronger security. Despite billions invested to accelerate innovation, projects are too often slowed by data friction - the result of growing volumes of silo’d data and multiple requests for data.
Overcoming these sources of friction requires constant iteration across several key dimensions:
• Reducing the total cost of data by making it fast and efficient to deliver data, regardless of source or consumer. Automation and tooling is critical.
• Integrating security and governance into a seamless data delivery process. This requires integrated masking, but also a governance platform and process to ensure the right rules and access controls are in place.
• Breaking down silos between people and organizations. This starts with the organizational change to bring people together into one team, but requires technology change to provide self-service data access and control.
Journey to the Cloud: Database Modernization Best PracticesDatavail
In this presentation from the AWS Dallas workshop, Datavail's migration team discusses the different decision paths to the cloud, how to decide when to migrate to AWS, and 3 case studies examples of migrations to AWS
How Element 84 Raises the Bar on Streaming Satellite DataAmazon Web Services
GOES-16 is a source of critical data for monitoring smoke, flooding impacts, burn scars, volcanic ash, and weather. However, finding and using this data can require significant investment. Element 84 married video compression and streaming technology with NASA’s Cumulus data processing pipeline, plus AWS Managed Services, to make the entire GOES-16 archive interactive on an array of formats. Users can now easily identify dates of interest for events like natural disasters, and stage a subset of the archive for analysis. And all of this scales down to $0 when not in use.
Rackspace::Solve NYC - Second Stage CloudRackspace
James Staten, VP and top Analyst at Forrester Research discusses tech adoption of cloud computing at Rackspace::Solve New York. Staten explains the Second-Stage Cloud, which means that the optimization phase of client-server is ending while we enter the rationalizations phase of cloud computing. This makes the cloud-competition today based on “service-value”, causing a hyper-growth for cloud services.
SkyWatch is all about making Earth-observation data digestible and accessible. They believe that creating a single place to bring together the planet’s observational datasets will make new waves in geospatial analytics. In this session, we'll take a look at how companies can take advantage of cloud-native workflows to enable access and analysis across planetary-scale datasets. You’ll hear how SkyWatch leveraged AWS serverless technologies to build a company that transforms petabytes of sensor data from space into useful information. You'll also learn how Sinergise is merging a variety of data streams through products like Sentinel Hub, and creating actionable intelligence for its users.
Getting on C2S: Lessons Learned Migrating Space Operational Systems to the Cl...Amazon Web Services
Leaders across the Earth and space community are increasingly finding value in the services, capacity, and scalability of AWS. Join us as Lockheed Martin Space Systems (LMSS) discusses how it’s used AWS to build revolutionary, new space programs to deliver value and performance to customers. LMSS will share lessons learned after transitioning National Reconnaissance Office and National Geospatial-Intelligence Agency processing programs to the C2S Cloud, and will present a firsthand view of how they took advantage of AWS services and features to move space operations and production to AWS.
OpenStack at the speed of business with SolidFire & Red Hat NetApp
When it comes to OpenStack® and the enterprise, it’s critical that you can rapidly deploy a plug-and-play solution that delivers mixed workload capabilities on a shared infrastructure. Join Red Hat and SolidFire to see how Agile Infrastructure for OpenStack can help your cloud move at the speed of business.
Architecting a Modern Data Warehouse: Enterprise Must-HavesYellowbrick Data
The goal of modern data warehousing is to not only deliver insights faster to more users, but provide a richer picture of your operations afforded by a greater volume and variety of data for analysis.
This presentation from a Database Trends and Applications webcast will educate IT decision makers and data warehousing professionals about the must-have capabilities for modern data warehousing today – how they work and how best to use them.
The Evolution of OpenStack – From Infancy to EnterpriseRackspace
As OpenStack turns 5 this year, we thought it would be a good time to take a look back at the evolution of OpenStack. We start with a quick overview of what OpenStack is, how OpenStack came to be and describe the OpenStack Foundation. Next we describe the problem that OpenStack helps to solve, the components of OpenStack and the timeline for when these components came to be. Last, we outline the current features and benefits that make OpenStack ready for the enterprise with supporting Enterprise use case examples. Blog can be found here (
https://developer.rackspace.com/blog/evolution-of-openstack-from-infancy-to-enterprise/) and webinar can be found here (https://www.brighttalk.com/webcast/11427/138613)
Top Trends in Building Data Lakes for Machine Learning and AI Holden Ackerman
Presentation by Ashish Thusoo, Co-Founder & CEO at Qubole, on exploring the big data industry trends in moving from data warehouses to cloud-based data lakes.This presentation will cover how companies today are seeing a significant rise in the success of their big data projects by moving to the cloud to iteratively build more cost-effective data pipelines and new products with ML and AI.
Uncovering how services like AWS, Google, Oracle, and Microsoft Azure provide the storage and compute infrastructure to build self-service data platforms that can enable all teams and new products to scale iteratively.
By now, enterprises understand the value of Software as a Service (SaaS) and Infrastructure as a Service (IaaS), but there still is much confusion about Platform as a Service (PaaS). This confusion is one reason why enterprises have been slow to adopt PaaS. Why is there so much confusion? This presentation will help clear up the confusion of all the different types of PaaS offerings in the marketplace.
IoT is still a vague buzzword for many people. In this session we will discuss the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. The presentation will also discuss how IoT is perceived by investors and how venture capitalist access this space. Other topics to discuss are barriers to success, what is new, what is old, and what the future may hold.
Today's cloud implementations require a different approach to monitoring. This presentation discusses the mindset required and discusses logging and monitoring strategies and tools.
Best Practices For Building and Operating A Managed Data Lake - StampedeCon 2016StampedeCon
This session will detail best practices for architecting, building, operating and managing an Analytics Data Lake platform. Key topics will include:
1) Defining next-generation Data Lake architectures. The defacto standard has been commodity DAS servers with HDFS, but there are now multiple solutions aimed at separating compute and storage, virtualizing or containerizing Hadoop applications, and utilizing Hadoop compatible or embedded HDFS filesystems. This portion will explore the options available, and the pros and cons of each.
2) Data Ingest. There are many ways to load data into a Data Lake, including standardized Apache tools (Sqoop, Flume, Kafka, Storm, Spark, NiFi), standard file and object protocols (SFTP, NFS, Rest, WebHDFS), and proprietary tools (eg, Zaloni Bedrock, DataTorrent). This section will explore these options in the context of best fit to workflows; it will also look at key gaps and challenges, particularly in the areas of data formats and integration with metadata/cataloging tools.
3) Metadata & Cataloguing. One of the biggest inhibitors of successful Data Lake deployments is Data Governance, particularly in the areas of indexing, cataloguing and metadata management. It is nearly impossible to run analytics on top of a Data Lake and get meaningful & timely results without solving these problems. This portion will explore both emerging open standards (Apache Atlas, HCatalog) and proprietary tools (Cloudera Navigator, Zaloni Bedrock/Mica, Informatica Metadata Manager), and balance the pros, cons and gaps of each.
4) Security & Access Controls. Solving these challenges are key for adoption in regulatory driven industries like Healthcare & Financial Services. There are multiple Apache projects and proprietary tools to address this, but the challenge is making security and access controls consistent across the entire application and infrastructure stack, and over the data lifecycle, and being able to audit this in the face of legal challenges. This portion will explore available options and best practices.
5) Provisioning & Workflow Management. The real promise of the Data Lake is integrating Analytics workflows and tools on converged infrastructure-with shared data-and build “As A Service” oriented architectures that are oriented towards self-service data exploration and Analytics for end users. This is an emerging and immature area, but this session will explore some potential concepts, tools and options to achieve this.
This will be a moderately technical session, with the above topics being illustrated by real world examples. Attendees should have basic familiarity with Hadoop and the associated Apache projects.
DAMA & Denodo Webinar: Modernizing Data Architecture Using Data Virtualization Denodo
Watch here: https://bit.ly/2NGQD7R
In an era increasingly dominated by advancements in cloud computing, AI and advanced analytics it may come as a shock that many organizations still rely on data architectures built before the turn of the century. But that scenario is rapidly changing with the increasing adoption of real-time data virtualization - a paradigm shift in the approach that organizations take towards accessing, integrating, and provisioning data required to meet business goals.
As data analytics and data-driven intelligence takes centre stage in today’s digital economy, logical data integration across the widest variety of data sources, with proper security and governance structure in place has become mission-critical.
Attend this session to learn:
- Learn how you can meet cloud and data science challenges with data virtualization.
- Why data virtualization is increasingly finding enterprise-wide adoption
- Discover how customers are reducing costs and improving ROI with data virtualization
Logical Data Lakes: From Single Purpose to Multipurpose Data Lakes (APAC)Denodo
Watch full webinar here: https://bit.ly/3aePFcF
Historically data lakes have been created as a centralized physical data storage platform for data scientists to analyze data. But lately the explosion of big data, data privacy rules, departmental restrictions among many other things have made the centralized data repository approach less feasible. In this webinar, we will discuss why decentralized multipurpose data lakes are the future of data analysis for a broad range of business users.
Attend this session to learn:
- The restrictions of physical single purpose data lakes
- How to build a logical multi purpose data lake for business users
- The newer use cases that makes multi purpose data lakes a necessity
The Transformation of your Data in modern IT (Presented by DellEMC)Cloudera, Inc.
Organizations have a wealth of data contained within the existing infrastructures. At DellEMC we’re helping customers remove the barriers of legacy datastores and transforming the customer experience in the modern datacentre. Learn how to unshackle the valuable data inside your existing data warehouse, leverage new techniques, applications and technology to enhance the financial impact of all your data sources
Horses for Courses: Database RoundtableEric Kavanagh
The blessing and curse of today's database market? So many choices! While relational databases still dominate the day-to-day business, a host of alternatives has evolved around very specific use cases: graph, document, NoSQL, hybrid (HTAP), column store, the list goes on. And the database tools market is teeming with activity as well. Register for this special Research Webcast to hear Dr. Robin Bloor share his early findings about the evolving database market. He'll be joined by Steve Sarsfield of HPE Vertica, and Robert Reeves of Datical in a roundtable discussion with Bloor Group CEO Eric Kavanagh. Send any questions to info@insideanalysis.com, or tweet with #DBSurvival.
Solving enterprise challenges through scale out storage & big compute finalAvere Systems
Google Cloud Platform, Avere Systems, and Cycle Computing experts will share best practices for advancing solutions to big challenges faced by enterprises with growing compute and storage needs. In this “best practices” webinar, you’ll hear how these companies are working to improve results that drive businesses forward through scalability, performance, and ease of management.
The slides were from a webinar presented January 24, 2017. The audience learned:
- How enterprises are using Google Cloud Platform to gain compute and storage capacity on-demand
- Best practices for efficient use of cloud compute and storage resources
- Overcoming the need for file systems within a hybrid cloud environment
- Understand how to eliminate latency between cloud and data center architectures
- Learn how to best manage simulation, analytics, and big data workloads in dynamic environments
- Look at market dynamics drawing companies to new storage models over the next several years
Presenters communicated a foundation to build infrastructure to support ongoing demand growth.
How much money do you lose every time your ecommerce site goes down?DataStax
In today’s environment, you must serve your customers with uptime (all the time) availability, plus hidden benefits like state-of-the-art fraud detection and game-changing recommendation engines.
In this webinar, you’ll learn how to:
-Get uptime, all the time, so you can serve your customers without outages
-Ingest huge velocities of data from anywhere
-Maximize mobile, online and cloud applications with the security your customers expect
-Identify patterns between formerly silo’d data, even text and call logs
-Get the search & insight you need without performance hits
From Single Purpose to Multi Purpose Data Lakes - Broadening End UsersDenodo
Watch full webinar here: https://buff.ly/2Mt555e
Historically data lakes have been created as centralized physical data storage platform for data scientists to analyze data. But lately the explosion of big data, data privacy rules, departmental restrictions among many other things have made the centralized data repository approach less feasible. In his recent whitepaper, renowned analyst Rick F. Van Der Lans talks about why decentralized multi purpose data lakes are the future of data analysis for a broad range of business users.
Please attend this session to learn:
• The restrictions of physical single purpose data lakes
• How to build a logical multi purpose data lake for business users
• The newer use cases that makes multi purpose data lakes a necessity
Webinar future dataintegration-datamesh-and-goldengatekafkaJeffrey T. Pollock
The Future of Data Integration: Data Mesh, and a Special Deep Dive into Stream Processing with GoldenGate, Apache Kafka and Apache Spark. This video is a replay of a Live Webinar hosted on 03/19/2020.
Join us for a timely 45min webinar to see our take on the future of Data Integration. As the global industry shift towards the “Fourth Industrial Revolution” continues, outmoded styles of centralized batch processing and ETL tooling continue to be replaced by realtime, streaming, microservices and distributed data architecture patterns.
This webinar will start with a brief look at the macro-trends happening around distributed data management and how that affects Data Integration. Next, we’ll discuss the event-driven integrations provided by GoldenGate Big Data, and continue with a deep-dive into some essential patterns we see when replicating Database change events into Apache Kafka. In this deep-dive we will explain how to effectively deal with issues like Transaction Consistency, Table/Topic Mappings, managing the DB Change Stream, and various Deployment Topologies to consider. Finally, we’ll wrap up with a brief look into how Stream Processing will help to empower modern Data Integration by supplying realtime data transformations, time-series analytics, and embedded Machine Learning from within data pipelines.
GoldenGate: https://www.oracle.com/middleware/tec...
Webinar Speaker: Jeff Pollock, VP Product (https://www.linkedin.com/in/jtpollock/)
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
Insights into Real-world Data Management ChallengesDataWorks Summit
Oracle began with the belief that the foundation of IT was managing information. The Oracle Cloud Platform for Big Data is a natural extension of our belief in the power of data. Oracle’s Integrated Cloud is one cloud for the entire business, meeting everyone’s needs. It’s about Connecting people to information through tools which help you combine and aggregate data from any source.
This session will explore how organizations can transition to the cloud by delivering fully managed and elastic Hadoop and Real-time Streaming cloud services to built robust offerings that provide measurable value to the business. We will explore key data management trends and dive deeper into pain points we are hearing about from our customer base.
Real-world Cloud HPC at Scale, for Production Workloads (BDT212) | AWS re:Inv...Amazon Web Services
"Running high-performance scientific and engineering applications is challenging no matter where you do it. Join IT executives from Hitachi Global Storage Technology, The Aerospace Corporation, Novartis, and Cycle Computing and learn how they have used the AWS cloud to deploy mission-critical HPC workloads.
Cycle Computing leads the session on how organizations of any scale can run HPC workloads on AWS. Hitachi Global Storage Technology discusses experiences using the cloud to create next-generation hard drives. The Aerospace Corporation provides perspectives on running MPI and other simulations, and offer insights into considerations like security while running rocket science on the cloud. Novartis Institutes for Biomedical Research talks about a scientific computing environment to do performance benchmark workloads and large HPC clusters, including a 30,000-core environment for research in the fight against cancer, using the Cancer Genome Atlas (TCGA)."
Data Virtualization: An Essential Component of a Cloud Data LakeDenodo
Watch full webinar here: https://bit.ly/33GgqE9
Data Lake strategies seem to have found their perfect companion in cloud providers. After years of criticism and struggles in the on-prem Hadoop world, data lakes are flourishing thanks to the simplification in management and low storage prices provided by SaaS vendors. For some, this is the ultimate data strategy. For others, just a repetition of the same mistakes. Attend this session to learn:
- The benefits and shortcoming of cloud data lakes
- The role and value of data virtualization in this scenario
- New development in data virtualization for cloud
Watch full webinar here: https://bit.ly/3mdj9i7
You will often hear that "data is the new gold"? In this context, data management is one of the areas that has received more attention from the software community in recent years. From Artificial Intelligence and Machine Learning to new ways to store and process data, the landscape for data management is in constant evolution. From the privileged perspective of an enterprise middleware platform, we at Denodo have the advantage of seeing many of these changes happen.
In this webinar, we will discuss the technology trends that will drive the enterprise data strategies in the years to come. Don't miss it if you want to keep yourself informed about how to convert your data to strategic assets in order to complete the data-driven transformation in your company.
Watch this on-demand webinar as we cover:
- The most interesting trends in data management
- How to build a data fabric architecture?
- How to manage your data integration strategy in the new hybrid world
- Our predictions on how those trends will change the data management world
- How can companies monetize the data through data-as-a-service infrastructure?
- What is the role of voice computing in future data analytic
Conquering Disaster Recovery Challenges and Out-of-Control Data with the Hybr...actualtechmedia
More and more companies are leveraging the cloud for disaster recovery. After all, the limitless compute resources of the cloud are perfectly suited for disaster recovery. Learn how to easily leverage the cloud for DR.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
SAP Sapphire 2024 - ASUG301 building better apps with SAP Fiori.pdfPeter Spielvogel
Building better applications for business users with SAP Fiori.
• What is SAP Fiori and why it matters to you
• How a better user experience drives measurable business benefits
• How to get started with SAP Fiori today
• How SAP Fiori elements accelerates application development
• How SAP Build Code includes SAP Fiori tools and other generative artificial intelligence capabilities
• How SAP Fiori paves the way for using AI in SAP apps
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
4. Exponential Growth
Data center traffic will grow 4X by
2016 by 6.6 ZB annually
Majority of traffic is caused by
datacenter and cloud workloads
invisible to the end users
Source: Cisco Global Cloud Index: 2011-2016
5. Exponential Growth
Growth drives demand in physical
infrastructure
Datacenters often growing at rates
much faster than originally
estimated
Driving demand for disk, servers,
network infrastructure, bandwidth,
cooling, floor space, etc.
Source: Cisco Global Cloud Index: 2011-2016
7. Green & Carbon Initiatives
Large infrastructure investments to
get cleaner and greener
Apple, Google, Amazon, eBay are
investing
Should companies whose core
competency is not building
datacenters be spending here?
Proposed datacenter from Google & Duke energy
21. Disaster Recovery
RTO (Recovery Time Objective)
“Time to be back up & running”
RPO (Recovery Point Objective)
“Maximum time in which data is lost”
Value
“How much money is recovery worth?”
27. Dev & QA Environments
Dev QA Stage Production
One of the biggest time wasters is dealing with inconsistent target environments
“It worked on my laptop!”
28. Dev & QA Environments
Standard Environments
- Devs get consistent, patched, secure
images in all environments
Create self-service provisioning
- Stop being the bottleneck and enable the
development teams
Automate everything
-Application Deployments
-Environment Provisioning
-Test Data Provisioning
-Acceptance Testing
29. Summary
Before building or expanding the next datacenter
- Look to the cloud for hybrid use cases
Common Patterns
-File Storage and Archiving
-Cloud Bursting
-Big Data
-Disaster Recovery
-Dev & QA Environments
Cloud Technology Partners can help
www.cloudtp.com