The document discusses hyperscale AI and how to achieve faster performance with a smaller footprint. It outlines an end-to-end hyperscale design from HPE that spans from ingestion at the edge to analytics in the core and cloud. The design utilizes technologies like persistent memory, high-performance networking and hardware, and distributed databases optimized for AI workloads. It provides examples of how this design has been implemented and tested on real-world AI and analytics problems.
2016 Sept 1st - IBM Consultants & System Integrators Interchange - Big Data -...Anand Haridass
An unprecedented increase in the use of digital devices is causing an explosion in the amount of data generated & captured by businesses. The need to extract economic value from all this "Big Data", that has the potential to transform businesses completely, is immense and drives a whole slew of new workloads. Organizations need to continuously align strategy, business processes and infrastructure investments to derive these insights. This session will talk to how solutions based on POWER deliver this in a cost-effective, open, scalable, high performing and reliable manner.
Like-minded and outspoken in their passion for all things NetApp, the NetApp A-Team members leverage social media to tell their stories – what worked, what didn’t, what they like, and what they think NetApp and other IT vendors can be doing better. They also share their day-to-day, hands-on product experiences with NetApp executives and subject matter experts, and this candid feedback helps shape decisions made regarding NetApp’s product roadmap.
Many have heard us at NetApp talking recently about how the world is changing fundamentally and quickly: digital transformation is – or should be – the focus of any enterprise’s IT strategy. And squarely at the center of that focus is data. As data continues to become even more distributed, dynamic and diverse, everything from IT infrastructures to application architectures to provisioning strategies will have to change in response to new realities in the hybrid cloud world. It’s in that context that we conceived our top five CTO predictions for 2018.
The Big Data solution from EMC provides market-leading scale-out storage, a unified analytics platform, and business process and application development tools. Together, these allow organizations to draw deeper insights and become a more predictive organization.
Copy Data Management & Storage Efficiency - Ravi NambooriRavi namboori
In this PPT Ravi Namboori explains how copy data management practices can bring about changes in our workplaces. Creation of more space to operate in is one of its main benefits and also about storage efficiency.
2016 Sept 1st - IBM Consultants & System Integrators Interchange - Big Data -...Anand Haridass
An unprecedented increase in the use of digital devices is causing an explosion in the amount of data generated & captured by businesses. The need to extract economic value from all this "Big Data", that has the potential to transform businesses completely, is immense and drives a whole slew of new workloads. Organizations need to continuously align strategy, business processes and infrastructure investments to derive these insights. This session will talk to how solutions based on POWER deliver this in a cost-effective, open, scalable, high performing and reliable manner.
Like-minded and outspoken in their passion for all things NetApp, the NetApp A-Team members leverage social media to tell their stories – what worked, what didn’t, what they like, and what they think NetApp and other IT vendors can be doing better. They also share their day-to-day, hands-on product experiences with NetApp executives and subject matter experts, and this candid feedback helps shape decisions made regarding NetApp’s product roadmap.
Many have heard us at NetApp talking recently about how the world is changing fundamentally and quickly: digital transformation is – or should be – the focus of any enterprise’s IT strategy. And squarely at the center of that focus is data. As data continues to become even more distributed, dynamic and diverse, everything from IT infrastructures to application architectures to provisioning strategies will have to change in response to new realities in the hybrid cloud world. It’s in that context that we conceived our top five CTO predictions for 2018.
The Big Data solution from EMC provides market-leading scale-out storage, a unified analytics platform, and business process and application development tools. Together, these allow organizations to draw deeper insights and become a more predictive organization.
Copy Data Management & Storage Efficiency - Ravi NambooriRavi namboori
In this PPT Ravi Namboori explains how copy data management practices can bring about changes in our workplaces. Creation of more space to operate in is one of its main benefits and also about storage efficiency.
Revolutionising Storage for your Future Business RequirementsNetApp
Non-disruptive Operations, Efficiency and Seamless scale are all topics of discussion by organisations facing challenging growth in the volumes of data stored. In this session Julian Wheeler, NetApp Channel SE Manager, investigates new storage infrastructures that enable you to manage growth, scale and efficiency while improving the service to the business.
Gain quick, easy, and flexible handling of data with a comprehensive family of SAP HANA storage solutions. IBM Storage delivers the flexibility and performance you need for your SAP applications. Discover fast, proven TDI-certified IBM Storage systems for your SAP deployment.
NetApp enterprise All Flash Storage
This presentation provides the key messages and differentiation, value propositions, and promotional programs for AFF.
Hadoop World 2011: Hadoop as a Service in CloudCloudera, Inc.
Hadoop framework is often built on native environment with commodity hardware as its original design. However, with growing tendency of cloud computing, there is stronger requirement to build hadoop cluster on a public/private cloud in order for customers to benefit from virtualization and multi-tenancy. My speech want to introduce some challenges to provide hadoop service on virtualization platform like: performance, rack awareness, job scheduling, memory over commitment, etc and propose some solutions.
Break free from the limits of today’s hyper converged solutions. Focus on innovation that can transform you into a data-centric organization. Explore four ways below that NetApp HCI can help you build a Next Generation Data Center.
Missed us at this April FIS event? Learn how IBM Power Systems can enable the most data intensive and mission-critical workloads in private and hybrid cloud environments. With IBM POWER9 based Power Systems, you can dynamically scale compute and memory on demand and build a cloud designed for the most data intensive workloads. These systems are ideal for FIS workloads and more.
Hadoop as a Service ( as offered by handful of niche vendors now ) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. This is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth.
In this session, we will describe the key elements of a Dell EMC Isilon Data Lake and its key advantages including reduced IT costs, simplified management, increased operational flexibility and in-place data analytics. Dell EMC products to be featured include Isilon, ECS and Virtustream.
In this session, you will learn about the Dell EMC Isilon Data Lake and its advantages including:
• Data consolidation and increased efficiency to lower capital costs
• Streamlined management to reduce operating costs
• Improved operational flexibility and scalability to meet growing storage requirements
• Simple integration with a choice of public or private cloud storage providers
• Powerful in-place data analytics that accelerate time to insight while eliminating the need for a separate analytics storage infrastructure.
You will also hear how this solution can be easily extended to include data from remote and branch office locations with an efficient software defined storage solution.
Moving workloads to the cloud and ensuring compliance with ASD’s essential eight for data protection can be challenging. Veeam delivers multi-cloud mobility, backup, restore, and disaster recovery for both on-premises and AWS workloads. They also provide centralized management and enable efficient, streamlined testing of restores whenever it’s required. Learn how to scale for growth on AWS Cloud with Veeam Availability Suite 9.5 Update 4.
Much industry focus is on All-Flash Arrays with traditional databases, but new databases using native direct-attached Flash have proven reliable, performant, and popular for operational use cases. Today, these operational databases store account information for banking and retail applications, real-time routing information for telecoms, and user profiles for advertising; they also support machine learning for applications in the financial industry, such as fraud detection. While proprietary PCIe and “wide SATA” had previously been popular, NVMe has finally come into operational use. Aerospike will discuss the benefits of NVMe for these use cases (including specific configurations and performance numbers), as well as the architectural implications of low-latency Flash and Storage Class Memory.
Revolutionising Storage for your Future Business RequirementsNetApp
Non-disruptive Operations, Efficiency and Seamless scale are all topics of discussion by organisations facing challenging growth in the volumes of data stored. In this session Julian Wheeler, NetApp Channel SE Manager, investigates new storage infrastructures that enable you to manage growth, scale and efficiency while improving the service to the business.
Gain quick, easy, and flexible handling of data with a comprehensive family of SAP HANA storage solutions. IBM Storage delivers the flexibility and performance you need for your SAP applications. Discover fast, proven TDI-certified IBM Storage systems for your SAP deployment.
NetApp enterprise All Flash Storage
This presentation provides the key messages and differentiation, value propositions, and promotional programs for AFF.
Hadoop World 2011: Hadoop as a Service in CloudCloudera, Inc.
Hadoop framework is often built on native environment with commodity hardware as its original design. However, with growing tendency of cloud computing, there is stronger requirement to build hadoop cluster on a public/private cloud in order for customers to benefit from virtualization and multi-tenancy. My speech want to introduce some challenges to provide hadoop service on virtualization platform like: performance, rack awareness, job scheduling, memory over commitment, etc and propose some solutions.
Break free from the limits of today’s hyper converged solutions. Focus on innovation that can transform you into a data-centric organization. Explore four ways below that NetApp HCI can help you build a Next Generation Data Center.
Missed us at this April FIS event? Learn how IBM Power Systems can enable the most data intensive and mission-critical workloads in private and hybrid cloud environments. With IBM POWER9 based Power Systems, you can dynamically scale compute and memory on demand and build a cloud designed for the most data intensive workloads. These systems are ideal for FIS workloads and more.
Hadoop as a Service ( as offered by handful of niche vendors now ) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. This is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth.
In this session, we will describe the key elements of a Dell EMC Isilon Data Lake and its key advantages including reduced IT costs, simplified management, increased operational flexibility and in-place data analytics. Dell EMC products to be featured include Isilon, ECS and Virtustream.
In this session, you will learn about the Dell EMC Isilon Data Lake and its advantages including:
• Data consolidation and increased efficiency to lower capital costs
• Streamlined management to reduce operating costs
• Improved operational flexibility and scalability to meet growing storage requirements
• Simple integration with a choice of public or private cloud storage providers
• Powerful in-place data analytics that accelerate time to insight while eliminating the need for a separate analytics storage infrastructure.
You will also hear how this solution can be easily extended to include data from remote and branch office locations with an efficient software defined storage solution.
Moving workloads to the cloud and ensuring compliance with ASD’s essential eight for data protection can be challenging. Veeam delivers multi-cloud mobility, backup, restore, and disaster recovery for both on-premises and AWS workloads. They also provide centralized management and enable efficient, streamlined testing of restores whenever it’s required. Learn how to scale for growth on AWS Cloud with Veeam Availability Suite 9.5 Update 4.
Much industry focus is on All-Flash Arrays with traditional databases, but new databases using native direct-attached Flash have proven reliable, performant, and popular for operational use cases. Today, these operational databases store account information for banking and retail applications, real-time routing information for telecoms, and user profiles for advertising; they also support machine learning for applications in the financial industry, such as fraud detection. While proprietary PCIe and “wide SATA” had previously been popular, NVMe has finally come into operational use. Aerospike will discuss the benefits of NVMe for these use cases (including specific configurations and performance numbers), as well as the architectural implications of low-latency Flash and Storage Class Memory.
Demystifying Data Warehousing as a Service (GLOC 2019)Kent Graziano
Extended deck from the 2019 GLOC event in Cleveland. Discusses what a DWaaS is, the top 10 features of Snowflake that represent that, and a check list for what questions to ask when choosing a cloud based data warehouse.
This session introduces you to Amazon EC2 F1 instances and walks you through a typical development and deployment process, including the Approved Amazon EC2 F1 C/C++ development workflow. We also discuss a number of use cases in different domains, including financial risk simulation, genomics, video processing, big data and analytics, with a discussion about acceleration work on top of EC2 F1.
RightScale Roadtrip Boston: Accelerate to CloudRightScale
The Accelerate to Cloud keynote will help you understand the current state of cloud adoption, identify the business value for your organization, and provide you a framework to plot your course to cloud adoption.
Pro Tips: Designing and Deploying End-to-End HPC and AI SolutionsPenguin Computing
High Performance Computing (HPC)*HPC on Demand*HPC Cloud*Storage*Networking*Artificial Intelligence (AI)*Supercomputing*Beowulf Clusters*Remote Desktop*Virtual Workstations*Scyld Cluster Mangement Software*Datacenters*IT Infrastrucutre*Professional Services*Managed Services*Open Compute Project (OCP)*Linux*Open Source -- Kevin Tubbs Ph.D., Sr. Director of Advanced Solutions Group for Penguin Computing, shared the best practices when designing and deploying end-to-end HPC and AI solutions.
Penguin computing designing and deploying end to end HPC and AI SolutionsPenguin Computing
Best Practices in Designing and Deploying End-to-End HPC and AI Solutions.
Penguin Computing is U.S.-based global provider of high-performance computing (HPC) and data center solutions with more than 2,500 customers in 40 countries, across 8 major vertical markets.
www.penguincomputing.com
IDERA Live | The Ever Growing Science of Database MigrationsIDERA Software
You can watch the replay for this webcast in the IDERA Resource Center: http://ow.ly/QHaG50A58ZB
Many information technology professionals may not recognize it, but the bulk of their work has been and continues to be nothing more than database migrations. In the old days to share files across systems, then to move files into relational databases, then to load into data warehouses, and finally now we're moving to NoSQL and the cloud. In the presentation we'll delve into the ever growing and increasingly complex world of database migrations. Some of these considerations include what issues must be planned for and overcome, what problems are likely to occur, and what types of tools exist.
Database expert Bert Scalzo will cover these and many other database migration concerns.
About Bert: Bert Scalzo is an Oracle ACE, author, speaker, consultant, and a major contributor for many popular database tools used by millions of people worldwide. He has 30+ years of database experience and has worked for several major database vendors. He has BS, MS and Ph.D. in computer science plus an MBA. He has presented at numerous events and webcasts. His areas of key interest include data modeling, database benchmarking, database tuning, SQL optimization, "star schema" data warehousing, running databases on Linux or VMware, and using NVMe flash based technology to speed up database performance.
New Business Applications Powered by In-Memory Technology @MIT Forum for Supp...Paul Hofmann
Talk about 4 leading edge projects:
1) Optimal pricing for energy management, online pricing, and truck scheduling @Princeton University
2) Infinite DRAM - RAMCloud @Stanford University
Applications: Extremely low latency and very high bandwidth
a) Facebook like problems with high read AND write rate,
b) advanced analytics, c) what-if scenarios for demand planning
3) Hybrid In-Memory Store @MIT CSAIL
4) Multithreading Real Time Event Platform @MIT Auto-ID Lab
500k events/s and millions of threads in-memory or distributed used for automatic meter reading, online billing, mobile billing and Smart Grid
DM Radio Webinar: Adopting a Streaming-Enabled ArchitectureDATAVERSITY
Architecture matters. That's why today's innovators are taking a hard look at streaming data, an increasingly attractive option that can transform business in several ways: replacing aging data ingestion techniques like ETL; solving long-standing data quality challenges; improving business processes ranging from sales and marketing to logistics and procurement; or any number of activities related to accelerating data warehousing, business intelligence and analytics.
Register for this DM Radio Deep Dive Webinar to learn how streaming data can rejuvenate or supplant traditional data management practices. Host Eric Kavanagh will explain how streaming-first architectures can relieve data engineers from time-consuming, error-prone processes, ideally bidding farewell to those unpleasant batch windows. He'll be joined by Kevin Petrie of Attunity, who will explain why (with real-world story successes) streaming data solutions can keep the business fueled with trusted data in a timely, efficient manner for improved business outcomes.
Deep learning at supercomputing scale by Rangan Sukumar from CrayBill Liu
Presented at AI NEXTCon Seattle 1/17-20, 2018
http://aisea18.xnextcon.com
join our free online AI group with 50,000+ tech engineers to learn and practice AI technology, including: latest AI news, tech articles/blogs, tech talks, tutorial videos, and hands-on workshop/codelabs, on machine learning, deep learning, data science, etc..
Hadoop Summit Brussels 2015: Architecting a Scalable Hadoop Platform - Top 10...Sumeet Singh
Since 2006, Hadoop and its ecosystem components have evolved into a platform that Yahoo has begun to trust for running its businesses globally. Hadoop’s scalability, efficiency, built-in reliability, and cost effectiveness have made it an enterprise-wide platform that web-scale cloud operations run on. In this talk, we will take a broad look at some of the top software, hardware, and services considerations that have gone in to make the platform indispensable for nearly 1,000 active developers on a daily basis, including the challenges that come from scale, security and multi-tenancy we have dealt with in the last several years of operating one the largest Hadoop footprints in the world. We will cover the current technology stack Yahoo that has built or assembled, infrastructure elements such as configurations, deployment models, and network, and what it takes to offer hosted Hadoop services to a large customer base at Yahoo. Throughout the talk, we will highlight relevant use cases from Yahoo’s Mobile, Search, Advertising, Personalization, Media, and Communications businesses that may make these considerations more pertinent to your situation.
Prezentace z webináře dne 10.3.2022
Prezentovali:
Jaroslav Malina - Senior Channel Sales Manager, Oracle
Josef Krejčí - Technology Sales Consultant, Oracle
Josef Šlahůnek - Cloud Systems sales Consultant, Oracle
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Welcome to the first live UiPath Community Day Dubai! Join us for this unique occasion to meet our local and global UiPath Community and leaders. You will get a full view of the MEA region's automation landscape and the AI Powered automation technology capabilities of UiPath. Also, hosted by our local partners Marc Ellis, you will enjoy a half-day packed with industry insights and automation peers networking.
📕 Curious on our agenda? Wait no more!
10:00 Welcome note - UiPath Community in Dubai
Lovely Sinha, UiPath Community Chapter Leader, UiPath MVPx3, Hyper-automation Consultant, First Abu Dhabi Bank
10:20 A UiPath cross-region MEA overview
Ashraf El Zarka, VP and Managing Director MEA, UiPath
10:35: Customer Success Journey
Deepthi Deepak, Head of Intelligent Automation CoE, First Abu Dhabi Bank
11:15 The UiPath approach to GenAI with our three principles: improve accuracy, supercharge productivity, and automate more
Boris Krumrey, Global VP, Automation Innovation, UiPath
12:15 To discover how Marc Ellis leverages tech-driven solutions in recruitment and managed services.
Brendan Lingam, Director of Sales and Business Development, Marc Ellis
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.