OpenFlowHub Webinar - Indigo v2.0 and LOXIopenflowhub
The Big Switch team has been working overtime to develop a new version of the popular Indigo project. Indigo is an OpenFlow firmware agent for physical switches, and info on the current version can be found on http://indigo.openflowhub.org.
Why the need for a new version? We wanted to provided an extensible framework to enable switch and hypervisor vendors to implement OpenFlow agents, and provide an easier way to provide an upgrade path to supported new versions of OpenFlow.
Major new capabilities include a) a HAL abstraction layer to make it easy to integrate with the forwarding and port management interfaces of physical- or virtual- switches, b) a configuration abstraction layer to support running OpenFlow in a "hybrid" mode on your switch, c) LOXI - a marshalling/un-marshalling engine that generates OpenFlow libraries in multiple languages. Currently it generates C, but Java and Python are coming soon.
Imaginea brings more than 12 years of product engineering and services to software companies from several different industries at any stage of the life-cycle process. Through the use of several technologies and strong, innovative development processes, we deliver dependable software products at a lower cost and fulfill our customer’s business needs.
It is no wonder then that all of our customers, from the startups to the big guys, call on us for comprehensive development of core products and are often return customers!
We provide product engineering services with a very reliable technology partnership to independent software vendors, enterprises and online SaaS businesses. Services are comprehensive and cover the development process from beginning to end.
Have you been hacking some JavaScript to enable your users to do more, directly in the Chatter feed? Hack no more! in Winter '15 you can define, invoke, and debug Action Links in Force.com. If you are familiar with our REST APIs and Chatter, join the engineers who delivered this feature for a walkthrough of tools, code samples, and resources. You'll come away ready to create your own action links.Have you been hacking some JavaScript to enable your users to do more, directly in the Chatter feed? Hack no more! in Winter '15 you can define, invoke, and debug Action Links in Force.com. If you are familiar with our REST APIs and Chatter, join the engineers who delivered this feature for a walkthrough of tools, code samples, and resources. You'll come away ready to create your own action links.
OpenFlowHub Webinar - Indigo v2.0 and LOXIopenflowhub
The Big Switch team has been working overtime to develop a new version of the popular Indigo project. Indigo is an OpenFlow firmware agent for physical switches, and info on the current version can be found on http://indigo.openflowhub.org.
Why the need for a new version? We wanted to provided an extensible framework to enable switch and hypervisor vendors to implement OpenFlow agents, and provide an easier way to provide an upgrade path to supported new versions of OpenFlow.
Major new capabilities include a) a HAL abstraction layer to make it easy to integrate with the forwarding and port management interfaces of physical- or virtual- switches, b) a configuration abstraction layer to support running OpenFlow in a "hybrid" mode on your switch, c) LOXI - a marshalling/un-marshalling engine that generates OpenFlow libraries in multiple languages. Currently it generates C, but Java and Python are coming soon.
Imaginea brings more than 12 years of product engineering and services to software companies from several different industries at any stage of the life-cycle process. Through the use of several technologies and strong, innovative development processes, we deliver dependable software products at a lower cost and fulfill our customer’s business needs.
It is no wonder then that all of our customers, from the startups to the big guys, call on us for comprehensive development of core products and are often return customers!
We provide product engineering services with a very reliable technology partnership to independent software vendors, enterprises and online SaaS businesses. Services are comprehensive and cover the development process from beginning to end.
Have you been hacking some JavaScript to enable your users to do more, directly in the Chatter feed? Hack no more! in Winter '15 you can define, invoke, and debug Action Links in Force.com. If you are familiar with our REST APIs and Chatter, join the engineers who delivered this feature for a walkthrough of tools, code samples, and resources. You'll come away ready to create your own action links.Have you been hacking some JavaScript to enable your users to do more, directly in the Chatter feed? Hack no more! in Winter '15 you can define, invoke, and debug Action Links in Force.com. If you are familiar with our REST APIs and Chatter, join the engineers who delivered this feature for a walkthrough of tools, code samples, and resources. You'll come away ready to create your own action links.
Turbo-Charge Your Analytics with IBM Netezza and Revolution R Enterprise: A S...Revolution Analytics
Everyone involved in high-stakes analytics wants power, speed and flexibility regardless of the size of the data set and complexity of the analysis. Trailblazing organizations that have deployed IBM Netezza Analytics with their IBM Netezza data warehouse appliances (TwinFin) with Revolution R Enterprise are getting all three.
100% R and More: Plus What's New in Revolution R Enterprise 6.0Revolution Analytics
R users already know why the R language is the lingua franca of statisticians today: because it's the most powerful statistical language in the world. Revolution Analytics builds on the power of open source R, and adds performance, productivity and integration features to create Revolution R Enterprise. In this webinar, author and blogger David Smith will introduce the additional capabilities of Revolution R Enterprise.
VP of Product Development, Dr. Sue Ranney will also provide an overview of the features introduced in Revolution R Enterprise 6.0 including:
1. Big Data Generalized Linear Model, the new RevoScaleR function that provides a fast, scalable, distributable implementation of generalized linear models, offering impressive speed-ups relative to glm on in-memory data frames
2. Platform LSF Cluster Support, which allows you to create a distributed compute context for the Platform LSF workload manager
3. Azure Burst support added to RxHpcServer
4. Updated R engine (R 2.14.2)
5. Ability to use RevoScaleR analysis functions with non-xdf data sources such as SAS, SPSS or text
6. New methods for RxXdfData data sources including head, tail, names, dim, colnames, length, str, and formula
7. New function rxRoc for generating ROC curves
In this presentation from Revolution Analytics, Bill Jacobs presents: Are You Ready for Big Data Analytics?
"Revolution Analytics delivers advanced analytics software at half the cost of existing solutions. By building on open source R—the world's most powerful statistics software—with innovations in big data analysis, integration and user experience, Revolution Analytics meets the demands and requirements of modern data-driven businesses."
Learn more: http://www.revolutionanalytics.com
Watch the presentation video: http://wp.me/p3RLEV-12S
We at Revolution Analytics are often asked “What is the best way to learn R?” While acknowledging that there may be as many effective learning styles as there are people we have identified three factors that greatly facilitate learning R. For a quick start:
- Find a way of orienting yourself in the open source R world
- Have a definite application area in mind
- Set an initial goal of doing something useful and then build on it
In this webinar, we focus on data mining as the application area and show how anyone with just a basic knowledge of elementary data mining techniques can become immediately productive in R. We will:
- Provide an orientation to R’s data mining resources
- Show how to use the "point and click" open source data mining GUI, rattle, to perform the basic data mining functions of exploring and visualizing data, building classification models on training data sets, and using these models to classify new data.
- Show the simple R commands to accomplish these same tasks without the GUI
- Demonstrate how to build on these fundamental skills to gain further competence in R
- Move away from using small test data sets and show with the same level of skill one could analyze some fairly large data sets with RevoScaleR
Data scientists and analysts using other statistical software as well as students who are new to data mining should come away with a plan for getting started with R.
As the Big Data market has evolved, the focus has shifted from data operations (storage, access and processing of data) to data science (understanding, analyzing and forecasting from data). And as new models are developed, organizations need a process for deploying analytics from research into the production environment. In this talk, we'll describe the five stages of real-time analytics deployment:
Data distillation
Model development
Model validation and deployment
Model refresh
Real-time model scoring
We'll review the technologies supporting each stage, and how Revolution Analytics software works with the entire analytics stack to bring Big Data analytics to real-time production environments.
New Features in Revolution R Enterprise 5.0 to Support Scalable Data AnalysisRevolution Analytics
Revolution R Enterprise 5.0 is Revolution Analytics’ scalable analytics platform. At its core is Revolution Analytics’ enhanced Distribution of R, the world’s most widely-used project for statistical computing. In this webinar, Dr. Ranney will discuss new features and show examples of the new functionality, which extend the platform’s usability, integration and scalability.
Denodo Design Studio: Modeling and Creation of Data ServicesDenodo
Watch full webinar here: https://bit.ly/39T7SON
Change is the only constant and it is very important for enterprises to keep up with the changing times in an agile fashion. To ensure faster time to market, quick business insights and rapid data driven decision making, it is important that the Data Delivery channel is optimized in the best way possible.
With the advent of API Management technologies the demand for data being delivered in the form of a Data Service/APIs is increasing. The ability to make data available in an API format at the click of a button is the need of the hour. Join us to see how easy it is to make enterprise wide data available as Data Services/APIs no matter what format the data is stored in with no prior coding experience. Faster development, zero learning curve and huge value.
Watch on-demand this webinar to learn:
- How to explore datasets available using Denodo Data Catalog
- How to build new data sets using Denodo Design Studio, drag and drop interface
- How to make datasets available in RESTful, OData 4, GeoJSON, GraphQL.
- How to enable different authentication protocols including OAuth 2.0.
- Automatic documentation (Open API) and availability in the Denodo Data Catalog.
Modernizing Your IT Infrastructure with Hadoop - Cloudera Summer Webinar Seri...Cloudera, Inc.
You will also learn how to understand key challenges when deploying a Hadoop cluster in production, manage the entire Hadoop lifecycle using a single management console, deliver integrated management of the entire cluster to maximize IT and business agility.
Big data analytics on teradata with revolution r enterprise bill jacobsBill Jacobs
Revolution Analytics brings big data analytics to Teradata database. Presentation from Teradata Partners, October 2013 overviewing Revolution R Enterprise for Teradata by Bill Jacobs, Director, Product Marketing, Revolution Analytics.
Presented to eRum (Budapest), May 2018
There are many common workloads in R that are "embarrassingly parallel": group-by analyses, simulations, and cross-validation of models are just a few examples. In this talk I'll describe the doAzureParallel package, a backend to the "foreach" package that automates the process of spawning a cluster of virtual machines in the Azure cloud to process iterations in parallel. This will include an example of optimizing hyperparameters for a predictive model using the "caret" package.
By David Smith. Presented at Microsoft Build (Seattle), May 7 2018.
Your data scientists have created predictive models using open-source tools, proprietary software, or some combination of both, and now you are interested in lifting and shifting those models to the cloud. In this talk, I'll describe how data scientists can transition their existing workflows — while using mostly the same tools and processes — to train and deploy machine learning models based on open source frameworks to Azure. I'll provide guidance on keeping connections to data sources up-to-date, evaluating and monitoring models, and deploying applications that make use of those models.
More Related Content
Similar to Integrate Your Advanced Analytics into BI Apps and MS Office and Multiply Their Value
Turbo-Charge Your Analytics with IBM Netezza and Revolution R Enterprise: A S...Revolution Analytics
Everyone involved in high-stakes analytics wants power, speed and flexibility regardless of the size of the data set and complexity of the analysis. Trailblazing organizations that have deployed IBM Netezza Analytics with their IBM Netezza data warehouse appliances (TwinFin) with Revolution R Enterprise are getting all three.
100% R and More: Plus What's New in Revolution R Enterprise 6.0Revolution Analytics
R users already know why the R language is the lingua franca of statisticians today: because it's the most powerful statistical language in the world. Revolution Analytics builds on the power of open source R, and adds performance, productivity and integration features to create Revolution R Enterprise. In this webinar, author and blogger David Smith will introduce the additional capabilities of Revolution R Enterprise.
VP of Product Development, Dr. Sue Ranney will also provide an overview of the features introduced in Revolution R Enterprise 6.0 including:
1. Big Data Generalized Linear Model, the new RevoScaleR function that provides a fast, scalable, distributable implementation of generalized linear models, offering impressive speed-ups relative to glm on in-memory data frames
2. Platform LSF Cluster Support, which allows you to create a distributed compute context for the Platform LSF workload manager
3. Azure Burst support added to RxHpcServer
4. Updated R engine (R 2.14.2)
5. Ability to use RevoScaleR analysis functions with non-xdf data sources such as SAS, SPSS or text
6. New methods for RxXdfData data sources including head, tail, names, dim, colnames, length, str, and formula
7. New function rxRoc for generating ROC curves
In this presentation from Revolution Analytics, Bill Jacobs presents: Are You Ready for Big Data Analytics?
"Revolution Analytics delivers advanced analytics software at half the cost of existing solutions. By building on open source R—the world's most powerful statistics software—with innovations in big data analysis, integration and user experience, Revolution Analytics meets the demands and requirements of modern data-driven businesses."
Learn more: http://www.revolutionanalytics.com
Watch the presentation video: http://wp.me/p3RLEV-12S
We at Revolution Analytics are often asked “What is the best way to learn R?” While acknowledging that there may be as many effective learning styles as there are people we have identified three factors that greatly facilitate learning R. For a quick start:
- Find a way of orienting yourself in the open source R world
- Have a definite application area in mind
- Set an initial goal of doing something useful and then build on it
In this webinar, we focus on data mining as the application area and show how anyone with just a basic knowledge of elementary data mining techniques can become immediately productive in R. We will:
- Provide an orientation to R’s data mining resources
- Show how to use the "point and click" open source data mining GUI, rattle, to perform the basic data mining functions of exploring and visualizing data, building classification models on training data sets, and using these models to classify new data.
- Show the simple R commands to accomplish these same tasks without the GUI
- Demonstrate how to build on these fundamental skills to gain further competence in R
- Move away from using small test data sets and show with the same level of skill one could analyze some fairly large data sets with RevoScaleR
Data scientists and analysts using other statistical software as well as students who are new to data mining should come away with a plan for getting started with R.
As the Big Data market has evolved, the focus has shifted from data operations (storage, access and processing of data) to data science (understanding, analyzing and forecasting from data). And as new models are developed, organizations need a process for deploying analytics from research into the production environment. In this talk, we'll describe the five stages of real-time analytics deployment:
Data distillation
Model development
Model validation and deployment
Model refresh
Real-time model scoring
We'll review the technologies supporting each stage, and how Revolution Analytics software works with the entire analytics stack to bring Big Data analytics to real-time production environments.
New Features in Revolution R Enterprise 5.0 to Support Scalable Data AnalysisRevolution Analytics
Revolution R Enterprise 5.0 is Revolution Analytics’ scalable analytics platform. At its core is Revolution Analytics’ enhanced Distribution of R, the world’s most widely-used project for statistical computing. In this webinar, Dr. Ranney will discuss new features and show examples of the new functionality, which extend the platform’s usability, integration and scalability.
Denodo Design Studio: Modeling and Creation of Data ServicesDenodo
Watch full webinar here: https://bit.ly/39T7SON
Change is the only constant and it is very important for enterprises to keep up with the changing times in an agile fashion. To ensure faster time to market, quick business insights and rapid data driven decision making, it is important that the Data Delivery channel is optimized in the best way possible.
With the advent of API Management technologies the demand for data being delivered in the form of a Data Service/APIs is increasing. The ability to make data available in an API format at the click of a button is the need of the hour. Join us to see how easy it is to make enterprise wide data available as Data Services/APIs no matter what format the data is stored in with no prior coding experience. Faster development, zero learning curve and huge value.
Watch on-demand this webinar to learn:
- How to explore datasets available using Denodo Data Catalog
- How to build new data sets using Denodo Design Studio, drag and drop interface
- How to make datasets available in RESTful, OData 4, GeoJSON, GraphQL.
- How to enable different authentication protocols including OAuth 2.0.
- Automatic documentation (Open API) and availability in the Denodo Data Catalog.
Modernizing Your IT Infrastructure with Hadoop - Cloudera Summer Webinar Seri...Cloudera, Inc.
You will also learn how to understand key challenges when deploying a Hadoop cluster in production, manage the entire Hadoop lifecycle using a single management console, deliver integrated management of the entire cluster to maximize IT and business agility.
Big data analytics on teradata with revolution r enterprise bill jacobsBill Jacobs
Revolution Analytics brings big data analytics to Teradata database. Presentation from Teradata Partners, October 2013 overviewing Revolution R Enterprise for Teradata by Bill Jacobs, Director, Product Marketing, Revolution Analytics.
Presented to eRum (Budapest), May 2018
There are many common workloads in R that are "embarrassingly parallel": group-by analyses, simulations, and cross-validation of models are just a few examples. In this talk I'll describe the doAzureParallel package, a backend to the "foreach" package that automates the process of spawning a cluster of virtual machines in the Azure cloud to process iterations in parallel. This will include an example of optimizing hyperparameters for a predictive model using the "caret" package.
By David Smith. Presented at Microsoft Build (Seattle), May 7 2018.
Your data scientists have created predictive models using open-source tools, proprietary software, or some combination of both, and now you are interested in lifting and shifting those models to the cloud. In this talk, I'll describe how data scientists can transition their existing workflows — while using mostly the same tools and processes — to train and deploy machine learning models based on open source frameworks to Azure. I'll provide guidance on keeping connections to data sources up-to-date, evaluating and monitoring models, and deploying applications that make use of those models.
Presentation delivered by David Smith to NY R Conference https://www.rstats.nyc/, April 2018:
Minecraft is an open-world creativity game, and a hit with kids. To get kids interested in learning to program with R, we created the "miner" package. This package is a collection of simple functions that allow you to connect with a Minecraft instance, manipulate the world within by creating blocks and controlling the player, and to detect events within the world and react accordingly.
The miner package is intended mainly for kids, to inspire them to learn R while playing Minecraft. But the development of the package also provides some useful insights into how to build an R package to interface with a persistent API, and how to instruct others on its use. In this talk I'll describe how to set up your own Minecraft server, and how to use and extend the package. I'll also provide a few examples of the package in action in a live Minecraft session.
While Python is a widely-used tool for AI development, in this talk I'll make the case for considering R as a platform for developing models for intelligent applications. Firstly, R provides a first-class experience working deep learning frameworks with its keras integration. Equally importantly, it provides the most comprehensive suite of statistical data analysis tools, which are extremely useful for many intelligent applications such as transfer learning. I'll give a few high-level examples in this talk, and we'll go into further detail in the accompanying interactive code lab.
There are many common workloads in R that are "embarrassingly parallel": group-by analyses, simulations, and cross-validation of models are just a few examples. In this talk I'll describe several techniques available in R to speed up workloads like these, by running multiple iterations simultaneously, in parallel.
Many of these techniques require the use of a cluster of machines running R, and I'll provide examples of using cloud-based services to provision clusters for parallel computations. In particular, I will describe how you can use the SparklyR package to distribute data manipulations using the dplyr syntax, on a cluster of servers provisioned in the Azure cloud.
Presented by David Smith at Data Day Texas in Austin, January 27 2018.
A look at the changing perceptions of R, from the early days of the R project to today. Microsoft sponsor talk, presented by David Smith to the useR!2017 conference in Brussels, July 5 2017.
Predicting Loan Delinquency at One Million Transactions per SecondRevolution Analytics
Real-time applications of predictive models must be able to generate predictions at the rate that transactions are generated. Previously, such applications of models trained using R needed to be converted to other languages like C++ or Java to achieve the required throughput. In this talk, I’ll describe how to use the in-database R processing capabilities of Microsoft R Server to detect fraud in a SQL Server database of loan records at a rate exceeding one million transactions per second. I will also show the process of training the underlying gradient-boosted tree model on a large training set using the out-of-memory algorithms of Microsoft R.
Presented by David Smith at The Data Science Summit, Chicago, April 20 2017.
The ability to independently reproduce results is a critical issue within the scientific community today, and is equally important for collaboration and compliance in business. In this talk, I'll introduce several features available in R that help you make reproducibility a standard part of your data science workflow. The talk will include tips on working with data and files, combining code and output, and managing R's changing package ecosystem.
Presented by David Smith, R Community Lead (Microsoft), at Monktoberfest October 2016.
The value of open source isn’t just in the software itself. The communities that form around open source software provide just as much value and sometimes even more: in ongoing development, in documentation, in support, in marketing, and as a supply of ready-trained employees. Companies who build on open source tend to focus on the software, but neglect communities at their peril.
In this talk, I share some of my experiences in building community for an open-source software company, Revolution Analytics, and perspectives since the acquisition by Microsoft in 2015.
R is more than just a language. Many of the reasons why R has become such a popular tool for data science come from the ecosystem surrounding the R project. R users benefit from the many resources and packages created by the community, while commercial companies (including Microsoft) provide tools to extend and support R, and services to help people use R.
In this talk, I will give an overview of the R Ecosystem and describe how it has been a critical component of R’s success, and include several examples of Microsoft’s contributions to the ecosystem.
(Presented to EARL London, September 2016)
(Presented by David Smith at useR!2016, June 2016. Recording: https://channel9.msdn.com/Events/useR-international-R-User-conference/useR2016/R-at-Microsoft )
Since the acquisition of Revolution Analytics in April 2015, Microsoft has embarked upon a project to build R technology into many Microsoft products, so that developers and data scientists can use the R language and R packages to analyze data in their data centers and in cloud environments.
In this talk I will give an overview (and a demo or two) of how R has been integrated into various Microsoft products. Microsoft data scientists are also big users of R, and I'll describe a couple of examples of R being used to analyze operational data at Microsoft. I'll also share some of my experiences in working with open source projects at Microsoft, and my thoughts on how Microsoft works with open source communities including the R Project.
Hadoop is famously scalable. Cloud Computing is famously scalable. R – the thriving and extensible open source Data Science software – not so much. But what if we seamlessly combined Hadoop, Cloud Computing, and R to create a scalable Data Science platform? Imagine exploring, transforming, modeling, and scoring data at any scale from the comfort of your favorite R environment. Now, imagine calling a simple R function to operationalize your predictive model as a scalable, cloud-based Web Service. Learn how to leverage the magic of Hadoop on-premises or in the cloud to run your R code, thousands of open source R extension packages, and distributed implementations of the most popular machine learning algorithms at scale.
GridMate - End to end testing is a critical piece to ensure quality and avoid...ThomasParaiso2
End to end testing is a critical piece to ensure quality and avoid regressions. In this session, we share our journey building an E2E testing pipeline for GridMate components (LWC and Aura) using Cypress, JSForce, FakerJS…
PHP Frameworks: I want to break free (IPC Berlin 2024)Ralf Eggert
In this presentation, we examine the challenges and limitations of relying too heavily on PHP frameworks in web development. We discuss the history of PHP and its frameworks to understand how this dependence has evolved. The focus will be on providing concrete tips and strategies to reduce reliance on these frameworks, based on real-world examples and practical considerations. The goal is to equip developers with the skills and knowledge to create more flexible and future-proof web applications. We'll explore the importance of maintaining autonomy in a rapidly changing tech landscape and how to make informed decisions in PHP development.
This talk is aimed at encouraging a more independent approach to using PHP frameworks, moving towards a more flexible and future-proof approach to PHP development.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
Alt. GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using ...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
Threats to mobile devices are more prevalent and increasing in scope and complexity. Users of mobile devices desire to take full advantage of the features
available on those devices, but many of the features provide convenience and capability but sacrifice security. This best practices guide outlines steps the users can take to better protect personal devices and information.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
Dr. Sean Tan, Head of Data Science, Changi Airport Group
Discover how Changi Airport Group (CAG) leverages graph technologies and generative AI to revolutionize their search capabilities. This session delves into the unique search needs of CAG’s diverse passengers and customers, showcasing how graph data structures enhance the accuracy and relevance of AI-generated search results, mitigating the risk of “hallucinations” and improving the overall customer journey.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...SOFTTECHHUB
The choice of an operating system plays a pivotal role in shaping our computing experience. For decades, Microsoft's Windows has dominated the market, offering a familiar and widely adopted platform for personal and professional use. However, as technological advancements continue to push the boundaries of innovation, alternative operating systems have emerged, challenging the status quo and offering users a fresh perspective on computing.
One such alternative that has garnered significant attention and acclaim is Nitrux Linux 3.5.0, a sleek, powerful, and user-friendly Linux distribution that promises to redefine the way we interact with our devices. With its focus on performance, security, and customization, Nitrux Linux presents a compelling case for those seeking to break free from the constraints of proprietary software and embrace the freedom and flexibility of open-source computing.
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Integrate Your Advanced Analytics into BI Apps and MS Office and Multiply Their Value
1. REvolution Confidential
Revolution
Deploying R evolution
R E nterpris e with
B us ines s
Intelligenc e
A pplic ations
David C hampagne
May 8, 2012
Revolution Confidential
2. Revolution Confidential
Our C us tomers Want to… REvolution Confidential
Revolution
Be more productive
Stop cutting and pasting analytics into MS Office applications
Get feedback from a wider variety of analytic consumers
Make the company’s “analytic consumers” self-sufficient
Provide the model and secure access to data and let them
iterate and learn
Provide more timely updates
Model updates with new data, so not restricted to “scheduled”
reporting intervals
Deliver more value through BI application investment
Advanced analytics complements traditional BI reporting, and
can leverage the work already done
Elevate the value of the analytics team
Once people see what’s possible, they’ll want more
3. Revolution Confidential
REvolution Confidential
Revolution
Most advanced statistical
analysis software available
The professor who invented analytic software for
Half the cost of the experts now wants to take it to the masses
commercial alternatives
2M+ Users
Power
2,500+ Applications
Finance
Statistics
Life Sciences
Predictive Manufacturing
Analytics Productivity
Retail
Data Mining Telecom Enterprise
Social Media Readiness
Visualization
Government
3
4. Revolution Confidential
A dvanc ed A nalytic s and B us ines s
Intelligenc e REvolution Confidential
Revolution
Obtain greater insights by adding advanced
Analytics
Predictive modeling
Sales Forecasts
Customer Affinity
Associations
Clustering and Classification
Expose this capability to the business users
4
5. Revolution Confidential
R is the bes t c hoic e for job REvolution Confidential
Revolution
R is a programming language
Catalog of over 2500 open-source add-on
packages
Community
Thousands of contributors, 2 million users
Resources and help in every domain
Develop and deploy your models with the
same environment
5
6. Revolution Confidential
A dding A dvanc ed A nalytic s REvolution Confidential
Revolution
Use the same analytic toolset across
applications and platforms
Web
Reporting and Dashboards
Custom Interactive applications
Mobile
Desktop Applications
Excel
Custom Apps via(.NET, Java)
Enterprise Processes via SOA
6
7. Revolution Confidential
R evolution R E nterpris e - Deployment REvolution Confidential
Revolution
Individual H S
Analysts O
S
S
T S
HDFS Database Appliance
Deployment
Servers High Workload
Hadoop Cluster
Clusters
Business Users
7
8. Revolution Confidential
C us tomer Us e C as es REvolution Confidential
Revolution
Major NY Bank using Revolution R
Enterprise to Create and Deploy Equity
Trading models
500+ Analysts using the developed models
Major Hospital Network doing analytics on
clinical data to generate treatment efficacy
predictions and capacity forecasting
Executive staff using output of analytics to make
business decisions
8
11. Revolution Confidential
R evoDeployR – K ey A dvantages REvolution Confidential
Revolution
Unlocks all the power of R to any 3rd party
application
Easy to use API – Rapid deployment
Scalability – Add nodes as you need them
Separation of expertise
Statistician - Just writes R code, no need to
know about the application
Application programmer – calls the API to
execute an R script, and gets the output.
11
12. Revolution Confidential
R evoDeployR REvolution Confidential
Revolution
Designed to be Enterprise Ready
Comprehensive collection of Web Service APIs
Enterprise Security
Stateful and Stateless execution of R
Code/Scripts
Asynchronous Job Execution
Repository for managing R objects and files
Administration
12
13. Revolution Confidential
R evoDeployR - A rc hitec ture REvolution Confidential
Revolution
End User Desktop Business Interactive Web or
Applications Intelligence Mobile
(i.e. Excel) (i.e. QlikView) Applications
Application
Client libraries (JavaScript, Java, .NET)
Developer
HTTP/HTTPS – JSON/XML
RevoDeployR Web Services
Admin Session Data/Script
Authentication Administration
Management Management
R
R
Programmer R
R
13
14. Revolution Confidential
R evoDeployR - S erver REvolution Confidential
Revolution
Applications Admin
R
R R
RevoDeployR Web Grid Node
Management Console
Services API
R
Grid Management R R
Framework Grid Node
Spring3 Framework J2EE Framework
R
R R
Grid Node
SQL Database WebDAV Repository
R R Session
14
15. Revolution Confidential
R evoDeployR REvolution Confidential
Revolution
Web Services Layer
Implemented as RESTful API accessed via
HTTP or HTTPS
Support for both JSON and XML formatted
payloads
Client libraries in JavaScript, Java and .NET to
make integration easy
15
16. Revolution Confidential
R evoDeployR REvolution Confidential
Revolution
R Scripts and R Code
Stateless execution of pre-defined R Scripts
Supports both Anonymous and Authenticated access
Project is automatically created, inputs loaded, R script
executed, outputs returned, and session destroyed
Stateful execution
Must be an authenticated user
Project is explicitly created/destroyed
R script or R code executed in the defined project
Jobs
Code and Script can be executed as a background job
Results are persisted and can be retrieved later
17. Revolution Confidential
R evoDeployR REvolution Confidential
Revolution
R Developer
Create R code, defining inputs and outputs
Inputs – R Objects
Outputs – Files, Console, Warnings/Errors, R Objects
R Objects that can be rendered in JSON or XML
as part of the API payload
Primitives (character, numeric, logical, date, factor)
Vector
Matrix
Data Frame
List
17
18. Revolution Confidential
R evoDeployR REvolution Confidential
Revolution
Application Developer
Define RevoDeployR Server connection (URL)
*Authenticate
*Create/Open Project
Execute Script or Execute Code
Create list of inputs
R Objects
Create lists of named outputs (if any)
R Objects
*Close R Project
* Required for Stateful execution
18
19. Revolution Confidential
R evoDeployR – R E S T ful A P I REvolution Confidential
Revolution
Example HTTP Call to Create a Project
format = json
HTTP POST on API call:
/r/session/create
JSON Response
{
"deployr": {
"response": {
"success": true,
"project": {
"lastmodified": "Thu, 20 Oct 2011 18:27:29 +0000",
"live": true,
"origin": "Project original.",
"longdescr": null,
"name": null,
"projectcookie": null,
"ispublic": false,
"owner": "testuser",
"descr": null,
"project": "PROJECT-5ab61ec0-09b9-44ea-837d-9e6f40a7e8a3"
},
"call": "/r/project/create"
}
}
}
19
20. Revolution Confidential
REvolution Confidential
Revolution
R evoDeployR – S tateles s E xample (J avaS c ript)
var exeScript = function () {
…
// set the call back configuration
var callback = { success : plot, failure: fail, scope : this, verbose : true };
var rnum = Y.Revolution.RDataFactory.createNumeric('input_randomNum', num);
var scriptConfig = { rscript:'DeployR - Hello World', inputs : [rnum]};
// execute RScript
Y.Revolution.DeployR.repositoryScriptExecute(scriptConfig, callback);
};
20
21. Revolution Confidential
R evoDeployR – S tateful E xample REvolution Confidential
Revolution
Use case - Simple Regression
Upload a CSV file to the RevoDeployR Server
Get a list of numeric variables
Run a simple regression using 2 of the variables
Return a plot
Implementation
2 R Scripts
Read the uploaded CSV and return the list of numeric variables
Run the regression on the selected variables
Requires authentication (login)
R Session is explicitly created after login
Both scripts execute in the same R Session
21
22. Revolution Confidential
R evoDeployR - S c alability REvolution Confidential
Revolution
Add compute nodes to handle changing
workload requirements
Execute code and scripts as background
jobs
Assign roles to nodes
Anonymous
Authenticated
Jobs
22
23. Revolution Confidential
T hank you. REvolution Confidential
Revolution
The leading commercial provider of software and support for the popular
open source R statistics language.
www.revolutionanalytics.com 650.646.9545 Twitter: @RevolutionR
23