This document summarizes a presentation on using data analytics for compliance, due diligence, and investigations. The presentation features four speakers: Raul Saccani of Deloitte, Dave Stewart of SAS Institute, John Walsh of SightSpan, and John Walsh of SAS Institute. It discusses challenges related to big data including volume, variety, and velocity of data. It provides examples of how financial institutions have used analytics for anti-money laundering model tuning and illicit network analysis. It also outlines the analytics lifecycle and considerations for adopting a proactive analytics strategy.
SQL Azure Database is a cloud database service from Microsoft. SQL Azure provides web-facing database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This paper provides an overview on some scale out strategies, challenges with scaling out on-premise and how you can benefit with scaling out with SQL Azure.
The data services marketplace is enabled by a data abstraction layer that supports rapid development of operational applications and single data view portals. In this presentation yo will learn services-based reference architecture, modality, and latency of data access.
- Reference architecture for enterprise data services marketplace
- Modality and latency of data access
- Customer use cases and demo
This presentation is part of the Denodo Educational Seminar , and you can watch the video here goo.gl/vycYmZ.
The Economic Value of Data: A New Revenue Stream for Global CustodiansCognizant
Global custodians' big data offers myriad opportunities for generating value from analytics solutions; we explore various paths and offer three use cases to illustrate. Data aggregation, risk management, digital experience, operational agility and cross-selling are all covered.
SQL Azure Database is a cloud database service from Microsoft. SQL Azure provides web-facing database functionality as a utility service. Cloud-based database solutions such as SQL Azure can provide many benefits, including rapid provisioning, cost-effective scalability, high availability, and reduced management overhead. This paper provides an overview on some scale out strategies, challenges with scaling out on-premise and how you can benefit with scaling out with SQL Azure.
The data services marketplace is enabled by a data abstraction layer that supports rapid development of operational applications and single data view portals. In this presentation yo will learn services-based reference architecture, modality, and latency of data access.
- Reference architecture for enterprise data services marketplace
- Modality and latency of data access
- Customer use cases and demo
This presentation is part of the Denodo Educational Seminar , and you can watch the video here goo.gl/vycYmZ.
The Economic Value of Data: A New Revenue Stream for Global CustodiansCognizant
Global custodians' big data offers myriad opportunities for generating value from analytics solutions; we explore various paths and offer three use cases to illustrate. Data aggregation, risk management, digital experience, operational agility and cross-selling are all covered.
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
This document is the first deliverable of the Lean Big Data work package 7 (WP7). The main goal of the package 7 is to provide the use cases applications that will be used to validate the Lean Big Data platform. To this end, an analysis of requirement of each use case will be provided in the scope.This analysis will be used as basis for the description of the evaluation, benchmarking and validation of the Lean Big Data platform.
This deliverable comprises the analysis of requirements for the following case of study provided in the context of Lean Big Data: Data Centre monitoring Case Study, Electronic Alignment of Direct Debit transactions Case Study, Social Network-based Area surveillance Case Study and Targeted Advertisement Case Study.
An Agile & Adaptive Approach to Addressing Financial Services Regulations and...Neo4j
Watch this webinar and learn how Neo4j and ICC Technology can help you remove risk from your data governance by improving the way you approach data lineage. We’ll cover some of the common approaches, driving regulations and biggest risks for banks and finances services.
-Find out how Data Lineage is becoming more complex for Banks and Financial Services companies
-Learn how a native-graph model can improve tracing data sources to targets as well as store transformations.
-Watch a demonstration on how you might approach regulations such as BCBS 239
Modern Integrated Data Environment - Whitepaper | QuboleVasu S
A whit-paper is about building a modern data platform for data driven organisations with using cloud data warehouse with modern data platform architecture
https://www.qubole.com/resources/white-papers/modern-integrated-data-environment
Joe Caserta, President at Caserta Concepts, presented "Setting Up the Data Lake" at a DAMA Philadelphia Chapter Meeting.
For more information on the services offered by Caserta Concepts, visit our website at http://casertaconcepts.com/.
You can view the full presentation of this webinar here: http://info.datameer.com/Slideshare-Fighting-Fraud-this-Holiday-Season.html
In 2012, retailers lost $3.5 billion in revenue to online fraud. These losses spike by a substantial estimated 20% during the holiday season.
Join Datameer and Hortonworks in this webinar to learn how Big Data Analytics can be used to identify new fraud schemes during peak fraud season.
In this webinar, you will learn about:
current challenges in identifying fraud
what to look for in a big data solution addressing fraud
how big data analytics can identify credit card fraud
best practices
Empowering your Enterprise with a Self-Service Data Marketplace (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3uqcAN0
Self-service is a major goal of modern data strategists. A successfully implemented self-service initiative means that business users have access to holistic and consistent views of data regardless of its location, source or type. As data unification and data collaboration become key critical success factors for organizations, data catalogs play a key role as the perfect companion for a virtual layer to fully empower those self-service initiatives and build a self-service data marketplace requiring minimal IT intervention.
Denodo’s Data Catalog is a key piece in Denodo’s portfolio to bridge the gap between the technical data infrastructure and business users. It provides documentation, search, governance and collaboration capabilities, and data exploration wizards. It provides business users with the tool to generate their own insights with proper security, governance, and guardrails.
In this session we will cover:
- The role of a virtual semantic layer in self-service initiatives
- Key ingredients of a successful self-service data marketplace Self-service (consumption) vs. inventory catalogs
- Best practices and advanced tips for successful deployment
- A Demonstration: Product Demo
- Examples of customers using Denodo’s Data Catalog to enable self-service initiatives
Introduction
Big Data may well be the Next Big Thing in the IT world.
Big data burst upon the scene in the first decade of the 21st century.
The first organizations to embrace it were online and startup firms. Firms like Google, eBay, LinkedIn, and Face book were built around big data from the beginning.
Like many new information technologies, big data can bring about dramatic cost reductions, substantial improvements in the time required to perform a computing task, or new product and service offerings.
Jump start into 2013 by exploring how Big Data can transform your business. Listen to Infochimps Director of Product, Tim Gasper, cover the leading use cases for 2013, sharing where the data comes from, how the systems are architected and most importantly, how they drive business insights for data-driven decisions.
Introduction to Modern Data Virtualization (US)Denodo
Watch full webinar here: https://bit.ly/3uyvxN5
“Through 2022, 60% of all organizations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
- How to easily get started with Denodo Standard 8.0
Презентация Виталия Никитина о возомжностях платформы HPE Idol для работы с BigData в современном кол-центре. Аналитика аудио и текстовой информации на базе платформы HPE IDOL
Data protection and privacy regulations such as the EU’s General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and Singapore’s Personal Data Protection Act (PDPA) have been major drivers for data governance initiatives and the emergence of data catalog solutions. Organizations have an ever-increasing appetite to leverage their data for business advantage, either through internal collaboration, data sharing across ecosystems, direct commercialization, or as the basis for AI-driven business decision-making. This requires data governance and especially data asset catalog solutions to step up once again and enable data-driven businesses to leverage their data responsibly, ethically, compliantly, and accountably.
This presentation explores how data catalog has become a key technology enabler in overcoming these challenges.
"How to create an efficient API.. with a business model?" by Nicolas GreniéTheFamily
A common mistake when discussing API business models is to think of the API first. But everyone should consider how an API will interact with a business model.
We will answer to the following questions:
What is an API? Which business model can you include in an API? How to use the canvas model to build a badass API?
Nicolas Grenié is a Hacker in Residence at 3scale living between Barcelona and San Francisco. He built his first website in 2000 using Microsoft Word, and since them did not stop learning about programming. Nicolas likes to try new languages all the time, so he has experience in PHP, Ruby and Node. When not working you have a good chance to find him hacking on side projects or enjoying a good craft beer. And of course, as he is French, frog and snails are part of his daily diet!
ADV Slides: When and How Data Lakes Fit into a Modern Data ArchitectureDATAVERSITY
Whether to take data ingestion cycles off the ETL tool and the data warehouse or to facilitate competitive Data Science and building algorithms in the organization, the data lake – a place for unmodeled and vast data – will be provisioned widely in 2020.
Though it doesn’t have to be complicated, the data lake has a few key design points that are critical, and it does need to follow some principles for success. Avoid building the data swamp, but not the data lake! The tool ecosystem is building up around the data lake and soon many will have a robust lake and data warehouse. We will discuss policy to keep them straight, send data to its best platform, and keep users’ confidence up in their data platforms.
Data lakes will be built in cloud object storage. We’ll discuss the options there as well.
Get this data point for your data lake journey.
This document is the first deliverable of the Lean Big Data work package 7 (WP7). The main goal of the package 7 is to provide the use cases applications that will be used to validate the Lean Big Data platform. To this end, an analysis of requirement of each use case will be provided in the scope.This analysis will be used as basis for the description of the evaluation, benchmarking and validation of the Lean Big Data platform.
This deliverable comprises the analysis of requirements for the following case of study provided in the context of Lean Big Data: Data Centre monitoring Case Study, Electronic Alignment of Direct Debit transactions Case Study, Social Network-based Area surveillance Case Study and Targeted Advertisement Case Study.
An Agile & Adaptive Approach to Addressing Financial Services Regulations and...Neo4j
Watch this webinar and learn how Neo4j and ICC Technology can help you remove risk from your data governance by improving the way you approach data lineage. We’ll cover some of the common approaches, driving regulations and biggest risks for banks and finances services.
-Find out how Data Lineage is becoming more complex for Banks and Financial Services companies
-Learn how a native-graph model can improve tracing data sources to targets as well as store transformations.
-Watch a demonstration on how you might approach regulations such as BCBS 239
Modern Integrated Data Environment - Whitepaper | QuboleVasu S
A whit-paper is about building a modern data platform for data driven organisations with using cloud data warehouse with modern data platform architecture
https://www.qubole.com/resources/white-papers/modern-integrated-data-environment
Joe Caserta, President at Caserta Concepts, presented "Setting Up the Data Lake" at a DAMA Philadelphia Chapter Meeting.
For more information on the services offered by Caserta Concepts, visit our website at http://casertaconcepts.com/.
You can view the full presentation of this webinar here: http://info.datameer.com/Slideshare-Fighting-Fraud-this-Holiday-Season.html
In 2012, retailers lost $3.5 billion in revenue to online fraud. These losses spike by a substantial estimated 20% during the holiday season.
Join Datameer and Hortonworks in this webinar to learn how Big Data Analytics can be used to identify new fraud schemes during peak fraud season.
In this webinar, you will learn about:
current challenges in identifying fraud
what to look for in a big data solution addressing fraud
how big data analytics can identify credit card fraud
best practices
Empowering your Enterprise with a Self-Service Data Marketplace (ASEAN)Denodo
Watch full webinar here: https://bit.ly/3uqcAN0
Self-service is a major goal of modern data strategists. A successfully implemented self-service initiative means that business users have access to holistic and consistent views of data regardless of its location, source or type. As data unification and data collaboration become key critical success factors for organizations, data catalogs play a key role as the perfect companion for a virtual layer to fully empower those self-service initiatives and build a self-service data marketplace requiring minimal IT intervention.
Denodo’s Data Catalog is a key piece in Denodo’s portfolio to bridge the gap between the technical data infrastructure and business users. It provides documentation, search, governance and collaboration capabilities, and data exploration wizards. It provides business users with the tool to generate their own insights with proper security, governance, and guardrails.
In this session we will cover:
- The role of a virtual semantic layer in self-service initiatives
- Key ingredients of a successful self-service data marketplace Self-service (consumption) vs. inventory catalogs
- Best practices and advanced tips for successful deployment
- A Demonstration: Product Demo
- Examples of customers using Denodo’s Data Catalog to enable self-service initiatives
Introduction
Big Data may well be the Next Big Thing in the IT world.
Big data burst upon the scene in the first decade of the 21st century.
The first organizations to embrace it were online and startup firms. Firms like Google, eBay, LinkedIn, and Face book were built around big data from the beginning.
Like many new information technologies, big data can bring about dramatic cost reductions, substantial improvements in the time required to perform a computing task, or new product and service offerings.
Jump start into 2013 by exploring how Big Data can transform your business. Listen to Infochimps Director of Product, Tim Gasper, cover the leading use cases for 2013, sharing where the data comes from, how the systems are architected and most importantly, how they drive business insights for data-driven decisions.
Introduction to Modern Data Virtualization (US)Denodo
Watch full webinar here: https://bit.ly/3uyvxN5
“Through 2022, 60% of all organizations will implement data virtualization as one key delivery style in their data integration architecture," according to Gartner. What is data virtualization and why is its adoption growing so quickly? Modern data virtualization accelerates that time to insights and data services without copying or moving data.
Watch this webinar to learn:
- Why organizations across the world are adopting data virtualization
- What is modern data virtualization
- How data virtualization works and how it compares to alternative approaches to data integration and management
- How modern data virtualization can significantly increase agility while reducing costs
- How to easily get started with Denodo Standard 8.0
Презентация Виталия Никитина о возомжностях платформы HPE Idol для работы с BigData в современном кол-центре. Аналитика аудио и текстовой информации на базе платформы HPE IDOL
Data protection and privacy regulations such as the EU’s General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and Singapore’s Personal Data Protection Act (PDPA) have been major drivers for data governance initiatives and the emergence of data catalog solutions. Organizations have an ever-increasing appetite to leverage their data for business advantage, either through internal collaboration, data sharing across ecosystems, direct commercialization, or as the basis for AI-driven business decision-making. This requires data governance and especially data asset catalog solutions to step up once again and enable data-driven businesses to leverage their data responsibly, ethically, compliantly, and accountably.
This presentation explores how data catalog has become a key technology enabler in overcoming these challenges.
"How to create an efficient API.. with a business model?" by Nicolas GreniéTheFamily
A common mistake when discussing API business models is to think of the API first. But everyone should consider how an API will interact with a business model.
We will answer to the following questions:
What is an API? Which business model can you include in an API? How to use the canvas model to build a badass API?
Nicolas Grenié is a Hacker in Residence at 3scale living between Barcelona and San Francisco. He built his first website in 2000 using Microsoft Word, and since them did not stop learning about programming. Nicolas likes to try new languages all the time, so he has experience in PHP, Ruby and Node. When not working you have a good chance to find him hacking on side projects or enjoying a good craft beer. And of course, as he is French, frog and snails are part of his daily diet!
Be My API How to Implement an API Strategy Everyone will Love CA API Management
Mike Amundsen,
Principal API Architect, Layer 7 Technologies
Mike is the author of Building Hypermedia APIs with HTML5 & Node and is a regular speaker at leading industry events on the subject of API design, Web application development and cloud computing.
Learn how to create and publish APIs that will help your business thrive and grow
February 7, 2013
9am PST | 12pm EST
Building great APIs is about more than just design; it requires detailed, thoughtful execution. Your API strategy needs to meet the business requirements of your organization but it must also be flexible enough to meet your developer community’s diverse needs. This webinar with Mike Amundsen, Layer 7's Principal API Architect, will examine the key foundational elements necessary for a solid API implementation strategy.
You Will Learn
Align API design with business goals
Architect flexible and robust APIs that are developer-accessible
Design for multiple client platforms (Web, mobile and cloud)
Implement USE methodology, versioning, reusability and hypermedia
Address issues around security, identity, social integration, reliability and scalability
Presented By
Developing an API strategy should be considered a journey, not a project with a predetermined outcome. This presentation describes Netflix's journey to discover a winning API strategy as well as future directions for the API.
Sachin Agarwal, SOA Software VP of Product Marketing, explains the frenzy around the mass development and adoption of APIs. In this presentation, he describes the business and technology implications of developing an API stratgy.
Welcome to the API Economy: Developing Your API StrategyMuleSoft
View the recording of this webinar: http://www.mulesoft.com/webinars/esb/welcome-api-economy
Learn more about our Anypoint Platform for APIs: https://www.mulesoft.com/platform/api
Gartner predicts 75% of Fortune 500 enterprises will open an API by 2014. In this new API economy, those without an API strategy will be left behind. What does this mean for you and your business? Join Ross Mason, MuleSoft Founder, for a discussion on key API trends and what you can do in this New Enterprise era to unlock competitive advantage for your organization.
Questions discussed:
What has changed with APIs?
What is the API economy and how did we get here?
How are APIs transforming enterprises?
What are key API trends my organization should be planning for?
How can APIs make my business more competitive?
Architecting an Enterprise API Management StrategyWSO2
A good internal and external API management strategy and architecture is key to building ecosystem platforms that lead to successful API economies in the enterprise. This workshop will look at best practices in API management using the WSO2 API Manager and Integration Platform products, which are used to rapidly implement RESTful design, enforce governance policies, safely scale solutions, orchestrate complex interaction sequences, and re-use assets. The session will also look at reference architectures and architectural recommendations of building large scale API ecosystems.
Director - Solutions Architecture at WSO2, Mifan Careem presented this session at APIdays Sydney 2015.
This paper was presented at the 'Towards a Magna Carta for Data' workshop at the RDS in Dublin, Sept 17th. It discusses how considerations of the ethics of big data consist of much more than the issues of privacy and security that it often gets boiled down to, and argues that the various ethical issues related to big data are multidimensional and contested; vary in nature across domains, and which ethical philosophy is adopted matters to the deliberation over data rights.
Enabling Data Governance - Data Trust, Data Ethics, Data QualityEryk Budi Pratama
Presented on PHPID Online Learning 35.
Komunitas PHP Indonesia
Title: Enabling Data Governance - The Journey through Data Trust, Ethics, and Quality
Eryk B. Pratama
Global IT & Cybersecurity Advisor
Presentación del Webinar de nuestra hermana Mind Your Privacy y Cardinal Path
En el actual escenario digital, más que nunca los analistas, marketeros y demás profesionales de datos deben conocer los cambios en las normativas nacionales e internacionales así como una serie de principios básicos para respetar la privacidad y la protección de los que sus datos recogen.
Digital Marketing meets Privacy
With the possibility of a security incident or breach, immediate decision making is required. It's imperative that organizations kick off immediately their IR Plan and bring all functions together.
The Incident Response Decision Tree can help you build your IR Plan or ensure that you have all decision makers ready.
Time is of the essence in an incident or breach. OpenText Risk & Compliance Advisory and DFIR Teams are available to help organization in their response. For more information on OpenText Security Consulting, visit: https://www.opentext.com/services/security
Data Privacy Program – a customized solution for the new EU General Regulatio...IAB Bulgaria
Data Privacy Program – a customized solution for the new EU General Regulation on Data Protection, Maria Maxim, Senior Manager – Fraud Investigation & Dispute Service, Ernst&Young
Keep Calm and Comply: 3 Keys to GDPR SuccessSirius
Recent surveys benchmarking the status of U.S. companies' efforts to meet the May 25 deadline for the EU Global Data Protection Regulation (GDPR) have revealed a startling lack of preparedness.
Companies not yet in compliance are likely to violate the directive if they don’t take immediate action, and fines can amount to 2-4 percent of a company’s annual gross revenue. Do you have the resources and information you need to comply?
View to learn:
--What GDPR means to your business
--Short, medium, and long-term actions you can take to protect regulated data and achieve compliance
--How you can streamline incident response and third-party risk management capabilities
--How to streamline the resources and technology needed to keep up with the evolving regulatory landscape
Don't fall behind on these compliance regulations. Take the steps needed to protect the data you collect.
Presentation on key legal issues regarding use and developments of BOTs, AI - GDPR, Data Protection. Case study BRISbot. Presentation delivered at Epicenter 30 of May 2017 in partnership with BRIS and Microsoft.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Generating a custom Ruby SDK for your web service or Rails API using Smithyg2nightmarescribd
Have you ever wanted a Ruby client API to communicate with your web service? Smithy is a protocol-agnostic language for defining services and SDKs. Smithy Ruby is an implementation of Smithy that generates a Ruby SDK using a Smithy model. In this talk, we will explore Smithy and Smithy Ruby to learn how to generate custom feature-rich SDKs that can communicate with any web service, such as a Rails JSON API.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...
Making ‘Big Data’ Your Ally – Using data analytics to improve compliance, due diligence and performance
1. Making Big Data Your Ally
Using data analytics to improve
compliance, due diligence and
investigations
Thursday, February 6, 2014 | 3:00 - 4:00 PM
Speakers:
Raul Saccani, Dave Stewart, John Walsh
8. The Analytics Lifecycle
BUSINESS
MANAGER
Domain Expert
Makes Decisions
Evaluates Processes and ROI
EVALUATE /
MONITOR
RESULTS
IDENTIFY /
FORMULATE
PROBLEM
DATA
PREPARATION
DEPLOY
MODEL
IT SYSTEMS /
MANAGEMENT
Model Validation
Model Deployment
Data Preparation
DATA
SCIENTIST
Data Exploration
Data Visualization
Report Creation
DATA
EXPLORATION
ANALYST
DATA MINER
VALIDATE
MODEL
TRANSFORM
& SELECT
BUILD
MODEL
Exploratory Analysis
Descriptive Segmentation
Predictive Modeling
9. Case Studies
• Tier I Asian Bank
– Visual analytics of Group Security Operations
– Cross-border sharing of summary data
• Tier I Global Bank
– AML model tuning & optimization
– Large volume peer group analysis
• Tier I Global Bank
– “Safety Net” approach for controlling affiliate risk
– Ad hoc builds of illicit networks
10. Observations
• New capabilities require new thinking about business
as usual
• Variety of data & techniques requires new skills
within lines of business
• Adopt a pro-active/pre-emptive analytics strategy
• Understand your company’s technology roadmap
and get on board
12. Raúl Saccani´s presentation contents
• Data privacy standards in Latin America,
compared to US and EU standards, and
• How data privacy rules, limitations on crossborder data sharing can impact compliance
functions and internal investigations
• Role of e-discovery in financial crime
investigations, including internal investigations
• Sources of data in internal investigations,
including structured and unstructured data
13. Privacy and Data Protection
1)
2)
3)
4)
The context
Data protection and electronic evidence
EU law on privacy and data protection
Practical considerations
14. (1) Context
Most personal information and most evidence are digital
Lawyers and judges need to know significance of digital
information
Need to know and understand the :
• nature of digital evidence
• data protection rules of the road
Otherwise no :
• remedy for the data subject
• fair trial for the accused
• convictions for the prosecutor
15. No. of countries with privacy laws
The growth of global privacy laws
Time Period
16. (2) Data Protection and
Electronic Evidence
•
•
•
•
Overlapping Scope
Data protection rules apply to the courts
Fruits of the Poisoned Tree
precautions to ensure admissibility of eevidence
17. (3) EU Law on Privacy:
two fundamental rights
(a) the Right to Privacy
ECHR (1950), Article 8
Everyone has the right to respect for
his or her private and family life, home
and correspondence
EU Charter (2000), Article 7 :
…and communications.
18. (b) the Right to
Protection of Personal Data
an autonomous fundamental right to selfdetermination in the Information Society
Article 16, EU Treaty
EU Charter, Article 8 :
1. Everyone has the right to the protection of
personal data concerning him or her.
19. 2. Such data must be processed fairly for
specified purposes and on the basis of the
consent of the person concerned or some
other legitimate basis laid down by law.
Everyone has the right of access to data
which has been collected concerning him or
her, and the right to have it rectified.
3. Compliance with these rules shall be subject
to control by an independent authority
20. Data Privacy
• What is a Data Controller?
– Person or entity who determines purpose and manner of
processing
– EU Directive imposes obligation to protect personal data
– Potential liability for failure to fulfill obligations
– Responsible for directing and controlling actions of Data
Processor
• What is a Data Processor?
– Processes data on behalf of and at the direction of Data
Controller
– Must follow instructions of Data Controller
21. Practical Considerations
• Now you are in a position to make the necessary cost-benefit
analysis. Ask yourself the following questions:
– What is the true value of this source of information
relative to (a) other more easily accessible sources of
information and (b) the litigation as a whole?
– What are the projected costs of complying with the EU
Data Protection Directive?
– What are the projected costs of defending a discovery
dispute?
– What are the relative strengths and weaknesses of each
side on discovery issues?
22. Outsourcing Personal Data Processing
Contractual means:
All practicable security measures
Timely return, destruction or deletion
of data
Prohibition against any use or
disclosure for other purposes
Prohibition against sub-contracting
Right to audit and inspect
28. The Change is Driving Big Data
Petabytes
Terabytes
Gigabytes
Megabytes
Data Complexity, Variety and Velocity
29. Big Data Is…
Big Data represents the
Trends, Technologies and
Potential for organizations
to obtain valuable insight
from large amounts of
Structured, Unstructured
and fast-moving data.
80%
Unstructured Data
Click Stream
Videos
Images
Text
Sensors
30. Where Does Big Data Come From?
• Our Data-driven World
– Science
• Data bases from astronomy, genomics,
environmental data, transportation data, …
– Humanities and Social Sciences
• Scanned books, historical documents, social
interactions data, new technology like GPS, …
– Business & Commerce
• Corporate sales, stock market transactions, census,
airline traffic, …
– Entertainment
• Internet images, Hollywood movies, MP3 files, …
– Medicine
• MRI & CT scans, patient records, …
31. Structured vs unstructured data
• Structured data : information in “tables”
Employee
Manager
Salary
Smith
Jones
50000
Chang
Smith
60000
Ivy
Smith
50000
Typically allows numerical range and exact match
(for text) queries, e.g.,
Salary < 60000 AND Manager = Smith.
32. Unstructured data
• Typically refers to free text
• Allows
– Keyword-based queries including operators
– More sophisticated “concept” queries, e.g.,
• find all web pages dealing with drug abuse
33. Forensic Data Analytics - Definition
Core objectives:
Identifying, preserving, recovering, processing, and analyzing structured,
standardized and/or codified digital information for the purpose of
generating evidence that may be used as such in an investigation, and
that may ultimately serve as legal actions support in litigation.
Source of information:
Company’s accounting system (ERP), proprietary or third partydeveloped vertical applications, intersystem interfaces, financial
reporting worksheets.
34. How data analytics works?
Data Acquisition, Accounting
Integrity Control and Data Mapping
Evaluation of fraud and misconduct
risk indicators
Routines and tests
Identification of unusual or
irregular trends and patterns
Analysis of preidentified
transactions
35. Usual procedures - Overview
How data analytics works?
– Reviews with focus in red flags detected.
– Master vendor and customer files analysis:
• Databases cross analysis between company databases and public databases
and records. Some examples are:
Clients related to public biddings
Vendors/Clients with invalid or incomplete key data
Vendors/Clients with potential tax irregularities
Vendors/Clients with unusual activities
Vendors/Clients with unusual characteristics
Vendors/Clients with unusual transactional activity
Duplicate Vendors/Clients
Vendors/Clients related to employees or other Vendors/Clients
Employees related to other employees
36. Employee - Vendor Matching: identical domicile as per external databases
Masters
External databases
CODE
VENDOR
DOMICILE
100911 TRANSPORTES PARANÁ ARANÁ 1
P
CODE
EMPLOYEE
502435 JUAN PEREZ
CITY
Taxpayer ID
CAP. FED. 30-70867893-0
DOMICILE
CITY
Employee ID
AV PUEYRREDÓN 1111 AP. FED. 23-20667877-4
C
Apparently
unrelated
COMPANY
ALTERNATIVE
DOMICILE
NAME
DOMICILE
30-70867893-0MARÍA PEREZ PARANÁ 1
AV. CÓRDOBA 999 PISO 3
Taxpayer ID
Employee ID
NAME
DOMICILE
CITY
23-20667877-4JUAN PEREZ CÓRDOBA 999 PISO 3 FED.
CAP.
The difference might
arise from the
fantasy name –
company name
Unusual
relationship
37. Examples of results per vendor
He/she external public
Based onwould be
working under a
sources, he/she would be
working under a contract
contract of
of employment
employment
Individual
Individual
Related to
Related to a a
potentially irregular
potentially irregular
entity
entity
services
Sequentially
Sequentially
numbered invoices
numbered invoices
Entity showing no
tax activity
Company name
does not match the
information filed
with AFIP
Vendor :
Vendor C: Provider
Provider of
of advertising
advertising
services
Significant number
of legal actions
Data quality issues
(incomplete
information)
Vendor :
Advisory
services fees
38. Vendor
Name
High risk fraud alert
Unusual activity
Master data changes
Unusual Behaviour
Inconsistent names
Potential tax irregularities
Connected entities
Suspicious tax payer ID
Suspicious address
Suspicious telephone
Unusual information
Other Potential irregularities
Data Quality - Invalid key data
Data Quality - Missing key data
Duplicates
TOTAL SCORING
Test 001
Test 002
Test 100
TOTAL TESTS
Vendors with higher scoring
100123
100981
100789
101000
102078
Vendor 1
Vendor 2
Vendor 3
Vendor 4
Vendor 5
100
100
100
100
100
10
10
10
0
0
0
0
0
0
0
0
1
0
0
0
0
0
0
0
0
3
2
3
2
1
7
8
4
7
3
0
0
0
0
0
0
0
0
0
0
0
0
0
0
0
2
0
0
0
2
0
0
3
0
1
0
0
0
0
0
0
0
0
0
0
2
4
3
4
5
122
121
120
109
107
1
1
1
0
0
0
1
0
0
0
0
0
0
1
1
1
2
1
1
1
Each routine is classified into these groups considering the
estimated risk inherent to each test.
Note: for instance, only three routines are identified in the chart.
The complete analysis includes over 200 routines.
Customers are looking critically at the “Big data” termand asking what all the fuss is about!Big data – however you define it – isn’t going away and it isn’t getting smaller. Big data is a relative term describing a situation where the volume, velocity and variety of data exceed an organization’s storage or compute capacity for accurate and timely decision making.Volumes - Growing volumes of data and how much data need to be processed within a time window Variety - includes structured tables, documents, e-mail, metering data, video, image, audio, stock ticker data, and more. Velocity - How fast data is produced and processed to meet demand. Ability to respond once a problem or opportunity is detected. With the wealth of data coming at them, organizations struggle with managing the information overload. But we do not want the amount , type and speed at which you are collecting data limit the analytics you can do! They also have to identify what is relevant set of data to answer the complex set of questions before they become obsolete. So the concept of “relevance” is really important and will change over time as we get new data. Big data in and of itself is not that interesting! First you need good data management practices to managebig data. Secondly you can leverage “big data” for valuable insights byusing high-performance analytics. It can help to improve decision making at all levels, whether to gain better customer insights, manage risks or improve operational metrics.
The Analytics Life Cycle is a way to automate and production-ize the creation, development, testing and deployment of models in an organization.