This was presented at SAS Visual Analytics Event on May 15, 2013 in Chennai. This presentation discussed on how SAS Visual Analytics can empower your organisation in gaining valuable insights from your data in the shortest amount of time.
Modern Data Warehousing with the Microsoft Analytics Platform SystemJames Serra
The traditional data warehouse has served us well for many years, but new trends are causing it to break in four different ways: data growth, fast query expectations from users, non-relational/unstructured data, and cloud-born data. How can you prevent this from happening? Enter the modern data warehouse, which is able to handle and excel with these new trends. It handles all types of data (Hadoop), provides a way to easily interface with all these types of data (PolyBase), and can handle “big data” and provide fast queries. Is there one appliance that can support this modern data warehouse? Yes! It is the Analytics Platform System (APS) from Microsoft (formally called Parallel Data Warehouse or PDW) , which is a Massively Parallel Processing (MPP) appliance that has been recently updated (v2 AU1). In this session I will dig into the details of the modern data warehouse and APS. I will give an overview of the APS hardware and software architecture, identify what makes APS different, and demonstrate the increased performance. In addition I will discuss how Hadoop, HDInsight, and PolyBase fit into this new modern data warehouse.
Watch full webinar here: https://buff.ly/2mHGaLA
What started to evolve as the most agile and real-time enterprise data fabric, data virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Attend this session to learn:
• What data virtualization really is
• How it differs from other enterprise data integration technologies
• Why data virtualization is finding enterprise-wide deployment inside some of the largest organizations
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
DataOps is a methodology and culture shift that brings the successful combination of development and operations (DevOps) to data processing environments. It breaks down silos between developers, data scientists, and operators, resulting in lean data feature development processes with quick feedback. In this presentation, we will explain the methodology, and focus on practical aspects of DataOps.
Modern Data Warehousing with the Microsoft Analytics Platform SystemJames Serra
The traditional data warehouse has served us well for many years, but new trends are causing it to break in four different ways: data growth, fast query expectations from users, non-relational/unstructured data, and cloud-born data. How can you prevent this from happening? Enter the modern data warehouse, which is able to handle and excel with these new trends. It handles all types of data (Hadoop), provides a way to easily interface with all these types of data (PolyBase), and can handle “big data” and provide fast queries. Is there one appliance that can support this modern data warehouse? Yes! It is the Analytics Platform System (APS) from Microsoft (formally called Parallel Data Warehouse or PDW) , which is a Massively Parallel Processing (MPP) appliance that has been recently updated (v2 AU1). In this session I will dig into the details of the modern data warehouse and APS. I will give an overview of the APS hardware and software architecture, identify what makes APS different, and demonstrate the increased performance. In addition I will discuss how Hadoop, HDInsight, and PolyBase fit into this new modern data warehouse.
Watch full webinar here: https://buff.ly/2mHGaLA
What started to evolve as the most agile and real-time enterprise data fabric, data virtualization is proving to go beyond its initial promise and is becoming one of the most important enterprise big data fabrics.
Attend this session to learn:
• What data virtualization really is
• How it differs from other enterprise data integration technologies
• Why data virtualization is finding enterprise-wide deployment inside some of the largest organizations
Data Lakehouse, Data Mesh, and Data Fabric (r1)James Serra
So many buzzwords of late: Data Lakehouse, Data Mesh, and Data Fabric. What do all these terms mean and how do they compare to a data warehouse? In this session I’ll cover all of them in detail and compare the pros and cons of each. I’ll include use cases so you can see what approach will work best for your big data needs.
DataOps is a methodology and culture shift that brings the successful combination of development and operations (DevOps) to data processing environments. It breaks down silos between developers, data scientists, and operators, resulting in lean data feature development processes with quick feedback. In this presentation, we will explain the methodology, and focus on practical aspects of DataOps.
Next generation intelligent data lakes, powered by GraphQL & AWS AppSync - MA...Amazon Web Services
GraphQL is a query language for APIs and a runtime to fulfill these queries, allowing applications to easily connect and access data stored on any type of database technology or API. AWS AppSync provides a powerful and flexible serverless GraphQL API that securely accesses, manipulates, and combines data from multiple sources at any scale, enabling you to build any kind of application on a range of data sources independently of the underlying database technology. In this session, we discuss different use cases where AWS AppSync and GraphQL power next-generation applications. Special guest, Candid Partners, shares how it uses AWS AppSync in its Data Fabric solution to simplify large-scale data management using a GraphQL API to interact with data lakes.
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...Edureka!
** Microsoft Azure Certification Training : https://www.edureka.co/microsoft-azure-training **
This Edureka "Azure Data Factory” tutorial will give you a thorough and insightful overview of Microsoft Azure Data Factory and help you understand other related terms like Data Lakes and Data Warehousing.
Following are the offering of this tutorial:
1. Why Azure Data Factory?
2. What Is Azure Data Factory?
3. Data Factory Concepts
4. What is Azure Data Lake?
5. Data Lake Concepts
6. Data Lake Vs Data Warehouse
7. Demo- Moving On-Premise Data To Cloud
Check out our Playlists: https://goo.gl/A1CJjM
SAP Data Services is a data integration and transformation software application. It also supports changed-data capture (CDC), which is an important capability for providing input data to both data-warehousing and stream-processing systems.
It is an ETL tool which gives a single enterprises level solution for data integration, Transformation, Data quality, Data profiling and text data processing from the heterogeneous source into a target database or data warehouse.
Agile Data Engineering: Introduction to Data Vault 2.0 (2018)Kent Graziano
(updated slides used for North Texas DAMA meetup Oct 2018) As we move more and more towards the need for everyone to do Agile Data Warehousing, we need a data modeling method that can be agile with us. Data Vault Data Modeling is an agile data modeling technique for designing highly flexible, scalable, and adaptable data structures for enterprise data warehouse repositories. It is a hybrid approach using the best of 3NF and dimensional modeling. It is not a replacement for star schema data marts (and should not be used as such). This approach has been used in projects around the world (Europe, Australia, USA) for over 15 years and is now growing in popularity. The purpose of this presentation is to provide attendees with an introduction to the components of the Data Vault Data Model, what they are for and how to build them. The examples will give attendees the basics:
• What the basic components of a DV model are
• How to build, and design structures incrementally, without constant refactoring
Data Driven Advanced Analytics using Denodo Platform on AWSDenodo
Watch full webinar here: https://buff.ly/3JC8gCS
Accelerating cloud adoption and modernizing analytics in the cloud has become a necessity to facilitate timely, insightful, and impactful decision making. However, with the widespread data in an organization across disparate hybrid cloud data sources poses a challenge with real time and well governed analytics. Data Virtualization is a modern data integration technique in which a single semantic layer can be built to help drive data democratization and speed up the analytics in an efficient and cost-effective manner.
Watch this session to learn:
- How various AWS services (Redshift, S3, RDS) can be quickly integrated using Denodo Platform’s logical data management by implementing a logical data fabric (LDF)
- How LDF helps you manage and deliver your data for data science and analytics programs, supporting your business users.
- How governed Data Services layer enables self-service analytics in your complex AWS data landscape
Data Lakes are meant to support many of the same analytics capabilities of Data Warehouses while overcoming some of the core problems. Yet Data Lakes have a distinctly different technology base. This webinar will provide an overview of the standard architecture components of Data Lakes.
This will include:
The Lab and the factory
The base environment for batch analytics
Critical governance components
Additional components necessary for real-time analytics and ingesting streaming data
Big Data and Data Warehousing Together with Azure Synapse Analytics (SQLBits ...Michael Rys
SQLBits 2020 presentation on how you can build solutions based on the modern data warehouse pattern with Azure Synapse Spark and SQL including demos of Azure Synapse.
Delta Lake delivers reliability, security and performance to data lakes. Join this session to learn how customers have achieved 48x faster data processing, leading to 50% faster time to insight after implementing Delta Lake. You’ll also learn how Delta Lake provides the perfect foundation for a cost-effective, highly scalable lakehouse architecture.
Azure Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data. In this session we will learn how to create data integration solutions using the Data Factory service and ingest data from various data stores, transform/process the data, and publish the result data to the data stores.
Event: Passcamp, 07.12.2017
Speaker: Stefan Kirner
Mehr Tech-Vorträge: https://www.inovex.de/de/content-pool/vortraege/
Mehr Tech-Artikel: https://www.inovex.de/blog
Building the Enterprise Data Lake - Important Considerations Before You Jump InSnapLogic
In this webinar, learn from industry analyst and big data thought leader Mark Madsen about the future of big data and importance of the new Enterprise Data Lake reference architecture.
This webinar also covers what’s important when building a modern, multi-use data infrastructure, the difference between a Hadoop application and a Data Lake infrastructure, and an enterprise data lake reference architecture to get you started.
To learn more, visit: www.snaplogic.com/big-data
Next generation intelligent data lakes, powered by GraphQL & AWS AppSync - MA...Amazon Web Services
GraphQL is a query language for APIs and a runtime to fulfill these queries, allowing applications to easily connect and access data stored on any type of database technology or API. AWS AppSync provides a powerful and flexible serverless GraphQL API that securely accesses, manipulates, and combines data from multiple sources at any scale, enabling you to build any kind of application on a range of data sources independently of the underlying database technology. In this session, we discuss different use cases where AWS AppSync and GraphQL power next-generation applications. Special guest, Candid Partners, shares how it uses AWS AppSync in its Data Fabric solution to simplify large-scale data management using a GraphQL API to interact with data lakes.
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...Edureka!
** Microsoft Azure Certification Training : https://www.edureka.co/microsoft-azure-training **
This Edureka "Azure Data Factory” tutorial will give you a thorough and insightful overview of Microsoft Azure Data Factory and help you understand other related terms like Data Lakes and Data Warehousing.
Following are the offering of this tutorial:
1. Why Azure Data Factory?
2. What Is Azure Data Factory?
3. Data Factory Concepts
4. What is Azure Data Lake?
5. Data Lake Concepts
6. Data Lake Vs Data Warehouse
7. Demo- Moving On-Premise Data To Cloud
Check out our Playlists: https://goo.gl/A1CJjM
SAP Data Services is a data integration and transformation software application. It also supports changed-data capture (CDC), which is an important capability for providing input data to both data-warehousing and stream-processing systems.
It is an ETL tool which gives a single enterprises level solution for data integration, Transformation, Data quality, Data profiling and text data processing from the heterogeneous source into a target database or data warehouse.
Agile Data Engineering: Introduction to Data Vault 2.0 (2018)Kent Graziano
(updated slides used for North Texas DAMA meetup Oct 2018) As we move more and more towards the need for everyone to do Agile Data Warehousing, we need a data modeling method that can be agile with us. Data Vault Data Modeling is an agile data modeling technique for designing highly flexible, scalable, and adaptable data structures for enterprise data warehouse repositories. It is a hybrid approach using the best of 3NF and dimensional modeling. It is not a replacement for star schema data marts (and should not be used as such). This approach has been used in projects around the world (Europe, Australia, USA) for over 15 years and is now growing in popularity. The purpose of this presentation is to provide attendees with an introduction to the components of the Data Vault Data Model, what they are for and how to build them. The examples will give attendees the basics:
• What the basic components of a DV model are
• How to build, and design structures incrementally, without constant refactoring
Data Driven Advanced Analytics using Denodo Platform on AWSDenodo
Watch full webinar here: https://buff.ly/3JC8gCS
Accelerating cloud adoption and modernizing analytics in the cloud has become a necessity to facilitate timely, insightful, and impactful decision making. However, with the widespread data in an organization across disparate hybrid cloud data sources poses a challenge with real time and well governed analytics. Data Virtualization is a modern data integration technique in which a single semantic layer can be built to help drive data democratization and speed up the analytics in an efficient and cost-effective manner.
Watch this session to learn:
- How various AWS services (Redshift, S3, RDS) can be quickly integrated using Denodo Platform’s logical data management by implementing a logical data fabric (LDF)
- How LDF helps you manage and deliver your data for data science and analytics programs, supporting your business users.
- How governed Data Services layer enables self-service analytics in your complex AWS data landscape
Data Lakes are meant to support many of the same analytics capabilities of Data Warehouses while overcoming some of the core problems. Yet Data Lakes have a distinctly different technology base. This webinar will provide an overview of the standard architecture components of Data Lakes.
This will include:
The Lab and the factory
The base environment for batch analytics
Critical governance components
Additional components necessary for real-time analytics and ingesting streaming data
Big Data and Data Warehousing Together with Azure Synapse Analytics (SQLBits ...Michael Rys
SQLBits 2020 presentation on how you can build solutions based on the modern data warehouse pattern with Azure Synapse Spark and SQL including demos of Azure Synapse.
Delta Lake delivers reliability, security and performance to data lakes. Join this session to learn how customers have achieved 48x faster data processing, leading to 50% faster time to insight after implementing Delta Lake. You’ll also learn how Delta Lake provides the perfect foundation for a cost-effective, highly scalable lakehouse architecture.
Azure Data Factory is a cloud-based data integration service that orchestrates and automates the movement and transformation of data. In this session we will learn how to create data integration solutions using the Data Factory service and ingest data from various data stores, transform/process the data, and publish the result data to the data stores.
Event: Passcamp, 07.12.2017
Speaker: Stefan Kirner
Mehr Tech-Vorträge: https://www.inovex.de/de/content-pool/vortraege/
Mehr Tech-Artikel: https://www.inovex.de/blog
Building the Enterprise Data Lake - Important Considerations Before You Jump InSnapLogic
In this webinar, learn from industry analyst and big data thought leader Mark Madsen about the future of big data and importance of the new Enterprise Data Lake reference architecture.
This webinar also covers what’s important when building a modern, multi-use data infrastructure, the difference between a Hadoop application and a Data Lake infrastructure, and an enterprise data lake reference architecture to get you started.
To learn more, visit: www.snaplogic.com/big-data
Learn about SAS and Cloudera technical integration, how SAS builds on the enterprise data hub, and SAS In-Memory Statistics for Hadoop, machine learning capabilities.
SAS Visual Analytics is a high-performance, in-memory solution for
exploring massive amounts of data very quickly. It enables you to spot
patterns, identify opportunities for further analysis and convey visual
results via Web reports or a mobile platform such as iPad® or Androidbased
tablets.This presentation is a very brief overview of the many features and
capabilities of SAS Visual Analytics. It is meant to get you started
quickly, with a relatively modest data set example (only 1.4 million
rows).Insight Toy Company is an organization that produces and sells toys to
resellers (“vendors”). The data is made up of 34 years of Sales information,
covering 128 cities across the world.
For each row of data (transaction) we have:
Information on the items sold (product brand, line, make, style, SKU)
The sale value (“order total”) and various related costs (distribution, marketing, product)
Information on the sales representative (rating, sales target, actual to date, etc.)
Geographic information (on the vendors as well as the selling facility)
Information on the vendors (rating, satisfaction, distance to nearest facility)
Text Notes taken at the moment of the order taking, based on conversation with the vendor.
Predictive Analytics and Machine Learning…with SAS and Apache HadoopHortonworks
In this interactive webinar, we'll walk through use cases on how you can use advanced analytics like SAS Visual Statistics and In-Memory Statistic with Hortonworks’ data platform (HDP) to reveal insights in your big data and redefine how your organization solves complex problems.
Understanding SAS Data Step Processingguest2160992
Learning
Base SAS,
Advanced SAS,
Proc SQl,
ODS,
SAS in financial industry,
Clinical trials,
SAS Macros,
SAS BI,
SAS on Unix,
SAS on Mainframe,
SAS interview Questions and Answers,
SAS Tips and Techniques,
SAS Resources,
SAS Certification questions...
visit http://sastechies.blogspot.com
An exclusive presentation by Mr. Deepak Ramanathan, Chief Technology Officer, North Asia, AP Presales Operations - SAS Institute Pte. Ltd (Singapore). on ‘High Performance Analytics - The Future of Analytics is Here.’ The presentation was made at Government Analytics & Information Summit 2013.
See how you can use statistical analysis to conduct useful and effective consumer and marketing research. These slides were used in a seminar held in the UK at The Shard. To see upcoming seminars, visit http://www.jmp.com/uk/about/events/conferences/
Santosh Tiwari, Analytics Expert, SAS presented on the topic "Data Driven Business" at IDG CIO Summit 8-9 Feb 2018, in which he talked about modern, open platform Viya built for analytics innovation, and shared a vision on how the future of analytics looks like. During this session, he also emphasized on Artificial Intelligence & Machine Learning to build automated agile analytics platforms for customer engagement and business decisions.
What are the the main areas of analytics and how can they benefit your business? Learn the value of SAS analytics and how you can get better insight into your data to make more profitable decisions.
By getting a better understanding of your data you will know which part of the data can be reliably forecast using time series methods and which cannot. You will also gain an understanding of any hierarchical structure in the data that can be used.
Visualisation and forecasting on IT capacity planning dataAndrew Gadsby
This is a presentation recently given at UK CMG around linking visualisation and forecasting tools with capacity planning and performance data from distributed and mainframe systems.
This presentation introduces Statistical Discovery, a process that allows you to work with data to discover new, useful, insights that drive cycles of learning. After a brief overview to introduce the concept, an example involving property prices in the US will be used to demonstrate the how the process works in practice. Through this example we also exemplify the skills and aptitudes required to exercise the process successfully.
How to sustain analytics capabilities in an organizationSAS Canada
This presentation is part of Analytics Management Series that is designed to suggest paths towards effective decision-making in order to help sustain and grow analytical capabilities. It features thought leaders who actively manage complex analytical environments who share their best practices. How to sustain analytics capabilities in an organization features Daymond Ling, Senior Director, Modelling & Analytics (CIBC) on how organizations who want better performance and less problems can use data to their advantage.
SAS Data Management for Analytics: potenzia le tue analisi e sostieni l’innov...SAS Italy
Ora più che mai le analisi di alta qualità richiedono dati di alta qualità! Con il crescente uso di molteplici e nuove fonti, come Hadoop e l'Internet of Things, i dati che fungono da combustibile per gli Analytics stanno seguendo una spirale ascendente in termini di varietà, volume e complessità nel mondo dei Big Data. Questa presentazione ti aiuterà a capire come la soluzione SAS Data Management può supportarti a migliorare la qualità dei tuoi dati e a ridurne i tempi di preparazione.
During this presentation, we will discuss how SAS can provide an open analytical platform to do artificial intelligence.
We’ll start clarifying what AI and Open mean in the context of driving business value, and we will then illustrate how SAS can support this value creation through the components of the Platform and of the Viya engine. Finally, these different elements will be demonstrated through real life examples and demos.
SAS Big Data Forum - Transforming Big Data into Corporate GoldLouis Fernandes
Synopsis: How SAS believes organisations can turn Big Data in to competitive advantage through the use of High Performance Analytics.
In this presentations, we look at how SAS is seeing organisations take the outputs from big data analysis and turn them into tangible business outcomes through real-time decision-making.
In it, we explore:
- Why we believe organisations need to exploit their data assets to create the insights that build competitive advantage
- How to develop infrastructures required to support multi-dimentsional insight
- What SAS is doing to make this a reality
Key topics include:
- Data governance
- Big data infrastructure
- High performance analytics
- Data visualisation
About SAS:
- World’s largest, privately held software company,
- 35 years old
- Focus on advanced and predictive analytics right from the word go
- Big data has been in our DNA before it became mainstream
Analyze Your Data, Transform Your BusinessDATAVERSITY
In the era of big data, data processing has taken center stage. The focus often is the speeds and feeds of the organizational data supply chain. Much less thought and expertise have focused on what matters most – using analytics in the proper context.
Why is context so important? Why isn’t it enough to slay the v-named data dragons – namely, volume, variety and velocity?
In this webinar, you’ll learn how successful organizations apply the right analytical capabilities in the proper context. Because without context, analytics can become noise that disturbs the decision-making process instead of helping it. There is no one-size-fits-all context generator.
We’ll also discuss the importance of putting data into the proper context for analytical decision making, why data is your most valuable organizational asset, and how you can apply analytics in a way that converts your data into tangible benefits.
An introduction to BRIDGEi2i - Analytics Solutions company focused on solving complex based problems based on data mining and advanced analytics on big data. Visit http://www.bridgei2i.com
The SAS Perspective on Artificial Intelligence was presented by Anil Arora, Principal Data Scientist, SAS at IDG CIO 100: Reimagineering India 2020 on 6-7th Sep at New Delhi.
Topic: Incidents, Indicators, Insights – the emergence of the Security Analytics Platform
Presented by Keith Swanson, IBA Event.
As cyber investment continues, many organisations have brought on a myriad of capabilities and technologies across operational and productivity platforms. Moving from threat and incident response to hunting as part of a risk based approach, the Security Analytics Platform is cementing its importance in bridging these investments and information across operational and productivity investments.
SAS was analytics sponsor at AML Summit by Fintelekt in Bangladesh and Mr. Rohan Langley, SAS Fraud & Security Expert AP presented on topic Effective AML Compliance.
SAS Anti-Money Laundering protects your assets by using advanced analytics to uncover illicit activity and comply with AML and CTF regulations.
Through this presentation, Mr. Vineet Khanna, Director - Practices, SAS India; talks about the key considerations for ALM, need for data management, analytics and optimisation.
In this era of big data, the role of marketing is constantly evolving - from monologues to dialogues, from customer satisfaction to customer delight! Through this presentation, Kiran Ajbani, Sr. Consultant at SAS India, brings into perspective the importance of data-driven marketing and the need to transform in the digital age.
This presentation was made at SAS Forum India 2014 by Mr. Manoj Shrivastava, Director - IT, MTS India. In this presentation, he talks about the various learnings derived from his experiences with BI and campaign management implementations.
This presentation was made at CIO Summit 2014, Pune. The presentation talks about the importance of data as a strategic asset and how analytics can empower modern businesses in making the most out of this recent and important asset class.
'Impact of emerging technologies in Business' was presented at the 17th IMA CEO Roundtable, by Sudipta K. Sen, Regional Director - South East Asia, Vice Chairman and Member of Board, SAS Institute (India). The presentation talks about how technologies in data management, analytics and BI can help organisations in driving breakthrough business outcomes.
An exclusive presentation by Mr. Srinivasan Iyengar, Chief Operating Officer of Reliance Life Insurance - “An Road to an Analytical Enterprise”. The presentation was made at Analytics Conference in Mumbai.
An exclusive presentation by Mr. Imam Hoque, General Manager, Advanced Analytics BU, Advanced Analytics Sales - EMEA AP BU, SAS Software Ltd (United Kingdom)on ‘Maximising The Value of Analytics in Tax Compliance’ The presentation was made at Government Analytics & Information Summit 2013.
An exclusive presentation by Mr. Nirlap Vora, Practice Manager - Information Management - SAS Institute India Pvt. Ltd . on ‘Data Management as a Strategic Initiative.’ The presentation was made at Government Analytics & Information Summit 2013.
An exclusive presentation by Keith Swanson, Director, Financial Crimes, SAS South Asia presented on Big Data, Big Analytics & Bad Behaviour - Fighting Financial Crime.
An exclusive presentation by Mr. Mayank Sahai, AVP - Corporate Marketing - Tata Teleservices Ltd. on ‘Enhancing Marketing Performance to drive Business Objectives.’ The presentation was made at SAS Forum India 2013.
An exclusive presentation by Ronald Fernandes, SVP - Compliance Department - Axis Bank on 'Automation of Compliance Management – Implementation Considerations. The presentation was made at SAS Forum India 2013.
An exclusive presentation by Mr. Mazhar Leghari, Business Development Solution Manager, SAS Middle East FZ LLC; on ‘Building for Success: The Foundation for Achievable MDM’. The presentation was made at SAS Forum India 2013.
This presentation was made by Mr. Deepak Ramanathan, Information Practice Head - SAS North Asia at SAS Forum India 2013. The presentation presents the current scenario on Big Data and how SAS High-Performance Analytics can empower organisations to derive value from this information explosion.
India's Largest Analytics Forum held in Grand Hyatt dated 12th March 2013. Arun V. Chearie, Practice Head - South East Asia spoke on topic - Evolution & the Changing dynamics of Customer Value Management.
India's Largest Analytic Forum which was held in Grand hyatt dated 12th March 2013. Deepak Ramanathan, Head - Information - North Asia spoke on topic Delivering forward-looking insights to drive breakthrough business outcomes
'The future of analytics is here', was presented by Mr. Deepak Ramanathan, Head - Information Management, SAS Asia Pacific (North). This document emphasizes on how SAS Visual Analytics empowers business to analyze data in seconds or minutes, which earlier used to take hours or days. This presentation was displayed at CIO - The Year Ahead conference, Mumbai, 29 - 30 Nov'12.
Communications Mining Series - Zero to Hero - Session 1DianaGray10
This session provides introduction to UiPath Communication Mining, importance and platform overview. You will acquire a good understand of the phases in Communication Mining as we go over the platform with you. Topics covered:
• Communication Mining Overview
• Why is it important?
• How can it help today’s business and the benefits
• Phases in Communication Mining
• Demo on Platform overview
• Q/A
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
zkStudyClub - Reef: Fast Succinct Non-Interactive Zero-Knowledge Regex ProofsAlex Pruden
This paper presents Reef, a system for generating publicly verifiable succinct non-interactive zero-knowledge proofs that a committed document matches or does not match a regular expression. We describe applications such as proving the strength of passwords, the provenance of email despite redactions, the validity of oblivious DNS queries, and the existence of mutations in DNA. Reef supports the Perl Compatible Regular Expression syntax, including wildcards, alternation, ranges, capture groups, Kleene star, negations, and lookarounds. Reef introduces a new type of automata, Skipping Alternating Finite Automata (SAFA), that skips irrelevant parts of a document when producing proofs without undermining soundness, and instantiates SAFA with a lookup argument. Our experimental evaluation confirms that Reef can generate proofs for documents with 32M characters; the proofs are small and cheap to verify (under a second).
Paper: https://eprint.iacr.org/2023/1886
Unlocking Productivity: Leveraging the Potential of Copilot in Microsoft 365, a presentation by Christoforos Vlachos, Senior Solutions Manager – Modern Workplace, Uni Systems
Climate Impact of Software Testing at Nordic Testing DaysKari Kakkonen
My slides at Nordic Testing Days 6.6.2024
Climate impact / sustainability of software testing discussed on the talk. ICT and testing must carry their part of global responsibility to help with the climat warming. We can minimize the carbon footprint but we can also have a carbon handprint, a positive impact on the climate. Quality characteristics can be added with sustainability, and then measured continuously. Test environments can be used less, and in smaller scale and on demand. Test techniques can be used in optimizing or minimizing number of tests. Test automation can be used to speed up testing.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Sudheer Mechineni, Head of Application Frameworks, Standard Chartered Bank
Discover how Standard Chartered Bank harnessed the power of Neo4j to transform complex data access challenges into a dynamic, scalable graph database solution. This keynote will cover their journey from initial adoption to deploying a fully automated, enterprise-grade causal cluster, highlighting key strategies for modelling organisational changes and ensuring robust disaster recovery. Learn how these innovations have not only enhanced Standard Chartered Bank’s data infrastructure but also positioned them as pioneers in the banking sector’s adoption of graph technology.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Pushing the limits of ePRTC: 100ns holdover for 100 daysAdtran
At WSTS 2024, Alon Stern explored the topic of parametric holdover and explained how recent research findings can be implemented in real-world PNT networks to achieve 100 nanoseconds of accuracy for up to 100 days.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
UiPath Test Automation using UiPath Test Suite series, part 5DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 5. In this session, we will cover CI/CD with devops.
Topics covered:
CI/CD with in UiPath
End-to-end overview of CI/CD pipeline with Azure devops
Speaker:
Lyndsey Byblow, Test Suite Sales Engineer @ UiPath, Inc.